PDF/A is one of the best formats to archive my documents.
What Linux software would allow me to scan documents to PDF/A ?
A PNG?PDF/A or TIFF?PDF/A tool would be fine too.
Is there a way to set two different background pictures for my two monitors in Windows 7?
By default the same background picture is used for both displays. I am looking for a solution without installing extra software.
I am looking to see if anything exists that would allow us to capture all outgoing email on a machine -- for example, in a staging environment -- and drop it in a single place, which ideally would be something we could check with a mail client.
Currently we're doing this on the software level (if environment is staging, rewrite address) which is a bit ugly and leads to errors.
The servers are currently on Debian Linux, using exim as the mail transport.
Open to any and all suggestions!
What is the best method to rip/encode a standard definition DVD, to be streamed to a Xbox360 from a Vista Media Center pc (preferably using free software). I’ve found tons of tutorials on the web explaining how to create wmv’s from vob files using FFMpeg, but every combination of settings that I have tried has resulted in very poor video quality. I’ve also tried various video conmversion tools, but everything seems to result in poor video quality, or audio that is out of sync with the picture. Please help!
I am sure my Windows Server 2008 box is constantly under attack both at the network level and web application level.
QUestion is How do i detect these attacks? is there any light-weight software available? which can monitor the server?
Note I am running this on a VPS so the monitor program will have to run on the same server.
We are running a payments (EFT transaction processing) application which is processing high volumes of transactions 24/7 and are currently investigating a better way of doing DB replication to our disaster recovery site.
Our current and previous strategies have included using both DoubleTake and Redgate to replicate data to a warm stand-by.
DoubleTake is the supported solution from the payments software vendor however their (DoubleTake's) support in South Africa is very poor. We had a few issues and simply couldn't ever resolve them so we had to give up on DoubleTake.
We have been using Redgate to manually read the data from the primary site (via queries) and write to the DR site but this is:
A bad solution
Getting the software vendor hot and bothered whenever we have support issues as it has a tendency to interfere with the payment application which is very DB intensive.
We recently upgraded the whole system to run on SQL 2008 R2 Enterprise which means we should probably be looking at using some of the built-in replication features.
The server has 2 fairly large databases with a mixture of tables containing highly volatile transactional data and pretty static configuration data.
Replication would be done over a WAN link to a separate physical site and needs to achieve the following objectives.
RPO: Zero loss - This is transactional data with financial impact so we can't lose anything.
RTO: Tending to zero - The business depends on our ability to process transactions every minute we are down we are losing money
I have looked at a few of the other questions/answers but none meet our case exactly:
SQL Server 2008 failover strategy - Log shipping or replication?
How to achieve the following RTO & RPO with logshipping only using SQL Server?
What is the best of two approaches to achieve DB Replication?
My current thinking is that we should use mirroring but I am concerned that for RPO:0 we will need to do delayed commits and this could impact the performance of the primary DB which is not an option.
Our current DR process is to:
Stop incoming traffic to the primary site and allow all in-flight transaction to complete.
Allow the replication to DR to complete.
Change network routing to route to DR site.
Start all applications and services on the secondary site (Ideally we can change this to a warmer stand-by whereby the applications are already running but not processing any transactions).
In other words the DR database needs to, as quickly as possible, catch up with primary and be ready for processing as the new primary. We would then need to be able to reverse this when we are ready to switch back.
Is there a better option than mirroring (should we be doing log-shipping too) and can anyone suggest other considerations that we should keep in mind?
Is there an option or a software that will allow me to temporarily ungroup and re-group specific similar taskbar buttons in Windows XP?
I would like the default behavior to be set to 'group similar taskbar buttons', but sometimes I prefer that specific applications be temporarily ungrouped for better window management.
Is there any build in UI for that kind of hardware like it exist in Modern UI for WiFi, Bluetooth, Broadband mobile and other common settings or I'm forced to use separate software (besides the obvious drivers for hardware)? The thing is that I have build-in fingerprint reader in my laptop and I have installed all necessary official drivers for it (and it looks like they are working fine, btw). But I did not find any UI settings where I could change Sign-in option from password/picture password/pin to fingerprint.
Right now I have Fedora dual booted with Windows 7. The reasoning behind that is just because windows was the first OS I ever used and has some essential software, and Fedora is the first linux distribution I tried, but I would like to hear the argument for other distros, as I may be looking to switch.
Thanks ahead of time.
I'm looking for a tool that is able to (remotely) monitor CPU and Memory in a Windows server but most importantly, which service/process is using it.
Or-- is it possible to monitor a specific running service?
We got a server that freezes on regular basis and we're trying to find the culprit without using a local debugger.
Would be great if the monitoring software came with an agent that we can install on the remote clients for maximum accuracy.
Any suggestions are very much appreciated.
I burned a CD on XP using the built-in burning software and I can read the CD on that machine, but when I insert it into my Vista machine, I can't read the files. It shows the correct volume label, and the correct 'free space', but I can't access the actual files.
Am I missing something obvious?
(Both systems are fully up-to-date)
I'm new with Linux and have noticed that there are numbers beside certain commands I look up.
For example I want to look up accept() in the aspect of network programming, but man accept shows this instead:
accept(8) Easy Software Products accept(8)
NAME
accept/reject - accept/reject jobs sent to a destination
So how do you switch between manual pages to other numbers like accept(1) ~ accept(7)?
My external HD got unplugged without ejecting and my Mac will no longer mount the drive, but it recognizes it's there.
I already tried repairing the disk in Disk Utility, and erasing it in Disk Utility, and it still won't mount. I can't imagine the hardware is actually damaged otherwise it wouldn't even recognize it (Right?).
Is there any other software solution I can try? Recovering my files is not a concern.
Seeing the 'is downloading a torrent of software you own illegal' thread, it made me think about this. Anyone have any answers?
NOTE: SuperUser is NOT a legal resource and any advice/answers here should NOT be used in any way.
There was a tool called PB Downforce that does something like this that's been discontinued as it was buggy and stopped working for it's purposes. So is there any alternative, as I have a licensed software that is attached to the HD Id. Thanks.
In my HFT software I plan to use one core for stock index calculation. That would be simply while(true) loop without any delays which will calculate (sum and multiply) components as often as possible (so millions times per second) and I plan to do that 8 hours per day every day.
I was never before loading my computer to 100% full time every day regullary. May it be dangerous? Do processor has kind of "resource" (very big of course) after which it can stopped working?
I develop web applications on Mac OSX in SQLite, MySQL and PostgreSQL and these are then put on the webserver. I want to be able to take the brunt out of looking at the terminal locally when dealing with these databases - is there any software available (free or otherwise) that can handle all three of these database technologies in a GUI for the Mac that is actually decent and worth it?
I'm intalling our local server and want to install a virtual machine but it seem vmware ESXi is not suit with our server
Server: Dell SC 1424
CPU : 2 Xeon 3.2G (buss 800, cache L2 2M)
Ram: 6G DDR ECC 266
Hard disk: 2 Hitachi Sata 1TB. Raid Dell Cerc 2s ( raid 0, 1)
Nic: 2 Broadcom 1Gb/s
I'm wondering if you're familiar with this area and have any idea about a VM software for our server. Just wanted to use server for some purposes ( web hosting, subversion and to experience some server OSs)
Thank you for helping.
I'm using Input Director as a software KVM to control my laptop from my desktop, and all is almost OK with the setup. However, key-presses on the master keyboard seem to repeat very easily on the slave, and it is close to impossible to type a word on the slave without getting repeated characters. I typed the word 'repeat' on the master keyboard and my editor on the slave captured the characters 'repeeaatt'.
Both machines are Windows 7.
My server has recieved sudden increase in the (read) web traffic, requesting many map image tiles, and apache cannot handle it.
Apache cannot even handle the redirections! The average load I get in my CentOS machine is more then 200..
Is there some software out there that can redirect SOME of the traffic, such as only the traffic from specific directory (such as http://example.com/maptiles/abc.png) to a different address (sucha as http://s3.amazonaws.com/mytiles/abc.png) ?
can this be done by HAProxy?
I've been trying to learn about networking, network maintenance, network administration, stuff of that nature (I want to be a network engineer when I get out of university after I get out of high school) and I want to set up my older PC (running Slackware) as a modem, as a project to help me learn. I want to know what kind of hardware I'll need. Pretty much all I know is that my current modem uses ADSL2+ and PPPoA, which I think is a software thing anyway.
How can I accomplish this?
Working on a new project, and need some documentation advice. I need to be able to document ~ 400 client sites to track assets, setup, hardware layout, network layout, etc. We want to have photos, maps, etc. wherever possible. We need this for a call center environment.
Has anyone found any off the shelf software that performs this functionality, or are we on our own to develop the tools that we require?
When I try to compile the tomcat connector from source, everything appears fine except that no mod_jk.so file gets created.
Software versions:
RHEL6 x86_64
httpd-2.4.3
tomcat-connector 1.2.37
Commands:
cd native
./configure --with-apxs=/usr/local/apache2/bin/apxs
make
cd apache-2.0
ls
The only warning message during the make is:
Warning! dlname not found in /usr/local/tomcat-connectors-1.2.37-src/native/apache-2.0/mod_jk.la.
Does anyone have any suggestions on how to get the mod_jk.so file to be generated?
I know Ghost and Clonezilla aren't able to build images of a system while the system is running(Without Rebooting). Haven't Checked on Acronis though, but i don't simpatize with private solutions.
Question: Is there a software solution which is able to build a "Live" image?
Would appreciate anwsers, since I'm one step away from building a Clonezilla test enviroment and this will just help on my decision.
Thank you.
Our team rolled puppet out to our systems over the last six months. We're managing all sorts of resources, and some of them have sensitive data (database passwords for automated backups, license keys for proprietary software, etc.).
Other teams want to get involved in the development of (or at least be able to see) our modules and manifests. What have other people done to continue to have secure data moving through Puppet, while sharing the modules and manifests with a larger audience?