Search Results

Search found 7263 results on 291 pages for 'job boards'.

Page 163/291 | < Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >

  • Looking for simple windows scan (multiple pages) to one pdf application?

    - by Troggy
    I would like to find some simple scan software for a windows machine that can scan to pdf, but I would like it to do batch or multiple pages into one big pdf. I saw a couple questions on scan to pdf software, but did not see anything talking about scanning to large multiple page pdf's. EDIT: I am surprised there are not more options out there. Do many of the scanners/all in one devices come with included software that perform this function? EDIT 2: I tried Scan2PDF and it locked up on me multiple times in the middle of the scan job and then gave me non-english error messages. Otherwise, I liked how simple the app was, just select number of pages and hit ok. Any other success stories out there?

    Read the article

  • WMI Notfication and database mirroring

    - by user22215
    Hi all I'm having a problem configuring a WMI alert that I would like to use with database mirroring. I'm running on Windows 2008 Enterprise X64 with Server 2008 Enterprise X64 also SQL Server has SP1 installed. Basically I click on alert select WMI after that I typed the below SQL statement SELECT * FROM DATABASE_MIRRORING_STATE_CHANGE WHERE DatabaseName = 'testmove' AND State = 8 I have also made sure the service broker is enabled for the msdb and all mirrored databases however I still can't get this to work basically the alert never fires. I'm testing with just the alert functionality I have not even added in the agent job yet. I tested this by right clicking on my mirrored database and forcing it to fail over. Any help with this problem would be much appreciated

    Read the article

  • What is the best way to handle the multitude of different logs created all around the place?

    - by Low Kian Seong
    I run a few applications which creates their own logs. Then I run cron scripts on the same server to do importing of data for my app. When these cron errors out, the default is it sends emails to the user that runs the cron job. There are just too many places that I need to check the logs and mails for stuff that might have potentially went wrong. My question is, what is the best way to do this or even better is like a log parser application which will go through all the system logs when something really goes wrong instead of me having to go through it daily?

    Read the article

  • How to swap Escape and Caps Lock?

    - by pexeer
    I am using Archlinux. When I program , I like to swap the Escape and Caps Lock. I know that gnome can do this job. But Gnome 3.6.2 in the Archlinux can not find this. So i use the xmodmap and create a file : ~/.xmodmap clear Lock keysym Caps_Lock = Escape keysym Escape = Caps_Lock add Lock = Caps_Lock when i run: xmodmap ~/.xmodmap it works well. But it can not work automatic when i login the gnome, even though i add xmodmap ~/.xmodmap to ~/.xprofile. Am I doing something wrong ? How can I solve this issue?

    Read the article

  • In linux: how to exit a program but not kill it?

    - by biomed
    I use Ubuntu 10.10 and I have a python program (mnemosyne) that I synchronize the data files using dropbox. If I forget to close (exit) this program. Here is my problem scenario. I leave the program running at home and go to work but if I open the program at work and work on it the data file is changed and I loose my progress at home when I exit (it automatically saves) when exitimg. I thought I could create a cron job to automatically close mnemosyne every morning regardless os me remembering to do it or not but if I use kill the program exits without saving the datafile and I end up with a tmp file and an error message when I restart it. Is there a better way of sending the exit signal to this program emulating me clicking fileexit menu option. Thanks

    Read the article

  • Timeout ssh sessions after inactivity?

    - by Insyte
    PCI requirement 8.5.15 states: "If a session has been idle for more than 15 minutes, require the user to re-enter the password to re-activate the terminal." The first, and most obvious, way to deal with ssh sessions that are idling at the bash prompt is by enforcing a read-only, global $TMOUT of 900. Unfortunately, that only covers sessions sitting at the bash prompt. The spirit of the PCI spec would also require killing sessions running top/vim/etc. I've considered writing a */1 cron job that parses the output of "/usr/bin/w" and kills the associated shell, but that seems like a blunt instrument. Any ideas for something that would actually do what the spec requires and just lock the terminal? I've looked at away and vlock; they both seem great for voluntarily locking your terminal, but I need a cron/daemon task that will enforce locking.

    Read the article

  • smarter OS X smart folder

    - by vectorizor
    Hi all, Smart folders on OS X are nice and all, but you can only access them from the Finder sidebar, and nowhere else (or am I wrong?). A better way would for them to appear as normal folder in the file system, so they are available from anywhere, say cmd line for isntance. Or if you have a smart folder to find movies/music in ~/Download, and add a shortcut to it in ~/movies, you could see your music/movies from Front Row). THe smart folder wouldnt have to be refreshed immediatly, maybe every hours or so from a cron/launchd job. Any way to do that? A

    Read the article

  • mod_rewrite, 301 problem

    - by blid
    Hi, Currently I can access specific site in two ways: 1. http://a.com/foo/bar 2. http://a.com/index.php?url=foo/bar What I'm trying to achive is to allow to do it only using first way, and make redirect 301 on the second to the first one. Here's the code which I made so far and put into .htaccess: <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME} !-f RewriteRule ^(.*)$ index.php?url=$1 [QSA,L] #tricky part RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.php\?url=?(.*)\ HTTP/ RewriteRule ^index\.php$ http://a.com/$1 [R=301,L] </IfModule> Currently it do almost all the job, it's redirecting index.php to /, but index.php?url=foo/bar to /?url=foo/bar and I can't manage to make it right. TIA.

    Read the article

  • What kind of time lapse image stitching software is out there?

    - by AaronLS
    I took a bunch of digital images(jpg format) of a sunrise that I want to make into a time lapse movie. The only recommendation I've seen was some iStopMotion for Mac, but I am running Windows. I would prefer the software to take into account the metadata in the image that indicates what time the image was taken when determining how long to display each frame, as they won't be accurately consistent in their temporal spacing. Onion skinning would be a cool feature to. Please, one software suggestion per answer to allow the voting system to do it's job. Thanks in advance.

    Read the article

  • Unix Shell/SSH config to allow TCP port forwarding without showing a command prompt

    - by Raphael K
    I'm running a Debian Linux. I'd like to have a user account that is able to connect via SSH for TCP-forwarding only, without a command prompt. e.g the following would work (from a remote computer): ssh -D1234 user@myhost but no command prompt would appear. Using a shell like /bin/false or /sbin/nologin is too restrictive as it doesn't even allow the user to log in. A shell that only allows the "exit" or Ctrl+D commands would do the job. I know that something similar is possible to allow only SFTP, but I can't find the equivalent for TCP forwarding. Thanks

    Read the article

  • How to switch off the monitor when mouse reaches the edge of the screen?

    - by evgeny9
    I have 2 computers at home (Windows XP and Windows 7), but one monitor for both of them. They are connected to this monitor using different interfaces: DVI and VGA. I'm also using one keyboard and one mouse to control both PCs with the help of Synergy or Input Director. But I still need to manually switch between monitor interfaces. I wonder, if there's some way (software) that will switch this interfaces (turn off the monitor), when reach the edge of the screen with the mouse. Until now I found several answers, which help to avoid pressing hardware buttons, but still can not do the job automatically based on mouse pointer coordinates. Thank you.

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • Duplication of Windows 7 Backup

    - by Steven Pickles
    I use the built in backup utility for Windows 7 because it's automated and flexible enough to allow me to schedule a daily shadow copy backup of particular files and folders directly to a separate internal RAID 0 array (2 x 1TB). It's also lightweight and stays out of the way. For off-site backup purposes, each week I copy the contents of the internal backup from the RAID 0 array to an external 1 TB drive. I then store move this drive to a different building. The copy from the internal backup to the external backup typically works like this: mount and erase contents of external drive highlight "file" on internal drive, hit CTRL+C CTRL+V on root directory of external drive Is there a better way to synchronize? Microsoft's SyncToy application does a pitiful job, and often leaves the folders not truly synchronized... which completely defeats the ability to use the backup's restore feature.

    Read the article

  • Mailing list with dynamically generated addresses

    - by Joe Tomasone
    I am trying to implement a dynamic mailing list from a database that changes quite often. Conditions: Postfix is the MTA Email addresses are in a MySql Database Postfix only allows senders whose emails are in that database (via smtpd_sender_restrictions) Cron job extracts the current emails from the database nightly and puts them into an alias file, then runs postalias on it. This works well, but since the sender remains the same, many domains are rejecting the email since my server is not a DNS listed mail server for the sender's domain. So, I either have to find a way to re-write the outgoing address as "listserv@mydomain" or find some mailing list package that will use database-retrieved emails (either queried directly or in a flat file) as the subscriber list, with that list replaced daily. I've tried Sympa and am pretty much ready to give up on it - it's a nightmare to get working right - but that's the only open source listserver that I have seen that works with dynamic mail lists. Does anyone have any ideas? Thanks, Joe

    Read the article

  • msn 2011 extremely laggy at time?

    - by chobo2
    Hi I am using msn 2011 and think they done a terrible job on it. It seems so buggy. At times when I type a message it is like a slide show. If I would type a sentence you could see it really struggle and the sentence would show up seconds after you typed it. This only happens sometimes. I heard this is something to do with some ad's that are making it slow. Does anyone have a fix for this? Also I noticed like 70% of the time when I check auto login(remember my password and username) it forgets that I checked the box the next time I start my computer up. So I end up typing in my username and password. Thanks

    Read the article

  • Make a mosaic picture out of 900 images

    - by unor
    I have ~ 900 pictures (mostly photos) of varying sizes. Now I want to create one big picture that includes all 900 pictures in a small thumbnail-like resolution. The tool should automatically resize the thumnails so that everything fits. All pictures should be added next to each other, so there should be no border/padding. Each picture should be included exactly one time! (however, it would be okay to omit a few so that there is no empty space) I'm looking for a tool (FLOSS, for GNU/Linux) that can do the job. I tried Metapixel, but it needs an input image that should be "resembled" by the thumbnails, and it only uses a selection of all pictures. I found montage, but couldn't get a result yet, because my system was unresponsible for hours after starting it. Maybe there are some tweaks? AndreaMosaic is recommended in this answer, but it's not FLOSS (and needs Wine to run it on GNU/Linux).

    Read the article

  • crontab still sending emails even with > /dev/null

    - by user2344668
    I have a crontab (root) that runs a script and output is set to /dev/null but I always get the emails whenever it runs. I only want to receive error emails. # Rackspace driveclient update (12pm MST) 0 12 * * * /root/scripts/driveclient-update > /dev/null The only way I can get it to turn off is to use /dev/null 2&1 but then I won't get error emails. This is happening on three different CentOS servers, two are 6.3 and one is 6.4. NOTE: I have read over and over that /dev/null is supposed to send stdout there and prevent the email if there is nothing but stdout from the script, so at works for at least some people; I cannot figure out why it is not working on these servers. Here's an example of where /dev/null is supposed to work: http://www.alphadevx.com/a/384-Suppressing-Cron-Job-Email-Notifications

    Read the article

  • stsadm farm backup exits with ffffffff

    - by overbyte
    I have a SP2007 farm that uses stsadm through Scheduled Tasks to run farm backups. It always worked fine, however it ran for a couple of seconds one day and just exited with code ffffffff. Looked at Event Viewer, the Sharepoint logs themselves and nothing unusual happened at the time this job ran. No files were created so an spbackup.log doesn't exist. Searched the net for batch files and STSADM return codes but the error message doesn't even exist. Any other recommended place to look for issues like this?

    Read the article

  • Is there a chroot build script somewhere?

    - by Nils
    I am about to develop a little script to gather information for a chroot-jail. In my case this looks (at the first glance) pretty simple: The application has a clean rpm-install and did install almost all files into a sub-directory of /opt. My idea is: Do a find of all binaries Check their library-dependencies Record the results into a list Do a rsync of that list into the chroot-target-directory before startup of the application Now I wonder - ist there any script around that already does such a job (perl/bash/python)? So far I found only specialized solutions for single applications (like sftp-chroot). Update I see three close-votes for the reason "off topic". This is a question that arose because I have to install that ancient piece of software on a server at work. So if you still feel this is off-topic - leave a comment...

    Read the article

  • How do I change the output line length from the "top" linux command running in batch mode

    - by Tom
    The following command is useful to capture the current processes that are taking up the most CPU in a file: top -c -b -n 1 > top.log The -c flag is particularly useful because it gives you the command line arguments of each process rather than just the process name. The problem is that each line of output is truncated to fit on the current terminal window. This is ok if you can have a wide terminal because you have a lot of the output but if your terminal is only 165 characters wide, you only get 165 characters of information per process and it is often not enough characters to show the full process command. This is a particular problem when the command is executed without a terminal, for example if you do it via a cron job. Does anyone know how to stop top truncating data or force top to display a certain number of characters per line? This is not urgent because there is an alternative method of getting the top 10 CPU using processes: ps -eo pcpu,pmem,user,args | sort -r -k1 | head -n 10

    Read the article

  • nginx not returning 304 on cached content

    - by Don H
    I'm using nginx as a reverse proxy with an Apache back-end handling some PHP files. The files return the right expiry headers and proxy_cache does a good job of caching them, but I've noticed that the cached content returns a 200 on every refresh, when it might be more efficient to return a 304 on the cached files. The files in question are generated by PHP. The urls do not have .php in them as they've been prettified. Any idea why nginx might not be returning 304 on repeated visits to a cached PHP output? To clarify: It's using proxy_cache for caching dynamic PHP pages (not static html pages generated by PHP). I'm setting expires headers in the PHP file of time + 24 hours. With that in mind, I was hoping nginx would be able to then return 304s on its cached versions during that 24 hour window.

    Read the article

  • Get-Mailbox not returning all mailboxes

    - by rotard
    I am trying to set up an exchange mailbox backup job with Vembu Storegrid and StoreGrid is unable to list the mailboxes for the client. While I was troubleshooting the issue, I did notice another thing: running the Get-Mailbox command on the mail server as the backup user only shows the mailbox for that account, while running Get-Mailbox as my admin account returns a list of what appears to be all the mailboxes. My service account is a member of "Administrators", "Domain Admins", and "Domain Users". What additional permissions might be required to list all mailboxes in the system?

    Read the article

  • Bacula configuration for clients that are turned on and off randomly

    - by Rastloser
    I'm evaluating Bacula as a centralized backup tool for a small network where users will turn machines on and off unpredictably. Some of the headless Linux boxes I need to back up are intended to be turned off by pressing the on/off-button on the case, without any way of telling the user to wait for a backup job to finish. So, we don't know when backup jobs may run (anacron might help with this, right?) and we don't know whether they'll be allowed to finish. Is Bacula a reasonable choice for such an environment?

    Read the article

  • limit the speed of writing files to NFS

    - by xgwang
    CentOS 5.6 NFS is mounted on the server for backup disk space. When the backup job started, it could reach 80MB/s and we really do not expect it took so much bandwidth. So i need to find a way to limit the speed of writing to NFS. I tried rsync with --bwlimit=5000. However, it did limit the reading speed, but the accumulated data still was written at 80MB/s, and no writing activities for seconds. Is there any way to limit the writing speed of NFS?

    Read the article

  • Raid 5 scsi fault

    - by HaLaBi
    I have no much knowledge about servers and I was looking all day around the internet about finding a solution to my raid 5 problem. All of a sudden two disks failed. The server won't boot (HP Proliant, windows 2003 R2, very old maybe 10 years old). I know that if one disk is faulty then I can add a new disk and rebuild it and things will be fine, the problem is two went faulty :( is this normal? two at the same time? is there any other thing I can do and I am not aware of? other than taking them out and reinserting them back? Windows won't boot. The Array menu shows that disks 0 and 4 are "Missing". Any other tricks or things to do? It is important because for some unknown reason the back up job did not work for a month and I just found out, so I need to make these raid 5 back online again.

    Read the article

< Previous Page | 159 160 161 162 163 164 165 166 167 168 169 170  | Next Page >