Search Results

Search found 9464 results on 379 pages for 'cron task'.

Page 134/379 | < Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >

  • Is there a lightweight MTA for Ubuntu 9.10 Desktop?

    - by Joe Casadonte
    I'm writing a Perl script to run as a cron job, and I want to email results & errors to a local account on the laptop. I'd like something that can talk SMTP (do any MTAs not adhere to SMTP?). I use Thunderbird 3, so I'll also need a POP/IMAP server (unless T-Bird can read straight from an mbox file; I'll have to check into that). No need for spam controls as I'll lock it down real tight, only accepting mail originating from the laptop itself. Thanks!

    Read the article

  • Backup Script - Could Not Open Input File

    - by Iestyn
    this is the backup script that I've got going: http://pastebin.com/4g4E6wUz This is the cron info: /usr/local/bin/php /home/backups/backup-db.php --filename-dated ALL No matter what I do, I keep on getting this error: Could not open input file: /home/backups/backup-db.php - That's the correct location of the file. I just don't know what else to try, I feel I've been working on this for so long now that I've explored every avenue, on the other hand sometimes I think that the time I've spent on it is clouding my thoughts and I'm missing something stupidly obvious. Just wondering if someone can give me a few pointers? Also on a last note, does anyone know of a way/article to auto generate a full backup of cPanel every * amount of days and store it in a location that I want? Kind Regards.

    Read the article

  • automated printouts from a wireless printer

    - by Piotr
    I have a wireless printer which is always on, and an always on fanless linux server. Looking at the mprinter project on Kickstarter I started to wonder if there is an app somewhere in the internet already that will allow to prepare an automated daily printout based on some settings. things to be printed could include - weather forecast for my locations, todo`s scheduled for that day, a "quote of a day" or "word of the day", stats from google analytics for my site, and many more ... I would set a printout at 6:15 every work day so its on my printer when I am already up, having a coffee. anyone knows something that can be used for such purpose? While I know this can be done by combining the power of TeX, cron and a script language to manage the dynamic part of the PDF I believe this is a use case someone has already addressed.

    Read the article

  • windows VPS running apache and mysql, php scripts running slow.. but cpu usage is 1-3%..

    - by Roeland
    So every night I run some cron jobs. It requires probably about 20 min to process all the records. I gather the script does something like 10,000 sql queries. I figure this task was just that intense and needs time to complete, but I looked at CPU and memory usage, and it is super low. Cpu usage is between 1-3% and once in a while will bounce to 50ish for 2-3 seconds. This VPS is running windows 2003 server with Apache and MySQL. Does this sound right?

    Read the article

  • Is there a way to rsync in batches?

    - by Chris
    I have a huge chunk of data (11G) in a subversion repository that I'm using rsync to migrate to Alfresco, which lucene indexes new files as they hit the file system. I'm using a dav mount as a proxy to allow me to rsync. The issue I'm having is the indexing post-rsync is quite an expensive operation for such a huge chunk of data, so I was wondering whether there's a way I could logically separate the rsync into identically-sized batches (say 500MB each) so I could schedule them in cron. At the moment, I'm traversing the top level folders and taking the smallest ones across first, but once I'm done with those, the much larger sub-directories are going to be quite troublesome. Please let me know if you need any further info. Thanks in advance.

    Read the article

  • chroot for unsecure programs execution

    - by attwad
    Hi, I have never set-up a chroot-jailed environment before and I am afraid I need some help to do it well. To explain shortly what this is all about: I have a webserver to which users send python scripts to process various files that are stored on the server (the system is for Research purpose). Everyday a cron job starts the execution of the uploaded scripts via a command of this kind: /usr/bin/python script_file.py All of this is really insecure and I would like to create a jail in which I would copy the necessary files (uploaded scripts, files to process, python binary and dependencies). I already looked at various utilities to create jails but none of them seemed up-to-date or were lacking solid documentation (ie. the links proposed in How can I run an untrusted python script) Could anyone guide me to a viable solution to my problem? like a working example of a script that creates a jail, put some files in it and executes a python script? Thank you very much.

    Read the article

  • User account automatically filling up with dead.letter file

    - by jeroen
    I have one user account on a server with about 400 accounts that is filling up automatically. The dead.letter file in the users home directory automatically grows until the account is full (about 10 - 40 Mb per day). The user is using Microsoft Outlook to send and receive mail. What can be causing this and how can I avoid it from happening? Right now I have an emergency cron-job to delete the file but I would like "real" solution. Edit: The server version is Red Hat Enterprise Linux ES release 4 (Nahant Update 4) Edit 2: It seems mainly spam and I see different mailer headings (from php to Outlook Express) and a frequent appearing header is [email protected] Update: I have asked the hosting provider where I use that dedicated server to look into the problem as well, as it's their Control Panel that could be a cause of the problem.

    Read the article

  • vps running out of memory, 200MB free

    - by demon
    At the beginning of this year I took a VPS for my website because I was running against the resource limits from a shared hosting. Here are the things I know: 2GB memory, with 1GB swap Debian X64 server ED installed Software running on the webserver: mysql apache postfix pop3 imap amavisd clamd cron fail2ban munin-node pure-ftpd spamd nginx Now for the setup: Nginx listens on port 80 and handles the static files, the php side is done by apache2 running mod_php in combi with apc(no var caching!). Iam using a pretty 'busy' drupal and phpbb stack on the server, for drupal iam using boost and authcache to handle of the server load with a pressflow stack. phpbb is just phpbb3 with some mods installed, but has at max 30 users online at a time.. The problem is that its staring to use the swap after a few days after a reboot and thus the site becomes slower. I'v added pictures of monit and munin, so maybe somebody can help me out... Monit: Munin:

    Read the article

  • Backup of images

    - by Sam Kong
    I've just installed a Ubuntu for a file server. It will share a folder (samba) and employees of my company will save photos on that. Currently the total amount of the photos is about 100GB and every day 20MB will be added. My question is about backup plan. I want to backup the photos to a remote server using a cron job. I can think of 2 things. rsync git Image files won't be changed so rsync will do. But as people say, I must git all my data. What would you do? Thanks. Sam

    Read the article

  • How to automatically set default quota limits for users on XFS filesystem, when the new account is created

    - by acidburn2k
    I guess the title explains the problem pretty well. Do you have an idea for a mechanism, which will automatically assign default quota values for every new account created (sort as the skel scheme works, but in this area)? Now, I am looking for a generic clean solution, not some ugly cron based scripts, or wrapper scripts for creating users. I would also like to avoid any external, unmaintained stuff (like forgotten pam modules, and such). Anything what could lead to overhead and extra work in future isn't really the solution, nor is checking for new accounts every minute.

    Read the article

  • Looking for a Linux stream ripper that can be scheduled

    - by Anthony D
    I have an MP3 stream I want to schedule a recording of. I can do it using wget to a file, its just a straight mp3 stream. However I'd like to use a command line stream ripper that will do a better job. Any one know of one? Update 1 WGET is grabbing whatever part of the stream it comes in on. This may not really be the start of a frame in the MP3 file. Also, wget is not really schedule ready. I experimented with starting it with a cron job, then killing it later, this produced a file that didn't really start and stop where I wanted.

    Read the article

  • Mongo Scripting the shell

    - by cKendrick
    On my production stack, I have a front-end server and a Mongo server. I would like to be able to set a cron job on the front-end server to create some logs daily. I wrote a script that does this: ./mongo server:27017/dbname --quiet my_commands.js If I run it from the Mongo server as above, it works fine. However, I would like to be able to run it from the front-end server. When I try to do that, I get: -bash: mongo: command not found Since mongo is not installed on the front end server, it gives me that error. Is it possible to somehow bind mongo to my mongo on the Mongo server?

    Read the article

  • I'd like to archive files from Ubuntu to Windows between two computers on a shared home network

    - by Wabbitseason
    I have an old laptop running Ubuntu 9.10 which I use as a LAMP environment for web development, and I have a comfortable, powerful desktop computer with Windows 7 installed on it. These two are connected to a home router so both can access the internet. I have been able to set up Samba so I can mount my Apache home directory so it is accessible from Windows and is mapped as a network drive. What I'd like to do is access some Windows folders from Linux so I could automatically create backups (with cron scripts) of my work to physically different locations on the Windows box. Perhaps at a later time I'd set up a local Subversion repository but I'd love to keep backups of that on the Windows drives too. Using Ubuntu's Places/Network menu I can see my desktop but I'm unable to log in to that despite having created the corrent username and password on Windows. All I can get is the following error message: "Unable to mount location. Failed to retrieve share list from server." What could be misconfigurated?

    Read the article

  • Can expire_logs_days be less than 1 day in MySQL?

    - by Scott
    So... yesterday I received an "after the fact email" about a campaign that has started for one of the services that I run. Now the DB server is getting hammered, hard, to the tune of about 300mb/min in binary logging for the replicate. As you could imagine, this is chewing up space at a fairly tremendous rate. My normal 7 day expiry of binary logs just isn't cutting it. I've resorted to truncating logs to just the last for 4 hours with(I'm verifying that replication is up to date with mk-heartbeat): PURGE MASTER LOGS BEFORE DATE_SUB( NOW(), INTERVAL 4 HOUR); I'm just running that from cron every few hours to weather the storm, but it made me question the minimum value for expire_logs_days. I haven't come across a value that is less than 1, but that doesn't mean that it isn't possible. http://dev.mysql.com/doc/refman/5.0/en/server-system-variables.html#sysvar_expire_logs_days gives the type as being numeric, but doesn't indicate if it's expecting integers.

    Read the article

  • Auto-scaling EC2 Servers and Updating Code

    - by jstats
    We've come to the point where we need to set up autoscaling for our web server and I'm unsure how to go about the process of scaling servers and updating the the existing code without remaking a new AMI and changing the autoscale config to use it. I've read a bit about people bundling the new code and uploading it to s3 and having new servers grab the bundle on boot up but that doesn't seem all that pleasant either. Currently the web app's files live in a git repo, and when we update the code, we push it to github, ssh into the web app and run a hook to bring down the latest code. So I was thinking that another option could be to just run that hook on an hourly or daily cron task. Unfortunately that doesn't cover everything type of update (for example new blog posts' images and such which aren't included in the git repo) but it's something. Could anyone provide some advice on what a common solution is or anything as to why my proposed solution is a bad idea? Thanks all

    Read the article

  • In linux: how to exit a program but not kill it?

    - by biomed
    I use Ubuntu 10.10 and I have a python program (mnemosyne) that I synchronize the data files using dropbox. If I forget to close (exit) this program. Here is my problem scenario. I leave the program running at home and go to work but if I open the program at work and work on it the data file is changed and I loose my progress at home when I exit (it automatically saves) when exitimg. I thought I could create a cron job to automatically close mnemosyne every morning regardless os me remembering to do it or not but if I use kill the program exits without saving the datafile and I end up with a tmp file and an error message when I restart it. Is there a better way of sending the exit signal to this program emulating me clicking fileexit menu option. Thanks

    Read the article

  • smarter OS X smart folder

    - by vectorizor
    Hi all, Smart folders on OS X are nice and all, but you can only access them from the Finder sidebar, and nowhere else (or am I wrong?). A better way would for them to appear as normal folder in the file system, so they are available from anywhere, say cmd line for isntance. Or if you have a smart folder to find movies/music in ~/Download, and add a shortcut to it in ~/movies, you could see your music/movies from Front Row). THe smart folder wouldnt have to be refreshed immediatly, maybe every hours or so from a cron/launchd job. Any way to do that? A

    Read the article

  • Files copying between servers by creation time

    - by driftux
    My bash scripting knowledge is very weak that's why I'm asking help here. What is the most effective bash script according to performance to find and copy files from one LINUX server to another using specifications described below. I need to get a bash script which finds only new files created in server A in directories with name "Z" between interval from 0 to 10 minutes ago. Then transfer them to server B. I think it can be done by formatting a query and executing it for each founded new file "scp /X/Y.../Z/file root@hostname:/X/Y.../Z/" If script finds no such remote path on server B it will continue copying second file which directory exists. File should be copied with permissions, group, owner and creation time. X/Y... are various directories path. I want setup a cron job to execute this script every 10 minutes. So the performance is very important in this case. Thank you.

    Read the article

  • How to automatically print the contents of a folder in OS X?

    - by MDRoz
    I would like to be able to set up a folder on my Mac, where I could dump files and have them be automatically sent to the default printer. This way I would be able to print files at home when I'm not physically at home, using something like Dropbox. It wouldn't have to be real-time; i.e., a scheduled job that checks the folder every so often would be acceptable. What's the easiest approach? Automator? Applescript? Cron job?

    Read the article

  • rss downloader script

    - by The Digital Ninja
    I have a Synology NAS that is powered by linux at my house. I'm looking to set up a cron script to check a group of rss feeds and auto download new video podcasts to a shared folder. I can do most of the scripting, such as deleting files older than 3 weeks and the wget parts. But I'm not sure how to parse the rss feed and check dates to only grab the latest. I figured its best not to re-invent the wheel and surly someone out there has a command line rss downloader or some such script. Any ideas?

    Read the article

  • Mailing list with dynamically generated addresses

    - by Joe Tomasone
    I am trying to implement a dynamic mailing list from a database that changes quite often. Conditions: Postfix is the MTA Email addresses are in a MySql Database Postfix only allows senders whose emails are in that database (via smtpd_sender_restrictions) Cron job extracts the current emails from the database nightly and puts them into an alias file, then runs postalias on it. This works well, but since the sender remains the same, many domains are rejecting the email since my server is not a DNS listed mail server for the sender's domain. So, I either have to find a way to re-write the outgoing address as "listserv@mydomain" or find some mailing list package that will use database-retrieved emails (either queried directly or in a flat file) as the subscriber list, with that list replaced daily. I've tried Sympa and am pretty much ready to give up on it - it's a nightmare to get working right - but that's the only open source listserver that I have seen that works with dynamic mail lists. Does anyone have any ideas? Thanks, Joe

    Read the article

  • crontab still sending emails even with > /dev/null

    - by user2344668
    I have a crontab (root) that runs a script and output is set to /dev/null but I always get the emails whenever it runs. I only want to receive error emails. # Rackspace driveclient update (12pm MST) 0 12 * * * /root/scripts/driveclient-update > /dev/null The only way I can get it to turn off is to use /dev/null 2&1 but then I won't get error emails. This is happening on three different CentOS servers, two are 6.3 and one is 6.4. NOTE: I have read over and over that /dev/null is supposed to send stdout there and prevent the email if there is nothing but stdout from the script, so at works for at least some people; I cannot figure out why it is not working on these servers. Here's an example of where /dev/null is supposed to work: http://www.alphadevx.com/a/384-Suppressing-Cron-Job-Email-Notifications

    Read the article

  • How do I change the output line length from the "top" linux command running in batch mode

    - by Tom
    The following command is useful to capture the current processes that are taking up the most CPU in a file: top -c -b -n 1 > top.log The -c flag is particularly useful because it gives you the command line arguments of each process rather than just the process name. The problem is that each line of output is truncated to fit on the current terminal window. This is ok if you can have a wide terminal because you have a lot of the output but if your terminal is only 165 characters wide, you only get 165 characters of information per process and it is often not enough characters to show the full process command. This is a particular problem when the command is executed without a terminal, for example if you do it via a cron job. Does anyone know how to stop top truncating data or force top to display a certain number of characters per line? This is not urgent because there is an alternative method of getting the top 10 CPU using processes: ps -eo pcpu,pmem,user,args | sort -r -k1 | head -n 10

    Read the article

  • Is there any way to remotely configure a Microsoft Lync account?

    - by John O
    There are no perl modules for Lync. No open source clients. Windows Powershell can do some things with it, but only on the server on which the server software is installed. It would be useful to be able to forward a certain desk phone number (we use Lync for voip) to a personal cell phone. We can do this from our own desktop machines, but only using the Lync client. It would be nice to be able to have a cron script run that just did rotations, I wouldn't have to carry around the lousy on-call phone with me. communicator.exe doesn't take any useful parameters. Nor are there any obvious function names in the DLLs that would let me just use rundll32.exe to accomplish this. There is a Lync SDK, but no examples of changing phone forwarding, and my Windows 7 machine refuses to install the Silverlight SDK dependency for some reason I can't fathom. Does anyone have any other ideas how I might accomplish this?

    Read the article

  • Do best-practices say to restrict the usage of /var to sudoers?

    - by NewAlexandria
    I wrote a package, and would like to use /var to persist some data. The data I'm storing would perhaps even be thought of as an addition for /var/db. The pattern I observe is that files in /var/db, and the surrounds, are owned by root. The primary (intended) use of the package filters cron jobs - meaning you would need permissions to edit the crontab. Should I presume a sudo install of the package? Should I have the package gracefully degrade to a /usr subdir, and if so then which one? If I 'opinionate' that any non-sudo install requires a configrc (with paths), where should the package look (presuming a shared-host environment) for that config file? Incidentally, this package is a ruby gem, and you can find it here.

    Read the article

< Previous Page | 130 131 132 133 134 135 136 137 138 139 140 141  | Next Page >