Search Results

Search found 3615 results on 145 pages for 'cron daily'.

Page 6/145 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Cron job for list of urls

    - by mathew
    Duplicate http://stackoverflow.com/questions/2968918/how-do-i-do-cron-job-for-list-of-urls-closed sorry for the confusion... Ho do I do cron job using a list of urls?? actually the search path is mention below www.mydomain.com/search?url=www.google.com after google.com another url which is in the urllist.txt should be the next. so at the end of the day the whole list of urls in urllist.txt will be completed and saved in my database. what I need is how do I array urls in search path from urllist.txt.....and put it in cron job rest I can handle Thanks Mathew

    Read the article

  • help with inline images/mail with cron - php?

    - by David Verhulst
    I've got mailings that need to be sended using cron. When I load the script manualy all works fine. With cron i get broken images. to change the src of my img i used: $body = eregi_replace("managersrc_logo","images/managers/acertainlogo.jpg",$body); Because i thaught that it is importent to use absolute paths i also tried: $body = eregi_replace("managersrc_logo","http://www.site.com/images/managers/acertainlogo.jpg",$body); In that case i even do not see the images when i run the cronscript manualy. Nor the automated cron will display me the images. When i check the source of the mail that is received i always see "cid:encryptedstuff" even if i use absolute paths? Why is that? I just want my absolute paths being printed in the src attribute of the img tag. Who changes my absolute path to cid: ? is it php, phpmailer or outlook itself? Any help someone?.... David

    Read the article

  • Kohana 3 and CRON always accessing index.php (not following the URI argument)

    - by alex
    OK, I hope this is my last question about CRON jobs and Kohana 3. Note: others are not duplicates, just other problems. Here is my CRON job (setup in cPanel) php /home/user/public_html/index.php --uri=properties/update As per this answer. I have set it up so it emails me the output. It is running every 5 mins. Unfortunately, it always emails me the source of the home page of my site (index.php or /). I can access that URL fine in my browser, i.e. http://www.example.com/properties/update and it works and does it's job correctly. I can tell the Cron is never hitting the script because I have a file logger in place. Would this have anything to do with .htaccess? Has this happened to anyone before, and how did they fix it? Many thanks.

    Read the article

  • cron stops running when processing multiple items

    - by James
    Cron stops running (no visible error) after a few hours when I add more than 5 jobs in my cron tab. Each job runs every minute (polls a webpage for information which takes 1 second). I tried putting all of my php jobs in a shell script and called this shell script instead but the same problem occurs. Cron stops running, no error in the log file, no error email sent out either. Anyone encountered this before? Where/how canI debug this?

    Read the article

  • Running CodeIgniter cron on localhost

    - by stef
    I'm trying to get a cron job to run every 5 min on my localhost. Using the Cronnix app I entered the following command 0,5 * * * * root curl http://localhost:8888/site/ > /dev/null The script runs fine when I visit http://localhost:8888/site/ in my browser. I've read some stuff about getting CI to run on Cron, using wget and various other options but none make a lot of sense. In another SO post I found the following command wget -O - -q -t 1 http://www.example.com/cron/run What is the "-O - -q -t 1" syntax exactly? Are there other options?

    Read the article

  • How to determine bandwidth used by cron job?

    - by Lost_in_code
    I'm not a unix guy. CPanel does a good job of managing cronjobs and that is what I used to run dozens of cronjobs. All of them combined run more than 5000 times every day. Every cron makes a call to an external API. How can I check how much bandwidth are all the cron jobs eating? For my website I use awstats and that shows bandwidth usage et al. Another thing is that I dont want the admins to ban the cron jobs because they are using too much bandwidth (and CPU), more than what is allocated in my web hosting package.

    Read the article

  • cron library for java

    - by nutsiepully
    I am looking for a cron expression library in java. Something that can parse cron expressions and return me future fire times for the trigger. API on the lines of. CronExpression cronExpression = new CronExpression("0 30 4 * * *"); List<Date> fireTimes = cronExpression.getFireTimes(todaysDate, nextWeekDate); I don't want to use something as complicated as quartz. The purpose is to basically use cron like a regex for timings. That's all. I do not want a background scheduler. I tried googling but wasn't able to find anything very helpful. Any suggestions would be appreciated. Regards, Pulkit P.S - I looked at using the CronExpression class out of quartz. Wasn't very helpful - failing some tests.

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • Two parts: linux startup script to connect to bluetooth and cron to keep it connected

    - by D.R.
    I have a mini bluetooth keyboard and a Raspberry Pi running a Debian-based distro. I know the MAC address of the keyboard but for this question, let's just use AA:BB:CC:DD:EE:FF. Right now I have to have a wired keyboard connected as well as my bluetooth dongle for the mini-keyboard and on the wired keyboard, I have to run the following when the device boots up: sudo hidd --connect AA:BB:CC:DD:EE:FF If the device goes idle for too long, then the bluetooth disconnects and I have to pull out my wired keyboard and retype that same command. What I'm looking for it a way to have that command run at startup and a way to sense if it gets disconnected so that it will auto reconnect. The annoying thing is that they keyboard has to be in pairing mode (even though it has already been paired) when I run that command, otherwise it tells me the host is down. So perhaps the script needs to prevent it from disconnecting due to inactivity, otherwise I'll have to put it back in pairing mode to reconnect. So to recap: - A script to connect at startup (I can make sure to put the keyboard into pairing mode before turning it on) - A script to prevent it from disconnecting (maybe some sort of signal to send to it every 60 seconds or something?) Any help with this is greatly appreciated! StackOverflow is always the best place to find answers to weird questions! I've been searching long and hard for an answer, but finally had to resort to coming here! Thanks!

    Read the article

  • Minutely cron, ensuring only a single instance

    - by Christopher Nadeau
    Is there a way to run a script every minute (or 2, or 5, etc), but only if it isn't already running? We have a set of scripts that need to run every minute. Sometimes they might start and finish in a second, other times they might go on for 5 minutes. Our current way of avoiding simultaneous executions is by setting a is_running flag in each script, and exiting if it's still enabled. But this is a little unreliable (i.e., fatal errors would cause the flag to remain enabled even after the script halted). We could write our own little manager, but I'm wondering if there is a more fashionable solution that already exists.

    Read the article

  • Why doesn't this cron work?

    - by Alex
    I do "crontab -e" and add the following line: 0 9 * * * /usr/bin/python /home/g1/g1/utils/statsEmail.py > /home/g1/log/statsemail.log But it doesn't work! Why? The script itself works. Also, the log is empty. My other command in crontab is this, and it works: 0 9 * * * /usr/bin/python /home/g1/g1/sphinx/updateall.py > /home/g1/log/updateall.log

    Read the article

  • aws s3 works with script but not on cron

    - by user3800017
    guys.. My first post ! hope not the last .. I have few bunch of servers on aws ec2 platforms. I made a simple script to backup my custom logs on their s3 storage bucket. The problem is the script works fine .. but I tried to add it to the crontab. And the script executes but not the s3 sync/mv part ! Here is my code: NOW=$(date "+%b_%d_%Y") MY_HOSTNAME=`uname -n` mv /opt/req/req* /opt/req/bkup/ mv /opt/response/res* /opt/req/bkup/ cd /opt/req/bkup/ tar -cvf ${MY_HOSTNAME}_req_bkup_${NOW}.tar re* rm *.txt aws s3 mv /opt/req/bkup/* s3://req `

    Read the article

  • let CRON send emails through SMTP ( debian squeeze )

    - by supernova
    i would like to send emails whenever a cronjob has completed, i read that this is possible with exim4. in /etc/alias i added the line myuser: [email protected] in /etc/exim4/update-exim4.conf.conf i set dc_smarthost='smtp.myserver.ip::25 and in /etc/exim4/passwd.client i set smtp.myserver.ip:[email protected]:mypassword my problem is that i can't see any login at my mailserver, and in the exim logs i saw a few lines with 2012-10-13 09:17:01 1TMvy1-0001fp-F2 ** [email protected] R=nonlocal: Mailing to remote domains not supported 2012-10-13 09:17:01 1TMvy1-0001fr-JE <= <> R=1TMvy1-0001fp-F2 U=Debian-exim P=local S=17426 are there any additional config settings i have to set? edit : i solved the prev. error by running dpkg-reconfigure, but now i'm facing the following error <root@debian> R=dnslookup T=remote_smtp defer (-53): retry time not reached for any host

    Read the article

  • [PHP] Dynamic cron jobs?

    - by Frankie Laguna
    I'm writing a script that needs to be called at a random time during the day, but am not sure how to accomplish this. I don't want to waste server resources to run a cron job every minute. I want the script to be called at random, so generating the random times for say a month in advance and then creating cron jobs for each of them isn't what I'm looking for. Also this script only needs to be executed once a day. Thanks in advance!

    Read the article

  • /etc/crontab or any user crontab is not being executed

    - by ian
    My server is CentOS 5. When I edit /etc/crontab or edit any user(including root) crontab via "crontab -e" command, it just adds "(system) RELOAD (/etc/crontab)" or "(admin) RELOAD (cron/admin)" in the log. No CMD in the /var/log/cron. Sample entry in /var/log/cron: Aug 10 10:21:33 localhost crontab[31688]: (root) BEGIN EDIT (root) Aug 10 10:21:42 localhost crontab[31688]: (root) REPLACE (root) Aug 10 10:21:42 localhost crontab[31688]: (root) END EDIT (root) Aug 10 10:22:01 localhost crond[2688]: (root) RELOAD (cron/root) Result of "service crond status": crond (pid 1345) is running... The command "cat /var/log/messages | grep cron" does not give anything. Contents of /etc/cron.allow: admin root Contents of /etc/crontab: SHELL=/bin/bash PATH=/sbin:/bin:/usr/sbin:/usr/bin MAILTO=root HOME=/ # run-parts 01 * * * * root run-parts /etc/cron.hourly 02 4 * * * root run-parts /etc/cron.daily 22 4 * * 0 root run-parts /etc/cron.weekly 42 4 1 * * root run-parts /etc/cron.monthly * * * * * root run-parts /bin/date >> /data/date.txt Result of ps aux |grep cron: root 1345 0.0 0.1 5268 1204 ? Ss 11:43 0:00 crond Contents of admin's crontab: * * * * * /bin/date >> /data/date.txt Note that it's not only admin's crontab that's not running. All cron jobs are not running. Any ideas why they aren't running?

    Read the article

  • rake task via cron problem loading rubygems

    - by Matenia Rossides
    I have managed to get a cron job to run a rake task by doing the following: cd /home/myusername/approotlocation/ && /usr/bin/rake sendnewsletter RAILS_ENV=development i have checked with which ruby and which rake to make sure the paths are correct (from bash) the job looks like it wants to run as i get the following email from the cron daemon when it completes Missing these required gems: chronic whenever searchlogic adzap-ar_mailer twitter gdata bitly ruby-recaptcha You're running: ruby 1.8.7.22 at /usr/bin/ruby rubygems 1.3.5 at /home/myusername/gems, /usr/lib/ruby/gems/1.8 Run `rake gems:install` to install the missing gems. (in /home/myusername/approotlocation) my custom rake file within lib/tasks is as follows: task :sendnewsletter => :environment do require 'rubygems' require 'chronic' require 'whenever' require 'searchlogic' require 'adzap-ar_mailer' require 'twitter' require 'gdata' require 'bitly' require 'ruby-recaptcha' @recipients = Subscription.all(:conditions => {:active => true}) for user in @recipients Email.send_later(:deliver_send_newsletter,user) end end with or without the require items, it still gives me the same error ... can anyone shed some light on this? or alternatively advise me on how to make a custom file within the script directory that will run this function (I already have a cron job working that will run and process all my delayed_jobs. Cheers!

    Read the article

  • CLI include paths to run zend framework via cron

    - by summerg
    I wrote a command line utility using Zend Framework to do some nightly reporting. It uses a ton of the same functionality the accompanying site. It works great when I run it by hand, but when I run it on cron I have include path issues. Seems like it should be easily fixed with set_include_path, but maybe I'm missing something? My directory structure looks like this: /var/www/clientname/ application Globals.php commandline commandline_bootstrap.php public_html public_bootstrap.php library Zend In public_bootstrap.php I use set_include_path without a problem, relative to the current directory: set_include_path('../library' . PATH_SEPARATOR . get_include_path()); If I understand correctly, in commandline_bootstrap.php I need to put in the absolute path, so cron knows where everything is. My file starts like this: error_reporting(E_ALL); set_include_path('/var/www/clientname/library' . PATH_SEPARATOR . get_include_path()); require_once "../application/Globals.php"; But when I run it via cron I get the following error: PHP Fatal error: require_once(): Failed opening required '../application/Globals.php' (include_path='/var/www/clientname/library/') in /var/www/clientname/commandline/zfcli.php on line 11 I think PHP is accepting my new path, because when I run it command line and dump the phpinfo I can see: include_path = /var/www/clientname/library/:.:/usr/share/pear:/usr/share/php = .:/usr/share/pear:/usr/share/php I admit the syntax here looks a little strange, but I can’t figure out how to fix it. Any suggestions would be greatly appreciated. Thanks summer

    Read the article

  • Running CORN job on Ubuntu server for SugarCRM

    - by Logik
    i am pretty inexperienced in Linux.So be descriptive on your answer. My environment :Local Linux server 12.04 hosting Sugar CRM 6.5.2. There is area in sugar CRM called scheduler. I can configured some predefined jobs here. in my case i am trying to run email reminders (ever min/hour/day/month). For this scheduler to be effective, i read some where i need to setup CRON job. So i did some research & finally put following lines in CRONTAB for the root user, as per instructions given in sugarCRM. cd /var/www/crm; php -f cron.php /dev/null 2&1 Well i am creating contracts in my sugarCRM (AOS module) & i want email reminders to be sent for these contracts to the concern person. Now my sugarCRM email is configured correctly & i can send test emails using it. But the CRON + scheduler not giving any result. I can't receive any emails. Then i tried to read /var/log/syslog & it is showing entry for following line each minute. Oct 27 15:03:01 unicomm CRON[28182]: (root) CMD (cd /var/www/crm; php -f cron.php /dev/null 2&1) I've few questions: 1) what does the CRON job line i've added in crontab mean? cd /var/www/crm; php -f cron.php /dev/null 2&1 is not making any sense to me. 2) How am i suppose to get this thing work? I've searched a lot (including SugarCRM forum), but no luck.

    Read the article

  • Bash commands not executed when through cron job - PHP

    - by basicxman
    Hi there! I have a cron job running a PHP script every five minutes; the PHP script executes two bash commands at the end of the script. I know the script is running due to a log file it appends to. When I run the PHP script manually via the Ubuntu Gnome Terminal both bash commands execute flawlessly; however when the PHP script is triggered via cron, the two bash commands are not ran. Any ideas? $command = 'notify-send "' . count($infoleakPosts) . ' New Posts."'; `$command`; $command = 'firefox http://example.com'; `$command`; */1 * * * * php /home/andrew/grab.php USERNAME PASSWORD # JOB_ID_1

    Read the article

  • cron job for updating user profile data imported via facebook connect

    - by Abidoon Nadeem
    I want to write a cron job for updating user profile data on my website that I pull for users that register via facebook connect on my website. The objective is to keep their profile data on my website in sync with their profile data on facebook. So if a user updates their profile picture on facebook. I want to update his profile picture on my website as well via a cron job which will run every 24 hours. I wanted to know if this is possible and secondly if this is in violation of facebook privacy policy. Based on my research it seems doable but I wanted to know if anyone has already done something like this before. It would really help.

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >