Search Results

Search found 2603 results on 105 pages for 'daily'.

Page 35/105 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • How can I schedule a daily backup with SQL Server Express?

    - by edosoft
    I'm running a small web application with SQL server express (2005) as backend. I can create a backup with a SQL script, however, I'd like to schedule this on a daily basis. As extra option (should-have) I'd like to keep only the last X backups (for space-saving reasons obviously) Any pointers? [edit] SQL server agent is unavailable in SQL server express...

    Read the article

  • Can't (re)Install VLC (removed by update{again})

    - by David matthews
    I use VLC a lot, And When 2.0 came out Ubuntu did not update to that version, the REPO had the older version even months later, So I added the daily repo: http://ppa.launchpad.net/videolan/stable-daily/ubuntu and that worked for a while, after a few months later I received a 'Distribution upgrade' and when I installed it, it removed VLC. when I tried to re-install it gave me a bunch of unmet dependency's, so I disabled the source, ran apt-get update, and tried to install the older VLC, that did not work either. I eventually found a web page, and it helped me get it working, and I was also able to get the 'Stable Daily' working too But last night, I got another 'disto upgrade' and it uninstalled VLC again. when I try to reinstall from daily I get: The following packages have unmet dependencies: vlc : Depends: fonts-freefont-ttf but it is not installable Depends: vlc-nox (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed Recommends: vlc-plugin-notify (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed Recommends: vlc-plugin-pulse (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed E: Unable to correct problems, you have held broken packages. and from the default source: vlc : Depends: vlc-nox (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed Recommends: vlc-plugin-notify (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed vlc-plugin-pulse : Depends: vlc-nox (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed E: Unable to correct problems, you have held broken packages. (and yes, I ran apt-get update after turning off daily) Any Ideas? (ubuntu 12.04 64bit)

    Read the article

  • rsnapshot intervals in configuration file...

    - by Patrick
    A simple question about rsnapshot. In order to perform daily backups I'm going to add lines to cron in my Ubuntu. Then, why do I have also these lines in the rsnapshot.conf ? ######################################### # BACKUP INTERVALS # # Must be unique and in ascending order # # i.e. hourly, daily, weekly, etc. # ######################################### interval hourly 6 interval daily 7 interval weekly 4 #interval monthly 3 If I use cron, should I disable them ? thanks ps. I've just realized that in the crontab I still have "hourly" and "daily". Should I then uncomment only the one I use in the crontab ? And what's the point to specify hourly if it is already specified in cron ? I'm a bit confused. # crontab -e 0 */4 * * * /usr/local/bin/rsnapshot hourly 30 23 * * * /usr/local/bin/rsnapshot daily

    Read the article

  • How to create a log file in the folder which will be created at run time

    - by swati
    Hello Everyone, I new to apache logger.I am using apache log4j for my application. I am using the following configuration file configure the root logger log4j.rootLogger=INFO, STDOUT, DAILY configure the console appender log4j.appender.STDOUT=org.apache.log4j.ConsoleAppender log4j.appender.STDOUT.Target=System.out log4j.appender.STDOUT.layout=org.apache.log4j.PatternLayout log4j.appender.STDOUT.layout.conversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} [%p] %c:%L - %m%n configure the daily rolling file appender log4j.appender.DAILY=org.apache.log4j.DailyRollingFileAppender log4j.appender.DAILY.File=log4jtest.log log4j.appender.DAILY.DatePattern='.'yyyy-MM-dd-HH-mm log4j.appender.DAILY.layout=org.apache.log4j.PatternLayout log4j.appender.DAILY.layout.conversionPattern=%d{yyyy-MM-dd HH:mm:ss.SSS} [%p] %c:%L - %m%n So when my application runs it creates a folder called somename_2010-04-09-23-09 . My log file has to be created inside of this somename_2010-04-09-23-09 folder.(Which created at run time..). Is there anyway to do that.. Is there anyway we can specify in the configuration file so that it will create at run time the log file inside of the folder somename_2010-04-09-23-03 folder..? I would really appreciate if some one can answer to my questions. Thanks, Swati

    Read the article

  • Can I use the whatthetrend.com API to get daily and weekly twitter trends?

    - by Charles S.
    The twitter API allows me to receive daily trends and weekly trends (in addition to current trends, of course) as either JSON or XML. Is there an equivalent with the whatthetrend.com API? I don't see a method/parameter at first glance but I wanted to see if anyone out there knew a way... http://api.whatthetrend.com/api There's a method to lookup trend definition by keyword so I guess I could use that based off the Twitter API but I'd rather just load all the data at once rather than have to remotely access the API every time I want to look up a definition. Thanks

    Read the article

  • When configuring daily backups, which files should I include to be sure I have the MySQL db's

    - by user575599
    I have a dedicated LAMP server with cpanel hosting 100 websites (some of them have MySQL db's). I am currently using the Jungle Disk Server Edition to backup our files from our LAMP server to Amazon S3. Once a week were are backing up the entire cpanel which is an enormous strain on resources but that is a separate issue. Now, what I want to do is to set up a daily job to backup just the HTML files and the MySQL db's. If I just backup the "public_html" folder will my MySQL database info be stored in that directory? Would backing up the public_html folder be enough to recover the db? I can find plenty of resources online about how to manually backup MySQL db's but with a 100 sites, I need it automated. I'm hoping for an easy solution where I can just grab a folder to backup each day.

    Read the article

  • Cannot run logwatch due to Date::Manip issue

    - by Quintin Par
    I tried to run logwatch at follows [root@machine cron.daily]# ./0logwatch ERROR: Date::Manip unable to determine TimeZone. Execute the following command in a shell prompt: perldoc Date::Manip The section titled TIMEZONES describes valid TimeZones and where they can be defined. My date is as follows root@machine cron.daily]# date Thu Aug 23 06:25:21 GMT 2012 Now based on details in various forums I tried to fix this by setting /etc/timezone to “+0800” but it didn’t work My /etc/localtime points to /usr/share/zoneinfo/GMT and is managed by puppet How do I go about fixing this? I still want all my machines to be in GMT timezone. EDIT: Sadly, Both the changes are not working: [root@machine cron.daily]# cat /etc/TIMEZONE UTC Quanta’s [root@machine cron.daily]# cat ~/.bash_profile # .bash_profile # Get the aliases and functions if [ -f ~/.bashrc ]; then . ~/.bashrc fi # User specific environment and startup programs PATH=$PATH:$HOME/bin export TZ=GMT export PATH [root@machine cron.daily]# source ~/.bash_profile [root@machine cron.daily]# ./0logwatch ERROR: Date::Manip unable to determine TimeZone. Execute the following command in a shell prompt: perldoc Date::Manip The section titled TIMEZONES describes valid TimeZones and where they can be defined.

    Read the article

  • Windows server's HDD Spin down daily/nightly - Does it makes sense?

    - by Riccardo
    A Windows Server 2003 R2 has the following hard disk configuration: - 3 internal hard disks attached to a 3Ware unit, configured in Raid 1 + spare unit - 3 external USB backup disks: 2 Verbatim 1TB (Samsung HD103SI) + 1 Western Digital 1TB (WD10EADS) The server runs 365 days per year, h24, however: - at daytime the server/user usage is limited to the internal hard disks - at nighttime there's no user usage, apart from scheduled maintenance tasks, basically the Server will be idle from 7PM to 8AM. apart from nighly backups (few hours). I was wondering if: (a) it makes any sense let Windows manage power savings, allowing disks to spin down accordingly, ** OR** let the disks stay awlays-on, to avoid permature wearing, due to continuous spin up/down (b) leave internal disks always on, and force external disks to power down while idle (this requires third party tools, such as Verbatim's Green button utility) Your thoughts?

    Read the article

  • Which tool do you use for tracking daily/weekly progress on your long-term goals?

    - by NoCatharsis
    I read through this post and this post to get an idea of short-term task management applications out there. However, now I'm having a problem tracking my longer-term goals pertaining to things like career, education, and exercise. I just discovered Disciplanner which might meet my needs but haven't had time to set it up just yet. Not to mention, it took me about 2 weeks of playing with every online to-do list app I could find before I came to a decision on which to use. I'm very particular about my tasks and goal-setting, so I would prefer to get some advice from other superusers before diving into the web again to find another life-management website.

    Read the article

  • How can I automate or script daily downloads for any new anti- virus databases, and then have the program scan my drive?

    - by Macgrimm
    Howdy all Super Users" I humbly ask if any Super User can direct this long time, gray haired Apple Tech in the right direction on this issue. I believe there probably are many ways to skin this cat. But I am looking to find simply the best, most unattended way to get it done. Any help will be greatly appreciated. also (I know there are much better softwares out there for the Mac so please don't go there! The politics of this company dictate which Anti virus we have to use) anyway without any further wait: basically I am trying to automate 2 very important functions of Mc'Afee anti-virus for Mac. First I want to automate the process of retrieving new virus definition files, and second I want to automate the process of scanning for viruses. It turns out that Using Mc'Afee Anti-Virus for the Mac are both manual functions. And they left up to the user (per user account) to perform. Depending on all of about 150 MAc users to perform these 2 tasks themselves is around 65% compliance. My question then is: If I wanted to use the command line such as (open /Applications/McAfee\ Security.app) It will open up the Security Console. But how can I make command Mc'Afee go out and grab the definition files and scan the computer? I have to admit I am at a crossroad and Macaltimers has set in. I would really appreciate it if any of you "Super ~ Users" can help me out with this MacAltimers loss of how to what to do. Thanks to All up Front Macgrimm

    Read the article

  • Is Clonezilla a good option for a daily batch-file-based backup of a Windows XP PC?

    - by rossmcm
    Having just been through the process of rebuilding a Windows XP desktop machine when the disk died, I'm anxious to make it a lot less painful. I didn't lose any data, but reinstalling everything took ages. Clonezilla seems to be a highly mentioned free backup tool. How easy would it be to implement the following: a nightly unattended backup of the desktop's disk image to another network machine (or a second drive in the machine), hopefully with compression. restore from that image using USB boot media. so that if I come in to work and find the hard drive has tanked, it is just a matter of replacing the dead drive with a new one, booting from the USB stick, choosing the image to restore, and then finding something else to do for an hour or two. When it is finished I would hopefully be back to where I was.

    Read the article

  • How to make a redundant desktop system with daily snapshots? (Is btrfs ready for use?)

    - by TestUser16418
    I want to configure a desktop system in which the home filesystem would be redundant (e.g. RAID-1), and would have weekly snapshots taken. I've already done this with ZFS, the snapshot system is wonderful, and with send/recv you can easily create backups on external media. Unfortunately, at that point, I want GNU+Linux and not FreeBSD or Solaris, so I'm looking for suggestions for good alternatives. I reckon that my alternatives are: btrfs - it seems to be exactly what I need, it has snapshots and commands that allow you to easily replicate zfs send. Yet all documentation mentions that it's still experimental. I can't seem to find any actual reports on its reliability or usability issues. Can you point me to any information on that issue that could clarify whether it would be a possible choice? I have a large preference for this option, mostly because I don't want to reformat the drives when btrfs becomes ready, but I there's no information on whether it's usable at all, whether it's a silly idea to use it, etc. The question that I cannot get the answer to is what does "experimental" mean. lvm snapshots and ext4 - preferably not, since it can consume an awful amount of space when new files are created. Creating 200 GB files requres 200 GB free space and 200 GB additionally for snapshots. I also have found it unreliable -- failed metadata rewrite results in an unreadable PV. I'm wondering how btrfs would compare here. A single filesystem (ext4) on a RAID-1 array with custom COW snapshots with hardlinks (like cp -al). That's my current preference if I can't use btrfs. So how experimental btrfs is, which should I choose, and do I have any other options? What if I don't keep external incremental backups, would that affect my choice?

    Read the article

  • How can I get a windows server to collate and email 4 daily reports/graphs for server performance

    - by Glyn Darkin
    I run a windows 2008 webserver and would like to setup the most basic performance monitoring in the world. What i would like is: a plot of ASP.Net request time for each w3wp process for a 24hour period a plot of CPU% utilisation for each w3wp process for a 24hour period a plot of Memory utilisation for each w3wp process for a 24hour period a plot of network utilisation for each w3wp process for a 24hour period a plot of disk utilisation for each w3wp process for a 24hour period I would these plots to be emailed to me each morning. Anybody know what is the simplest way to set this up? Thanks for your help in advance. Glyn

    Read the article

  • SEO: Nested List vs List, Split Over Divs vs Definition List

    - by Jon P
    From an SEO perspective which, if any, is better: Option 1: Nested lists with h2 tags <ul id="mainpoints"> <li><h2>Powerful Analysis</h2> <ul> <li>Charting and indicators</li> <li>Daily trading signals</li> <li>Company health checks</li> </ul> </li> <li><h2>World Market Data</h2> <ul> [List Items removed for brevity] </ul> </li> <li><h2>Daily Market Data</h2> <ul> [List Items removed for brevity] </ul> </li> </ul> Option 2: Divs with h2 and lists <div id="mainpoints"> <div> <h2>Powerful Analysis</h2> <ul> <li>Charting and indicators</li> <li>Daily trading signals</li> <li>Company health checks</li> </ul> </div> <div> <h2>World Market Data</h2> <ul> [List Items removed for brevity] </ul> </div> <div> <h2>Daily Market Data</h2> <ul> [List Items removed for brevity] </ul> </div> </div> Option 3: Definition List <dl id="mainpoints"> <dt>Powerful Analysis</dt> <dd>- Charting and indicators</dd> <dd>- Daily trading signals</dd> <dd>- Company health checks</dd> <dt>World Market Data</dt> [List Items removed for brevity] <dt>Daily Market Data</dt> [List Items removed for brevity] </dl> My instincts tell me that semanticaly the pure list options (1 & 3) are the best and that h2 may be more SEO friendly (1 & 2) which would point to option 1 as being the best option. I do love the lean makeup of the definition list but will I take an SEO hit by losing the h2 tags? Before anyone asks, h2 is not valid markup in a dt tag. Are my instincts right with a nested list being the way to go?

    Read the article

  • How to handle these variables in rsync exclude file?

    - by linux
    I have an ignore file for rsync but I can't figure out how to ignore this string of file names and the username: backup/cpbackup/daily/username/homedir/mail/cur/1244452567.H511146P7355.dwhs45.dwhs.net,S=2161:2, backup/cpbackup/daily/username/homedir/mail/cur/1244455430.H516330P14494.dwhs45.dwhs.net,S=4062:2, I tried this: backup/cpbackup/daily/*/homedir/mail/cur/* and this: *.*.dwhs45.dwhs.* But of course that would be too easy. Basically I just want to not transfer all the mail in the /cur/ directory for all users to the backups.

    Read the article

  • cron job email customization

    - by user12145
    I have a list of cronjobs where some executes daily and others execute very 15 minutes. I do want to receive an email for the ones executed daily, but wants email disabled for the ones executing every 15 minutes(or maybe receive a daily email), is there a way to do this in crontab?

    Read the article

  • Why is wp-cron taking up so many resources?

    - by Gaia
    From /var/logs/httpd/error-log: [Thu Apr 22 01:41:15 2010] [notice] mod_fcgid: call /var/www/vhosts/mydomain.com/httpdocs/wp-cron.php with wrapper /usr/bin/php-cgi [Thu Apr 22 01:41:15 2010] [notice] mod_fcgid: server /var/www/vhosts/mydomain.com/httpdocs/wp-cron.php(17999) started ...The previous line shows up 8661 times... What's in Cron? Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_version_check Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_update_plugins Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_update_themes Apr 23, 2010 @ 12:21 (1272025294) Once Daily wp_scheduled_delete Running CentoOS 5/plesk 9.3/php as FastCGI/suExec with WP 2.9.2 Thanks in advance.

    Read the article

  • Why is wp-cron is taking up so many resources?

    - by Gaia
    From /var/logs/httpd/error-log: ` [Thu Apr 22 01:41:15 2010] [notice] mod_fcgid: call /var/www/vhosts/mydomain.com/httpdocs/wp-cron.php with wrapper /usr/bin/php-cgi [Thu Apr 22 01:41:15 2010] [notice] mod_fcgid: server /var/www/vhosts/mydomain.com/httpdocs/wp-cron.php(17999) started ...The previous line shows up 8661 times... ` Whats in Cron? ` Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_version_check Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_update_plugins Apr 22, 2010 @ 18:25 (1271960731) Twice Daily wp_update_themes Apr 23, 2010 @ 12:21 (1272025294) Once Daily wp_scheduled_delete ` Running CentoOS 5/plesk 9.3/php as FastCGI/suExec Thanks in advance.

    Read the article

  • Cross-match a number of worksheets to one master worksheet

    - by Carter
    Hopefully the title is not too confusing. Basically, I have a master list of addresses and those addresses are listed in multiple columns (Column A - street number, Column B - street name, Column C - street type etc) and I get a another set of addresses on a daily basis with the same address formatting. What I need to do is cross-match the daily changing list of addresses to the first list to remove any matching entries. So, for example, if the first list has 123 Main St on it, I have to ensure that there are no entries of 123 Main St on any subsequent daily lists. I'm using one address as an example but the lists contain upwards of 10000 addresses that have to be cross matched. I don't need them flagged or highlighted, just deleted from the daily lists (though if they have to be flagged or highlighted, I could work with that) Any help here would be much appreciated.

    Read the article

  • If email not received then do X (outlook 2013 on Exchange 2010)

    - by Brad
    I receive notification emails daily and would like to automate an easier way to manage all of those notifications. For example: Notification 1 from [email protected] is received daily between 10pm-1am Notification 2 from [email protected] is received daily between 12am-3am Notification 3 from [email protected] is received daily between 1am-4am I am looking for a way to page myself at [email protected] on my cellphone if any of these messages are not received within the defined time frame of when the email should have arrived. I would like to basically email a page like: ATTENTION Notification 2 not received within the allowed range. This way I would be notified instead of having to check the email manually and see that I only received 2 of the three alerts. Is there a way to do this in Outlook? Our exchange server is a hosted exchange server on GoDaddy if that info is needed.

    Read the article

  • Can't reinstall VLC

    - by David matthews
    I use VLC a lot. And when 2.0 came out Ubuntu did not update to that version, the REPO had the older version even months later, So I added the daily repo: http://ppa.launchpad.net/videolan/stable-daily/ubuntu and that worked for a while, after a few months later I received a 'Distribution upgrade' and when I installed it, it removed VLC. when I tried to re-install it gave me a bunch of unmet dependency's, so I disabled the source, ran apt-get update, and tried to install the older VLC, that did not work either. I eventually found a web page, and it helped me get it working, and I was also able to get the 'Stable Daily' working too But last night, I got another 'distro upgrade' and it uninstalled VLC again. when I try to reinstall from daily I get: The following packages have unmet dependencies: vlc : Depends: fonts-freefont-ttf but it is not installable Depends: vlc-nox (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed Recommends: vlc-plugin-notify (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed Recommends: vlc-plugin-pulse (= 2.0.3+git20121005+r392-0~r42~precise1) but it is not going to be installed E: Unable to correct problems, you have held broken packages. and from the default source: vlc : Depends: vlc-nox (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed Recommends: vlc-plugin-notify (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed vlc-plugin-pulse : Depends: vlc-nox (= 2.0.3-0ubuntu0.12.04.1) but it is not going to be installed Depends: libvlccore5 (>= 2.0.0) but it is not going to be installed E: Unable to correct problems, you have held broken packages. Any ideas? I am using ubuntu 12.04 64bit.

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >