Search Results

Search found 2603 results on 105 pages for 'daily'.

Page 39/105 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Webalizer causing high CPU load

    - by Tom
    We use webalizer to generate reports on our Apache access logs - it is useful in conjunction with Google Analytics. The problem is that webalizer uses ALOT of CPU when running. If I run top I can see two perl processes with 90% CPU - this slows down the machine and therefore the website for our users. Webalizer is run via a daily cron job (/etc/cron.daily/00webalizer): #! /bin/bash # update access statistics for the web site if [ -s /var/log/httpd/access_log ]; then exec /usr/bin/webalizer -Q fi Does anyone know how to limit how much CPU webalizer can use? For example, would nice help and how would I use it?

    Read the article

  • What backup solution for Windows 2008 R2 servers on XenServer 5.0?

    - by Niels R.
    A friend of mine is hosting a lot of Linux VM's on his servers using XenServer 5.0. He uses rdiff-backup to make daily backups. I'm trying to convince him to host some Windows VM's (Windows 2008 R2 Web Edition) too, so he could provide (me) .NET hosting. The main problem at the moment is a backup strategy for these Windows VM's. I would like to see something like a weekly full backup (snapshot of the VM?) with daily incrementals. I've looked at Windows Backup, but because the backups are made onto network shares it doesn't provide incrementals (for what I understand). Does anyone has any experience with this situation? How did you solve this in a "not-too-hard-to-install/maintain" way?

    Read the article

  • Nginx, logrotate and empty files

    - by user37887
    Hi. I have a problem with nginx/logrotate. The problems is that nginx is logging access to 2 files (main and data). I have the following contrab setting: 0 * * * * /usr/sbin/logrotate -f /home/orwell/orwell-setup/bin/logrotate-nginx And the file "logrotate-nginx" has the following content: /tmp/data.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } /tmp/main.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } The work is done in the two files, but there is a problem that nginx stops logging into those files. Both files are created, but they are empty. Any ideas why nginx stop logging info to both files?

    Read the article

  • Is there any desktop version available for wsywyg editors

    - by user825904
    I have been searching for long and i am not able to find one editor which i can always keep open for daily notes keeping with some codes and which highlights the code Just like we have our editor where we ask questions. ALl is want is I can write some text as normal Then i have few php code which i can write and then click some code button or like we have ctrl +K and i gets higlighted then on the same page i write some paragraph text as normal withoout higligting and some bullet points Then again i get some c++ code and i wrap that around on the same page and page grows daily Currently i use notepad++ and it does not support syntx higlighting of different langages on same page and on individual blocks. it highights all the file even the normal text as well basically i am looking wsywyg editor but as windows desktop application

    Read the article

  • How reliable are Compact Flash drives

    - by bakytn
    For example, USB drives have read/write limits. and they are considered as unreliable and not durable for daily use over many years. My question is, however, about Compact Flash drives. AFAIK they are built almost in the same way as USB flash drives. I was going to use one Compact Flash drive for daily usage and install a lightweight operating system to it (Linux). What kind of limitations do they have? Read/write operations? Just a time? something else? Thank you!

    Read the article

  • SQL Server 2005 Default Backup Plan

    - by tylerl
    I noticed that a newly imported database on SQLServer 2005 had configured itself (without my knowledge) to perform daily backups; but it's not deleting old files and quickly filling up the disk. I don't know how the backup job got configured (maybe that's something that gets transferred when you move a database?) but I'm having trouble modifying it. The backup runs as part of SQL Server Agent job called "Daily Backups". This job runs a package called "(SSIS Packages)\Maintenance Plans\Backup Plan" -- which I can't find. The "Management\Maintenance Plans" area for my server is empty. I imagine I could delete the existing plan and re-create it manually, but I was hoping to just modify what was already there, since all that's missing is deleting old files.

    Read the article

  • Tool to monitor file size, file existence, parse xml, etc

    - by Artur Carvalho
    I'm trying to find some tool that helps me monitor several things. What are some requirements: Shows results on a web page. Checks existence of files/folders Checks sizes of files/folders Can parse xml files Can have several status depending if it's for instance, after 9pm Ping workstations/Servers to ensure they are on or off create daily/weekly/monthly reports (pdf, html, csv) show daily/weekly/monthly scheduled tasks check if specific users are logged in a machine check which users are logged in in a machine I've looked into some solutions but could not find what I wanted. Usually tools like nagios are more focused in servers, and spiceworks is not so specific. At this point I'm using a little powershell script that does several of these items, but before losing more time probably reinventing the wheel, what tools are out there? Thank you in advance.

    Read the article

  • Rsync backup - detect new directory and backup only from that directory

    - by Pracovek
    New cpanel daily backup is creating separate directories for daily backup. This creates problem when I try to user rsync to do an offsite backup since I would like to rsync only latest data. E.g. On backup server I have directory "backup" and on server, from which we are pulling backups I get directories 2013-11-07, 2013-11-08 etc in backup directory. If I backup /backup directory on the server it will use allot more space so I would like to backup only latest directory in backup directory, eg 2013-11-08. Is there a way to detect latest directory in backup directory and pass that directory name to rsync for backup ?

    Read the article

  • windows server backup 2008 R2 - what is generating all the change data?

    - by bobjandal
    We have a small relatively idle windows server 2008 R2 installation that does basic filesharing and exchange for about 10 not very active users. When running a windows server backup, the incremental data daily is about 20GB. This is not coming from users shared files, nor from changes in their mailbox sizes. The total size of the installation is 249GB, which is mostly old files. Where is all this data coming from, and how can I reduce it ? Using online backup of the vhd file from the backup is taking a while because of this daily change. Is there some way I can at least see what files are changing and contributing to this data ? Options I can think of but am not sure about: 1) pagefile churning - altho the backup does not include the pagefile, perhaps the changed blocks left behind are included ? 2) logs or something ? but the installation size stays the same every day 3) should I zero free space using sdelete before backing up perhaps ?

    Read the article

  • How to set up a file server in a restricted corporate environment

    - by Emilio M Bumachar
    I work in a big corporation, and the disk space my team gets in the corporate file server is so low, I am considering turning my work PC into a file server. I ask this community for links to tutorials, software suggestions, and advice in general about how to set it up. My machine is an Intel Core2Duo E7500 @ 3GHz, 3 GB of RAM, Running Windows XP Service Pack 3. Upgrading, formatting or installing another OS is out of the question. But I do have Administrator priviledges on the PC, and I can install programs (at least for now). A lot of security software I don't even know about is and must remain installed. But I only need communication whithin the corporate network, which is not restricted. People have usernames (logins) on the corporate network, and I need to use them to restrict access. Simply put, I have a list of logins of team members, and only people in the list should access the files. I have about 150 GB of free disk space. I'm thinking of allocating 100 GB to the team's shared files. I plan monthly backups on machines of co-workers, same configuration. But automation of backups is a nice, unnecessary feature: it's totally acceptable for me to manually copy the contents to a different machine once a month. Uptime is important, as everyone would use these files in their daily work. I have experience as a python and C programmer, but no experience whatsoever as a sysadmin, and almost nothing of my programming experience is network programming. I'm a complete beginner in this. Thanks in advance for any help. EDIT I honestly appreciate all the warnings, I really do, but what I plan to make available is mostly stuff that now is solely on DVDs just for space reasons. It's 'daily work' to read them, but 'daily work write' files will remain on the corporate server. As for the importance of uptime, I think I overstated it: a few outages are OK, it's already an improvement over getting the DVDs. As for policy, my manager is kind of on my side, I will confirm that before making my move. As for getting more space through the proper channels, well, that was Plan A, and it's still on the table... But I don't have much hope. I'm not as "core businees" as I'd like.

    Read the article

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • recyle application pool,Warm up scripts-Performance tuning in Sharepoint WCM site

    - by joel14141
    I was trying to tune WCM public facing site we have in Sharepoint . I have following doubts By default application pools are set to recycle themselves at 2 am in night and because of that we need warm up scripts . But As I was googling on this topic I found mixed reactions on this some MVP are saying its not advisable to recycle application pool daily and some say otherwise so I am confused. Because if I am not doing recycling application pool then I don't hv to use warmup scripts . But as my site is public facing and its all around the globe so is it advisable that I should recycle it daily as it will affect the performance of my site even though I would run warm up scripts once I don't think so it wud be as good as it should be ....Any advice on that?

    Read the article

  • Win Server 2008: Task Scheduler runs programs twice or late

    - by SomeName
    Hi, I need to restart a service every day. I have logon hours restricted at 3:00 am, and the server will logout existing TS connections. I have two tasks scheduled: "Daily At 3:20 am every day" "start a program" "c:\windows\system32\sc.exe stop myservice" "Daily At 3:22 am every day" "start a program" "c:\windows\system32\sc.exe start myservice" I came in today to notice that the service wasn't running. I've been digging in logs, and found these entries: For stop task, history: a) 3:29:35 am: Action Completed (sc result code 0) b) 3:20:00 am: Action Completed (sc result code 0) For start task, history: a) 3:29:35 am: Action Completed (sc result code ERROR_SERVICE_ALREADY_RUNNING 1056 (0x420)) b) 3:22:01 am: Action Completed (sc result code 0) Checking event logs shows me: a) 3:29:35 am, Application log, Source myservice, "The service was stopped" b) 3:29:25 am, System log, Source Service Control Manager, "The myservice service entered the stopped state" So, What would have caused both tasks to run at 3:29 am? Why don't I see a message from the SCM saying that the service entered the running state? Is this the preferred way to do this? Thanks!

    Read the article

  • Nginx, logrotate and empty files

    - by tzulberti
    I have a problem with nginx/logrotate. The problems is that nginx is logging access to 2 files (main and data). I have the following contrab setting: 0 * * * * /usr/sbin/logrotate -f /home/orwell/orwell-setup/bin/logrotate-nginx And the file "logrotate-nginx" has the following content: /tmp/data.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } /tmp/main.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } The work is done in the two files, but there is a problem that nginx stops logging into those files. Both files are created, but they are empty. Any ideas why nginx stop logging info to both files?

    Read the article

  • Backup XAMPP (Htdocs & MySQL)

    - by Max
    I have a development server, but would like to backup everything at least daily to a remote location. I would like to backup the htdocs folder and the MySQL servers. But if possible also the settings of the server and anything else relevant. At the moment I am using DropBox for the htdocs, but this is not ideal. I have looked into Git, DropBox simple copy paste on a daily basis. I was wondering what any advice would be. For example how hard would it be to set it up as a cloud based system? Any and all advice is greatly appreciated.

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • /usr/bin/mandb: can't search directory

    - by tfe
    Today I got this email from my debian server: test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily ) /etc/cron.daily/man-db: /usr/bin/mandb: can't search directory /usr/local/share/man/man1/: Permission denied Can me tell somebody what does it mean? I didn't change any permissions: drw---S--- 2 root staff 4096 Jun 28 14:05 man1 P.S Directory /usr/local/share/man/man1 contains 1 file: csf.1. Yesterday (Jun28) CSF/LFT was updated automatically. How do I fix this problem?

    Read the article

  • ionice idle is ignored

    - by Ferran Basora
    I have been testing the ionice command for a while and the idle (3) mode seems to be ignored in most cases. My test is to run both command at the same time: du <big folder> ionice -c 3 du <another big folder> If I check both process in iotop I see no difference in the percentage of io utilization for each process. To provide more information about the CFQ scheduler I'm using a 3.5.0 linux kernel. I started doing this test because I'm experimenting a system lag each time a daily cron job updatedb.mlocate is executed in my Ubuntu 12.10 machine. If you check the /etc/cron.daily/mlocate file you realize that the command is executed like: /usr/bin/ionice -c3 /usr/bin/updatedb.mlocate Also, the funny thing is that whenever my system for some reason starts using swap memory, the updatedb.mlocate io process is been scheduled faster than kswapd0 process, and then my system gets stuck. Some suggestion? References: http://ubuntuforums.org/showthread.php?t=1243951&page=2 https://bugs.launchpad.net/ubuntu/+source/findutils/+bug/332790

    Read the article

  • freebsd dev server on virtualbox over windows

    - by g_kaya
    I need a unixy environment for development purposes. I hate doing things on windows but it is more stable for daily use and I don't have a mac, so I'm having to use windows (7). I want to run freebsd in a virtual machine, configure it to be the localhost server, be able to connect using ssh (within my home-network) and be able to install vbox guest addons. If guest additions aren't the best, I can use solaris or linux flavours. I need no gui. I don't know anything about network stuff, so I need a detailed explanation from vise people here, or a nice doc to read. Edit : To be more specific as requested, I use following on unices: *django 1.4 *apache *python (2.7) *emacs *mysql *probably node.js *bash scripting I use windows to be able to do daily things easily, like connecting to my tablet, browsing and learning java. And I don't want to use linux as my desktop os, beacuse it gets broken a lot, it's annoying to maintain wlan problems and some more.

    Read the article

  • Checkbox values to varchar via Spring

    - by iowatiger08
    I am trying to get a varchar message from a database to display the selected values of a checkbox field in a jsp for patient's medication's dosage frequency. The possible values will be saved in comma-delimited string in the varchar. For most form fields there is simply a one form value to one database field ratio, but in this case, I am needing to merge the values that would come as a string[] into the comma-delimited string and then when retrieving that record for that medication of that patient, display the selected values from the comma-delimited string as selected from the selectableDosageFrequencyList. You assistance in this is greatly appreciated as I am not sure what I am missing here. In the application context, I created the list of possible values as part of the ServiceBean. <property name="selectableDosageFrequencyList"> <set> <value>On an empty stomach</value> <value>Every other day</value> <value>4 times daily</value> <value>3 times daily</value> <value>Twice daily</value> <value>At bedtime</value> <value>With meal</value> <value>As needed</value> <value>Once daily</value> </set> </property> This is set up in the flow as requestscope. <view-state id="addEditMedication" model="medication"> <on-render> <set name="requestScope.selectableDosageFrequencyList" value="memberService.buildSelectableDosageFrequencyList(patient)" /> </on-render> ... <transition on="next" to="assessment" > <evaluate expression="memberService.updateMedication(patient, medication)" /> </transition> </view-state> I have helper methods in the memberService that need to be executed when the form is init and then when the form is completed. //get the form fields selected and build the new string for db public String setSelectedDosageFrequency(String [] dosageFrequencies){ String frequencies = null; if (dosageFrequencies != null){ for (String s : dosageFrequencies){ frequencies = frequencies + "," + s; } } return frequencies; } //get value from database and build selected Set public LinkedHashSet<String> getSelectedDosageFrequencyList(String dosageFrequency){ String copyOfDosages =dosageFrequency;//may not need to do this LinkedHashSet<String> setofSelectedDosageFrequency = new LinkedHashSet<String> (); while (copyOfDosages!= null && copyOfDosages.length()>0){ for (String aFrequency: selectableDosageFrequencyList){ if (copyOfDosages.contains(aFrequency)){ setofSelectedDosageFrequency.add(aFrequency); if (!copyOfDosages.equals(aFrequency) && copyOfDosages.endsWith(aFrequency)){ copyOfDosages.replaceAll(","+aFrequency, ""); }else if (!copyOfDosages.equals(aFrequency) && copyOfDosages.contains(aFrequency=",")){ copyOfDosages.replaceAll(aFrequency+",", ""); }else copyOfDosages.replaceAll(aFrequency, ""); copyOfDosages.trim(); } } } return setofSelectedDosageFrequency; } The Medication class that backs the form will have a variable for dosage-frequency as a string. private String dosageFrequency; The jsp I currently am doing this. <div class="formField"> <form:label path="dosageFrequency">Dosage Frequency</form:label> <ul class="multi-column double" style="width: 550px;"> <form:checkboxes path="dosageFrequency" items="${selectableDosageFrequencyList}" itemLabel="${selectableDosageFrequencyList}" element="li" /> </ul> </div>

    Read the article

  • Chennai Metro Water [Indian Websites]

    - by samsudeen
    If you are living in Chennai definitely you will have this question in your mind “Does Chennai 0survive this summer without any water crisis?”. The Chennai Metro Water maintains a website where in you can know all the details about the water storage, supply. This site has all the information about water supply  especially matters for  Chennai residents. I have given some of the uses of this site Water Sewer Application Status If you are a new resident and applied for water sewer application status. You can check your application (Water Sewer Application Status) by entering your WA No (Water Application No). You can also download all the required application forms. Register Complaints You can register complaints (Register Complaints) about  water supply / defective water meter and sewage blocking online. Pay Water & Sewage Tax You can pay your water & sewage tax (Pay Tax) online including the dues /arrears if any left out. Lake Level This site also maintains the current (Lake Level) and historic water levels of the Chennai water reservoirs including the daily inflow /out flow of water. You can also know the daily / month rainfall levels of these water reservoirs from 1965 onwards. Hope this  information little  useful for those who are planning / building a new house in Chennai corporation. Join us on Facebook to read all our stories right inside your Facebook news feed.

    Read the article

  • Is OpenTK Dead?

    - by ashes999
    Looking at OpenTK, I notice some disturbing signs: The last news item was posted on December 31st, 2010 The main forum gets about one post a day On SourceForge, the last nightly build was in March, and the last release was 2010. Does OpenTK exist anymore, or is it abandonware now? Edit: Some people have expressed concern at my use of "ambiguous" and "loaded terms" like "dead," "abandonware," and others. What I'm asking is this: software projects comprise of many pieces: The actual software project (such as OpenTK) A group of people who maintain the software (project leads, core developers) Some vehicle by which users can find and consume the latest versions (such as releasing daily builds) A community (can I ask questions about it? Get answers?) Updates (are there new features? New releases? Active development? A roadmap?) Some projects have all of these things. Most have a few. Some have nothing, other than maybe the actual software project itself. Is OpenTK one of these? Because it seems like: The actual software project is stable The maintainers don't contribute to it anymore There are no more latest versions (daily builds), not since 2010 (2+ years) The community is very low-traffic (nobody is asking/answering questions, who is actually using this anyway?) There are no updates since 2010

    Read the article

  • First Stable Version of Opera 15 has been Released

    - by Akemi Iwaya
    Opera has just released the first stable version of their revamped browser and will be proceeding at a rapid pace going forward. There is also news concerning the three development streams they will maintain along with news of an update for the older 12.x series for those who are not ready to update to 15.x just yet. The day is full of good news for Opera users whether they have already switched to the new Blink/Webkit Engine version or are still using the older Presto Engine version. First, news of the new development streams… Opera has released details outlining their three new release streams: Opera (Stable) – Released every couple of weeks, this is the most solid version, ready for mission-critical daily use. Opera Next – Updated more frequently than Stable, this is the feature-complete candidate for the Stable version. While it should be ready for daily use, you can expect some bugs there. Opera Developer – A bleeding edge version, you can expect a lot of fancy stuff there; however, some nasty bugs might also appear from time to time. From the Opera Desktop Team blog post: When you install Opera from a particular stream, your installation will stick to it, so Opera Stable will be always updated to Opera Stable, Opera Next to Opera Next and so on. You can choose for yourself which stream is the best for you. You can even follow a couple of them at the same time! Of particular interest is the announcement of continued development for the 12.x series. A new version (12.16) is due to be released soon to help keep the older series up to date and secure while the transition process from 12.x to 15.x continues.    

    Read the article

  • Antenna Aligner Part 8: It’s Alive!!!

    - by Chris George
    Finally the day has come, Antenna Aligner v1.0.1 has been uploaded to the AppStore and . “Waiting for review” .. . fast forward 7 days and much checking of emails later WOO HOO! Now what? So I set my facebook page to go live  https://www.facebook.com/AntennaAligner, and started by sending messages to my mates that have iphones! Amazingly a few of them bought it! Similarly some of my colleagues were also kind enough to support me and downloaded it too! Unfortunately the only way I knew they had bought is was from them telling me, as the iTunes connect data is only updated daily at about midday GMT. This is a shame, surely they could provide more granular updates throughout the day? Although I suppose once an app has been out in the wild for a while, daily updates are enough. It would, however, be nice to get a ping when you make your first sale! I would have expected more feedback on my facebook page as well, maybe I’m just expecting too much, or perhaps I’ve configured the page wrong. The new facebook timeline layout is just confusing, and I’m not sure it’s all public, I’ll check that! So please take a look and see what you think! I would love to get some more feedback/reviews/suggestions… Oh and watch out for the Android version coming soon!

    Read the article

  • New Year's Resolutions and Keeping in Touch in 2011

    - by Brian Dayton
    The run-up to Oracle OpenWorld 2010 San Francisco--and the launch of Fusion Applications--was a busy time for many of us working on the applications business at Oracle. The great news was that the Oracle Applications general sessions, sessions, demogrounds and other programs were very well attended and well received. Unfortunately, for this blog, the work wasn't done there. Yes, there haven't been many additional blog entries since the previous one, which one industry analyst told us "That's a good post!" That being said, our New Year's Resolution is to blog more frequently about what's been keeping us busy since Oracle OpenWorld San Francisco. A quick summary: - A 4-part webcast series covering major elements of Oracle's Applications strategy - Oracle OpenWorld Brazil - Oracle OpenWorld China - A stellar fiscal Q2 for Oracle and our applications business - Engagement with many Oracle Fusion Applications Early Adopter customers (more on this in the coming year) Objectives for the Coming Year Looking forward at 2011 there are many ways in which we hope to continue making connections with our valued customers and partners, sharing information about where Oracle Applications are headed, and answering questions about how to manage your Oracle Applications roadmap. Things to look for in 2011: - Stay connected with Oracle Applications on a daily basis via our Facebook page. You don't have to be a member of Facebook---but if you are and "like" the page you'll have daily insights and updates delivered to your account http://www.facebook.com/OracleApps - Coming soon, an Oracle Applications strategy update World Tour---a global program that takes key updates and information to cities around the globe - Save the date: On February 3rd, Oracle will be hosting a global, online conference for Oracle Applications customers, partners and interested parties Happy New Year and look for us in 2011.

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >