Search Results

Search found 49453 results on 1979 pages for 'memory mapped files'.

Page 121/1979 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • A Lot of WordPress wp-cron.php in Memory

    - by ServerChecker
    My client is hosting many WordPress blogs. I checked system load with "ps -ef | grep -i php" (because the server is hosting domains for clients using SuPHP and cpanel) and I see many of the blogs have wp-cron.php in memory, sometimes several of the same domain running several copies of wp-cron.php. Is this wp-cron.php loaded so much a normal thing with WordPress? Or is it a misconfiguration?

    Read the article

  • rsync -c -i flags identical files as different

    - by Scott
    My goal: given a list of files on local server, show any differences to the files with the same absolute path on remote server; e.g. compare local /etc/init.d/apache to same file on remote server. "Difference" for me means different checksum. I don't care about file modification times. I also do not want to sync the files (yet); only show the diffs. I have rsync 3.0.6 on both local and remote servers, which should be able to do what I want. However, it is claiming that local and remote files, even with identical checksums, are still different. Here's the command line: $ rsync --dry-run -avi --checksum --files-from=/home/me/test.txt --rsync-path="cd / && rsync" / me@remote:/ where: "me" = my username; "remote" = remote server hostname current working directory is '/' test.txt contains one line reading "/etc/init.d/apache" OS: Linux 2.6.9 Running cksum on /etc/init.d/apache on both servers yields the same result. The files are the same. However, rsync output is: me@remote's password: building file list ... done .d..t...... etc/ cd+++++++++ etc/init.d/ <f+++++++++ etc/init.d/apache sent 93 bytes received 21 bytes 20.73 bytes/sec total size is 2374 speedup is 20.82 (DRY RUN) The output codes (see http://www.samba.org/ftp/rsync/rsync.html) mean that rsync thinks /etc is identical except for mod time /etc/init.d needs to be changed /etc/init.d/apache will be sent to the remote server I don't understand how, with --checksum option, and the files having identical checksums, that rsync should think they're different. (I've tried with other files having identical mod times, and those files are not flagged as different.) I did run this in /, and made sure (AFAIK) that it's run remotely in /, so even relative pathnames will still be correct. I ran rsync with -avvvi for more debug info, but saw nothing remarkable. I'm wondering: is rsync still looking at file mod times, even with --checksum? am I somehow not setting up the path(s) right? what am I doing wrong?

    Read the article

  • Removing DS_Store files and variants?

    - by Ron Gejman
    I am running an Ubuntu 10.04.1 LTS server. Frequently I open up files using AFP from my Mac. Inevitably this created .DS_Store files on the server (although for some reason they are named :2eDS_Store. However, it also creates variants on DS_Store files. These variants are often named similarly to other files in that directory. E.g.: ~$ ls total 60K -rw-r--r-- 1 tarakhovsky 16K 2010-11-30 18:28 :2eDS_Store drwx--S--- 4 tarakhovsky 4.0K 2010-11-08 13:58 :2eTemporaryItems/ lrwxrwxrwx 1 tarakhovsky 15 2010-10-19 17:44 bigdisk -> /media/bigdisk// ... drwxr-xr-x 3 tarakhovsky 4.0K 2010-11-03 18:24 Temporary Items/ drwxr-xr-x 3 tarakhovsky 4.0K 2010-11-30 01:34 tmp/ ... I've disabled creation of DS_Store files using: defaults write com.apple.desktopservices DSDontWriteNetworkStores true so hopefully this won't continue to occur—but I really want to get rid of all of the existing variants of DS_Store files already on the server. Any ideas as to why these variants are being created and how I can get rid of them all?

    Read the article

  • jZip isn't integrating with Windows 7 shell

    - by Aayush
    I installed jZip in my new windows 7 64 bit OS. but it doesn't seem to integrate with the shell. It doesn't appear in the right click menu on any file(not even zip files). To zip a file I have to open jZip go to file-new and browse for the files and to extract compressed files I have to open the file in jZip then click on extract, there is no right click menu integration what so ever, although It used to have when I had jZip on the Windows 7 RC. I reinstalled checking the settings incase I made a mistake but I checked shell integration while installation. Don't know what it wrong? anyone know how I should solve this. Help..? Thanks!

    Read the article

  • Count total file size of many files in Windows

    - by user105249
    When I want to burn a CD R with lots of files, I have to make sure the total file size in my folder doesn't exceed the capacity of the disc (680MB). In Windows are there possibilities to check the total file size of a bunch of files. I put them in a folder, right-click and check the properties. But this is an annoying trial and error kind of way. Either there are too many files in the folder, or too little. I watch the file size go up as I keep selecting more files, using ALT+going down button. No. 2 is my favorite way to do it. Here's my question: For some reason Windows (I still use XP) only shows the total file size of 100 selected files. When you select more than 100 files, no file size information appears any more. Is there way, a trick, an app, to work around this problem?

    Read the article

  • iMac memory limit

    - by Mike
    I have an iMac that was from the first generation of aluminum iMacs. The reported model is "iMac 7,1". This iMac's manual says I can put 2 2GB modules, but when this manual was made we don't have modules with more than 2GB and also we had Leopard then, that I suppose can handle less memory than Snow leopard. Today we have 4GB modules, so can I put two 4GB modules and make it 8GB? thanks.

    Read the article

  • Out of memory error while uploading file into a blob

    - by stacker
    I received a out of memory error SQL Error: 0, SQLState: 53200 from postgres while trying to upload a 10MB file into a single row with a column of blob-type bytea. Which configuration parameters should be changed to allow inserts of this size, or should this work out of the box without modifications? Is there an option like in informix-db to create so called blob-spaces?

    Read the article

  • How do i restart my linux server in every 2 days via crontab?

    - by Barkat Ullah
    I have a Linux server containing the os version below: Linux 2.6.32-220.7.1.el6.x86_64 I want to restart it in every 2 days, please help me, I want to do it via crontab. Another help, I used a code below to drop my memory caches in every hour. 0 * * * * /root/clearcache.sh #!/bin/sh sync; echo 3 > /proc/sys/vm/drop_caches But 1st 15 in every hour my server remain so slow after cleaning the caches. My sites do not load during every hour in 1st 15 minutes. In another way if I restart my server then also caches are removed. So I decided to restart my server in every 2 days to drop my caches. Will it be helpful to restart? Or is there any other way to drop my memory caches that will not down my server?

    Read the article

  • make quantaplus a default editor for php files in ubuntu

    - by diEcho
    In Ubuntu, how can I make Quanta Plus the default editor for PHP files? In Windows we just use the open with context menu the first time for any new types of files and check use this application for these type of files. After that the files always open up in checked application. I want to know if Ubuntu has a similar feature. Also, if I open 2 PHP files then two instance of Quanta Plus opens up. How can I get the secondary PHP files to open on another tab instead?

    Read the article

  • tar - exclude certain files

    - by Alan
    I wish to tar all files in a directory and its subdirectories that do NOT end in .jpg, .bmp, .gif, or png. So, given the following folders and files: foo/file.txt foo/file.gif foo/bar/file foo/bar/image.jpg I want to tar only the files file.txt and file. file.gif and image.jpg should be ignored. I would also like to maintain the folder structure. My first thought was to pipe the results of the find command in conjunction with grep -v ".jpg|.gif|.bmp.png" to a text file, and then use the tar include argument to feed it that list of files. However, the results of the grepped find command also contain directories (in the example above, it would be "foo" and "foo/bar"), and when a directory is fed to tar, it includes all files in that directory, so I would end up with a tar file containing all of the files--not what I want. Is there any way to prevent find from outputting directories? Is there a far easier way to approach this?

    Read the article

  • Clearing cmd.exe memory or cache

    - by Abs
    The main question is how do I clear the command prompts memory or cache. I ran this on cmd.exe svn info <URL> Which prompted me a for a username and password which I entered. I then logged off and logged back in immediately and I entered that same command and I wasn't asked for a username and password. I want to clear this to test something without restarting my server! How can I do this?

    Read the article

  • How to fix Windows 2008 R2 show less memory than available

    - by eugeneK
    I've got Win2008 R2 64bit installed on Dell R410 server with 8GB of RAM. Dell Open Manage shows 8GB total and 4GB available for use, In Windows Control Panel, System i see 64bit and 8GB of RAM while in Windows Task Manager at Performance tab there is 4GB of memory available. Dell support has made some checks and told me that if BIOS shows 8Gb of RAM and indeed BIOS does then it's operational system issue. Tried to search online for resolution but none found. Please help, thanks

    Read the article

  • HP ML150 G6 upgrading RAM/CPU beyond specs?

    - by Morten Green Hermansen
    I am being told that some limits on some HP servers can be crossed. Do any of you have any experience with that? A ML150/G6 is limited to 48GB RAM but I have been talking to a German company that guaranties me that this server will be able to be upgraded to 384GB RAM (using 32GB memory modules and 2 CPUs) http://www.compuram.de/en/memory,HP+%28-Compaq%29,Server,Proliant,ML150+G6.htm Can this really be true? The server that I have is using E5504 CPUs but will I be able to upgrade to any CPU that is using a LGA1366 socket? All from a low wattage L5640 all the way to the 6 core, high wattage versions like an X5650? (If cooling and power is adequate ofcause). Is there any limitation with powerregulators and chipset (Intel 5500). I am looking forward to any reply. Thanks in advance and best regards, - Morten Green Hermansen, Fanitas

    Read the article

  • Eclipse: Organising Files

    - by someguy
    I want to import a project that I'm planning to build upon. The problem is that it is very messy; with source files, class files and libraries under one directory. How would I organise these files using Eclipse? I know you can change the source folder and output folder, but when I do change the source folder, the files that I want inside it do not physically move to that folder. Output folder is fine, though. Also, I would like a separate folder for libraries. I'm not sure how I would go about this, however. Here's how I would like it: src: This folder will contain source files. bin: This folder will contain binary (class) files. lib: This folder will contain external libraries.

    Read the article

  • Commit charge peak higher than system limit

    - by Grubsnik
    We are seeing some very strange behaviour on our servers and google didn't turn up anything usefull, so I'm tossing it out here. A standard server is configured with 4GB Ram, 2 4GB pagefiles and running windows server 2003. The servers are running 50-120 vb6/.net applications which normally consume no more than 100mb of memory, but will occasionally run up to 300 mb. The issue with a single process spending way too much memory is being traced down somewhere else, but the thing that is baffling us is that the reported peak charge is vastly higher than what we have available. As the image above shows, we are getting reported peaks that are way higher than what the system is actually capable of delivering. This number has been seen as high as 29GB, which makes no sense at all for a system with a limit of 12GB. Does anyone have an idea what is going on?

    Read the article

  • Apache Custom Log Format

    - by Shishant
    Hello, I am trying to write a reward system wherein users will be given reward points if they download complete files, So what should be my log format. After searching alot this is what I understand its my first time and havent done custom logs before. First of all which file should I edit for custom logs because this thing I cant find. I am using ubuntu server with default apache, php5 and mysql installation # I use this commands and they work fine nano /etc/apache2/apache2.conf /etc/init.d/apache2 restart I think this is what I need to do for my purpose LogLevel notice LogFormat "%f %u %x %o" rewards CustomLog /var/www/logs/rewards_log rewards This is as it is command or there is something missing? and is there any particular location where I need to add this? and one more thing %o is for filesize that was sent and is it possible to log only files from a particular directory? or for files with size more than 10mb. Thank You.

    Read the article

  • Add entire 300 GB filesystem to Git Annex repository?

    - by Ryan Lester
    By default, I get an error that I have too many open files from the process. If I lift the limit manually, I get an error that I'm out of memory. For whatever reason, it seems that Git Annex in its current state is not optimised for this sort of task (adding thousands of files to a repository at once). As a possible solution, my next thought was to do something like: cd / find . -type d | git annex add --$NONRECURSIVELY find . -type f | git annex add # Need to add parent directories of each file first or adding files fails The problem with this solution is that there doesn't seem from the documentation to be a way to non-recursively add a directory in Git Annex. Is there something I'm missing or a workaround for this? If my proposed solution is a dead end, are there other ways that people have solved this problem?

    Read the article

  • Delete files from directory: memory exhausted

    - by codeholic
    This question is a logical continuation of http://serverfault.com/questions/45245/how-can-i-delete-all-files-from-a-directory-when-it-reports-argument-list-too-lo I have drwxr-xr-x 2 doreshkin doreshkin 198291456 Apr 6 21:35 session_data I tried find session_data -type f -delete find session_data -type f | xargs rm -f find session_data -maxdepth 1 -type f -print0 | xargs -r0 rm -f The result is the same: find: memory exhausted What can I do to remove this directory?

    Read the article

  • Deleting files using .NET that were migrated from win2k3

    - by Andrew Duncan
    We recently migrated an ASP.NET website from Windows 2003 to Windows 2008 R2, by zipping up all the files and extracting them to the new site. Since migrating the web application is still able to upload and delete files (that are new), however, it's unable to delete files that were copied from the original Win 2k3 app. We're guessing it's a permissions problem because the error is: Access to the path 'E:.......PATH.....' is denied. We've been trying to match the permissions of a newly uploaded file to that of a migrated files. Newly uploaded files seem to get the APP POOL user as a permission and the OWNER. However, the original files didn't have this. Any help that anyone can be would be fantastic. Thanks,

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >