Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 641/1620 | < Previous Page | 637 638 639 640 641 642 643 644 645 646 647 648  | Next Page >

  • Linux/Apache performance very slow even on local network

    - by klausch
    I have an Ubuntu server machine running Apache and MYSQL. System and version info is as follows: Linux kernel 3.0.0.-12 Apache/2.2.20 MySQL Ver 14.14.Distrib 5.1.58 I am running a few websites on this server, some HTML only, some PHP/MySQL. THe [problem is that response time is very slow, both on static as well as the dynamic sites. Sometimes it takes more than 10 seconds before a response is given, this makes the sites very slow and almost unusable. The problem occurs even when requesting from the local network. I have added the involved subdomains to my /etc/hosts file, and abolve all the problem is not solved by using IP numbers instead of URL's. So there is no DNS lookup issue. I have modified the log format by showing the response times and sometimes a files takes 12 seconds to be served, see the jquery~.js file in the example screenshot. I have no explanation for this extremely long response time, but is is not even the only issue here, some other files takes a long time to be served too, but do not show a long response time in the log file. So probably different tissues are involved here. I cannot find a solution until now, any suggestions??? THanx in advance, Klaas link to screenshot picture from access logfile Some extra configuration info: apache2.conf (comment is removed) LockFile ${APACHE_LOCK_DIR}/accept.lock PidFile ${APACHE_PID_FILE} Timeout 300 KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 5 <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_worker_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> <IfModule mpm_event_module> StartServers 2 MinSpareThreads 25 MaxSpareThreads 75 ThreadLimit 64 ThreadsPerChild 25 MaxClients 150 MaxRequestsPerChild 0 </IfModule> User ${APACHE_RUN_USER} Group ${APACHE_RUN_GROUP} AccessFileName .htaccess <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy all </Files> DefaultType text/plain HostnameLookups Off ErrorLog ${APACHE_LOG_DIR}/error.log LogLevel warn Include mods-enabled/*.load Include mods-enabled/*.conf Include httpd.conf Include ports.conf LogFormat "%v:%p %h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" vhost_combined LogFormat "%h %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\" %T/%D" combined LogFormat "%h %l %u %t \"%r\" %>s %O" common LogFormat "%{Referer}i -> %U" referer LogFormat "%{User-agent}i" agent Include conf.d/ Include sites-enabled/ And the virtual hostfile for one of the slow sites, in fact it is pretty straightforward... <VirtualHost *:80> ServerAdmin [email protected] ServerSignature EMail ServerName toenjoy.drsklaus.nl DocumentRoot /var/www/toenjoy.drsklaus.nl <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory /var/www/toenjoy.drsklaus.nl/> Options Indexes FollowSymLinks MultiViews AllowOverride AuthConfig AuthType Basic AuthName "To Enjoy" AuthUserFile /etc/.htpasswd Require user petraaa Order allow,deny allow from all </Directory> ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/ <Directory "/usr/lib/cgi-bin"> AllowOverride None Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch Order allow,deny Allow from all </Directory> ErrorLog /var/log/apache2/error.log # Possible values include: debug, info, notice, warn, error, crit, # alert, emerg. LogLevel warn CustomLog /var/log/apache2/access.log combined Alias /doc/ "/usr/share/doc/" <Directory "/usr/share/doc/"> Options Indexes MultiViews FollowSymLinks AllowOverride None Order deny,allow Deny from all Allow from 127.0.0.0/255.0.0.0 ::1/128 </Directory> </VirtualHost> And the output of free -m: klaas@ubuntu-server:/etc/apache2$ free -m total used free shared buffers cached Mem: 1997 1401 595 0 144 1017 -/+ buffers/cache: 238 1758 Swap: 2035 0 2035 and I have no indication that swapping occurs on the moments the site is slow. I have runned top and it does not appear to be a CPU issue. I have the impression that the spawning of a apache thread could maybe be the bottleneck but it is just a suggestion. Maybe this gives some extra information! EDIT: The problem seemed to be gone for some time but occurs again! And not only with Apache, also connecting using SSH takes a tremendous time, sometimes it takes up to 15 seconds before the keyphrase is asked for. Also scp works very slowly. The behavious is really unpredoctable and makes the server very hard to use. Any ideas...?

    Read the article

  • Steps to install solely ubuntu 13.04 on Dell inspiron 14z ultrabook with SSD+HDD

    - by rishy
    I have tried a few things like disabling the Intel smart response, choosing AHCI in BIOS. But there are certain problems I am still facing. I can't see my SSD during the installation of ubuntu (I am planning to install Ubuntu on my SSD and other files on HDD). When I run Ubuntu my laptop gets overheated and battery backup reduces to 90 minutes. (I guess it's related to my graphic driver ATI Raedon HD 7570). Cooling fan seems to run at its fullest, it was working much better in windows. So, overall I wanted to know what are the exact steps I need to follow to install Ubuntu on my SSD and then use my HDD to keep other files, How can I get rid of overheating and battery backup problem?

    Read the article

  • Color highlights have vanished in emacs

    - by Jim Kiley
    I'm using emacs on a remote Linux server that I access via ssh. I'm editing C files that have a non-standard suffix, so I have had to manually enter c-mode with M-x c-mode every time I open one of those files. I found this to be annoying so I started monkeying with my .emacs to make that problem go away. This made all the color highlights in c-mode go away instead. Correction: All my color highlights are gone. I've removed the .emacs file, logged out and logged back in, but now, the color highlights are gone. I miss them! They were very helpful How do I get them back?

    Read the article

  • What statistics app should I use for my website?

    - by Camran
    I have my own server (with root access). I need statistics of users who visit my website etc etc... I have looked at an app called Webalyzer... Is this a good choice? I run apache2 on a Ubuntu 9 system... If you know of any good statistics apps for servers please let me know. And a follow-up question: All statistics are saved in log-files right? So how large would these log-files become then? Possibility to split them would be good, dont know if this is possible with Webalyzer though...

    Read the article

  • Wordpress serving PHP but not CSS or JS

    - by Jason
    I'm trying to set up an Amazon EC2 instance to run a Django app and a WP instance side by side, differing only by the incoming URL. Initially, accessing the site via mysite.com/wordpress worked, but I also needed to catch the incoming requests from a subdomain address blog.mysite.com. To do that, I created a default file in /etc/apache2/sites-enabled and included two virtualhost directives, one of which was <VirtualHost *:80> ServerName www.blog.mysite.com <Directory /var/www/wordpress> Order deny,allow Allow from all </Directory> </VirtualHost> This created some errors with the other virtualhost, so I restored the default 000-default file configuration and restarted. Now, accessing mysite.com/wordpress takes forever, and even then the CSS and JS files are not loading. Iside the Firebug Net tab, I can see the HTML response, but the CSS and JS files are not loading at all. What happened here?

    Read the article

  • Simple secured SFTP tunnel?

    - by babonk
    I'd like to setup an sftp tunnel so that I can connect to an IP-secured SFTP server through a gateway computer from anywhere, and download the files to anywhere. I was thinking of using a combination with netcat, having it listen to either WinSCP or PuTTY sFTP (doesn't matter which). Not sure how I would download the files to the connecting computer though. I would like the tunnel to be secured, preferably, with a username/password. I'm open to using alternative software but I'm looking for unintrusive, simple command line stuff because I don't want to install a lot on this computer. Thanks

    Read the article

  • ffmpeg error while segmenting

    - by Tommy Ng
    I'm using ffmpeg and segmenter on Ubuntu 10.04 to create the transport stream from flv/h264 video files and then segment the ts segments for ipad streaming. Some ts files show an error with segmenter - Output #0, mpegts, to '29': Stream #0.0: Video: 0x0000, yuv420p, 480x360, q=2-31, 90k tbn, 25 tbc Stream #0.1: Audio: 0x0000, 0 channels, s16 [mpegts @ 0x11f4ac0]sample rate not set Could not write mpegts header to first output file my ffmpeg command for creating the ts file - ffmpeg -i 1.flv -f mpegts -acodec libfaac -ar 48000 -ab 64k -s 480x360 -vcodec libx264 -b 192k -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 192k -bufsize 192k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 480:360 -g 30 -async 2 -y 1.ts my segmenter command - segmenter 1.ts 10 1 1.m3u8 path/to/streams/

    Read the article

  • It's the ethernet 10/100 in LAN transfer faster than USB 1.0?

    - by dag729
    I have an old laptop (PIII 800MHz, with 256 RAM) that I wish to use as my home server: it'll have to serve just two people, so I think that I'll be more than ok as for the RAM and the CPU. The issue is about data, because the internal hard disk is a 12GB, that is...ridicolous! I have more than 60GB of mixed storage and counting (images, videos and music) in an external usb hd. I could put the hd in my desktop pc just to serve the big files through ethernet or let it inside its usb box attached to the laptop. The question is: which of these solutions will be the fastest? USB 1.0 attached to the server (laptop) or internal hard disk serving files via 10/100 ethernet to the laptop on demand?

    Read the article

  • Backup solution

    - by user66115
    We are currently looking for a new backup solution. Our current network is 5 remote location with a tape backup in each plant. Right now we are looking at a MPLS VPN and running backups out of our main plant. The main thing that we backup are user private folders and department files. And each plant has it's own file server that houses CAD drawings. My main plan is to have every thing but that CAD drawing at the main faculty. We would start with a main backup of the drawing files and then do change backups back to the main plant. Besides tapes what would be the best way to backup. Our contact at Pc Connection is point us toward a Tandberg Data device.

    Read the article

  • Why does my drive show as 180GB used, but copies only 40GB?

    - by Manuel
    My computer crashed, so I removed the drive (with Windows XP Professional 32-Bit) and put this drive in another computer running Windows 7 64-bit. Booting with Win7, when going to My Computer it is showing the XP drive as more than 180GB used. So as a quick backup, I'm trying to copy all the disk's content to a folder, but when I start copying, it's showing only 40 GB as the total size of the files to copy. I enabled showing system and hidden files in View options. What could be the problem?

    Read the article

  • I can't write to a folder which I'm a member of

    - by user3265472
    I'm trying to setup folder access to a group so that all members of that group can create/edit/delete files within the folder. # create my group and add a member sudo addgroup dev sudo adduser martyn dev Now, logged in as "martyn", check my user has been added to "dev" group groups martyn martyn : martyn dev Now I want to change the group ownership of my project folder so all members of that group can edit it and files/folders within it. sudo chgrp -R dev myproject Just to check: martyn@localhost:/var/www$ ls -l total 4 drwxrwxr-x 3 dev dev 4096 May 31 15:53 myproject Now here's where it fails. I want to create a file within myproject (logged in as "martyn", a member of "dev"): vi myproject/test ..but when I try to save the file I get the following error: "myproject/test" E212: Can't open file for writing Why, as user "martyn" which is a member of "dev", can I not write this file? Even if I create the file so it exists, change the ownership to "dev" then try to edit and save - I get the same error.

    Read the article

  • Enable mod_deflate per directory level

    - by z1_jabbar
    I am using following code, when i access site it only compress all the jsp inside all the urls path under /abc but it ignores all the js and css files. I want to compress js and css files under all the subfolders in /abc path? How I can do this. Thanks! <LocationMatch "/abc"> <IfModule mod_deflate.c> SetOutputFilter DEFLATE # Don't compress images SetEnvIfNoCase Request_URI \ \.(?:gif|jpe?g|png)$ no-gzip dont-vary #Don't compress PDFs SetEnvIfNoCase Request_URI \.pdf$ no-gzip dont-vary #Don't compress compressed file formats SetEnvIfNoCase Request_URI \.(?:7z|bz|bzip|gz|gzip|ngzip|rar|tgz|zip)$ no-gzip dont-vary <IfModule mod_headers.c> Header append Vary User-Agent </IfModule> </IfModule> </LocationMatch>

    Read the article

  • Recommend a web file sharing software please.

    - by Baczek
    I'm looking for a web platform to put company files at. My requirements are: should be accessible via a browser should be open source must be installable (dropbox is a no-go) must have an option to put a access time limit on a file must perform garbage collection automatically after a file expires must be able to mark files as public or private an option to protect a file via a pin-code for users without accounts in the system would be nice to have The problem is I don't even know what to search for - all my googling results in either complete groupware solutions or p2p file sharing software. If such a thing doesn't exist, please don't hestitate to say so, so I can crawl to a corner and cry myself to sleep. TIA

    Read the article

  • glassfish v3 and java EE in production mode: what are the options to update a live web app?

    - by shadesco
    I am building a web app using java EE and glassfish v3. I want to move it to production mode soon, however i have zero experience with using glassfish in production, i would appreciate if you give me some guidance about how to approach the following scenario: say i have deployed the web app using admin console pointing to the .war file. But what if i want to update this live application, do i need to : a) undeploy -- build new war file (with updates) -- paste the war file to the app folder --redeploy? b) move in only the changed files , ie : .class files , jsp, etc... without undeploying before?

    Read the article

  • Homegroup sharing problems

    - by soandos
    I can see other people in my homegroup and their folders, but when I click on those folders, I cannot see the contents (no error message, just nothing happens). The other people in the Homegroup can see me just fine, and my files In addition, I cannot see the files that they have under the network tab, though they can all see each others stuff. What could be the issue? The homegroup has already be created and recreated numerous times. Perhaps unrelated, but I am unable to turn off password protected sharing.

    Read the article

  • How can I display host name on windows desktop

    - by Martin
    I do a lot of work on Windows Server 2008 remote desktops and often lose track of which host I am currently logged on to. Is there a way of displaying (without installing any non-standard apps) the host name or IP address of the host I am connected to in either the wallpaper or the notification area? I tried creating files in the desktop with the name of the machine - but my roaming profile shows the same set of desktop files on every machine, so that was scuppered. Duh! In shell windows this is easy: just set the prompt to display the host name. Surely there is a simple way of doing the same for the graphical desktop.

    Read the article

  • Add folder name to beginning of filename - getting multiple renames

    - by Flibble Wibble
    I've used dbenham's excellent response to the question of how to add the folder name to the beginning of a filename in a cmd script. @echo off pushd "Folder" for /d %%D in (*) do ( for %%F in ("%%~D\*") do ( for %%P in ("%%F\..") do ( ren "%%F" "%%~nxP_%%~nxF" ) ) ) popd What I'm finding is that seemingly randomly (though it probably isn't) sometimes the script will run through several child folders and rename correctly but then it gets to a folder where it gets stuck in a loop and starts adding the folder name repeatedly to the file inside. I have 90,000 files in 300 folders to rename this weekend. Any chance you can guess the cause? PS: Is there a maximum number of files that are acceptable in each folder?

    Read the article

  • Is it safe to compress my Windows 7 %USERPROFILE%\AppData folder?

    - by Kev
    Having just read Scott Hanselman's latest blog entry, Guide to Freeing up Disk Space under Windows 7, he suggests turning on NTFS compression which I already do for a number of less travelled folders that contain static files such as downloads or images. However I am wondering if it's wise to turn on NTFS compression for the whole of my %USERPROFILE%\AppData folder? My system drive is a 128 GB SSD residing in a Dell Precision T5400 3Ghz Quad Core Xeon workstation so I ought not to notice the extra cycles used to compress and decompress files on their way to and from the disk. However would there be any good reasons not to do this? In fact could I safely compress the whole of my %USERPROFILE% folder?

    Read the article

  • Applications starts very slowly from a network path

    - by Snowfox
    Hi We have a windows 2008 server which hosts the network share \\srvcompany\lib. This share contains several applications needed for the daily business. Every client/user (all win xp) has shortcuts on the desktop to these apps. We have the problem that at several (but not all) clients the apps starts very slowly. If I copy the application's programm files to a local folder then they'll start fastly. When I watch the memory usage in the task manager on such a "slow" machine while an applications starts I notice that the memory usage grows much slowier than when I start the app from a "fast" machine. But when I copy files with Windows Explorer from this share, the speed is nearly the same. I've also checked the network driver, both tested clients have the same network card with the same driver version. Has anyone an idea where or what I should check next to solve this problem? Thanks for answers.

    Read the article

  • How to automatically print the contents of a folder in OS X?

    - by MDRoz
    I would like to be able to set up a folder on my Mac, where I could dump files and have them be automatically sent to the default printer. This way I would be able to print files at home when I'm not physically at home, using something like Dropbox. It wouldn't have to be real-time; i.e., a scheduled job that checks the folder every so often would be acceptable. What's the easiest approach? Automator? Applescript? Cron job?

    Read the article

  • Online backup service _with_ filtering (by extension, size and so...)

    - by QyRoN
    Hello, I recentely discovered an ability to backup personal data to online server but I was never surprised when I didn't found a popular service provides filtering capabilities, i.e. all of them they backup all the contents of specific folder. Are there any free options with filtering? To be specific, I need following features: Backup to online server. Automatic but bandwidth-transparent backup, i.e. it will backup my files automatically but won't try to do it if I'm heavily using the computer or internet at the moment. Individual filtering settings for folders, i.e. I want to specify which files to backup in every folder. Some free plan (since I'm not going to use more than 500MB of space).

    Read the article

  • is it okay to use random URLs instead of passwords?

    - by stew
    Is it considered "safe" to use URL constructed from random characters like this? http://example.com/EU3uc654/Photos I'd like to put some files/picture galleries on a webserver that are only to be accessed by a small group of users. My main concern is that the files should not get picked up by search-engines or curious power-users that poke around my site. I've set up an .htaccess file, just to notice that clicking on http://user:pass@url/ links doesn't work well with some browsers/email clients, prompting dialogs and warnings messages that confuse my not-too-computer-savy users.

    Read the article

  • Open original Microsoft Office document (not "version 1") on Mac OS X Lion restart

    - by FlyingMolga
    My MacBook Pro running Lion has been frequently freezing lately, and I've had to restart with the power button. When Lion starts up again, the Microsoft Office applications that were running start and load different autosaved versions of the documents I had open (i.e. it does not open abc.xlsx but [version 1] of abc.xlsx). Sometimes it also opens the original files. Several times I've inputted data into these "version 1" files, only to try to save it and realize that it isn't the original file and is sometimes missing data that is contained in the original file. Is there any way to make autorecover open the actual document with the unsaved changes, instead of making a new temporary version?

    Read the article

  • Cron job failing to backing up a Postgres database

    - by user705142
    I'm unsure what's going on here: I've got a backup script which runs fine under root. It produces a 300kb database dump in the proper directory. When running it as a cron job with exactly the same command however, an empty gzip file appears with nothing in it. The cron log shows no error, just that the command has been run. This is the script: #! /bin/bash DIR="/opt/backup" YMD=$(date "+%Y-%m-%d") su -c "pg_dump -U postgres mydatabasename | gzip -6 > "$DIR/database_backup.$YMD.gz" " postgres # delete backup files older than 60 days OLD=$(find $DIR -type d -mtime +60) if [ -n "$OLD" ] ; then echo deleting old backup files: $OLD echo $OLD | xargs rm -rfv fi And the cron job: 01 10 * * * root sh /opt/daily_backup_script.sh It produces a database_backup file, just an empty one. Anyone know what's going on here?

    Read the article

< Previous Page | 637 638 639 640 641 642 643 644 645 646 647 648  | Next Page >