Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 457/1620 | < Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >

  • Bandwidth monitor for apache websites

    - by bmaynard
    I am after a web application that will parse apache log files and record how much bandwidth the user has used. We have several virtual hosts that have custom log files and the I/O is recorded at the end of the logfile. However I can't find an application that will parse multiple log files and display a summary for each site. I believe awstats can do this but I want to be able to see all of my clients in one list. If there is something that integrates into cacti then that would be perfect.

    Read the article

  • Tell browsers to cache until last modified date changes?

    - by Chad Johnson
    My web site consists of static HTML files which are usually republished once per day, and sometimes more. I'm using Apache. In the vhost settings for my site, I'd like to tell browsers to cache HTML files indefinitely, until Apache sees that they are modified. So as soon as an HTML file is changed, Apache should immediately begin telling browsers it's changed and send the updated file. As soon as a new file is published, browsers should immediately begin receiving that...they should never receive old versions of files. Maybe ExpiresByType text/html modification and no "plus x days." Is something like this possible?

    Read the article

  • Norton Backup "Failed to Restore"

    - by Teknophilia
    I recently had one of my computers (XP) die on me. I had it's files set to automatically backup to another PC's HD using Norton. I've tried using Norton restore on the second computer to try and restore some files (word documents, pictures), but when I try to do this, I get a dialog box saying that it "Failed to Restore". When I click to continue, it shows a list of the files I tried to restore, along with a status indicator for each file (which says "invalid file"). Any ideas?

    Read the article

  • Restoring Subversion repositories from backup

    - by John Hoge
    Hi, I had to restore a subversion server from a backup image taken the previous night. Everything worked fine after the restore except for one repository. A working copy had been committed on the server after the latest backup, so this working copy had newer files than the restored repository. I tried to commit the files using tortoise, but SVN didn't recognize that the files on the working copy were newer than those in the repository. I'm using Subversion Server 1.6.5 on Windows 2003 Server and TortoiseSVN 1.6.8 64 bit on a Win7 64 bit client. Thanks, John

    Read the article

  • How do I reassemble a zip file that has been emailed in multiple parts?

    - by Guy
    I received 3 emails each containing part of a zip file. The extensions end in .z00, .z01 and .z02. (Emailed as such to get around the typical 10Mb attachment limit per email.) I have put all 3 files into one directory. I can use both 7-zip and WinZip to open the first file (the .z00 file) and it lists the contents of the zip but when trying to extract the files both programs are reporting errors. What is the least error prone way of reassembling this zip and getting to the files?

    Read the article

  • Distortion in format of data in wordpad file when shifted from windows XP to winows 2007

    - by Harpreet
    I have many data files which were set to open in wordpad file in windows XP. Those files have a particular format for data, like following: Name of Data file No. of data columns Name of data in column_1 Name of data in column_2 . . . Name of data in column_n column_1 column_2 column_3 ... column_n Now my computer has been formatted and OS is changed to windows 2007, however when I open my data files in wordpad the above format of data is no more present. The format in wordpad in windows 2007 seems to be distorted. Does anyone knows what to do to restore the format as shown above, which is what the data used to look like in XP? I have attached the snap shot of the new distorted format of data as seen in wordpad in windows 2007. The snap shot shows 100 column names, however the data columns present are only 5 when it should be actually 100 data columns.

    Read the article

  • What tools should I use to edit H.264 MP4 GoPro videos?

    - by WW.
    I have recorded videos using a GoPro, which produces MP4 files containing H.264 encoded videos. I would like to do some simple editing tasks on these videos without losing quality:- Cut various scenes together Change soundtrack I'm using Windows XP Pro so I have Windows Movie Maker which seems like it should be sufficient but can not read the MP4 files that I have. Can I install a codec to allow WMM to read the MP4 files? Can I convert from MP4 to something that WMM reads? Is there a different video editing program that I should use? Free software would be preferable, but I'm willing to pay if it's a superior solution.

    Read the article

  • Folder Redirection won't load on Windows 7 Machine in Windows 2008 R2 Network

    - by leeand00
    Okay so redirected profiles don't load exactly, but the computer is joined to the network and it won't display any of the users files on their desktop that are in their redirected profile. I know this because we have a Terminal Server and when the user logs in there, her files appear. I checked the users' profile in Active Directory Users and Computers and compared it with a working users profile. When that didn't turn up any differences, I looked at her computer and found that on the Dial-in tab the Network Access Permission wasn't set to Control access through NPS Network Policy like it was on the other machines on the network; so I selected it, ran gpupdate /force on her machine and rebooted. This did not fix the issue. Is there anything else that could be preventing the redirected files on the users desktop from showing up when the user logs in?

    Read the article

  • uninstall mysql on linux with plesk

    - by Arsenal
    I'm having trouble uninstalling linux on my centOS 4 that has plesk. I'm actually trying to upgrade my Mysql 4.1 version to Mysql 5.0 using the following command: yum update mysql I get an error list of conflicted files however. When I try to remove mysql 4.1 and perform a clean install but when I use yum remove mysql* It deletes all of its dependencies and appearantly some of these are files needed by plesk, which causes my plesk to stop working. A did a full restore and everything is okay now, but how can I remove mysql without ruining plesk? I have also tried: rpm -qa | grep mysql to get a list of all files and remove them one by one, but there's a duplicate in that list, so I can't delete those (because it says it doesn't know which one to take). Any help would be greatly appreciated!

    Read the article

  • Mac sees mangled filenames from Linux SAMBA share

    - by me2
    Certain filenames in my SAMBA shares from Linux are not getting transmitted properly. It's certain files in certain folders and it doesn't affect all folders or all files within a single folder. I can find no discerning pattern to the mangling but hoping that this is a known problem. No amount of reboot, restart, etc. will fix this problem. The filenames, when they get mangled, all end up in this form: 0JY4B3~H.M4V 0MBS1O~M.M4V 0NKDX9~R.M4V 0O0ZTA~A.M4V These are MPEG4 files. The extension remains intact. Any ideas?

    Read the article

  • What are the requirements for gettings django translations to work?

    - by Espen Christensen
    Hi, I am hosting several djangosites on a CentOS 5 box. But I'm having difficulties with translations. So first i had to upgrade the gettext package from 0.14 to 0.16, but that didn't help. Now i can make and compile tranlsations files with the managment commands, but the translations does not show. I am sure that the translation files are located at the right place since they work with the same setup on a local installation, and django's own translation files does not work. (e.g the admin is not translated). What could i be missing in my server setup that makes this happen?

    Read the article

  • Broken characters in filenames only in some directories

    - by Kaivosukeltaja
    We have a web server running CentOS 5.8 that uses SVN for version control. When trying to switch to the latest revision, we got an error about the filenames of files in an upload directory: svn: Error converting entry in directory 'adm/emails/upload' to UTF-8 svn: Valid UTF-8 data (hex: 54 79) followed by invalid UTF-8 sequence (hex: f6 6b 69 72) Upon investigating, we noticed there were some files that had broken filenames: $ ls ~/public_html/adm/emails/upload/ Ty?el?m?trendit.csv Ty?kirja1.csv To get the update completed quickly, we simply mved the files into our home directory. Surprisingly, their filenames looked fine in their new location: $ ls ~/ Työelämätrendit.csv Työkirja1.csv After the update we moved them back to where they were and their filenames were broken again. What could cause this and how can we fix it? The system's locale is set to LANG=en_US.UTF-8.

    Read the article

  • Set Microsoft Word template to always save documents based on it to a certain location

    - by nhinkle
    Some of my professors demand very specific formats for papers typed up for their courses. I've created word templates (.dotx files) for these, so I don't have to set up the formatting each time I go to write something. I already have a template for each of my classes, and have my files organized such that each class has its own directory. I would like to be able to specify a default save location for each template. I know how to set the general default save location for all documents, but I want to change it just for a specific template. Even if there were a way to have it save files generated by the template into the folder the template file resides in, that would be nice. Anybody have any ideas?

    Read the article

  • Directory comparison in Meld but ignoring changes that only involve file timestamp?

    - by creamcheese
    I'm using Meld to compare two directories of source code on Ubuntu. However, because all of the files in one of the directories have been 'touched' so that all of their timestamps were updated, Meld is showing them as different, even though the contents of the files have not changed. But I'm only trying to find files that have different content. I don't see an option to get Meld just to look at changed contents. Any ideas for how to do this in Meld or is there a better GUI directory comparison tool for Ubuntu?

    Read the article

  • Permissionless external drive with NTFS

    - by user12889
    I have an external hard disk which has 1 partition, formatted in NTFS. I use this drive on multiple computers with a different logins on different machines, Windows XP and Windows 7. All files are plain old files, not OS encrypted or compressed. Every now and then Windows 7 does not let me access some files, citing permission problems. I can circumvent this per case by taking ownership and setting appropriate permissions. This, however, is tedious. Is there a simple way to tell Windows to not enforce or store any permissions on any file/directory on a partition?

    Read the article

  • How to synchronize two folders on two remote Linux virtual machines

    - by Manoj Agarwal
    I have two virtual machines, Host OS is ESXi 3.5 and guest OS is Centos 4.6. There are two ESXi servers remotely located, each containing a Centos 4.6 virtual machine. I wish, whatever change I make in any file/folder in one virtual machine should be automatically synchronized on other remote virtual machine. The synchronization process should be automatic. It should only sync differentials, not simulate entire copy with overwrite operation. Sync should be intelligent enough to look for what has changed and what not, and should only update the changed files/folders. Further, there should be some sort of overview and selection for syncing, for example, if it shows 4 files have changed, It should be possible to sync only two files and leave other two for the time being. So, some intelligent syncing mechanism for Linux is needed.

    Read the article

  • How can I create multiple identical AWS EC2 server instances with large amounts of persistent data?

    - by mojones
    I have a CPU-intensive data-processing application that I want to run across many (~100,000) input files. The application needs a large (~20GB) data file in order to run. What I would like to do is create an EC2 machine image that has my application and associated data files installed boot up a large number (e.g. 100) of instances of this image split my input files up into 100 batches and send one batch to be processed on each instance I am having trouble figuring out the best way to ensure that each instance has access to the large data file. The data file is too big to fit on the root filesystem of an AMI. I could use Block Storage, but a given Block Storage volume can only be attached to a single instance, so I would need 100 clones. Is there some way to create a custom image that has more space on the root filsystem so that I can include my large data file? Or is there a better way to tackle this problem?

    Read the article

  • What's a good way for organizing PDF documents on windows?

    - by Ivan
    I'm looking for a good way to manage a lot of pdf documents (e.g. papers, ebooks) on windows. Ideally I'm looking for a windows version of the great mac app Yep. I've looked quite a bit and haven't found any windows app that provides and organized overview of your pdf documents. I've considered just tagging the pdf files, but there don't seem to be any apps to simply tag and search tagged files easily. I've found TaggedFrog, but the tags are kept in the app's internal DB and are associated with the filename. So if you move/rename a file it looses all its tags. In a nutshell: Is there a good windows app to organize/efficiently tag files?

    Read the article

  • Using wget to recursively download whole FTP directories

    - by user9406
    I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder. I tried running the following command form my new server: wget -r ftp://username:[email protected] But all I get is a made up index.html file. What the right syntax for using wget recursively over FTP?

    Read the article

  • Do large folder sizes slow down IO performance?

    - by Aaron
    We have a Linux server process that writes a few thousand files to a directory, deletes the files, and then writes a few thousand more files to the same directory without deleting the directory. What I'm starting to see is that the process doing the writing is getting slower and slower. My question is this: The directory size of the folder has grown from 4096 to over 200000 as see by this output of ls -l. root@ad57rs0b# ls -l 15000PN5AIA3I6_B total 232 drwxr-xr-x 2 chef chef 233472 May 30 21:35 barcodes On ext3, can these large directory sizes slow down performance? Thanks. Aaron

    Read the article

  • rsync - How to exclude one .htaccess but not all of them

    - by Cory Gagliardi
    I have an rsync command for copying my files from dev to production. I don't want to copy the .htaccess file that's in the root of the HTML directory but, I do want to copy the few .htaccess files that are in its sub directories. I'm using the argument --exclude .htaccess which is stopping all of the files from getting copied. The other arguments I'm including are -a --recursive --times --perms. Is it possible to configure rsync to do this? Edit: Here is my full command: rsync -a --recursive --times --perms \ --exclude prop_images --exclude tracking --exclude vtours \ --exclude .htaccess --exclude .htaccess_backup --exclude "*~" \ /home/user/dev_html/* /home/user/public_html/

    Read the article

  • Can't get powershell to return where results from GCI using ACL

    - by Rossaluss
    I'm trying to get Powershell to list files in a directory that are older than a certain date and match a certain user. I've got the below script so far which gives me all the files older than a certain date and lists the directory and who owns them: $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where {$_.lastwritetime -lt "$age"} | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize However, when I try and use an additional where parameter to select only files owned by a certain user, I get no results at all, even though I know I should, based on the match I'm trying to obtain (as below): $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where ({$_.lastwritetime -lt "$age"} -and {{(get-acl $_.FullName).owner} -eq "domain\user"}) | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize Am I missing something? Can I not use the get-acl command in a where condition as I've tried to? Any help would be appreciated. Thanks

    Read the article

  • Break a hard link of a file in use

    - by Stebi
    I used hard links to merge duplicated files on my SSD (space is still precious) and now have a weird problem. Common files like msvcr110.dll got hard linked. Now I want to delete a program which has this file in its installation directory. But I cannot because this file (on another location) is used by a currently running application (don't know which) and windows doesn't allow me to delete this file because it's in use. I can rename the file but it still points to the same file, so not possible to delete it. Is there any way to break a the hard link of a file which is currently in use? I currently use a trash folder where I move those files to so I can delete the directory structure of program to be deleted. But I'd like to get rid of this leftover (although it doesn't take much space as it's a hard link).

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Move some iTunes library items to different drive?

    - by Sören Kuklau
    My internal hard drive is somewhat small, and I only regularly listen to a fraction of my iTunes library anyway, so I'd like to keep large portions on it on an external drive for archival purposes. Since dealing with multiple iTunes libraries is somewhat painful, the solution I'm looking for is to move individual items of the library to a different location, without compromising the "Keep organized" and "Copy files" settings. I found an AppleScript that I assume is supposed to do this, Move Files To Folder…, but it instead copies them, and doesn't update the library accordingly. I can do this manually by moving the file, then accessing it in iTunes — it'll prompt me for the new location. I just don't intend to do this one by one for thousands of files.

    Read the article

< Previous Page | 453 454 455 456 457 458 459 460 461 462 463 464  | Next Page >