Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 57/1516 | < Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >

  • Web application and remote storage of files

    - by Matt
    Hi have a web application that can store lots and lots of files on the server. i.e. users upload data to it. The files are stored below a particular storage path. The web host will be an IBM xseries 345. However, the disks are really expensive so we would like to put the files onto a less expensive server. Now here is the question. Should I use an NFS mount on the IBM server of a path on the storage server? Or should I write some scripts to upload the files to the storage server instead. Both the storage server and the web host are on the same network. Only the web server is visible to the world. Is NFS performance suitable for an expected low to moderately loaded server?

    Read the article

  • Extract music files from a Audio CD [closed]

    - by Jatin
    Possible Duplicate: What good, free audio CD ripping/extraction tools exist for Windows, and supporting multiple formats? I have an audio cd, which has audio files with the file format as .cda ( CD Audio Track ). Each one of these files have a size of 1 KB each, and the rest of the CD has nothing else. Is there a way that I can get the audio files from the CD and then convert it into mp3 format and then play it in any other devices as I like.

    Read the article

  • Recommendations on managing dot files for users using Puppet

    - by Beaming Mel-Bin
    Goal is to have a collection of dot files (.bashrc, .vimrc, etc.) in a central location. Once it's there, Puppet should push out the files to all managed servers. I initially was thinking of giving users FTP access where they could upload their dot files and then having an rsync cron job. However, it might not be the most elegant or robust solution. Wanted to see if anyone else had some recommendations.

    Read the article

  • Disaster recovery backup of files/photos for personal use

    - by Renesis
    I'm looking for the best method to store a backup of important files and 5+ years of digital photos that is safe from some type of fire/flood disaster in my home. I'm looking for: Affordable: Less than $100/yr or first-time cost. Reliable: At least a smaller chance of failing than there is of fire or flood Easy for initial backup and to add to, and at least semi-easy to recover. I recently purchased a small home safe for physical vitals. It was inexpensive, solid, and is fire/water safe. If I had a physical copy of the digital files, the safe would work fine for this, but I don't know what to store in it that adequately meets the requirements above. Hard drive - I read that the danger of it not spinning up makes a hard drive a bad choice for this type of storage, although it was my first thought and would definitely be the simplest choice - very easy to take out once a month and add files to. DVDs - Way too much of a hassle for both backup and restore. Tape - No idea on the affordability of this option Online - Given that I have at least 300GB already and ever-increasing megapixels means ever-bigger files, and my ISP upload is about 2Mb at the best, this just doesn't sound like a good option for me, but I could be convinced. Other - Have I missed something? Also, I'm already covered both for sync between computers (Dropbox) and a nightly backup of these files (External HDD). The problem with the nightly backup is obviously that it's always with the computer and in a disaster would be destroyed along with it. Is anyone else doing something similar? Is the HDD as poor of a choice as I read, or is it a feasible option? Maybe two to reduce the likelihood of failure?

    Read the article

  • How do I fix a gid on files moved to a new server

    - by Tim Abell
    Hi, I've copied a folder of data from one linux server to another via a tarball. The group ids (GIDs) don't match up on the two servers, so I now have files that look like -rw-rw-r-- 1 tim 1013 88 2008-11-14 10:18 config There is a mixture of group ownerships in the folder, and I want to keep them owned by different groups on the same server, so I can't just use chgrp -R. How do I change all files/folders with GID 1013 to another group, without affecting other files/folders? Thanks

    Read the article

  • How to display certain lines from a text file in Linux?

    - by Boaz
    Hi, I guess everyone knows the useful Linux cmd line utilities head and tail. Head allows you to print the first X lines of a file, tail does the same but prints the end of the file. What is a good command to print the middle of a file? something like middle --start 10000000 --count 20 (print the 10,000,000th till th 10,000,010th lines). I'm looking for something that will deal with large files efficiently. I tried tail -n 10000000 | head 10 and it's horrifically slow. Thanks, Boaz

    Read the article

  • emule cannot create Incoming Files directory

    - by Tim
    Hi, I have installed emule on Windows 7. When I start emule, I will receive a message "failed to create Incoming Files directory 'C:\Program Files\eMule\Incoming' - Access is denied", similar message for "C:\Program Files\eMule\Temp" and "Failed to initialize cryptokeys - secure ident disabled". It looks like Windows 7 does not allow emule to make such operation. How can I give emule the right to do what it needs? Thanks!

    Read the article

  • Temp files created in every folder in Windows Server 2003

    - by i.h4d35
    So we have some folders which are shared over the AD Domain (Windows Server 2003). It was just noticed that in 2 of those folders (which contain only Excel and Word files), whenever a file is opened and closed, the temp file which was opened corresponding to that file still remains. Apparently, this's been going on for the past couple of years (which has led to an insane amount of temp files in each folder/subfolder under those shared folders). These shared folders are under the D:drive and not C: drive. There is only one group (containing 2 users) who access the said folders. I cannot understand if this has to do with the settings/permissions for the User/Group/Individual Client machine. For now, I have manually deleted all the temp files from each folder/subfolder. While this is not critical at the moment, I'd still like to clear this up. Also, it takes an additional fraction of a second to open folders that contains more than 10,000 temp files. Thanks in advance.

    Read the article

  • OS X: How to show all files in chooser dialogs

    - by Stabledog
    On OS X 10.5 and 10.6 at least, the command to enable all files in Finder only affects Finder Windows -- the "/File /Open" dialogs on applications still show a limited set of files, ignoring nearly all the Unix stuff. Is there a setting which enables ALL file dialogs of ALL types to show ALL files? I really am a big boy and promise not to harm them! :)

    Read the article

  • Windows Vista not indexing files

    - by kevin
    I'm using Windows Vista's search index service to quickly launch programs by typing "windows key" +"name of program". However I'm having a hard time trying to understand why certain files won't appear. I want to launch "pageant.exe" located in "C:\Program Files\Putty" with this method but it doesn't show up in the results. In the index search options I said I wanted to index the "Start menu" and "C:\Program Files". I've checked that .exe files are indeed indexed and they are. I've also tried to completly rebuild the index with no luck. What am I doing wrong ?

    Read the article

  • Concatenating ogg video files from the command line

    - by Noufal Ibrahim
    Okay. I've got a few ogg files I've created using a desktop recording tool. I've transcoded them using ffmpeg once (mainly to clip out the beginnings and the ends). Now, I have 3 such files which I want to concatenate into a single .ogv file. I tried using oggCat, it crashed with some kind of error (I tried concatenating a file to itself using oggCat and that failed too leading me to believe that my distro is shipping a broken version of the package). Simply cating the files works but I can't seek which is not cool. mencoder run like this mencoder -ovc lavc -oac lavc file1.ogv file2.ogv file3.ogv -o complete.ogv. It transcodes the files into an avi and clips off a little of the 3 videos. So, how do I do this? Update 1: My current workaround is to transcode the 3 files into .mpg using ffmpeg, then cating them together and then transcoding them back into ogv. Update 2: PiTiVi works for this kind of thing but I need something from the command line that I can automate and script.

    Read the article

  • How do I mklink junction + move content from C:\Program Files to D:\Program Files?

    - by Matt
    I have a few applications that absolutely refuse to install into anything but C:\Program Files or C:\Program Files(x86). Changing the registry keys for default install folders doesn't seem to provide any satisfaction and so now I'm wondering about throwing a NTFS junction in there to force these pesky applications to cooperate. There are files currently in use within Windows so it's quite likely I am not going to be able to do this within the active OS. Is there some bootable Windows 7 system tools that would allow me to make this happen? Seems I will need the ability to copy files (with permissions!) from one drive to another, as well as make the junction for Windows.

    Read the article

  • Copying Metadata Between Files in iTunes

    - by Levi Hackwith
    A while back, I converted some AVI files to .m4v files that would be playable on my iPhone. When I play these files on my PC using iTunes the quality is terrible because the resolution is so low. My solution is to convert the AVI's using Handbrake using the "Universal" preset; it works like a charm and I can now watch them both on my PC and iPhone. The Problem I want to import the newly converted files into my iTunes library and be able to copy the metadata (show, season (these are TV episodes) Description, etc) without having to manually copy and paste values from one file to another. Is it possible to just say "copy this file's metadata to this file"?

    Read the article

  • Any program or editor in windows 7 to run ".md" files

    - by Anmol Saraf
    I understand that '.md' is an extension for markdown format. While installing 'Grunt' from GitHub I see a lot of .md extension files inside node_module/grunt/docs folder. As per my understanding these files are supported by GitHub for documentation kind of things if I am not wrong. My question here is - Are there any editors/tools or programs available for Windows 7 where I can see these .md files executing ? When I try to open any of these file inside my text editor it displays in raw format with all '#" etc. keywords. I want to see the formatted version of these files so that without an internet connection also I can navigate the documentation on my machine. Thanks for helping !!

    Read the article

  • Moved files only opening as 'read-only' in Excel

    - by Lance Roberts
    I moved a large directory of Excel files to another machine. When I sign in as myself (administrator signing into domain) then the files open just fine, but when I sign in as a Power User directly onto the machine, the files open as 'Read-Only'. I've reset all the attributes through both Windows and DOS, but to no avail. Also I checked on the Search Indexing bug, but indexing is already turned off. Any ideas?

    Read the article

  • Thoughts on Apache log file sizes?

    - by Nathan Long
    Do you place any limits on the size of Apache log files - access.log and error.log? Specifically, can you give: Reasons to limit log file sizes Disk space Any other? Reasons NOT to limit log file sizes Research into performance issues or security breaches Any other? Methods of doing so Cron job that periodically deletes the file, or the first N lines? Any other? Anything you might salvage before deleting For example, grep out how many times a file was downloaded before deleting the access logs I'd like get the thoughts of experienced sysadmins before I do anything. (Marking as community wiki since this may be a matter of opinion.)

    Read the article

  • Search in multiple xml files

    - by Ram
    I have a windows Xp Sp2 system where the windows explorer search is not able to find the text in xml files. Is there some setting that enable the search in xml files? It finds in text in text / doc files in the same folder.

    Read the article

  • Re-sync deleted files from rsync

    - by hfranco
    I need to recover files that have been deleted. My scenario: I have a rsync script that runs at 9PM and mirrors everything from server1 directory to another directory on backup server2. A couple of files have been accidentally deleted from server1. How do I recover those files from server1 with rsync?

    Read the article

  • Search in multiple xml files

    - by Ram
    I have a windows Xp Sp2 system where the windows explorer search is not able to find the text in xml files. Is there some setting that enable the search in xml files? It finds in text in text / doc files in the same folder.

    Read the article

  • files have no ownership permissions and can't assign ownership

    - by Force Flow
    I'm having problems with file permissions on a server 2008 R1 server. Office 2010 tmp files are being created, and don't have any security permissions assigned. They aren't being deleted, I can't assign ownership, and I can't delete them. I downloaded and ran the sysinternals tool handle.exe. When running it for the first time, handle64.exe was created, but not assigned any permissions. I cannot assign ownership and cannot delete it. Seemingly random files in random places don't seem to have any permissions assigned. Access is denied when attempting to change ownership to administrator or the administrators group. If I try to replace inheritable permissions of the folder these files are in, access is denied for the files with no permissions. I attempted to use subinacl to view the ownership information on the files that had no permissions, but access was denied here as well. I also tried setting the owner with setacl in an elevated cmd window, but access was denied as well. This problem only surfaced in the last few days, and I'm unsure as what the cause is or how to correct it.

    Read the article

  • nginx status code 200 and 304

    - by Chamnap
    I'm using nginx + passenger. I'm trying to understand the nginx response 200 and 304. What does this both means? Sometimes, it responses back in 304 and others only 200. Reading the YUI blog, it seems browser needs the header "Last-Modified" to verify with the server. I'm wondering why the browser need to verify the last modified date. Here is my nginx configuration: location / { root /var/www/placexpert/public; # <--- be sure to point to 'public'! passenger_enabled on; rack_env development; passenger_use_global_queue on; if ($request_filename ~* ^.+\.(jpg|jpeg|gif|png|ico|css|js|swf)$) { expires max; break; } } How would I add the header "Last-Modified" to the static files? Which value should I set?

    Read the article

  • Did chkdsk make it harder to restore files?

    - by neyl
    My friend asked me to try and fix his loaded Sansa Clip + which wasn't playing. After opening it in MSC mode I discovered that the Music directory was empty and total of all files was only a few MB. However Disk properties showed me that it was 7Gb full. I then ran Tools - Error Checking and Windows dutifully informed me that disk was corrupt and I should run again Allowing Windows to Fix Errors. I did that and it told me everything was fixed and that all files were placed in FOUND.000 Dir. FOUND.000 was about 7.5 GB with FILE0000-1546 . CHK. (I am aware of methods like ChkBack to scan and convert to mp3 etc BUT Original filenames and structure needed!) Now I started getting worried that I made things worse! I have plenty of experience with Data Recovery Programs - Recuva, Restore My Files etc. and I was anyhow planning to use them to scan the drive. But NOW after CHKDSK "fixed" the drive maybe it modified critical FAT information vital for data recovery. So I run these programs and 0!!!. No trace of files! I tried a ton of Recovery Programs with same results TILL EaseUS Data Recovery Wizard found all files and I purchased program for $55! My Question In your opinion - did running CHKDSK with automatic fixing of errors make matters worse (i.e. many data recovery progs. didn't find a trace and they would have done if not for chkdsk) or was the filesystem too corrupt anyhow for regular File Recovery Progs.? If I would be a Professional - would I be responsible for running CHKDSK - automatic Fixing. Do you know of a better Data Recovery Program than EaseUs Data Recovery wizard - According to my experience I haven't found better!? Thanks

    Read the article

  • Large temp files created in Windows Server 2003 temp folder

    - by BlueGene
    I'm managing a Windows Server 2003 with around 30 GB space in primary partition. A couple of times the server has crashed with error message saying that the C: drive is full. After searching folders to free up space, I found that lot of temp files being created in C:\WINNT\Temp and some of them of enormous size with more than 2GB. The temp files have common name, Efs###.tmp. Since we encrypt files frequently using Windows's EFS, I initially suspected Windows encryption. But after reading the documentation, I found that Efs###.tmp are in fact created by EFS but they are created only under the folder which you're currently encrypting, not in Temp folder. This looks very strange since Efs##.tmp files shouldn't be created under C:\WINNT\Temp unless someone tried to encrypt that Temp folder itself. The server has Tivoli Backup client. Could that be messing with windows Encryption? Can anyone shed some light on what could be causing the issue?

    Read the article

  • Preserving name when bulk unzipping files?

    - by Elip
    Sorry this question is trivial, but I couldn't get it to work: I have a folder full of .zip files, each .zip file contains exactly one .xml file. The zip files have sensible names like a.zip etc., while the .xml file contained in them have some "randomname".xml Now I want to unpack all the .zip files in the folder, preserving the name of the .zip files, so that a.zip gets unpacked into a.xml, b.zip into b.xml etc... I only managed to achieve a batch unpacking with the command: for z in *.zip; do unzip "$z"; done How do I enhance this to keep the names?

    Read the article

  • Problem opening files with Gvim on Windows 7

    - by Oscar Duignan
    Just installed gvim on windows 7 for the first time, and I'm having a problem opening some files. When I open files vim seems to flash up a cmd window for a few second before closing it (too quickly to make out the contents,) and I end up with a C:/Program folder, and a /Files/vim/vimfiles/doc/ in the directory of the file I just opened. I can see it's trying to access C:/Program Files/vim/vimfiles/doc, which is where vim is installed, however it's choking on the space, and I'm not familiar enough with gvim to work out why. Any and all ideas are greatly appreciated.

    Read the article

< Previous Page | 53 54 55 56 57 58 59 60 61 62 63 64  | Next Page >