Search Results

Search found 38393 results on 1536 pages for 'war files'.

Page 64/1536 | < Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >

  • 7-Zip many files from different folders?

    - by mafutrct
    I would like to add a large number of files with different names from different folders to a single 7-Zip archive using 7za.exe. This should be simple, but it turned out to be a major pain. I created a file that contains the paths (7za a out.7z @list.txt), but once there are too many (~100) files, it fails. Apparently the content of the argument file is pushed onto the command line buffer [Edit: This was likely a misinformation on my part, either way it was not the reason], which is far too small (the number of files to add is more than one million). Splitting the process up by adding the files one by one is not feasible due to the way 7za works: When adding the next file, it creates a copy of the archive, adds the file to the copy and finally replaces the original. This is terribly slow once the archive gets to a couple 100 MB in size. So far I am using a combination of the two approaches by adding a dozen files each time in a loop, but it is an unreliable hack and still very slow. Is there a better way to do it? I tried to use 7-Zip wrapper DLLs (I'm a C# programmer), but none of them worked reliably and I was repeatedly suggested to just use 7za instead.

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • Program to swap files between drives?

    - by josi
    Has anyone built a program/script to transfer files between 2 hard drives, but like if both are near full....so one copies 1 file over then the other copies the other file, then they delete the files that were copied? Kind of annoying, have a 6tb raid at about 4tb full, then 1 4.5tb basically full, can't really swap them easily....without doing many copies and deletes of files.... Anyone know a way to make them just swap? lol

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • how to make vhdresizer work on XP Mode VHD files

    - by A_M
    Hi, I'm trying to shrink an Windows 7 XP Mode VHD file with little success. I've been trying to use VhdResizer. When I select my VHD file, it says "VhdExpand only supports fixed and dynamic VHD files". My XP Mode VHDs are dynamic files. Does anyone have any idea why it is failing? Failing that, does anyone have a process which I can use to shrink my XP mode VHD files? Thanks.

    Read the article

  • What is the most reliable way to copy access front end files to client PCs

    - by Funky Si
    I have several in house databases which have access 2003 front ends, either adp or ade files. I need to copy these from my server to every client machine. In the past I have used a rollout scripts to copy the files to the all users desktop folder. I have since adapted this to also copy files to the public desktop folder since we started having windows 7 client machines as well as XP. The problem is that some of the time these scripts don't work for windows 7. Is there a better way of copying these files to a mix of windows 7 and XP clients or is using rollout scripts the best way?

    Read the article

  • Robocopy to copy only new folders and files

    - by Valery Shampal
    A task: To find all new files and subfolders under some root folder (let us say Documents) and to copy them to other disk (j: in this case) Command line used: robocopy c:\users\valery\documents j:\robocopy /XO /E /MAXAGE:20131030 /XD Result: A full folders tree created. Only new files copyed, as supposed. It's good A point is, that I do not want to create all subfolders on a target disk, if there are no new files in them. Results are exeptable, but there is a lot of work to go through all folders and to find new files, as well, as to understand, what subfolders are a new ones Regards, Valery

    Read the article

  • Microsoft Outlook: Export list of currently opened PST files

    - by ultrasawblade
    At my current workplace we are upgrading various users from XP to Windows 7. Frequently the users have anywhere from 10 to 30 or so .pst files opened within their installation of Microsoft Outlook 2007. These users are particularly helpless without these files. I know how to view the list of currently opened PST files, and would like to know if there is an easy way to capture that information other than taking screenshots of the Options - Data Files window. Does migwiz.exe transfer this information? Is that the only way? Would there happen to be a tool that will let you capture and restore that information? I don't want to export or move the actual .pst's themselves (yes, some of them are on network locations, very terrible, I know), just reopen ones in a new installation of Outlook that used to be opened in a previous installation.

    Read the article

  • Dropbox picture sync: Skip RAW files?

    - by Steven Lu
    I like the convenience of having Dropbox keep track of my photos because it tends to work with my devices over 3G (I am often tethering to my phone with my iPad and Macbook) as well as Wifi, but it's a waste of network traffic to sync the raw files from my camera or memory card. It clutters up the dropbox list and the files are just huge. Is there a way to configure the Dropbox client so that it ignores a certain file extension for the picture sync? Also, I suspect that if I just go and delete the raw files, that the next time I plug in the memory card and tell Dropbox to sync, it will re-download the raw files. Which would be terribad. I could switch to iCloud for Photo Stream, I suppose, but there will be no access via 3G that way. And I've already got years of experience with Dropbox so I know it's going to just work. I think any method that works for filtering files to exclude from sync on Dropbox in general should work here too. Edit: Wow there are 19k votes for this exact request.

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • How to download batch of randomly named files

    - by TheLearner
    I need to download a bunch of files from a website but I don't want to have to click on each file to add it to downloadall or whatever. The structure of the website is as follows: http://something.com/katalog/?get=Exclusive/group1/2012.09.03/ The directory has loads of randomly named files with .doc extensions. I can't use the batch feature because the files don't start or end with the same characters e.g. 001...100. Any ideas/

    Read the article

  • looking for a command line tool to copy files to remote computers (similar to psexec)

    - by hatchetman82
    hi. im looking for a small utility that can copy files over to/from remote windows hosts, and which can take the credentials (domain user and password) as part of its command line, similar to psexec. i know i can use net use to map the target directory to a drive letter and use xcopy, and i know psexec can upload files to be executed on the remote machine and then delete them, but im looking for a small utility to distribute files to remote hosts that will not be as awkward to use as net use + xcopy

    Read the article

  • Windows xp - recover document opened directly from IE

    - by Thingfish
    Hi Attempting to help a family member recover a document. The word 2007 document was downloaded and opened directly from a webmail interface using Internet Explorer running on Windows XP. The user saved the document multiple times while working on it for the good part of a day. After closing Word 2007 the user is not able to locate the document, and I have so far not been able to help. The computer has not been turned off, and the user has not attempted to open the document directly from the mail again. Recreating the events on vista/windows 7 its easy enough to locate the document under the Temporary Internet Files folder. I have however not been able to do the same on a Windows XP. Any suggestion for how to locate this document, or if its even possible? Thanks

    Read the article

  • git - recover deleted files from a prior commit

    - by Walter White
    I accidentally deleted some files in a prior commit and would like to recover them. How can I do this? I ran this and found exactly what I was looking for: git whatchanged --diff-filter=D At the time I made the commit, I should have committed the new/changed files only and ran a reset --hard then to recover the missing files. I have about 100 files that I need to restore. I don't want to do a straight revert as that will also undo the changes in that commit. Any ideas?

    Read the article

  • Program keeping encrypted files.

    - by Giorgi
    I am looking for a program which will encrypt files specified by me and allow me to view/edit/delete those files without creating a virtual disk. I do not want to have virtual disk as a domain administrator can access it so truecrypt is not the possibility. One possibility is to use winrar with password protected archive but winrar serves a different goal so it is not very user friendly for this purpose. If it's possible it would be nice if the program does not creates temp files while I open the files. Any suggestions?

    Read the article

  • Truecrypt files corrupted after moving PC into another case

    - by Dygerati
    I recently bought a new PC case and transferred all of my PC hardware into it. The only hardware modification was the addition of two identical ram modules. The entire process went smoothly, and everything worked and booted as before. The only side-effect I found when accessing one my of file-based hidden truecrypt volumes shortly there after. Some of the files in the volume - NOT all - seemed to be entirely corrupted. The directory and file names are garbled characters, but a few of the directories in the same volume appear and function normally. Also, all files in the non-hidden tc volume were still intact. Is this not weird? The only other real change I could think of would be that the hard drives were connected to different SATA ports on the mobo. I really don't know how the truecrypt encryption works well enough to know what could cause this...and the fact that not all the files were corrupted makes it more bizarre still. So, first off (and I'm not too hopeful on this point), would it be possible to restore these files? I had a backup of most, but not all of the files involved. Other than that I'm just curious how this happened and how I can prevent it next time. Thanks!

    Read the article

  • Why Netbeans 6.8 remote project (php) uploads all files by default

    - by xaguilars
    Hi I wanted to know if there's some option for disabling Netbeans to upload all files of a recently imported remote (php) project. I always check "Upload files on run", in the project configuration. But when I click on run Netbeans selects all files by default (I modified only some). The file checkboxes cannot be disabled at once and you have to do this one by one (imagine you have 5000 files...). That's annoying. Do you know any solution? thank you

    Read the article

  • Server stop responding, where to look to know what happened?

    - by Cid
    I have a server that has been running for well over 5 months and suddently it stop responding. I couldn't ssh into it or anything else so I decided to reboot it and the reboot fixed it. I'm trying to figure out what happened and I'm not sure exactly where to look. I started to look in /var/log but there are tons of files in there and I'm not sure which one I should pay attention to. I'm slowly going through each one of them but if anyone can point me in the right direction, it would be great. Thanks!

    Read the article

  • How do I get a listing of music files on a specific drive

    - by Kevin34
    I'm helping someone setup thier IPOD, but they are using Windows 7, and I know XP. I don't see the music in the directory lising on his computer that I see on the IPOD. So I'm trying to search for all music files on e: In Windows XP, this is easy. Windows 7 has changed everything. I googled this, and I found to type "music" in the Windows search bar. This result in music "Libraries." Great. There's still not a listing of the files. I can search for *.wma, but that doesn't list all the music on the IPOD. There are many types of music files, how do I get a list of ALL music files on JUST drive e:? Again, on XP this was VERY easy.

    Read the article

  • Remove all files except for a few, from a folder in Unix

    - by nikhil
    I often face this problem. I have a set of files in a folder and would like to delete all of them except a few. For example: I have files named according to the date of creation (like 11-1-11.tar, 10-1-11.tar and so on). Now I would like to delete files like 10-1-11, 9-1-11 and so on but not some other files. Basically I would like to enforce what all should be deleted and what should be retained. How would I do this?

    Read the article

  • Program for keeping encrypted files.

    - by Giorgi
    I am looking for a program which will encrypt files specified by me and allow me to view/edit/delete those files without creating a virtual disk. I do not want to have virtual disk as a domain administrator can access it so truecrypt is not the possibility. One possibility is to use winrar with password protected archive but winrar serves a different goal so it is not very user friendly for this purpose. If it's possible it would be nice if the program does not creates temp files while I open the files. Any suggestions?

    Read the article

  • how to share files over local network in ubuntu

    - by rails_guy
    I am trying to paste some files from my laptop to desktop. both have ubuntu. From the laptop I can see the desktop under Places - Network. I can see the files in the Desktop but when I try to paste a file it says "permission denied" What can I do on the Desktop so it allows my laptop to paste files?

    Read the article

  • retain last used path to location for saving files in Windows 7

    - by Mark Miller
    I am using Microsoft Office 2010 and Windows 7 on a Dell PC. I am opening a bunch of MSWord files one at a time, copying data tables therein, pasting the data into Excel and saving the Excel files as comma delimited text files. I am creating a separate Excel file for each MSWord file. The path to the folder containing the saved comma-delimited files is quite long, something like this: c:\users\me\aa\bb\cc\dd\ee\ Every time I open Excel and save a new comma-delimited file I have to re-navigate the entire path (c:\users\me\aa\bb\cc\dd\ee). In the past Windows seemed to remember the last used path, saving a lot of tedious key-strokes. In fact, I think Windows did this for me as recently as last week, albeit on a different computer. Can I apply a setting in Windows somewhere asking it to offer the last used path as a default when saving files so I do not have to re-navigate the entire directory structure to save each new comma-delimited file? If I can, how so? Where is the option for specifying that setting? Thank you for any help.

    Read the article

  • Sorting Files into Subfolders based on EXIF Date

    - by honestor
    I have a huge directory from a HDD recovery that contains 70000+ JPEG files. I tried playing around with some AppleScripts, that I found, but had no luck. I already installed EXIFtool, which might be useful for this task. The current directory structure is as follows: dir001 - file0001.jpg ... - file9999.jpg dir002 - file0001.jpg ... - file9999.jpg ... dir070 - file0001.jpg - ... - file9999.jpg The files mostly have EXIF Data, but sometimes there are Files without metadata. Now I hope to be able to sort and rename these files into folders based on the date: 1999 - 1999 01 31 - 1999_01_31_-_22_59_59.jpg 2000 - 2000 05 20 - 2000_05_20_-_21_59_59.jpg - 2000_05_20_-_22_59_59.jpg I figured Applescript/Automator might come in handy for this, however every other solution would be welcome, too!

    Read the article

< Previous Page | 60 61 62 63 64 65 66 67 68 69 70 71  | Next Page >