Search Results

Search found 40999 results on 1640 pages for 'duplicate files'.

Page 82/1640 | < Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >

  • Mercurial (hg) commit only certain files

    - by bresc
    Hi I'm trying to commit only certain files with hg. Because of of hg having auto-add whenever I try to commit a change it wants to commit all files. But I don't want that because certain files are not "ready" yet. There is hg commit -I thefile.foo, but this is only for one file. The better way for me would be if I can turn off auto-add as in git. Is this possible? thx

    Read the article

  • DOS application to allow remote management of files over serial link

    - by tomlogic
    Harken back to the days of DOS. I have an embedded DOS handheld device, and I'm looking for a tool to manage the files stored on it. I picture an application I can launch on the device that opens COM1 up for commands to get a directory listing, send/receive files via x/y/zmodem, move/delete files, and create/move/delete directories. A Windows application can then download a recursive file listing and then manage those files (for example, synchronizing with a local directory). Keep in mind that this is DOS -- 8.3 filenames, 640K of RAM and a 19200bps serial link (yuk!). I'd prefer something with source in case we need to add additional features (for example, the ability to get a checksum of a file for change detection). Now that I've written this description, I realize I'm asking for something like LapLink or pcAnywhere. Norton no longer sells DOS versions of pcAnywhere and LapLink V for DOS seems pricy at $50. Are you aware of any similar apps from those good old days?

    Read the article

  • Is there a way to backup all my files and replace with 0-byte files with same name?

    - by laggingreflex
    My main drive on laptop keeps filling up so I take a backup on a USB and delete the original files. But then I find myself getting (downloading or getting from someone else) files that I already have backed-up but couldn't recall at the moment. So is there a way I can keep a 0-byte file with the same name as teh backed-up copy so that when I'm asked whether to overwrite the existing file, I can easily choose no knowing I probably have this file already in the backup. EDIT: better yet, replace with a shortcut(.lnk) on the external drive so I can access the files hassle free and not get any errors because of 0-byte files being accidentally opened.

    Read the article

  • Google Drive and sync?

    - by Royi Namir
    Two questions please: Where are Google Drive offline docs stored in my computer? which folder ? (*when accessing docs.google.com/offline) I have a text file in my Google Drive. When I click on it, I can view it only (no edit). The only option to edit is to export it to Google Docs, but now I have 2 files: the original text file and the the editable one. So now I have to sync both the regular file AND the version created by Google Docs. Is that the normal behavior?

    Read the article

  • ZFS, dedupe and PST files

    - by Unreason
    I am interested to know what would be expected maximum dedupe ratio for a set of PST files. I have ~40G of pst files from ~15 usres with high level of duplication of attachments. I am running tests to see if I can have significant space savings if I store the data on ZFS with dedupe. For this purpose I have installed a test setup of Nexenta, but was wondering if someone here had already done this and what level of deduplication I might expect (or in another words how sensitive are pst files to block alignment and what are the parameters that can influence the ratio?). Initial test show very low dedupe ratio and I did find explanation that block level dedupe would not be efficient here and that byte level dedupe would be much better (and that it should be performed by application that is aware of internal organization), so I am just double checking here if someone have some more input. Otherwise I will probably be converting PST files to IMAP.

    Read the article

  • Can not copy files from Windows 2003 server over network

    - by Mark
    It seemed quite strange. I have a share folder with full read/write permission on my Windows 2003 server. With a XP client, I can create a new folder on the share folder, and I can copy files to it normally, but I can not copy these files back to my client PC. I tried use ftp,and webdav to get the files from server. None of them worked. Is the issue related with NETWORK SERVICE? Thanks for your help.

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • 7-Zip many files from different folders?

    - by mafutrct
    I would like to add a large number of files with different names from different folders to a single 7-Zip archive using 7za.exe. This should be simple, but it turned out to be a major pain. I created a file that contains the paths (7za a out.7z @list.txt), but once there are too many (~100) files, it fails. Apparently the content of the argument file is pushed onto the command line buffer [Edit: This was likely a misinformation on my part, either way it was not the reason], which is far too small (the number of files to add is more than one million). Splitting the process up by adding the files one by one is not feasible due to the way 7za works: When adding the next file, it creates a copy of the archive, adds the file to the copy and finally replaces the original. This is terribly slow once the archive gets to a couple 100 MB in size. So far I am using a combination of the two approaches by adding a dozen files each time in a loop, but it is an unreliable hack and still very slow. Is there a better way to do it? I tried to use 7-Zip wrapper DLLs (I'm a C# programmer), but none of them worked reliably and I was repeatedly suggested to just use 7za instead.

    Read the article

  • Program to swap files between drives?

    - by josi
    Has anyone built a program/script to transfer files between 2 hard drives, but like if both are near full....so one copies 1 file over then the other copies the other file, then they delete the files that were copied? Kind of annoying, have a 6tb raid at about 4tb full, then 1 4.5tb basically full, can't really swap them easily....without doing many copies and deletes of files.... Anyone know a way to make them just swap? lol

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • What is the most reliable way to copy access front end files to client PCs

    - by Funky Si
    I have several in house databases which have access 2003 front ends, either adp or ade files. I need to copy these from my server to every client machine. In the past I have used a rollout scripts to copy the files to the all users desktop folder. I have since adapted this to also copy files to the public desktop folder since we started having windows 7 client machines as well as XP. The problem is that some of the time these scripts don't work for windows 7. Is there a better way of copying these files to a mix of windows 7 and XP clients or is using rollout scripts the best way?

    Read the article

  • how to make vhdresizer work on XP Mode VHD files

    - by A_M
    Hi, I'm trying to shrink an Windows 7 XP Mode VHD file with little success. I've been trying to use VhdResizer. When I select my VHD file, it says "VhdExpand only supports fixed and dynamic VHD files". My XP Mode VHDs are dynamic files. Does anyone have any idea why it is failing? Failing that, does anyone have a process which I can use to shrink my XP mode VHD files? Thanks.

    Read the article

  • Robocopy to copy only new folders and files

    - by Valery Shampal
    A task: To find all new files and subfolders under some root folder (let us say Documents) and to copy them to other disk (j: in this case) Command line used: robocopy c:\users\valery\documents j:\robocopy /XO /E /MAXAGE:20131030 /XD Result: A full folders tree created. Only new files copyed, as supposed. It's good A point is, that I do not want to create all subfolders on a target disk, if there are no new files in them. Results are exeptable, but there is a lot of work to go through all folders and to find new files, as well, as to understand, what subfolders are a new ones Regards, Valery

    Read the article

  • Microsoft Outlook: Export list of currently opened PST files

    - by ultrasawblade
    At my current workplace we are upgrading various users from XP to Windows 7. Frequently the users have anywhere from 10 to 30 or so .pst files opened within their installation of Microsoft Outlook 2007. These users are particularly helpless without these files. I know how to view the list of currently opened PST files, and would like to know if there is an easy way to capture that information other than taking screenshots of the Options - Data Files window. Does migwiz.exe transfer this information? Is that the only way? Would there happen to be a tool that will let you capture and restore that information? I don't want to export or move the actual .pst's themselves (yes, some of them are on network locations, very terrible, I know), just reopen ones in a new installation of Outlook that used to be opened in a previous installation.

    Read the article

  • Dropbox picture sync: Skip RAW files?

    - by Steven Lu
    I like the convenience of having Dropbox keep track of my photos because it tends to work with my devices over 3G (I am often tethering to my phone with my iPad and Macbook) as well as Wifi, but it's a waste of network traffic to sync the raw files from my camera or memory card. It clutters up the dropbox list and the files are just huge. Is there a way to configure the Dropbox client so that it ignores a certain file extension for the picture sync? Also, I suspect that if I just go and delete the raw files, that the next time I plug in the memory card and tell Dropbox to sync, it will re-download the raw files. Which would be terribad. I could switch to iCloud for Photo Stream, I suppose, but there will be no access via 3G that way. And I've already got years of experience with Dropbox so I know it's going to just work. I think any method that works for filtering files to exclude from sync on Dropbox in general should work here too. Edit: Wow there are 19k votes for this exact request.

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • How to download batch of randomly named files

    - by TheLearner
    I need to download a bunch of files from a website but I don't want to have to click on each file to add it to downloadall or whatever. The structure of the website is as follows: http://something.com/katalog/?get=Exclusive/group1/2012.09.03/ The directory has loads of randomly named files with .doc extensions. I can't use the batch feature because the files don't start or end with the same characters e.g. 001...100. Any ideas/

    Read the article

  • looking for a command line tool to copy files to remote computers (similar to psexec)

    - by hatchetman82
    hi. im looking for a small utility that can copy files over to/from remote windows hosts, and which can take the credentials (domain user and password) as part of its command line, similar to psexec. i know i can use net use to map the target directory to a drive letter and use xcopy, and i know psexec can upload files to be executed on the remote machine and then delete them, but im looking for a small utility to distribute files to remote hosts that will not be as awkward to use as net use + xcopy

    Read the article

  • Windows xp - recover document opened directly from IE

    - by Thingfish
    Hi Attempting to help a family member recover a document. The word 2007 document was downloaded and opened directly from a webmail interface using Internet Explorer running on Windows XP. The user saved the document multiple times while working on it for the good part of a day. After closing Word 2007 the user is not able to locate the document, and I have so far not been able to help. The computer has not been turned off, and the user has not attempted to open the document directly from the mail again. Recreating the events on vista/windows 7 its easy enough to locate the document under the Temporary Internet Files folder. I have however not been able to do the same on a Windows XP. Any suggestion for how to locate this document, or if its even possible? Thanks

    Read the article

  • git - recover deleted files from a prior commit

    - by Walter White
    I accidentally deleted some files in a prior commit and would like to recover them. How can I do this? I ran this and found exactly what I was looking for: git whatchanged --diff-filter=D At the time I made the commit, I should have committed the new/changed files only and ran a reset --hard then to recover the missing files. I have about 100 files that I need to restore. I don't want to do a straight revert as that will also undo the changes in that commit. Any ideas?

    Read the article

  • Program keeping encrypted files.

    - by Giorgi
    I am looking for a program which will encrypt files specified by me and allow me to view/edit/delete those files without creating a virtual disk. I do not want to have virtual disk as a domain administrator can access it so truecrypt is not the possibility. One possibility is to use winrar with password protected archive but winrar serves a different goal so it is not very user friendly for this purpose. If it's possible it would be nice if the program does not creates temp files while I open the files. Any suggestions?

    Read the article

  • Truecrypt files corrupted after moving PC into another case

    - by Dygerati
    I recently bought a new PC case and transferred all of my PC hardware into it. The only hardware modification was the addition of two identical ram modules. The entire process went smoothly, and everything worked and booted as before. The only side-effect I found when accessing one my of file-based hidden truecrypt volumes shortly there after. Some of the files in the volume - NOT all - seemed to be entirely corrupted. The directory and file names are garbled characters, but a few of the directories in the same volume appear and function normally. Also, all files in the non-hidden tc volume were still intact. Is this not weird? The only other real change I could think of would be that the hard drives were connected to different SATA ports on the mobo. I really don't know how the truecrypt encryption works well enough to know what could cause this...and the fact that not all the files were corrupted makes it more bizarre still. So, first off (and I'm not too hopeful on this point), would it be possible to restore these files? I had a backup of most, but not all of the files involved. Other than that I'm just curious how this happened and how I can prevent it next time. Thanks!

    Read the article

  • Server stop responding, where to look to know what happened?

    - by Cid
    I have a server that has been running for well over 5 months and suddently it stop responding. I couldn't ssh into it or anything else so I decided to reboot it and the reboot fixed it. I'm trying to figure out what happened and I'm not sure exactly where to look. I started to look in /var/log but there are tons of files in there and I'm not sure which one I should pay attention to. I'm slowly going through each one of them but if anyone can point me in the right direction, it would be great. Thanks!

    Read the article

< Previous Page | 78 79 80 81 82 83 84 85 86 87 88 89  | Next Page >