Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 59/1516 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • VMware vCetner tomcat log files missing

    - by sttaq
    I have upgraded to a trial version of vCenter 5.1 from 5.0. Previous versions used to have tomcat log files inside the tomcat\logs folder. But since I have update I have noticed that tomcat is not writing any new log files. The log files were of the following format: vctomcat-stderr.XXXX-XX-XX.log vctomcat-stdout.XXXX-XX-XX.log I would like to know if there is a way by which we can configure tomcat (which is shipped with vCenter) to produces these log files? Also, VMware seems to have removed that Configure Tomcat application that they used to ship with the older version. Any reasons for this?

    Read the article

  • Media Encoder w\ Merge Files & SandyBridge Hardware Encoding Support

    - by GruffTech
    So i have a Z68 chipset that allows Quick Sync encoding on my i7 processor. I want a media encoder that supports both Hardware encoding and allows me to "merge files". My problem is Fraps breaks my recording into 4GB files. I need to stitch these files together & re-encode for Youtube ect, but i want to use the hardware encoding I paid for. I have not figured out how to do this on Handbrake or MediaEsspresso.

    Read the article

  • Encrypted folders and files stay encrypted when copied

    - by user66126
    Hi all, I just tried out Windows 7 feature to encrypt a folder. I found that when I access the encrypted folder from another computer (the parent folder of the encrypted folder is shared) I can see the files there but I could not open it (which is good). But when I copy the file to another folder outside the encrypted folder (regardless it is on the same remote computer or to the computer from where I am accessing the files) then I can open the file without any problem. This might be how it works ... but that's not what I need. My question is: Can I encrypt a folder (and all files inside), access those files (create, edit) seamlessly while I am logged in normally to the computer ... but make the files stays encrypted when they were copied to another directory outside the encrypted folder? Regardless they were copied to the same computer or another computer or uploaded to a remote server. If this is not a feature that Windows natively support, is there third party software that does that? Thank you.

    Read the article

  • Caching/preloading files on Linux into RAM

    - by Andrioid
    I have a rather old server that has 4GB of RAM and it is pretty much serving the same files all day, but it is doing so from the hard drive while 3GBs of RAM are "free". Anyone who has ever tried running a ram-drive can witness that It's awesome in terms of speed. The memory usage of this system is usually never higher than 1GB/4GB so I want to know if there is a way to use that extra memory for something good. Is it possible to tell the filesystem to always serve certain files out of RAM? Are there any other methods I can use to improve file reading capabilities by use of RAM? More specifically, I am not looking for a 'hack' here. I want file system calls to serve the files from RAM without needing to create a ram-drive and copy the files there manually. Or at least a script that does this for me. Possible applications here are: Web servers with static files that get read alot Application servers with large libraries Desktop computers with too much RAM Any ideas? Edit: Found this very informative: The Linux Page Cache and pdflush As Zan pointed out, the memory isn't actually free. What I mean is that it's not being used by applications and I want to control what should be cached in memory.

    Read the article

  • 7zip many files from different folders?

    - by mafutrct
    I would like to add a large number of files with different names from different folders to a single 7zip archive using 7za.exe. This should be simple, but it turned out to be a major pain. I created a file that contains the paths (7za -a @list.txt) but once there are too many (~100) files, it fails. Apparently the content of the argument file is pushed onto the command line buffer, which is far too small (the number of files to add is 1m). Splitting the process up by adding the files one by one is not feasible due to the way 7za works: When adding the next file, it creates a copy of the archive, adds the file to the copy and finally replaces the original. This is terribly slow once the archive gets to a couple 100MB in size. So far I am using a combination of the two approaches by adding a dozen files each time in a loop, but it is an unreliable hack and still very slow. Is there a better way to do it? I tried to use 7zip wrapper DLLs (I'm a C# programmer), but none of them worked reliably and I was repeatedly suggested to just use 7za instead.

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • How I can recover files when the folder shows empty but the files are not deleted?

    - by Borror0
    Yesterday, my laptop caught a virus which caused massive damage. Since them, I have been trying to recover important files before reformatting my computer, a task the virus has not made easy. Restoration points predating the attack have been deleted. Most of my folders show empty. My Start menu is essentially empty, with the exception of Trillian and Mirror's Edge. The same goes for my Desktop, which only has programs which were installed after the attack. Searching for files though my computer is pretty much useless, as it only rarely brings up anything. I suspect most of my files have not been deleted. While my folders show empty, uTorrent still does display them and I can open them from here. Unfortunately, when I select Open Containing Folder, the folder still shows as completely empty even if I'm currently watching a video from that very folder. Further adding evidence to the not-deleted, just-missing theory, the data recovery software I'm using (Restoration) cannot find only find an handful of the missing files. If they were deleted, I could do a forensic recovery to get them back but since they're probably still somewhere on my computer, just out out of my reach, I can't find them. Under those circumstances, is there a way I can recover those files?

    Read the article

  • UNIX command for replace files

    - by all-R
    Hi, on Mac, you know there is no 'merge files' when you command+c/command+v some folder on another one, it actually replaces it. I'm simply wondering on what UNIX is this based? Because correct me if I'm wrong, but "cp - R" DOES merge files, no? And that's what I'm doing via the Finder, copying some files and folders... thanks!

    Read the article

  • Highest compression for files(for web transfer)?

    - by Rogue
    Have seen some highly compressed files around.(for eg: i have seen 700mb of data compressed to around 30-50mb) But how do you get such compressed files, I have tried using softwares like Winrar and 7Zip but have never achieved such high compression. What are the techniques/software that allow you to compress files so well? (P.S. I'm using Windows Xp)

    Read the article

  • including files in a symlink directory when backing up with duplicity

    - by Rob
    I'm backing up using Duplicity, great tool. I'm unable to include files in the backup that are within a directory that is a symlink. Using the following: duplicity <dup args> --include /var/www/**/current --exclude '**' duplicity will only backup the symlink I've tried: duplicity <dup args> --include /var/www/**/current/* --exclude '**' # and duplicity <dup args> --include /var/www/**/current/** --exclude '**' Not even then symlink is backed up. the "current" directory links to directory like: /var/www/host.com/de9f2c7fd25e1b3afad3e85a0bd17d9b100db4b3 The files contains a few static html & css files. I want those files to be backed up, regardless of which sha'd directory "current" points to. Any help appreciated.

    Read the article

  • How to sync Ovi files with Ubuntu desktop?

    - by MikeG
    Hi I just signed in to Ovi Files service and it works great on Windows Desktop. But it lacks a connector for Linux. Is there any alternative way to achieve the same. The goal is to sync your files on the desktop with a webstorage system like Google Docs or Ovi files, Dropbox and so on. Thanks.

    Read the article

  • Are PHP session files ever deleted?

    - by GetFree
    I see there are thousands of files in my "/tmp" directory (a CentOS machine) and almost all of them are PHP session files. I'm worried about the possible impact this might have on my system. Are those files ever deleted either by the OS, Apache or PHP? or I have to take care of it myself?

    Read the article

  • How to avoid copying corrupted files with rsync

    - by Roberto Aloi
    I have an HDD with plenty of files, some of which are unfortunately corrupted. I'm now trying to copy the good files into a new HDD. I'm using: rsync -azP SRC TGT When rsync comes to one of the corrupted files, I can see a message in the console: rsync: read errors mapping XXX: Input/output error (5) In the target folder, I still see the corrupted file, which I'm not able to open and which I have to delete manually. Is there any option to tell rsync not to copy files after a i/o error?

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • Is there a way to backup all my files and replace with 0-byte files with same name?

    - by laggingreflex
    My main drive on laptop keeps filling up so I take a backup on a USB and delete the original files. But then I find myself getting (downloading or getting from someone else) files that I already have backed-up but couldn't recall at the moment. So is there a way I can keep a 0-byte file with the same name as teh backed-up copy so that when I'm asked whether to overwrite the existing file, I can easily choose no knowing I probably have this file already in the backup. EDIT: better yet, replace with a shortcut(.lnk) on the external drive so I can access the files hassle free and not get any errors because of 0-byte files being accidentally opened.

    Read the article

  • Mercurial (hg) commit only certain files

    - by bresc
    Hi I'm trying to commit only certain files with hg. Because of of hg having auto-add whenever I try to commit a change it wants to commit all files. But I don't want that because certain files are not "ready" yet. There is hg commit -I thefile.foo, but this is only for one file. The better way for me would be if I can turn off auto-add as in git. Is this possible? thx

    Read the article

  • ZFS, dedupe and PST files

    - by Unreason
    I am interested to know what would be expected maximum dedupe ratio for a set of PST files. I have ~40G of pst files from ~15 usres with high level of duplication of attachments. I am running tests to see if I can have significant space savings if I store the data on ZFS with dedupe. For this purpose I have installed a test setup of Nexenta, but was wondering if someone here had already done this and what level of deduplication I might expect (or in another words how sensitive are pst files to block alignment and what are the parameters that can influence the ratio?). Initial test show very low dedupe ratio and I did find explanation that block level dedupe would not be efficient here and that byte level dedupe would be much better (and that it should be performed by application that is aware of internal organization), so I am just double checking here if someone have some more input. Otherwise I will probably be converting PST files to IMAP.

    Read the article

  • Can not copy files from Windows 2003 server over network

    - by Mark
    It seemed quite strange. I have a share folder with full read/write permission on my Windows 2003 server. With a XP client, I can create a new folder on the share folder, and I can copy files to it normally, but I can not copy these files back to my client PC. I tried use ftp,and webdav to get the files from server. None of them worked. Is the issue related with NETWORK SERVICE? Thanks for your help.

    Read the article

  • DOS application to allow remote management of files over serial link

    - by tomlogic
    Harken back to the days of DOS. I have an embedded DOS handheld device, and I'm looking for a tool to manage the files stored on it. I picture an application I can launch on the device that opens COM1 up for commands to get a directory listing, send/receive files via x/y/zmodem, move/delete files, and create/move/delete directories. A Windows application can then download a recursive file listing and then manage those files (for example, synchronizing with a local directory). Keep in mind that this is DOS -- 8.3 filenames, 640K of RAM and a 19200bps serial link (yuk!). I'd prefer something with source in case we need to add additional features (for example, the ability to get a checksum of a file for change detection). Now that I've written this description, I realize I'm asking for something like LapLink or pcAnywhere. Norton no longer sells DOS versions of pcAnywhere and LapLink V for DOS seems pricy at $50. Are you aware of any similar apps from those good old days?

    Read the article

  • Google Drive and sync?

    - by Royi Namir
    Two questions please: Where are Google Drive offline docs stored in my computer? which folder ? (*when accessing docs.google.com/offline) I have a text file in my Google Drive. When I click on it, I can view it only (no edit). The only option to edit is to export it to Google Docs, but now I have 2 files: the original text file and the the editable one. So now I have to sync both the regular file AND the version created by Google Docs. Is that the normal behavior?

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • Program to swap files between drives?

    - by josi
    Has anyone built a program/script to transfer files between 2 hard drives, but like if both are near full....so one copies 1 file over then the other copies the other file, then they delete the files that were copied? Kind of annoying, have a 6tb raid at about 4tb full, then 1 4.5tb basically full, can't really swap them easily....without doing many copies and deletes of files.... Anyone know a way to make them just swap? lol

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >