Search Results

Search found 43406 results on 1737 pages for 'drag and drop files'.

Page 92/1737 | < Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >

  • DOS Batch file to find "new" files by date

    - by Todd McArthur
    My PC has entered an infinite BSOD loop - but I do have access to a safe-mode command prompt. I'm trying to get an idea of "what changed" that might have triggered this. e.g. I might have gotten a virus, or an app update went belly up. I'd like to thus see which files were created/modified in the last few days/week or at least the *.exe, *.dll, *.com, *.bat etc. I thought I was ok with my Batch-fu but I'm stumped on how to write a quick batch file/command that would list the files for me. REM This will find the files, but the results are all muddled REM all EXE files, reverse sort by date, recursively through sub-directories dir *.exe /O-D /S What I'd really like is to find all (executable filetypes) that were created/modified in the last 3-7 days. Can anyone point me in the right direction?

    Read the article

  • Shared Files stuck locked even after closing all sessions

    - by Chris S
    We run a business app from a shared network drive (has to be this way). When I go to do updates it complains that files are locked. Generally there are open sessions from people who left their computer on, but with no locks on files; there aren't necessarily always sessions open when it complains about locked files. If I close these sessions they disappear. I say "disappear" because I suspect they're actually hanging open. If I try to restart the Server service, it hangs on stopping. Restarting the whole server (it's a VM) unlocks the files. The Server is a Windows 2008 R2 Ent VM running on Hyper-V; the share is accessed through DFS. Offline Files and caching are disabled (Share and GPO). All clients are Win7. Nothing has SP1 yet. Any ideas on what causes the file locks to hang? Any ideas for a solution other than rebooting the server every time?

    Read the article

  • Proper way to rotate Nginx logs

    - by depesz
    I would like to achieve rotation of nginx logs that: would work without any extra software (i.e. - best if without "logrotate") would create rotated files with names based on date Best approach is something like PostgreSQL has - i.e. in it's log_filename config variable I can specify strftime-style %Y-%m-%d, and it will automatically change log on date (or time) change. Another approach from apache - sending logs via pipe to rotatelogs program. As far as I was able to search - no such approach exists. All I can do, is to use logrotate with dateext option, but it has it's own set of drawbacks, and I'd rather use something that works like |rotatelogs or log_filename in PostgreSQL.

    Read the article

  • Configure Apache + Passenger to serve static files from different directory

    - by Rory Fitzpatrick
    I'm trying to setup Apache and Passenger to serve a Rails app. However, I also need it to serve static files from a directory other than /public and give precedence to these static files over anything in the Rails app. The Rails app is in /home/user/apps/testapp and the static files in /home/user/public_html. For various reasons the static files cannot simply be moved to the Rails public folder. Also note that the root http://domain.com/ should be served by the index.html file in the public_html folder. Here is the config I'm using: <VirtualHost *:80> ServerName domain.com DocumentRoot /home/user/apps/testapp/public RewriteEngine On RewriteCond /home/user/public_html/%{REQUEST_FILENAME} -f RewriteCond /home/user/public_html/%{REQUEST_FILENAME} -d RewriteRule ^/(.*)$ /home/user/public_html/$1 [L] </VirtualHost> This serves the Rails application fine but gives 404 for any static content from public_html. I have also tried a configuration that uses DocumentRoot /home/user/public_html but this doesn't serve the Rails app at all, presumably because Passenger doesn't know to process the request. Interestingly, if I change the conditions to !-f and !-d and the rewrite rule to redirecto to another domain, it works as expected (e.g. http://domain.com/doesnt_exist gets redirected to http://otherdomain.com/doesnt_exist) How can I configure Apache to serve static files like this, but allow all other requests to continue to Passenger?

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • Process PHP files from a network share in a vmware virtual machine

    - by nhinkle
    As a testing environment, I have set up a vmware virtual machine running Windows Server 2008 R2. I have Apache and PHP installed (as part of the xampp package). I am doing the development outside of the VM, and so want Apache to serve PHP files from a VM shared folder (which appears as a network share in the VM). I have done this by creating an NTFS symbolic link in Apache's htdocs directory. I can access this directory from the browser, and plain-text files are readable. However, PHP fails to process files, instead returning the following error: Warning: Unknown: failed to open stream: No such file or directory in Unknown on line 0 Fatal error: Unknown: Failed opening required 'C:/xampplite/htdocs/path/to/file.php' (include_path='.;C:\xampplite\php\PEAR') in Unknown on line 0 It appears to be a permissions issue — PHP doesn't seem to be allowed to read the file to process it. However, Apache has no problem opening files in the directory. I cannot figure out how to give PHP the necessary permissions to process the file. Does anybody know of a way to make this work, or else another solution for getting the files into the VM automatically while I develop on the host machine?

    Read the article

  • Use DOS batch to move all files up 1 directory

    - by Harminoff
    I have created a batch file to be executed through the right-click menu in Win7. When I right-click on a folder, I would like the batch file to move all files (excluding folders) up 1 directory. I have this so far: PUSHHD %1 MOVE "%1\*.*" ..\ This seems to work as long as the folder I'm moving files from doesn't have any spaces. When the folder does have spaces, I get an error message: "The syntax of the command is incorrect." So my batch works on a folder titled PULLTEST but not on a folder titled PULL TEST. Again, I don't need it to move folders, just files. And I would like it to work in any directory on any drive. There will be no specific directories that I will be working in. It will be random. Below is the registry file I made if needed for reference. Windows Registry Editor Version 5.00 [HKEY_CLASSES_ROOT\Directory\shell\PullFiles] @="PullFilesUP" [HKEY_CLASSES_ROOT\Directory\shell\PullFiles\command] @="\"C:\\Program Files\\MyBatchs\\PullFiles.bat\" \"%1\""

    Read the article

  • Disk full, how to move mysql database files?

    - by kopeklan
    my database files located in /var/lib/mysql which located in partition /dev/sda5 this partition is full (refer here for details) so I'm going to move the location of database files from /var/lib/mysql to /home/lib/mysql What is the right way to move this database files? Im going to do this steps: Stop http server and PHP Change datadir=/var/lib/mysql to become datadir=/home/lib/mysql in /etc/my.cnf move all database files to the new location run killall -9 mysql, then /etc/init.d/mysqld start Start http server and PHP Is this right? Correct me if I'm wrong added: currently, mysql won't stop. refer here: mysql wont stop, mysqld_safe appeared in top

    Read the article

  • VMware vCetner tomcat log files missing

    - by sttaq
    I have upgraded to a trial version of vCenter 5.1 from 5.0. Previous versions used to have tomcat log files inside the tomcat\logs folder. But since I have update I have noticed that tomcat is not writing any new log files. The log files were of the following format: vctomcat-stderr.XXXX-XX-XX.log vctomcat-stdout.XXXX-XX-XX.log I would like to know if there is a way by which we can configure tomcat (which is shipped with vCenter) to produces these log files? Also, VMware seems to have removed that Configure Tomcat application that they used to ship with the older version. Any reasons for this?

    Read the article

  • Media Encoder w\ Merge Files & SandyBridge Hardware Encoding Support

    - by GruffTech
    So i have a Z68 chipset that allows Quick Sync encoding on my i7 processor. I want a media encoder that supports both Hardware encoding and allows me to "merge files". My problem is Fraps breaks my recording into 4GB files. I need to stitch these files together & re-encode for Youtube ect, but i want to use the hardware encoding I paid for. I have not figured out how to do this on Handbrake or MediaEsspresso.

    Read the article

  • Caching/preloading files on Linux into RAM

    - by Andrioid
    I have a rather old server that has 4GB of RAM and it is pretty much serving the same files all day, but it is doing so from the hard drive while 3GBs of RAM are "free". Anyone who has ever tried running a ram-drive can witness that It's awesome in terms of speed. The memory usage of this system is usually never higher than 1GB/4GB so I want to know if there is a way to use that extra memory for something good. Is it possible to tell the filesystem to always serve certain files out of RAM? Are there any other methods I can use to improve file reading capabilities by use of RAM? More specifically, I am not looking for a 'hack' here. I want file system calls to serve the files from RAM without needing to create a ram-drive and copy the files there manually. Or at least a script that does this for me. Possible applications here are: Web servers with static files that get read alot Application servers with large libraries Desktop computers with too much RAM Any ideas? Edit: Found this very informative: The Linux Page Cache and pdflush As Zan pointed out, the memory isn't actually free. What I mean is that it's not being used by applications and I want to control what should be cached in memory.

    Read the article

  • Encrypted folders and files stay encrypted when copied

    - by user66126
    Hi all, I just tried out Windows 7 feature to encrypt a folder. I found that when I access the encrypted folder from another computer (the parent folder of the encrypted folder is shared) I can see the files there but I could not open it (which is good). But when I copy the file to another folder outside the encrypted folder (regardless it is on the same remote computer or to the computer from where I am accessing the files) then I can open the file without any problem. This might be how it works ... but that's not what I need. My question is: Can I encrypt a folder (and all files inside), access those files (create, edit) seamlessly while I am logged in normally to the computer ... but make the files stays encrypted when they were copied to another directory outside the encrypted folder? Regardless they were copied to the same computer or another computer or uploaded to a remote server. If this is not a feature that Windows natively support, is there third party software that does that? Thank you.

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • 7zip many files from different folders?

    - by mafutrct
    I would like to add a large number of files with different names from different folders to a single 7zip archive using 7za.exe. This should be simple, but it turned out to be a major pain. I created a file that contains the paths (7za -a @list.txt) but once there are too many (~100) files, it fails. Apparently the content of the argument file is pushed onto the command line buffer, which is far too small (the number of files to add is 1m). Splitting the process up by adding the files one by one is not feasible due to the way 7za works: When adding the next file, it creates a copy of the archive, adds the file to the copy and finally replaces the original. This is terribly slow once the archive gets to a couple 100MB in size. So far I am using a combination of the two approaches by adding a dozen files each time in a loop, but it is an unreliable hack and still very slow. Is there a better way to do it? I tried to use 7zip wrapper DLLs (I'm a C# programmer), but none of them worked reliably and I was repeatedly suggested to just use 7za instead.

    Read the article

  • How I can recover files when the folder shows empty but the files are not deleted?

    - by Borror0
    Yesterday, my laptop caught a virus which caused massive damage. Since them, I have been trying to recover important files before reformatting my computer, a task the virus has not made easy. Restoration points predating the attack have been deleted. Most of my folders show empty. My Start menu is essentially empty, with the exception of Trillian and Mirror's Edge. The same goes for my Desktop, which only has programs which were installed after the attack. Searching for files though my computer is pretty much useless, as it only rarely brings up anything. I suspect most of my files have not been deleted. While my folders show empty, uTorrent still does display them and I can open them from here. Unfortunately, when I select Open Containing Folder, the folder still shows as completely empty even if I'm currently watching a video from that very folder. Further adding evidence to the not-deleted, just-missing theory, the data recovery software I'm using (Restoration) cannot find only find an handful of the missing files. If they were deleted, I could do a forensic recovery to get them back but since they're probably still somewhere on my computer, just out out of my reach, I can't find them. Under those circumstances, is there a way I can recover those files?

    Read the article

  • UNIX command for replace files

    - by all-R
    Hi, on Mac, you know there is no 'merge files' when you command+c/command+v some folder on another one, it actually replaces it. I'm simply wondering on what UNIX is this based? Because correct me if I'm wrong, but "cp - R" DOES merge files, no? And that's what I'm doing via the Finder, copying some files and folders... thanks!

    Read the article

  • Highest compression for files(for web transfer)?

    - by Rogue
    Have seen some highly compressed files around.(for eg: i have seen 700mb of data compressed to around 30-50mb) But how do you get such compressed files, I have tried using softwares like Winrar and 7Zip but have never achieved such high compression. What are the techniques/software that allow you to compress files so well? (P.S. I'm using Windows Xp)

    Read the article

  • including files in a symlink directory when backing up with duplicity

    - by Rob
    I'm backing up using Duplicity, great tool. I'm unable to include files in the backup that are within a directory that is a symlink. Using the following: duplicity <dup args> --include /var/www/**/current --exclude '**' duplicity will only backup the symlink I've tried: duplicity <dup args> --include /var/www/**/current/* --exclude '**' # and duplicity <dup args> --include /var/www/**/current/** --exclude '**' Not even then symlink is backed up. the "current" directory links to directory like: /var/www/host.com/de9f2c7fd25e1b3afad3e85a0bd17d9b100db4b3 The files contains a few static html & css files. I want those files to be backed up, regardless of which sha'd directory "current" points to. Any help appreciated.

    Read the article

  • How to avoid copying corrupted files with rsync

    - by Roberto Aloi
    I have an HDD with plenty of files, some of which are unfortunately corrupted. I'm now trying to copy the good files into a new HDD. I'm using: rsync -azP SRC TGT When rsync comes to one of the corrupted files, I can see a message in the console: rsync: read errors mapping XXX: Input/output error (5) In the target folder, I still see the corrupted file, which I'm not able to open and which I have to delete manually. Is there any option to tell rsync not to copy files after a i/o error?

    Read the article

  • How to sync Ovi files with Ubuntu desktop?

    - by MikeG
    Hi I just signed in to Ovi Files service and it works great on Windows Desktop. But it lacks a connector for Linux. Is there any alternative way to achieve the same. The goal is to sync your files on the desktop with a webstorage system like Google Docs or Ovi files, Dropbox and so on. Thanks.

    Read the article

  • Are PHP session files ever deleted?

    - by GetFree
    I see there are thousands of files in my "/tmp" directory (a CentOS machine) and almost all of them are PHP session files. I'm worried about the possible impact this might have on my system. Are those files ever deleted either by the OS, Apache or PHP? or I have to take care of it myself?

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

< Previous Page | 88 89 90 91 92 93 94 95 96 97 98 99  | Next Page >