Search Results

Search found 54055 results on 2163 pages for 'multiple files'.

Page 117/2163 | < Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >

  • DOS application to allow remote management of files over serial link

    - by tomlogic
    Harken back to the days of DOS. I have an embedded DOS handheld device, and I'm looking for a tool to manage the files stored on it. I picture an application I can launch on the device that opens COM1 up for commands to get a directory listing, send/receive files via x/y/zmodem, move/delete files, and create/move/delete directories. A Windows application can then download a recursive file listing and then manage those files (for example, synchronizing with a local directory). Keep in mind that this is DOS -- 8.3 filenames, 640K of RAM and a 19200bps serial link (yuk!). I'd prefer something with source in case we need to add additional features (for example, the ability to get a checksum of a file for change detection). Now that I've written this description, I realize I'm asking for something like LapLink or pcAnywhere. Norton no longer sells DOS versions of pcAnywhere and LapLink V for DOS seems pricy at $50. Are you aware of any similar apps from those good old days?

    Read the article

  • Speed-up large number of files deletion on NTFS volumes

    - by sharptooth
    Every now and then I need to delete a folder containing something like 500k files from an NTFS volume. I do this with Windows Explorer. Since NTFS journals all the service data changes each deletion is carried out serially and so the whole 500k files deletion takes ages. I remember when I did the same in FAT32 it ran uncomparably faster. Is there any way to speed up deletion of large number of files on NTFS volumes?

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • Why would anacron not be running?

    - by Rory
    I have a Ubuntu system that has anacron installed. However I'm pretty sure it's not running. It's not running the commands in /etc/cron.daily to rotate the syslog files (I'm using sysklog, which has its own rotating log method, not using logrotate). The last time the logs were rotated were in October 2009. /var/spool/anacron/cron.daily exists and the contents are 20091015. AFAIR we had a power outage then, and everything rebooted. How can I debug anacron? How can I see why it's not running? My first instinct is to look for /var/log/anacron, but that's not there. How can I fix it to make it run again?

    Read the article

  • 3 Monitor PCI-e Graphics card on Linux (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. The exact message is: The current settings cannot be applied. Possible issues may include: - Display(s) cannot be enabled. - Setting(s) cannot be applied due to insufficient video memory. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment.

    Read the article

  • Google Drive and sync?

    - by Royi Namir
    Two questions please: Where are Google Drive offline docs stored in my computer? which folder ? (*when accessing docs.google.com/offline) I have a text file in my Google Drive. When I click on it, I can view it only (no edit). The only option to edit is to export it to Google Docs, but now I have 2 files: the original text file and the the editable one. So now I have to sync both the regular file AND the version created by Google Docs. Is that the normal behavior?

    Read the article

  • Keep Uploaded Files in Sync Across Multiple Servers - LAMP

    - by Dfranc3373
    I have a website right now that is currently utilizing 2 servers, a application server and a database server, however the load on the application server is increasing so we are going to add a second application server. The problem I have is that the website has users upload files to the server. How do I get the uploaded files on both of the servers? I do not want to store images directly in a database as our application is database intensive already. Is there a way to sync the servers across each other or is there something else I can do? Any help would be appreciated. Thanks

    Read the article

  • HAProxy, health checking multiple servers with different host names

    - by Marco Bettiolo
    I need to load balance between multiple running servers with different host names. I cannot set-up the same virtual host on each one. Is it possible to have only one listen configuration with multiple server and make the Health Checks apply the http-send-name-header Host directive? I am using HAProxy 1.5. I came up with this working haproxy.cfg, as you can see, I had to set a different hostname for each health check as the health check ignores the http-send-name-header Host. I would have preferred to use variables or other methods and keep things more concise. global log 127.0.0.1 local0 notice maxconn 2000 user haproxy group haproxy defaults log global mode http option httplog option dontlognull retries 3 option redispatch timeout connect 5000 timeout client 10000 timeout server 10000 stats enable stats uri /haproxy?stats stats refresh 5s balance roundrobin option httpclose listen inbound :80 option httpchk HEAD / HTTP/1.1\r\n server instance1 127.0.0.101 check inter 3000 fall 1 rise 1 server instance2 127.0.0.102 check inter 3000 fall 1 rise 1 listen instance1 127.0.0.101:80 option forwardfor http-send-name-header Host option httpchk HEAD / HTTP/1.1\r\nHost:\ www.example.com server www.example.com www.example.com:80 check inter 5000 fall 3 rise 2 listen instance2 127.0.0.102:80 option forwardfor http-send-name-header Host option httpchk HEAD / HTTP/1.1\r\nHost:\ www.bing.com server www.bing.com www.bing.com:80 check inter 5000 fall 3 rise 2

    Read the article

  • How to handle files that don't need version control in mercurial

    - by richardh
    I am new to mercurial, and for the most part do LaTeX reports and statistical calculations in R using .csv and/or .sqlite files. Re LaTeX, all I really care is the .tex file. Re R, I don't need version control on the .csv or .sqlite files because they are static. When I do 'hg add' for a repo with a .csv and/or .sqlite file, I get a warning like: rev2.sqlite: up to 3070 MB of RAM may be required to manage this file (use 'hg revert rev2.sqlite' to cancel pending addition) So I revert and subsequently use adds like hg add -X *.sqlite. I guess I really have two questions: (1) Should I ignore these warnings? Because these large files are static, can I just add to the repo knowing that the diff files will always be empty and not worry about wasted resources? (2) If I should keep excluding these files from the repo, is there away that I can fix this option? I.E., add to my .hgrc file something that always appends an option like -I *.tex -I *.R to my 'hg add' commands? Thanks!

    Read the article

  • how to make vhdresizer work on XP Mode VHD files

    - by A_M
    Hi, I'm trying to shrink an Windows 7 XP Mode VHD file with little success. I've been trying to use VhdResizer. When I select my VHD file, it says "VhdExpand only supports fixed and dynamic VHD files". My XP Mode VHDs are dynamic files. Does anyone have any idea why it is failing? Failing that, does anyone have a process which I can use to shrink my XP mode VHD files? Thanks.

    Read the article

  • What is the most reliable way to copy access front end files to client PCs

    - by Funky Si
    I have several in house databases which have access 2003 front ends, either adp or ade files. I need to copy these from my server to every client machine. In the past I have used a rollout scripts to copy the files to the all users desktop folder. I have since adapted this to also copy files to the public desktop folder since we started having windows 7 client machines as well as XP. The problem is that some of the time these scripts don't work for windows 7. Is there a better way of copying these files to a mix of windows 7 and XP clients or is using rollout scripts the best way?

    Read the article

  • Linux: 3 Monitor PCI-e Graphics card (without tremendous pain)?

    - by N Rahl
    As we are all painfully aware, the only way to get multiple monitors AND compositing (Compiz) on Linux is to use a single graphics card that can drive both (or in my case all three) screens. I bought a Radeon 5750 specifically because it claims to able to drive 3 monitors. I can plug in 3 monitors (2 DVI, 1 HDMI) and the Catalyst Control Center shows all 3, but only 2 can be enabled at a time. I'll post the exact error message here soon, but it's very useless. So I'm going to assume that either the 5750 doesn't support 3 monitors, OR, more likely, ATI couldn't be bothered to add that support to their Linux drivers. So this is a multipart question: First, can anyone suggest a PCI Express Graphics card that can run 3 screens on linux without tremendous pain? I'm looking for something where you install the driver and all three screens "just work". Does such a card exist? Second, if you have a 5750, have you been able to get it to do 3 monitors? I'm running Ubuntu 10.04 at the moment. Thanks, Nick

    Read the article

  • Microsoft Outlook: Export list of currently opened PST files

    - by ultrasawblade
    At my current workplace we are upgrading various users from XP to Windows 7. Frequently the users have anywhere from 10 to 30 or so .pst files opened within their installation of Microsoft Outlook 2007. These users are particularly helpless without these files. I know how to view the list of currently opened PST files, and would like to know if there is an easy way to capture that information other than taking screenshots of the Options - Data Files window. Does migwiz.exe transfer this information? Is that the only way? Would there happen to be a tool that will let you capture and restore that information? I don't want to export or move the actual .pst's themselves (yes, some of them are on network locations, very terrible, I know), just reopen ones in a new installation of Outlook that used to be opened in a previous installation.

    Read the article

  • Robocopy to copy only new folders and files

    - by Valery Shampal
    A task: To find all new files and subfolders under some root folder (let us say Documents) and to copy them to other disk (j: in this case) Command line used: robocopy c:\users\valery\documents j:\robocopy /XO /E /MAXAGE:20131030 /XD Result: A full folders tree created. Only new files copyed, as supposed. It's good A point is, that I do not want to create all subfolders on a target disk, if there are no new files in them. Results are exeptable, but there is a lot of work to go through all folders and to find new files, as well, as to understand, what subfolders are a new ones Regards, Valery

    Read the article

  • Dropbox picture sync: Skip RAW files?

    - by Steven Lu
    I like the convenience of having Dropbox keep track of my photos because it tends to work with my devices over 3G (I am often tethering to my phone with my iPad and Macbook) as well as Wifi, but it's a waste of network traffic to sync the raw files from my camera or memory card. It clutters up the dropbox list and the files are just huge. Is there a way to configure the Dropbox client so that it ignores a certain file extension for the picture sync? Also, I suspect that if I just go and delete the raw files, that the next time I plug in the memory card and tell Dropbox to sync, it will re-download the raw files. Which would be terribad. I could switch to iCloud for Photo Stream, I suppose, but there will be no access via 3G that way. And I've already got years of experience with Dropbox so I know it's going to just work. I think any method that works for filtering files to exclude from sync on Dropbox in general should work here too. Edit: Wow there are 19k votes for this exact request.

    Read the article

  • Nginx Multiple If Statements Cause Memory Usage to Jump

    - by Justin Kulesza
    We need to block a large number of requests by IP address with nginx. The requests are proxied by a CDN, and so we cannot block with the actual client IP address (it would be the IP address of the CDN, not the actual client). So, we have $http_x_forwarded_for which contains the IP which we need to block for a given request. Similarly, we cannot use IP tables, as blocking the IP address of the proxied client will have no effect. We need to use nginx to block the requested based on the value of $http_x_forwarded_for. Initially, we tried multiple, simple if statements: http://pastie.org/5110910 However, this caused our nginx memory usage to jump considerably. We went from somewhere around a 40MB resident size to over a 200MB resident size. If we changed things up, and created one large regex that matched the necessary IP addresses, memory usage was fairly normal: http://pastie.org/5110923 Keep in mind that we're trying to block many more than 3 or 4 IP addresses... more like 50 to 100, which may be included in several (20+) nginx server configuration blocks. Thoughts? Suggestions? I'm interested both in why memory usage would spike so greatly using multiple if blocks, and also if there are any better ways to achieve our goal.

    Read the article

  • How to download batch of randomly named files

    - by TheLearner
    I need to download a bunch of files from a website but I don't want to have to click on each file to add it to downloadall or whatever. The structure of the website is as follows: http://something.com/katalog/?get=Exclusive/group1/2012.09.03/ The directory has loads of randomly named files with .doc extensions. I can't use the batch feature because the files don't start or end with the same characters e.g. 001...100. Any ideas/

    Read the article

  • looking for a command line tool to copy files to remote computers (similar to psexec)

    - by hatchetman82
    hi. im looking for a small utility that can copy files over to/from remote windows hosts, and which can take the credentials (domain user and password) as part of its command line, similar to psexec. i know i can use net use to map the target directory to a drive letter and use xcopy, and i know psexec can upload files to be executed on the remote machine and then delete them, but im looking for a small utility to distribute files to remote hosts that will not be as awkward to use as net use + xcopy

    Read the article

  • Setting up a copy of a site with IIS 7?

    - by SJaguar13
    I have a site running on IIS with a dyndns.org domain that points to the IP of the Windows 2008 machine hosting it. I need a copy of that site for development purposes. I set up another folder with all the files, and create a new site in IIS. I don't really have a domain for it, so I was just going to use the IP address. When I go to localhost, 127.0.0.1, or the internal IP, I get bad hostname. If I use the IP address on port 80 (the same as the real version of the site), I get 404 not found. If I use a different port so I don't have them both on the same IP with the same port, I get connection timed out. How do I go about setting this up?

    Read the article

  • Add folder name to beginning of filename - getting multiple renames

    - by Flibble Wibble
    I've used dbenham's excellent response to the question of how to add the folder name to the beginning of a filename in a cmd script. @echo off pushd "Folder" for /d %%D in (*) do ( for %%F in ("%%~D\*") do ( for %%P in ("%%F\..") do ( ren "%%F" "%%~nxP_%%~nxF" ) ) ) popd What I'm finding is that seemingly randomly (though it probably isn't) sometimes the script will run through several child folders and rename correctly but then it gets to a folder where it gets stuck in a loop and starts adding the folder name repeatedly to the file inside. I have 90,000 files in 300 folders to rename this weekend. Any chance you can guess the cause? PS: Is there a maximum number of files that are acceptable in each folder?

    Read the article

  • DVD/CD burning .zip: is it more reliable, faster, longer lasting to burn a zip of files rather than the files as a folder?

    - by Rob
    Is it more reliable, faster, longer lasting to burn to CD/DVD a zip (or a few large zips) of files rather than the files as a folder? Just thinking if 1000s of small files would not be as efficiently recorded compared with one or a few large zips. Also, even after the burning program verifies the disc, I also use Beyond Compare to compare the files with those on the disc. Always binary compares as identical but I hear the drive stuttering presumably as the head is being shifted only slightly each time to seek the next file, which leads me to think that its best to make one or more zips and copy those locally to compare. Or is it that burning invidual files to the disc is not as readable which causes the head to stutter. There aren't any problems, my disc burns are reliable, just thinking more of efficiency and longevity, the discs burn and verify fast enough on my 18x DVD burner. I'm using ImgBurn mostly. Also used Nero in the past. I burn whole discs closed, finalised. Not sure which write mode but would think Disc At Once from a temporary cached image made by the burning program would be the most reliable.

    Read the article

  • How to move mail accounts when migrating webhosting

    - by pkswatch
    I am migrating my website abc.com from one webhosting company to another in a shared hosting environment. Both have cpanel. And the second hosting account i am preparing to move is my multi-domain hosting account with 3 domains already in it. The problem is, i have many email accounts associated with my website abc.com, which are accessed using webmail. So if i move it to the other host, will i lose all those accounts and their emails? If yes, then how should i synchronise the email accounts so that all the accounts and the contained emails remain intact? I saw some several sync tools like IMAP Sync, etc. But these require two hosts while synchronizing, and as you see, i have just one domain name to be synchronized over 2 servers. PS, i do not have any ssh access on either of them, and i have made complete backup of all files using backup wizard in cpanel.

    Read the article

  • git - recover deleted files from a prior commit

    - by Walter White
    I accidentally deleted some files in a prior commit and would like to recover them. How can I do this? I ran this and found exactly what I was looking for: git whatchanged --diff-filter=D At the time I made the commit, I should have committed the new/changed files only and ran a reset --hard then to recover the missing files. I have about 100 files that I need to restore. I don't want to do a straight revert as that will also undo the changes in that commit. Any ideas?

    Read the article

  • Why Netbeans 6.8 remote project (php) uploads all files by default

    - by xaguilars
    Hi I wanted to know if there's some option for disabling Netbeans to upload all files of a recently imported remote (php) project. I always check "Upload files on run", in the project configuration. But when I click on run Netbeans selects all files by default (I modified only some). The file checkboxes cannot be disabled at once and you have to do this one by one (imagine you have 5000 files...). That's annoying. Do you know any solution? thank you

    Read the article

  • Truecrypt files corrupted after moving PC into another case

    - by Dygerati
    I recently bought a new PC case and transferred all of my PC hardware into it. The only hardware modification was the addition of two identical ram modules. The entire process went smoothly, and everything worked and booted as before. The only side-effect I found when accessing one my of file-based hidden truecrypt volumes shortly there after. Some of the files in the volume - NOT all - seemed to be entirely corrupted. The directory and file names are garbled characters, but a few of the directories in the same volume appear and function normally. Also, all files in the non-hidden tc volume were still intact. Is this not weird? The only other real change I could think of would be that the hard drives were connected to different SATA ports on the mobo. I really don't know how the truecrypt encryption works well enough to know what could cause this...and the fact that not all the files were corrupted makes it more bizarre still. So, first off (and I'm not too hopeful on this point), would it be possible to restore these files? I had a backup of most, but not all of the files involved. Other than that I'm just curious how this happened and how I can prevent it next time. Thanks!

    Read the article

< Previous Page | 113 114 115 116 117 118 119 120 121 122 123 124  | Next Page >