Search Results

Search found 9286 results on 372 pages for 'transfer speed'.

Page 42/372 | < Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >

  • How to restore a Windows Easy Transfer file from a 64bit machine to a 32bit machine?

    - by Kevin Davis
    Using a 32bit laptop, I saved my settings etc. using "Windows Easy Transfer" from the Win7 RC. I set the file destination to a Win2K R2 machine that happened to be 64bit. When I re-installed my laptop and tried to restore my settings from the file I'd saved I was surprised to get an error: "Windows Easy Transfer can't transfer files from a 64-bit computer to a 32-bit computer." Is there a known workaround? Ideas on how to unpack the file and get my stuff?

    Read the article

  • linux command prompt ftp to ftp server running on windows

    - by Vass
    Hi, I am using on Windows Vista, Filezilla server. I have it set up to be accessed via outside IPs and when I use a client on the IP I have it connects normally using Filezilla client. On the same machine I have Ubuntu running in a virtual box and when using filezilla client in there it works fine. Now I want to try the command prompt. So I do the ftp xxx.xxx.xx.xx I enter the name and password and i get the ftp command prompt, but the commands are not working properly. when trying "ls" or "cd" these commands do not work. "cd" tells me that the current directory is "/" root, but this does not make sense in the windows operating system. Now the filezilla client is taking the user in the application window directly to the root folder of the permitted filespace granted to that user. How can the same be done from the command prompt, if there is a way? It is as if the command prompt takes me to the root which does not exist or even have correct permissions to move in. Is there any way to be taken to the correct directory directly, or move there especially when the slashes are the wrong way around etc? Best,

    Read the article

  • How to speed up apache

    - by Zen_silence
    We have a server with 8Cores, 16GB of RAM and RAID 0 SAS 10K drives. Our goal is to use this to serve a fairly simple php application quickly. We have tested all other components and we think we have narrowed it down to apache is our bottleneck. I am no apache guru I have done some research and tested a couple things but when i test with JMeter launching 100 concurrent connections against the server the first 10 - 20 come back quickly 30 - 100ms but the rest take between 1000ms to 3000ms. Anyone have any ideas on what to change in our apache config to make this faster right now its a vanilla install of apache.

    Read the article

  • Why can't get more speed on iperf on windows xp

    - by SledgehammerPL
    I test my bandwith and throughput using iperf (jperf) on desktop PC with WinXP. I can't get more than 3Mbit/s outside until I change TCP Window size - about 84Kb is ok. but I can't force XP to use this value by default.. I try very many magic spells on Registry, use many TCP Optimisers - but nothing works. I will accept that that everything is ok, when I reboot the PC, run iperf and will see 18.1Mbit - like my Linux box standing very near my Windows XP Box. Is it possible?

    Read the article

  • Criteria strings, how many different criteria can be entered to retrieve specific data?

    - by Janet
    For our membership database we are currently using an old DOS program "Arclist". The program is old but the one feature we desperately need in a database program is to be able to enter multiple criteria at one time for more of a "one time" extraction of the data meeting all the various criteria entered in what I call a "criteria string". An example may be extracting only those records with zip codes matching (67893, 54235, 54323, 54201, 54302, 54303, 54301, 67894, 67895). Another set of criteria might be to omit records, not equal to, one type of criteria in one field and also extract records matching criteria in another field. So we would want records "not equal to" in one field, but whose information equals requested information in another field.

    Read the article

  • Windows 7 can't copy file - Error 0x800700DF: The file size exceeds the limit allowed and cannot be saved

    - by JJGroover
    Any attempt to copy files larger than about 40 MB from a network share (a SAN running open filer / Samba) to my local machine running Windows 7 always results in the following error and the copy fails: Error 0x800700DF: The file size exceeds the limit allowed and cannot be saved. I've tried copying to my C: drive and a USB drive with the same results. Smaller files copy just fine. Clearly 40 MB is not that big of a file so I'm assuming it is some buggy interaction between windows 7 and Samba perhaps. Google has so far turned up nothing. Can anyone point me in the right direction?

    Read the article

  • Is it possible to download extremely large files intelligently or in parts via SSH from Linux to Windows?

    - by Andrew
    I have a ~35 GB file on a remote Linux Ubuntu server. Locally, I am running Windows XP, so I am connecting to the remote Linux server using SSH (specifically, I am using a Windows program called SSH Secure Shell Client version 3.3.2). Although my broadband internet connection is quite good, my download of the large file often fails with a Connection Lost error message. I am not sure, but I think that it fails because perhaps my internet connection goes out for a second or two every several hours. Since the file is so large, downloading it may take 4.5 to 5 hours, and perhaps the internet connection goes out for a second or two during that long time. I think this because I have successfully downloaded files of this size using the same internet connection and the same SSH software on the same computer. In other words, sometimes I get lucky and the download finishes before the internet connection drops for a second. Is there any way that I can download the file in an intelligent way -- whereby the operating system or software "knows" where it left off and can resume from the last point if a break in the internet connection occurs? Perhaps it is possible to download the file in sections? Although I do not know if I can conveniently split my file into multiple files -- I think this would be very difficult, since the file is binary and is not human-readable. As it is now, if the entire ~35 GB file download doesn't finish before the break in the connection, then I have to start the download over and overwrite the ~5-20 GB chunk that was downloaded locally so far. Do you have any advice? Thanks.

    Read the article

  • Replacing a Windows File Server

    - by Keltari
    We have a Windows file server that needs to be replaced. Unfortunately, there are to many custom build applications, shares, and random things that rely on the existence of that file server to just stand up a new one, so we need to make the transition as seamless as possible. We decided to copy the contents of the old server to the new and then rename the new one with the same name. I have made a list of everything I need to do. Am I missing anything? The new file server is up and running I copied the directories of the old file server to the new one I set up all the shares/permissions for those directories in advance Copy the contents from oldserver to newserver robocopy \vash\d$\nasshare\ \vash2\l$\originalvash\nasshare\ /E /ZB /copyall /dcopy:T /FP /x /v /fp /np /mt:8 /eta /log:robocopy.log /tee Rename and reIP oldserver to something else (need it available if something is missing) Rename and reIP newserver to oldserver Obviously, I need to test if shares are working properly. Are there any steps im missing?

    Read the article

  • Network Drive Via Ethernet Port for Speed?

    - by Yar
    I have a Macbook with Firewire 400 and USB 2.0, so the only way I can get fast external storage is through the Ethernet port. A really fast firewire 800 drive on ANOTHER computer is actually much faster than the built-in drive (according to XBench). So I thought I would try to go one better and buy an ethernet-ready drive. I bought a Seagate GoFlex™ Home Network Storage System, and it seems like the only way to get it to work is to plug it into a router. Can this drive be used without a router (i.e., direct to computer)? Are there any drives that can be plugged directly into the ethernet port for fast access? I don't want the drive on my router: I want it on my computer. Ideally I'd need 7200rpm or faster, too... Update: Just chatted with Seagate and they said that this particular drive will not work that way. Will any others?

    Read the article

  • What hash should be used to ensure file integrity?

    - by Corey Ogburn
    It's no secret that large files offered up for download often are coupled with their MD5 or SHA-1 hash so that after you download you can verify the file's integrity. Are these still the best algorithms to use for this? Obviously these are very popular hashes that potential downloaders would have easy access to. Ignoring that factor, what hashes have the best properties for being used for this? For example, bcrypt would be horrible for this. It's designed to be slow. That would suck to use on your 7.4 GB dual layer OS ISO you just downloaded when a 12 letter password might take up to a second with the right parameters.

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • Copy files with filter (XP)

    - by fire
    I have a huge folder (over 6GB) with multiple sub-folders that I want to copy onto an external hard drive, however I do not want it to copy any PDF, EXE or ZIP files across to save space. Is there any software that will help me achieve this? I have looked at TeraCopy but this doesn't seem to have any filter mechanism on it. I am using Windows XP (* sigh *). *edit: found the xcopy command, will this do it? Can anyone help me with the syntax?

    Read the article

  • Rsync and Lazy mode ?

    - by fabien-barbier
    Since transferring or copying a file that is being used sometimes causes corruption of the transferred file, can we define a time interval in which Rsync checks each file in a given directory to see if there is a change within that time interval ? Files that are not changed during that interval will be transferred, while those that have changes will not. Can I do that with rsync ? Or another tool ? Is there a script to add this functionality to Rsync ? Thanks

    Read the article

  • NTFS write speed really slow (<15MB/s)

    - by Zulakis
    I got a new Seagate 4TB harddrive formatted with ntfs using parted /dev/sda > mklabel gpt > mkpart pri 1 -1 mkfs.ntfs /dev/sda1 When copying files or testing writespeed with dd, the max writespeed I can get is about 12MB/s. The harddrive should be capable of atleast 100MB/s. top shows high cpu usage for the mount.ntfs process. The system has a AMD dualcore. This is the output of parted /dev/sda unit s print: Model: ATA ST4000DM000-1F21 (scsi) Disk /dev/sda: 7814037168s Sector size (logical/physical): 512B/4096B Partition Table: gpt Number Start End Size File system Name Flags 1 2048s 7814035455s 7814033408s pri The used kernel is 3.5.0-23-generic. The ntfs-3g versions I tried are ntfs-3g 2012.1.15AR.1 (ubuntu 12.04 default) and the newest version ntfs-3g 2013.1.13AR.2. When formatted with ext4 I get good write speeds with about 140MB/s. How can I fix the writespeed?

    Read the article

  • Rsync over NFS with QoS: How to view real transfer speed?

    - by Ian Mackinnon
    We have a bandwidth limit between a Linux server and a NAS, created using 'tc' with an IP filter. When writing to an NFS mount of the NAS, rsync claims a very high transfer speed for each file and then waits a long time before acknowledging that everything has finished. The total time taken is consistent with the QoS limit and the time taken by the same transfer over FTP. Why does the write to the NFS mount report higher transfer speeds than are actually happening over the network? How can I monitor the actual bandwidth of the transfer?

    Read the article

  • Remote file copy util (like rsync) but that will take account of data already copied (in this sessio

    - by Rory McCann
    Let's say I have a directory with 2 files, both are identical and quite large (e.g. 2GB ea.) I want to rsync that directory to a remote host. As I understand it (and I could be wrong), rsync calculates checksums of files. Surely if it sees 2 files with the same checksum it can just copy the first file, then do a local copy on the remote host for the 2nd file? That would make it faster, no? On a similar note, doesn't rsync hash all the remote files before copying? If it saw a different file with the same hash as a file that was to transfered, it could do a local copy on the remote host. Does rsync support this sort of thing? Is there some way to turn it on? Is there a tool similar to rsync that will do this sort of 'hash based' local copies?

    Read the article

  • Hibernate speed, and hard drive temperature

    - by cometbill
    My computer is taking in the region of 4 minutes or so, to hibernate 4GB of RAM. Also, resting my hand on the top of it is rather hot to the touch, so I ran CrystalDisk Info and that is reporting a temperature of 49 degrees C. I It's a Western Digital 5400 rpm drive, I've had it in the laptop since I bought it for most of the 2.75 years I've had it. Power on Count is 1219 Power on Hours is 3940 hours Boot up time is quicker than resuming from hibernate, too. Is there anything I can / should check ? Advice is greatly appreciated.

    Read the article

  • method for transfering large files for newbies

    - by doug
    Hi there One of my friends is now in china and he wana send me his home-mode video files. I have a linux hosting account on godaddy and i've configured a ftp account for him. Unfortunately he has trouble in using the ftp account. Can you recommend a better option? TY

    Read the article

  • Verizon FiOS Speed concerns

    - by Josh K
    I'm working on getting a separate internet connection to run out and I was looking into available options. FiOS claims to offer 25Mb up / 25 Mb down as a maximum rate. Do they have listed minimum rates? Is there anything with fiber I should be concerned about? Special hardware, special routers, availability concerns?

    Read the article

< Previous Page | 38 39 40 41 42 43 44 45 46 47 48 49  | Next Page >