Search Results

Search found 70198 results on 2808 pages for 'file transfer'.

Page 3/2808 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Measure data transfer rate over tcp using c#

    - by publicENEMY
    i want to measure current download speed. im sending huge file over tcp. how can i capture the transfer rate every second? if i use IPv4InterfaceStatistics or similar method, instead of capturing the file transfer rate, i capture the device transfer rate. the problem with capturing device transfer rate is that it captures all ongoing data through the network device instead of the single file that i transfer. how can i capture the file transfer rate? im using c#.

    Read the article

  • How to transfer data between two netowks efficiently

    - by Tono Nam
    I will like to transfer files between two places over the internet. Right now I have a VPN and I am able to browse, download and transfer files. So my question is not really how to transfer the files; Instead, I will like to use the most efficient approach because the two places constantly share a lot of data. The reason why I want to get rid of the vpn is because it is two slow. Having high upload speed is very expensive/impossible on residential places so I will like to use a different approach. I was thinking about using programs such as http://www.dropbox.com . The problem with dropbox is it only enables 2 GB of storage in order for it to be free. I think the deals they offer are ok and I might be willing to pay to get that increase in speed. But I am concerned with the speed of transferring data. Dropbox will upload the file to their server then send it from the server to the other location. I will like it even faster lol. Anyways I was thinking why not create a program my self. This is the algorithm that I was thinking let me know if it sounds to crazy. (remember my goal is to transfer files as fastest as possible) Things that I will use in this algorithm: Server on the internet called S ( has fast download and upload speed. I pay to host a website and some services in there. I want to take advantage of it) Client A on location 1 Client B on location 2 So lets say on location 1 20 large files are created and need to be transferred to location 2. Client A compresses the files with the highest compression ratio possible. Client A starts sending data via UDP to client B. Because I am using UDP I will include the sequence number on each package. Have server S help speed up things. For example every time a package is lost we can use Server S to inform client A that it needs to resend a package. Anyways I think this approach will increase the transfer rate. I do not know if it is possible to start sending data meanwhile it is being compressed. Also if it is possible to start decompressing data even if we are not done receiving all the info. Maybe it will be faster to start sending the files right away without compressing. If I knew that I will always be sending large text files then I will obviously use the compression. I need this as a general algorithm. So i guess my question is should using UDP over TCP could increase performance by using an extra server to keep track of lost packages? and How should I compress files before sending? compressing a 1 GB file with the highest compression ration takes about 1 hour! I will like to take advantage of that time by sending it meanwhile it is compressed.

    Read the article

  • How to transfer data between two networks efficiently

    - by Tono Nam
    I would like to transfer files between two places over the internet. Right now I have a VPN and I am able to browse, download and transfer files. So my question is not really how to transfer the files; Instead, I would like to use the most efficient approach because the two places constantly share a lot of data. The reason why I want to get rid of the VPN is because it is two slow. Having high upload speed is very expensive/impossible in residential places so I would like to use a different approach. I was thinking about using programs such as http://www.dropbox.com . The problem with Dropbox is that the free version comes with only 2 GB of storage. I think the deals they offer are OK and I might be willing to pay to get that increase in speed. But I am concerned with the speed of transferring data. Dropbox will upload the file to their server then send it from the server to the other location. I would like it to be even faster. Anyway I was thinking why not create a program myself. This is the algorithm that I was thinking of. Let me know if it sounds too crazy. (Remember my goal is to transfer files as fast as possible) Things that I will use in this algorithm: Server on the internet called S (Has fast download and upload speed. I pay to host a website and some services in there. I want to take advantage of it.) Client A at location 1 Client B at location 2 So lets say at location 1, 20 large files are created and need to be transferred to location 2. Client A compresses the files with the highest compression ratio possible. Client A starts sending data via UDP to client B. Because I am using UDP I will include the sequence number on each packet. Have server S help speed up things. For example every time a packet is lost we can use Server S to inform client A that it needs to resend a packet. Anyways I think this approach will increase the transfer rate. I do not know if it is possible to start sending data while it is being compressed. Or if it is possible to start decompressing data even if we are not done receiving the whole file. Maybe it will be faster to start sending the files right away without compressing. If I knew that I will always be sending large text files then I will obviously use the compression. I need this as a general algorithm. So I guess my question is could I increase performance by using UDP instead of TCP and by using an extra server to keep track of lost packets? And how should I compress files before sending? Compressing a 1 GB file with the highest compression ratio takes about 1 hour! I would like to take advantage of that time by sending it as it is being compressed.

    Read the article

  • Firewire 800 data transfer too slow on Macbook Pro Unibody and Win7 RC

    - by dtmunir
    I'm trying to back up some data on an external hard drive and am finding the transfer rate to be unbearably slow. My environment is as follows: Macbook Pro Unibody (late 2008) Windows 7 RC, 64-bit Lacie, rugged 500GB portable hard drive I have tried using a number of methods including simple copying in Explorer, Teracopy, Crashplan, and Windows backup. I am averaging around 1MB/s which seems terribly slow. How do I identify what is the cause of this slow file transfer, and then how do I go about addressing the issue.

    Read the article

  • robocopy transfer file and not folder

    - by Bill McKay
    I'm trying to use robocopy to tranfer a single file from one location to another but robocopy seems to think I'm always specifying a folder. Here is an example: robocopy "c:\transfer_this.txt" "z:\transferred.txt" But I get this error instead: 2009/08/11 15:21:57 ERROR 123 (0x0000007B) Accessing Source Directory c:\transfer_this.txt\ (note the '\' at the end of transfer_this.txt) But if I treat it like an entire folder: robocopy "c:\folder" "z:\folder" It works but then I have to transfer everything in the folder. How can I only transfer a single file with robocopy?

    Read the article

  • Slow Transfer Speeds from KVM host to client

    - by indian maiden
    I am trying to isolate the root cause of slow transfer speeds from my host OS to a KVM client. Both are Linux. Rsync on the host 192.168.1.72 rsync -auv --progress rut3.img /tmp/ [54.09MB/s] Rsync to the client: rsync -auv --progress rut3.img 192.168.1.80:/tmp/ [25.52MB/s] I realize that there will be some TCP overhead on the transfer but over 50%? Can someone enlighten me on what could be slowing down the transfers so much?

    Read the article

  • rsync doesn't use delta transfer on first run

    - by ockzon
    I'm trying to synchronize a large local directory (with a batch file using rsync 3.0.7 on Cygwin, Windows 7 x64, 30k files, 200gb size) to a remote server (Debian x64 with kernel 2.6, rsyncd 3.0.7) over a slow internet connection (90kbyte/s upload). I know almost all files are identical and I verified that using md5sum locally and remotely. However when executing rsync from my local machine every file gets transferred completely for the first time. When I terminate the batch file after a few transfers and run it again then the already transferred files are skipped. But as soon as it gets to a file not yet transferred it uploads the file as a whole again instead of noticing that the checksum is the same locally and remotely. The batch file calling rsync looks like this (backslashes and line brakes added here for readability): c:\cygwin\bin\rsync.exe --verbose --human-readable --progress --stats \ --recursive --ignore-times --password-file pwd.txt \ /cygdrive/d/ftp/data/ \ rsync://[email protected]:33400/data/ | \ c:\cygwin\bin\tee.exe --append rsync.log I experimented using the following parameters in varying combinations but that didn't help either: --checksum --partial --partial-dir=/tmp/.rsync-partial --compress

    Read the article

  • How to Customize the File Open/Save Dialog Box in Windows

    - by Lori Kaufman
    Generally, there are two kinds of Open/Save dialog boxes in Windows. One kind looks like Windows Explorer, with the tree on the left containing Favorites, Libraries, Computer, etc. The other kind contains a vertical toolbar, called the Places Bar. The Windows Explorer-style Open/Save dialog box can be customized by adding your own folders to the Favorites list. You can, then, click the arrows to the left of the main items, except the Favorites, to collapse them, leaving only the list of default and custom Favorites. The Places Bar is located along the left side of the File Open/Save dialog box and contains buttons providing access to frequently-used folders. The default buttons on the Places Bar are links to Recent Places, Desktop, Libraries, Computer, and Network. However, you change these links to be links to custom folders of your choice. We will show you how to customize the Places Bar using the registry and using a free tool in case you are not comfortable making changes in the registry. Use Your Android Phone to Comparison Shop: 4 Scanner Apps Reviewed How to Run Android Apps on Your Desktop the Easy Way HTG Explains: Do You Really Need to Defrag Your PC?

    Read the article

  • What happens when my domain provider cancels order after domain transfer?

    - by Saifur Rahman Mohsin
    I purchased 2 domains say xyz.in and abc.com on october 2012 and I got emails that they will be expiring on oct 2013. I called my local domain provider and told him I'd like to transfer the domain from Webiq to GoDaddy to which he said I cannot unless the domain is active. He asked me to pay for both the domains, renew it and then I could transfer the domain via the domain panel. When I went to the domain panel I noticed that the order was made and so I made a transfer which happened successfully. Just as he mentioned the period of validity (1 year) for each domain got transferred to GoDaddy as well! Additionally, I added 1 year of period to both the domain via GoDaddy so both of them and also GoDaddy provided an extra free year to both these domains as I paid for the transfer on 11/10/2013 at 9:18 PM MST so both of these were stated to be valid till 2016 and that's what it showed when I did a whois lookup as well. But now it suddenly shows me that my domains are getting expired this year (and the whois also shows 2015). This is confusing as I have no idea who to blame for the missing one year. I'm wondering what would have happened say if my old domain provider's client who got my domain registered cancelled the order. Since it was no longer under their control would they still be able to deduce that one year? When I tried submitting a support request to Webiq they replied: Your domain "abc.com" has been transferred away from us on 17-11-2013 and the domain "xyz.in" was transferred away from us on 18-01-2014. There are no order cancellation actions placed. If you have any billing related issues kindly contact your parent reseller. I need some guidance on explaining what issue might have occurred or understanding how this domain control works!

    Read the article

  • File transfer from MP3 player to computer

    - by JP
    I own an old 20GB Creative Zen jukebox (all external screws are gone on it LOL) and I would like to transfer files back to my PC to get them on my new iPod. Problem is, no matter what software I use (WinAMP, Creative Media Source or Windows Media Player), it just stops transferring files on my HD with an error message that says there is no more space on the destination folder. Problem is, there is still 320GB free. I tried lot of things like installing newer driver, latest Zen plug-in for Media Source, latest WinAMP version. Sometimes, it just works and then again, it stops working and I get this non-sense error. Restarting my PC sometime solves the issue by giving me enough time to transfer 10 or 15 more files and then I get the error again and again. Yesterday though I managed to transfer up to 3GB of MP3s on my computer before getting the error. Seems like I'm having a driver issue or a weird behavior from the player and/or the software I'm using. 3 different software can't reproduce the exact same issue by themselves so it must be something related to the driver. I can't find any post of any sort concerning such issue on old forums. Any idea?

    Read the article

  • Opening a file opens the folder the file is in, not the file itself

    - by Pepe Lebuntu
    Whenever I try to open a file (such as an .odt, or .doc) from say, the Dash or the Firefox Downloads, Ubuntu 11.10 opens Nautilus to the the folder where the file is, rather than just going to the application and loading the file straight away. In previous releases, when I clicked on a downloaded file, it just went straight to LibreOffice, and it was fine. This is adding a superfluous step in the process. How do I associate the correct extensions?

    Read the article

  • cannot move file: "file cannot be found" (Windows XP)

    - by Steve
    I have some CR2 files in a subfolder of My Documents called My Photos on a Windows XP PC. I want to move them across a WIFI network to an external HDD attached to a Windows 7 PC. I have read/write permissions on the external HDD, as I mapped to this HDD using the Windows 7 user account. When I try to move a single CR2 file, I receive "Cannot copy IMG_3317: Cannot find the specified file. Make sure you specify the correct path and file name." If I refresh the source folder, the file is still there. It is not read only, and I have read/write access to the source file. I can view its properties. Why can't I move this file? I have been able to move similar files in the past.

    Read the article

  • Extremely slow transfer speed ubuntu -> Windows

    - by Hailwood
    I have two laptops, One is running Ubuntu 12.04 (EXT4) the other is running Windows 7 (NTFS). I am copying over 40gb of data (one file) from the Ubuntu laptop to the Windows Laptop. (Browse the shared folder on Ubuntu using Windows copy/paste) But I am getting transfer speeds topping out at ~700kb/s Surely this is not right. I am transferring via wifi on both laptops. My download speeds can reach 7-8mb/s on both laptops, so I know it is not the wifi cards or the router topping out. wlan0 Link encap:Ethernet HWaddr 84:4b:f5:db:b4:85 inet addr:192.168.1.66 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::864b:f5ff:fedb:b485/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:11941185 errors:0 dropped:0 overruns:0 frame:0 TX packets:11306693 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:10087111370 (10.0 GB) TX bytes:7843524888 (7.8 GB)

    Read the article

  • Is there way to use Windows Easy Transfer on Windows Server 2008

    - by CJM
    At work, I'd been experimenting with using Windows Server 2008 as a desktop machine - I'm a s/w developer so some of the server software was particularly appropriate, but back in the day there was a suggestion that Server 2008 would be faster than Vista (mainly because of less bloat). I'm now wanting to move across to a new Windows 7 workstation; not only does Server 2008 not have Windows Easy Transfer, but I can't attack the problem from the Windows 7 end either - when I try to run the migration wizard it claims that the software 'isnt compatible with this version of Windows'. I'd bet that it would work fine, if only it wasn't for the arbitrary version check... Is there any way to coax this software into working? If not, any good alternatives to Windows Easy Transfer - I don't fancy having to manually copy application settings etc across myself...

    Read the article

  • linux to linux, 10TB transfer?

    - by lostincode
    I've looked at all the previous similar questions, but the answers seemed to be all over the place and no one was moving a lot of data (100GB != 10TB). I've got about 10TB that I need to move from one raid to another, gigabit net, XFS file systems. My biggest concern is having the transfer die midway and not being able to resume easily. Speed would be nice, but ensuring transfer is much more important. Normally I'd just tar & netcat, but the raid I'm moving from has been super flaky as of late and I need to be able to recover and resume if it drops mid process. Should I be looking at rsync?

    Read the article

  • Transfer nearly an entire file system to a fresh install as smoothly as possible

    - by Xander
    I've got a friend who needs his computer working in just a few hours. His files are safe, however, he managed to corrupt his main install of Windows 7. My plan is to go in with a Linux disk, copy his C:\ do a backup drive I've got and then reinstall. Restoring many of his files will be pretty simple (such as documents and such), however, things such as applications won't transfer as easily. Is there any easy way to transfer applications such as MS Office (which he needs in the morning) or other commercial software packages without having to go through the hassle of locating the keys and reinstalling them completely? I don't think just moving them over will work just because of the fact that I'm sure much of that is stored in the registry (validation stuff and such). Anyways, quick responses would be super nice! Also, additional help would be great!

    Read the article

  • Slow WLAN file transfer between server and tablet

    - by user266985
    My file server is running Ubuntu 12.04 and I'm sharing files from it over samba. It is connected via gigabit ethernet. My desktop, running Windows 8.1, is also connected via gigabit ethernet. I can transfer files between the two and completely saturate that gigabit pipe. However, I just got a Surface Pro 2, and I'm trying to stream HD movies from my server to the device over WiFi. For some reason, I can't break much past 1.5MB/s transferring files over the network. I've tried streaming through XBMC and a standard file copy; no difference. To add the confusion, if I connect to my guest network and then use my VPN server (installed on the router) to access the file server, I get around 3.2MB/s. I've been running diagnostics to determine the root and I think I've found it but I have no idea what is causing it or how to fix it. Router: Asus RT-N66U Surface Pro 2 Network Card: Marvell Avastar 350N (Driver 19/09/2013 v14.69.24044.150) InSSIDer: Link Score: 100 Co-Channels: 0 Overlapping: 0 5GHz Network Channel: 48+44 iperf File Server as Server; Surface Pro 2 as Client - TCP Performance: Acceptable ------------------------------------------------------------ Server listening on TCP port 5001 TCP window size: 85.3 KByte (default) ------------------------------------------------------------ [ 4] local 192.168.0.90 port 5001 connected with 192.168.0.56 port 57367 [ ID] Interval Transfer Bandwidth [ 4] 0.0- 1.0 sec 10.1 MBytes 84.7 Mbits/sec [ 4] 1.0- 2.0 sec 10.4 MBytes 87.6 Mbits/sec [ 4] 2.0- 3.0 sec 10.6 MBytes 88.8 Mbits/sec [ 4] 3.0- 4.0 sec 10.7 MBytes 89.5 Mbits/sec [ 4] 4.0- 5.0 sec 10.1 MBytes 84.4 Mbits/sec [ 4] 5.0- 6.0 sec 10.2 MBytes 85.8 Mbits/sec [ 4] 6.0- 7.0 sec 7.04 MBytes 59.1 Mbits/sec [ 4] 7.0- 8.0 sec 10.8 MBytes 90.2 Mbits/sec [ 4] 8.0- 9.0 sec 10.6 MBytes 89.1 Mbits/sec [ 4] 9.0-10.0 sec 8.62 MBytes 72.3 Mbits/sec [ 4] 0.0-10.0 sec 99.2 MBytes 83.1 Mbits/sec iperf Surface Pro 2 as Server, File Server as Client Performance: Poor ------------------------------------------------------------ Client connecting to 192.168.0.56, TCP port 5001 TCP window size: 22.9 KByte (default) ------------------------------------------------------------ [ 3] local 192.168.0.90 port 40233 connected with 192.168.0.56 port 5001 [ ID] Interval Transfer Bandwidth [ 3] 0.0- 1.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 1.0- 2.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 2.0- 3.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 3.0- 4.0 sec 1.25 MBytes 10.5 Mbits/sec [ 3] 4.0- 5.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 5.0- 6.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 6.0- 7.0 sec 1.38 MBytes 11.5 Mbits/sec [ 3] 7.0- 8.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 8.0- 9.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 9.0-10.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 0.0-10.1 sec 15.0 MBytes 12.4 Mbits/sec For some reason, it gets capped and I haven't got a clue why. Any suggestions? Edit: My link speed is reported as 270Mbps by Windows. I'm less than two metres from the router with a clear line of sight.

    Read the article

  • Data transfer is extrem slow after partitioning extern usb drive

    - by user125912
    I bought an extern usb 3.0 drive with 500 gb capacity. OS is Windows 7. I use it with an usb 2.0 slot, no prob. Initially I used it without making several partitions and it was fast as hell. Then I had the great idea to make partitions, one for programs, one for data and one for backup. I chose the free EASEUS Partition Master 9.1.1. and ended up with these partitions: F:Apps, primary, NTFS, 100gb H:Data, logic, NTFS, 250gb B:Backup, logic, NTFS, 150gb THE PROBLEM: When I copy files from C: to F: I get a transfer rate of about 100 KB/S ! When I copy files from C: to H: I get a transfer rate of about 4 MB/S ! thats all muuuch to slow, slower then before. What can I do to speed the shit up? Thanks in advance!

    Read the article

  • PHP File Downloading Questions

    - by nsearle
    Hey All! I am currently running into some problems with user's downloading a file stored on my server. I have code set up to auto download a file once the user hits the download button. It is working for all files, but when the size get's larger than 30 MB it is having issues. Is there a limit on user download? Also, I have supplied my example code and am wondering if there is a better practice than using the PHP function 'file_get_contents'. Thank You all for the help! $path = $_SERVER['DOCUMENT_ROOT'] . '../path/to/file/'; $filename = 'filename.zip'; $filesize = filesize($path . $filename); @header("Content-type: application/zip"); @header("Content-Disposition: attachment; filename=$filename"); @header("Content-Length: $filesize") echo file_get_contents($path . $filename);

    Read the article

  • Using Multiple File Handles for Single File

    - by Ryan Rosario
    I have an O(n^2) operation that requires me to read line i from a file, and then compare line i to every line in the file. This repeats for all i. I wrote the following code to do this with 2 file handles, but it does not yield the result I am looking for. I imagine this is a simple error on my part. IN1 = open("myfile.dat","r") IN2 = open("myfile.dat","r") for line1 in IN1: for line2 in IN2: print line1.strip(), line2.strip() IN1.close() IN2.close() The result: Hello Hello Hello World Hello This Hello is Hello an Hello Example Hello of Hello Using Hello Two Hello File Hello Pointers Hello to Hello Read Hello One Hello File The output should contain 15^2 lines.

    Read the article

  • Best Practice: Apache File Upload

    - by matnagel
    I am looking for a soultion for trusted users to upload pdf files via html forms (with maybe php involved). This is quite a standard ubuntu linux server with apache 2.x and php 5. I am wonderiung what are the benefits of the apache file upload module. There were no updates for some time, is it actively maintained? What are the advantages over traditional php upload with apache 2 without this module? http://commons.apache.org/fileupload I remember traditional php file upload is difficult with some pitfalls, will the apache file upload module improve the situation? The solution I am looking for will be part of an existing website and be integrated into the admin web frontend. Things I am not considering are webdav, ssh, ftp, ftps, ftp over ssh. Should work with a browser and without installing special client software, so I am asking about a browser based upload without special client side requirements. I can request a modern browser like firefox = 3.5 or modern webkit broser like chrome or safari from the users.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >