Search Results

Search found 12531 results on 502 pages for 'resume download'.

Page 45/502 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • Downloading movies with Bit torrent

    - by Quintin Par
    I come from a part of the world where the average Hollywood movie is released a year + later than in US. For example: Transformers: Revenge of the Fallen will be released only 5-6 months down the lane. The Reader, I think will not be released at all. Is this a right justification for me to download movies from the torrent networks and watch them? Even if I don’t share it? P.S I don’t think my country has all these American type cyber laws to catch me. It’s just the guilt feeling. Note: SuperUser is not a legal resource.

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • How to make iTunes download all podcast episodes irrespective of listened/not

    - by user15660
    I am a big iPod user (especially podcasts). I have used iPod touch, iPod classic and iPod nano. I noticed that iTunes stops downloading podcasts that you no longer listen to and stops downloading them. Is there a way to force iTunes to download all episodes of podcasts irrespective of whether you listen to them or not? I am not interested in just clicking a episode to make iTunes think that I'm listening. I need some kind of programmatic work around or batch script that runs and updates all podcasts/episodes automatically.

    Read the article

  • Using wget to download pdf files from a site that requires cookies to be set

    - by matt74tm
    I want to access a newspaper site and then download their epaper copies (in PDF). The site requires me to login using my email address and password and then it permits me to access those PDF URLs. I'm having trouble 'setting my session' in wget. When I login into the site from my browser, it sets two cookie values: [email protected] Password=12345 I tried: wget --post-data "[email protected]&Password=12345" http://epaper.abc.com/login.aspx However, that just downloaded the login page and saved it locally The FORM on the login page has two fields: txtUserID txtPassword and radiobuttons like this: <input id="rbtnManchester" type="radio" checked="checked" name="txtpub" value="44"> Another button: <input id="rbtnLondon" type="radio" name="txtpub" value="64"> If I post this to the login.aspx page, I get the same output wget --post-data "[email protected]&txtPassword=12345&txtpub=44" http://epaper.abc.com/login.aspx If I do: --save-cookies abc_cookies.txt it doesnt seem to have anything other than the default content. For the last if I do --debug as well it says: ... Set-Cookie: ASP.NET_SessionId=05kphcn4hjmblq45qgnjoe41; path=/; HttpOnly ... Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId 05kphcn4hjmblq45qgnjoe41 Length: 107253 (105K) [text/html] Saving to: `login.aspx' ... Saving cookies to abc_cookies.txt. However, abc_cookies.txt shows ONLY the following: # HTTP cookie file. # Generated by Wget on 2011-08-16 08:03:05. # Edit at your own risk. (Not sure why I'm not getting any responses on SO - perhaps SU is a better forum - http://stackoverflow.com/questions/7064171/using-wget-to-download-pdf-files-from-a-site-that-requires-cookies-to-be-set) EDIT 1 C:\Temp>wget --cookies=on --keep-session-cookies --save-cookies abc_cookies.txt --post-data "txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7" http://epaper.abc.com/login.aspx --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. --2011-08-18 08:15:59-- http://epaper.abc.com/login.aspx Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00a2ae80 (new refcount 1). ---request begin--- POST /login.aspx HTTP/1.0 User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Content-Type: application/x-www-form-urlencoded Content-Length: 100 ---request end--- [POST data: txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7] HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:17 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 Set-Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 107253 ---response end--- 200 OK Registered socket 300 for persistent reuse. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 Length: 107253 (105K) [text/html] Saving to: `login.aspx.1' 100%[======================================================================================================================>] 107,253 24.9K/s in 4.2s 2011-08-18 08:16:05 (24.9 KB/s) - `login.aspx.1' saved [107253/107253] Saving cookies to abc_cookies.txt. Done saving cookies. C:\Temp>wget --referer=http://epaper.abc.com/login.aspx --cookies=on --load-cookies abc_cookies.txt --keep-session-cookies --save-cookies abc_cookies.txt http://epaper.abc.com/PagePrint/16_08_2011_001.pdf --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 --2011-08-18 08:16:12-- http://epaper.abc.com/PagePrint/16_08_2011_001.pdf Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00598290 (new refcount 1). ---request begin--- GET /PagePrint/16_08_2011_001.pdf HTTP/1.0 Referer: http://epaper.abc.com/login.aspx User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145 ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 content-disposition: attachement; filename=Default_logo.gif Cache-Control: private Content-Type: image/GIF Content-Length: 4568 ---response end--- 200 OK Registered socket 300 for persistent reuse. Length: 4568 (4.5K) [image/GIF] Saving to: `16_08_2011_001.pdf' 100%[======================================================================================================================>] 4,568 7.74K/s in 0.6s 2011-08-18 08:16:14 (7.74 KB/s) - `16_08_2011_001.pdf' saved [4568/4568] Saving cookies to abc_cookies.txt. Done saving cookies. Contents of abc_cookies.txt epaper.abc.com FALSE / FALSE 0 ASP.NET_SessionId owcrje55yl45kgmhn43gq145

    Read the article

  • Redirecting to a diferent exe for download based on user agent

    - by Ra
    I own a Linux-Apache site where I host exe files for download. Now, when a user clicks this link to my site (published on another site): http://mysite.com/downloads/file.exe I need to dynamically check their user agent and redirect them to either http://mysite.com/downloads/file-1.exe or http://mysite.com/downloads/file-2.exe It seems to me that I have to options: Put a .htaccess file stating that .exe files should be considered to be scripts. Then write a script that checks the user agent and redirects to a real exe placed in another folder. Call this script file.exe. Use Apache mod-rewrite to point file.exe to redirect.php. Which of these is better? Any other considerations? Thanks.

    Read the article

  • Downloading videos from a flash website

    - by Mimi
    There are those plugins that can capture flash videos from websites like Youtube and others. There's also the browsing cache which keeps the videos and I can copy them somewhere else and have them stored on my computer. I know of these, but how can I download a video from a website that (I think) is all flash because the address doesn't change wherever you navigate to and so it stays the same when you play a video that's on the website? No plugin I've tried (realplayer, ant video downloadr, IDM) have worked with it and nothing gets cached from that website.

    Read the article

  • IIS7 response size thresholds

    - by DanielM
    I have a customer who is attempting video playback via HTTP progressive download of very large files ( 1 GB). There is no problem once a file is cached at the edge via my CDN, but hits to my origin (first hits prior to edge-cache population) experience stalling and loss of sync between audio and video about an hour and a half into playback. This occurs pretty reliably at that point, suggesting that some threshold somehwere is getting hit. Are there IIS configuration knobs governing HTTP Response size? Other data points: I am unable to replicate this problem. I am looking at client bandwidth and last mile issues. I am looking at possible encoding recipe dependencies. But this problem never came up when we were using a "push" cache configuration (CDN-hosted origin), so something funky serverside at my origin seems like a likely culprit. Thanks ...

    Read the article

  • How can I make a specific image stop loading in Firefox?

    - by Gibby
    I don't want to prevent all images from loading or stop loading everything, I just want to stop loading an individual image after I've seen that it's probably not something I care about so I can use that bandwidth for something else. My internet is very slow, but I still like browsing sites like tumblr, imgur, etc. that have lots of images. It seems like GIFs are getting more and more common and they can be several megabytes each... my internet just can't handle it. When I right-click a broken/unloaded image in Firefox, there's an option to reload the image. I essentially want the counterpart to that: to right-click a still-loading image and stop the download. Is this possible? Greasemonkey script, extension, I'll take any method.

    Read the article

  • How to Suppress Repetition of Warnings That an Application Was Downloaded From the Internet on Mac OS X?

    - by Jonathan Leffler
    On Mac OS X, when I run Firefox (and Thunderbird, and ...) which I downloaded from Mozilla, the OS pops up a warning that the file was downloaded over the internet, giving the date on which it was downloaded. I have no problem with that warning on the first time I use a downloaded application - but the repeated warnings are a nuisance. Is there a way to suppress that dialogue box? Is there a way to avoid it appearing in the first place? (Some applications I download from a corporate intranet - those don't produce the equivalent warning; any idea what the criteria are for when the warning is generated?)

    Read the article

  • Downloading file from FTP using cURL

    - by Josiah
    I'm trying to use a cURL command to download a file from an FTP server to a local drive on my computer. I've tried curl "ftp://myftpsite" --user name:password -Q "CWD /users/myfolder/" -O "myfile.raw" But it returns an error that says: curl: Remote file name has no length! curl: try 'curl --help' or 'curl --manual' for more information curl: (6) Could not resolve host: myfile.raw; No data record of requested type I've tried some other methods, but nothing seems to work. Also, I'm not quite sure how to specify which folder I want the file to be downloaded to. How would I do that?

    Read the article

  • Downloads speed starts ok but after a few seconds they go down to 10kbps?

    - by peterg
    Since yesterday any file I try to download from any host start downloading ok (at 1mbps) but after a few seconds the speed start to decrease down to 10kbps. I use a mobile modem (huawei) and I connect to a wcdma network. What could be happening? I use kaspersky, and I checked the Network Monitor and I don't see any weird process using bandwidth. What other tests can I run? Recommend me some app to monitor the traffic to see if I have some malware or see where my bandwidth goes.

    Read the article

  • Ubuntu apt-get install (--download-only) executed from another machine on behalf of mine

    - by Maroloccio
    I have a server on a network segment with no direct or indirect access to the Internet. I want to perform an: apt-get install <package_name> Is there a way to somehow delegate the process of downloading the required files to another machine by exporting the server configuration so as to satisfy all dependencies while running: apt-get install --download-only <package_name> Can, in effect, apt-get install read a configuration from an exported archive rather than from the local package database? Can the list of packages to be downloaded be retrieved, along with an installation script to perform the installation, instead of the actual packages? (a further level of indirection which would help me schedule this with wget at appropriate times...)

    Read the article

  • Why is my wireless so slow compared to my wired download speed?

    - by Shawn
    I just used speedtest.net (using Firefox) to compare my wired connection speed with my wireless connection speed. With my current contract (with Videotron), I'm supposed to get Download speed: 8Mbps Upload speed: 1Mbps Here are the results of the speedtest.net test: Wired Ping: 14ms Download speed: 8.41Mbps Upload speed: 1.04Mbps Wireless Ping: 16ms Download speed: 0.18Mbps Upload speed: 0.98Mbps The difference in download speeds seems staggering to me since I did the test 1 meter aways from my router. Any clue as to why my wireless download speed is so low compared to my wired download speed? using Ubuntu 11.04 on an Acer Aspire 5536-5519 Oh and it might be worth mentioning that my girlfriend has no trouble at all with her wireless connection. No slowness at all. (She uses Firefox on Windows 7 on a Dell) Here's the results for the same test on her system: Ping: 22ms Download speed; 8.44Mbps Upload speed: 1.02Mbps

    Read the article

  • Installing maven on Ubuntu by manual download

    - by WebDevHobo
    To install Maven, I downloaded the latest version from the website and then followed these steps: http://maven.apache.org/download.html#Installation The last step, the version control, does not work. It says that 'mvn' is currently not installed and that I should type sudo apt-get install maven2 If I go directly to the mvn file itself, it does work: root@ubuntu:~# /usr/local/apache-maven/apache-maven-2.2.1/bin/mvn --version Apache Maven 2.2.1 (r801777; 2009-08-06 12:16:01-0700) Java version: 1.6.0_21 Java home: /usr/java/jdk1.6.0_21/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux" version: "2.6.32-25-generic" arch: "i386" Family: "unix" So, what am I doing wrong here? Or what would and apt-get install do extra that I might have forgotten?

    Read the article

  • rss downloader script

    - by The Digital Ninja
    I have a Synology NAS that is powered by linux at my house. I'm looking to set up a cron script to check a group of rss feeds and auto download new video podcasts to a shared folder. I can do most of the scripting, such as deleting files older than 3 weeks and the wget parts. But I'm not sure how to parse the rss feed and check dates to only grab the latest. I figured its best not to re-invent the wheel and surly someone out there has a command line rss downloader or some such script. Any ideas?

    Read the article

  • Download web server structure with empty files

    - by golimar
    I want to make a mirror of a Web server, but downloading the actual files will take too long. So I thought of having just the directory and file structure, and when I need the actual contents of the file, I can download just that file. I have tried wget --spider URL and in a short time it has created in my local disk the directory structure with no files. But I've checked all of wget's or curl's switches and there is nothing like what I need. Can this be done with wget, curl or any other tool?

    Read the article

  • Trying to find a good filehost [closed]

    - by user67481
    I'm looking for a good filehost that I can use to link downloads on my blog (personally created files, no copyright infringement). Been looking at mediafire, but I'm not sure what else is out there that would meet my needs. Ideally wanting something that has no files-per-day-per-user limits, can host individual files of at least 500MB each, and has very little hassle for the users who download from them. I'll pay for a 'premium' or whatever level account if necessary. Any good suggestions? Or will mediafire be my best bet for this?

    Read the article

  • How to pause and resume a game in XNA using the same key?

    - by user13095
    I'm attempting to implement a really simple game state system, this is my first game - trying to make a Tetris clone. I'd consider myself a novice programmer at best. I've been testing it out by drawing different textures to the screen depending on the current state. The 'Not Playing' state seems to work fine, I press Space and it changes to 'Playing', but when I press 'P' to pause or resume the game nothing happens. I tried checking current and previous keyboard states thinking it was happening to fast for me to see, but again nothing seemed to happen. If I change either the pause or resume, so they're both different, it works as intended. I'm clearly missing something obvious, or completely lacking some know-how in regards to how update and/or the keyboard states work. Here's what I have in my Update method at the moment: protected override void Update(GameTime gameTime) { KeyboardState CurrentKeyboardState = Keyboard.GetState(); // Allows the game to exit if (CurrentKeyboardState.IsKeyDown(Keys.Escape)) this.Exit(); // TODO: Add your update logic here if (CurrentGameState == GameStates.NotPlaying) { if (CurrentKeyboardState.IsKeyDown(Keys.Space)) CurrentGameState = GameStates.Playing; } if (CurrentGameState == GameStates.Playing) { if (CurrentKeyboardState.IsKeyDown(Keys.P)) CurrentGameState = GameStates.Paused; } if (CurrentGameState == GameStates.Paused) { if (CurrentKeyboardState.IsKeyDown(Keys.P)) CurrentGameState = GameStates.Playing; } base.Update(gameTime); }

    Read the article

  • Android: Download the Android SDK components for offline install

    - by Tawani
    Is it possible to download the Android SDK components for offline install without using the SDK Manager? The problem is I am behind a firewall which I have no control over and both sites download URLs seem to be blocked (throws a connection refused exception) https://dl-ssl.google.com/android/repository/repository.xml http://dl-ssl.google.com/android/repository/repository.xml Failed to fetch URL http://dl-ssl.google.com/android/repository/repository.xml, reason: Connection refused: connect

    Read the article

  • Where should I download corflags.exe from?

    - by mmiika
    I'm running Windows Server 2008 64-bit "workstation" and would like to get corflags.exe. Which SDK do I need to download? Edit: I know about .NET Framework 2.0 Software Development Kit (SDK) (x64) and Windows SDK for Windows Server 2008 and .NET Framework 3.5 but I was hoping to find something smaller as these are quite large downloads. Also the note about 2.0 SDK seems to suggest to download the 3.5 one, should I follow that?

    Read the article

  • Downloading an android layout from the internet.

    - by oscarello
    Hi, I would like ask if there's a way to download an android layout from the Internet into the "res/layout" folder. I was thinking in getting the file using an HttpUrlConnection and a FileOutputStream, like discussed in here http://stackoverflow.com/questions/576513/android-download-binary-file-problems but I can't fgure out how to put it into the "res/layout" folder. Thanks!

    Read the article

  • Upload/download files securly, winforms c# and asp.net

    - by mikeh
    From a winforms application, I need to upload & download files to an asp.net web server using http/https. only need to send/receive one file at a time cannot use ftp, must use http/https need progress bar upload & download must be username/password authenticated Is there an easy way to do this?

    Read the article

  • How to download attachment file from JSP

    - by Stardust
    I want to know how can I download any file from JSP page based on content disposition as an attachment from mail server. I want to create a link on JSP page, and by clicking on that link user can download file from mail server. The link should be for content dispostion's attachment type. How can I do that in JSP?

    Read the article

  • HTTP New location after download starts

    - by Flavius
    Hi I'm using X-Accel-Redirect (so implicitly nginx), Content-Type and Content-Disposition to download a file, everything works great. What I need to accomplish is redirecting to a new location after the download starts. I've tested with both Refresh and Location, it doesn't work. Is it possible with HTTP 1.1/nginx?

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >