Search Results

Search found 16316 results on 653 pages for 'force download'.

Page 43/653 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Large file download for a Rails project

    - by Horace Ho
    One client project will be online two months later. One of the requirements changed is to support large files (10 to 15MB per RAW camera file, expected 1000 to 5000 files download per day) download worldwide for their customers. The process will be: there is upload screen via paperclip to the rails local public folder a hourly task to upload to web storage (S3?) update the download url from paperclip url to the web url Questions: is there a gem/plug-in for this purpose? if no, any gem/plug-in for S3 to recommend? Questions about the storage provider: is S3 recommended? or other service to recommend? The baseline is: the client's web server does not and will not have the bandwidth to handle the downloads. Thanks

    Read the article

  • Download data in background with iOS4

    - by Sagar
    As per the latest update of Kindle V2.5, it has support of "continue downloading books while the app is in the background on iOS 4 devices". How is it possible to download content in background? As per the iOS multitasking documentation, only audio, voip & location updates are possible in background. And I've also maken sure that NSURLConnection doesn't download new data work while app goes background. Then how's it possible with Kindle app? Edit: I haven't checked Kindle App in iOS4 multitasking enabled device. So if anyone let me (& community) know what exactly Kindle app does to download, that would be very much helpful.

    Read the article

  • Download Specific Images

    - by thebourneid
    I'm trying to search and download specific images /front and back cover / of a website if found but whatever I do I always download only one of them. What should I change in my code to download both of them if found? while ($title_found =~ /'(http:\/\/images.blu-ray.com\/movies\/covers\/\d+_.*?)'/gis) { $url = getSite($1); if ($title_found =~ /front/) { $filename = 'front.jpg'; } elsif ($title_found =~ /back/) { $filename = 'back.jpg'; } } my $dir = 'somepath'.$filename; open F, ">", $dir; binmode F; print F $url; close F; return 0;

    Read the article

  • IE cannot download file with unicode pathname

    - by MM
    I have a web-app that allows users to upload and download image files by pressing buttons on a web page. A user of this page is reporting that IE 7 and 8 fail to download files when the files have Unicode pathnames. IE prompts the user with a dialog stating: "Internet explorer cannot download (file) at (webserver).". Unfortunately I have not been able to reproduce the problem using these versions on my machine. My question is, what could cause this, and how can I prevent it from happening? I have read about problems with cache control (I currently have it set to no-cache); however, I am not using HTTP-S, and the problem only occurs with file-names containing Unicode characters.

    Read the article

  • Removing response.End() includes the html of the page.

    - by vinay_rockin
    HttpResponse response = Context.Http.Response; response.Headers.Clear(); response.Clear(); response.ContentType = Component.OverrideMimeType ? Component.MimeType : "application/download"; response.AppendHeader("Content-Disposition", String.Format("attachment; filename=\"{0}\"", HttpUtility.HtmlEncode(Path.GetFileName(file.FileName)))); response.OutputStream.Write(file.Contents, 0, file.Contents.Length); response.Flush(); // response.End(); if I use response.End() it throws exception and if I comment response.End() it includes the html of the page on which the download link resides. how want to download file with introducing this exta html. Any Idea how to fix this?

    Read the article

  • Need php script to download a file on a remote server and save locally

    - by bigLarry
    Trying to download a file on a remote server and save it to a local subdirectory. The following code seems to work for small files, < 1MB, but larger files just time out and don't even begin to download. <?php $source = "http://someurl.com/afile.zip"; $destination = "/asubfolder/afile.zip"; $data = file_get_contents($source); $file = fopen($destination, "w+"); fputs($file, $data); fclose($file); ?> Any suggestions on how to download larger files without interruption?

    Read the article

  • Download HP Power Protector for ESXi

    - by Mark Henderson
    The HP PowerProtector user guide states that to install the HP PowerProtector client on an ESXi Host: Download the latest version of HPPP from the HP website (http://www.hp.com/go/rackandpower). The ESXi Server is automatically detected, and a shutdown command script is generated. However in typical HP fashion, after clicking through no less than 6 different links to get to the downloads page, I am presented with: http://h18004.www1.hp.com/products/servers/proliantstorage/power-protection/software/power-protector/pp-dl.html HP Power Protector (HPPP) - Windows HP Power Protector (HPPP) - Linux x86 HP Power Protector (HPPP) - Linux x64 HP Power Protector (HPPP) - Linux IA64 HP Power Protector (HPPP) - HPUX The Linux packages contain an RPM and in no way resemble what is in the HP documentation. None of these are labelled for ESXi. Does anyone know where or how to get the HP Power Protector ESXi client installed?

    Read the article

  • Where to download n900 songbird client?

    - by Walter White
    Hi all, I am confused ... is Songbird releasing a client for the n900? This page has been up for a few months. I've been eagerly awaiting a download, but cannot find the link: http://getsongbird.com/gadgets/ Does anyone have any ideas here? The media player with the n900 works well, but it doesn't connect to last.fm. It isn't a deal breaker, but it would be a nice to have. Songbird seems to run better than Rhythmbox and offers a cleaner, more polished look. Thanks, Walter

    Read the article

  • Downloads on Vista Home Premium start off fast but slow down to 0 Kb/s and hang

    - by user66265
    I have Windows Vista Home Premium on my computer and everytime I go to download something, it starts out at about 1.5 Mb/s and stays there for about 3 seconds, then it slows down to 800 Kb/s and continues to drop until it gets down to 0 Kb/s and hangs. I've tried just about everything I can find such as uninstalling all firewalls/antivirus, doing the netsh rss,autotune, and chimney disable, and updating everything but it still continues to happen. I'd prefer not to reinstall but if I have to then I have to... EDIT: Figured it out, the router needed a firmware update

    Read the article

  • Automatically save/download e-mail body to disk

    - by CatamountJack
    Is there a program that will allow me to connect to my mail server (IMAP) and automatically save certain new e-mails to disk? Multiple times a day I receive automated e-mail updates about pending jobs from a system that processes some information for us. The data in these e-mails is written as plain-text within the body of the message. I would like to download the newest message, parse it, and display it on my desktop. The last two parts I can manage ok - it's just the automatic downloading that is posing a challenge. I don't use Outlook (I do use Thunderbird), but would prefer not to have the client open to make this happen. I'm currently running Win7.

    Read the article

  • Prevent Windows Live Mail to download all messages from IMAP

    - by m8t
    Hello, Recently I'm trying the Window Live Mail client. Simple and beautiful. I have set up an IMAP account, and I'm used that a client only downloads headers. However Windows Live Mail automatically creates a list of tasks to download all messages from all directories when you are closing the client. Is it possible to avoid this? It's a good and a bad thing. You can work offline and you have a backup, but it takes extremely long to perform, in fact I have about hundred of thousand of emails. This task can take a whole day to perform. After looking in the settings I don't see anything special, maybe you have an idea? Thank you Mike

    Read the article

  • Apache rewrite rules not causes a download dialog of the PHP file

    - by Shaihi
    I have Apache 2.2.17 using the WAMPServer 2.1 installation. I am debugging a website fully local on my computer. I have the following rule in the .htaccess: # Use PHP5 Single php.ini as default AddHandler application/x-httpd-php5s .php Options +FollowSymlinks RewriteEngine on Rewritebase / RewriteRule ^bella/(.*)/(.*)$ beauty.php?beauty_id=$1 [L] RewriteRule ^(argentina|brasil|chile|colombia|espana|mexico|rep_dominicana|uruguay|venezuela|peru|bolivia|cuba|ecuador|panama|paraguay|puerto_rico)/$ country.php?name=$1 [L] RewriteRule ^(argentina|brasil|chile|colombia|espana|mexico|rep_dominicana|uruguay|venezuela|peru|bolivia|cuba|ecuador|panama|paraguay|puerto_rico)/(hi5|facebook|twitter|orkut)/$ socialnetw.php?country=$1&category=$2 [L] The problem When I enable this rule and try to access http://localhost/index.php using FF I get a download dialog for the PHP file. If I comment the Rewrite* part in the .htaccess file then the index.php file loads fine, but navigation in the page is broken...

    Read the article

  • Adobe Volume License for Indesign Digital Download Location?

    - by elistp
    We recently purchased a volume license for Adobe Indesign from Dell. We received an e-mail for the order that contains the serial #. However, there is no information on how to obtain Adobe Indesign from the Adobe licensing portal. This is our first time dealing with Adobe volume licensing so I'm a bit lost as to what we're suppose to do. I've googled around a little and found an Adobe License Portal but I do not have access to it. Does anyone with experience concerning Adobe volume licensing have any idea what we're suppose to do to get a download of our purchase?

    Read the article

  • where to download emacs manuals as offline html files

    - by Jisang Yoo
    When you press C-h i in Emacs, it shows what's called the top of the INFO tree, and it links to all kinds of manuals: AUCTeX, Org Mode, Emacs, Emacs FAQ, Emacs Lisp Intro, Elisp, ... . Is there a place where I can download all of them at once as html files? GNU Home page has links to some of them in html format: http://www.gnu.org/software/emacs/manual/elisp.html_node.tar.gz http://www.gnu.org/software/emacs/manual/emacs.html_node.tar.gz But I cannot find a link to a single tar.gz file packing all of them.

    Read the article

  • Norton Security Suite Symantec Download Manager Error: "Error writing to disk"

    - by Stephen Pace
    My broadband provider (Comcast) decided to switch their 'included with service' security suite from McAfee to Norton Security Suite. Their email directed me to a site that downloaded the Symantec Download Manager (NortonDL.exe) and that went fine. I'm running Windows 7 32-bit and running this application pops up the standard User Account Control message and the software is correctly identified as coming from Symantec. I answer 'yes' to allow the software to install and upon launch immediately get an "Error writing to disk" error. I searched the Internet for this error, but mainly I find Comcast users complaining about the same issue with no resolution other than to call Symantec. I found no one suggesting a successful workaround and it appeared that most of the support calls took up to three hours. I'd like to avoid that if possible. Ideas? To be honest, I'm getting close to bagging this installation and just moving to Microsoft Security Essentials.

    Read the article

  • php file downloads instead of being processed with ajax on apache

    - by eagleon
    I have a small website where some content is displayed within a HTML tag using AJAX. The content is simply taken from another page on the same web site. However, sometimes instead of loading the parsed PHP file, the browser displays a download box instead. I downloaded the file and this is what it looks like a text file mixed with binary or gzipped data. I can't paste the binary stuff here, but here are some of the headers: Jul 2012 18:52:16 GMT Server: Apache/2 X-Powered-By: PHP/5.3.10 Content-Encoding: gzip Vary: Accept-Encoding,User-Agent Keep-Alive: timeout=1, max=95 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=93 ETag: "2fc857-409-4c39691c59b40" HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=92 ETag: "2fc854-3e5-4c39691b65900" HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=91 ETag: "2fc847-3e3-4c3969197d480" and large blocks of stuff like this: µàl]&BaËÜk#ìÏ

    Read the article

  • Software to automatically download a file from FTP and then rename->replace existing file

    - by pauska
    Hi. We pay a news agency to provide us mp3's of hourly news bulletins. They put the mp3's on a FTP server just about 10 minutes before every hour, with files named after date and time (example: 02012010_1600.mp3 or similar). I need to find some solution to download only the latest modified file from the FTP server, rename it to news.mp3 and replace the previous news.mp3 that was created. This should prefferably run on a Windows 2008 Server, as a service if its possible. Anyone have suggestion for software?

    Read the article

  • 403 Forbidden when trying to download file that was uploaded using SSH

    - by Simon Hartcher
    I have FTP access to an Apache server on linux to upload files so that they can be downloadable from the web. I recently was granted SSH access for extra permissions and figured that it would be quicker to download the files directly to the server, instead of downloading them to my machine then FTPing to the server. When I downloaded a file using SSH to the server, and then placed it in the public_html directory, it was not visible from the web. The permissions (from SSH and the FTP client) were the same as all the other files that are visible, but it was not visible in the directory listing, and if I tried to type in the filename into my browser I would get a 403 error. Obviously, when I FTP a file to the server something else happens that makes it web visible, that I am not currently privy to. What am I missing that is causing the file to be invisible from the web?

    Read the article

  • CentOS Insufficient space in download directory /var/cache/yum/base/packages

    - by Joao Heleno
    Hello! I was trying to yum install libpcap when I got Error Downloading Packages: 14:libpcap-0.9.4-15.el5.i386: Insufficient space in download directory /var/cache/yum/base/packages * free 0 * needed 108 k Here's output from df -h: Filesystem Size Used Avail Use% Mounted on /dev/sda1 20G 19G 0 100% / /dev/sda3 202G 38G 154G 20% /home tmpfs 1.5G 0 1.5G 0% /dev/shm And fdisk -l: Disk /dev/sda: 250.0 GB, 250000000000 bytes 255 heads, 63 sectors/track, 30394 cylinders Units = cylinders of 16065 * 512 = 8225280 bytes Device Boot Start End Blocks Id System /dev/sda1 * 1 2611 20972826 83 Linux /dev/sda2 2612 3251 5140800 82 Linux swap / Solaris /dev/sda3 3252 30394 218026147+ 83 Linux I have launched yum clean all with no success clearing up space. Please advise. Thanks.

    Read the article

  • Installing maven on Ubuntu by manual download

    - by WebDevHobo
    To install Maven, I downloaded the latest version from the website and then followed these steps: http://maven.apache.org/download.html#Installation The last step, the version control, does not work. It says that 'mvn' is currently not installed and that I should type sudo apt-get install maven2 If I go directly to the mvn file itself, it does work: root@ubuntu:~# /usr/local/apache-maven/apache-maven-2.2.1/bin/mvn --version Apache Maven 2.2.1 (r801777; 2009-08-06 12:16:01-0700) Java version: 1.6.0_21 Java home: /usr/java/jdk1.6.0_21/jre Default locale: en_US, platform encoding: UTF-8 OS name: "linux" version: "2.6.32-25-generic" arch: "i386" Family: "unix" So, what am I doing wrong here? Or what would and apt-get install do extra that I might have forgotten?

    Read the article

  • Downloading movies with Bit torrent

    - by Quintin Par
    I come from a part of the world where the average Hollywood movie is released a year + later than in US. For example: Transformers: Revenge of the Fallen will be released only 5-6 months down the lane. The Reader, I think will not be released at all. Is this a right justification for me to download movies from the torrent networks and watch them? Even if I don’t share it? P.S I don’t think my country has all these American type cyber laws to catch me. It’s just the guilt feeling. Note: SuperUser is not a legal resource.

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • Using wget to download pdf files from a site that requires cookies to be set

    - by matt74tm
    I want to access a newspaper site and then download their epaper copies (in PDF). The site requires me to login using my email address and password and then it permits me to access those PDF URLs. I'm having trouble 'setting my session' in wget. When I login into the site from my browser, it sets two cookie values: [email protected] Password=12345 I tried: wget --post-data "[email protected]&Password=12345" http://epaper.abc.com/login.aspx However, that just downloaded the login page and saved it locally The FORM on the login page has two fields: txtUserID txtPassword and radiobuttons like this: <input id="rbtnManchester" type="radio" checked="checked" name="txtpub" value="44"> Another button: <input id="rbtnLondon" type="radio" name="txtpub" value="64"> If I post this to the login.aspx page, I get the same output wget --post-data "[email protected]&txtPassword=12345&txtpub=44" http://epaper.abc.com/login.aspx If I do: --save-cookies abc_cookies.txt it doesnt seem to have anything other than the default content. For the last if I do --debug as well it says: ... Set-Cookie: ASP.NET_SessionId=05kphcn4hjmblq45qgnjoe41; path=/; HttpOnly ... Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId 05kphcn4hjmblq45qgnjoe41 Length: 107253 (105K) [text/html] Saving to: `login.aspx' ... Saving cookies to abc_cookies.txt. However, abc_cookies.txt shows ONLY the following: # HTTP cookie file. # Generated by Wget on 2011-08-16 08:03:05. # Edit at your own risk. (Not sure why I'm not getting any responses on SO - perhaps SU is a better forum - http://stackoverflow.com/questions/7064171/using-wget-to-download-pdf-files-from-a-site-that-requires-cookies-to-be-set) EDIT 1 C:\Temp>wget --cookies=on --keep-session-cookies --save-cookies abc_cookies.txt --post-data "txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7" http://epaper.abc.com/login.aspx --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. --2011-08-18 08:15:59-- http://epaper.abc.com/login.aspx Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00a2ae80 (new refcount 1). ---request begin--- POST /login.aspx HTTP/1.0 User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Content-Type: application/x-www-form-urlencoded Content-Length: 100 ---request end--- [POST data: txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7] HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:17 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 Set-Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 107253 ---response end--- 200 OK Registered socket 300 for persistent reuse. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 Length: 107253 (105K) [text/html] Saving to: `login.aspx.1' 100%[======================================================================================================================>] 107,253 24.9K/s in 4.2s 2011-08-18 08:16:05 (24.9 KB/s) - `login.aspx.1' saved [107253/107253] Saving cookies to abc_cookies.txt. Done saving cookies. C:\Temp>wget --referer=http://epaper.abc.com/login.aspx --cookies=on --load-cookies abc_cookies.txt --keep-session-cookies --save-cookies abc_cookies.txt http://epaper.abc.com/PagePrint/16_08_2011_001.pdf --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 --2011-08-18 08:16:12-- http://epaper.abc.com/PagePrint/16_08_2011_001.pdf Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00598290 (new refcount 1). ---request begin--- GET /PagePrint/16_08_2011_001.pdf HTTP/1.0 Referer: http://epaper.abc.com/login.aspx User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145 ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 content-disposition: attachement; filename=Default_logo.gif Cache-Control: private Content-Type: image/GIF Content-Length: 4568 ---response end--- 200 OK Registered socket 300 for persistent reuse. Length: 4568 (4.5K) [image/GIF] Saving to: `16_08_2011_001.pdf' 100%[======================================================================================================================>] 4,568 7.74K/s in 0.6s 2011-08-18 08:16:14 (7.74 KB/s) - `16_08_2011_001.pdf' saved [4568/4568] Saving cookies to abc_cookies.txt. Done saving cookies. Contents of abc_cookies.txt epaper.abc.com FALSE / FALSE 0 ASP.NET_SessionId owcrje55yl45kgmhn43gq145

    Read the article

  • Redirecting to a diferent exe for download based on user agent

    - by Ra
    I own a Linux-Apache site where I host exe files for download. Now, when a user clicks this link to my site (published on another site): http://mysite.com/downloads/file.exe I need to dynamically check their user agent and redirect them to either http://mysite.com/downloads/file-1.exe or http://mysite.com/downloads/file-2.exe It seems to me that I have to options: Put a .htaccess file stating that .exe files should be considered to be scripts. Then write a script that checks the user agent and redirects to a real exe placed in another folder. Call this script file.exe. Use Apache mod-rewrite to point file.exe to redirect.php. Which of these is better? Any other considerations? Thanks.

    Read the article

  • Downloading videos from a flash website

    - by Mimi
    There are those plugins that can capture flash videos from websites like Youtube and others. There's also the browsing cache which keeps the videos and I can copy them somewhere else and have them stored on my computer. I know of these, but how can I download a video from a website that (I think) is all flash because the address doesn't change wherever you navigate to and so it stays the same when you play a video that's on the website? No plugin I've tried (realplayer, ant video downloadr, IDM) have worked with it and nothing gets cached from that website.

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >