Search Results

Search found 11748 results on 470 pages for 'webclient download'.

Page 8/470 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Capture reload/endrequest event after server redirect to download file

    - by Prutswonder
    Inside a webpage I have an Excel download button, which redirects to a webpage that serves the requested Excel file via the application/ms-excel MIME type, which usually results in a file download in the browser. In the webpage, I have the following jQuery code: $(document).ready(function () { $(".div-export .button").click(function () { setBusy(true); }); Sys.WebForms.PageRequestManager.getInstance().add_endRequest(function () { setBusy(false); }); }); Which displays a busy animation while the user waits for the Excel file to be served. Problem is: The animation doesn't end (setBusy(false);) after the file download, because the endRequest event doesn't get fired, probably because of the server redirect. Does anyone have a workaround for this? Edit: The download button is handled in an UpdatePanel.

    Read the article

  • Web Performance testing using VS2010 "Testing a file download"

    - by cheedep
    Hi All, I am trying out the VS 2010 testing tools for the first time. And I tried recording a web performance test and my actions had a file download implemented as in the KB article here http://support.microsoft.com/kb/812406 by streaming chunks of 10000 bytes. However my test is failing at the download saying "The response stream has been closed". Please help me understand why it is happening this way also any suggestions how you would test such a file download. My main aim was to see how the download was performing for a load test with Intercontinental 350kbps connection on files of about 30-50 MB. Thanks.

    Read the article

  • How to programmly download image from website ?

    - by MemoryLeak
    I need to download images from a website, and I have the login name and password, but if i just use URL to download the image, it will throw a exception: there is no value in session. I think I need to login the website before I can programmingly download the image. Do you have any solutions ? Thanks in advance !

    Read the article

  • PHP Download Script

    - by KA_lin
    [edited] I am trying to make a script that downloads a file, the problem is that i am accesing a page(from a server on the network) that generates that file (opens a download window). Is there a way I can get that file witought that pop-up in php? NOTE: I cannot modify the generating page... It is an excel file. The application is called Cognos. I managed with opera to see page variables parsed so I can get to the download page but I must make that download in a folder without the download pop-up

    Read the article

  • refresh page after form download submit

    - by solomongaby
    Hello, I have a form that as an action returns a download. The problem is that the page will pop-out the download, and you can save it, but it will not allow another form submit. i was thinking of doing a page refresh after the submit. But i cant figure out how to do that and not stop the download. Do you have any ideas. Thanks

    Read the article

  • peoblem with download file on zend framework

    - by user1400
    hello all i am using upload files to server in my application, it works fine , i want other users could download these file ,but i get error i created a upload folder in public folder and i upload my files in upload folder now when i create a link (<a href="http://mytest/public/1.jpg">download image</a>) to these files i get error "The requested URL /public/upload/1.jpg was not found on this server." how should i use routing for download these files? someone may help please thanks

    Read the article

  • Avoid application becoming unresponsive on download

    - by baron
    Hi I have an application which downloads a file from a network location and then displays that files information. It uses WebClient.DownloadFile to achieve this. The problem is, when the user clicks a button to start the download, the application becomes unresponsive until the file has downloaded. This gives the impression the application may have hung. I would like to seek ideas which would avoid this scenario. Though my solution will need to be 'as quick and cheap as possible' I would be interested in hearing about custom loaders that have been built, or some sort of loading symbol / progress bar / sand time clock thingy : ) Thanks for reading.

    Read the article

  • C#/.Net Download file from premium rapidshare account

    - by Simon
    Hello, how can I log to premium rapidshare account from my source? I tryed this but it is not working: string authInfo = "name" + ":" + "pass"; authInfo = Convert.ToBase64String(Encoding.Default.GetBytes(authInfo)); client.Headers["Authorization"] = "Basic " + authInfo; client.DownloadFile("url", "C:\\Temp\\aaaa.file"); OR WebClient client = new WebClient(); client.Credentials = new NetworkCredential("name", "pass"); client.DownloadFile("url", "C:\\Temp\\aaaa.file"); Is there any simple way how download the file directly from rapidshare premium? Thank you a lot! Regards, simon

    Read the article

  • How to disable automatic download of new podcast episodes in Rhythmbox?

    - by meceso
    After adding a new podcast to Rhythmbox it starts downloading the newest episode automatically without asking the user for permission. I do find this behaviour pretty annoying especially when you have a lot of podcasts and are using 3G broadband to access the internet. Can you update the podcast feed and then choose manually which episode you want to download? Did not find this in the settings/preferences...

    Read the article

  • Download/update webpages listed in XML sitemap

    - by unor
    I'm searching a FLOSS tool that downloads all pages (and embedded resources, e.g. images) linked in a XML sitemap (built according to http://www.sitemaps.org/). The tool should "crawl" the sitemap regularly and look for new and deleted URLs and changes in the lastmod element. So whenever a page gets added/deleted/updated, the tool should apply the changes. Some sitemaps list sub-sitemaps in sitemapindex?sitemap. The tool should understand this and load all linked sub-sitemaps and look for URLs in there. I know there are tools that allow me to extract all URLs from the sitemap, so that I could feed them to wget or similar tools (see for example: Extract Links from a sitemap(xml)). But this wouldn't help in getting noticed about updates to pages. Tracking the webpages itself for updates doesn't work, because "secondary" content on the pages changes daily, but lastmod gets only updated when relevant content changed.

    Read the article

  • Mac limit update download speed

    - by ILMV
    I have a Macbook with Snow Leopard that I need to update, but I want to limit the speed so it doesn't wipe out my entire bandwidth. Is there an application or setting change I can use to limit the speed to 20KBps? I've already tried ipfw through the terminal with little success. Thanks :-)

    Read the article

  • the right options to traverse/download the pages/directories of a subdomain

    - by Lorraine Bernard
    Let's suppose exist a site with the following directories (subdomain) index.php |-sub1 |-index.php |-sub1sub1 |-index.php |-other.php |-sub1sub1sub1 |-sub2 |-index.php |- …. |-sub3 |- ... My question is: 1) how can I display properly locally the site of the sub1 subdomain (http://domain/sub1) 2) how can I get just the files and directory which are childs of sub1 (sub1sub1 and sub1sub1sub1 for example) I tried the following options (for wget) but it retrieves also the files and directories which are in sub2, sub3 etc.. wget -E -H -k -K -r http://domain/sub1/index.php

    Read the article

  • How to download big file with chrome on Mac OSX?

    - by Eye of Hell
    If I try to download a big file on unstable connection/server (XCode 4) Google chrome simply "stops" downloading on first network error so I have a first 1-2-3 gigabytes of file and chrome thinks that download is finished. Unfortunately, I need to download an entire file, so I need a more advanced download tool like a wget. But there comes a problem: most URL's currently on the web is not a direct URL but multiple "redicrect" pages that utilize complex javascript in order to generate next url and redirect browser to it. Chrome handles such things ok, but if I try to supply such URL to wget it will download some "intermediate" page as a file - not a file itself but an HTML page with complex redirect javascript. is it any way to get a direct URL from chrome or to somehow discover it so I can use it with wget? Maybe it's some avanced download manager integrated in chrome that I just need to install? I use MacOS X 10.6.6 and latest Google chrome.

    Read the article

  • How to automate downloading files?

    - by Damon
    I got a book which had a pass to access digital versions of hi-res scans of much of the artwork in the book. Amazing! Unfortunately the presentation of all the these are 177 pages of 8 images each with links to zip files of jpgs. It is extremely tedious to browse, and I would love to be able to get all the files at once rather than sitting and clicking through each one separately. archive_bookname/index.1.htm - archive_bookname/index.177.htm each of those pages have 8 links each to the files linking to files such as <snip>/downloads/_Q6Q9265.jpg.zip, <snip>/downloads/_Q6Q7069.jpg.zip, <snip>/downloads/_Q6Q5354.jpg.zip. that don't quite go in order. I cannot get a directory listing of the parent /downloads/ folder. Also, the file is behind a login-wall, so doing a non-browser tool, might be difficult without knowing how to recreate the session info. I've looked into wget a little but I'm pretty confused and have no idea if it will help me with this. Any advice on how to tackle this? Can wget do this for me automatically?

    Read the article

  • What is the best way to download files via HTTP using .NET?

    - by Shamika
    In one of my application I'm using the WebClient class to download files from a web server. Depending on the web server sometimes the application download millions of documents. It seems to be when there are lot of documents, performance vise the WebClient doesn't scale up well. Also it seems to be the WebClient doesn't immediately close the connection it opened for the WebServer even after it successfully download the particular document. I would like to know what other alternatives I have.

    Read the article

  • mysql image disable print download

    - by Vish
    Hi, We use a Flex AIR client and a WAMP server. Tiff images are stored in MySQL. Currently, I can download the image from AIR client and it prompts for a download dialog. Things are fine till this point. We got a new requirement. Requirement is that only some users can print the image which gets downloaded. For other users, they should not be able to print the tiff image. Wondering how to accomplish this. One idea, not sure if its efficient, is to convert the image requested to pdf at the server side, disable print option there(hope there are API's available) and send back the pdf. Please let me know btter ideas. Also, is there a way to prevent file download dialog from popping up everytime the file is requested for download? Can we just get the file stream to the client and manipulate it to open with a particular viewer or write it to pdf... Please help.

    Read the article

  • How to open the download window when a dynamically created link is clicked in asp.net

    - by Ranjana
    i have stored the txtfile in the database.i need to show the txtfile when i clik the link. and this link has to be created dynamically. my code below: aspx code: aspx.cs protected void Page_Load(object sender, EventArgs e) { if(!Page.IsPostBack) { DataTable dtassignment = new DataTable(); dtassignment = serviceobj.DisplayAssignment(Session["staffname"].ToString()); if (dtassignment != null) { Byte[] bytes = (Byte[])dtassignment.Rows[0]["Data"]; //download(dtassignment); } divlink.InnerHtml = ""; divlink.Visible = true; foreach (DataRow r in dtassignment.Rows) { divlink.InnerHtml += "<a href='" + "'onclick='download(dtassignment)'>" + r["Filename"].ToString() + "</a>" + "<br/>"; } } } - public void download(DataTable dtassignment) { System.Diagnostics.Debugger.Break(); Byte[] bytes = (Byte[])dtassignment.Rows[0]["Data"]; Response.Buffer = true; Response.Charset = ""; Response.Cache.SetCacheability(HttpCacheability.NoCache); Response.ContentType = dtassignment.Rows[0]["ContentType"].ToString(); Response.AddHeader("content-disposition", "attachment;filename=" + dtassignment.Rows[0]["FileName"].ToString()); Response.BinaryWrite(bytes); Response.Flush(); Response.End(); } i have got the link dynamically, but i did not able to download the txtfile when i clik the link. how to carry out this. pls help me out...

    Read the article

  • mysql image disable print download

    - by Vish
    Hi, We use a Flex AIR client and a WAMP server. Tiff images are stored in MySQL. Currently, I can download the image from AIR client and it prompts for a download dialog. Things are fine till this point. We got a new requirement. Requirement is that only some users can print the image which gets downloaded. For other users, they should not be able to print the tiff image. Wondering how to accomplish this. One idea, not sure if its efficient, is to convert the image requested to pdf at the server side, disable print option there(hope there are API's available) and send back the pdf. Please let me know btter ideas. Also, is there a way to prevent file download dialog from popping up everytime the file is requested for download? Can we just get the file stream to the client and manipulate it to open with a particular viewer or write it to pdf... Please help.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >