Search Results

Search found 11748 results on 470 pages for 'webclient download'.

Page 36/470 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Context Menu for Browser to download file to specific folder

    - by elcojon
    There is this website which has customized audio files for me. I would like to save them in a special folder. Now I don't want to select the "special folder" each time in the file-chooser dialogue of my browser. I would rather prefer to have a custom entry in the context menu when I right-click the download link. This context-menu-entry should do the trick and download the file to the predefined "special folder". How would I start about that? I am using Safari and Chrome. So a solution in either browser is fine. To get into the context menu of the browser, what kind of programming do I need to do? Is it an extension, plugin, etc.? Thanks

    Read the article

  • How to Make a Plugin for Chrome ( dll ) like RealPlayer Download and Record Plugin ( capturing media

    - by uenx
    Hi guys. I'm trying to make a media Download bar for Chrome Browser like Real Player's one ( a DLL plugin ) Whenever you open a page which contents "media stream" like Youtube..., it will show a download bar at the left-top corner of the flash player - allow you to download this video/song to your computer. How does it capture the video url of the flash-player? Which method and language( C++ or C# ) do I have to use? Thanks in Advance :) ( and so sorry for bad English )

    Read the article

  • iphone file download not working

    - by Anonymous
    Hi, In my app I 'm first connecting to a web service, which in return sends a url for a file. I use the url to download the file and then display it on the new view. I get the correct URL but not able to download file from that location. I have another test app which will download file from the same location and it works like a charm. following is my code for webservice-file download. This is a snippet of the code where i 'm parsing the web service xml and then pass the result to NSData for file download. Any suggestions where am i going wrong -- I 'm referring to the following tutorials. Web Service PDF Viewer if ([elementName isEqualToString:@"PRHPdfResultsResult"]) { NSLog(soapResults); UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Report downloaded from:" message:soapResults delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil]; NSData *pdfData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:soapResults]]; //Store the Data locally as PDF File NSString *resourceDocPath = [[NSString alloc] initWithString:[[[[NSBundle mainBundle] resourcePath] stringByDeletingLastPathComponent] stringByAppendingPathComponent:@"Documents"]]; NSString *filePath = [resourceDocPath stringByAppendingPathComponent:@"myPDF.pdf"]; [pdfData writeToFile:filePath atomically:YES]; [alert show]; [alert release]; [soapResults setString:@""]; elementFound = FALSE; }

    Read the article

  • Logging full network connection like in browser

    - by pokemarine
    Can I set my browser or download a special application to log/download everything that is going through network while browsing a site? When I open a website in Firefox, it downloads the files to temporary folders, but I need everything, and by everything I mean full Ajax request logging, image downloading...etc Every file that is downloaded while browsing sites (and not temporary, I need them later, so It should log and categorize it somehow). Otherwise: I want to automatically save everything you can see in the browser's network tab (after pressing F12) and not just the response, also need the information about what it sent and what it got back.

    Read the article

  • wget recursively with -np option still ascends to parent directory

    - by vectra
    tl;dr: will `wget --no-parrent -r ' download from a directory above the given url's directory? when using wget to download, say images, recursively from example.com/a/b with the -r and -np options, will a picture that is under example.com/a/c/ be downloaded when example.com/a/b/ delivers a html-file containing a link to the picture? if so, how do i get all pictures, that are in a folder and it's subfolders and only those? the description of the option --no-parent says "Do not ever ascend to the parent directory when retrieving recursively". anyway directory browsing delivers a link to the parent directory, which wget will follow, despite mentioned option. now what did i miss? edit: using GNU Wget 1.12

    Read the article

  • Make download dialog show up for pictures in IIS 6.0

    - by LinuxGnut
    I have a client that is offering picture downloads from their site. They want the user to have to download the picture, rather than having it appear in the browser when the link is clicked. Inserting the text "Right-click to save as" or something similar isn't an option with this client, as everything needs to be done their way. Now, I know I could accomplish this task in PHP or ASP, but I'd rather not have to write an addition to Magento to accomplish this. So is there a way in IIS 6.0 (Server 2003) to send the correct headers for image formats in that it will make a download dialog pop up?

    Read the article

  • linux: upload / download difference on network shares

    - by Batsu
    I have a Red Hat Enterprise Linux 6 (with SELinux) which shows significant differences of speed between download and upload (the latter significantly slower) of files shared over the LAN. The bottleneck seems to be the output of the linux machine since I have a rate around 1Mb/s when WinXP machines download files shared (using samba) by the RHEL machine uploading files from the RHEL to a WinXP's shared folder while uploading from the XP machines to linux's shares downloading XPs' shares on the RHEL any share between Windows machines only run smooth (around 50Mb/s). Since the upload from RHEL to WinXP's share is slowed too I would exclude an issue in the configuration of samba. What could possibly determine this limit in the upload speed? update: iptables doesn't show any output rule and disabling it doesn't show any noticeable difference, so I would rule out it too.

    Read the article

  • Instant file sharing between users in PHP

    - by Skyfe
    Hi there, Working on a rather complex system in which users can directly exchange files with eachother from the website. However is any of these things possible: EITHER * Have another user download a file which is still being uploaded by another user ( in progress ) OR * Make a user automaticly ( instant ) download a file from another users PC through our website OR * Make a user automaticly (instant) download a file from our server ( so it's directly downloaded to the users pc and the progress shown on our website of the download progress, without the normal internet explorer dialog downloading the file or firefox ). Thank you very much in advanced, Best Regards, Webcodez.net. UPDATE: an example would be MSN's file sharing but then through a website instead of application.

    Read the article

  • What is the best way to make versioned files available for download

    - by Saif Bechan
    I have a small PHP framework which I want to make available for download. It is located in a git repository. But the last version is not always the one that I want to make available for download. Is there some place I can make the versions available for download. Another thing about this framework is that I bring out additional components for the framework. These also have different versions. Is there somewhere where I can add the whole project, and people can browse trough everything and download what they need. Or should I make this myself?

    Read the article

  • Download files using Perl

    - by Neeraj
    I have a project that depends upon some other binaries to be downloaded from web at install time.For this what i do is: if ( file-present-in-src/) # skip that file else # use wget to download the file The problem with this approach is that when I interrupt a download in middle, and do invoke the script next time, the partially downloaded file is also skipped (which is not desired), also I want wget to resume the download of the partially downloaded file. How should I go about it: Possible Solutions I could think of: Let the file to be downloaded to some file say download_tmp. Copy to original file if successful. Handle SIG{'INT'} to write proper cleanup code. But none of these could help resume the partial file download, Any insights?

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • PDF download hanging from server with Firefox/Chrome

    - by Cruachan
    I have a Windows 2008 R2 (virtual) server running a number of websites. My client has uploaded several PDFs by FTP to a download directory from where they can be retrieved via a web page. This works fine in IE and Safari, but when attempting to download with Firefox or Chrome both browsers hang and Firefox posts 'stopped' in the status bar at the bottom of the page. We've tried this on several PCs at different locations so I think this is a server problem. Why would this be? Is there some configuration setting I need to change?

    Read the article

  • how to download page from source code

    - by newBie
    i need to download page from source code..for example Cellini's Italian Restaurant i want to download the "/len/aaproximat...php"..i didnt find the suitable regex for it..and i need to download that page..can anyone help? im using vb.net

    Read the article

  • 14 WordPress Photo Blog & Portfolio Themes

    - by Aditi
    The best thing you can do to preserve your memories is to capture them. Photographs can help you relive all those sweet moments you had with your special someone or the ones closest to you. With the sudden explosion in the number of blogs on blogosphere it was quite obvious that many bloggers would like to share their most cherished memories on their blog. We saw blogs full of images along with the intricate details and now we are presenting you some WordPress themes to help you showcase your photography or make a photo blog so that you can share those small delights you captured with your special ones, no matter where they are. These WordPress photo blog themes are not just limited for personal use as some of them have been designed especially for professional use. Graphix Price: $69 Single & $149 Developer Package | DownLoad DeepFocus Price: $39 Package | DownLoad ReCapture Price: $50 or $75 Package | DownLoad PhotoGraphic Price: $50 or $75 Package | DownLoad PhotoLand Price: $39 Single & $99 Developer Package | DownLoad SimplePress Perfect Theme for showcasing your Portfolio, very simple & easy to navigate. Lots of Features. Price: $39 Single & $99 Developer Package | DownLoad ePhoto Price: $39 Single & $99 Developer Package | DownLoad Outline Price: $50 or $75 Package | DownLoad Gallery The theme features a simple options panel for easy setup, automatic resizing & cropping for thumbnails, and 5 colour styles. Price: $49 | DownLoad eGallery eGallery is one of the best theme to showcase your images. It has some features which you don’t see in any other themes of this kind. It’s particularly nice if you want to encourage social interaction as readers can rate and comment on your images. It is compatible with all major web browsers. Price: $39 | DownLoad Photoblog Price: $49 | DownLoad Ultra Web Studio Price: $30 | DownLoad Showtime Ultimate WordPress Theme for you to create your web portfolio, 3 different styles. Price: $40 | DownLoad Boomerang Price: $35 | DownLoad Related posts:6 PhotoBlog Portfolio WordPress Themes Wootube WordPress Video Blog Theme 7 Portfolio WordPress Themes

    Read the article

  • [ASP.NET] Odd HttpRequest behaviour

    - by barguast
    I have a web service which runs with a HttpHandler class. In this class, I inspect the request stream for form / query string parameters. In some circumstances, it seemed as though these parameters weren't getting through. After a bit of digging around, I came across some behaviour I don't quite understand. See below: // The request contains 'a=1&b=2&c=3' // TEST ONLY: Read the entire request string contents; using (StreamReader sr = new StreamReader(context.Request.InputStream)) { contents = sr.ReadToEnd(); } // Here 'contents' is usually correct - containing 'a=1&b=2&c=3'. Sometimes it is empty. string a = context.Request["a"]; // Here, a = null, regardless of whether the 'contents' variable above is correct Can anyone explain to me why this might be happening? I'm using a .NET WebClient and UploadDataAsync to perform the request on the client if that makes any difference. If you need any more information, please let me know.

    Read the article

  • Handling multiple async HTTP requests in Silverlight serially

    - by Jeb
    Due to the async nature of http access via WebClient or HttpWebRequest in Silverlight 4, when I want to do multiple http get/posts serially, I find myself writing code that looks like this: doFirstGet(someParams, () => { doSecondGet(someParams, () => { doThirdGet(... } }); Or something similar. I'll end up nesting subsequent calls within callbacks usually implemented using lambdas of some sort. Even if I break things out into Actions or separate methods, it still ends up being hard to read. Does anyone have a clean solution to executing multiple http requests in SL 4 serially? I don't need things to be synchronous, I just want serial execution.

    Read the article

  • Inconsistent file downloads of (what should be) the same file

    - by Austin A.
    I'm working on a system that archives large collections of timetstamped images. Part of the system deals with saving an image to a growing .zip file. This morning I noticed that the log system said that an image was successfully downloaded and placed in the zip file, but when I downloaded the .zip (from an apache alias running on our server), the images didn't match the log. For example, although the log said that camera 3484 captured on January 17, 2011, when I download from the apache alias, the downloaded zip file only contains images up to January 14. So, I sshed onto the server, and unzipped the file in its own directory, and that zip file has images from January 14 to today (January 17). What strikes me as odd is that this should be the exact same file as the one I downloaded from the apache alias. Other experiments: I scp-ed the file from the server to my local machine, and the zip file has the newer images. But when I use an SCP client (in this case, Fugu for OSX), I get the zip file for the older images. In short: unzipping a file on the server or after downloading through scp or after downloading through wget gives one zip file, but unzipping a file from Chrome, Firefox, or SCP client gives a different zip file, when they should be exactly the same. Unzipping on the server... [user@server ~]$ cd /export1/amos/images/2011/84/3484/00003484/ [user@server 00003484]$ ls -la total 6180 drwxr-sr-x 2 user groupname 24 Jan 17 11:20 . drwxr-sr-x 4 user groupname 36 Jan 11 19:58 .. -rw-r--r-- 1 user groupname 6309980 Jan 17 12:05 2011.01.zip [user@server 00003484]$ unzip 2011.01.zip Archive: 2011.01.zip extracting: 20110114_140547.jpg extracting: 20110114_143554.jpg replace 20110114_143554.jpg? [y]es, [n]o, [A]ll, [N]one, [r]ename: y extracting: 20110114_143554.jpg extracting: 20110114_153458.jpg (...bunch of files...) extracting: 20110117_170459.jpg extracting: 20110117_173458.jpg extracting: 20110117_180501.jpg Using the wget through apache alias. local:~ user$ wget http://example.com/zipfiles/2011/84/3484/00003484/2011.01.zip --12:38:13-- http://example.com/zipfiles/2011/84/3484/00003484/2011.01.zip => `2011.01.zip' Resolving example.com... ip.ip.ip.ip Connecting to example.com|ip.ip.ip.ip|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 6,327,747 (6.0M) [application/zip] 100% [=====================================================================================================>] 6,327,747 1.03M/s ETA 00:00 12:38:56 (143.23 KB/s) - `2011.01.zip' saved [6327747/6327747] local:~ user$ unzip 2011.01.zip Archive: 2011.01.zip extracting: 20110114_140547.jpg (... same as before...) extracting: 20110117_183459.jpg Using scp to grab the zip local:~ user$ scp user@server:/export1/amos/images/2011/84/3484/00003484/2011.01.zip . 2011.01.zip 100% 6179KB 475.3KB/s 00:13 local:~ user$ unzip 2011.01.zip Archive: 2011.01.zip extracting: 20110114_140547.jpg (...same as before...) extracting: 20110117_183459.jpg Using Fugu to download 2011.01.zip from /export1/amos/images/2011/84/3484/00003484/ gives images 20110113_090457.jpg through 201100114_010554.jpg Using Firefox to download 2011.01.zip from http://example.com/zipfiles/2011/84/3484/00003484/2011.01.zip gives images 20110113_090457.jpg through 201100114_010554.jpg Using Chrome gives same results as Firefox. Relevant section from apache httpd.conf: # ScriptAlias: This controls which directories contain server scripts. # ScriptAliases are essentially the same as Aliases, except that # documents in the realname directory are treated as applications and # run by the server when requested rather than as documents sent to the client. # The same rules about trailing "/" apply to ScriptAlias directives as to # Alias. # ScriptAlias /cgi-bin/ "/var/www/cgi-bin/" Alias /zipfiles/ /export1/amos/images/

    Read the article

  • Windows Phone 7, download xml over ssl with authentication

    - by Snake
    Hi, I'm trying to download a file from my provider. The url is protected with basic username and password, and everything is sent over ssl. So I try to do this: WebClient proxy = new WebClient(); proxy.DownloadStringCompleted += (o, dscea) => System.Diagnostics.Debugger.Break(); proxy.Credentials = new NetworkCredential("username", "password"); proxy.DownloadStringAsync(new Uri("https://..../.../data.xml")); As you can see I try to validate. The data IS correct, and the code works when I try to download something from twitter. What am I forgetting to connect to this xml file?

    Read the article

  • how do I download a large file (via HTTP) in .NET

    - by nickcartwright
    I need to download a LARGE file (2GB) over HTTP in a C# console app. Problem is, after about 1.2GB, the app runs out of memory. Here's the code I'm using: WebClient request = new WebClient(); request.Credentials = new NetworkCredential(username, password); byte[] fileData = request.DownloadData(baseURL + fName); As you can see... I'm reading the file directly into memory. I'm pretty sure I could solve this if I were to read the data back from HTTP in chunks and write it to a file on disk. Does anyone know how I could do this?

    Read the article

  • Downloading a file from the internet with '&' in URL using wget

    - by matt_tm
    Hi, I'm trying to download a file from a URL that looks like this: http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf Within the browser, this link prompts me to download a file called x.pdf irrespective of what DEF is (but 'x.pdf' is the right content). However using wget, I get the following: >wget.exe http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc --2011-01-06 07:52:05-- http://pdf.example.com/filehandle.ashx?p1=ABC Resolving pdf.example.com... 99.99.99.99 Connecting to pdf.example.com|99.99.99.99|:80... connected. HTTP request sent, awaiting response... 500 Internal Server Error 2011-01-06 07:52:08 ERROR 500: Internal Server Error. 'p2' is not recognized as an internal or external command, operable program or batch file. This is on a Windows Vista system Edit1 >wget.exe "http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf" SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files\GnuWin32/etc/wgetrc --2011-02-06 10:18:31-- http://pdf.example.com/filehandle.ashx?p1=ABC&p2=DEF.pdf Resolving pdf.example.com... 99.99.99.99 Connecting to pdf.example.com|99.99.99.99|:80... connected. HTTP request sent, awaiting response... 200 OK Length: 4568 (4.5K) [image/JPEG] Saving to: `filehandle.ashx@p1=ABC&p2=DEF.pdf' 100%[======================================>] 4,568 --.-K/s in 0.1s 2011-02-06 10:18:33 (30.0 KB/s) - `filehandle.ashx@p1=ABC&p2=DEF.pdf' saved [4568/4568]

    Read the article

  • Logging into Local Statusnet instance on Apache causes browser to download a file

    - by DilbertDave
    I've installed statusnet 0.9.1 on a Windows Server via the WAMP stack and on the whole it seems to be fine. However, when logging in using IE7 or Chrome the browers invoke a file download, i.e. the File Download dialog is displayed. In IE7 the file is called notice with the content below (some parts starred out): <?xml version="1.0" encoding="UTF-8"?> <OpenSearchDescription xmlns="http://a9.com/-/spec/opensearch/1.1/"> <ShortName>Mumble Notice Search</ShortName> <Contact>david.carson@*****.com</Contact> <Url type="text/html" method="get" template="http://voice.*****.com/mumble/search/notice?q={searchTerms}"></Url> <Image height="16" width="16" type="image/vnd.microsoft.icon">http://voice.*****.com/mumble/favicon.ico</Image> <Image height="50" width="50" type="image/png">http://voice.******.com/mumble/theme/cloudy/logo.png</Image> <AdultContent>false</AdultContent> <Language>en_GB</Language> <OutputEncoding>UTF-8</OutputEncoding> <InputEncoding>UTF-8</InputEncoding> </OpenSearchDescription> In Chrome (Linux and Windows!) the file is called people and contains similar XML. This is not an issue when logging in using FireFox. This is obviously a configuration issue but I'm not having much luck tracking it down. I tested the previous version of Statusnet on an Ubuntu Server VM on our network and it worked fine for months. Thanks In Advance

    Read the article

  • Tool for search, watch and download youtube videos on Ubuntu

    - by Mike
    I am looking for a tool like MacTubes for Ubuntu. MacTubes is a mac app that can search, watch and download youtube videos. The great advantage of this app is that it searches youtube and shows all videos available there and allows you to select all and download everything fast and easy. MacTubes is so awesome that it also converts the video to MP4 and downloads the HD version of a video when available. I use this on my mac, but my sister uses Linux and I am looking for something like that for her. I have tried Miro, but Miro's search feature is bad as hell. I search for something using MacTubes and it shows me 1600 results. The same search under Miro shows me 40 results. Miro never shows more than one page of results. I prefer it to be an application with a GUI, instead of command line, because my sister's proficiency in Linux is not that good. Any suggestions? Thanks.

    Read the article

  • Passive mode FTP file download hangs from specific machine

    - by chiptuned
    I have a server which is an AWS instance that just cannot download files from a specific FTP server. I can connect to the FTP server fine and run some commands, but when I request a file it just hangs. Here is the debug output of the base linux ftp client after login: ---> SYST 215 UNIX Type: Apache FtpServer Remote system type is UNIX. ftp> get outgoing/catalog.gz catalog.gz local: catalog.gz remote: outgoing/catalog.gz ---> PASV 227 Entering Passive Mode (64,156,167,125,135,191) ---> RETR outgoing/catalog.gz 150 File status okay; about to open data connection. Thats it. Then it just sits there and nothing transfers. I have verified that a data connection is made but the client gets no data. ? ss -nt dst 64.156.167.125 State Recv-Q Send-Q Local Address:Port Peer Address:Port ESTAB 0 0 10.185.147.150:41190 64.156.167.125:21 ESTAB 0 0 10.185.147.150:48871 64.156.167.125:48557 The FTP server is not in my control and downloads from other FTP servers in passive mode have worked. Active mode does not work as the system is behind a firewall. Every FTP client I've tried has the same problem. The download works from other systems, even from other AWS instances I have with the same Security Group. Not necessarily the same distro or config though. I understand it may be some issue on the server side, but I want to know what it is about my particular machine where the transfer hangs and where on every other machine I can get my hands on, it works. Please let me know what the culprit on the client side could be or ideas on what else to look at.

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >