Search Results

Search found 12327 results on 494 pages for 'attachment download'.

Page 43/494 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Using psftp to upload and download files

    - by macha
    Hello I am trying to upload and download files from my desktop to my server. Now after some search I did download psftp. I used to use filezilla earlier, but I cannot install it on my desktop due to a few reasons. Since psftp (similar to putty) is just an executable for file transfer. So now after going through this link http://www.math.tamu.edu/~mpilant/math696/psftp.html. I understood that put and get are two commands I would use to download and upload files. Now when I logon to the server and say get filename, it actually is throwing back an error "local: unable to open filename". I tried that with other files too, and I end up getting the same error. Am I making a mistake or is it a problem with this executable? I did not find relevant tags for this topic, could somebody suggest me the right forum for this issue.

    Read the article

  • Pipeline For Downloading and Processing Files In Unix/Linux Environment With Perl

    - by neversaint
    I have a list of files URLS where I want to download them: http://somedomain.com/foo1.gz http://somedomain.com/foo2.gz http://somedomain.com/foo3.gz What I want to do is the following for each file: Download foo1,2.. in parallel with wget and nohup. Every time it complete download process them with myscript.sh What I have is this: #! /usr/bin/perl @files = glob("foo*.gz"); foreach $file (@files) { my $downurls = "http://somedomain.com/".$file; system("nohup wget $file &"); system("./myscript.sh $file >> output.txt"); } The problem is that I can't tell the above pipeline when does the file finish downloading. So now it myscript.sh doesn't get executed properly. What's the right way to achieve this?

    Read the article

  • Best free site/blog to put video tutorial

    - by nash
    Hi, i am planning to upload a video tutorial on a particular software technology. The size of the videos will be around 600 MB's to say 1 GB. I am putting it for free for anyone to download. Is there any site/blog where i can put it.I am planning to divide the videos in parts and zip them. Does blogger or any other cms allow me to upload to there sites and put download links? Or is buying web space the only option? I am not keen on using youtube as i clearly want the user to download by clicking on links and not watch them online. Also i was thinking of just uploading the videos on rapidshare/megaupload/mediafire... and put just links on a blogger post. Any suggestion from you guys?

    Read the article

  • pdf file download

    - by rex
    Once a member signs up a form they can download the pdf file. Currently, the pdf file is link with HTML which means that anyone with the url can download it. What's the best way to encrypt a page so that the user don't see the url to a pdf file. I tried creating a flash file and link the URL from flash using the following: var myPDF = new URLRequest("temp/test.pdf"); navigateToURL(myPDF); but it opens a new window and show's the URL !!! is there a way to make the browse forcefully download the file instead of opening in on a new browser. Thanks, Rex

    Read the article

  • Getting attachments from a mail account with .NET

    - by MarceloRamires
    I'd like a free library for .NET to get attachments from an account (such as gMail, or others) via imap4 (not necessarely), and save them in a folder. Ideally it would allow me to get a list of them, and download only some given ones (filtering by extension, name, and/or size) and be free. I've already done this with a trial version of EAGetMail, but for the purpose of what i'm trying to attempt buying the unlimited version of this library isn't quite suitable (i didn't know that this functionality itself was one among the ones with limited time). ---[edit - Higuchi]--- I'm using the following code: Dim cl As New Pop3Client() cl.UserName = "[email protected]" cl.Password = "mypassword" cl.ServerName = "pop.gmail.com" cl.AuthenticateMode = Pop3AuthenticateMode.Pop cl.Ssl = False cl.Authenticate() //takes a while, but passes even if there's a wrong password Dim mg As Pop3Message = cl.GetMessage(1) //gives me an exception: Message = "Pop3 connection is closed" As commented, I am having some issues while trying to connect and get the first e-mail. any help ?

    Read the article

  • iPhone SDK: My server doesn't support range header requests, does that mean it's impossible for me t

    - by Jessica
    I am currently developing an iPhone app, in which involves downloads of up to 300 mb. I have been told by my hosting service that my server does not support range header requests. However, when I download a file from my server using a download client, like safari download manager, resume options are available and work. Does this mean that they have a work around for servers that don't support range header requests and that I could possibly implement into my iPhone app? Or are they using a technique too complex to implement into the iPhone. If you know of a technique code samples will be greatly appreciated.

    Read the article

  • Large file download for a Rails project

    - by Horace Ho
    One client project will be online two months later. One of the requirements changed is to support large files (10 to 15MB per RAW camera file, expected 1000 to 5000 files download per day) download worldwide for their customers. The process will be: there is upload screen via paperclip to the rails local public folder a hourly task to upload to web storage (S3?) update the download url from paperclip url to the web url Questions: is there a gem/plug-in for this purpose? if no, any gem/plug-in for S3 to recommend? Questions about the storage provider: is S3 recommended? or other service to recommend? The baseline is: the client's web server does not and will not have the bandwidth to handle the downloads. Thanks

    Read the article

  • Download data in background with iOS4

    - by Sagar
    As per the latest update of Kindle V2.5, it has support of "continue downloading books while the app is in the background on iOS 4 devices". How is it possible to download content in background? As per the iOS multitasking documentation, only audio, voip & location updates are possible in background. And I've also maken sure that NSURLConnection doesn't download new data work while app goes background. Then how's it possible with Kindle app? Edit: I haven't checked Kindle App in iOS4 multitasking enabled device. So if anyone let me (& community) know what exactly Kindle app does to download, that would be very much helpful.

    Read the article

  • Downloading a csv file in django

    - by spyder
    I am trying to download a CSV file using HttpResponse to make sure that the browser treats it as an attachment. I follow the instructions provided here but my browser does not prompt a "Save As" dialog. I cannot figure out what is wrong with my function. All help is appreciated. dev savefile(request): try: myfile = request.GET['filename'] filepath = settings.MEDIA_ROOT + 'results/' destpath = os.path.join(filepath, myfile) response = HttpResponse(FileWrapper(file(destpath)), mimetype='text/csv' ) response['Content-Disposition'] = 'attachment; filename="%s"' %(myfile) return response except Exception, err: errmsg = "%s"%(err) return HttpResponse(errmsg) Happy Pat's day!

    Read the article

  • Download Specific Images

    - by thebourneid
    I'm trying to search and download specific images /front and back cover / of a website if found but whatever I do I always download only one of them. What should I change in my code to download both of them if found? while ($title_found =~ /'(http:\/\/images.blu-ray.com\/movies\/covers\/\d+_.*?)'/gis) { $url = getSite($1); if ($title_found =~ /front/) { $filename = 'front.jpg'; } elsif ($title_found =~ /back/) { $filename = 'back.jpg'; } } my $dir = 'somepath'.$filename; open F, ">", $dir; binmode F; print F $url; close F; return 0;

    Read the article

  • IE cannot download file with unicode pathname

    - by MM
    I have a web-app that allows users to upload and download image files by pressing buttons on a web page. A user of this page is reporting that IE 7 and 8 fail to download files when the files have Unicode pathnames. IE prompts the user with a dialog stating: "Internet explorer cannot download (file) at (webserver).". Unfortunately I have not been able to reproduce the problem using these versions on my machine. My question is, what could cause this, and how can I prevent it from happening? I have read about problems with cache control (I currently have it set to no-cache); however, I am not using HTTP-S, and the problem only occurs with file-names containing Unicode characters.

    Read the article

  • Removing response.End() includes the html of the page.

    - by vinay_rockin
    HttpResponse response = Context.Http.Response; response.Headers.Clear(); response.Clear(); response.ContentType = Component.OverrideMimeType ? Component.MimeType : "application/download"; response.AppendHeader("Content-Disposition", String.Format("attachment; filename=\"{0}\"", HttpUtility.HtmlEncode(Path.GetFileName(file.FileName)))); response.OutputStream.Write(file.Contents, 0, file.Contents.Length); response.Flush(); // response.End(); if I use response.End() it throws exception and if I comment response.End() it includes the html of the page on which the download link resides. how want to download file with introducing this exta html. Any Idea how to fix this?

    Read the article

  • Need php script to download a file on a remote server and save locally

    - by bigLarry
    Trying to download a file on a remote server and save it to a local subdirectory. The following code seems to work for small files, < 1MB, but larger files just time out and don't even begin to download. <?php $source = "http://someurl.com/afile.zip"; $destination = "/asubfolder/afile.zip"; $data = file_get_contents($source); $file = fopen($destination, "w+"); fputs($file, $data); fclose($file); ?> Any suggestions on how to download larger files without interruption?

    Read the article

  • Download HP Power Protector for ESXi

    - by Mark Henderson
    The HP PowerProtector user guide states that to install the HP PowerProtector client on an ESXi Host: Download the latest version of HPPP from the HP website (http://www.hp.com/go/rackandpower). The ESXi Server is automatically detected, and a shutdown command script is generated. However in typical HP fashion, after clicking through no less than 6 different links to get to the downloads page, I am presented with: http://h18004.www1.hp.com/products/servers/proliantstorage/power-protection/software/power-protector/pp-dl.html HP Power Protector (HPPP) - Windows HP Power Protector (HPPP) - Linux x86 HP Power Protector (HPPP) - Linux x64 HP Power Protector (HPPP) - Linux IA64 HP Power Protector (HPPP) - HPUX The Linux packages contain an RPM and in no way resemble what is in the HP documentation. None of these are labelled for ESXi. Does anyone know where or how to get the HP Power Protector ESXi client installed?

    Read the article

  • Where to download n900 songbird client?

    - by Walter White
    Hi all, I am confused ... is Songbird releasing a client for the n900? This page has been up for a few months. I've been eagerly awaiting a download, but cannot find the link: http://getsongbird.com/gadgets/ Does anyone have any ideas here? The media player with the n900 works well, but it doesn't connect to last.fm. It isn't a deal breaker, but it would be a nice to have. Songbird seems to run better than Rhythmbox and offers a cleaner, more polished look. Thanks, Walter

    Read the article

  • Downloads on Vista Home Premium start off fast but slow down to 0 Kb/s and hang

    - by user66265
    I have Windows Vista Home Premium on my computer and everytime I go to download something, it starts out at about 1.5 Mb/s and stays there for about 3 seconds, then it slows down to 800 Kb/s and continues to drop until it gets down to 0 Kb/s and hangs. I've tried just about everything I can find such as uninstalling all firewalls/antivirus, doing the netsh rss,autotune, and chimney disable, and updating everything but it still continues to happen. I'd prefer not to reinstall but if I have to then I have to... EDIT: Figured it out, the router needed a firmware update

    Read the article

  • Automatically save/download e-mail body to disk

    - by CatamountJack
    Is there a program that will allow me to connect to my mail server (IMAP) and automatically save certain new e-mails to disk? Multiple times a day I receive automated e-mail updates about pending jobs from a system that processes some information for us. The data in these e-mails is written as plain-text within the body of the message. I would like to download the newest message, parse it, and display it on my desktop. The last two parts I can manage ok - it's just the automatic downloading that is posing a challenge. I don't use Outlook (I do use Thunderbird), but would prefer not to have the client open to make this happen. I'm currently running Win7.

    Read the article

  • Prevent Windows Live Mail to download all messages from IMAP

    - by m8t
    Hello, Recently I'm trying the Window Live Mail client. Simple and beautiful. I have set up an IMAP account, and I'm used that a client only downloads headers. However Windows Live Mail automatically creates a list of tasks to download all messages from all directories when you are closing the client. Is it possible to avoid this? It's a good and a bad thing. You can work offline and you have a backup, but it takes extremely long to perform, in fact I have about hundred of thousand of emails. This task can take a whole day to perform. After looking in the settings I don't see anything special, maybe you have an idea? Thank you Mike

    Read the article

  • Apache rewrite rules not causes a download dialog of the PHP file

    - by Shaihi
    I have Apache 2.2.17 using the WAMPServer 2.1 installation. I am debugging a website fully local on my computer. I have the following rule in the .htaccess: # Use PHP5 Single php.ini as default AddHandler application/x-httpd-php5s .php Options +FollowSymlinks RewriteEngine on Rewritebase / RewriteRule ^bella/(.*)/(.*)$ beauty.php?beauty_id=$1 [L] RewriteRule ^(argentina|brasil|chile|colombia|espana|mexico|rep_dominicana|uruguay|venezuela|peru|bolivia|cuba|ecuador|panama|paraguay|puerto_rico)/$ country.php?name=$1 [L] RewriteRule ^(argentina|brasil|chile|colombia|espana|mexico|rep_dominicana|uruguay|venezuela|peru|bolivia|cuba|ecuador|panama|paraguay|puerto_rico)/(hi5|facebook|twitter|orkut)/$ socialnetw.php?country=$1&category=$2 [L] The problem When I enable this rule and try to access http://localhost/index.php using FF I get a download dialog for the PHP file. If I comment the Rewrite* part in the .htaccess file then the index.php file loads fine, but navigation in the page is broken...

    Read the article

  • Adobe Volume License for Indesign Digital Download Location?

    - by elistp
    We recently purchased a volume license for Adobe Indesign from Dell. We received an e-mail for the order that contains the serial #. However, there is no information on how to obtain Adobe Indesign from the Adobe licensing portal. This is our first time dealing with Adobe volume licensing so I'm a bit lost as to what we're suppose to do. I've googled around a little and found an Adobe License Portal but I do not have access to it. Does anyone with experience concerning Adobe volume licensing have any idea what we're suppose to do to get a download of our purchase?

    Read the article

  • where to download emacs manuals as offline html files

    - by Jisang Yoo
    When you press C-h i in Emacs, it shows what's called the top of the INFO tree, and it links to all kinds of manuals: AUCTeX, Org Mode, Emacs, Emacs FAQ, Emacs Lisp Intro, Elisp, ... . Is there a place where I can download all of them at once as html files? GNU Home page has links to some of them in html format: http://www.gnu.org/software/emacs/manual/elisp.html_node.tar.gz http://www.gnu.org/software/emacs/manual/emacs.html_node.tar.gz But I cannot find a link to a single tar.gz file packing all of them.

    Read the article

  • Norton Security Suite Symantec Download Manager Error: "Error writing to disk"

    - by Stephen Pace
    My broadband provider (Comcast) decided to switch their 'included with service' security suite from McAfee to Norton Security Suite. Their email directed me to a site that downloaded the Symantec Download Manager (NortonDL.exe) and that went fine. I'm running Windows 7 32-bit and running this application pops up the standard User Account Control message and the software is correctly identified as coming from Symantec. I answer 'yes' to allow the software to install and upon launch immediately get an "Error writing to disk" error. I searched the Internet for this error, but mainly I find Comcast users complaining about the same issue with no resolution other than to call Symantec. I found no one suggesting a successful workaround and it appeared that most of the support calls took up to three hours. I'd like to avoid that if possible. Ideas? To be honest, I'm getting close to bagging this installation and just moving to Microsoft Security Essentials.

    Read the article

  • php file downloads instead of being processed with ajax on apache

    - by eagleon
    I have a small website where some content is displayed within a HTML tag using AJAX. The content is simply taken from another page on the same web site. However, sometimes instead of loading the parsed PHP file, the browser displays a download box instead. I downloaded the file and this is what it looks like a text file mixed with binary or gzipped data. I can't paste the binary stuff here, but here are some of the headers: Jul 2012 18:52:16 GMT Server: Apache/2 X-Powered-By: PHP/5.3.10 Content-Encoding: gzip Vary: Accept-Encoding,User-Agent Keep-Alive: timeout=1, max=95 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/html HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=93 ETag: "2fc857-409-4c39691c59b40" HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=92 ETag: "2fc854-3e5-4c39691b65900" HTTP/1.1 304 Not Modified Date: Sun, 01 Jul 2012 18:52:16 GMT Server: Apache/2 Connection: Keep-Alive Keep-Alive: timeout=1, max=91 ETag: "2fc847-3e3-4c3969197d480" and large blocks of stuff like this: µàl]&BaËÜk#ìÏ

    Read the article

  • Software to automatically download a file from FTP and then rename->replace existing file

    - by pauska
    Hi. We pay a news agency to provide us mp3's of hourly news bulletins. They put the mp3's on a FTP server just about 10 minutes before every hour, with files named after date and time (example: 02012010_1600.mp3 or similar). I need to find some solution to download only the latest modified file from the FTP server, rename it to news.mp3 and replace the previous news.mp3 that was created. This should prefferably run on a Windows 2008 Server, as a service if its possible. Anyone have suggestion for software?

    Read the article

  • 403 Forbidden when trying to download file that was uploaded using SSH

    - by Simon Hartcher
    I have FTP access to an Apache server on linux to upload files so that they can be downloadable from the web. I recently was granted SSH access for extra permissions and figured that it would be quicker to download the files directly to the server, instead of downloading them to my machine then FTPing to the server. When I downloaded a file using SSH to the server, and then placed it in the public_html directory, it was not visible from the web. The permissions (from SSH and the FTP client) were the same as all the other files that are visible, but it was not visible in the directory listing, and if I tried to type in the filename into my browser I would get a 403 error. Obviously, when I FTP a file to the server something else happens that makes it web visible, that I am not currently privy to. What am I missing that is causing the file to be invisible from the web?

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >