Search Results

Search found 11542 results on 462 pages for 'download'.

Page 43/462 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • Downloading a large site with wget

    - by Evan Gill
    Hi, I'm trying to mirror a very large site but wget never seems to finish properly. I am using the command: wget -r -l inf -nc -w 0.5 {the-site} I have downloaded a good portion of the site, but not the whole thing. The content does not change fast enough to bother using time-stamping. After running overnight, this message appears: File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. File `{filename}.html' already there; not retrieving. Killed does anyone know what is happening and how I can fix it?

    Read the article

  • Mac: Script application downloaded from the Internet

    - by Svish
    I downloaded a php framework and has started to make a website using that. Sometimes I need to look at the source of that framework and every time I open a file I haven't opened before I get this message: “somefile.php” is a script application which was downloaded from the Internet. Are you sure you want to open it? That is ok and nice I suppose, but I am getting tired of it. Is there a way I can fix all the files in my web directory so that the os somehow forgets the files are from the Internet or something like that?

    Read the article

  • PHP file gets download instead of getting executed when browsed in any browser

    - by baltusaj
    I have a phpinfo.php file which I am trying to run by browsing to it using browser but the browser downloads the file instead of executing it. phpinfo.php <?php phpinfo(); ?> I followed following this post, added following lines to my /etc/apache2/httpd.conf and restarted apache but invain. phpinfo.php still gets downloaded. AddType application/x-httpd-php .php .phtml AddType application/x-httpd-php-source .phps Have I added these line to the correct file? On an openSuSE forum following was mentioned. I followed this too but still no success. Same problem is persisting. In case the browser wants to save your php files instead of displaying the content, you should enable php support in the /etc/apache2/mod_userdir.conf file. Add the following line to it, just after the line and restart the server. Include /etc/apache2/conf.d/php5.conf

    Read the article

  • something is downloading in background Linux

    - by wisdom
    My laptop boots normally to Linux -LinuxMint- and immediately I like to open gnome-system-monitor but the shock is that something is downloading(the problem has been out 2 days ago) and nothing I run yet ! I did iftop in terminal but nothing strange -a screen shot provided-, also I tried nethogs which showed nothing at all. The more complicated when I reboot and same problem is there so I can't browse/surf Internet, no one connected to my network can browse any more it's absorbing the stream horribly! After many reboots it will go to normal state(no background downloading). But really I can't figure out the problem going behind the scene ! Any suggestion,any help would be appreciated...thanks

    Read the article

  • How to download a full website as PDF?

    - by MartyIX
    I'm trying to make an offline version of a web site and I'm looking for a tool that would do the task automatically for the whole web site (circa 1000 pages of HTML + images). Is there anything like that and free? I know it is quite challenge for a program but maybe I'll be lucky :). EDIT: It should be a program for Windows.

    Read the article

  • Good firefox extension to download .flv flash video?

    - by Tamás Szelei
    I'm looking for a decent flash video downloader for Firefox. There are tons of them. I'd like to use it for a custom site with some random flash video player, not Youtube. How can it be accomplished? Edit: None of these methods you guys suggested worked so far; I looked up various places where the temporary file might end up - it just doesn't. It is indeed a flash player, but it seems like it's not caching - is that possible?

    Read the article

  • Why does Outlook Express download old (POP3) mail?

    - by CityNeonRain
    We are using Outlook Express to access mail through a POP3 server. The problem is that every now and then, Outlook Express starts downloading all old emails, which amounts to tens of thousands of emails. This happens on every single computer (three) at the office. Is this caused by our OE configuration? Or by our hosting?

    Read the article

  • Save a single web page (with background images) with Wget

    - by mikael
    I want to use Wget to save single web pages (not recursively, not whole sites) for reference. Much like Firefox's "Web Page, complete". My first problem is: I can't get Wget to save background images specified in the CSS. Even if it did save the background image files I don't think --convert-links would convert the background-image URLs in the CSS file to point to the locally saved background images. Firefox has the same problem. My second problem is: If there are images on the page I want to save that are hosted on another server (like ads) these wont be included. --span-hosts doesn't seem to solve that problem with the line below. I'm using: wget --no-parent --timestamping --convert-links --page-requisites --no-directories --no-host-directories -erobots=off http://domain.tld/webpage.html

    Read the article

  • Download databasename.bak file

    - by Jordon
    I have downloaded databasename.bak file from my hosting company, when i tried to restore that DB file in SQL server 2008 it is keep on giving me following error. The media family on device 'C:\go4sharepoint_1384_8481.bak' is incorrectly formed. SQL Server cannot process this media family. RESTORE HEADERONLY is terminating abnormally. (Microsoft SQL Server, Error: 3241) According to this error and from following link http://www.sqlcoffee.com/Troubleshooting047.htm It is clear that either file i am downloading is corrupt or it is getting corrupted on the way? Any idea, why I am keep on receiving this error? I tried almost all ways but unable to fix this problem, please help me.

    Read the article

  • Using wget to recursively download whole FTP directories

    - by user9406
    I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder. I tried running the following command form my new server: wget -r ftp://username:[email protected] But all I get is a made up index.html file. What the right syntax for using wget recursively over FTP?

    Read the article

  • Download discussions from usenet

    - by user22559
    Hello. Does anyone know a good (and maybe free) usenet client that would allow me to save all the discussions from one (or more) groups to text files? Preferably each post to its own text file. I need this to run some data mining on those discussions. Thanks :)

    Read the article

  • What causes PHP pages to consistently download instead of running normally

    - by Jonathan
    Hi, I'm running a Ubuntu Server on a VM, to test out different web forum solutions. I have set up a ~/public_html/ to be accessible with the apache2 web server, and that works fine. However when I go to a .php file on a browser (using my VM's ip-address/~username/phpfile.php) it does not display it as it should. Instead it offers to save to file/asks what program to open it with. Interestingly though that dialog box does recognise that it is a php file. I have the following version of php installed on the system: PHP 5.3.2-1ubuntu4.5 with Suhosin-Patch (cli) (built: Sep 17 2010 13:49:46) Copyright (c) 1997-2009 The PHP Group Zend Engine v2.3.0, Copyright (c) 1998-2010 Zend Technologies And the following server: Server version: Apache/2.2.14 (Ubuntu) Server built: Nov 18 2010 21:19:09 If anyone knows what might be causing this/potential solutions it would make me very happy :) EDIT: Turns out files this behaviour was only apparent on files in the ~/public_html/ directory. All php files in /var/www/ work fine. Prizes go to whoever can explain why? :D (And by prizes I just mean a well done, no actual prizes I'm afraid.)

    Read the article

  • Setting up a download limit for a computer

    - by sprsr
    Hello all, A friend of me asked me a question like that,"I have a modem, and a house mate. He is using my modem, and slowing down my internet. What I want to do is, limit his bandwith without using any program like netlimiter or so. How can I do that?" What are the ways to do this ? Thanks.

    Read the article

  • why is drop box syncing so freakishly slowly in my linux virtual machine?

    - by Bec
    i am setting up a linux virtual machine (windows 7 64 bit host, ubuntu 64 bit guest, using virtual box) and i just installed drop box and set it to sync. I've only got about 2Gb in there so i figured it should take just an afternoon, but it's going at about 0.5 kB/second and says it will take about 60 days. I usually get about 200 kB/second in the host OS, and downloading straight from the dropbox website through firefox in the ubuntu VM i get about that, but sync is really slow. any tips?

    Read the article

  • Server not responding to SSH and HTTP but ping works

    - by yes123
    Hello guys, I requested an hard reboot because none of ssh and http worked. Ping worked normally. Which logs should i check to understand what was the problem? Thanks! (debian 6 on lamp) Edit: my memory and swap: Mem: 4040068k total, 1114920k used, 2925148k free, 109212k buffers Swap: 1051384k total, 0k used, 1051384k free, 283820k cached 4 GB ram (and more than 1TB of HDD) The cause is from 2 days ago: look how the usage of swap goes +60% in less than 10hours My control panel reports this as top 5 memory usage process: If every apache2 process is 190MB large that sux because IF i do TOP i have 262 sleeping process most of them are apache2! My apache mpm_prefork settings are: <IfModule mpm_prefork_module> StartServers 5 MinSpareServers 5 MaxSpareServers 10 ServerLimit 1500 MaxClients 1500 MaxRequestsPerChild 2000 </IfModule> KeepAlive On MaxKeepAliveRequests 100 KeepAliveTimeout 4

    Read the article

  • Download or view a servers wins database

    - by Segfault
    I am trying to troubleshoot a WINS browsing problem in a Server 2008 AD Forest. I am in one domain and the problem is with a sibling domain. What command can i use to dump or view the WINS database on a particular AD server by name, in a different domain than me? I thought one of the subcommands of net would have an option for this, but I can't find it. I also tried browstat.exe getblist but it gives me an error message "The list of servers for this workgroup is not currently available". I am not a domain admin and don't have any rights to the either domain other than a normal user. Anyone know how this can be done?

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >