Downloading a large site with wget
Posted
by Evan Gill
on Super User
See other posts from Super User
or by Evan Gill
Published on 2010-06-15T16:09:34Z
Indexed on
2010/06/15
16:13 UTC
Read the original article
Hit count: 307
Hi,
I'm trying to mirror a very large site but wget never seems to finish properly. I am using the command:
wget -r -l inf -nc -w 0.5 {the-site}
I have downloaded a good portion of the site, but not the whole thing. The content does not change fast enough to bother using time-stamping.
After running overnight, this message appears:
File `{filename}.html' already there; not retrieving.
File `{filename}.html' already there; not retrieving.
File `{filename}.html' already there; not retrieving.
File `{filename}.html' already there; not retrieving.
Killed
does anyone know what is happening and how I can fix it?
© Super User or respective owner