How to use wget to grab copy of Google Code site documents?
Posted
by
Alex Reynolds
on Super User
See other posts from Super User
or by Alex Reynolds
Published on 2012-03-26T10:04:42Z
Indexed on
2012/03/26
11:35 UTC
Read the original article
Hit count: 243
I have a Google Code project which has a lot of wiki'ed documentation. I would like to create a copy of this documentation for offline browsing. I would like to use wget
or a similar utility.
I have tried the following:
$ wget --no-parent \
--recursive \
--page-requisites \
--html-extension \
--base="http://code.google.com/p/myProject/" \
"http://code.google.com/p/myProject/"
The problem is that links from within the mirrored copy have links like:
file:///p/myProject/documentName
This renaming of links in this way causes 404 (not found) errors, since the links point to nowhere valid on the filesystem.
What options should I use instead with wget
, so that I can make a local copy of the site's documentation and other pages?
© Super User or respective owner