How to use wget to grab copy of Google Code site documents?
- by Alex Reynolds
I have a Google Code project which has a lot of wiki'ed documentation. I would like to create a copy of this documentation for offline browsing. I would like to use wget or a similar utility.
I have tried the following:
$ wget --no-parent \
--recursive \
--page-requisites \
--html-extension \
--base="http://code.google.com/p/myProject/" \
"http://code.google.com/p/myProject/"
The problem is that links from within the mirrored copy have links like:
file:///p/myProject/documentName
This renaming of links in this way causes 404 (not found) errors, since the links point to nowhere valid on the filesystem.
What options should I use instead with wget, so that I can make a local copy of the site's documentation and other pages?