Home » Questions » Computers [ Ask a new question ]

Batch download pages from a wiki without special pages

Batch download pages from a wiki without special pages

From time to time I find some documentation on the web that I need for offline use on my notebook. Usually I fire up wget and get the whole site.

Asked by: Guest | Views: 293
Total answers/comments: 1
Guest [Entry]

"From the wget man page:

-R rejlist --reject rejlist

Specify comma-separated lists of file name suffixes or patterns to
accept or reject. Note that if any of
the wildcard characters, *, ?, [ or
], appear in an element of acclist or rejlist, it will be treated
as a pattern, rather than a suffix.

This seems like exactly what you need.

Note: to reduce the load on the wiki server, you might want to look at the -w and --random-wait flags."