Probably many people like me once wondered how they could download an entire site. I remember years ago that I did it with wget with the -r option, but today I found a brutal way to shred everything with a single Linux command.
wget --random-wait -r -p -e robots=off -U mozilla http://www.site.com/
The short options that can be used are the following:
-p parameter tells wget to include all files, including images.
-e robots=off you don’t want wget to obey by the robots.txt file
-U mozilla as your browsers identity.
–random-wait to let wget chose a random number of seconds to wait, avoid get into black list.
Other Useful wget Parameters:
–limit-rate=20k limits the rate at which it downloads files.
-b continues wget after logging out.
-o $HOME/wget_log.txt logs the output