Creating an offline backup or static version of your website using Wget

Posted by on June 23, 2015 in Programming, Ubuntu | 0 comments

Creating an offline backup or static version of your website using Wget

Sometimes you need a backup of your site as it displays on the web. But there’s another use for creating an offline backup. If you have a WordPress site for instance pages are created dynamically every time they load. If you create an offline backup, then you can have a static version of your dynamic site. This is helpful if you need to shut down PHP or your MySQL database. You can create a static version of your site that is plain HTML that can load without any higher level programming.

So let’s get down to it. Wget is a great and simple command you can use from the command line on your Linux box.

This command will download everything needed including JS files, CSS files, and images. It creates a folder structure for wordpress sites creating index files so you can keep the naming structure you already had. There are many different options you can use, but this is the simple command.

wget -E -m -p -k

If you run into some trouble, you may have to add a few other arguments. For instance. The site might be checking the useragent. To add that, put something like the following in.

–user-agent=”Mozilla/5.0 (Macintosh; Intel Mac OS X 10.8; rv:21.0) Gecko/20100101 Firefox/21.0″

There also might be an issue with a header not accepting what’s coming back. I had a site that would only download the first page, but if you added the following accept, it downloaded everything.

–header=”Accept: */*”

If you enjoyed this post, please consider leaving a comment or subscribing to the RSS feed to have future articles delivered to your feed reader.

Leave a Comment

Your email address will not be published. Required fields are marked *