mirror of
https://github.com/Brandon-Rozek/website.git
synced 2024-11-09 18:50:34 -05:00
1.1 KiB
1.1 KiB
title | date | draft |
---|---|---|
Archiving Sites | 2019-08-02T22:42:16-04:00 | false |
I have several old Wordpress sites that are now only alive for archival purposes. I've been trying to migrate off of Wordpress since I don't like having yet another system to manage. Luckily for archival type sites, I don't need the whole backend of Wordpress since I don't actually need any CRUD functionality. (Well I need the R part)
Solution... wget comes to the rescue
Thanks to user chuckg, I now know you can run wget -m -k -K -E https://url/of/web/site
to get a full offline copy of a website.
Joel Gillman expanded out the command to wget --mirror --convert-links --backup-converted --adjust-extension https://url/of/web/site
There are other solutions in that stack overflow post, but something about the simplicity of wget
appealed to me.