mirror of
https://github.com/Brandon-Rozek/website.git
synced 2024-11-09 10:40:34 -05:00
Added archive tag
This commit is contained in:
parent
a6ccbec33d
commit
be1f539662
4 changed files with 4 additions and 2 deletions
|
@ -2,6 +2,7 @@
|
|||
title: "Archiving Sites"
|
||||
date: 2019-08-02T22:42:16-04:00
|
||||
draft: false
|
||||
tags: [ "archive" ]
|
||||
---
|
||||
|
||||
I have several old Wordpress sites lying around that I would like to archive but not maintain anymore. Since I don't intend to create any more content on these sites, we can use tools like `wget` to scrape an existing site and provide a somewhat *read-only* copy of it. I say read-only not because we can't edit it, but because it's not in the original source format of the website.
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
title: "Mirror Download with wget"
|
||||
date: 2020-01-20T21:18:12-05:00
|
||||
draft: false
|
||||
tags: [ "linux" ]
|
||||
tags: [ "linux", "archive" ]
|
||||
---
|
||||
|
||||
This post will describe downloading a `centos` repo using `wget`. Though the ideas in this blog post can apply to any mirror with packages exposed via http.
|
||||
|
|
|
@ -2,7 +2,7 @@
|
|||
title: "Offline Pip Packages"
|
||||
date: 2020-01-20T23:11:05-05:00
|
||||
draft: false
|
||||
tags: [ "python" ]
|
||||
tags: [ "python", "archive" ]
|
||||
---
|
||||
|
||||
There are a few reasons I can think of to have offline pip packages:
|
||||
|
|
|
@ -2,6 +2,7 @@
|
|||
title: "Backing Up YouTube Channels"
|
||||
date: 2020-02-17T23:17:47-05:00
|
||||
draft: false
|
||||
tags: [ "archive" ]
|
||||
---
|
||||
|
||||
There are great content on YouTube that I would be sad if it went away. Therefore, I did some digging around and found a [great discussion](https://www.reddit.com/r/DataHoarder/comments/863aid/what_is_your_method_of_viewing_youtubedl_backed/dw25vnm/) on Reddit on backing up YouTube videos. The solution is based on `youtube-dl` and I modified the script a little to fit my needs.
|
||||
|
|
Loading…
Reference in a new issue