How to download Google Takeout files (150GB+) with wget (on a remote server)
I wanted to migrate from Google to something else (check here) and for that I needed a backup of all my Google Data. To export you data you can just go to https://takeout.google.com and select the products and services for which you want to export the data and click export.
I first tried with a .zip export, but my backup was 150Gb+ and that meant 50 x 2 GB zip files. After 10 files I didn't want to keep going. After that, I tried the tar.gz export, which support much bigger file sizes and decided to go with 17 around 10 GB each.
Downloading with the browser was a nightmare and, after that, I wanted to have the files in my local storage server (I have an HP MicroServer with Proxmox running Open Media Vault in a VM), so I would make sense to directly download the files to the server. After a bit of research I found a way to download Google Takeout files with wget. Info: Although this may seem easy, Google disables the link when you stop the download from the browser, and the link you can copy from the page does not work.
You can check the references at the end of this post, but the steps are as follows (as written in this answer).
Initiate download via takeout page in your browser
Go to "Window ⇾ Downloads"
Locate the download which is in-progress right now
Right click + Copy link address
From your terminal wget -o filename "url"
Pause the download on the browser and let wget download the file.
Just be sure not to cancel the browser download before your wget finishes.
If you want to do the same, I would recommend the use of screen or tmux so that, should your remote connection reset, the download keeps working in the remote session on screen.
Comments
Comments powered by Disqus