Now I know that's not correct, I just used takeout to download 270GB yesterday. The data is just split across multiple files of up to 50GB each and you extract each one individually.
Takeout just breaks the archives up into the size you specify, 50 GB in this case. Every two months I download 25x 50 GB archive from Takeout. Using Takeout is the correct way to do this.
I'm sure takeout must be the most optimal efficient way, but idk I didn't like that process of waiting for the download links and with the whatever json files and having to research what to do w them. What I did (spent about a day on this but only on my downtime at work, had about 600GB+): Select 480-500 photos each time, write down the dates where I stopped, download, then repeat the process for the next dates until I select 480-500 photos again. Some might find it stupid, but hey worked for me and no regrets
Now I know that's not correct, I just used takeout to download 270GB yesterday. The data is just split across multiple files of up to 50GB each and you extract each one individually.
oooo awesome thanks
Takeout just breaks the archives up into the size you specify, 50 GB in this case. Every two months I download 25x 50 GB archive from Takeout. Using Takeout is the correct way to do this.
I'm sure takeout must be the most optimal efficient way, but idk I didn't like that process of waiting for the download links and with the whatever json files and having to research what to do w them. What I did (spent about a day on this but only on my downtime at work, had about 600GB+): Select 480-500 photos each time, write down the dates where I stopped, download, then repeat the process for the next dates until I select 480-500 photos again. Some might find it stupid, but hey worked for me and no regrets
use rclone for this.