15

I am trying to download my 35GB Google Takeout .tgz file but it keeps failing part way through.

  • On Chrome I might get about 3 GB progress, then it reports Failed - Network error. I have tried several times.

partial and failed attempts

  • On Firefox, I manged about 75% of the file downloaded on the one attempt.

My internet is pretty stable and I haven't had problems downloading relatively large ISO images, but I couldn't be sure that my connection would be 100% glitch free for the 3 hours it takes to download.

I'm considering generating a .ZIP takeout instead, but then I have to manually download lots of 2GB zips.

Got any better ideas?

Andy Joiner
  • 321
  • 2
  • 10

3 Answers3

5

If you are familiar with linux/unix then there is a simpler method using wget/curl.

Steps:

  • Initiate download via takeout page in your browser
  • Go to "Window->Downloads"
  • Locate the download which is in-progress right now
  • Right click + Copy link address
  • From your terminal - wget {url}

wget is more stable. you can manually set timeouts.

Bonus: you can initiate this from a remote server. I have seen speeds upto 250 MBps from remote servers. Just be sure not to cancel the browser download before your wget finishes.

user1101791
  • 151
  • 1
  • 2
2

Unfortunately, Daco's method above didn't work for me; however, this variant did:

Initiate the takeout page download in your browser, then open the page with "Inspect" to get the developer tools. (I used Firefox, but the same should work in Chrome.)

Pause the download, and then in the developer tools look at the requests in the "Network" tab to identify the request that's to takeout-download.usercontent.google.com. Right-click on that and choose "Copy Value->Copy as cURL (Posix)".

In a terminal window where you've ssh'ed into wherever you want to download (probably running inside screen), paste the command line and add a -O (dash capital O) at the end of the command. (That -O is to get the command to actually save the result to a local file rather than spit it onto stdout.)

galacticninja
  • 2,319
  • 8
  • 34
  • 69
1

Use aria2c!

  1. Open up the Developer Tools, then click the download link.
  2. View the Network tab and identify the request, likely it will come from a googleapis.com subdomain.
  3. Right click the request and Copy request location.
  4. Click on the Headers tab and in the Request Headers section, locate the line beginning with Cookie:.
  5. Combine all of the information you now have into a download command with aria2 like so:

    aria2c -o export.mbox -c --header="Cookie: AUTH_e2e0q...etc" long-request-url-goes-here

Source: https://kylekelly.com/posts/2014/12/04/google-takeout-with-aria.html