Search code examples
bashhttpcookiesgoogle-drive-api

How can I download a file over the maximum file size limit from Google Drive programmatically?


I have a large file (supplier-data.tar.gz) that I wanted to download programmatically from Google Drive by using the following commands on the linux shell.

wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate "https://docs.google.com/uc?export=download&id=$FILEID" -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=$FILEID" -O $FILENAME && rm -rf /tmp/cookies.txt

I tested this on a smaller file and it works. I checked that I allowed anyone to access the files that I was testing this on and that I used the correct $FILEID from the share link. I executed the above snippet while logged in on my own system.

The problem is when I change the $FILEID and the $FILENAME to a large file (for example, supplier-data.tar.gz) on Google Drive, the resulting download isn't the same file stored on Google Drive. I know that if you download a large file using the Google Drive website, a prompt pops up and requires user intervention as shown in the attached screenshot.

Another screenshot with the log messages produced from the command snippet is attached.Can't scan file for viruseslog messages


Solution

    1. Installed gdown pip install gdown
    2. Then, ran gdown <file-id> to download large file from Google Drive.

    Thanks to pgame 001. See How to download large .tgz file from Google drive for installing Docker containers on Cloud9 Ubuntu 18.04LTS?