I have a large file (supplier-data.tar.gz) that I wanted to download programmatically from Google Drive by using the following commands on the linux shell.
wget --load-cookies /tmp/cookies.txt "https://docs.google.com/uc?export=download&confirm=$(wget --quiet --save-cookies /tmp/cookies.txt --keep-session-cookies --no-check-certificate "https://docs.google.com/uc?export=download&id=$FILEID" -O- | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1\n/p')&id=$FILEID" -O $FILENAME && rm -rf /tmp/cookies.txt
I tested this on a smaller file and it works. I checked that I allowed anyone to access the files that I was testing this on and that I used the correct $FILEID from the share link. I executed the above snippet while logged in on my own system.
The problem is when I change the $FILEID and the $FILENAME to a large file (for example, supplier-data.tar.gz) on Google Drive, the resulting download isn't the same file stored on Google Drive. I know that if you download a large file using the Google Drive website, a prompt pops up and requires user intervention as shown in the attached screenshot.
Another screenshot with the log messages produced from the command snippet is attached.
pip install gdown
gdown <file-id>
to download large file from Google Drive.Thanks to pgame 001. See How to download large .tgz file from Google drive for installing Docker containers on Cloud9 Ubuntu 18.04LTS?