I have thousands of files in a shared folder in Google Drive which I do not have full control. I need to download them to a server with enough memory and CPUs for analysis.
1HLv6tWz0oXFOJHerBTP8HsNmhpRqssijJatC92bv9Ym6HSN69_UuzcDk
and
proceed to install the library BatchRequest.function myFunction() {
const sourceFolderId = "18b0tVyn4ErDfduhaMQi4vZSOCYAqB2tl"; // Please set the source folder ID.
const getFiles = (id, res = []) => {
const folder = DriveApp.getFolderById(id);
const files = folder.getFiles();
while (files.hasNext()) {
const file = files.next();
res.push({name: file.getName(), id: file.getId()})
}
let ids = [];
const folders = folder.getFolders();
while (folders.hasNext()) ids.push(folders.next().getId());
if (ids.length > 0) ids.forEach(id => getFiles(id, res));
return res;
}
const files = getFiles(sourceFolderId);
// Create a text file
const fileIdsText = files.map(file => file.id).join("\n");
const txtFile = DriveApp.createFile('file_ids.txt', fileIdsText);
console.log('File IDs saved to: ' + txtFile.getUrl());
}
After execution, you may find a message in the log saying that your
file IDs are stored in file_ids.txt
in your own Google Drive, in
which each line contains one file ID corresponding to one file in
your targetted folder.
For Linux users, install gdown and proceed with the following shell script to download by referring to the file IDs:
#!/bin/bash
while IFS= read -r file_id
do
download_url="https://drive.google.com/uc?id=${file_id}"
gdown "$download_url"
done < ids.txt
echo "File download completed."