The official documentation here is not helping much on this topic. First of all it is an example on searching by mimeType, not by parent ID. And then I replaced the search term by the example in here and it still does not find anything. My code is below. Note that the function is named copy_folder
just because I was actually trying to copy all files and sub-folders in a folder to a new folder and the first step is to get the contents of the source folder. And the source folder is in a team drive. The 'files' key in the response is just empty while in the folder I tested there are actually files and sub-folders.
def copy_folder(service, src, dst):
"""
Copies a folder. It will create a new folder in dst with the same name as src,
and copies the contents of src into the new folder
src: Source folder's id
dst: Destination folder's id that the source folder is going to be copied to
"""
page_token = None
while True:
response = service.files().list(q="'%s' in parents" % src,
supportsAllDrives=True,
spaces='drive',
fields='nextPageToken, files(id, name)',
pageToken=page_token,
).execute()
for file in response.get('files', []):
# Process change
print('Found file: %s (%s)' % (file.get('name'), file.get('id')))
page_token = response.get('nextPageToken', None)
if page_token is None:
break
In that case, I think that in order to retrieve the files in the subfolders, it is required to use the recursive function. For this, I have created the library for achieving it. So in this answer, I would like to achieve your goal using a library of getfilelistpy. When this library is used, the script is as follows.
Before you use this script, please install getfilelistpy as follows.
$ pip install getfilelistpy
And, after you copied and pasted the following script, please set the variable of src
.
import pickle
import os.path
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from getfilelistpy import getfilelist
SCOPES = 'https://www.googleapis.com/auth/drive.metadata.readonly'
creds = None
creFile = 'token_sample.pickle'
if os.path.exists(creFile):
with open(creFile, 'rb') as token:
creds = pickle.load(token)
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'client_secret.json', SCOPES)
creds = flow.run_local_server()
with open(creFile, 'wb') as token:
pickle.dump(creds, token)
src = '###' # Please set the folder ID.
fields = 'nextPageToken, files(id, name)'
resource = {
"oauth2": creds,
"id": src,
"fields": fields,
}
res = getfilelist.GetFileList(resource)
print(dict(res))
https://www.googleapis.com/auth/drive.readonly
and https://www.googleapis.com/auth/drive
.When you want to retrieve the file list just under the specific folder, you can also use the following script.
src = '###' # Please set the folder ID.
service = build('drive', 'v3', credentials=creds)
fields = 'nextPageToken, files(id, name)'
q = "'%s' in parents" % src
values = []
nextPageToken = ""
while True:
res = service.files().list(q=q, fields=fields, pageSize=1000, pageToken=nextPageToken or "", includeItemsFromAllDrives=True, supportsAllDrives=True).execute()
values.extend(res.get("files"))
nextPageToken = res.get("nextPageToken")
if nextPageToken is None:
break
print(values)
In this case, in order to retrieve the file list from the shared Drive, please use supportsAllDrives=True
and includeItemsFromAllDrives=True
.
If you cannot retrieve the files, please also add corpora="drive", driveId="###driveId###"
.