I tried to upload a zip file (containing .rvt base file and multiple .rvt link files) to OSS bucket following the steps from here. I received a 504 GATEWAY TIMEOUT status. My zip file size is close to 1GB, and upon searching, I understand that I should use resumable uploads for large files in chunks from here. Before I tried to chunk the data, I listed the details of my bucket using this API - https://developer.api.autodesk.com/oss/v2/buckets/bucketkey/objects/objectkey/details, and I see that my zip file is already uploaded without resumable API. I then took the objectId from the response of this API and further used it for Translation APIs after encoding. The generated URN is tested from Forge viewer and I could see my .rvt linked model as well.
I would like to know as why in the first place I received a timeout where there was no response objectId from the API call. Since the file is in-fact uploaded (as verified from listing the bucket), and I can use the URN out of it, will there be any hidden issues associated with this timeout. Please provide your valuable insights for the same. Also, it would be great if you can point me to the correct detailed out steps for uploading the larger zip files > 1GB.
Thank you.
It is kinda strange that you got a 504 Gateway Timeout
error even though the object was successfully uploaded, but I think it is possible. Uploading is a multi-step process (storing the bytes to some cloud storage, checking authorization, updating records of buckets/objects, etc.) and I guess it could happen that the HTTP request times out after the bucket records have already been updated.
Anyways, large files should always be uploaded using the resumable/chunked upload. If you're developing with Node.js, you could take a look at how the resumable upload is implemented in the Forge extension for VS Code: https://github.com/petrbroz/vscode-forge-tools/blob/develop/src/commands/data-management.ts#L230-L272.