I'm trying to move data from one bucket to another bucket within same project on Google Cloud. It has plain images and documents, like .png, .jpg & .pdf files. I tried several times, but the results remain same.
Below is the command i'm using, gsutil -m cp -r -n gs://bucket1/folder1/subfolder/* gs://bucket2/folder1/subfolder/ 2>&1 | tee ./result.txt
It runs for few hours, after some time the screen goes to hung state, i'm not seeing any logs running on google console.
My doubt as follows, Is there any setting on GCP that, if the network traffic flows for several hours, it will block the traffic, something like that ? Is there any size cap in the bucket level ? If yes, How can i check that ?
Do i have to add any additional parameter to the above command ? Do you have any other suggestions to complete this task ?
I have installed "screen" package on the instance and initiated the copy job. Though there was network fluctuations or console getting hung, the job used to run in the backend. Even though its small trick, it was very much helpful to complete the task. I hope it might help others too. Note: I did copy the file & folder not move.