I have been able to execute my code to copy data form one s3 bucket to anothe rbucket if the data size is in kb. But, if the size grows to several mb, my task fails with START_To_CLOSE error. The task fails with taskStarttoClose time out. I have tried overriding the defaultTaskStartToCloseTimeoutSeconds to 60 seconds and later when executed the workflow, I see the value set to 10 sec. I dont understand why the value changes when I have made it to wait for 60 seconds. The activity fails to copy large files. Here is the code sample of the copy activity.
@Override
public String copyData(String jobFlowId, String inputFilename) throws IOException, InterruptedException {
AmazonS3 s3Client = new AmazonS3Client(credentials);
String baseOutputFilename = "job/"
CopyObjectRequest copyObjRequest = new CopyObjectRequest(
"bucket1", "/job/data.txt", "bucket2", OutputFilename);
s3Client.copyObject(copyObjRequest);
return "s3n://bucketName/eOutputFilename";
}
defaultTaskStartToCloseTimeoutSeconds is passed to SWF during activity version type registration. Version type registration happens only once and the version type is immutable after that. So changing the timeout value in your Java code is not going to change the registered one. The solution is either bump activity version number (in @Activities annotation) to force the new activity version registration or explicitly specify timeout override (passing ActivitySchedulingOptions parameter to each activity invocation.