Search code examples
javaspring-bootspring-cloud-dataflow

Spring Cloud DataFlow using launch task after new file in AWS S3 Bucket source


I'm trying to create a batch processing started by a new file in AWS S3.

So the flow is:

1 - A new file is uploaded to AWS S3 Bucket
2 - SCDF detects
3 - SCDF launch the task (Spring Batch application)
4 - Spring Batch application process the file and stores in a DB.

Something similar to this, but with S3 Bucket: https://dataflow.spring.io/docs/recipes/batch/sftp-to-jdbc/

Maybe is a misunderstood of the concept, but in SFTP Source I could set Port, Host, User and Pass, but in S3 Source I doesn't have region and credentials properties.

Where do I set the AWS properties?


Solution

  • There's an Amazon AWS common options section in the README (see: old-app / new-app), which includes the common AWS-specific properties one can override.

    You can pass them as inline properties in the stream definition or when deploying the stream by following the deployer properties convention.