In the current pipeline there is a "Copy data" step, which copies the files from a sftp server to a data lake. The second step would be to process the newly copied data with azure funtions. Thereby, it would be nice to pass the name of the file or file path to the azure function. So that it can read / open the file from the data lake and eventually store the processed data in Postgresql (inside a vnet).
For example, how could I embed the file name or file path inside the body?
There is another option using trigger on blob storage / data lake. I would slightly prefer to pass the path or the file name. Suggestions, ideas are appreciated.
Since you copy the files to data lake, you can utilize the storage event trigger to get the created file names and paths. You can then pass the parameters or variables storing them to the Azure functions in POST body JSON.
example:
@triggerBody().fileName
and @triggerBody().folderPath
respectively.