I'm trying to read multiple files of same type from a container recursively from Azure blob storage in Python with Function App. But how could that be done using binding functions in host.json of orchestrator as shown below? What appropriate changes should be made in local settings as I've mentioned the conn strings and paths to blobs already in the same?
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "context",
"type": "orchestrationTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"dataType": "string",
"path": "test/{file_name}.pdf{queueTrigger}",
"connection": "CONTAINER_CONN_STR",
"direction": "in"
}
]
}
*test : The directory I have.
CONTAINER_CONN_STR : already specified path
Also, when doing so, in normal method without binding, gives error while downloading the files to local system as given below:
Exception: PermissionError: [Errno 13] Permission denied: 'analytics_durable_activity/'
Stack: File "C:\Program Files\Microsoft\Azure Functions Core Tools\workers\python\3.8\WINDOWS\X64\azure_functions_worker\dispatcher.py", line 271, in _handle__function_load_request
func = loader.load_function(
how could that be done using binding functions in host.json of orchestrator as shown below? What appropriate changes should be made in local settings
The configuration that you have used looks good. For more information, you can refer to this Example.
Also, when doing so, in normal method without binding, gives error when downloading the files to local system as given below:
You might this error when you are trying to open a file, but your path is a folder or if you don't have the permissions that are required.
You can refer to this SO thread which discusses a similar issue.
REFERENCES: Set, View, Change, or Remove Permissions on Files and Folders | Microsoft Docs