What i am trying to do? I have a data lake container. inside HDFS name spaces ex: "container/year/month/day/bunch of files". files will upload on daily bases and folder structure is dynamic based on current date . i need my azure function to trigger when files are uploaded in day directory. and those files will process and dump data to sql server db[c# code]. Only i have problem is triggering my function over dynamic directory. please help me or suggest me on how to approach.
Thanks a million.
You don't need to use dynamic foldername. Actually, the path of the blobtrigger need to be given when compiled. You should give a const or set it in the environment variable.
So, two ways:
1, The first way is simply. Just do like this:
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp23
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([BlobTrigger("yourcontainername/{year}/{month}/{day}/{filename}", Connection = "str")]Stream myBlob, string filename, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{filename} \n Size: {myBlob.Length} Bytes");
}
}
}
2, Second way, Deploy a timetrigger with your blobtrigger. And put the code that can add the environment variable in it.(This timetrigger triggers once a day.)
I don't recommend this method, Although it can achieve the "dynamic", but I think your use case does not need this. If you really need this, I will update the code. But in theory the first method is sufficient.