I want to insert SQL rows into my_logging_table
as a log whenever a NiFi Put-SQL-processor in my pipeline fails. So far I had to create another logging-PutSQL
processor for every pipeline-processor which have the following sql-statement
:
insert into schema.my_logging_table values
('Failed-NiFi-processor-name', 'failed', current_timestamp)
This obviously leads to double the number of NiFi processors as each has to have their own "logging-putsql-processor" so that the correct processor name can be logged.
Is there a way to have my pipeline-putsql-processors update a flowfile property on fail (Im thinking of passing on the name of the processor)?
This way I could route all failures to a single logging-putsql-processor which reads the failing processor's name from the file property and insert the row into the database. I noticed the "update_attribute" processor, but I would have to build one of them for every processor as well...
Yes, you'd need to stick an UpdateAttribute on the Failure branch, not all processors write failure attributes. You'll see what attributes they write in the processor docs - see the Writes Attributes section for PutSQL
Alternatively, you could