I have created a Apache Ni-Fi flow 1 to fetch a simple table from a MySQL server and copy it into InfluxDB after some data transformation.
I use the PutInfluxDataRecord processor to convert the data from AVRO format into In-Line protocol and to insert the data into Influx (by using StandardInfluxDatabaseService 1.15.0-SNAPSHOT controller service).
In the Avro Reader 1.15.2 controller service I have selected the option Schema Access Strategy: Use Embedded Avro Schema.
In the properties of the PutInfluxDataRecord processor I set the value of the Timestamp field property as "time_stamp", which is the name of the field containing the timestamp in the input flow of the processor.
When running the flow, I get the following error:
PutInfluxDatabaseRecord[id=6d21f785-017e-1000-e0da-0528ab2de725] Failed procession flow file f0c553ef-7a4b-414c-bd9f-be2c6b7bf4f5 due to For input string: "2021-01-11 09:00:00.0": java.lang.NumberFormatException: For input string: "2021-01-11 09:00:00.0"
In the property tab of the PutInfluxDatabaseRecord processor, in the help text of the Timestamp field property it says that the supported types are: java.util.Date, java.lang.Number and java.lang.String.
Any hints how I could fix this?
Thanks,
Bernardo
I fixed the issue by changing the settings in the QueryDatabaseTable processor that I used to fetch the data from the MySQL server. I did the following change:
Use Avro Logical Types: true
This prevented Ni-Fi to set time_stamp as a string.