I have 2 influx databases: 1 on localhost, 1 online accessible from a domain. I save data to the local influx database every few seconds and I want to make a copy every few seconds to the online influx (acting as a cloud). Now, another feature would be great to have: if you lose connection to the cloud, some kind of local buffer collects the data and syncs with the online influx once the connection is up again.
I suggest to run only one version influxdb.
Telegraf data collector supports buffering of data in case of network issues.
metric_buffer_limit
controls how many metrics are buffered.
Quoted from telegraf documentation.
## Maximum number of unwritten metrics per output. Increasing this value
## allows for longer periods of output downtime without dropping metrics at the
## cost of higher maximum memory usage.
metric_buffer_limit = 10000
Adjusting the buffer limit should help in saving the metrics without loss due to network glitches. There is no need to run two influxdb instances for this.