Search code examples
pythondatabasesqlitedatabase-designdata-acquisition

Efficient way to organize a sqlite3 db for data acquisition , Single Table with all the readings or a multiple Table per DAQ session?


I am building a data logging software in Python that will store Time,4 temp readings in a sqlite3 database in the following table format. Data comes over the serial port from external daq,4 values/second and is logged into the the DB by python

time temp1 temp2 temp3 temp4

One session, may take several readings over several hours and may log 1000 rows or more in each table.

I intend to take multiple sessions lasting several hours of the same above format.

My question is should i log all the values to the same table as shown below (For eg 2 independent sessions) (sessions may be more )

time temp1 temp2 temp3 temp4
session1_time0 session1_temp1 session1_temp2 session1_temp3 session1_temp4
session1_time99 session1_temp1 session1_temp2 session1_temp3 session1_temp4
SESSION2_time0 SESSION2_temp1 SESSION2_temp2 SESSION2_temp3 SESSION2_temp4
SESSION2_time99 SESSION2_temp1 SESSION2_temp2 SESSION2_temp3 SESSION2_temp4

or

Create multiple tables for each session inside the sqlite3 database.

Session1 Table inside Database

time temp1 temp2 temp3 temp4
session1_time0 session1_temp1 session1_temp2 session1_temp3 session1_temp4
session1_time99 session1_temp1 session1_temp2 session1_temp3 session1_temp4

SESSION2 Table inside same Database

time temp1 temp2 temp3 temp4
SESSION2_time0 SESSION2_temp1 SESSION2_temp2 SESSION2_temp3 SESSION2_temp4
SESSION2_time99 SESSION2_temp1 SESSION2_temp2 SESSION2_temp3 SESSION2_temp4

SESSION n Table inside same Database and so on

What is the correct way to organize a database in this situation, Single Table containing all the readings or a Table per session?

  • Assume the format of the logging remains the same (time + 4 channels)
  • One session happens,values are committed,DB is closed ,then other session happens, No parallal reads or writes.

Solution

  • One table with session_id with measuremets, but then again just for logging just use csv/json file/s, one per running instance. one file per instance, then if you want publish data , import to real db using copy command (yes , i am thinking postgresql)