I'm using tcpdump to watch packets. I loop over each line in live and aggregate data during 60s.
Then, after 60s, I update the database. I need to constantly have data every 60s, even if there is no packet (by inserting null data).
If there is no internet or packet, the loop doesn't continue and I have to wait x seconds for the next output. The problem is that the interval can be > 60s.
ts = int(time.time())
p = subprocess.Popen(
(
"tcpdump",
"-neqli",
"eth0"
),
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
)
interval = 60
sessions = {}
for row in iter(p.stdout.readline, ""):
# Do some work, aggregate data
splited_row = row.split(" ")
log_ts = splited_row[0]
sessions[log_ts] = "Anything"
if int(time.time()) - interval > ts:
# Insert in database
insert_in_database(sessions)
ts = int(time.time())
Thanks to @Gerd, I had to use thread and queue.
In the producer, I use the code above without the time section. First I get the current queue data to update it at every new input. Then instead of insert, I put my aggregated data in it.
Then, my consumer loop every 60s, get the queue data and insert it.
I suggest to use a producer-consumer approach with two threads: Here is a complete example (with random integer values rather than strings).
In your case, the producer thread would read the tcpdump
output from the subprocess and put the results in a queue, the consumer thread would look at the queue every 60 seconds and either find some data or that the queue is empty. So you consumer thread main loop would look something like this:
while True:
if not q.empty():
item = q.get()
print('Getting ' + str(item) + ' : ' + str(q.qsize()) + ' items in queue')
# insert data into database
else:
print('No data')
# insert null data into database
time.sleep(60.0)