Search code examples
pythonpython-multithreadingconfigobj

Is this a decent way of sharing dynamic ConfigObj files?


I have several python processes which monitor and act upon physical IO. E.g. shut down a motor if the current is too high. They need to let each other know why they have done something so I thought a shared file might be a simple solution. The various processes can write to this file and the others need to know when it has been written to. I'm already using ConfigObj for static configuration files so I thought I'd give it a try for dynamic files. The writes shouldn't occur very often, perhaps one per second at most and usually much slower than that. I came up with this example which seems to work.

import copy
import os.path
import threading
import time
from configobj import ConfigObj

class config_watcher(threading.Thread):
    def __init__(self,watched_items):
        self.watched_items = watched_items
        self.config = self.watched_items['config'] 
        super(config_watcher,self).__init__()
    def run(self):
        self.reload_config()
        while 1:
            # First look for external changes
            if self.watched_items['mtime'] <> os.path.getmtime(self.config.filename):
                print "external chage detected"
                self.reload_config()
            # Now look for external changes
            if self.watched_items['config'] <> self.watched_items['copy']: 
                print "internal chage detected"
                self.save_config()
            time.sleep(.1)
    def reload_config(self):
        try:
            self.config.reload()
        except Exception:
            pass
        self.watched_items['mtime'] = os.path.getmtime(self.config.filename)
        self.watched_items['copy'] = copy.deepcopy(self.config)
    def save_config(self):
        self.config.write()
        self.reload_config()

if __name__ == '__main__':
    from random import randint
    config_file = 'test.txt'
    openfile = open(config_file, 'w')
    openfile.write('x = 0 # comment\r\n')
    openfile.close()
    config = ConfigObj(config_file)
    watched_config = {'config':config} #Dictionary to pass to thread
    config_watcher = config_watcher(watched_config) #Start thread
    config_watcher.setDaemon(True) # and make it a daemon so we can exit on ctrl-c
    config_watcher.start()
    time.sleep(.1) # Let the daemon get going
    while 1:
        newval = randint(0,9)
        print "is:{0} was:{1}, altering dictionary".format(newval,config['x'])
        config['x'] = newval
        time.sleep(1)
        openfile = open(config.filename, 'w')
        openfile.write('x = {0} # external write\r\n'.format(randint(10,19)))
        openfile.close()
        time.sleep(1)
        print "is {1} was:{0}".format(newval,config['x'])
        time.sleep(1)

My question is if there a better/easier/cleaner way of doing this?


Solution

  • Your approach is vulnerable to race conditions if you have multiple processes trying to monitor and update the same files.

    I would tend to use SQLite for this, making a timestamped "log" table to record the messages. The "monitor" thread can just check the max timestamp or integer key value. Some would say this is overkill, I know, but I'm sure that once you have a shared database in the system you will find some other clever uses for it.

    As a bonus, you get auditability; the history of changes can be recorded in the table.