Search code examples
pythondictionaryparallel-processingmultiprocessingmultiple-processes

Shared config variable between multiple processes


Say you have a config.py which inside has

settings = read_yaml('settings.yaml')

so config.settings is a dictionary.

in one script foo.py you have:

import config
config.settings['foo'] = str(time.time())
write_yaml('settings.yaml', config.settings)

and in another script bar.py you have

import config
while True:
    sleep(10)
    print config.settings['foo']

How would you keep the printed value in bar.py up to date with the new value after running foo.py at any time without the obvious reading the file again seeing as the while loop in bar.py needs to be as quick as possible!

I currently run these on seperate bash threads i.e:

$ python bar.py
$ python foo.py

But I could run bar in a thread if that is possible?


Solution

  • I don't know how fast you need this to be. But it would certainly be possible to just reload the config module with importlib.reload. So config.py and foo.py stay the same and your bar.py changes to:

    import importlib
    import config
    
    while True:
        print config.settings['foo']
        sleep(10)
        importlib.reload(config)
    

    Update

    The example above works for Python >= 3.4, use imp.reload for earlier versions of Python 3 or reload for Python 2.