Search code examples
pythonpython-multithreading

Python Thread Acquire Lock Prioritization


Is there a way to hand over the threading.lock to a pre-decided thread in python. Consider following example:

import os, json
import threading
from threading import Lock, Thread
from time import sleep


class Export:

    def __init__(self):
        self.relative_path = "file.txt"
        self.special_lock  = Lock()

    def push(self, data):
        self.special_lock.acquire()
        if not os.path.exists(self.relative_path):
            open(self.relative_path, 'w').close()
        with open(self.relative_path, 'a') as fo:
            json.dump(data, fo)
        self.special_lock.release()

    def rarePrinting(self):
        while(True):
            export.push("Important")
            sleep(1)

    def commonPrinting(self):
        while(True):
            export.push("Not So Important")
            sleep(0.1)

export = Export()
t1 = threading.Thread(target=export.rarePrinting)
t2 = threading.Thread(target=export.commonPrinting)
t1.start()
t2.start()

t1 is writing some information to the file that is more important then t2, however, since I dont have a graceful exit in my program, I dont know which thread will be blocked on acquire. I just want to guarantee that all of the t1 data that is fetched is written to file, is there a way to prioritize the threads this way without declaring another lock?


Solution

  • Use a priority queue, which handles locking itself. rarePrinting and commonPrinting simply put items of different priority into the queue. A third thread takes care of getting the next available item from the queue and adds it to your file.

    from queue import PriorityQueue
    
    
    class Export:
    
        def __init__(self):
            self.relative_path = "file.txt"
            self.jobs = PriorityQueue()
    
        def writer(self):
            with open(self.relative_path, 'a') as fo:
                while True:
                    _, data = self.jobs.get()
                    json.dump(data, fo)
    
        def rare_printing(self):
            while True:
                self.jobs.put((0, "Important"))
                sleep(1)
    
        def common_printing(self):
            while True:
                self.jobs.put((1, "So Important"))
                sleep(0.1)
    
    
    export = Export()
    t1 = threading.Thread(target=export.rare_printing)
    t2 = threading.Thread(target=export.common_printing)
    t3 = threading.Thread(target=export.writer)
    t1.start()
    t2.start()
    t3.start()
    

    All three threads have equal access to the queue, but when rare_printing gets the lock, whatever it adds to the queue jumps ahead of anything anything common_printing has previously added.

    This does assume that writer is fast enough to remove jobs faster than they can be added, so that rare_printing doesn't starve processing of jobs from common_printing.