So i have been using the *args **kwargs functionality in python for a while and I came across a problem that I can't seem to find a solution to in the documentation/here on stackoverflow.
I have a multithreaded job which sends requests to the server in parallel, and then does some analysis on the JSON that is returned. I need to write to a csv file a line for each request-response pair.
because this is done in parallel, it will be problematic to write to the csv in the threaded job function. The solution I came up with is to put the analysis result in a que as well and then make a function to grab it from the analysis from the que and write it to the csv in order.
Now to the the real problem: the function that I have to write to the csv takes arguments and keyword arguments, but the que doesn't know how to handle that.
Is there a way to put *args **kwargs in a que then getting them one by one and passing them to another function?
I want something that would look like this:
csv_que = Queue()
# put arguments to the que which will be written later
def write_row_to_que(self,*args, **kwargs):
csv_que.put(*args, **kwargs)
# send each argument in the que to the write_row function with the arguments
#from the que
def csv_writer_func():
while True:
args, kwargs = csv_que.get()
write_row(args, kwargs)
I dont know if this is the right way to go about solving this but I would love to hear some thought and if this functionality (to pass arguments to a que and then pull them from it) is possible.
This mod should be enough to get you started:
def write_row_to_que(self,*args, **kwargs):
csv_que.put( (args, kwargs) )
Put the parameters into a tuple and the get()
method at the other end will retrieve them.
args, kwargs = csv_que.get()
This line will retrieve the items, but:
args
will be a tuple of the unnamed arguments.kwargs
will be a dictionary of the keyword arguments originally supplied.