Search code examples
pythonargumentsmultiprocessingpooliterable

Multiprocessing - pass shared Queue and unique number for each worker


I can't quite find solution to a code where I pass to each worker Shared Queue but also a number for each worker.

My code:

The idea is to create several channels for putting audio songs. Each channels must be unique. So If a song arrives I put it to channel which is available

from multiprocessing import Pool,Queue
from functools import partial
import pygame
queue = Queue()


def play_song(shared_queue, chnl):

    channel = pygame.mixer.Channel(chnl)
    while True:
        sound_name = shared_queue.get()
        channel.play(pygame.mixer.Sound(sound_name))
    


if __name__ == "__main__":
    channels= [0,1, 2, 3, 4]
    
    func = partial(play_song,queue)
    p = Pool(5,func, (channels,))

This code of course doesn't return any error, because its multiprocessing, but the problem is that channels is passed to play_song as whole list instead of being mapped to all workers.

So basically instead of each worker initialize channel like this:

channel = pygame.mixer.Channel(0) # each worker would have number from list so 1,2,3,4

I am getting this

channel = pygame.mixer.Channel([0,1,2,3,4]) # for each worker

I tried playing with partial function, but unsuccessfully.

I was successful with pool.map function, but while I could pass individual numbers from channels list, I couldn't share Queue among workers


Solution

  • Eventually I found solution to my Pygame problem that does not require threads or multiprocessing.


    Background to the problem:

    I was working with Pyaudio and since it is quite lowlevel api to audio, I had problems with mixing several sounds at the same time and in general. The reasons are:

    1) It is not easy(maybe imposible) to start several streams at the same time or feed those streams at the same time (looks like hardware problem)

    2) Based on 1) I tried different attitude - have one stream where audio waves from different sounds are added up before entering stream - that works but its unreliable as adding up audiowaves is not really compatible - adding to much waves results in 'sound cracking' as the amplitudes are too high.

    Based on 1) and 2) I wanted to try run streams in different processes, therefore this question.


    Pygame solution (single processed):

    for sound_file in sound_files:
        availible_channel = pygame.mixer.find_channel() #if there are 8 channels, it can play 8 sounds at the same time
        availible_channel.play(sound_file )
    

    if sound_files are already loaded, this gives near simultaneous results.

    Multiprocessing solution

    Thanks to Darkonaut who pointed out multiprocessing method, I manage to answer my initial question on multiprocessing, which I think is already answered on stackoverflow, but I will include it.

    Example is not finished, because I didnt use it at the end, but it answers my initial requirement on the processes with shared queue but with different parameters

    import multiprocessing as mp
    
    shared_queue = mp.Queue()
    
    
    def channel(que,channel_num):
        que.put(channel_num)
    
    if __name__ == '__main__':
        processes = [mp.Process(target=channel, args=(shared_queue, channel_num)) for channel_num in range(8)]
    
        for p in processes:
            p.start()
    
        for i in range(8):
            print(shared_queue.get())
        for p in processes:
            p.join()