Search code examples
c#multithreadingqueuesynchronize

Synchronize threads access to fixed size queue by threads order


I was asked the following question in an interview:

There is a fixed size queue of tasks. Threads want to enqueue task. If the queue is full they should wait. The threads order should remain: if thread1 came with task1 and after that thread2 came with task2, task1 should enter the queue before task2.

Other threads want to dequeue tasks and execute it. If the queue is empty, they should wait, and also their order should remain: If t3 came before t4, t3 should enqueue task before t4.

How to achieve this (in pseudo-code)?


Solution

    1. Simple solution In .NET 4.0 were introduced namespace System.Collections.Concurrent, class from it are working quite right - I couldn't achieve some error from them.
      ConcurrentQueue and BlockingQueue are the place to start your research. But I think your question is not about the standart solution - that's bad answer on the interview, so:
    2. Solution based on Jeffrey Richter's book information:
      Base code (C#):

      internal sealed class SynchronizedQueue<T> {
          private readonly Object m_lock = new Object();
          private readonly Queue<T> m_queue = new Queue<T>();
      
          public void Enqueue(T item) {
              Monitor.Enter(m_lock);
              // After enqueuing an item, wake up any/all waiters
              m_queue.Enqueue(item);
              Monitor.PulseAll(m_lock);
              Monitor.Exit(m_lock);
          }
      
          public T Dequeue() {
              Monitor.Enter(m_lock);
              // Loop while the queue is empty (the condition)
              while (m_queue.Count == 0)
                  Monitor.Wait(m_lock);
              // Dequeue an item from the queue and return it for processing
              T item = m_queue.Dequeue();
              Monitor.Exit(m_lock);
              return item;
          }
      }
      

      This class is a thread-safe, but still doesn't check the order - and here is a many ways to implement that. From the same book:

      ConcurrentQueue and ConcurrentStack are lock-free; these both internally use Interlocked methods to manipulate the collection.

      So, you must remove Monitor class usage, and provide check for your thread to being next one to enqueue item. This can be done by maintaining the number of current adders and current queue length in the private field. You should make this fields volatile.
      You should use Interlocked.Exchange to get your current adders and Interlocked.Read to get your current queue length.
      After that, you have unique number for your thread - current length + current adders. Use SpinWait class to spin around while current length will not became equal to your number, after that enqueue item, and leave the Enqueue method.

    I strongly recommend you to study this book chapters about multithreading and locks - you'll be much more prepared for this type questions in your life. Also try to search similar questions here. For example:

    Creating a blocking Queue<T> in .NET?