Search code examples
c#.netmultithreadingthread-safetythread-synchronization

Thread Safety Atomic Producer/Consumer


I have an app which utilizes a producer/consumer pattern for processing events (2 threads total). The main (UI) thread atomically writes to a reference-type field, and my instantiated thread atomically reads the field. Here's the demo code:

public class MyApp
{
    private Thread WorkerThread;        // Thread which processes the events.
    private EventArgs QueuedEvent;      // The latest event registered by the app.
    private EventArgs ProcessedEvent;   // The latest event processed by the worker.

    public static void Main(string[] args)
    {
        // Create and start the thread.
        WorkerThread = new(new ThreadStart(DoWork));
        WorkerThread.Start();

        // Attach event
        [...]
    }

    // Event which is fired from the main thread.
    // The timing of this event is unpredictable; 
    // it could be super fast, never fire, or anything in between.
    private static void OnEvent(object sender, EventArgs e)
    {
        // Set this as the latest event.
        // It is OK if the previous event is LOST.
        QueuedEvent = e;
    }

    // Main loop for consumer thread.
    private void DoWork()
    {
        while (true)
        {
            // Copy QueuedEvent to local variable so it's not overwritten.
            var argsToWorkOn = QueuedEvent;

            // For performance, don't do work if the args haven't changed.
            if (argsToWorkOn != ProcessedEvent)
            {
                // Do expensive work with argsToWorkOn.
                [...]

                // Remember that we already did work with these args.
                ProcessedEvent = argsToWorkOn;
            }
        }
    }
}

The Question

Performance in both threads is the #1 priority here; they must not block each other. Given this is case, what is objectively the most performant way to implement thread safety here? Is it:

  • Not necessary? Definitely not. Unless I'm mistaken.
  • Use volatile? This was my initial answer. But after researching for several hours, I truthfully have no idea what to conclude concerning volatile here. I've read arguments for and against volatile, especially with respect to modern architecture.
  • Use lock? Maybe. The main thread could be blocked every time the worker thread reads the field, which is bad.
  • Use Interlocked? I don't think it's very applicable here, but I could be wrong.
  • Use ReaderWriterLockSlim? Haven't actually looked too far into it yet, I just know it exists.
  • Something else?

Solution

  • What is objectively the most performant way to implement thread safety here?

    Use the volatile keyword:

    private volatile EventArgs QueuedEvent;
    private volatile EventArgs ProcessedEvent;
    

    This is the absolute winner regarding the utilization of your CPU, and the absolute performance loser regarding your own time and sanity. You would need months of study, especially study of the source code of the .NET runtime, before the confusion goes away and you have a firm understanding of the guarantees that the volatile gives you in practice. And I say in practice because it could be said that according to the official documentation, the .NET source code itself is incorrect, because it makes assumptions that the documentation doesn't provide.

    In short the volatile keyword ensures that the CPU instructions of your code will not be reordered in a way that would allow one thread to see an EventArgs instance in a partially initialized state. It also ensures that the compiler will not emit code that reads the QueuedEvent only once and then never reads it again, subsequently returning always the same cached value. A thread that reads the volatile QueuedEvent field will see a reasonably fresh value of the field, about as fresh as it would be if you were using the lock or Interlocked.