Search code examples
c#memory-managementnonblockingvolatilememory-barriers

Volatile and Thread.MemoryBarrier in C#


To implement a lock free code for multithreading application I used volatile variables, Theoretically: The volatile keyword is simply used to make sure that all threads see the most updated value of a volatile variable; so if thread A updates the variable value and thread B read that variable just after that update is happened it will see the most updated value that written recently from thread A. As I read in a C# 4.0 in a Nutshell book that this is incorrect because

applying volatile doesn’t prevent a write followed by a read from being swapped.

Could this problem being solved by putting Thread.MemoryBarrier() before every get of the volatile variable like:

private volatile bool _foo = false;

private void A()
{
    //…
    Thread.MemoryBarrier();
    if (_foo)
    {
        //do somthing
    }
}

private void B()
{
    //…
    _foo = true;
    //…
}

And if this solves the problem; consider we have a while loop that depend on that value at one of its conditions; is putting Thread.MemoryBarrier() before the while loop is a correct way to fix the issue? example:

private void A()
{
    Thread.MemoryBarrier();
    while (_someOtherConditions && _foo)
    {
        // do somthing.
    }
}

To be more accurate I want the _foo variable to give its most fresh value when any thread asking for it at any time; so if inserting Thread.MemoryBarrier() before calling the variable fixes the issue then could I use Foo property instead of _foo and do a Thread.MemoryBarrier() within the get of that property Like:

Foo
{
    get 
    {
        Thread.MemoryBarrier();
        return _foo;
    }
    set
    {
        _foo = value;
    }
}

Solution

  • The "C# In a Nutshell" is correct, but its statement is moot. Why?

    • A 'write' followed by a 'read', withot 'volatile', is guaranteed to occur in program-order anyway if it effects logic within a single thread
    • The 'write' before a 'read' in a multi-threaded program is utterly pointless to worry about in your example.

    Let's clarify. Take your original code:

    private void A() 
    { 
        //… 
        if (_foo) 
        { 
            //do something 
        } 
    }
    

    What happens if the thread scheduler has already checked the _foo variable, but it gets suspended just before the //do something comment? Well, at that point your other thread could change the value of _foo, which means that all your volatiles and Thread.MemoryBarriers counted for nothing!!! If it is absolutely essential that the do_something be avoided if the value of _foo is false, then you have no choice but to use a lock.

    However, if it is ok for the do something to be executing when suddenly _foo becomes false, then it means the volatile keyword was more than enough for your needs.

    To be clear: all the responders who are telling you to use a memory barrier are incorrect or are providing overkill.