Search code examples
c++raii

C++ RAII vs. defer?


I've recently begun learning C++, previously I programmed in Go.

I was recently informed that I should not be using new because exceptions thrown may cause the allocated memory not to be freed and result in a memory leak. One popular solution to this is RAII, and I found a really good explanation of why to use RAII and what it is here.

However, coming from Go this whole RAII thing seemed unnecessarily complicated. Go has something called defer that solves this problem in a very intuitive way. You just wrap what you want to do when the scope ends in defer(), e.g. defer(free(ptr)) or defer(close_file(f)) and it'll automagically happen at the end of the scope.

I did a search and found two sources that had attempted to implement the defer functionality in C++ here and here. Both ended up with almost exactly the same code, perhaps one of them copied the other. Here they are:

Defer implentation 1:

template <typename F>
struct privDefer {
    F f;
    privDefer(F f) : f(f) {}
    ~privDefer() { f(); }
};

template <typename F>
privDefer<F> defer_func(F f) {
    return privDefer<F>(f);
}

#define DEFER_1(x, y) x##y
#define DEFER_2(x, y) DEFER_1(x, y)
#define DEFER_3(x)    DEFER_2(x, __COUNTER__)
#define defer(code)   auto DEFER_3(_defer_) = defer_func([&](){code;})

Defer implementation 2:

template <typename F>
struct ScopeExit {
    ScopeExit(F f) : f(f) {}
    ~ScopeExit() { f(); }
    F f;
};

template <typename F>
ScopeExit<F> MakeScopeExit(F f) {
    return ScopeExit<F>(f);
};

#define SCOPE_EXIT(code) \
    auto STRING_JOIN2(scope_exit_, __LINE__) = MakeScopeExit([=](){code;})

I have 2 questions:

  1. It seems to me that this defer is essentially doing the same thing as RAII, but much neater and more intuitively. What is the difference, and do you see any problems with using these defer implementations instead?

  2. I don't really understand what the #define part does on these implementations above. What is the difference between the two and is one of them more preferable?


Solution

  • It seems to me that this defer is essentially doing the same thing as RAII, but much neater and more intuitively. What is the difference, and do you see any problems with using these defer implementations instead?

    RAII's pro:

    • More secure and DRY: RAII avoid to use defer each time you acquire a resource.
    • RAII handles transfer-ownership (with move semantic).
    • defer can be implemented with RAII, not the other way (with movable resources).
    • With RAII, you might handle different paths for success/error (database commit/rollback in case of exception for example) (you might have finally/on_success/on_failure).
    • Can be composed (You might have object, with several resources).
    • Can be used at global scope. (even if global should be avoided in general).

    RAII's cons:

    • You need one class by "resource type". (standard provides several generic one though, containers, smart pointers, lockers, ...).
    • Destructor code should not throw. (go doesn't have exception, but error handling with defer is also problematic).
    • Can be misused at global scope. (Static order initialization Fiasco SIOF).

    For real resources, you should really use RAII.

    For code where you have to rollback/delay a change, using a finally class might be appropriate. using MACRO in C++ should be avoided, so I strongly suggest to use the RAII syntax instead of the MACRO way

    // ..
    ++s[i];
    const auto _ = finally([&](){ --s[i]; })
    backstrack_algo(s, /*..*/);
    

    I don't really understand what the #define part does on these implementations above. What is the difference between the two and is one of them more preferable?

    Both use same technique and use an object to do the RAII. so the macro (#define) is to declare an "unique" identifier (of the type of their object) to be able to call defer several time in the same function, so after MACRO replacement, it result to something like:

    auto scope_exit_42 = MakeScopeExit([&](){ fclose(f);});
    

    One use __COUNTER__, which is not a standard MACRO, but supported by most compiler (and so really ensure uniqueness). the other use ___LINE__, which is standard MACRO, but would break unicity if you call defer twice on the same line.

    Other difference is the default capture which might be [&] (be reference, instead of by value) as lambda stay in scope, so no lifetime issue.

    Both forget to handle/delete copy/move of their type (but as the variable cannot really be reused when using the macro).