Search code examples
c++referencesizeof

Why do references take up memory?


It would seem, a reference is simply an alias, yet, adding reference-fields to a struct, for example, increases the structure's size even when the reference is initialized at declaration as an alias for another field of the same structure.

For example:


#include <iostream>

using namespace std;

int
main(int, char **)
{
    struct {
        int integers[2];
    } first;
    struct {
        int integers[2];
        int &one = integers[0];
        int &two = integers[1];
    } second;

    cout << sizeof first << " " << sizeof first.integers << " " <<
        sizeof second << " " << endl;

    return 0;
};

The above program prints: 8 8 24 here. The first two numbers I understand, the third -- no. Why does adding such references matter -- what is stored in that memory, that cannot be resolved at compile time? Unlike pointers, once declared, references cannot change by design anyway, can they? So why are they being stored?


Solution

  • Even with the first and second structures being defined the way you do, I think, those reference members cannot be optimized away (if we are talking not about the particular program you wrote, but in the general case of using those structures). For example, suppose at some point in code, you'll decide to create an instance of the second structure, but to initialize the reference members differently, perhaps even in a dynamic way, not known at compile time. Consider the following usage:

    #include <iostream>
    
    int main()
    {
        struct
        {
            int integers[2];
        } first;
    
        struct
        {
            int integers[2];
            int &one = integers[0];
            int &two = integers[1];
        } second;
    
        int user_choice{ 0 };
        std::cin >> user_choice;
    
        int i{ 56 }, j{ 78 };
        decltype(second) third{ {12, 34}, i, (user_choice < 42) ? i : j };
    
        std::cout << third.integers[0] << ' ' << third.integers[1] << ' '
            << third.one << ' ' << third.two << '\n';
    }
    

    I the program above, the compiler simply cannot know beforehand, whether third.two will refer to i or to j: this depends on the number entered by user at run-time (try out at https://godbolt.org/z/43bM1o by entering 7 instead of 100, for example).