Search code examples
c++sumcpucomputation-theorycomputation

What are the chances of a 1 + 1 sum giving the wrong result?


I know that, for as much as we want to believe computers to be unerring, transistors are not perfect and 1 + 1 will not always return 2, at a transistor level.

I also know that, to protect us from errors, most computers nowadays have redundancy, error detection and correction algorithms.

That being said, what are the chances of the following C++ program printing the wrong result, without warning? Is there even a chance?

#include <iostream>
using namespace std;

int main()
{
    int a = 1, b = 1;
    int sum = a + b;

    cout << "Sum = " << sum;

    return 0;
}

Let's assume we are using an average x64 $1000 laptop, as of 2020.

This question has a broader scope. We run billions of calculations per second, I want to know how much can go wrong in a complex program, on a theoretical level.


Solution

  • Yes, there is a chance of 1 + 1 yielding something other than 2. The chance of that happening is so close to zero that it cannot be measured.

    This is so for the following reasons:

    1. First of all, the likelihood of things going wrong at the quantum level are infinitesimally low. The term "glitches" does exist in IT, but in the vast majority of cases it turns out to be due to some hardware malfunction like a network cable not making perfect contact. In the remaining extremely small percentage of cases where the glitch has been observed in software, it is simply used as just another term for "we are not quite sure why this happened". It is most likely due to a logic bug, or a multithreading issue, or some other non-quantum effect. Glitches due to quantum uncertainty are not happening at any rate that has given rise to any need to be given any consideration in our profession.

    2. The computer system on which you are going to run this little test program of yours is constantly running megabytes of code that perform various other functions, all of which rely on 1+1 or any other computation always yielding the correct result. If the slightest mishap was to ever happen, the computer would crash miserably and spectacularly. So, your puny little program does not even need to run: your computer and hundreds of millions of computers worldwide working flawlessly around the clock is proof that 1+1 is always computed as 2 with an extremely high degree of certainty.