Search code examples
sqlsql-serveroraclesybase

What is the reason behind precision and scale naming?


I have a lot of troubles understanding why precision and scale are called that way in database types.

I see precision and scale differently.

PRECISION IN MY POINT OF VIEW

For me precision would be how many digits there are in the right side. For instance, 1 is less precise than 1,0001.

SCALE IN MY POINT OF VIEW

Scale would be how much a number can go up or down. For instance 0 - 1000 is a bigger scale than 0 - 10. Or even 0 - 1,0 is a bigger scale than 0 - 1.

PRECISION AND SCALE IN DATABASE

However in database lexicon it has different meaning, precision is the total of digits in a number and scale the number of digits on right.

I'm always forgetting the meaning of this two words because I can't make sense of them.

Hope you guys can help me out, understanding why they are called this way


Solution

  • Perhaps it's easier if you consider how the numbers look in scientific notation.

    X * 10^Y
    

    where X has a single digit before the decimal point.

    Now, how big the number is (its "scale") is fundamentally determined by Y. Are we counting in ones? Millions? Thousandths? That's scale.

    Regardless of the absolute scale of the number, the digits in X determine how precise we're being. Can I distinguish 1.1 ones from 1.2 ones? Can I distinguish 1.1 millions from 1.2 millions? Can I distinguish 1.1 thousandths from 1.2 thousandths? All are equivalent - two digits (including the one before the decimal point) of precision.

    If I can distinguish 1.01 millions from 1.00 millions, that's more precise than only being able to distinguish 0.1 millions; that's 3 digits of precision.

    But 1.01*10^-3 is not more precise than 1.01*10^10 ; it merely operates at a smaller scale.

    Beyond that, I don't know what you want. Ok, you've told us what you'd like the words to mean; but that's not what they mean. This is what they mean.


    UPDATE - One other thing I should mention. It may seem that scale and precision are conflated in some way, because if we take a physical example, surely "1 millimeter from the bullseye" is more precise than "1 meter from the bullseye", right?

    But remember that precision and scale describe a variable's data type, not a specific measurement. If measuring in meters, we can't express "1 millimeter from the bullseye" with less than 4 digits of precision ("0.001 meters"); but we could describe "1 meter from the bullseye" with 1 digit of precision. So that actually does align with our desire to call "1 mm from the bullseye" somehow more precise.