When I query data from MySQL, I noticed this behavior:
select 1 / 100;
-- 0.0100
select 1 * 0.01;
-- 0.01
I try to run they in different machine, but get the same result: division produces 4 decimal places, multiplication produces 2 decimal places.
Can someone tell me why they different?
The rules for resulting datatype are described here:
In division performed with
/
, the scale of the result when using two exact-value operands is the scale of the first operand plus the value of thediv_precision_increment
system variable (which is 4 by default).
So we have:
SELECT 1 / 100 -- 0.0100
SELECT 1.0 / 100 -- 0.01000
SELECT 1.00 / 100 -- 0.010000
SELECT 1 / 100.0 -- 0.0100