I have 2 queries:
(1)
declare @m varchar
set @m='10'
select * from test where month=@m
(2)
declare @m varchar(2)
set @m='10'
select * from test where month=@m
Number of rows in result is different. In 2 variant more than in first. What is the reason could be?
That's because when you don't specify how many bytes the varchar
variable can hold, the engines uses the default with is 1
:
When n isn't specified in a data definition or variable declaration statement, the default length is 1. If n isn't specified when using the CAST and CONVERT functions, the default length is 30.
So, in the first case you have:
select * from test where month=1
and in the second:
select * from test where month=10