I am using SQL Server 2008-R2 Express Edition.
I wrote the query shown below to generate number sequences from 1 to @n. In testing it (purely in a query window), I found that if I significantly change the value of @n, I get incorrect results. Re-execution yields the same errors. However, if I open a new query window, the results are perfect.
Looking at the algorithm, it makes no sense to me as to why I should be getting unstable results (or even that a query can produce varying results against a fixed input).
DECLARE @n INT;
SET @n = 65536;
DECLARE @t TABLE (n INT NOT NULL PRIMARY KEY);
IF @n > 0 BEGIN
DECLARE @r INT, @i INT, @l INT;
SET @r = FLOOR(1.442695040888964 * LOG(@n));
SET @i = 1;
SET @l = 0;
INSERT INTO @t (n) VALUES (1);
WHILE @l < @r BEGIN
INSERT INTO @t (n) SELECT n + @i FROM @t;
SET @i = @i * 2;
SET @l = @l + 1;
END;
INSERT INTO @t (n) SELECT TOP (@n - @i) n + @i FROM @t;
END;
--SELECT COUNT(1) FROM @t;
select * from @t
EDIT
Change the 65536 to 5000, execute, change back to 65536, and scroll down to say row 169,770. I get row 169770 = 40000. In a new window, it runs correctly.
EDIT2
Besides getting randomly correct/incorrect results, it appears something else is wrong. I now am getting consistently incorrect results for some numbers such as 655360.
Add an ORDER BY to the last statement.
There is no natural or default order in any table
Edit:
I attribute it the use of floating point numbers
I posted how to do a tally table an hour ago: Maximum recursion has been exhausted