I am writing a stored procedure to do some common checks, in the stored procedure I have code like the following to avoid creating too many @variables
:
IF type = 'spl'
THEN BEGIN
SELECT col1, col4, col5, col9, col20, ... INTO #tmp
FROM tbl1
WHERE primary_key = unique_id
END
ELSE BEGIN
SELECT col1, col5, col7, col8, col19, ... INTO #tmp
FROM tbl2
WHERE primary_key = unique_id
END
Though these two INSERT
could never run at the same time, it still causes a #tmp table already exist
error, and the stored procedure can't be created.
Is there any common work around for this?
The following is certainly possible, but it looks worse then declaring dozens of @variables
. But it is the only thing I can come up with atm.
SELECT *
INTO #TEMP
FROM
(SELECT
CASE WHEN type = 'spl'
THEN a.col1
ELSE b.col1
END,
...
FROM ... CROSS JOIN ...
)
You can do a "conditional" insert from multiple data sources by combining them with union all
and having individual where
clauses that will cause only one data source to produce rows:
insert into #tmp
select col1, col2, ...
from tbl1
where @type = 'spl' and primary_key = unique_id
union all
select col1, col2, ...
from tbl2
where @type <> 'spl' and primary_key = unique_id;
If you want to have the output table created for you, you can use select ... into ...
with a where
clause that prevents any rows from being inserted. The result will be an empty table with the correct schema:
-- Create empty #tmp table.
select col1, col2, ...
from tbl1
into #tmp
where 42 = 6 * 9; -- Answer to the Ultimate Question of Life, the Universe, and Everything.
Note that executing separate queries using if
/then
allows the query optimizer to generate optimal plans for each query rather than the whole union all
mash up, and it makes the intent clearer for long term maintenance.
It may also be a good time to read up on parameter sniffing.