Recently I've been working on a SQL Server job that performs our weekly backups. As part of a business requirement, these databases need to be backed up from smallest to largest in the most accurate way possible.
I've managed to do this with the sp_msforeachdb
stored proc using code I found to help finding database size:
DROP TABLE #temp1
CREATE TABLE #temp1
(name varchar(50),
database_size varchar(50),
Freespace varchar(50))
INSERT INTO #temp1(name,database_size,Freespace)
EXEC sp_msforeachdb
'use ?;SELECT database_name = db_name()
,database_size = ltrim(str((convert(DECIMAL(15, 2), dbsize) + convert(DECIMAL(15, 2), logsize)) * 8192 / 1048576, 15, 2))
,''unallocated space'' = ltrim(str((
CASE
WHEN dbsize >= reservedpages
THEN (convert(DECIMAL(15, 2), dbsize) - convert(DECIMAL(15, 2), reservedpages)) * 8192 / 1048576
ELSE 0
END
), 15, 2))
FROM (
SELECT dbsize = sum(convert(BIGINT, CASE
WHEN type = 0
THEN size
ELSE 0
END))
,logsize = sum(convert(BIGINT, CASE
WHEN type <> 0
THEN size
ELSE 0
END))
FROM sys.database_files
) AS files
,(
SELECT reservedpages = sum(a.total_pages)
,usedpages = sum(a.used_pages)
,pages = sum(CASE
WHEN it.internal_type IN (
202
,204
,211
,212
,213
,214
,215
,216
)
THEN 0
WHEN a.type <> 1
THEN a.used_pages
WHEN p.index_id < 2
THEN a.data_pages
ELSE 0
END)
FROM sys.partitions p
INNER JOIN sys.allocation_units a
ON p.partition_id = a.container_id
LEFT JOIN sys.internal_tables it
ON p.object_id = it.object_id
) AS partitions'
SELECT [name] AS [DB_Name], ((CAST (database_size AS DECIMAL(10,2))) - (CAST (Freespace AS DECIMAL (10,2)))) AS Used
FROM #temp1
WHERE [name] <> 'tempdb'
ORDER BY Used
I think understandably, my company doesn't put undocumented stored procs in production code, so I've been trying to find a work-around.
The logic I came up with to get around this was to first create a temporary table (#temp
) and populate it with a consecutive ID# and all of the database IDs and names on the server from the sys.databases
table. This table ended up looking like this:
Once this table is created, I've been attempting to loop through the databases in #temp
to UPDATE
the 'Used' column (contains the # of pages used in the database as pulled from SUM(used_pages)
in sys.allocation_units
). After this, I already have a functional cursor that will go through #temp
and run the backups in the order we need using ORDER BY
. So essentially in this 3 part process, I'm just stuck on the step that populates the used space in #temp
.
To avoid the terror that can be cursors any more than I already have, I was originally trying to use a WHILE
loop to populate my #temp table. Here's what I came up with:
DECLARE @sql NVARCHAR(4000);
DECLARE @BigSQL NVARCHAR(4000);
DECLARE @dbName VARCHAR(100);
SET @sql = 'UPDATE tempdb..#temp
SET Used = (SELECT SUM(a.used_pages)
FROM sys.partitions p
INNER JOIN sys.allocation_units a ON p.partition_id = a.container_id
LEFT JOIN sys.internal_tables it ON p.object_id = it.object_id)';
SET @dbName = N'master'
DECLARE @db_name VARCHAR(100)
,@db_size DECIMAL(15, 2)
,@db_freespace DECIMAL(15, 2);
IF object_id ('tempdb..#temp') IS NOT NULL
DROP TABLE #temp
--index me
CREATE TABLE #temp (
ID INT IDENTITY(1,1),
DB_ID INT,
[Name] VARCHAR(50)
,Used VARCHAR(50)
)
INSERT INTO #temp (DB_ID, [Name])
SELECT database_id, [Name]
FROM sys.databases
--#temp check 1
SELECT *
FROM #temp
DECLARE @min INT
, @max INT
, @cur INT;
DECLARE @executionString VARCHAR(MAX),
@DB VARCHAR(50);
SELECT @min = MIN(ID),
@max = MAX(ID),
@cur = MIN(ID)
FROM #temp;
WHILE (@cur <= @max)
BEGIN
SELECT @dbName = [Name] FROM sys.databases WHERE database_id = @cur;
SET @BigSQL = 'USE ' + @dbName + '; EXEC sp_executesql N'''+ @sql + '''';
EXEC(@BigSQL);
SET @cur = @cur + 1;
END
--#temp check 2
SELECT *
FROM #temp
I was never able to get this to work because from what I understand, variables won't work in a USE ['database']
statement in dynamic SQL and the solution proposed by Tom Staab using sp_executesql
isn't working for me. Basically what ended up happening was every time I execute the code, the number of pages used in the first database referenced would populate down to every database in my #temp
table. Because of this, I figured the issue might be in the loop and transitioning over to a cursor might help. I wrote the following:
DECLARE @used INTEGER
, @Name VARCHAR(100);
IF object_id ('tempdb..#temp') IS NOT NULL
DROP TABLE #temp
--index me
CREATE TABLE #temp (
ID INT IDENTITY(1,1)
, DB_ID INT
, [Name] VARCHAR(50)
,Used VARCHAR(50)
)
INSERT INTO #temp (DB_ID, [Name])
SELECT database_id, [Name]
FROM sys.databases
--#temp check 1
SELECT *
FROM #temp
DECLARE temp_populate CURSOR FOR
SELECT [Name]
FROM #temp
OPEN temp_populate
FETCH NEXT FROM temp_populate
INTO @Name
WHILE @@FETCH_STATUS = 0
BEGIN
SELECT @used = SUM(used_pages)
FROM sys.allocation_units
IF((SELECT Used FROM #temp WHERE [Name] = @Name) IS NULL)
UPDATE #temp
SET Used = @used
WHERE [Name] = @Name;
ELSE
PRINT ''
END
CLOSE temp_populate
DEALLOCATE temp_populate
--#temp check 2
SELECT *
FROM #temp
The issue I'm getting here is that the first database in the #temp
table (in this case master) just gets evaluated over and over again until I cancel the query. If I SELECT * FROM #temp
after canceling, the first row of the table populates correctly, but all the rest remain NULL
.
I know this is quite the extensive post, but I've been working on this a couple days now and honestly would appreciate a fresh pair of eyes or someone who is more experienced with SQL who could propose a different method entirely for what I'm trying to do. Thanks so much for reading all this!
Try this query to get the size of databases
CREATE TABLE #temp (
ID INT IDENTITY(1,1)
, DB_ID INT
, [Name] VARCHAR(50)
, Used DECIMAL(8,2)
)
INSERT INTO #temp (DB_ID, [Name], Used)
SELECT
MF.database_id
,database_name = DB_NAME(database_id)
, total_size_mb = CAST(SUM(size) * 8. / 1024 AS DECIMAL(8,2))
FROM sys.master_files MF
GROUP BY database_id
ORDER BY total_size_mb DESC
--#temp check 1
SELECT *
FROM #temp