I've a big PG database. There are INTEGER and BIGINT fields. Integer has mas max value 2 147 483 647 and bigint 9 223 372 036 854 775 807.
How can I find fields where usage (max. value is very near) is big (about 80-90%) to change field type? Are there any tools for that?
I can use a custom script and make many inquiries to db, but maybe there is a Linux tools for that.
Thank you.
Your question only makes sense if the columns you are talking about are filled from a sequence.
If I understood you right, you could use this query to figure out to what percentage the available numbers are exhausted:
SELECT relname AS sequence_name,
coalesce(nextval(s.oid) /
CASE a.atttypid
WHEN 'smallint'::regtype THEN 32767::numeric
WHEN 'integer'::regtype THEN 2147483647::numeric
WHEN 'bigint'::regtype THEN 9223372036854775807::numeric
END, 0) * 100 AS percent_used
FROM pg_class AS s
JOIN pg_depend AS d ON d.objid = s.oid
JOIN pg_attribute a ON d.refobjid = a.attrelid
AND d.refobjsubid = a.attnum
WHERE s.relkind = 'S'
AND d.refclassid = 'pg_class'::regclass
AND d.classid = 'pg_class'::regclass
ORDER BY 2 DESC;
This assumes that there is a relationship between the column and the sequence, which is established for serial
and bigserial
columns as well as identity columns.
This bit of advice probably comes too late, but you should always use bigint
for generated keys, because then you never run danger of running out of numbers.