I have an application that passes around a lot of JSON. There is some analytical work that would be best handled by PG, so I thought I'd pass the records to PG, let a trigger extract into another table, and allow users to run analysis on that other table. I cannot however seem to get the function to handle the JSON data correctly.
Sample table:
create table if not exists json_testing.test_table
(
json_col jsonb
);
Sample trigger:
create or replace trigger split_record
after insert
on json_testing.test_table
FOR EACH ROW EXECUTE PROCEDURE record_splitter();
Sample JSON:
{
"attr1": 1,
"attr2": 2
}
And the function which I just can't seem to get working:
CREATE OR REPLACE FUNCTION record_splitter()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS
$$
BEGIN
INSERT INTO json_testing.some_other_table(col1, col2)
SELECT NEW->'attr_1', NEW->'attr_2';
RETURN NEW;
END;
$$
[2023-03-05 18:17:15] [42883] ERROR: operator does not exist: json_testing.test_table -> unknown
[2023-03-05 18:17:15] Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
[2023-03-05 18:17:15] Where: PL/pgSQL function record_splitter() line 3 at SQL statement
I feel like I'm doing something silly here. It seems to be trying to coerce the table itself into some other datatype, as opposed to treating NEW as a json blob...
Like Adrian Klaver said in his comment, you need to include the column name. Additionally, you can use a VALUES
clause for better performance:
CREATE OR REPLACE FUNCTION record_splitter()
RETURNS TRIGGER
LANGUAGE PLPGSQL
AS $$
BEGIN
INSERT INTO json_testing.some_other_table(col1, col2)
VALUES (NEW.json_col->'attr_1', NEW.json_col->'attr_2');
RETURN NEW;
END; $$