I have a function returning json
that contains a timestamptz
. When I execute it in Datagrip console, I get json
like
{..., "time_created":"2024-05-15T11:14:23.384266+00:00",...}.
But when I execute the same function in C# with ExecuteScalar
, returned value is like
{..., "time_created":"2024-05-15T09:14:23.384266+02:00",...}
It behaves the same even if the function returns json
casted to text
beforehand. Note that timestamps are converted to server time zone, and server time offset if added. This cannot be rendered in Quasar controls without modification.
What causes ExecuteScalar
to do the conversion even if text
is returned from the function?
If a function returns a table containing timestamptz
, values are returned correctly, and serialization creates values in the form e.g. "time_created":"2024-05-15T11:14:23.384266Z"
which can be rendered fine in Quasar input.
Is there a way to force whatever is converting json strings while reading with ExecuteScalar
to do it with ...Z
?
The function is
create function data_survey_r("Key" integer) returns text language plpgsql as $$
BEGIN
RETURN (SELECT ROW_TO_JSON(t)::text FROM data.survey t WHERE t.id = "Key");
END $$;
or
create function data_survey_r("Key" integer) returns json language plpgsql as $$
BEGIN
RETURN (SELECT ROW_TO_JSON(t) FROM data.survey t WHERE t.id = "Key");
END $$;
It is called from C# as
ret = (string) await connection.ExecuteScalarAsync("data_survey_r", new { Key = key }, commandType: CommandType.StoredProcedure);
EDIT: (The text from this comment):
create function test_now () returns json as $$
BEGIN
return (SELECT json_build_object('now', now()));
END $$
language plpgsql
C#
NpgsqlConnection con = new NpgsqlConnection(_options.ConnectionString);
con.Open();
var ret = await con.ExecuteScalarAsync("test_now");
Result: {"now" : "2024-06-22T12:53:30.515689+00:00"}
in console, {"now" : "2024-06-22T14:53:40.079963+02:00"}
I think we managed to get to the culprit in the comments discussion by using the diagnostic test_now
function (whose definition and outputs you later added to the question)
It demonstrates that postgres is responsible for generating the timespantz
formatting in the json based on the time zone set for your connection/session (which presumably for Npgsql is the time zone of your computer +02:00, possibly set to +00:00 UTC for DataGrip)
This postgres logic can be quickly tested with psql:
select test_now();
test_now
---------------------------------------------
{"now" : "2024-06-22T19:17:45.53308+03:00"}
(1 row)
SET TIME ZONE 'UTC';
SET
select test_now();
test_now
----------------------------------------------
{"now" : "2024-06-22T16:18:11.827584+00:00"}
(1 row)
ExecuteScalar()
does not play a role as suggested by your question
What causes ExecuteScalar to do the conversion even if text is returned from the function?
but confirms your initial observation:
It behaves the same even if the function returns json casted to text beforehand.
We could do the SET TIME ZONE
from within our c# code as suggested here but we will still get UTC time in format which you don't desire (...+00:00 vs ...Z) as per your original question:
Is there a way to force whatever is converting json strings while reading with ExecuteScalar to do it with ...Z?
I am not familiar with a way for Dapper
to add some interceptors when doing ExecuteScalar
and modify the generated json string with a date-format of our choosing which would be the ideal solution. Similar relatively recent use case with some solutions, but also with a comment from Marc Gravell here.
Dapper wants nothing to do with your fancy serialization shenanigans :) Basically, no: read it from the database as a string, then deserialize.
If that's not avaialble, we have two options -
ExecuteScalar
Your original function
create function data_survey_r("Key" integer) returns text language plpgsql as $$
BEGIN
RETURN (SELECT ROW_TO_JSON(t)::text FROM data.survey t WHERE t.id = "Key");
END $$;
uses ROW_TO_JSON
which doesn't allow for formatting suggestions to timespantz
, however in this answer there is a suggestion of how to work around this:
There is no way to influence the format used by
row_to_json
.You could define a view on your table and use
to_char(row_added_dttm, 'YYYY-MM-DD"T"HH24:MI:SS.MS"Z"')
to format the timestamp as a string.
Then you can use row_to_json on that view to get your desired result.
also note the comment which provides alternative approach.