For efficiency and not to consume the pool of PostgreSQL I want to execute some requests and not wait for them to respond or fail. 100% of the requests are heavy failsafe (internal error catching) functions that return VOID, and I want to be able to execute many without having to wait from a response to free the pool connection, because the responses are always going to be null as all those procedures always return VOID.
What options do I have? I'm open to different solutions.
I was even wondering if by just writting PERFORM myProcedure() it would just work since perform expects no return value.
Would these libpq mechanism work if I never retrieve the response? or will it instead block all my pool because I never retrieved the nulls. http://www.postgresql.org/docs/9.4/static/libpq-async.html
What you appear to be looking for is asynchronous queries with pipelining. Queries are queued up on the connection, executed in the order they're sent, and their results are returned in the order of execution.
This is supported by the PostgreSQL protocol. Some client drivers, including libpq
and PgJDBC
, support asynchronous queries. However, the only driver I'm aware of that supports pipelining is PgJDBC, and it doesn't support pipelining for individual queries, only queries in a batch run via executeBatch
.
To use libpq's async mechanism to do what you want, you'll need a pool of connections (since it can't pipeline on one connection), and you'll need to periodically check to see if the query is complete on a connection. You can't fire and forget, and there's no callback mechanism for completions. Something like an epoll
loop to check for new pending input would do, and it sounds like that'd be a good fit for node.js's event-driven model.
There's been some discussion on adding a pipelined asynchronous mode to libpq
, where queries return a future object, and you can check whether the result object is populated and/or wait on it, while also dispatching more queries. Nothing has been implemented yet AFAIK. You might want to get involved. It's not technically all that hard, and is mostly a matter of not having people who want it enough to help implement it. Speak up on pgsql-hackers.
I was even wondering if by just writting
PERFORM myProcedure()
it would just work since perform expects no return value.
No. It still waits. It has to know if there's been an error or not, and subsequent code is also allowed to expect that any side-effects of the procedure are complete.
For node-postgres, see:
It looks like asynchronous callbacks are the default way to do things on node. It won't be pipelined.
So, I'd either use a pool of connections with libpq
using async queries and an epoll
loop, or I'd use node-postgres.