I've not been able to reach a high number of instances with BigQuery remote functions & first gen cloud function (link). As such, I've deployed a 2nd gen cloud function with the same code/config. But I get a Access denied error from the BQ web interface.
The connection does have the invoke permission. This is confirmed by the fact that if I configure the connection to call a first gen cloud function I don't get a access denied error. This is illustrated bellow where the first gen call works, while the second gen does not while both are using the same connection.
CREATE OR REPLACE FUNCTION `project_name`.trash.add_fake_first_gen(user_id int64, corp_id STRING) RETURNS STRING REMOTE
WITH CONNECTION `project_name.eu.gcf-con` OPTIONS (endpoint = 'first_gen_url', max_batching_rows=1);
SELECT`project_name.trash.add_fake_first_gen`(1, "B");
CREATE OR REPLACE FUNCTION `project_name`.trash.add_fake_second_gen(user_id int64, corp_id STRING) RETURNS STRING REMOTE
WITH CONNECTION `project_name.eu.gcf-con` OPTIONS (endpoint = 'second_gen_url', max_batching_rows=1);
SELECT `project_name.trash.add_fake_second_gen`(1, "B");
Both cloud function share the same networking configuration & service account:
Configuration of the first gen cloud function (working):
Configuration of the second gen cloud function (access denied):
Does 2nd gen functions need additional configuration to work with remote functions?
As suggest by @guillaumeblaquiere, the service account associated with the cloud function gen2 should also have the Cloud Run invoker role: