Search code examples
google-cloud-platformgoogle-cloud-data-fusion

Cloud Data Fusion - Input HTTP Post Body from BQ rows


I am a new cloud data fusion user and have run into a problem I cant find a solution for.

I have a table in BQ with ~150 rows of latitude and longitude points. For each row, I want to pass the lat and lng into an HTTP post request to get a result from TravelTime API. Ultimately I want to have a table with all my original rows with a column with the response for each one/

Where I am stuck is that so far I have only been able to hard-code the body of the post request into the HTPP Source plugin and successfully write the response to a file in gcs. However, I expect the rows will change over time, so I would like to dynamically generate and pass in the POST request body from my BQ data.

Is this possible with data fusion? Is this an advisable approach? Or is there a better way?

enter image description here

enter image description here


Solution

  • As @Albert Shau and @user3750486 agreed in the comments:

    There is no out-of-the-box way to pass data from BQ rows dynamically in a POST HTTP request.

    A possible workaround is to have an HTTP transform plugin that sits in the middle of the pipeline and can be configured to make calls based on the input data. Then you would have a BQ source followed by that plugin followed by the GCS sink. I think your best bet would be to write a custom transform.

    This can be done by following this link that @Albert Shau provided or to do a custom code using GCP's Cloud Function as OP did.

    Posting the answer as community wiki for the benefit of the community that might encounter this use case in the future.

    Feel free to edit this answer for additional information.