Search code examples
google-cloud-platformgoogle-cloud-composergoogle-cloud-data-fusion

How can I pass parameter to Macro in batch Pipeline of Data Fusion


I have seen about Macros and dynamic pipelines [Building Dynamic Pipelines in Cloud Data Fusion Using Macros] (https://www.qwiklabs.com/focuses/12371?parent=catalog), however this solves some scenarios in development or test environment. Nevertheless, how I can to pass the values in the macro label in productive environment? I mean, if I have a macro ${bq.dataset} on BigQuery sink and I need set a value in runtime with the consideration that this data fusion pipeline is triggered from google composer o any other scheduled mechanism. In other case, maybe I need to set a date value for query import in a database source: Import query within macro value for dynamic ingestion to cloud storage or bigquery

In this case, my values for macro can be:

{"bq.dataset": "myTable"} 

I need to implement a productive pipeline with parameter in runtime (without human intervention).

I hope you can help me to find de best way to do this.


Solution

  • I think it can be solved by using preferences. Please let us know if that solves your usecase

    enter image description here