Search code examples
apache-sparkdbt

How to pass Spark configuration parameters to DBT?


I am using DBT to connect to AWS/EMR. I am able to run Spark/SQL queries but where do I set parameters like for example spark.sql.shuffle.partitions, that in normal code you will pass with:

sqlContext.setConf("spark.sql.shuffle.partitions", "1200")

?


Solution

  • As I do not get any answer here, I write what I think is the way at the moment (but repeat, not sure if it is the right way to go):

    {{ config(
        materialized='table',
        pre_hook=['SET spark.sql.shuffle.partitions=1200'],
        ...
    )}}