Search code examples
apache-sparkdatabricksazure-databricks

Databricks spark configuration using secrets in property name


Is it possible to refer to a databricks secret in my property name like this:

fs.azure.account.auth.type.{{secrets/my_scope/my_secret1}}.dfs.core.windows.net OAuth
fs.azure.account.auth.type.{{secrets/my_scope/my_secret2}}.dfs.core.windows.net OAuth

or is secrets only allowed for the assigned value?

My databricks workflows fails with the current message Failure to initialize configuration for storage account [REDACTED].dfs.core.windows.net: Invalid configuration value detected for fs.azure.account.keyInvalid configuration value detected for fs.azure.account.key

I am not 100% certain it is a problem with the configuration, so I just want to make sure it is possible before looking for other issues


Solution

  • No, it's not possible - secrets are matched by the full value, not substituted inside the string - it's described in the documentation. It's also easy to check. For example, if you start a cluster with such value, and then use Scala snippet to filter out necessary values:

    %scala
    
    spark.conf.getAll.filter { _._1.contains(".dfs.core.windows.net")}
    

    then you should get something like this:

    res1: scala.collection.immutable.Map[String,String] = Map(
      fs.azure.account.auth.type.{{secrets/my_scope/my_secret1}}.dfs.core.windows.net -> OAuth
    )
    

    showing that no substitution happened. The error message is a bit misleading because it's detected that you refers the secret and redacted it.