I'm using ADF to call Databricks notebooks via linked service.
When ADF passes parameters as blank (either leaving the input box as blank or typing ""), Databricks notebooks widgets read it as two double quotes (a string of "")
As in the image below both params would be read as double quotes in the Databricks notebook. (using dbutils.widgets.get()
)
To my knowledge, one way to handle this issue is to add handling code in the notebook after reading the widgets input. However, I'm wondering if there's something I've missed, or is there is any other ways around.
In Parameters
if you give blank value it takes value
from Default value
whatever you have given earlier.
So, check the Default value
you given.
Below is the input passed when I run the notebook with blank values.
and
If you observe clearly in this input even though i did not give quotes it is added, because the parameter is of type String
whatever you pass it is enclosed by quotes automatically.
The same is with you when passed double quotes.
Inputs:
So, when these inputs sent to databricks it will take it with double quotes.
And below is the run details in databricks and run output in adf.
and
So, do json.loads
for parameter whichever having double quotes.
Below is the parameter i sent.
and while sending back to adf do json.dumps
on data.
import json
x = json.loads(dbutils.widgets.get("name"))
y = json.loads(dbutils.widgets.get("id"))
dt = x+" "+y
dt = json.dumps(dt)
dbutils.notebook.exit(dt)
Output in databricks
My suggestion is send the text directly without double quotes for String
type or do a json.loads()
in notebook.