I'm totally new using Databricks (and in fact all Azure components).
Long story short:
I need to invoke a jar file like I do on my terminal.
Meaning something like
java -jar C:\Windows\...\myjavaapp.jar javaClass configFile=C:\Windows\...\params.properties aParam1=True aParam2=Field1,Field2,Field3 aParam3=output.csv
So this is how I'm running it on Windows terminal.
But now how to do this on a Databricks notebook cell? Is it possible? I already imported the jar
file into my notebook libs.
locally I built a python app to run these commands, and it worked perfectly. Is it possible to do it in Databricks? I tried, but no success. Don't know if it's something I'm doing it wrong, or it just doesn't work at all. I need to do this kind of dyanmic, meaning I working with String manipulation where I build the different commands.
Hope I was clear with my question.
Thanks for your help and time!
EDIT: here's the python function I'm using to run the command:
def java_run_command(command):
try:
print('Trying to running command ' + command)
result = subprocess.run(command, capture_output=True, text=True, check=True, shell=True)
output = result.stdout
error = result.stderr
if error:
print(date_format_str() + " - Some errors were found: \n")
print(error)
except subprocess.CalledProcessError as e:
print(date_format_str() + " - Error executing java command: \n")
print(e)
You can run jar files in databricks jobs, follow below steps.
Go To > Workflows and click on Create Job.
You will get interface like below, fill all the details and click on create.
Task name - Give a name to the task. Type - Select the type. Here it is JAR. Main class - Give the name of your class. Make sure you are giving correct name. Cluster - Select the cluster on which your task should run. Dependent libraries - Add your jar file here. Parameters - Give parameters in the list.
After clicking on create you will get below page.
Here, click on Run Now and see details in runs.
and output.