Search code examples
pythonapache-sparkpysparkpy4j

How to add third-party Java JAR files for use in PySpark


I have some third-party database client libraries in Java. I want to access them through

java_gateway.py

E.g.: to make the client class (not a JDBC driver!) available to the Python client via the Java gateway:

java_import(gateway.jvm, "org.mydatabase.MyDBClient")

It is not clear where to add the third-party libraries to the JVM classpath. I tried to add to file compute-classpath.sh, but that did not seem to work. I get:

Py4jError: Trying to call a package

Also, when comparing to Hive: the hive JAR files are not loaded via file compute-classpath.sh, so that makes me suspicious. There seems to be some other mechanism happening to set up the JVM side classpath.


Solution

  • You can add external jars as arguments to pyspark

    pyspark --jars file1.jar,file2.jar