Search code examples
scalaapache-sparkkubernetesk3srancher-desktop

spark-submit error "To use support for EC Keys you must explicitly add this dependency to classpath"


I am running a local K3s Kuberentes cluster created by Rancher Desktop.

kubectl cluster-info returns

Kubernetes control plane is running at https://127.0.0.1:6443
# ...

When I submit Spark application (in Scala) by:

spark-submit \
        --master=k8s://https://127.0.0.1:6443 \
        --deploy-mode=cluster \
        --name=findretiredpeople \
                --class=com.sundogsoftware.spark.Hello \
        --conf=spark.executor.instances=2 \
        --conf=spark.kubernetes.container.image=hongbo-miao/hello:latest \
        local:///target/scala-2.12/hello_2.12-1.0.jar

I got error

23/03/20 18:14:10 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
23/03/20 18:14:10 INFO SparkKubernetesClientFactory: Auto-configuring K8S client using current context from users K8S config file
Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: JcaPEMKeyConverter is provided by BouncyCastle, an optional dependency. To use support for EC Keys you must explicitly add this dependency to classpath.
  at io.fabric8.kubernetes.client.internal.CertUtils.handleECKey(CertUtils.java:164)
  at io.fabric8.kubernetes.client.internal.CertUtils.loadKey(CertUtils.java:134)
  at io.fabric8.kubernetes.client.internal.CertUtils.createKeyStore(CertUtils.java:112)
  at io.fabric8.kubernetes.client.internal.CertUtils.createKeyStore(CertUtils.java:247)
  at io.fabric8.kubernetes.client.internal.SSLUtils.keyManagers(SSLUtils.java:153)
  at io.fabric8.kubernetes.client.internal.SSLUtils.keyManagers(SSLUtils.java:147)
  at io.fabric8.kubernetes.client.utils.HttpClientUtils.applyCommonConfiguration(HttpClientUtils.java:204)
  at io.fabric8.kubernetes.client.okhttp.OkHttpClientFactory.createHttpClient(OkHttpClientFactory.java:89)
  at org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:118)
  at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:242)
  at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2763)
  at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:242)
  at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:214)
  at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
  at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
23/03/20 18:14:10 INFO ShutdownHookManager: Shutdown hook called
23/03/20 18:14:10 INFO ShutdownHookManager: Deleting directory /private/var/folders/22/ntjwd5dx691gvkktkspl0f_00000gq/T/spark-15d1055d-c2fb-40d4-81ff-66b602595979

Solution

  • After stuck here for a while, I suddenly thought I met a same error for Apache Flink last year at here. So I tried similar approach and it worked for Apache Spark too!

    Here is the solution:

    Download latest version of bcprov-jdk15on and bcpkix-jdk15on jar files from

    In my case, I installed Apache Spark on macOS by brew install apache-spark, so my folder is at /opt/homebrew/Cellar/apache-spark, you need find your own location.

    Then move those two jar files to this folder

    /opt/homebrew/Cellar/apache-spark/{version}/libexec/jars/
    

    Now I can submit the application again, and the error is gone!