Search code examples
apache-sparkcassandradatastaxdatastax-enterprise

datastax - Failed to connect to DSE resource manager on spark-submit


dsetool status

DC: dc1     Workload: Cassandra       Graph: no
======================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
--   Address          Load             Owns                 VNodes                                               Rack         Health [0,1]
UN   192.168.1.130     810.47 MiB       ?                    256                                              2a           0.90
UN   192.168.1.131     683.53 MiB       ?                    256                                          2a           0.90
UN   192.168.1.132      821.33 MiB       ?                    256                                          2a           0.90

DC: dc2     Workload: Analytics       Graph: no     Analytics Master: 192.168.2.131
    =========================================================================================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
--   Address          Load             Owns                 VNodes                                           Rack         Health [0,1]
UN   192.168.2.130     667.05 MiB       ?                    256                                          2a           0.90
UN   192.168.2.131     845.48 MiB       ?                    256                                          2a           0.90
UN   192.168.2.132       887.92 MiB       ?                    256                                          2a           0.90

when I try to launch the spark-submit job

dse -u user -p password spark-submit  --class com.sparkLauncher  test.jar prf

i am getting the following error (edited)

ERROR 2017-09-14 20:14:14,174 org.apache.spark.deploy.rm.DseAppClient$ClientEndpoint: Failed to connect to DSE resource manager
java.io.IOException: Failed to register with master: dse://?

....

Caused by: com.datastax.driver.core.exceptions.InvalidQueryException: The method DseResourceManager.registerApplication does not exist. Make sure that the required component for that method is active/enabled

....

ERROR 2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application has been killed. Reason: Failed to connect to DSE resource manager: Failed to register with master: dse://?
org.apache.spark.SparkException: Exiting due to error from cluster scheduler: Failed to connect to DSE resource manager: Failed to register with master: dse://?

....

WARN  2017-09-14 20:14:14,179 org.apache.spark.deploy.rm.DseSchedulerBackend: Application ID is not initialized yet.
ERROR 2017-09-14 20:14:14,384 org.apache.spark.SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

ERROR 2017-09-14 20:14:14,387 org.apache.spark.deploy.DseSparkSubmitBootstrapper: Failed to start or submit Spark application
java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem

I can confirm that I have granted privileges as mentioned in this documentation, https://docs.datastax.com/en/dse/5.1/dse-admin/datastax_enterprise/security/secAuthSpark.html I am trying this on AWS if that makes a difference and I can confirm that the routes between the nodes are all open. I am able to start spark shell from any of the spark nodes, can bring up the Spark UI, can get spark master from cqlsh commands

Any pointers will be helpful, thanks in advance!


Solution

  • For some reason I am unable to pin point, I can run it as mentioned in cluster mode but not in client mode