Search code examples
amazon-web-servicesapache-sparkpysparkaws-glue

Is there a way to set multiple --conf as job parametet in AWS Glue?


Im trying to configure spark in my Glue jobs. When I tried to input them one by one in the 'Edit job', 'Job Parameters' as key and valur pair (e.g. key:--conf value: spark.executor.memory=10g) it works but when I tried putting them altogether (delimited by space or comma), it results to an error. I also tried using sc._conf.setAll but Glue is ignoring the config and insists on using its default. Is there a way to do this with Spark 2.4?


Solution

  • Yes, you can pass multiple parameters as below:

    Key: --conf

    value: spark.yarn.executor.memoryOverhead=7g --conf spark.yarn.executor.memory=7g