I ran spark application on Spark 2.3 with spark.executor.cores as 25
"Allocated CPU VCores" in YARN Running applications page shows 2 VCores(1 for driver, 1 for executor)
"Cores" in Spark UI Executors tab shows 25. I could see 25 parallel tasks in stage as well
Wondering why the Yarn running application page shows misleading 2 VCores. Is this a known bug?
I do see a bug on HDP: https://jira.pnda.io/browse/PNDA-4006. This bug was found in HDP 2.6.5.0-292