I want to allot different memories to different workers of a cluster in spark. When specifying conf, we can write .set("spark.executor.memory", "2g") which will allot 2 gb to all respective workers. But how to allot different memory to different workers.
For example, I want to give 1g to one worker and 2g to remaining workers. Is this possible? If yes, how?
Possible duplicate with this
.