Search code examples
apache-sparkpysparkmaster-slave

How to allot different memories to different workers


I want to allot different memories to different workers of a cluster in spark. When specifying conf, we can write .set("spark.executor.memory", "2g") which will allot 2 gb to all respective workers. But how to allot different memory to different workers.

For example, I want to give 1g to one worker and 2g to remaining workers. Is this possible? If yes, how?


Solution

  • Possible duplicate with this

    1. It seems it is not possible to provide different memory constraints.
    2. However documentation says that if you do not provide any constraints, then all memory of the machine will be used. Thus, maybe if you have different machines it can uses different amount of RAM on each machine

    .