Can I change Spark's executor memory at runtime? Can I change Spark's executor memory at runtime? hadoop hadoop

Can I change Spark's executor memory at runtime?


No, you can't.

Each executor starts on their own JVM, and you can't change JVM memory at runtime. Please see for reference: Setting JVM heap size at runtime