I am running Spark on Yarn with 2 executors. The executors are running on separate physical machines.
I have spark.executor.memory set to '40g'. This is because I want to have 40g of memory used on each machine. I have one executor per machine.
When I run my application I see from 'top' that both my executors are using the full 40g of memory I allocated to them.
The 'Executors' tab in the Spark UI shows something different. It shows the memory used as a total of 20GB per executor e.g. x / 20.3GB. This makes it look like I only have 20GB available per executor when really I have 40GB available.