When submitting a spark-shell application, the executor side classloader is set to be ExecutorClassLoader.
However, it appears that when ExecutorClassLoader is used, parameter spark.executor.userClassPathFirst is not honored.
It turns out that, since ExecutorClassLoader class is defined as
its parent classloader is actually the system default classloader (due to ClassLoader class's default constructor) rather than the "parent" classloader specified in ExecutorClassLoader's constructor.
As a result, when spark.executor.userClassPathFirst is set to true, even though the "parent" classloader is ChildFirstURLClassLoader, ExecutorClassLoader.getParent() will return the system default classloader.
Thus, when ExecutorClassLoader tries to load a class, it will first attempt to load it through the system default classloader, and this will break the spark.executor.userClassPathFirst behavior.
A simple fix would be to define ExecutorClassLoader as: