Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Not A Problem
-
2.2.0
-
None
-
None
-
Spark 2.2.0, YARN, 2 Ubuntu 16.04 nodes, openjdk 1.8.0_151
Description
I have never reported anything before, and hope this is the right place as I think I have come across a bug. If I missed the solution, please feel free to correct me.
I set up Spark 2.2.0 on a 2-node Ubuntu cluster. I use Jupyter notebook to access the pyspark-shell. However, the UI via http://IP:4040/ is broken. Has anyone ever seen something like this?
When I inspect the page in Chrome, it says "Failed to load resource: net::ERR_EMPTY_RESPONSE" for various .js and .css files.
I did a fresh install and added my configurations until the problem occurred again. Everything works fine until I edited the spark-defaults.conf to contain the following line:
{{{{spark.driver.extraClassPath /usr/local/phoenix/phoenix-4.13.0-HBase-1.3-client.jar }}}}
{{{{spark.executor.extraClassPath /usr/local/phoenix/phoenix-4.13.0-HBase-1.3-client.jar }}}}
How to add these jar to my class path without breaking the UI? If I just supply them using the --jars parameter in the Terminal it works fine. But I'd like to have them configured, as explained in the manual: https://phoenix.apache.org/phoenix_spark.html
I posted the question on Stackoverflow some time ago here and apparently I'm not the only one (here).