Description
We should add a mechanism to add additional jars to jobs run in the Spark shell, since addJar() doesn't work there (see https://github.com/mesos/spark/pull/359).
There's a proposal/patch at https://groups.google.com/forum/?fromgroups#!searchin/spark-users/ADD_JAR/spark-users/IBgbLoFWbxw/9AzTrN_iwz4J, but someone needs to test it and submit it as a pull request.
Spark should also emit warnings / errors when trying to call addJar() within spark-shell.