Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.3.0
-
None
Description
// code placeholder
From some reason spark downloads a javadoc artifact of a package instead of the jar.
Steps to reproduce:
- Delete (or move) your local ~/.ivy2 cache to force Spark to resolve and fetch artifacts from central:
rm -rf ~/.ivy2
1. Run:
~/dev/spark-2.3.0-bin-hadoop2.7/bin/spark-shell --packages org.scalanlp:breeze_2.11:0.13.2
2.Spark would download the javadoc instead of the jar:
downloading https://repo1.maven.org/maven2/net/sourceforge/f2j/arpack_combined_all/0.1/arpack_combined_all-0.1-javadoc.jar ...
[SUCCESSFUL ] net.sourceforge.f2j#arpack_combined_all;0.1!arpack_combined_all.jar (610ms)
3. Later spark would complain that it couldn't find the jar:
Warning: Local jar /Users/thesamet/.ivy2/jars/net.sourceforge.f2j_arpack_combined_all-0.1.jar does not exist, skipping. Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
4. The dependency of breeze on f2j_arpack_combined seem fine: http://central.maven.org/maven2/org/scalanlp/breeze_2.11/0.13.2/breeze_2.11-0.13.2.pom