Description
I notice the comments in https://github.com/apache/spark/pull/9575 said that Datanucleus related jars will still be copied to lib_managed/jars. But I don't see any jars under lib_managed/jars. The weird thing is that I see the jars on another machine, but could not see jars on my laptop even after I delete the whole spark project and start from scratch. Does it related with environments ? I try to add the following code in SparkBuild.scala to track the issue, it shows that the jars is empty.
deployDatanucleusJars := { val jars: Seq[File] = (fullClasspath in assembly).value.map(_.data) .filter(_.getPath.contains("org.datanucleus")) // this is what I added println("*********************************************") println("fullClasspath:"+fullClasspath) println("assembly:"+assembly) println("jars:"+jars.map(_.getAbsolutePath()).mkString(",")) //
Attachments
Issue Links
- depends upon
-
SPARK-7841 Spark build should not use lib_managed for dependencies
- Resolved