Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-11798

Datanucleus jars is missing under lib_managed/jars

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Cannot Reproduce
    • None
    • None
    • Build, SQL
    • None

    Description

      I notice the comments in https://github.com/apache/spark/pull/9575 said that Datanucleus related jars will still be copied to lib_managed/jars. But I don't see any jars under lib_managed/jars. The weird thing is that I see the jars on another machine, but could not see jars on my laptop even after I delete the whole spark project and start from scratch. Does it related with environments ? I try to add the following code in SparkBuild.scala to track the issue, it shows that the jars is empty.

      deployDatanucleusJars := {
            val jars: Seq[File] = (fullClasspath in assembly).value.map(_.data)
              .filter(_.getPath.contains("org.datanucleus"))
            // this is what I added
            println("*********************************************")
            println("fullClasspath:"+fullClasspath)
            println("assembly:"+assembly)
            println("jars:"+jars.map(_.getAbsolutePath()).mkString(","))
            //
      

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              zjffdu Jeff Zhang
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: