Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12666

spark-shell --packages cannot load artifacts which are publishLocal'd by SBT

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 1.5.1, 1.6.0
    • 2.0.1, 2.1.0
    • Spark Submit
    • None

    Description

      Symptom:

      I cloned the latest master of spark-redshift, then used sbt publishLocal to publish it to my Ivy cache. When I tried running ./bin/spark-shell --packages com.databricks:spark-redshift_2.10:0.5.3-SNAPSHOT to load this dependency into spark-shell, I received the following cryptic error:

      Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: configuration not found in com.databricks#spark-redshift_2.10;0.5.3-SNAPSHOT: 'default'. It was required from org.apache.spark#spark-submit-parent;1.0 default]
      	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
      	at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
      	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
      

      I think the problem here is that Spark is declaring a dependency on the spark-redshift artifact using the default Ivy configuration. Based on my admittedly limited understanding of Ivy, the default configuration will be the only configuration defined in an Ivy artifact if that artifact defines no other configurations. Thus, for Maven artifacts I think the default configuration will end up mapping to Maven's regular JAR dependency (i.e. Maven artifacts don't declare Ivy configurations so they implicitly have the default configuration) but for Ivy artifacts I think we can run into trouble when loading artifacts which explicitly define their own configurations, since those artifacts might not have a configuration named default.

      I spent a bit of time playing around with the SparkSubmit code to see if I could fix this but wasn't able to completely resolve the issue.

      /cc brkyvz (ping me offline and I can walk you through the repo in person, if you'd like)

      Attachments

        Activity

          People

            bryanc Bryan Cutler
            joshrosen Josh Rosen
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: