Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27205

spark-shell with packages option fails to load transitive dependencies even ivy successfully pulls jars

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.0.0
    • 3.0.0
    • Spark Core
    • None

    Description

      I found this bug while testing my patch regarding Spark SQL Kafka module - I tend to open spark-shell and link kafka module via `–packages`.

      When we run

      ./bin/spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.0

      we should be able to import "org.apache.kafka" in spark-shell, but it doesn't work for current master branch.

      There's not enough evidence as well as I have no idea what's happening here even with `–verbose` option, so I had to spend couple of hours dealing with git bisect.

      Turned out the commit introducing the bug was SPARK-26977 (81dd21fda99da48ed76adb739a07d1dabf1ffb51).

       

      Attachments

        Issue Links

          Activity

            People

              kabhwan Jungtaek Lim
              kabhwan Jungtaek Lim
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: