Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-6047

pyspark - class loading on driver failing with --jars and --packages

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Duplicate
    • 1.3.0
    • None
    • PySpark, Spark Submit
    • None

    Description

      Because py4j uses the system ClassLoader instead of the contextClassLoader of the thread, the dynamically added jars in Spark Submit can't be loaded in the driver.

      This causes `Py4JError: Trying to call a package` errors.

      Usually `-packages` are downloaded from some remote repo before runtime, adding them explicitly to `driver-class-path` is not an option, like we can do with `jars`. One solution is to move the fetching of `-packages` to the SparkSubmitDriverBootstrapper, and add it to the driver class-path there.

      A more complete solution can be achieved through SPARK-4924.

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              brkyvz Burak Yavuz
              Votes:
              2 Vote for this issue
              Watchers:
              5 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: