Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-50281

pyspark local session `spark.jars` configuration does not work

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 4.0.0
    • 4.0.0
    • PySpark
    • None

    Description

      pyspark local session `spark.jars` configuration does not work 

       

      Reproducing code:

       

      if I do the following in a python shell:

      from pyspark.sql import SparkSession
      spark=SparkSession.builder.config("spark.jars", <my_jar_file_path>).master("local[*]").getOrCreate() 

      then do:

      spark.sparkContext._jvm.aaa.bbb.ccc  # aaa.bbb.ccc is the class in the jar file
      

      it can't access the Java class correctly.

       

      but if I do:

      bin/pyspark --jars=<my_jar_file_path>

      then in the pyspark shell, the issue vanished.

       

      this issue only happens in spark master, and it causes our CI failure:

      https://github.com/mlflow-automation/mlflow/actions/runs/11765501211/job/32775243482#step:12:291

       

      Attachments

        Activity

          People

            gurwls223 Hyukjin Kwon
            weichenxu123 Weichen Xu
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: