Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-48417

Filesystems do not load with spark.jars.packages configuration

Details

    • Bug
    • Status: Closed
    • Major
    • Resolution: Not A Problem
    • 3.5.1
    • None
    • Input/Output
    • None

    Description

      When we use spark.jars.packages configuration parameter in Python SparkSession Builder (Pyspark), it appears that the filesystems are not loaded when session starts. Because of this, Spark fails to read file from Google Cloud Storage (GCS) bucket (with GCS Connector). 

      I tested this with different packages so it does not appear specific to a particular package. I will attach the sample code and debug logs.

      Attachments

        1. pyspark_mleap.py
          3 kB
          Ravi Dalal
        2. pyspark_spark_jar_package_config_logs.txt
          30 kB
          Ravi Dalal
        3. pyspark_without_spark_jar_package_config_logs.txt
          6 kB
          Ravi Dalal

        Activity

          ravidalal Ravi Dalal added a comment -

          Apologies. We missed a configuration parameter. Found it after creating this bug. Resolving the bug now.

          ravidalal Ravi Dalal added a comment - Apologies. We missed a configuration parameter. Found it after creating this bug. Resolving the bug now.
          ravidalal Ravi Dalal added a comment -

          For anyone facing this issue, use following configuration to read file from GCS when spark.jars.packages is used:

          config("spark.jars", "https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-hadoop3-2.2.22.jar")
          config("spark.hadoop.fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS")               config("spark.hadoop.fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem")

          When spark.jars.pacakges is not used, following configuration alone works:

          config("spark.jars", "https://storage.googleapis.com/hadoop-lib/gcs/gcs-connector-hadoop3-2.2.22.jar")
          config("spark.hadoop.fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS") 
          ravidalal Ravi Dalal added a comment - For anyone facing this issue, use following configuration to read file from GCS when spark.jars.packages is used: config( "spark.jars" , "https: //storage.googleapis.com/hadoop-lib/gcs/gcs-connector-hadoop3-2.2.22.jar" ) config( "spark.hadoop.fs.AbstractFileSystem.gs.impl" , "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS" ) config( "spark.hadoop.fs.gs.impl" , "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem" ) When spark.jars.pacakges is not used, following configuration alone works: config( "spark.jars" , "https: //storage.googleapis.com/hadoop-lib/gcs/gcs-connector-hadoop3-2.2.22.jar" ) config( "spark.hadoop.fs.AbstractFileSystem.gs.impl" , "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS" )

          People

            Unassigned Unassigned
            ravidalal Ravi Dalal
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: