Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-27966

input_file_name empty when listing files in parallel

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Incomplete
    • 2.4.0
    • None
    • Input/Output
    • Databricks 5.3 (includes Apache Spark 2.4.0, Scala 2.11)

       
      Worker Type: 14.0 GB Memory, 4 Cores, 0.75 DBU Standard_DS3_v2
      Workers: 3
      Driver Type: 14.0 GB Memory, 4 Cores, 0.75 DBU Standard_DS3_v2

    Description

      I ran into an issue similar and probably related to SPARK-26128. The org.apache.spark.sql.functions.input_file_name is sometimes empty.

       

      df.select(input_file_name()).show(5,false)
      

       

      +-----------------+
      |input_file_name()|
      +-----------------+
      | |
      | |
      | |
      | |
      | |
      +-----------------+
      

      My environment is databricks and debugging the Log4j output showed me that the issue occurred when the files are being listed in parallel, e.g. when 

      19/06/06 11:50:47 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 127; threshold: 32
      19/06/06 11:50:47 INFO InMemoryFileIndex: Listing leaf files and directories in parallel under:

       

      Everything's fine as long as

      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 6; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      19/06/06 11:54:43 INFO InMemoryFileIndex: Start listing leaf files and directories. Size of Paths: 0; threshold: 32
      

       

      Setting spark.sql.sources.parallelPartitionDiscovery.threshold to 9999 resolves the issue for me.

       

      edit: the problem is not exclusively linked to listing files in parallel. I've setup a larger cluster for which after parallel file listing the input_file_name did return the correct filename. After inspecting the log4j again, I assume that it's linked to some kind of MetaStore being full. I've attached a section of the log4j output that I think should indicate why it's failing. If you need more, please let me know.

       ** 

       

       

      Attachments

        1. input_file_name_bug
          27 kB
          Christian Homberg

        Issue Links

          Activity

            People

              Unassigned Unassigned
              Chr_96er Christian Homberg
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: