Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-39252

Flaky Test: pyspark.sql.tests.test_dataframe.DataFrameTests test_df_is_empty

    XMLWordPrintableJSON

Details

    • Test
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.1.3, 3.3.0, 3.2.2
    • 3.1.3, 3.3.0, 3.2.2
    • PySpark
    • None

    Description

      test_df_is_empty is flaky. For example, a recent PR: https://github.com/apache/spark/pull/36580
      https://github.com/panbingkun/spark/runs/6525997469?check_suite_focus=true

      Possibly introduced from SPARK-39084

      test_df_is_empty (pyspark.sql.tests.test_dataframe.DataFrameTests) ... 
      [Stage 6:>                                                          (0 + 1) / 1]
      
                                                                                      
      #
      # A fatal error has been detected by the Java Runtime Environment:
      #
      #  SIGSEGV (0xb) at pc=0x00007fd84a1486ff, pid=4021, tid=0x00007fd8016a2700
      #
      # JRE version: OpenJDK Runtime Environment (Zulu 8.62.0.19-CA-linux64) (8.0_332-b09) (build 1.8.0_332-b09)
      # Java VM: OpenJDK 64-Bit Server VM (25.332-b09 mixed mode linux-amd64 compressed oops)
      # Problematic frame:
      # J 9116 C2 org.apache.spark.unsafe.UnsafeAlignedOffset.getSize(Ljava/lang/Object;J)I (51 bytes) @ 0x00007fd84a1486ff [0x00007fd84a1486e0+0x1f]
      #
      # Core dump written. Default location: /__w/spark/spark/core or core.4021
      #
      # An error report file with more information is saved as:
      # /__w/spark/spark/hs_err_pid4021.log
      #
      # If you would like to submit a bug report, please visit:
      #   http://www.azul.com/support/
      #
      ----------------------------------------
      Exception happened during processing of request from ('127.0.0.1', 36358)
      Traceback (most recent call last):
        File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 316, in _handle_request_noblock
          self.process_request(request, client_address)
        File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 347, in process_request
          self.finish_request(request, client_address)
        File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 360, in finish_request
          self.RequestHandlerClass(request, client_address, self)
        File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 720, in __init__
          self.handle()
        File "/__w/spark/spark/python/pyspark/accumulators.py", line 281, in handle
          poll(accum_updates)
        File "/__w/spark/spark/python/pyspark/accumulators.py", line 253, in poll
          if func():
        File "/__w/spark/spark/python/pyspark/accumulators.py", line 257, in accum_updates
          num_updates = read_int(self.rfile)
        File "/__w/spark/spark/python/pyspark/serializers.py", line 595, in read_int
          raise EOFError
      

      Attachments

        Activity

          People

            ivan.sadikov Ivan Sadikov
            gurwls223 Hyukjin Kwon
            Votes:
            0 Vote for this issue
            Watchers:
            1 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: