Uploaded image for project: 'Pig'
  1. Pig
  2. PIG-5318

Unit test failures on Pig on Spark with Spark 2.2

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • None
    • 0.18.0
    • spark
    • None

    Description

      There are several failing cases when executing the unit tests with Spark 2.2:

       org.apache.pig.test.TestAssert#testNegativeWithoutFetch
       org.apache.pig.test.TestAssert#testNegative
       org.apache.pig.test.TestEvalPipeline2#testNonStandardDataWithoutFetch
       org.apache.pig.test.TestScalarAliases#testScalarErrMultipleRowsInInput
       org.apache.pig.test.TestStore#testCleanupOnFailureMultiStore
       org.apache.pig.test.TestStoreInstances#testBackendStoreCommunication
       org.apache.pig.test.TestStoreLocal#testCleanupOnFailureMultiStore
      

      All of these are related to fixes/changes in Spark.

      TestAssert, TestScalarAliases and TestEvalPipeline2 failures could be fixed by asserting on the message of the exception's root cause, looks like on Spark 2.2 the exception is wrapped into an additional layer.
      TestStore and TestStoreLocal failure are also a test related problems: looks like SPARK-7953 is fixed in Spark 2.2
      The root cause of TestStoreInstances is yet to be found out.

      Attachments

        1. PIG-5318_1.patch
          11 kB
          Nándor Kollár
        2. PIG-5318_2.patch
          12 kB
          Nándor Kollár
        3. PIG-5318_3.patch
          12 kB
          Nándor Kollár
        4. PIG-5318_4.patch
          12 kB
          Nándor Kollár
        5. PIG-5318_5.patch
          12 kB
          Nándor Kollár
        6. PIG-5318_6.patch
          12 kB
          Nándor Kollár

        Activity

          People

            nkollar Nándor Kollár
            nkollar Nándor Kollár
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: