Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-45311

Encoder fails on many "NoSuchElementException: None.get" since 3.4.x, search for an encoder for a generic type, and since 3.5.x isn't "an expression encoder"

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 3.4.0, 3.4.1, 3.5.0
    • 3.4.2, 4.0.0, 3.5.1
    • Spark Core
    • None
    • Debian 12

      Java 17

      Underlying Spring-Boot 2.7.14

    Description

      If you find it convenient, you might clone the https://gitlab.com/territoirevif/minimal-tests-spark-issue project (that does many operations around cities, local authorities and accounting with open data) where I've extracted from my work what's necessary to make a set of 35 tests that run correctly with Spark 3.3.x, and show the troubles encountered with 3.4.x and 3.5.x.

       

      It is working well with Spark 3.2.x, 3.3.x. But as soon as I select Spark 3.4.x, where the encoder seems to have deeply changed, the encoder fails with two problems:

       

      1) It throws java.util.NoSuchElementException: None.get messages everywhere.

      Asking over the Internet, I wasn't alone facing this problem. Reading it, you'll see that I've attempted a debug but my Scala skills are low.

      https://stackoverflow.com/questions/76036349/encoders-bean-doesnt-work-anymore-on-a-java-pojo-with-spark-3-4-0

      by the way, if possible, the encoder and decoder functions should forward a parameter as soon as the name of the field being handled is known, and then all the long of their process, so that when the encoder is at any point where it has to throw an exception, it knows the field it is handling in its specific call and can send a message like:
      java.util.NoSuchElementException: None.get when encoding [the method or field it was targeting]

       

      2) Not found an encoder of the type RS to Spark SQL internal representation. Consider to change the input type to one of supported at (...)
      Or : Not found an encoder of the type OMI_ID to Spark SQL internal representation (...)

       
      where RS and OMI_ID are generic types.
      This is strange.
      https://stackoverflow.com/questions/76045255/encoders-bean-attempts-to-check-the-validity-of-a-return-type-considering-its-ge

       

      3) When I switch to the Spark 3.5.0 version, the same problems remain, but another add itself to the list:
      "Only expression encoders are supported for now" on what was accepted and working before.
       

      Attachments

        1. JavaTypeInference_116.png
          300 kB
          Marc Le Bihan
        2. sparkIssue_02.png
          521 kB
          Marc Le Bihan

        Issue Links

          Activity

            People

              Unassigned Unassigned
              mlebihan Marc Le Bihan
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: