Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-21255

NPE when creating encoder for enum

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Major
    • Resolution: Fixed
    • 2.1.0
    • 2.3.0
    • Java API
    • None
    • org.apache.spark:spark-core_2.10:2.1.0
      org.apache.spark:spark-sql_2.10:2.1.0

    Description

      When you try to create an encoder for Enum type (or bean with enum property) via Encoders.bean(...), it fails with NullPointerException at TypeToken:495.
      I did a little research and it turns out, that in JavaTypeInference:126 following code

      val beanInfo = Introspector.getBeanInfo(typeToken.getRawType)
      val properties = beanInfo.getPropertyDescriptors.filterNot(_.getName == "class")
      val fields = properties.map { property =>
                val returnType = typeToken.method(property.getReadMethod).getReturnType
                val (dataType, nullable) = inferDataType(returnType)
                new StructField(property.getName, dataType, nullable)
              }
      (new StructType(fields), true)
      

      filters out properties named "class", because we wouldn't want to serialize that. But enum types have another property of type Class named "declaringClass", which we are trying to inspect recursively. Eventually we try to inspect ClassLoader class, which has property "defaultAssertionStatus" with no read method, which leads to NPE at TypeToken:495.

      I think adding property name "declaringClass" to filtering will resolve this.

      Attachments

        Issue Links

          Activity

            People

              mike0sv Mike
              mike0sv Mike
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: