Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-16548

java.io.CharConversionException: Invalid UTF-32 character prevents me from querying my data

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 1.6.1
    • 2.2.0, 2.3.0
    • SQL
    • None

    Description

      Basically, when I query my json data I get

      java.io.CharConversionException: Invalid UTF-32 character 0x7b2265(above 10ffff)  at char #192, byte #771)
      	at com.fasterxml.jackson.core.io.UTF32Reader.reportInvalid(UTF32Reader.java:189)
      	at com.fasterxml.jackson.core.io.UTF32Reader.read(UTF32Reader.java:150)
      	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.loadMore(ReaderBasedJsonParser.java:153)
      	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipWSOrEnd(ReaderBasedJsonParser.java:1855)
      	at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:571)
      	at org.apache.spark.sql.catalyst.expressions.GetJsonObject$$anonfun$eval$2$$anonfun$4.apply(jsonExpressions.scala:142)
      

      I do not like it. If you can not process one json among 100500 please return null, do not fail everything. I have dirty one line fix, and I understand how I can make it more reasonable. What is our position - what behaviour we wanna get?

      Attachments

        1. corrupted.json
          3.24 MB
          Bijith Kumar

        Issue Links

          Activity

            People

              Unassigned Unassigned
              epahomov Egor Pahomov
              Votes:
              0 Vote for this issue
              Watchers:
              7 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved: