Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-18772

Unnecessary conversion try for special floats in JSON

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Minor
    • Resolution: Fixed
    • 2.0.2, 2.2.0
    • 2.2.0
    • SQL
    • None

    Description

      It looks we can avoid some cases for unnecessary conversion try in special floats in JSON.

      scala> import org.apache.spark.sql.types._
      import org.apache.spark.sql.types._
      
      scala> spark.read.schema(StructType(Seq(StructField("a", DoubleType)))).option("mode", "FAILFAST").json(Seq("""{"a": "nan"}""").toDS).show()
      17/05/12 11:30:41 ERROR Executor: Exception in task 0.0 in stage 2.0 (TID 2)
      java.lang.NumberFormatException: For input string: "nan"
      ...
      

      Attachments

        Activity

          People

            gurwls223 Hyukjin Kwon
            NathanHowell Nathan Howell
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: