Description
Currently,
scala> import org.apache.spark.sql.types.DataType import org.apache.spark.sql.types.DataType scala> DataType.fromJson( """"abcd"""") java.util.NoSuchElementException: key not found: abcd at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:59) at scala.collection.MapLike$class.apply(MapLike.scala:141) at scala.collection.AbstractMap.apply(Map.scala:59) at org.apache.spark.sql.types.DataType$.nameToType(DataType.scala:118) at org.apache.spark.sql.types.DataType$.parseDataType(DataType.scala:132) at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:104) ... 48 elided scala> DataType.fromJson( """{"abcd":"a"}""") scala.MatchError: JObject(List((abcd,JString(a)))) (of class org.json4s.JsonAST$JObject) at org.apache.spark.sql.types.DataType$.parseDataType(DataType.scala:130) at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:104) ... 48 elided scala> DataType.fromJson( """{"fields": [{"a":123}], "type": "struct"}""") scala.MatchError: JObject(List((a,JInt(123)))) (of class org.json4s.JsonAST$JObject) at org.apache.spark.sql.types.DataType$.org$apache$spark$sql$types$DataType$$parseStructField(DataType.scala:169) at org.apache.spark.sql.types.DataType$$anonfun$parseDataType$1.apply(DataType.scala:150) at org.apache.spark.sql.types.DataType$$anonfun$parseDataType$1.apply(DataType.scala:150) at scala.collection.immutable.List.map(List.scala:273) at org.apache.spark.sql.types.DataType$.parseDataType(DataType.scala:150) at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:104) ... 48 elided
DataType.fromJson throws non-readable error messages for the json input. We could improve this rather than throwing scala.MatchError or java.util.NoSuchElementException.