Details
-
Bug
-
Status: Closed
-
Major
-
Resolution: Fixed
-
1.37.0
Description
scala> val df = spark.sql("select map_keys(map(null,0))");
df: org.apache.spark.sql.DataFrame = [map_keys(map(NULL, 0)): array<void>]
scala> df.show();
org.apache.spark.SparkRuntimeException: [NULL_MAP_KEY] Cannot use null as map key.
at org.apache.spark.sql.errors.QueryExecutionErrors$.nullAsMapKeyNotAllowedError(QueryExecutionErrors.scala:445)
at org.apache.spark.sql.catalyst.util.ArrayBasedMapBuilder.put(ArrayBasedMapBuilder.scala:56)
at org.apache.spark.sql.catalyst.expressions.CreateMap.eval(complexTypeCreator.scala:248)
scala> val df = spark.sql("select map_values(map(cast(null as int),0, 'foo', 1))");
df: org.apache.spark.sql.DataFrame = [map_values(map(CAST(NULL AS INT), 0, foo, 1)): array<int>]
scala> df.show()
org.apache.spark.SparkRuntimeException: [NULL_MAP_KEY] Cannot use null as map key.
at org.apache.spark.sql.errors.QueryExecutionErrors$.nullAsMapKeyNotAllowedError(QueryExecutionErrors.scala:445)
Attachments
Issue Links
- blocks
-
CALCITE-6582 Release Calcite 1.38.0
- Closed
- relates to
-
CALCITE-6300 Function MAP_VALUES/MAP_KEYS gives exception when mapValueType and mapKeyType do not equal map biggest mapKeytype or mapValueType
- In Progress
- links to