Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.3.1
-
None
-
Important
Description
When i am trying to write an udf which returns calendar interval type, i am getting the following error:
Schema for type org.apache.spark.unsafe.types.CalendarInterval is not supported java.lang.UnsupportedOperationException: Schema for type org.apache.spark.unsafe.types.CalendarInterval is not supported at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:781) at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:715) at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56) at org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:825) ...
Attachments
Issue Links
- blocks
-
SPARK-28492 Expose and support calendar interval type in PySpark
- Resolved
-
SPARK-28493 Expose and support calendar interval type in SparkR
- Resolved
-
SPARK-28491 Expose and support calendar interval type in SparkSQL
- Resolved
- links to
(1 links to)