Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-28494 Expose CalendarIntervalType and CalendarInterval in Spark
  3. SPARK-24695

Move `CalendarInterval` to org.apache.spark.sql.types package

    XMLWordPrintableJSON

Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Incomplete
    • 2.3.1
    • None
    • SQL
    • Important

    Description

      When i am trying to write an udf which returns calendar interval type, i am getting the following error:

      Schema for type org.apache.spark.unsafe.types.CalendarInterval is not supported
      java.lang.UnsupportedOperationException: Schema for type org.apache.spark.unsafe.types.CalendarInterval is not supported
      at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:781)
      at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$schemaFor$1.apply(ScalaReflection.scala:715)
      at scala.reflect.internal.tpe.TypeConstraints$UndoLog.undo(TypeConstraints.scala:56)
      at org.apache.spark.sql.catalyst.ScalaReflection$class.cleanUpReflectionObjects(ScalaReflection.scala:825)
      ...

       

       

      Attachments

        Issue Links

          Activity

            People

              Unassigned Unassigned
              priyankagargnitk Priyanka Garg
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

                Created:
                Updated:
                Resolved:

                Time Tracking

                  Estimated:
                  Original Estimate - 24h
                  24h
                  Remaining:
                  Remaining Estimate - 24h
                  24h
                  Logged:
                  Time Spent - Not Specified
                  Not Specified