Description
Hello,
Spark does not recognize `interval` type as a `numeric` one, which means that we can't use `interval` columns in aggregated functions. For instance, the following query works on PgSQL but does not work on Spark:
SELECT i,AVG(cast(v as interval)) OVER (ORDER BY i ROWS BETWEEN CURRENT ROW AND UNBOUNDED FOLLOWING) FROM (VALUES(1,'1 sec'),(2,'2 sec'),(3,NULL),(4,NULL)) t(i,v);
cannot resolve 'avg(CAST(`v` AS INTERVAL))' due to data type mismatch: function average requires numeric types, not interval; line 1 pos 9
Attachments
Issue Links
- is related to
-
SPARK-29688 Support average with interval type values
- Reopened