Details
-
Sub-task
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
3.0.0
-
None
-
None
Description
In PostgreSQL, a NULL constant is accepted in LIMIT and its just ignored.
But, in spark, it throws an exception below;
select * from int8_tbl limit (case when random() < 0.5 then bigint(null) end); org.apache.spark.sql.AnalysisException The limit expression must evaluate to a constant value, but got CASE WHEN (`_nondeterministic` < CAST(0.5BD AS DOUBLE)) THEN CAST(NULL AS BIGINT) END;