Details

    • Sub-task
    • Status: Resolved
    • Major
    • Resolution: Invalid
    • 3.0.0
    • None
    • SQL
    • None

    Description

      In PostgreSQL, a NULL constant is accepted in LIMIT and its just ignored.

      But, in spark, it throws an exception below;

      select * from int8_tbl limit (case when random() < 0.5 then bigint(null) end);
      
      org.apache.spark.sql.AnalysisException
      The limit expression must evaluate to a constant value, but got CASE WHEN (`_nondeterministic` < CAST(0.5BD AS DOUBLE)) THEN CAST(NULL AS BIGINT) END; 

      Attachments

        Activity

          People

            Unassigned Unassigned
            maropu Takeshi Yamamuro
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: