Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-12558

AnalysisException when multiple functions applied in GROUP BY clause

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: 1.6.0
    • Fix Version/s: 1.6.1, 2.0.0
    • Component/s: SQL
    • Labels:
      None

      Description

      Hi,
      I have following issue when trying to use functions in group by clause.
      Example:

      sqlCtx = HiveContext(sc)
      rdd = sc.parallelize([{'test_date': 1451400761}])
      df = sqlCtx.createDataFrame(rdd)
      df.registerTempTable("df")
      

      Now, where I'm using single function it's OK.

      sqlCtx.sql("select cast(test_date as timestamp) from df group by cast(test_date as timestamp)").collect()
      
      [Row(test_date=datetime.datetime(2015, 12, 29, 15, 52, 41))]
      

      Where I'm using more than one function I'm getting AnalysisException

      sqlCtx.sql("select date(cast(test_date as timestamp)) from df group by date(cast(test_date as timestamp))").collect()
      
      Py4JJavaError: An error occurred while calling o38.sql.
      : org.apache.spark.sql.AnalysisException: expression 'test_date' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;
      

        Attachments

          Activity

            People

            • Assignee:
              dkbiswal Dilip Biswal
              Reporter:
              maver1ck Maciej BryƄski
            • Votes:
              0 Vote for this issue
              Watchers:
              6 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: