Description
Hi,
I have following issue when trying to use functions in group by clause.
Example:
sqlCtx = HiveContext(sc) rdd = sc.parallelize([{'test_date': 1451400761}]) df = sqlCtx.createDataFrame(rdd) df.registerTempTable("df")
Now, where I'm using single function it's OK.
sqlCtx.sql("select cast(test_date as timestamp) from df group by cast(test_date as timestamp)").collect()
[Row(test_date=datetime.datetime(2015, 12, 29, 15, 52, 41))]
Where I'm using more than one function I'm getting AnalysisException
sqlCtx.sql("select date(cast(test_date as timestamp)) from df group by date(cast(test_date as timestamp))").collect() Py4JJavaError: An error occurred while calling o38.sql. : org.apache.spark.sql.AnalysisException: expression 'test_date' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you get.;