This is an umbrella ticket to track new expressions we are adding to SQL/DataFrame.
For each new expression, we should:
1. Add a new Expression implementation in org.apache.spark.sql.catalyst.expressions
2. If applicable, implement the code generated version (by implementing genCode).
3. Add comprehensive unit tests (for all the data types the expressions support).
4. If applicable, add a new function for DataFrame in org.apache.spark.sql.functions, and python/pyspark/sql/functions.py for Python.
For date/time functions, put them in expressions/datetime.scala, and create a DateTimeFunctionSuite.scala for testing.