Description
Before we take Spark SQL out of alpha, we need to audit the APIs and stabilize them.
As a general rule, everything under org.apache.spark.sql.catalyst should not be exposed.
Attachments
Issue Links
- is related to
-
SPARK-6116 DataFrame API improvement umbrella ticket (Spark 1.5)
- Resolved
- relates to
-
SPARK-4988 "Create table ..as select ..from..order by .. limit 10" report error when one col is a Decimal
- Resolved
-
SPARK-2096 Correctly parse dot notations for accessing an array of structs
- Resolved
-
SPARK-5180 Data source API improvement (Spark 1.5)
- Resolved
-
SPARK-5135 Add support for describe [extended] table to DDL in SQLContext
- Resolved
-
SPARK-5061 SQLContext: overload createParquetFile
- Closed
-
SPARK-4945 Add overwrite option support for SchemaRDD.saveAsParquetFile
- Closed
- links to