Details
-
Wish
-
Status: Resolved
-
Minor
-
Resolution: Not A Problem
-
2.1.0
-
None
-
Important
-
Related to - https://issues.apache.org/jira/browse/SPARK-8480
Description
We can Cache and Uncache any table using its name in Spark Sql.
df.createTempView("myTable") sqlContext.cacheTable("myTable") sqlContext.uncacheTable("myTable")
Likewise if it is possible to have some kind of uniqueness for names in DataSets and an abstraction like the same that we have for tables. It would be very useful
scala> val df = sc.range(1,1000).toDF df: org.apache.spark.sql.DataFrame = [value: bigint] scala> df.setName("MyDataset") res0: df.type = MyDataset scala> df.cache res1: df.type = MyDataset sqlContext.getDataSet("MyDataset") sqlContext.uncacheDataSet("MyDataset")