Details
-
Bug
-
Status: Resolved
-
Trivial
-
Resolution: Fixed
-
3.3.0
Description
The docstring of the `DataFrame.cache()` method currently states that it uses a serialized storage level
Persists the :class:`DataFrame` with the default storage level (`MEMORY_AND_DISK`). [...] - The default storage level has changed to `MEMORY_AND_DISK` to match Scala in 2.0.
while `DataFrame.persist()` states that it uses a deserialized storage level
If no storage level is specified defaults to (`MEMORY_AND_DISK_DESER`)
[...]
The default storage level has changed to `MEMORY_AND_DISK_DESER` to match Scala in 3.0.
However, in practice both `.cache()` and `.persist()` use deserialized storage levels:
import pyspark from pyspark.sql import SparkSession from pyspark import StorageLevel print(pyspark.__version__) # 3.3.0 spark = SparkSession.builder.master("local[2]").getOrCreate() df = spark.createDataFrame(zip(["A"] * 1000, ["B"] * 1000), ["col_a", "col_b"]) df = df.cache() df.count() # Storage level in Spark UI: "Disk Memory Deserialized 1x Replicated" df = spark.createDataFrame(zip(["A"] * 1000, ["B"] * 1000), ["col_a", "col_b"]) df = df.persist() df.count() # Storage level in Spark UI: "Disk Memory Deserialized 1x Replicated" df = spark.createDataFrame(zip(["A"] * 1000, ["B"] * 1000), ["col_a", "col_b"]) df = df.persist(StorageLevel.MEMORY_AND_DISK) df.count() # Storage level in Spark UI: "Disk Memory Serialized 1x Replicated"
Attachments
Issue Links
- links to