pyspark.sql.DataFrame.unpersist¶
- 
DataFrame.unpersist(blocking: bool = False) → pyspark.sql.dataframe.DataFrame[source]¶
- Marks the - DataFrameas non-persistent, and remove all blocks for it from memory and disk.- New in version 1.3.0. - Changed in version 3.4.0: Supports Spark Connect. - Parameters
- blockingbool
- Whether to block until all blocks are deleted. 
 
- Returns
- DataFrame
- Unpersisted DataFrame. 
 
 - Notes - blocking default has changed to - Falseto match Scala in 2.0.- Examples - >>> df = spark.range(1) >>> df.persist() DataFrame[id: bigint] >>> df.unpersist() DataFrame[id: bigint] >>> df = spark.range(1) >>> df.unpersist(True) DataFrame[id: bigint]