pyspark.sql.functions.
variance
Aggregate function: alias for var_samp
New in version 1.6.0.
Changed in version 3.4.0: Supports Spark Connect.
Column
target column to compute on.
variance of given column.
Examples
>>> df = spark.range(6) >>> df.select(variance(df.id)).show() +------------+ |var_samp(id)| +------------+ | 3.5| +------------+