pyspark.sql.functions.try_avg#

pyspark.sql.functions.try_avg(col)[source]#

Returns the mean calculated from values of a group and the result is null on overflow.

New in version 3.5.0.

Parameters
colColumn or str

Examples

Example 1: Calculating the average age

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([(1982, 15), (1990, 2)], ["birth", "age"])
>>> df.select(sf.try_avg("age")).show()
+------------+
|try_avg(age)|
+------------+
|         8.5|
+------------+

Example 2: Calculating the average age with None

>>> import pyspark.sql.functions as sf
>>> df = spark.createDataFrame([(1982, None), (1990, 2), (2000, 4)], ["birth", "age"])
>>> df.select(sf.try_avg("age")).show()
+------------+
|try_avg(age)|
+------------+
|         3.0|
+------------+

Example 3: Overflow results in NULL when ANSI mode is on

>>> from decimal import Decimal
>>> import pyspark.sql.functions as sf
>>> origin = spark.conf.get("spark.sql.ansi.enabled")
>>> spark.conf.set("spark.sql.ansi.enabled", "true")
>>> try:
...     df = spark.createDataFrame(
...         [(Decimal("1" * 38),), (Decimal(0),)], "number DECIMAL(38, 0)")
...     df.select(sf.try_avg(df.number)).show()
... finally:
...     spark.conf.set("spark.sql.ansi.enabled", origin)
+---------------+
|try_avg(number)|
+---------------+
|           NULL|
+---------------+