pyspark.sql.functions.array_append#
- pyspark.sql.functions.array_append(col, value)[source]#
Array function: returns a new array column by appending value to the existing array col.
New in version 3.4.0.
- Parameters
- Returns
Column
A new array column with value appended to the original array.
Notes
Supports Spark Connect.
Examples
Example 1: Appending a column value to an array column
>>> from pyspark.sql import Row, functions as sf >>> df = spark.createDataFrame([Row(c1=["b", "a", "c"], c2="c")]) >>> df.select(sf.array_append(df.c1, df.c2)).show() +--------------------+ |array_append(c1, c2)| +--------------------+ | [b, a, c, c]| +--------------------+
Example 2: Appending a numeric value to an array column
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([([1, 2, 3],)], ['data']) >>> df.select(sf.array_append(df.data, 4)).show() +---------------------+ |array_append(data, 4)| +---------------------+ | [1, 2, 3, 4]| +---------------------+
Example 3: Appending a null value to an array column
>>> from pyspark.sql import functions as sf >>> df = spark.createDataFrame([([1, 2, 3],)], ['data']) >>> df.select(sf.array_append(df.data, None)).show() +------------------------+ |array_append(data, NULL)| +------------------------+ | [1, 2, 3, NULL]| +------------------------+
Example 4: Appending a value to a NULL array column
>>> from pyspark.sql import functions as sf >>> from pyspark.sql.types import ArrayType, IntegerType, StructType, StructField >>> schema = StructType([ ... StructField("data", ArrayType(IntegerType()), True) ... ]) >>> df = spark.createDataFrame([(None,)], schema=schema) >>> df.select(sf.array_append(df.data, 4)).show() +---------------------+ |array_append(data, 4)| +---------------------+ | NULL| +---------------------+
Example 5: Appending a value to an empty array
>>> from pyspark.sql import functions as sf >>> from pyspark.sql.types import ArrayType, IntegerType, StructType, StructField >>> schema = StructType([ ... StructField("data", ArrayType(IntegerType()), True) ... ]) >>> df = spark.createDataFrame([([],)], schema=schema) >>> df.select(sf.array_append(df.data, 1)).show() +---------------------+ |array_append(data, 1)| +---------------------+ | [1]| +---------------------+