pyspark.sql.functions.
map_concat
Returns the union of all the given maps.
New in version 2.4.0.
Column
column names or Columns
Examples
>>> from pyspark.sql.functions import map_concat >>> df = spark.sql("SELECT map(1, 'a', 2, 'b') as map1, map(3, 'c') as map2") >>> df.select(map_concat("map1", "map2").alias("map3")).show(truncate=False) +------------------------+ |map3 | +------------------------+ |{1 -> a, 2 -> b, 3 -> c}| +------------------------+