pyspark.sql.functions.split_part#
- pyspark.sql.functions.split_part(src, delimiter, partNum)[source]#
Splits str by delimiter and return requested part of the split (1-based). If any input is null, returns null. if partNum is out of range of split parts, returns empty string. If partNum is 0, throws an error. If partNum is negative, the parts are counted backward from the end of the string. If the delimiter is an empty string, the str is not split.
New in version 3.5.0.
- Parameters
Examples
>>> df = spark.createDataFrame([("11.12.13", ".", 3,)], ["a", "b", "c"]) >>> df.select(split_part(df.a, df.b, df.c).alias('r')).collect() [Row(r='13')]