pyspark.sql.functions.unix_seconds#

pyspark.sql.functions.unix_seconds(col)[source]#

Returns the number of seconds since 1970-01-01 00:00:00 UTC. Truncates higher levels of precision.

New in version 3.5.0.

Examples

>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
>>> df = spark.createDataFrame([('2015-07-22 10:00:00',)], ['t'])
>>> df.select(unix_seconds(to_timestamp(df.t)).alias('n')).collect()
[Row(n=1437584400)]
>>> spark.conf.unset("spark.sql.session.timeZone")