pyspark.sql.functions.timestamp_seconds#
- pyspark.sql.functions.timestamp_seconds(col)[source]#
Converts the number of seconds from the Unix epoch (1970-01-01T00:00:00Z) to a timestamp.
New in version 3.1.0.
Changed in version 3.4.0: Supports Spark Connect.
See also
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "UTC")
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([(1230219000,), (1280219000,)], ['seconds']) >>> df.select('*', sf.timestamp_seconds('seconds')).show() +----------+--------------------------+ | seconds|timestamp_seconds(seconds)| +----------+--------------------------+ |1230219000| 2008-12-25 15:30:00| |1280219000| 2010-07-27 08:23:20| +----------+--------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")