SparkSession.
conf
Runtime configuration interface for Spark.
This is the interface through which the user can get and set all Spark and Hadoop configurations that are relevant to Spark SQL. When getting the value of a config, this defaults to the value set in the underlying SparkContext, if any.
SparkContext
New in version 2.0.0.
pyspark.sql.conf.RuntimeConfig
Examples
>>> spark.conf <pyspark.sql.conf.RuntimeConfig object ...>
Set a runtime configuration for the session
>>> spark.conf.set("key", "value") >>> spark.conf.get("key") 'value'