pyspark.ml.util.
DefaultParamsReader
Specialization of MLReader for Params types
MLReader
Params
Default MLReader implementation for transformers and estimators that contain basic (json-serializable) params and no data. This will not handle more complex params or types with data (e.g., models with coefficients).
New in version 2.3.0.
Methods
getAndSetParams(instance, metadata[, skipParams])
getAndSetParams
Extract Params from metadata, and set them in the instance.
isPythonParamsInstance(metadata)
isPythonParamsInstance
load(path)
load
Load the ML instance from the input path.
loadMetadata(path, sc[, expectedClassName])
loadMetadata
Load metadata saved using DefaultParamsWriter.saveMetadata()
DefaultParamsWriter.saveMetadata()
loadParamsInstance(path, sc)
loadParamsInstance
Load a Params instance from the given path, and return it.
session(sparkSession)
session
Sets the Spark Session to use for saving/loading.
Attributes
sc
Returns the underlying SparkContext.
sparkSession
Returns the user-specified Spark Session or the default.
Methods Documentation
pyspark.SparkContext
If non empty, this is checked against the loaded metadata.
Load a Params instance from the given path, and return it. This assumes the instance inherits from MLReadable.
MLReadable
Attributes Documentation