explain {SparkR}R Documentation

Explain

Description

Print the logical and physical Catalyst plans to the console for debugging.

Usage

explain(x, ...)

## S4 method for signature 'SparkDataFrame'
explain(x, extended = FALSE)

## S4 method for signature 'StreamingQuery'
explain(x, extended = FALSE)

Arguments

x

a SparkDataFrame or a StreamingQuery.

...

further arguments to be passed to or from other methods.

extended

Logical. If extended is FALSE, prints only the physical plan.

Note

explain since 1.4.0

explain(StreamingQuery) since 2.2.0

See Also

Other SparkDataFrame functions: SparkDataFrame-class, agg(), alias(), arrange(), as.data.frame(), attach,SparkDataFrame-method, broadcast(), cache(), checkpoint(), coalesce(), collect(), colnames(), coltypes(), createOrReplaceTempView(), crossJoin(), cube(), dapplyCollect(), dapply(), describe(), dim(), distinct(), dropDuplicates(), dropna(), drop(), dtypes(), exceptAll(), except(), filter(), first(), gapplyCollect(), gapply(), getNumPartitions(), group_by(), head(), hint(), histogram(), insertInto(), intersectAll(), intersect(), isLocal(), isStreaming(), join(), limit(), localCheckpoint(), merge(), mutate(), ncol(), nrow(), persist(), printSchema(), randomSplit(), rbind(), rename(), repartitionByRange(), repartition(), rollup(), sample(), saveAsTable(), schema(), selectExpr(), select(), showDF(), show(), storageLevel(), str(), subset(), summary(), take(), toJSON(), unionAll(), unionByName(), union(), unpersist(), withColumn(), withWatermark(), with(), write.df(), write.jdbc(), write.json(), write.orc(), write.parquet(), write.stream(), write.text()

Other StreamingQuery methods: awaitTermination(), isActive(), lastProgress(), queryName(), status(), stopQuery()

Examples



## Not run: 
##D sparkR.session()
##D path <- "path/to/file.json"
##D df <- read.json(path)
##D explain(df, TRUE)
## End(Not run)
## Not run:  explain(sq) 




[Package SparkR version 3.2.4 Index]