Home
Trees
Indices
Help
PySpark
[
frames
] |
no frames
]
[
Module Hierarchy
|
Class Hierarchy
]
Class Hierarchy
pyspark.storagelevel.StorageLevel
:
Flags for controlling the storage of an RDD.
object
:
The most base type
pyspark.accumulators.Accumulator
:
A shared variable that can be accumulated, i.e., has a commutative and associative "add" operation.
pyspark.accumulators.AccumulatorParam
:
Helper object that defines how to accumulate values of a given type.
pyspark.broadcast.Broadcast
pyspark.rdd.RDD
:
A Resilient Distributed Dataset (RDD), the basic abstraction in Spark.
pyspark.context.SparkContext
:
Main entry point for Spark functionality.
pyspark.files.SparkFiles
:
Resolves paths to files added through
SparkContext.addFile()
.
pyspark.statcounter.StatCounter
Home
Trees
Indices
Help
PySpark
Generated by Epydoc 3.0.1 on Tue Dec 10 15:25:55 2013
http://epydoc.sourceforge.net