package spark
Available imports for the spark and RDD contexts.
- Alphabetic
- By Inheritance
- spark
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
-
case class
Keyspace(bucket: Option[String] = None, scope: Option[String] = None, collection: Option[String] = None) extends Product with Serializable
The keyspace reflects a triple/coordinate of bucket, scope and collection.
The keyspace reflects a triple/coordinate of bucket, scope and collection.
Note that not all APIs need all three values to be set. Depending on the context where the keyspace is used or the type of service (i.e. kv vs. query) it might be sufficient to only provide a subset. See the individual semantics for each operation if in doubt.
- bucket
the bucket name, if present.
- scope
the scope name, if present.
- collection
the collection name, if present.
-
class
RDDFunctions[T] extends Serializable
Functions which can be performed on an RDD if the evidence matches.
Functions which can be performed on an RDD if the evidence matches.
- T
the generic RDD type to operate on.
-
class
SparkContextFunctions extends Serializable
Brings RDD related functions into the spark context when loaded as an import.
Value Members
- implicit def toRDDFunctions[T](rdd: RDD[T]): RDDFunctions[T]
- implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions
-
object
DefaultConstants
Default constants used across the whole spark connector project.