Packages

package spark

Available imports for the spark and RDD contexts.

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. spark
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class Keyspace(bucket: Option[String] = None, scope: Option[String] = None, collection: Option[String] = None) extends Product with Serializable

    The keyspace reflects a triple/coordinate of bucket, scope and collection.

    The keyspace reflects a triple/coordinate of bucket, scope and collection.

    Note that not all APIs need all three values to be set. Depending on the context where the keyspace is used or the type of service (i.e. kv vs. query) it might be sufficient to only provide a subset. See the individual semantics for each operation if in doubt.

    bucket

    the bucket name, if present.

    scope

    the scope name, if present.

    collection

    the collection name, if present.

  2. class RDDFunctions[T] extends Serializable

    Functions which can be performed on an RDD if the evidence matches.

    Functions which can be performed on an RDD if the evidence matches.

    T

    the generic RDD type to operate on.

  3. class SparkContextFunctions extends Serializable

    Brings RDD related functions into the spark context when loaded as an import.

Value Members

  1. implicit def toRDDFunctions[T](rdd: RDD[T]): RDDFunctions[T]
  2. implicit def toSparkContextFunctions(sc: SparkContext): SparkContextFunctions
  3. object DefaultConstants

    Default constants used across the whole spark connector project.