Java API

In addition to the primary Scala API, the connector provides convenience APIs when accessed from Java.

Java APIs

To use the Java API in spark, you need to initialize a JavaSparkContext:

SparkConf conf = new SparkConf()
    .set("", "");

JavaSparkContext sc = new JavaSparkContext(conf);

Since Java doesn’t have the implicit imports like Scala, the connector provides a helper class to achieve similar functionality:

// The Couchbase-Enabled spark context
CouchbaseSparkContext csc = couchbaseContext(sc);

The context is a static import. In general you want to statically import the following:

import static com.couchbase.spark.japi.CouchbaseDocumentRDD.couchbaseDocumentRDD;
import static com.couchbase.spark.japi.CouchbaseSparkContext.couchbaseContext;

Now you can create RDDs through Key/Value, Views or N1QL:

// Load docs through K/V
List<JsonDocument> docs = csc
    .couchbaseGet(Arrays.asList("airline_10226", "airline_10748"))

// Perform a N1QL query
List<CouchbaseQueryRow> results = csc
    .couchbaseQuery(N1qlQuery.simple("SELECT * FROM `travel-sample` LIMIT 10"))


If you want to store Documents in Couchbase, use the couchbaseDocumentRDD method:

    sc.parallelize(Arrays.asList(JsonDocument.create("doc1", JsonObject.empty())))