Packages

c

com.couchbase.spark.query

QueryTableProvider

class QueryTableProvider extends TableProvider with Logging with DataSourceRegister with CreatableRelationProvider

Linear Supertypes
CreatableRelationProvider, DataSourceRegister, Logging, TableProvider, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. QueryTableProvider
  2. CreatableRelationProvider
  3. DataSourceRegister
  4. Logging
  5. TableProvider
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new QueryTableProvider()

Value Members

  1. def createRelation(ctx: SQLContext, mode: SaveMode, properties: Map[String, String], data: DataFrame): BaseRelation
    Definition Classes
    QueryTableProvider → CreatableRelationProvider
  2. def getTable(schema: StructType, partitioning: Array[Transform], properties: Map[String, String]): Table

    Returns the "Table", either with an inferred schema or a user provide schema.

    Returns the "Table", either with an inferred schema or a user provide schema.

    schema

    the schema, either inferred or provided by the user.

    partitioning

    partitioning information.

    properties

    the properties for customization

    returns

    the table instance which performs the actual work inside it.

    Definition Classes
    QueryTableProvider → TableProvider
  3. def inferPartitioning(arg0: CaseInsensitiveStringMap): Array[Transform]
    Definition Classes
    TableProvider
  4. def inferSchema(options: CaseInsensitiveStringMap): StructType

    InferSchema is always called if the user does not pass in an explicit schema.

    InferSchema is always called if the user does not pass in an explicit schema.

    options

    the options provided from the user.

    returns

    the inferred schema, if possible.

    Definition Classes
    QueryTableProvider → TableProvider
  5. def isWrite: Boolean

    This is a hack because even from the DataFrameWriter the infer schema is called - even though we accept any schema.

    This is a hack because even from the DataFrameWriter the infer schema is called - even though we accept any schema.

    So check the stack where we are coming from and it allows to bail out early since we don't care about the schema on a write op at all.

    returns

    true if we are in a write op, this is a hack.

  6. def readConfig(properties: Map[String, String]): QueryReadConfig
  7. def shortName(): String
    Definition Classes
    QueryTableProvider → DataSourceRegister
  8. def supportsExternalMetadata(): Boolean

    We allow a user passing in a custom schema.

    We allow a user passing in a custom schema.

    Definition Classes
    QueryTableProvider → TableProvider
  9. def writeConfig(properties: Map[String, String], conf: CouchbaseConfig): QueryWriteConfig