GoogleApi.Dataproc.V1.Model.SparkSqlJob (google_api_dataproc v0.48.0) View Source

A Dataproc job for running Apache Spark SQL (http://spark.apache.org/sql/) queries.

Attributes

  • jarFileUris (type: list(String.t), default: nil) - Optional. HCFS URIs of jar files to be added to the Spark CLASSPATH.
  • loggingConfig (type: GoogleApi.Dataproc.V1.Model.LoggingConfig.t, default: nil) - Optional. The runtime log config for job execution.
  • properties (type: map(), default: nil) - Optional. A mapping of property names to values, used to configure Spark SQL's SparkConf. Properties that conflict with values set by the Dataproc API may be overwritten.
  • queryFileUri (type: String.t, default: nil) - The HCFS URI of the script that contains SQL queries.
  • queryList (type: GoogleApi.Dataproc.V1.Model.QueryList.t, default: nil) - A list of queries.
  • scriptVariables (type: map(), default: nil) - Optional. Mapping of query variable names to values (equivalent to the Spark SQL command: SET name="value";).

Link to this section Summary

Functions

Unwrap a decoded JSON object into its complex fields.

Link to this section Types

Specs

t() :: %GoogleApi.Dataproc.V1.Model.SparkSqlJob{
  jarFileUris: [String.t()] | nil,
  loggingConfig: GoogleApi.Dataproc.V1.Model.LoggingConfig.t() | nil,
  properties: map() | nil,
  queryFileUri: String.t() | nil,
  queryList: GoogleApi.Dataproc.V1.Model.QueryList.t() | nil,
  scriptVariables: map() | nil
}

Link to this section Functions

Specs

decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.