View Source GoogleApi.Dataproc.V1.Model.PySparkBatch (google_api_dataproc v0.54.0)

A configuration for running an Apache PySpark (https://spark.apache.org/docs/latest/api/python/getting_started/quickstart.html) batch workload.

Attributes

  • archiveUris (type: list(String.t), default: nil) - Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
  • args (type: list(String.t), default: nil) - Optional. The arguments to pass to the driver. Do not include arguments that can be set as batch properties, such as --conf, since a collision can occur that causes an incorrect batch submission.
  • fileUris (type: list(String.t), default: nil) - Optional. HCFS URIs of files to be placed in the working directory of each executor.
  • jarFileUris (type: list(String.t), default: nil) - Optional. HCFS URIs of jar files to add to the classpath of the Spark driver and tasks.
  • mainPythonFileUri (type: String.t, default: nil) - Required. The HCFS URI of the main Python file to use as the Spark driver. Must be a .py file.
  • pythonFileUris (type: list(String.t), default: nil) - Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip.

Summary

Functions

Unwrap a decoded JSON object into its complex fields.

Types

@type t() :: %GoogleApi.Dataproc.V1.Model.PySparkBatch{
  archiveUris: [String.t()] | nil,
  args: [String.t()] | nil,
  fileUris: [String.t()] | nil,
  jarFileUris: [String.t()] | nil,
  mainPythonFileUri: String.t() | nil,
  pythonFileUris: [String.t()] | nil
}

Functions

@spec decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.