google_api_dataproc v0.6.0 GoogleApi.Dataproc.V1.Model.PySparkJob View Source
A Cloud Dataproc job for running Apache PySpark (https://spark.apache.org/docs/0.9.0/python-programming-guide.html) applications on YARN.
Attributes
- archiveUris ([String.t]): Optional. HCFS URIs of archives to be extracted in the working directory of .jar, .tar, .tar.gz, .tgz, and .zip. Defaults to:
null
. - args ([String.t]): Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission. Defaults to:
null
. - fileUris ([String.t]): Optional. HCFS URIs of files to be copied to the working directory of Python drivers and distributed tasks. Useful for naively parallel tasks. Defaults to:
null
. - jarFileUris ([String.t]): Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Python driver and tasks. Defaults to:
null
. - loggingConfig (LoggingConfig): Optional. The runtime log config for job execution. Defaults to:
null
. - mainPythonFileUri (String.t): Required. The HCFS URI of the main Python file to use as the driver. Must be a .py file. Defaults to:
null
. - properties (%{optional(String.t) => String.t}): Optional. A mapping of property names to values, used to configure PySpark. Properties that conflict with values set by the Cloud Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code. Defaults to:
null
. - pythonFileUris ([String.t]): Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip. Defaults to:
null
.
Link to this section Summary
Functions
Unwrap a decoded JSON object into its complex fields.
Link to this section Types
Link to this section Functions
Link to this function
decode(value, options) View Source
Unwrap a decoded JSON object into its complex fields.