google_api_dataproc v0.6.0 GoogleApi.Dataproc.V1.Model.OrderedJob View Source

A job executed by the workflow.

Attributes

  • hadoopJob (HadoopJob): Job is a Hadoop job. Defaults to: null.
  • hiveJob (HiveJob): Job is a Hive job. Defaults to: null.
  • labels (%{optional(String.t) => String.t}): Optional. The labels to associate with this job.Label keys must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}{0,62}Label values must be between 1 and 63 characters long, and must conform to the following regular expression: \p{Ll}\p{Lo}\p{N}_-{0,63}No more than 32 labels can be associated with a given job. Defaults to: null.
  • pigJob (PigJob): Job is a Pig job. Defaults to: null.
  • prerequisiteStepIds ([String.t]): Optional. The optional list of prerequisite job step_ids. If not specified, the job will start at the beginning of workflow. Defaults to: null.
  • pysparkJob (PySparkJob): Job is a Pyspark job. Defaults to: null.
  • scheduling (JobScheduling): Optional. Job scheduling configuration. Defaults to: null.
  • sparkJob (SparkJob): Job is a Spark job. Defaults to: null.
  • sparkSqlJob (SparkSqlJob): Job is a SparkSql job. Defaults to: null.
  • stepId (String.t): Required. The step id. The id must be unique among all jobs within the template.The step id is used as prefix for job id, as job goog-dataproc-workflow-step-id label, and in prerequisiteStepIds field from other steps.The id must contain only letters (a-z, A-Z), numbers (0-9), underscores (_), and hyphens (-). Cannot begin or end with underscore or hyphen. Must consist of between 3 and 50 characters. Defaults to: null.

Link to this section Summary

Functions

Unwrap a decoded JSON object into its complex fields.

Link to this section Types

Link to this section Functions

Link to this function

decode(value, options) View Source
decode(struct(), keyword()) :: struct()

Unwrap a decoded JSON object into its complex fields.