1. Packages
  2. Google Cloud Native
  3. API Docs
  4. datapipelines
  5. datapipelines/v1
  6. getPipeline

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

google-native.datapipelines/v1.getPipeline

Explore with Pulumi AI

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi

Looks up a single pipeline. Returns a “NOT_FOUND” error if no such pipeline exists. Returns a “FORBIDDEN” error if the caller doesn’t have permission to access it.

Using getPipeline

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getPipeline(args: GetPipelineArgs, opts?: InvokeOptions): Promise<GetPipelineResult>
function getPipelineOutput(args: GetPipelineOutputArgs, opts?: InvokeOptions): Output<GetPipelineResult>
Copy
def get_pipeline(location: Optional[str] = None,
                 pipeline_id: Optional[str] = None,
                 project: Optional[str] = None,
                 opts: Optional[InvokeOptions] = None) -> GetPipelineResult
def get_pipeline_output(location: Optional[pulumi.Input[str]] = None,
                 pipeline_id: Optional[pulumi.Input[str]] = None,
                 project: Optional[pulumi.Input[str]] = None,
                 opts: Optional[InvokeOptions] = None) -> Output[GetPipelineResult]
Copy
func LookupPipeline(ctx *Context, args *LookupPipelineArgs, opts ...InvokeOption) (*LookupPipelineResult, error)
func LookupPipelineOutput(ctx *Context, args *LookupPipelineOutputArgs, opts ...InvokeOption) LookupPipelineResultOutput
Copy

> Note: This function is named LookupPipeline in the Go SDK.

public static class GetPipeline 
{
    public static Task<GetPipelineResult> InvokeAsync(GetPipelineArgs args, InvokeOptions? opts = null)
    public static Output<GetPipelineResult> Invoke(GetPipelineInvokeArgs args, InvokeOptions? opts = null)
}
Copy
public static CompletableFuture<GetPipelineResult> getPipeline(GetPipelineArgs args, InvokeOptions options)
public static Output<GetPipelineResult> getPipeline(GetPipelineArgs args, InvokeOptions options)
Copy
fn::invoke:
  function: google-native:datapipelines/v1:getPipeline
  arguments:
    # arguments dictionary
Copy

The following arguments are supported:

Location This property is required. string
PipelineId This property is required. string
Project string
Location This property is required. string
PipelineId This property is required. string
Project string
location This property is required. String
pipelineId This property is required. String
project String
location This property is required. string
pipelineId This property is required. string
project string
location This property is required. str
pipeline_id This property is required. str
project str
location This property is required. String
pipelineId This property is required. String
project String

getPipeline Result

The following output properties are available:

CreateTime string
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
DisplayName string
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
JobCount int
Number of jobs.
LastUpdateTime string
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
Name string
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
PipelineSources Dictionary<string, string>
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
ScheduleInfo Pulumi.GoogleNative.Datapipelines.V1.Outputs.GoogleCloudDatapipelinesV1ScheduleSpecResponse
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
SchedulerServiceAccountEmail string
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
State string
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
Type string
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
Workload Pulumi.GoogleNative.Datapipelines.V1.Outputs.GoogleCloudDatapipelinesV1WorkloadResponse
Workload information for creating new jobs.
CreateTime string
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
DisplayName string
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
JobCount int
Number of jobs.
LastUpdateTime string
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
Name string
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
PipelineSources map[string]string
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
ScheduleInfo GoogleCloudDatapipelinesV1ScheduleSpecResponse
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
SchedulerServiceAccountEmail string
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
State string
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
Type string
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
Workload GoogleCloudDatapipelinesV1WorkloadResponse
Workload information for creating new jobs.
createTime String
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
displayName String
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
jobCount Integer
Number of jobs.
lastUpdateTime String
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
name String
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
pipelineSources Map<String,String>
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
scheduleInfo GoogleCloudDatapipelinesV1ScheduleSpecResponse
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
schedulerServiceAccountEmail String
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
state String
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
type String
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
workload GoogleCloudDatapipelinesV1WorkloadResponse
Workload information for creating new jobs.
createTime string
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
displayName string
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
jobCount number
Number of jobs.
lastUpdateTime string
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
name string
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
pipelineSources {[key: string]: string}
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
scheduleInfo GoogleCloudDatapipelinesV1ScheduleSpecResponse
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
schedulerServiceAccountEmail string
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
state string
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
type string
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
workload GoogleCloudDatapipelinesV1WorkloadResponse
Workload information for creating new jobs.
create_time str
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
display_name str
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
job_count int
Number of jobs.
last_update_time str
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
name str
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
pipeline_sources Mapping[str, str]
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
schedule_info GoogleCloudDatapipelinesV1ScheduleSpecResponse
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
scheduler_service_account_email str
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
state str
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
type str
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
workload GoogleCloudDatapipelinesV1WorkloadResponse
Workload information for creating new jobs.
createTime String
Immutable. The timestamp when the pipeline was initially created. Set by the Data Pipelines service.
displayName String
The display name of the pipeline. It can contain only letters ([A-Za-z]), numbers ([0-9]), hyphens (-), and underscores (_).
jobCount Number
Number of jobs.
lastUpdateTime String
Immutable. The timestamp when the pipeline was last modified. Set by the Data Pipelines service.
name String
The pipeline name. For example: projects/PROJECT_ID/locations/LOCATION_ID/pipelines/PIPELINE_ID. * PROJECT_ID can contain letters ([A-Za-z]), numbers ([0-9]), hyphens (-), colons (:), and periods (.). For more information, see Identifying projects. * LOCATION_ID is the canonical ID for the pipeline's location. The list of available locations can be obtained by calling google.cloud.location.Locations.ListLocations. Note that the Data Pipelines service is not available in all regions. It depends on Cloud Scheduler, an App Engine application, so it's only available in App Engine regions. * PIPELINE_ID is the ID of the pipeline. Must be unique for the selected project and location.
pipelineSources Map<String>
Immutable. The sources of the pipeline (for example, Dataplex). The keys and values are set by the corresponding sources during pipeline creation.
scheduleInfo Property Map
Internal scheduling information for a pipeline. If this information is provided, periodic jobs will be created per the schedule. If not, users are responsible for creating jobs externally.
schedulerServiceAccountEmail String
Optional. A service account email to be used with the Cloud Scheduler job. If not specified, the default compute engine service account will be used.
state String
The state of the pipeline. When the pipeline is created, the state is set to 'PIPELINE_STATE_ACTIVE' by default. State changes can be requested by setting the state to stopping, paused, or resuming. State cannot be changed through UpdatePipeline requests.
type String
The type of the pipeline. This field affects the scheduling of the pipeline and the type of metrics to show for the pipeline.
workload Property Map
Workload information for creating new jobs.

Supporting Types

GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse

AdditionalExperiments This property is required. List<string>
Additional experiment flags for the job.
AdditionalUserLabels This property is required. Dictionary<string, string>
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
EnableStreamingEngine This property is required. bool
Whether to enable Streaming Engine for the job.
FlexrsGoal This property is required. string
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
IpConfiguration This property is required. string
Configuration for VM IPs.
KmsKeyName This property is required. string
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
MachineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
MaxWorkers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
NumWorkers This property is required. int
The initial number of Compute Engine instances for the job.
ServiceAccountEmail This property is required. string
The email address of the service account to run the job as.
Subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
TempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
WorkerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
WorkerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
Zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
AdditionalExperiments This property is required. []string
Additional experiment flags for the job.
AdditionalUserLabels This property is required. map[string]string
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
EnableStreamingEngine This property is required. bool
Whether to enable Streaming Engine for the job.
FlexrsGoal This property is required. string
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
IpConfiguration This property is required. string
Configuration for VM IPs.
KmsKeyName This property is required. string
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
MachineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
MaxWorkers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
NumWorkers This property is required. int
The initial number of Compute Engine instances for the job.
ServiceAccountEmail This property is required. string
The email address of the service account to run the job as.
Subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
TempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
WorkerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
WorkerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
Zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. List<String>
Additional experiment flags for the job.
additionalUserLabels This property is required. Map<String,String>
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
enableStreamingEngine This property is required. Boolean
Whether to enable Streaming Engine for the job.
flexrsGoal This property is required. String
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
ipConfiguration This property is required. String
Configuration for VM IPs.
kmsKeyName This property is required. String
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. String
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. Integer
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. String
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. Integer
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. String
The email address of the service account to run the job as.
subnetwork This property is required. String
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. String
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. String
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
workerZone This property is required. String
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. String
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. string[]
Additional experiment flags for the job.
additionalUserLabels This property is required. {[key: string]: string}
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
enableStreamingEngine This property is required. boolean
Whether to enable Streaming Engine for the job.
flexrsGoal This property is required. string
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
ipConfiguration This property is required. string
Configuration for VM IPs.
kmsKeyName This property is required. string
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. number
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. number
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. string
The email address of the service account to run the job as.
subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
workerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additional_experiments This property is required. Sequence[str]
Additional experiment flags for the job.
additional_user_labels This property is required. Mapping[str, str]
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
enable_streaming_engine This property is required. bool
Whether to enable Streaming Engine for the job.
flexrs_goal This property is required. str
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
ip_configuration This property is required. str
Configuration for VM IPs.
kms_key_name This property is required. str
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
machine_type This property is required. str
The machine type to use for the job. Defaults to the value from the template if not specified.
max_workers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. str
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
num_workers This property is required. int
The initial number of Compute Engine instances for the job.
service_account_email This property is required. str
The email address of the service account to run the job as.
subnetwork This property is required. str
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
temp_location This property is required. str
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
worker_region This property is required. str
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
worker_zone This property is required. str
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. str
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. List<String>
Additional experiment flags for the job.
additionalUserLabels This property is required. Map<String>
Additional user labels to be specified for the job. Keys and values must follow the restrictions specified in the labeling restrictions. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
enableStreamingEngine This property is required. Boolean
Whether to enable Streaming Engine for the job.
flexrsGoal This property is required. String
Set FlexRS goal for the job. https://cloud.google.com/dataflow/docs/guides/flexrs
ipConfiguration This property is required. String
Configuration for VM IPs.
kmsKeyName This property is required. String
Name for the Cloud KMS key for the job. Key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. String
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. Number
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. String
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. Number
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. String
The email address of the service account to run the job as.
subnetwork This property is required. String
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. String
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. String
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, defaults to the control plane region.
workerZone This property is required. String
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. String
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.

GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse

ContainerSpecGcsPath This property is required. string
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
Environment This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse
The runtime environment for the Flex Template job.
JobName This property is required. string
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
LaunchOptions This property is required. Dictionary<string, string>
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
Parameters This property is required. Dictionary<string, string>
The parameters for the Flex Template. Example: {"num_workers":"5"}
TransformNameMappings This property is required. Dictionary<string, string>
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
Update This property is required. bool
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.
ContainerSpecGcsPath This property is required. string
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
Environment This property is required. GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse
The runtime environment for the Flex Template job.
JobName This property is required. string
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
LaunchOptions This property is required. map[string]string
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
Parameters This property is required. map[string]string
The parameters for the Flex Template. Example: {"num_workers":"5"}
TransformNameMappings This property is required. map[string]string
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
Update This property is required. bool
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.
containerSpecGcsPath This property is required. String
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
environment This property is required. GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse
The runtime environment for the Flex Template job.
jobName This property is required. String
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
launchOptions This property is required. Map<String,String>
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
parameters This property is required. Map<String,String>
The parameters for the Flex Template. Example: {"num_workers":"5"}
transformNameMappings This property is required. Map<String,String>
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
update This property is required. Boolean
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.
containerSpecGcsPath This property is required. string
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
environment This property is required. GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse
The runtime environment for the Flex Template job.
jobName This property is required. string
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
launchOptions This property is required. {[key: string]: string}
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
parameters This property is required. {[key: string]: string}
The parameters for the Flex Template. Example: {"num_workers":"5"}
transformNameMappings This property is required. {[key: string]: string}
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
update This property is required. boolean
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.
container_spec_gcs_path This property is required. str
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
environment This property is required. GoogleCloudDatapipelinesV1FlexTemplateRuntimeEnvironmentResponse
The runtime environment for the Flex Template job.
job_name This property is required. str
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
launch_options This property is required. Mapping[str, str]
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
parameters This property is required. Mapping[str, str]
The parameters for the Flex Template. Example: {"num_workers":"5"}
transform_name_mappings This property is required. Mapping[str, str]
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
update This property is required. bool
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.
containerSpecGcsPath This property is required. String
Cloud Storage path to a file with a JSON-serialized ContainerSpec as content.
environment This property is required. Property Map
The runtime environment for the Flex Template job.
jobName This property is required. String
The job name to use for the created job. For an update job request, the job name should be the same as the existing running job.
launchOptions This property is required. Map<String>
Launch options for this Flex Template job. This is a common set of options across languages and templates. This should not be used to pass job parameters.
parameters This property is required. Map<String>
The parameters for the Flex Template. Example: {"num_workers":"5"}
transformNameMappings This property is required. Map<String>
Use this to pass transform name mappings for streaming update jobs. Example: {"oldTransformName":"newTransformName",...}
update This property is required. Boolean
Set this to true if you are sending a request to update a running streaming job. When set, the job name should be the same as the running job.

GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse

LaunchParameter This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse
Parameter to launch a job from a Flex Template.
Location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
Project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
ValidateOnly This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
LaunchParameter This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse
Parameter to launch a job from a Flex Template.
Location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
Project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
ValidateOnly This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
launchParameter This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse
Parameter to launch a job from a Flex Template.
location This property is required. String
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
project This property is required. String
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. Boolean
If true, the request is validated but not actually executed. Defaults to false.
launchParameter This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse
Parameter to launch a job from a Flex Template.
location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. boolean
If true, the request is validated but not actually executed. Defaults to false.
launch_parameter This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateParameterResponse
Parameter to launch a job from a Flex Template.
location This property is required. str
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
project This property is required. str
The ID of the Cloud Platform project that the job belongs to.
validate_only This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
launchParameter This property is required. Property Map
Parameter to launch a job from a Flex Template.
location This property is required. String
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request. For example, us-central1, us-west1.
project This property is required. String
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. Boolean
If true, the request is validated but not actually executed. Defaults to false.

GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse

Environment This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse
The runtime environment for the job.
JobName This property is required. string
The job name to use for the created job.
Parameters This property is required. Dictionary<string, string>
The runtime parameters to pass to the job.
TransformNameMapping This property is required. Dictionary<string, string>
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
Update This property is required. bool
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.
Environment This property is required. GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse
The runtime environment for the job.
JobName This property is required. string
The job name to use for the created job.
Parameters This property is required. map[string]string
The runtime parameters to pass to the job.
TransformNameMapping This property is required. map[string]string
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
Update This property is required. bool
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.
environment This property is required. GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse
The runtime environment for the job.
jobName This property is required. String
The job name to use for the created job.
parameters This property is required. Map<String,String>
The runtime parameters to pass to the job.
transformNameMapping This property is required. Map<String,String>
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
update This property is required. Boolean
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.
environment This property is required. GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse
The runtime environment for the job.
jobName This property is required. string
The job name to use for the created job.
parameters This property is required. {[key: string]: string}
The runtime parameters to pass to the job.
transformNameMapping This property is required. {[key: string]: string}
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
update This property is required. boolean
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.
environment This property is required. GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse
The runtime environment for the job.
job_name This property is required. str
The job name to use for the created job.
parameters This property is required. Mapping[str, str]
The runtime parameters to pass to the job.
transform_name_mapping This property is required. Mapping[str, str]
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
update This property is required. bool
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.
environment This property is required. Property Map
The runtime environment for the job.
jobName This property is required. String
The job name to use for the created job.
parameters This property is required. Map<String>
The runtime parameters to pass to the job.
transformNameMapping This property is required. Map<String>
Map of transform name prefixes of the job to be replaced to the corresponding name prefixes of the new job. Only applicable when updating a pipeline.
update This property is required. Boolean
If set, replace the existing pipeline with the name specified by jobName with this pipeline, preserving state.

GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse

GcsPath This property is required. string
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
LaunchParameters This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse
The parameters of the template to launch. This should be part of the body of the POST request.
Location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
Project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
ValidateOnly This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
GcsPath This property is required. string
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
LaunchParameters This property is required. GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse
The parameters of the template to launch. This should be part of the body of the POST request.
Location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
Project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
ValidateOnly This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
gcsPath This property is required. String
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
launchParameters This property is required. GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse
The parameters of the template to launch. This should be part of the body of the POST request.
location This property is required. String
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
project This property is required. String
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. Boolean
If true, the request is validated but not actually executed. Defaults to false.
gcsPath This property is required. string
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
launchParameters This property is required. GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse
The parameters of the template to launch. This should be part of the body of the POST request.
location This property is required. string
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
project This property is required. string
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. boolean
If true, the request is validated but not actually executed. Defaults to false.
gcs_path This property is required. str
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
launch_parameters This property is required. GoogleCloudDatapipelinesV1LaunchTemplateParametersResponse
The parameters of the template to launch. This should be part of the body of the POST request.
location This property is required. str
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
project This property is required. str
The ID of the Cloud Platform project that the job belongs to.
validate_only This property is required. bool
If true, the request is validated but not actually executed. Defaults to false.
gcsPath This property is required. String
A Cloud Storage path to the template from which to create the job. Must be a valid Cloud Storage URL, beginning with 'gs://'.
launchParameters This property is required. Property Map
The parameters of the template to launch. This should be part of the body of the POST request.
location This property is required. String
The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) to which to direct the request.
project This property is required. String
The ID of the Cloud Platform project that the job belongs to.
validateOnly This property is required. Boolean
If true, the request is validated but not actually executed. Defaults to false.

GoogleCloudDatapipelinesV1RuntimeEnvironmentResponse

AdditionalExperiments This property is required. List<string>
Additional experiment flags for the job.
AdditionalUserLabels This property is required. Dictionary<string, string>
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
BypassTempDirValidation This property is required. bool
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
EnableStreamingEngine This property is required. bool
Whether to enable Streaming Engine for the job.
IpConfiguration This property is required. string
Configuration for VM IPs.
KmsKeyName This property is required. string
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
MachineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
MaxWorkers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
NumWorkers This property is required. int
The initial number of Compute Engine instances for the job.
ServiceAccountEmail This property is required. string
The email address of the service account to run the job as.
Subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
TempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
WorkerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
WorkerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
Zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
AdditionalExperiments This property is required. []string
Additional experiment flags for the job.
AdditionalUserLabels This property is required. map[string]string
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
BypassTempDirValidation This property is required. bool
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
EnableStreamingEngine This property is required. bool
Whether to enable Streaming Engine for the job.
IpConfiguration This property is required. string
Configuration for VM IPs.
KmsKeyName This property is required. string
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
MachineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
MaxWorkers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
Network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
NumWorkers This property is required. int
The initial number of Compute Engine instances for the job.
ServiceAccountEmail This property is required. string
The email address of the service account to run the job as.
Subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
TempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
WorkerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
WorkerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
Zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. List<String>
Additional experiment flags for the job.
additionalUserLabels This property is required. Map<String,String>
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
bypassTempDirValidation This property is required. Boolean
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
enableStreamingEngine This property is required. Boolean
Whether to enable Streaming Engine for the job.
ipConfiguration This property is required. String
Configuration for VM IPs.
kmsKeyName This property is required. String
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. String
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. Integer
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. String
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. Integer
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. String
The email address of the service account to run the job as.
subnetwork This property is required. String
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. String
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. String
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
workerZone This property is required. String
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. String
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. string[]
Additional experiment flags for the job.
additionalUserLabels This property is required. {[key: string]: string}
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
bypassTempDirValidation This property is required. boolean
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
enableStreamingEngine This property is required. boolean
Whether to enable Streaming Engine for the job.
ipConfiguration This property is required. string
Configuration for VM IPs.
kmsKeyName This property is required. string
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. string
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. number
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. string
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. number
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. string
The email address of the service account to run the job as.
subnetwork This property is required. string
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. string
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. string
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
workerZone This property is required. string
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. string
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additional_experiments This property is required. Sequence[str]
Additional experiment flags for the job.
additional_user_labels This property is required. Mapping[str, str]
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
bypass_temp_dir_validation This property is required. bool
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
enable_streaming_engine This property is required. bool
Whether to enable Streaming Engine for the job.
ip_configuration This property is required. str
Configuration for VM IPs.
kms_key_name This property is required. str
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
machine_type This property is required. str
The machine type to use for the job. Defaults to the value from the template if not specified.
max_workers This property is required. int
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. str
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
num_workers This property is required. int
The initial number of Compute Engine instances for the job.
service_account_email This property is required. str
The email address of the service account to run the job as.
subnetwork This property is required. str
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
temp_location This property is required. str
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
worker_region This property is required. str
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
worker_zone This property is required. str
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. str
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.
additionalExperiments This property is required. List<String>
Additional experiment flags for the job.
additionalUserLabels This property is required. Map<String>
Additional user labels to be specified for the job. Keys and values should follow the restrictions specified in the labeling restrictions page. An object containing a list of key/value pairs. Example: { "name": "wrench", "mass": "1kg", "count": "3" }.
bypassTempDirValidation This property is required. Boolean
Whether to bypass the safety checks for the job's temporary directory. Use with caution.
enableStreamingEngine This property is required. Boolean
Whether to enable Streaming Engine for the job.
ipConfiguration This property is required. String
Configuration for VM IPs.
kmsKeyName This property is required. String
Name for the Cloud KMS key for the job. The key format is: projects//locations//keyRings//cryptoKeys/
machineType This property is required. String
The machine type to use for the job. Defaults to the value from the template if not specified.
maxWorkers This property is required. Number
The maximum number of Compute Engine instances to be made available to your pipeline during execution, from 1 to 1000.
network This property is required. String
Network to which VMs will be assigned. If empty or unspecified, the service will use the network "default".
numWorkers This property is required. Number
The initial number of Compute Engine instances for the job.
serviceAccountEmail This property is required. String
The email address of the service account to run the job as.
subnetwork This property is required. String
Subnetwork to which VMs will be assigned, if desired. You can specify a subnetwork using either a complete URL or an abbreviated path. Expected to be of the form "https://www.googleapis.com/compute/v1/projects/HOST_PROJECT_ID/regions/REGION/subnetworks/SUBNETWORK" or "regions/REGION/subnetworks/SUBNETWORK". If the subnetwork is located in a Shared VPC network, you must use the complete URL.
tempLocation This property is required. String
The Cloud Storage path to use for temporary files. Must be a valid Cloud Storage URL, beginning with gs://.
workerRegion This property is required. String
The Compute Engine region (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1". Mutually exclusive with worker_zone. If neither worker_region nor worker_zone is specified, default to the control plane's region.
workerZone This property is required. String
The Compute Engine zone (https://cloud.google.com/compute/docs/regions-zones/regions-zones) in which worker processing should occur, e.g. "us-west1-a". Mutually exclusive with worker_region. If neither worker_region nor worker_zone is specified, a zone in the control plane's region is chosen based on available capacity. If both worker_zone and zone are set, worker_zone takes precedence.
zone This property is required. String
The Compute Engine availability zone for launching worker instances to run your pipeline. In the future, worker_zone will take precedence.

GoogleCloudDatapipelinesV1ScheduleSpecResponse

NextJobTime This property is required. string
When the next Scheduler job is going to run.
Schedule This property is required. string
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
TimeZone This property is required. string
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.
NextJobTime This property is required. string
When the next Scheduler job is going to run.
Schedule This property is required. string
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
TimeZone This property is required. string
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.
nextJobTime This property is required. String
When the next Scheduler job is going to run.
schedule This property is required. String
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
timeZone This property is required. String
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.
nextJobTime This property is required. string
When the next Scheduler job is going to run.
schedule This property is required. string
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
timeZone This property is required. string
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.
next_job_time This property is required. str
When the next Scheduler job is going to run.
schedule This property is required. str
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
time_zone This property is required. str
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.
nextJobTime This property is required. String
When the next Scheduler job is going to run.
schedule This property is required. String
Unix-cron format of the schedule. This information is retrieved from the linked Cloud Scheduler.
timeZone This property is required. String
Timezone ID. This matches the timezone IDs used by the Cloud Scheduler API. If empty, UTC time is assumed.

GoogleCloudDatapipelinesV1WorkloadResponse

DataflowFlexTemplateRequest This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
DataflowLaunchTemplateRequest This property is required. Pulumi.GoogleNative.Datapipelines.V1.Inputs.GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.
DataflowFlexTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
DataflowLaunchTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.
dataflowFlexTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
dataflowLaunchTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.
dataflowFlexTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
dataflowLaunchTemplateRequest This property is required. GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.
dataflow_flex_template_request This property is required. GoogleCloudDatapipelinesV1LaunchFlexTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
dataflow_launch_template_request This property is required. GoogleCloudDatapipelinesV1LaunchTemplateRequestResponse
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.
dataflowFlexTemplateRequest This property is required. Property Map
Template information and additional parameters needed to launch a Dataflow job using the flex launch API.
dataflowLaunchTemplateRequest This property is required. Property Map
Template information and additional parameters needed to launch a Dataflow job using the standard launch API.

Package Details

Repository
Google Cloud Native pulumi/pulumi-google-native
License
Apache-2.0

Google Cloud Native is in preview. Google Cloud Classic is fully supported.

Google Cloud Native v0.32.0 published on Wednesday, Nov 29, 2023 by Pulumi