airbyte.DestinationDatabricks
Explore with Pulumi AI
DestinationDatabricks Resource
Example Usage
Coming soon!
Coming soon!
Coming soon!
Coming soon!
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.airbyte.DestinationDatabricks;
import com.pulumi.airbyte.DestinationDatabricksArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
public static void main(String[] args) {
Pulumi.run(App::stack);
}
public static void stack(Context ctx) {
var myDestinationDatabricks = new DestinationDatabricks("myDestinationDatabricks", DestinationDatabricksArgs.builder()
.configuration(DestinationDatabricksConfigurationArgs.builder()
.accept_terms(false)
.authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
.personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
.personalAccessToken("...my_personal_access_token...")
.build())
.build())
.database("...my_database...")
.hostname("abc-12345678-wxyz.cloud.databricks.com")
.http_path("sql/1.0/warehouses/0000-1111111-abcd90")
.port("443")
.purge_staging_data(false)
.raw_schema_override("...my_raw_schema_override...")
.schema("default")
.build())
.definitionId("fb6a88f5-a304-46f5-ab8b-4280a6d91f99")
.workspaceId("2615758c-c904-459e-9fd6-c8a55cba9327")
.build());
}
}
resources:
myDestinationDatabricks:
type: airbyte:DestinationDatabricks
properties:
configuration:
accept_terms: false
authentication:
personalAccessToken:
personalAccessToken: '...my_personal_access_token...'
database: '...my_database...'
hostname: abc-12345678-wxyz.cloud.databricks.com
http_path: sql/1.0/warehouses/0000-1111111-abcd90
port: '443'
purge_staging_data: false
raw_schema_override: '...my_raw_schema_override...'
schema: default
definitionId: fb6a88f5-a304-46f5-ab8b-4280a6d91f99
workspaceId: 2615758c-c904-459e-9fd6-c8a55cba9327
Create DestinationDatabricks Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new DestinationDatabricks(name: string, args: DestinationDatabricksArgs, opts?: CustomResourceOptions);
@overload
def DestinationDatabricks(resource_name: str,
args: DestinationDatabricksArgs,
opts: Optional[ResourceOptions] = None)
@overload
def DestinationDatabricks(resource_name: str,
opts: Optional[ResourceOptions] = None,
configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
workspace_id: Optional[str] = None,
definition_id: Optional[str] = None,
name: Optional[str] = None)
func NewDestinationDatabricks(ctx *Context, name string, args DestinationDatabricksArgs, opts ...ResourceOption) (*DestinationDatabricks, error)
public DestinationDatabricks(string name, DestinationDatabricksArgs args, CustomResourceOptions? opts = null)
public DestinationDatabricks(String name, DestinationDatabricksArgs args)
public DestinationDatabricks(String name, DestinationDatabricksArgs args, CustomResourceOptions options)
type: airbyte:DestinationDatabricks
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var destinationDatabricksResource = new Airbyte.DestinationDatabricks("destinationDatabricksResource", new()
{
Configuration = new Airbyte.Inputs.DestinationDatabricksConfigurationArgs
{
Authentication = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationArgs
{
OAuth2Recommended = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs
{
ClientId = "string",
Secret = "string",
},
PersonalAccessToken = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs
{
PersonalAccessToken = "string",
},
},
Database = "string",
Hostname = "string",
HttpPath = "string",
AcceptTerms = false,
Port = "string",
PurgeStagingData = false,
RawSchemaOverride = "string",
Schema = "string",
},
WorkspaceId = "string",
DefinitionId = "string",
Name = "string",
});
example, err := airbyte.NewDestinationDatabricks(ctx, "destinationDatabricksResource", &airbyte.DestinationDatabricksArgs{
Configuration: &.DestinationDatabricksConfigurationArgs{
Authentication: &.DestinationDatabricksConfigurationAuthenticationArgs{
OAuth2Recommended: &.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs{
ClientId: pulumi.String("string"),
Secret: pulumi.String("string"),
},
PersonalAccessToken: &.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs{
PersonalAccessToken: pulumi.String("string"),
},
},
Database: pulumi.String("string"),
Hostname: pulumi.String("string"),
HttpPath: pulumi.String("string"),
AcceptTerms: pulumi.Bool(false),
Port: pulumi.String("string"),
PurgeStagingData: pulumi.Bool(false),
RawSchemaOverride: pulumi.String("string"),
Schema: pulumi.String("string"),
},
WorkspaceId: pulumi.String("string"),
DefinitionId: pulumi.String("string"),
Name: pulumi.String("string"),
})
var destinationDatabricksResource = new DestinationDatabricks("destinationDatabricksResource", DestinationDatabricksArgs.builder()
.configuration(DestinationDatabricksConfigurationArgs.builder()
.authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
.oAuth2Recommended(DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs.builder()
.clientId("string")
.secret("string")
.build())
.personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
.personalAccessToken("string")
.build())
.build())
.database("string")
.hostname("string")
.httpPath("string")
.acceptTerms(false)
.port("string")
.purgeStagingData(false)
.rawSchemaOverride("string")
.schema("string")
.build())
.workspaceId("string")
.definitionId("string")
.name("string")
.build());
destination_databricks_resource = airbyte.DestinationDatabricks("destinationDatabricksResource",
configuration={
"authentication": {
"o_auth2_recommended": {
"client_id": "string",
"secret": "string",
},
"personal_access_token": {
"personal_access_token": "string",
},
},
"database": "string",
"hostname": "string",
"http_path": "string",
"accept_terms": False,
"port": "string",
"purge_staging_data": False,
"raw_schema_override": "string",
"schema": "string",
},
workspace_id="string",
definition_id="string",
name="string")
const destinationDatabricksResource = new airbyte.DestinationDatabricks("destinationDatabricksResource", {
configuration: {
authentication: {
oAuth2Recommended: {
clientId: "string",
secret: "string",
},
personalAccessToken: {
personalAccessToken: "string",
},
},
database: "string",
hostname: "string",
httpPath: "string",
acceptTerms: false,
port: "string",
purgeStagingData: false,
rawSchemaOverride: "string",
schema: "string",
},
workspaceId: "string",
definitionId: "string",
name: "string",
});
type: airbyte:DestinationDatabricks
properties:
configuration:
acceptTerms: false
authentication:
oAuth2Recommended:
clientId: string
secret: string
personalAccessToken:
personalAccessToken: string
database: string
hostname: string
httpPath: string
port: string
purgeStagingData: false
rawSchemaOverride: string
schema: string
definitionId: string
name: string
workspaceId: string
DestinationDatabricks Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The DestinationDatabricks resource accepts the following input properties:
- Configuration
Destination
Databricks Configuration - Workspace
Id string - Definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- Configuration
Destination
Databricks Configuration Args - Workspace
Id string - Definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
Destination
Databricks Configuration - workspace
Id String - definition
Id String - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
- configuration
Destination
Databricks Configuration - workspace
Id string - definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
Destination
Databricks Configuration Args - workspace_
id str - definition_
id str - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name str
- Name of the destination e.g. dev-mysql-instance.
- configuration Property Map
- workspace
Id String - definition
Id String - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
Outputs
All input properties are implicitly available as output properties. Additionally, the DestinationDatabricks resource produces the following output properties:
- Created
At double - Destination
Id string - Destination
Type string - Id string
- The provider-assigned unique ID for this managed resource.
- Resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- Created
At float64 - Destination
Id string - Destination
Type string - Id string
- The provider-assigned unique ID for this managed resource.
- Resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- created
At Double - destination
Id String - destination
Type String - id String
- The provider-assigned unique ID for this managed resource.
- resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- created
At number - destination
Id string - destination
Type string - id string
- The provider-assigned unique ID for this managed resource.
- resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- created_
at float - destination_
id str - destination_
type str - id str
- The provider-assigned unique ID for this managed resource.
- resource_
allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- created
At Number - destination
Id String - destination
Type String - id String
- The provider-assigned unique ID for this managed resource.
- resource
Allocation Property Map - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
Look up Existing DestinationDatabricks Resource
Get an existing DestinationDatabricks resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DestinationDatabricksState, opts?: CustomResourceOptions): DestinationDatabricks
@staticmethod
def get(resource_name: str,
id: str,
opts: Optional[ResourceOptions] = None,
configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
created_at: Optional[float] = None,
definition_id: Optional[str] = None,
destination_id: Optional[str] = None,
destination_type: Optional[str] = None,
name: Optional[str] = None,
resource_allocation: Optional[DestinationDatabricksResourceAllocationArgs] = None,
workspace_id: Optional[str] = None) -> DestinationDatabricks
func GetDestinationDatabricks(ctx *Context, name string, id IDInput, state *DestinationDatabricksState, opts ...ResourceOption) (*DestinationDatabricks, error)
public static DestinationDatabricks Get(string name, Input<string> id, DestinationDatabricksState? state, CustomResourceOptions? opts = null)
public static DestinationDatabricks get(String name, Output<String> id, DestinationDatabricksState state, CustomResourceOptions options)
resources: _: type: airbyte:DestinationDatabricks get: id: ${id}
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Configuration
Destination
Databricks Configuration - Created
At double - Definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Destination
Id string - Destination
Type string - Name string
- Name of the destination e.g. dev-mysql-instance.
- Resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- Workspace
Id string
- Configuration
Destination
Databricks Configuration Args - Created
At float64 - Definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Destination
Id string - Destination
Type string - Name string
- Name of the destination e.g. dev-mysql-instance.
- Resource
Allocation DestinationDatabricks Resource Allocation Args - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- Workspace
Id string
- configuration
Destination
Databricks Configuration - created
At Double - definition
Id String - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination
Id String - destination
Type String - name String
- Name of the destination e.g. dev-mysql-instance.
- resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- workspace
Id String
- configuration
Destination
Databricks Configuration - created
At number - definition
Id string - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination
Id string - destination
Type string - name string
- Name of the destination e.g. dev-mysql-instance.
- resource
Allocation DestinationDatabricks Resource Allocation - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- workspace
Id string
- configuration
Destination
Databricks Configuration Args - created_
at float - definition_
id str - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination_
id str - destination_
type str - name str
- Name of the destination e.g. dev-mysql-instance.
- resource_
allocation DestinationDatabricks Resource Allocation Args - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- workspace_
id str
- configuration Property Map
- created
At Number - definition
Id String - The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination
Id String - destination
Type String - name String
- Name of the destination e.g. dev-mysql-instance.
- resource
Allocation Property Map - actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level.
- workspace
Id String
Supporting Types
DestinationDatabricksConfiguration, DestinationDatabricksConfigurationArgs
- Authentication
Destination
Databricks Configuration Authentication - Authentication mechanism for Staging files and running queries
- Database string
- The name of the unity catalog for the database
- Hostname string
- Databricks Cluster Server Hostname.
- Http
Path string - Databricks Cluster HTTP Path.
- Accept
Terms bool - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- Port string
- Databricks Cluster Port. Default: "443"
- Purge
Staging boolData - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- Raw
Schema stringOverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- Schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- Authentication
Destination
Databricks Configuration Authentication - Authentication mechanism for Staging files and running queries
- Database string
- The name of the unity catalog for the database
- Hostname string
- Databricks Cluster Server Hostname.
- Http
Path string - Databricks Cluster HTTP Path.
- Accept
Terms bool - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- Port string
- Databricks Cluster Port. Default: "443"
- Purge
Staging boolData - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- Raw
Schema stringOverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- Schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
Destination
Databricks Configuration Authentication - Authentication mechanism for Staging files and running queries
- database String
- The name of the unity catalog for the database
- hostname String
- Databricks Cluster Server Hostname.
- http
Path String - Databricks Cluster HTTP Path.
- accept
Terms Boolean - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port String
- Databricks Cluster Port. Default: "443"
- purge
Staging BooleanData - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- raw
Schema StringOverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema String
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
Destination
Databricks Configuration Authentication - Authentication mechanism for Staging files and running queries
- database string
- The name of the unity catalog for the database
- hostname string
- Databricks Cluster Server Hostname.
- http
Path string - Databricks Cluster HTTP Path.
- accept
Terms boolean - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port string
- Databricks Cluster Port. Default: "443"
- purge
Staging booleanData - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- raw
Schema stringOverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
Destination
Databricks Configuration Authentication - Authentication mechanism for Staging files and running queries
- database str
- The name of the unity catalog for the database
- hostname str
- Databricks Cluster Server Hostname.
- http_
path str - Databricks Cluster HTTP Path.
- accept_
terms bool - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port str
- Databricks Cluster Port. Default: "443"
- purge_
staging_ booldata - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- raw_
schema_ stroverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema str
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication Property Map
- Authentication mechanism for Staging files and running queries
- database String
- The name of the unity catalog for the database
- hostname String
- Databricks Cluster Server Hostname.
- http
Path String - Databricks Cluster HTTP Path.
- accept
Terms Boolean - You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port String
- Databricks Cluster Port. Default: "443"
- purge
Staging BooleanData - Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- raw
Schema StringOverride - The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema String
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
DestinationDatabricksConfigurationAuthentication, DestinationDatabricksConfigurationAuthenticationArgs
DestinationDatabricksConfigurationAuthenticationOAuth2Recommended, DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs
DestinationDatabricksConfigurationAuthenticationPersonalAccessToken, DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs
- Personal
Access stringToken
- Personal
Access stringToken
- personal
Access StringToken
- personal
Access stringToken
- personal
Access StringToken
DestinationDatabricksResourceAllocation, DestinationDatabricksResourceAllocationArgs
- Default
Destination
Databricks Resource Allocation Default - optional resource requirements to run workers (blank for unbounded allocations)
- Job
Specifics List<DestinationDatabricks Resource Allocation Job Specific>
- Default
Destination
Databricks Resource Allocation Default - optional resource requirements to run workers (blank for unbounded allocations)
- Job
Specifics []DestinationDatabricks Resource Allocation Job Specific
- default_
Destination
Databricks Resource Allocation Default - optional resource requirements to run workers (blank for unbounded allocations)
- job
Specifics List<DestinationDatabricks Resource Allocation Job Specific>
- default
Destination
Databricks Resource Allocation Default - optional resource requirements to run workers (blank for unbounded allocations)
- job
Specifics DestinationDatabricks Resource Allocation Job Specific[]
- default
Destination
Databricks Resource Allocation Default - optional resource requirements to run workers (blank for unbounded allocations)
- job_
specifics Sequence[DestinationDatabricks Resource Allocation Job Specific]
- default Property Map
- optional resource requirements to run workers (blank for unbounded allocations)
- job
Specifics List<Property Map>
DestinationDatabricksResourceAllocationDefault, DestinationDatabricksResourceAllocationDefaultArgs
- Cpu
Limit string - Cpu
Request string - Ephemeral
Storage stringLimit - Ephemeral
Storage stringRequest - Memory
Limit string - Memory
Request string
- Cpu
Limit string - Cpu
Request string - Ephemeral
Storage stringLimit - Ephemeral
Storage stringRequest - Memory
Limit string - Memory
Request string
- cpu
Limit String - cpu
Request String - ephemeral
Storage StringLimit - ephemeral
Storage StringRequest - memory
Limit String - memory
Request String
- cpu
Limit string - cpu
Request string - ephemeral
Storage stringLimit - ephemeral
Storage stringRequest - memory
Limit string - memory
Request string
- cpu_
limit str - cpu_
request str - ephemeral_
storage_ strlimit - ephemeral_
storage_ strrequest - memory_
limit str - memory_
request str
- cpu
Limit String - cpu
Request String - ephemeral
Storage StringLimit - ephemeral
Storage StringRequest - memory
Limit String - memory
Request String
DestinationDatabricksResourceAllocationJobSpecific, DestinationDatabricksResourceAllocationJobSpecificArgs
- Job
Type string - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- Resource
Requirements DestinationDatabricks Resource Allocation Job Specific Resource Requirements - optional resource requirements to run workers (blank for unbounded allocations)
- Job
Type string - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- Resource
Requirements DestinationDatabricks Resource Allocation Job Specific Resource Requirements - optional resource requirements to run workers (blank for unbounded allocations)
- job
Type String - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- resource
Requirements DestinationDatabricks Resource Allocation Job Specific Resource Requirements - optional resource requirements to run workers (blank for unbounded allocations)
- job
Type string - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- resource
Requirements DestinationDatabricks Resource Allocation Job Specific Resource Requirements - optional resource requirements to run workers (blank for unbounded allocations)
- job_
type str - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- resource_
requirements DestinationDatabricks Resource Allocation Job Specific Resource Requirements - optional resource requirements to run workers (blank for unbounded allocations)
- job
Type String - enum that describes the different types of jobs that the platform runs. must be one of ["getspec", "checkconnection", "discoverschema", "sync", "resetconnection", "connection_updater", "replicate"]
- resource
Requirements Property Map - optional resource requirements to run workers (blank for unbounded allocations)
DestinationDatabricksResourceAllocationJobSpecificResourceRequirements, DestinationDatabricksResourceAllocationJobSpecificResourceRequirementsArgs
- Cpu
Limit string - Cpu
Request string - Ephemeral
Storage stringLimit - Ephemeral
Storage stringRequest - Memory
Limit string - Memory
Request string
- Cpu
Limit string - Cpu
Request string - Ephemeral
Storage stringLimit - Ephemeral
Storage stringRequest - Memory
Limit string - Memory
Request string
- cpu
Limit String - cpu
Request String - ephemeral
Storage StringLimit - ephemeral
Storage StringRequest - memory
Limit String - memory
Request String
- cpu
Limit string - cpu
Request string - ephemeral
Storage stringLimit - ephemeral
Storage stringRequest - memory
Limit string - memory
Request string
- cpu_
limit str - cpu_
request str - ephemeral_
storage_ strlimit - ephemeral_
storage_ strrequest - memory_
limit str - memory_
request str
- cpu
Limit String - cpu
Request String - ephemeral
Storage StringLimit - ephemeral
Storage StringRequest - memory
Limit String - memory
Request String
Import
$ pulumi import airbyte:index/destinationDatabricks:DestinationDatabricks my_airbyte_destination_databricks ""
To learn more about importing existing cloud resources, see Importing resources.
Package Details
- Repository
- airbyte airbytehq/terraform-provider-airbyte
- License
- Notes
- This Pulumi package is based on the
airbyte
Terraform Provider.