1. Packages
  2. AWS Cloud Control
  3. API Docs
  4. sagemaker
  5. ProcessingJob

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.32.0 published on Wednesday, Aug 13, 2025 by Pulumi

aws-native.sagemaker.ProcessingJob

Explore with Pulumi AI

aws-native logo

We recommend new projects start with resources from the AWS provider.

AWS Cloud Control v1.32.0 published on Wednesday, Aug 13, 2025 by Pulumi

    Resource Type definition for AWS::SageMaker::ProcessingJob

    Create ProcessingJob Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new ProcessingJob(name: string, args: ProcessingJobArgs, opts?: CustomResourceOptions);
    @overload
    def ProcessingJob(resource_name: str,
                      args: ProcessingJobArgs,
                      opts: Optional[ResourceOptions] = None)
    
    @overload
    def ProcessingJob(resource_name: str,
                      opts: Optional[ResourceOptions] = None,
                      app_specification: Optional[ProcessingJobAppSpecificationArgs] = None,
                      processing_resources: Optional[ProcessingJobProcessingResourcesArgs] = None,
                      role_arn: Optional[str] = None,
                      environment: Optional[ProcessingJobEnvironmentArgs] = None,
                      experiment_config: Optional[ProcessingJobExperimentConfigArgs] = None,
                      network_config: Optional[ProcessingJobNetworkConfigArgs] = None,
                      processing_inputs: Optional[Sequence[ProcessingJobProcessingInputsObjectArgs]] = None,
                      processing_job_name: Optional[str] = None,
                      processing_output_config: Optional[ProcessingJobProcessingOutputConfigArgs] = None,
                      stopping_condition: Optional[ProcessingJobStoppingConditionArgs] = None,
                      tags: Optional[Sequence[_root_inputs.CreateOnlyTagArgs]] = None)
    func NewProcessingJob(ctx *Context, name string, args ProcessingJobArgs, opts ...ResourceOption) (*ProcessingJob, error)
    public ProcessingJob(string name, ProcessingJobArgs args, CustomResourceOptions? opts = null)
    public ProcessingJob(String name, ProcessingJobArgs args)
    public ProcessingJob(String name, ProcessingJobArgs args, CustomResourceOptions options)
    
    type: aws-native:sagemaker:ProcessingJob
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args ProcessingJobArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args ProcessingJobArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args ProcessingJobArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args ProcessingJobArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args ProcessingJobArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    ProcessingJob Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.

    The ProcessingJob resource accepts the following input properties:

    AppSpecification Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobAppSpecification
    Configuration to run a processing job in a specified container image.
    ProcessingResources Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobProcessingResources
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    Environment Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobEnvironment
    Sets the environment variables in the Docker container.
    ExperimentConfig Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobExperimentConfig
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    NetworkConfig Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobNetworkConfig
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    ProcessingInputs List<Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobProcessingInputsObject>
    An array of inputs configuring the data to download into the processing container.
    ProcessingJobName string
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    ProcessingOutputConfig Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobProcessingOutputConfig
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    StoppingCondition Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobStoppingCondition
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    Tags List<Pulumi.AwsNative.Inputs.CreateOnlyTag>
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
    AppSpecification ProcessingJobAppSpecificationArgs
    Configuration to run a processing job in a specified container image.
    ProcessingResources ProcessingJobProcessingResourcesArgs
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    RoleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    Environment ProcessingJobEnvironmentArgs
    Sets the environment variables in the Docker container.
    ExperimentConfig ProcessingJobExperimentConfigArgs
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    NetworkConfig ProcessingJobNetworkConfigArgs
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    ProcessingInputs []ProcessingJobProcessingInputsObjectArgs
    An array of inputs configuring the data to download into the processing container.
    ProcessingJobName string
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    ProcessingOutputConfig ProcessingJobProcessingOutputConfigArgs
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    StoppingCondition ProcessingJobStoppingConditionArgs
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    Tags CreateOnlyTagArgs
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
    appSpecification ProcessingJobAppSpecification
    Configuration to run a processing job in a specified container image.
    processingResources ProcessingJobProcessingResources
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    environment ProcessingJobEnvironment
    Sets the environment variables in the Docker container.
    experimentConfig ProcessingJobExperimentConfig
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    networkConfig ProcessingJobNetworkConfig
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    processingInputs List<ProcessingJobProcessingInputsObject>
    An array of inputs configuring the data to download into the processing container.
    processingJobName String
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    processingOutputConfig ProcessingJobProcessingOutputConfig
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    stoppingCondition ProcessingJobStoppingCondition
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    tags List<CreateOnlyTag>
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
    appSpecification ProcessingJobAppSpecification
    Configuration to run a processing job in a specified container image.
    processingResources ProcessingJobProcessingResources
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    roleArn string
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    environment ProcessingJobEnvironment
    Sets the environment variables in the Docker container.
    experimentConfig ProcessingJobExperimentConfig
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    networkConfig ProcessingJobNetworkConfig
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    processingInputs ProcessingJobProcessingInputsObject[]
    An array of inputs configuring the data to download into the processing container.
    processingJobName string
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    processingOutputConfig ProcessingJobProcessingOutputConfig
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    stoppingCondition ProcessingJobStoppingCondition
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    tags CreateOnlyTag[]
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
    app_specification ProcessingJobAppSpecificationArgs
    Configuration to run a processing job in a specified container image.
    processing_resources ProcessingJobProcessingResourcesArgs
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    role_arn str
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    environment ProcessingJobEnvironmentArgs
    Sets the environment variables in the Docker container.
    experiment_config ProcessingJobExperimentConfigArgs
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    network_config ProcessingJobNetworkConfigArgs
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    processing_inputs Sequence[ProcessingJobProcessingInputsObjectArgs]
    An array of inputs configuring the data to download into the processing container.
    processing_job_name str
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    processing_output_config ProcessingJobProcessingOutputConfigArgs
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    stopping_condition ProcessingJobStoppingConditionArgs
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    tags Sequence[CreateOnlyTagArgs]
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.
    appSpecification Property Map
    Configuration to run a processing job in a specified container image.
    processingResources Property Map
    Identifies the resources, ML compute instances, and ML storage volumes to deploy for a processing job. In distributed training, you specify more than one instance.
    roleArn String
    The Amazon Resource Name (ARN) of an IAM role that Amazon SageMaker can assume to perform tasks on your behalf.
    environment Property Map
    Sets the environment variables in the Docker container.
    experimentConfig Property Map
    Associates a SageMaker job as a trial component with an experiment and trial. Specified when you call the CreateProcessingJob API.
    networkConfig Property Map
    Networking options for a job, such as network traffic encryption between containers, whether to allow inbound and outbound network calls to and from containers, and the VPC subnets and security groups to use for VPC-enabled jobs.
    processingInputs List<Property Map>
    An array of inputs configuring the data to download into the processing container.
    processingJobName String
    The name of the processing job. The name must be unique within an AWS Region in the AWS account.
    processingOutputConfig Property Map
    Contains information about the output location for the compiled model and the target device that the model runs on. TargetDevice and TargetPlatform are mutually exclusive, so you need to choose one between the two to specify your target device or platform. If you cannot find your device you want to use from the TargetDevice list, use TargetPlatform to describe the platform of your edge device and CompilerOptions if there are specific settings that are required or recommended to use for particular TargetPlatform.
    stoppingCondition Property Map
    Configures conditions under which the processing job should be stopped, such as how long the processing job has been running. After the condition is met, the processing job is stopped.
    tags List<Property Map>
    (Optional) An array of key-value pairs. For more information, see Using Cost Allocation Tags(https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/cost-alloc-tags.html#allocation-whatURL) in the AWS Billing and Cost Management User Guide.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the ProcessingJob resource produces the following output properties:

    AutoMlJobArn string
    The ARN of an AutoML job associated with this processing job.
    CreationTime string
    The time at which the processing job was created.
    ExitMessage string
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    FailureReason string
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The time at which the processing job was last modified.
    MonitoringScheduleArn string
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    ProcessingEndTime string
    The time at which the processing job completed.
    ProcessingJobArn string
    The Amazon Resource Name (ARN) of the processing job.
    ProcessingJobStatus Pulumi.AwsNative.SageMaker.ProcessingJobStatus
    Provides the status of a processing job.
    ProcessingStartTime string
    The time at which the processing job started.
    TrainingJobArn string
    The ARN of a training job associated with this processing job
    AutoMlJobArn string
    The ARN of an AutoML job associated with this processing job.
    CreationTime string
    The time at which the processing job was created.
    ExitMessage string
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    FailureReason string
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    Id string
    The provider-assigned unique ID for this managed resource.
    LastModifiedTime string
    The time at which the processing job was last modified.
    MonitoringScheduleArn string
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    ProcessingEndTime string
    The time at which the processing job completed.
    ProcessingJobArn string
    The Amazon Resource Name (ARN) of the processing job.
    ProcessingJobStatus ProcessingJobStatus
    Provides the status of a processing job.
    ProcessingStartTime string
    The time at which the processing job started.
    TrainingJobArn string
    The ARN of a training job associated with this processing job
    autoMlJobArn String
    The ARN of an AutoML job associated with this processing job.
    creationTime String
    The time at which the processing job was created.
    exitMessage String
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    failureReason String
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The time at which the processing job was last modified.
    monitoringScheduleArn String
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    processingEndTime String
    The time at which the processing job completed.
    processingJobArn String
    The Amazon Resource Name (ARN) of the processing job.
    processingJobStatus ProcessingJobStatus
    Provides the status of a processing job.
    processingStartTime String
    The time at which the processing job started.
    trainingJobArn String
    The ARN of a training job associated with this processing job
    autoMlJobArn string
    The ARN of an AutoML job associated with this processing job.
    creationTime string
    The time at which the processing job was created.
    exitMessage string
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    failureReason string
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    id string
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime string
    The time at which the processing job was last modified.
    monitoringScheduleArn string
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    processingEndTime string
    The time at which the processing job completed.
    processingJobArn string
    The Amazon Resource Name (ARN) of the processing job.
    processingJobStatus ProcessingJobStatus
    Provides the status of a processing job.
    processingStartTime string
    The time at which the processing job started.
    trainingJobArn string
    The ARN of a training job associated with this processing job
    auto_ml_job_arn str
    The ARN of an AutoML job associated with this processing job.
    creation_time str
    The time at which the processing job was created.
    exit_message str
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    failure_reason str
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    id str
    The provider-assigned unique ID for this managed resource.
    last_modified_time str
    The time at which the processing job was last modified.
    monitoring_schedule_arn str
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    processing_end_time str
    The time at which the processing job completed.
    processing_job_arn str
    The Amazon Resource Name (ARN) of the processing job.
    processing_job_status ProcessingJobStatus
    Provides the status of a processing job.
    processing_start_time str
    The time at which the processing job started.
    training_job_arn str
    The ARN of a training job associated with this processing job
    autoMlJobArn String
    The ARN of an AutoML job associated with this processing job.
    creationTime String
    The time at which the processing job was created.
    exitMessage String
    An optional string, up to one KB in size, that contains metadata from the processing container when the processing job exits.
    failureReason String
    A string, up to one KB in size, that contains the reason a processing job failed, if it failed.
    id String
    The provider-assigned unique ID for this managed resource.
    lastModifiedTime String
    The time at which the processing job was last modified.
    monitoringScheduleArn String
    The ARN of a monitoring schedule for an endpoint associated with this processing job.
    processingEndTime String
    The time at which the processing job completed.
    processingJobArn String
    The Amazon Resource Name (ARN) of the processing job.
    processingJobStatus "Completed" | "InProgress" | "Stopping" | "Stopped" | "Failed"
    Provides the status of a processing job.
    processingStartTime String
    The time at which the processing job started.
    trainingJobArn String
    The ARN of a training job associated with this processing job

    Supporting Types

    CreateOnlyTag, CreateOnlyTagArgs

    Key string
    The key name of the tag
    Value string
    The value of the tag
    Key string
    The key name of the tag
    Value string
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag
    key string
    The key name of the tag
    value string
    The value of the tag
    key str
    The key name of the tag
    value str
    The value of the tag
    key String
    The key name of the tag
    value String
    The value of the tag

    ProcessingJobAppSpecification, ProcessingJobAppSpecificationArgs

    ImageUri string
    The container image to be run by the processing job.
    ContainerArguments List<string>
    The arguments for a container used to run a processing job.
    ContainerEntrypoint List<string>
    The entrypoint for a container used to run a processing job.
    ImageUri string
    The container image to be run by the processing job.
    ContainerArguments []string
    The arguments for a container used to run a processing job.
    ContainerEntrypoint []string
    The entrypoint for a container used to run a processing job.
    imageUri String
    The container image to be run by the processing job.
    containerArguments List<String>
    The arguments for a container used to run a processing job.
    containerEntrypoint List<String>
    The entrypoint for a container used to run a processing job.
    imageUri string
    The container image to be run by the processing job.
    containerArguments string[]
    The arguments for a container used to run a processing job.
    containerEntrypoint string[]
    The entrypoint for a container used to run a processing job.
    image_uri str
    The container image to be run by the processing job.
    container_arguments Sequence[str]
    The arguments for a container used to run a processing job.
    container_entrypoint Sequence[str]
    The entrypoint for a container used to run a processing job.
    imageUri String
    The container image to be run by the processing job.
    containerArguments List<String>
    The arguments for a container used to run a processing job.
    containerEntrypoint List<String>
    The entrypoint for a container used to run a processing job.

    ProcessingJobAthenaDatasetDefinition, ProcessingJobAthenaDatasetDefinitionArgs

    Catalog string
    The name of the data catalog used in Athena query execution.
    Database string
    The name of the database used in the Athena query execution.
    OutputFormat Pulumi.AwsNative.SageMaker.ProcessingJobAthenaDatasetDefinitionOutputFormat
    The data storage format for Athena query results.
    OutputS3Uri string
    The location in Amazon S3 where Athena query results are stored.
    QueryString string
    The SQL query statements, to be executed.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    OutputCompression Pulumi.AwsNative.SageMaker.ProcessingJobAthenaDatasetDefinitionOutputCompression
    The compression used for Athena query results.
    WorkGroup string
    The name of the workgroup in which the Athena query is being started.
    Catalog string
    The name of the data catalog used in Athena query execution.
    Database string
    The name of the database used in the Athena query execution.
    OutputFormat ProcessingJobAthenaDatasetDefinitionOutputFormat
    The data storage format for Athena query results.
    OutputS3Uri string
    The location in Amazon S3 where Athena query results are stored.
    QueryString string
    The SQL query statements, to be executed.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    OutputCompression ProcessingJobAthenaDatasetDefinitionOutputCompression
    The compression used for Athena query results.
    WorkGroup string
    The name of the workgroup in which the Athena query is being started.
    catalog String
    The name of the data catalog used in Athena query execution.
    database String
    The name of the database used in the Athena query execution.
    outputFormat ProcessingJobAthenaDatasetDefinitionOutputFormat
    The data storage format for Athena query results.
    outputS3Uri String
    The location in Amazon S3 where Athena query results are stored.
    queryString String
    The SQL query statements, to be executed.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    outputCompression ProcessingJobAthenaDatasetDefinitionOutputCompression
    The compression used for Athena query results.
    workGroup String
    The name of the workgroup in which the Athena query is being started.
    catalog string
    The name of the data catalog used in Athena query execution.
    database string
    The name of the database used in the Athena query execution.
    outputFormat ProcessingJobAthenaDatasetDefinitionOutputFormat
    The data storage format for Athena query results.
    outputS3Uri string
    The location in Amazon S3 where Athena query results are stored.
    queryString string
    The SQL query statements, to be executed.
    kmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    outputCompression ProcessingJobAthenaDatasetDefinitionOutputCompression
    The compression used for Athena query results.
    workGroup string
    The name of the workgroup in which the Athena query is being started.
    catalog str
    The name of the data catalog used in Athena query execution.
    database str
    The name of the database used in the Athena query execution.
    output_format ProcessingJobAthenaDatasetDefinitionOutputFormat
    The data storage format for Athena query results.
    output_s3_uri str
    The location in Amazon S3 where Athena query results are stored.
    query_string str
    The SQL query statements, to be executed.
    kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    output_compression ProcessingJobAthenaDatasetDefinitionOutputCompression
    The compression used for Athena query results.
    work_group str
    The name of the workgroup in which the Athena query is being started.
    catalog String
    The name of the data catalog used in Athena query execution.
    database String
    The name of the database used in the Athena query execution.
    outputFormat "PARQUET" | "AVRO" | "ORC" | "JSON" | "TEXTFILE"
    The data storage format for Athena query results.
    outputS3Uri String
    The location in Amazon S3 where Athena query results are stored.
    queryString String
    The SQL query statements, to be executed.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data generated from an Athena query execution.
    outputCompression "GZIP" | "SNAPPY" | "ZLIB"
    The compression used for Athena query results.
    workGroup String
    The name of the workgroup in which the Athena query is being started.

    ProcessingJobAthenaDatasetDefinitionOutputCompression, ProcessingJobAthenaDatasetDefinitionOutputCompressionArgs

    Gzip
    GZIP
    Snappy
    SNAPPY
    Zlib
    ZLIB
    ProcessingJobAthenaDatasetDefinitionOutputCompressionGzip
    GZIP
    ProcessingJobAthenaDatasetDefinitionOutputCompressionSnappy
    SNAPPY
    ProcessingJobAthenaDatasetDefinitionOutputCompressionZlib
    ZLIB
    Gzip
    GZIP
    Snappy
    SNAPPY
    Zlib
    ZLIB
    Gzip
    GZIP
    Snappy
    SNAPPY
    Zlib
    ZLIB
    GZIP
    GZIP
    SNAPPY
    SNAPPY
    ZLIB
    ZLIB
    "GZIP"
    GZIP
    "SNAPPY"
    SNAPPY
    "ZLIB"
    ZLIB

    ProcessingJobAthenaDatasetDefinitionOutputFormat, ProcessingJobAthenaDatasetDefinitionOutputFormatArgs

    Parquet
    PARQUET
    Avro
    AVRO
    Orc
    ORC
    Json
    JSON
    Textfile
    TEXTFILE
    ProcessingJobAthenaDatasetDefinitionOutputFormatParquet
    PARQUET
    ProcessingJobAthenaDatasetDefinitionOutputFormatAvro
    AVRO
    ProcessingJobAthenaDatasetDefinitionOutputFormatOrc
    ORC
    ProcessingJobAthenaDatasetDefinitionOutputFormatJson
    JSON
    ProcessingJobAthenaDatasetDefinitionOutputFormatTextfile
    TEXTFILE
    Parquet
    PARQUET
    Avro
    AVRO
    Orc
    ORC
    Json
    JSON
    Textfile
    TEXTFILE
    Parquet
    PARQUET
    Avro
    AVRO
    Orc
    ORC
    Json
    JSON
    Textfile
    TEXTFILE
    PARQUET
    PARQUET
    AVRO
    AVRO
    ORC
    ORC
    JSON
    JSON
    TEXTFILE
    TEXTFILE
    "PARQUET"
    PARQUET
    "AVRO"
    AVRO
    "ORC"
    ORC
    "JSON"
    JSON
    "TEXTFILE"
    TEXTFILE

    ProcessingJobClusterConfig, ProcessingJobClusterConfigArgs

    InstanceCount int
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType Pulumi.AwsNative.SageMaker.ProcessingJobClusterConfigInstanceType
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
    InstanceCount int
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    InstanceType ProcessingJobClusterConfigInstanceType
    The ML compute instance type for the processing job.
    VolumeSizeInGb int
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    VolumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
    instanceCount Integer
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType ProcessingJobClusterConfigInstanceType
    The ML compute instance type for the processing job.
    volumeSizeInGb Integer
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
    instanceCount number
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType ProcessingJobClusterConfigInstanceType
    The ML compute instance type for the processing job.
    volumeSizeInGb number
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
    instance_count int
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instance_type ProcessingJobClusterConfigInstanceType
    The ML compute instance type for the processing job.
    volume_size_in_gb int
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    volume_kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.
    instanceCount Number
    The number of ML compute instances to use in the processing job. For distributed processing jobs, specify a value greater than 1. The default value is 1.
    instanceType "ml.t3.medium" | "ml.t3.large" | "ml.t3.xlarge" | "ml.t3.2xlarge" | "ml.m4.xlarge" | "ml.m4.2xlarge" | "ml.m4.4xlarge" | "ml.m4.10xlarge" | "ml.m4.16xlarge" | "ml.c4.xlarge" | "ml.c4.2xlarge" | "ml.c4.4xlarge" | "ml.c4.8xlarge" | "ml.c5.xlarge" | "ml.c5.2xlarge" | "ml.c5.4xlarge" | "ml.c5.9xlarge" | "ml.c5.18xlarge" | "ml.m5.large" | "ml.m5.xlarge" | "ml.m5.2xlarge" | "ml.m5.4xlarge" | "ml.m5.12xlarge" | "ml.m5.24xlarge" | "ml.r5.large" | "ml.r5.xlarge" | "ml.r5.2xlarge" | "ml.r5.4xlarge" | "ml.r5.8xlarge" | "ml.r5.12xlarge" | "ml.r5.16xlarge" | "ml.r5.24xlarge" | "ml.g4dn.xlarge" | "ml.g4dn.2xlarge" | "ml.g4dn.4xlarge" | "ml.g4dn.8xlarge" | "ml.g4dn.12xlarge" | "ml.g4dn.16xlarge" | "ml.g5.xlarge" | "ml.g5.2xlarge" | "ml.g5.4xlarge" | "ml.g5.8xlarge" | "ml.g5.16xlarge" | "ml.g5.12xlarge" | "ml.g5.24xlarge" | "ml.g5.48xlarge" | "ml.r5d.large" | "ml.r5d.xlarge" | "ml.r5d.2xlarge" | "ml.r5d.4xlarge" | "ml.r5d.8xlarge" | "ml.r5d.12xlarge" | "ml.r5d.16xlarge" | "ml.r5d.24xlarge" | "ml.g6.xlarge" | "ml.g6.2xlarge" | "ml.g6.4xlarge" | "ml.g6.8xlarge" | "ml.g6.12xlarge" | "ml.g6.16xlarge" | "ml.g6.24xlarge" | "ml.g6.48xlarge" | "ml.g6e.xlarge" | "ml.g6e.2xlarge" | "ml.g6e.4xlarge" | "ml.g6e.8xlarge" | "ml.g6e.12xlarge" | "ml.g6e.16xlarge" | "ml.g6e.24xlarge" | "ml.g6e.48xlarge" | "ml.m6i.large" | "ml.m6i.xlarge" | "ml.m6i.2xlarge" | "ml.m6i.4xlarge" | "ml.m6i.8xlarge" | "ml.m6i.12xlarge" | "ml.m6i.16xlarge" | "ml.m6i.24xlarge" | "ml.m6i.32xlarge" | "ml.c6i.xlarge" | "ml.c6i.2xlarge" | "ml.c6i.4xlarge" | "ml.c6i.8xlarge" | "ml.c6i.12xlarge" | "ml.c6i.16xlarge" | "ml.c6i.24xlarge" | "ml.c6i.32xlarge" | "ml.m7i.large" | "ml.m7i.xlarge" | "ml.m7i.2xlarge" | "ml.m7i.4xlarge" | "ml.m7i.8xlarge" | "ml.m7i.12xlarge" | "ml.m7i.16xlarge" | "ml.m7i.24xlarge" | "ml.m7i.48xlarge" | "ml.c7i.large" | "ml.c7i.xlarge" | "ml.c7i.2xlarge" | "ml.c7i.4xlarge" | "ml.c7i.8xlarge" | "ml.c7i.12xlarge" | "ml.c7i.16xlarge" | "ml.c7i.24xlarge" | "ml.c7i.48xlarge" | "ml.r7i.large" | "ml.r7i.xlarge" | "ml.r7i.2xlarge" | "ml.r7i.4xlarge" | "ml.r7i.8xlarge" | "ml.r7i.12xlarge" | "ml.r7i.16xlarge" | "ml.r7i.24xlarge" | "ml.r7i.48xlarge"
    The ML compute instance type for the processing job.
    volumeSizeInGb Number
    The size of the ML storage volume in gigabytes that you want to provision. You must specify sufficient ML storage for your scenario.
    volumeKmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data on the storage volume attached to the ML compute instance(s) that run the processing job.

    ProcessingJobClusterConfigInstanceType, ProcessingJobClusterConfigInstanceTypeArgs

    MlT3Medium
    ml.t3.medium
    MlT3Large
    ml.t3.large
    MlT3Xlarge
    ml.t3.xlarge
    MlT32xlarge
    ml.t3.2xlarge
    MlM4Xlarge
    ml.m4.xlarge
    MlM42xlarge
    ml.m4.2xlarge
    MlM44xlarge
    ml.m4.4xlarge
    MlM410xlarge
    ml.m4.10xlarge
    MlM416xlarge
    ml.m4.16xlarge
    MlC4Xlarge
    ml.c4.xlarge
    MlC42xlarge
    ml.c4.2xlarge
    MlC44xlarge
    ml.c4.4xlarge
    MlC48xlarge
    ml.c4.8xlarge
    MlC5Xlarge
    ml.c5.xlarge
    MlC52xlarge
    ml.c5.2xlarge
    MlC54xlarge
    ml.c5.4xlarge
    MlC59xlarge
    ml.c5.9xlarge
    MlC518xlarge
    ml.c5.18xlarge
    MlM5Large
    ml.m5.large
    MlM5Xlarge
    ml.m5.xlarge
    MlM52xlarge
    ml.m5.2xlarge
    MlM54xlarge
    ml.m5.4xlarge
    MlM512xlarge
    ml.m5.12xlarge
    MlM524xlarge
    ml.m5.24xlarge
    MlR5Large
    ml.r5.large
    MlR5Xlarge
    ml.r5.xlarge
    MlR52xlarge
    ml.r5.2xlarge
    MlR54xlarge
    ml.r5.4xlarge
    MlR58xlarge
    ml.r5.8xlarge
    MlR512xlarge
    ml.r5.12xlarge
    MlR516xlarge
    ml.r5.16xlarge
    MlR524xlarge
    ml.r5.24xlarge
    MlG4dnXlarge
    ml.g4dn.xlarge
    MlG4dn2xlarge
    ml.g4dn.2xlarge
    MlG4dn4xlarge
    ml.g4dn.4xlarge
    MlG4dn8xlarge
    ml.g4dn.8xlarge
    MlG4dn12xlarge
    ml.g4dn.12xlarge
    MlG4dn16xlarge
    ml.g4dn.16xlarge
    MlG5Xlarge
    ml.g5.xlarge
    MlG52xlarge
    ml.g5.2xlarge
    MlG54xlarge
    ml.g5.4xlarge
    MlG58xlarge
    ml.g5.8xlarge
    MlG516xlarge
    ml.g5.16xlarge
    MlG512xlarge
    ml.g5.12xlarge
    MlG524xlarge
    ml.g5.24xlarge
    MlG548xlarge
    ml.g5.48xlarge
    MlR5dLarge
    ml.r5d.large
    MlR5dXlarge
    ml.r5d.xlarge
    MlR5d2xlarge
    ml.r5d.2xlarge
    MlR5d4xlarge
    ml.r5d.4xlarge
    MlR5d8xlarge
    ml.r5d.8xlarge
    MlR5d12xlarge
    ml.r5d.12xlarge
    MlR5d16xlarge
    ml.r5d.16xlarge
    MlR5d24xlarge
    ml.r5d.24xlarge
    MlG6Xlarge
    ml.g6.xlarge
    MlG62xlarge
    ml.g6.2xlarge
    MlG64xlarge
    ml.g6.4xlarge
    MlG68xlarge
    ml.g6.8xlarge
    MlG612xlarge
    ml.g6.12xlarge
    MlG616xlarge
    ml.g6.16xlarge
    MlG624xlarge
    ml.g6.24xlarge
    MlG648xlarge
    ml.g6.48xlarge
    MlG6eXlarge
    ml.g6e.xlarge
    MlG6e2xlarge
    ml.g6e.2xlarge
    MlG6e4xlarge
    ml.g6e.4xlarge
    MlG6e8xlarge
    ml.g6e.8xlarge
    MlG6e12xlarge
    ml.g6e.12xlarge
    MlG6e16xlarge
    ml.g6e.16xlarge
    MlG6e24xlarge
    ml.g6e.24xlarge
    MlG6e48xlarge
    ml.g6e.48xlarge
    MlM6iLarge
    ml.m6i.large
    MlM6iXlarge
    ml.m6i.xlarge
    MlM6i2xlarge
    ml.m6i.2xlarge
    MlM6i4xlarge
    ml.m6i.4xlarge
    MlM6i8xlarge
    ml.m6i.8xlarge
    MlM6i12xlarge
    ml.m6i.12xlarge
    MlM6i16xlarge
    ml.m6i.16xlarge
    MlM6i24xlarge
    ml.m6i.24xlarge
    MlM6i32xlarge
    ml.m6i.32xlarge
    MlC6iXlarge
    ml.c6i.xlarge
    MlC6i2xlarge
    ml.c6i.2xlarge
    MlC6i4xlarge
    ml.c6i.4xlarge
    MlC6i8xlarge
    ml.c6i.8xlarge
    MlC6i12xlarge
    ml.c6i.12xlarge
    MlC6i16xlarge
    ml.c6i.16xlarge
    MlC6i24xlarge
    ml.c6i.24xlarge
    MlC6i32xlarge
    ml.c6i.32xlarge
    MlM7iLarge
    ml.m7i.large
    MlM7iXlarge
    ml.m7i.xlarge
    MlM7i2xlarge
    ml.m7i.2xlarge
    MlM7i4xlarge
    ml.m7i.4xlarge
    MlM7i8xlarge
    ml.m7i.8xlarge
    MlM7i12xlarge
    ml.m7i.12xlarge
    MlM7i16xlarge
    ml.m7i.16xlarge
    MlM7i24xlarge
    ml.m7i.24xlarge
    MlM7i48xlarge
    ml.m7i.48xlarge
    MlC7iLarge
    ml.c7i.large
    MlC7iXlarge
    ml.c7i.xlarge
    MlC7i2xlarge
    ml.c7i.2xlarge
    MlC7i4xlarge
    ml.c7i.4xlarge
    MlC7i8xlarge
    ml.c7i.8xlarge
    MlC7i12xlarge
    ml.c7i.12xlarge
    MlC7i16xlarge
    ml.c7i.16xlarge
    MlC7i24xlarge
    ml.c7i.24xlarge
    MlC7i48xlarge
    ml.c7i.48xlarge
    MlR7iLarge
    ml.r7i.large
    MlR7iXlarge
    ml.r7i.xlarge
    MlR7i2xlarge
    ml.r7i.2xlarge
    MlR7i4xlarge
    ml.r7i.4xlarge
    MlR7i8xlarge
    ml.r7i.8xlarge
    MlR7i12xlarge
    ml.r7i.12xlarge
    MlR7i16xlarge
    ml.r7i.16xlarge
    MlR7i24xlarge
    ml.r7i.24xlarge
    MlR7i48xlarge
    ml.r7i.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlT3Medium
    ml.t3.medium
    ProcessingJobClusterConfigInstanceTypeMlT3Large
    ml.t3.large
    ProcessingJobClusterConfigInstanceTypeMlT3Xlarge
    ml.t3.xlarge
    ProcessingJobClusterConfigInstanceTypeMlT32xlarge
    ml.t3.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlM4Xlarge
    ml.m4.xlarge
    ProcessingJobClusterConfigInstanceTypeMlM42xlarge
    ml.m4.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlM44xlarge
    ml.m4.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlM410xlarge
    ml.m4.10xlarge
    ProcessingJobClusterConfigInstanceTypeMlM416xlarge
    ml.m4.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlC4Xlarge
    ml.c4.xlarge
    ProcessingJobClusterConfigInstanceTypeMlC42xlarge
    ml.c4.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlC44xlarge
    ml.c4.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlC48xlarge
    ml.c4.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlC5Xlarge
    ml.c5.xlarge
    ProcessingJobClusterConfigInstanceTypeMlC52xlarge
    ml.c5.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlC54xlarge
    ml.c5.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlC59xlarge
    ml.c5.9xlarge
    ProcessingJobClusterConfigInstanceTypeMlC518xlarge
    ml.c5.18xlarge
    ProcessingJobClusterConfigInstanceTypeMlM5Large
    ml.m5.large
    ProcessingJobClusterConfigInstanceTypeMlM5Xlarge
    ml.m5.xlarge
    ProcessingJobClusterConfigInstanceTypeMlM52xlarge
    ml.m5.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlM54xlarge
    ml.m5.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlM512xlarge
    ml.m5.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlM524xlarge
    ml.m5.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5Large
    ml.r5.large
    ProcessingJobClusterConfigInstanceTypeMlR5Xlarge
    ml.r5.xlarge
    ProcessingJobClusterConfigInstanceTypeMlR52xlarge
    ml.r5.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlR54xlarge
    ml.r5.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlR58xlarge
    ml.r5.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlR512xlarge
    ml.r5.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlR516xlarge
    ml.r5.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlR524xlarge
    ml.r5.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dnXlarge
    ml.g4dn.xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dn2xlarge
    ml.g4dn.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dn4xlarge
    ml.g4dn.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dn8xlarge
    ml.g4dn.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dn12xlarge
    ml.g4dn.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlG4dn16xlarge
    ml.g4dn.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlG5Xlarge
    ml.g5.xlarge
    ProcessingJobClusterConfigInstanceTypeMlG52xlarge
    ml.g5.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlG54xlarge
    ml.g5.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlG58xlarge
    ml.g5.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlG516xlarge
    ml.g5.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlG512xlarge
    ml.g5.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlG524xlarge
    ml.g5.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlG548xlarge
    ml.g5.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5dLarge
    ml.r5d.large
    ProcessingJobClusterConfigInstanceTypeMlR5dXlarge
    ml.r5d.xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d2xlarge
    ml.r5d.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d4xlarge
    ml.r5d.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d8xlarge
    ml.r5d.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d12xlarge
    ml.r5d.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d16xlarge
    ml.r5d.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlR5d24xlarge
    ml.r5d.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6Xlarge
    ml.g6.xlarge
    ProcessingJobClusterConfigInstanceTypeMlG62xlarge
    ml.g6.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlG64xlarge
    ml.g6.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlG68xlarge
    ml.g6.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlG612xlarge
    ml.g6.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlG616xlarge
    ml.g6.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlG624xlarge
    ml.g6.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlG648xlarge
    ml.g6.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6eXlarge
    ml.g6e.xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e2xlarge
    ml.g6e.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e4xlarge
    ml.g6e.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e8xlarge
    ml.g6e.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e12xlarge
    ml.g6e.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e16xlarge
    ml.g6e.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e24xlarge
    ml.g6e.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlG6e48xlarge
    ml.g6e.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6iLarge
    ml.m6i.large
    ProcessingJobClusterConfigInstanceTypeMlM6iXlarge
    ml.m6i.xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i2xlarge
    ml.m6i.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i4xlarge
    ml.m6i.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i8xlarge
    ml.m6i.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i12xlarge
    ml.m6i.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i16xlarge
    ml.m6i.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i24xlarge
    ml.m6i.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlM6i32xlarge
    ml.m6i.32xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6iXlarge
    ml.c6i.xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i2xlarge
    ml.c6i.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i4xlarge
    ml.c6i.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i8xlarge
    ml.c6i.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i12xlarge
    ml.c6i.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i16xlarge
    ml.c6i.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i24xlarge
    ml.c6i.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlC6i32xlarge
    ml.c6i.32xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7iLarge
    ml.m7i.large
    ProcessingJobClusterConfigInstanceTypeMlM7iXlarge
    ml.m7i.xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i2xlarge
    ml.m7i.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i4xlarge
    ml.m7i.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i8xlarge
    ml.m7i.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i12xlarge
    ml.m7i.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i16xlarge
    ml.m7i.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i24xlarge
    ml.m7i.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlM7i48xlarge
    ml.m7i.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7iLarge
    ml.c7i.large
    ProcessingJobClusterConfigInstanceTypeMlC7iXlarge
    ml.c7i.xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i2xlarge
    ml.c7i.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i4xlarge
    ml.c7i.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i8xlarge
    ml.c7i.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i12xlarge
    ml.c7i.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i16xlarge
    ml.c7i.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i24xlarge
    ml.c7i.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlC7i48xlarge
    ml.c7i.48xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7iLarge
    ml.r7i.large
    ProcessingJobClusterConfigInstanceTypeMlR7iXlarge
    ml.r7i.xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i2xlarge
    ml.r7i.2xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i4xlarge
    ml.r7i.4xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i8xlarge
    ml.r7i.8xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i12xlarge
    ml.r7i.12xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i16xlarge
    ml.r7i.16xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i24xlarge
    ml.r7i.24xlarge
    ProcessingJobClusterConfigInstanceTypeMlR7i48xlarge
    ml.r7i.48xlarge
    MlT3Medium
    ml.t3.medium
    MlT3Large
    ml.t3.large
    MlT3Xlarge
    ml.t3.xlarge
    MlT32xlarge
    ml.t3.2xlarge
    MlM4Xlarge
    ml.m4.xlarge
    MlM42xlarge
    ml.m4.2xlarge
    MlM44xlarge
    ml.m4.4xlarge
    MlM410xlarge
    ml.m4.10xlarge
    MlM416xlarge
    ml.m4.16xlarge
    MlC4Xlarge
    ml.c4.xlarge
    MlC42xlarge
    ml.c4.2xlarge
    MlC44xlarge
    ml.c4.4xlarge
    MlC48xlarge
    ml.c4.8xlarge
    MlC5Xlarge
    ml.c5.xlarge
    MlC52xlarge
    ml.c5.2xlarge
    MlC54xlarge
    ml.c5.4xlarge
    MlC59xlarge
    ml.c5.9xlarge
    MlC518xlarge
    ml.c5.18xlarge
    MlM5Large
    ml.m5.large
    MlM5Xlarge
    ml.m5.xlarge
    MlM52xlarge
    ml.m5.2xlarge
    MlM54xlarge
    ml.m5.4xlarge
    MlM512xlarge
    ml.m5.12xlarge
    MlM524xlarge
    ml.m5.24xlarge
    MlR5Large
    ml.r5.large
    MlR5Xlarge
    ml.r5.xlarge
    MlR52xlarge
    ml.r5.2xlarge
    MlR54xlarge
    ml.r5.4xlarge
    MlR58xlarge
    ml.r5.8xlarge
    MlR512xlarge
    ml.r5.12xlarge
    MlR516xlarge
    ml.r5.16xlarge
    MlR524xlarge
    ml.r5.24xlarge
    MlG4dnXlarge
    ml.g4dn.xlarge
    MlG4dn2xlarge
    ml.g4dn.2xlarge
    MlG4dn4xlarge
    ml.g4dn.4xlarge
    MlG4dn8xlarge
    ml.g4dn.8xlarge
    MlG4dn12xlarge
    ml.g4dn.12xlarge
    MlG4dn16xlarge
    ml.g4dn.16xlarge
    MlG5Xlarge
    ml.g5.xlarge
    MlG52xlarge
    ml.g5.2xlarge
    MlG54xlarge
    ml.g5.4xlarge
    MlG58xlarge
    ml.g5.8xlarge
    MlG516xlarge
    ml.g5.16xlarge
    MlG512xlarge
    ml.g5.12xlarge
    MlG524xlarge
    ml.g5.24xlarge
    MlG548xlarge
    ml.g5.48xlarge
    MlR5dLarge
    ml.r5d.large
    MlR5dXlarge
    ml.r5d.xlarge
    MlR5d2xlarge
    ml.r5d.2xlarge
    MlR5d4xlarge
    ml.r5d.4xlarge
    MlR5d8xlarge
    ml.r5d.8xlarge
    MlR5d12xlarge
    ml.r5d.12xlarge
    MlR5d16xlarge
    ml.r5d.16xlarge
    MlR5d24xlarge
    ml.r5d.24xlarge
    MlG6Xlarge
    ml.g6.xlarge
    MlG62xlarge
    ml.g6.2xlarge
    MlG64xlarge
    ml.g6.4xlarge
    MlG68xlarge
    ml.g6.8xlarge
    MlG612xlarge
    ml.g6.12xlarge
    MlG616xlarge
    ml.g6.16xlarge
    MlG624xlarge
    ml.g6.24xlarge
    MlG648xlarge
    ml.g6.48xlarge
    MlG6eXlarge
    ml.g6e.xlarge
    MlG6e2xlarge
    ml.g6e.2xlarge
    MlG6e4xlarge
    ml.g6e.4xlarge
    MlG6e8xlarge
    ml.g6e.8xlarge
    MlG6e12xlarge
    ml.g6e.12xlarge
    MlG6e16xlarge
    ml.g6e.16xlarge
    MlG6e24xlarge
    ml.g6e.24xlarge
    MlG6e48xlarge
    ml.g6e.48xlarge
    MlM6iLarge
    ml.m6i.large
    MlM6iXlarge
    ml.m6i.xlarge
    MlM6i2xlarge
    ml.m6i.2xlarge
    MlM6i4xlarge
    ml.m6i.4xlarge
    MlM6i8xlarge
    ml.m6i.8xlarge
    MlM6i12xlarge
    ml.m6i.12xlarge
    MlM6i16xlarge
    ml.m6i.16xlarge
    MlM6i24xlarge
    ml.m6i.24xlarge
    MlM6i32xlarge
    ml.m6i.32xlarge
    MlC6iXlarge
    ml.c6i.xlarge
    MlC6i2xlarge
    ml.c6i.2xlarge
    MlC6i4xlarge
    ml.c6i.4xlarge
    MlC6i8xlarge
    ml.c6i.8xlarge
    MlC6i12xlarge
    ml.c6i.12xlarge
    MlC6i16xlarge
    ml.c6i.16xlarge
    MlC6i24xlarge
    ml.c6i.24xlarge
    MlC6i32xlarge
    ml.c6i.32xlarge
    MlM7iLarge
    ml.m7i.large
    MlM7iXlarge
    ml.m7i.xlarge
    MlM7i2xlarge
    ml.m7i.2xlarge
    MlM7i4xlarge
    ml.m7i.4xlarge
    MlM7i8xlarge
    ml.m7i.8xlarge
    MlM7i12xlarge
    ml.m7i.12xlarge
    MlM7i16xlarge
    ml.m7i.16xlarge
    MlM7i24xlarge
    ml.m7i.24xlarge
    MlM7i48xlarge
    ml.m7i.48xlarge
    MlC7iLarge
    ml.c7i.large
    MlC7iXlarge
    ml.c7i.xlarge
    MlC7i2xlarge
    ml.c7i.2xlarge
    MlC7i4xlarge
    ml.c7i.4xlarge
    MlC7i8xlarge
    ml.c7i.8xlarge
    MlC7i12xlarge
    ml.c7i.12xlarge
    MlC7i16xlarge
    ml.c7i.16xlarge
    MlC7i24xlarge
    ml.c7i.24xlarge
    MlC7i48xlarge
    ml.c7i.48xlarge
    MlR7iLarge
    ml.r7i.large
    MlR7iXlarge
    ml.r7i.xlarge
    MlR7i2xlarge
    ml.r7i.2xlarge
    MlR7i4xlarge
    ml.r7i.4xlarge
    MlR7i8xlarge
    ml.r7i.8xlarge
    MlR7i12xlarge
    ml.r7i.12xlarge
    MlR7i16xlarge
    ml.r7i.16xlarge
    MlR7i24xlarge
    ml.r7i.24xlarge
    MlR7i48xlarge
    ml.r7i.48xlarge
    MlT3Medium
    ml.t3.medium
    MlT3Large
    ml.t3.large
    MlT3Xlarge
    ml.t3.xlarge
    MlT32xlarge
    ml.t3.2xlarge
    MlM4Xlarge
    ml.m4.xlarge
    MlM42xlarge
    ml.m4.2xlarge
    MlM44xlarge
    ml.m4.4xlarge
    MlM410xlarge
    ml.m4.10xlarge
    MlM416xlarge
    ml.m4.16xlarge
    MlC4Xlarge
    ml.c4.xlarge
    MlC42xlarge
    ml.c4.2xlarge
    MlC44xlarge
    ml.c4.4xlarge
    MlC48xlarge
    ml.c4.8xlarge
    MlC5Xlarge
    ml.c5.xlarge
    MlC52xlarge
    ml.c5.2xlarge
    MlC54xlarge
    ml.c5.4xlarge
    MlC59xlarge
    ml.c5.9xlarge
    MlC518xlarge
    ml.c5.18xlarge
    MlM5Large
    ml.m5.large
    MlM5Xlarge
    ml.m5.xlarge
    MlM52xlarge
    ml.m5.2xlarge
    MlM54xlarge
    ml.m5.4xlarge
    MlM512xlarge
    ml.m5.12xlarge
    MlM524xlarge
    ml.m5.24xlarge
    MlR5Large
    ml.r5.large
    MlR5Xlarge
    ml.r5.xlarge
    MlR52xlarge
    ml.r5.2xlarge
    MlR54xlarge
    ml.r5.4xlarge
    MlR58xlarge
    ml.r5.8xlarge
    MlR512xlarge
    ml.r5.12xlarge
    MlR516xlarge
    ml.r5.16xlarge
    MlR524xlarge
    ml.r5.24xlarge
    MlG4dnXlarge
    ml.g4dn.xlarge
    MlG4dn2xlarge
    ml.g4dn.2xlarge
    MlG4dn4xlarge
    ml.g4dn.4xlarge
    MlG4dn8xlarge
    ml.g4dn.8xlarge
    MlG4dn12xlarge
    ml.g4dn.12xlarge
    MlG4dn16xlarge
    ml.g4dn.16xlarge
    MlG5Xlarge
    ml.g5.xlarge
    MlG52xlarge
    ml.g5.2xlarge
    MlG54xlarge
    ml.g5.4xlarge
    MlG58xlarge
    ml.g5.8xlarge
    MlG516xlarge
    ml.g5.16xlarge
    MlG512xlarge
    ml.g5.12xlarge
    MlG524xlarge
    ml.g5.24xlarge
    MlG548xlarge
    ml.g5.48xlarge
    MlR5dLarge
    ml.r5d.large
    MlR5dXlarge
    ml.r5d.xlarge
    MlR5d2xlarge
    ml.r5d.2xlarge
    MlR5d4xlarge
    ml.r5d.4xlarge
    MlR5d8xlarge
    ml.r5d.8xlarge
    MlR5d12xlarge
    ml.r5d.12xlarge
    MlR5d16xlarge
    ml.r5d.16xlarge
    MlR5d24xlarge
    ml.r5d.24xlarge
    MlG6Xlarge
    ml.g6.xlarge
    MlG62xlarge
    ml.g6.2xlarge
    MlG64xlarge
    ml.g6.4xlarge
    MlG68xlarge
    ml.g6.8xlarge
    MlG612xlarge
    ml.g6.12xlarge
    MlG616xlarge
    ml.g6.16xlarge
    MlG624xlarge
    ml.g6.24xlarge
    MlG648xlarge
    ml.g6.48xlarge
    MlG6eXlarge
    ml.g6e.xlarge
    MlG6e2xlarge
    ml.g6e.2xlarge
    MlG6e4xlarge
    ml.g6e.4xlarge
    MlG6e8xlarge
    ml.g6e.8xlarge
    MlG6e12xlarge
    ml.g6e.12xlarge
    MlG6e16xlarge
    ml.g6e.16xlarge
    MlG6e24xlarge
    ml.g6e.24xlarge
    MlG6e48xlarge
    ml.g6e.48xlarge
    MlM6iLarge
    ml.m6i.large
    MlM6iXlarge
    ml.m6i.xlarge
    MlM6i2xlarge
    ml.m6i.2xlarge
    MlM6i4xlarge
    ml.m6i.4xlarge
    MlM6i8xlarge
    ml.m6i.8xlarge
    MlM6i12xlarge
    ml.m6i.12xlarge
    MlM6i16xlarge
    ml.m6i.16xlarge
    MlM6i24xlarge
    ml.m6i.24xlarge
    MlM6i32xlarge
    ml.m6i.32xlarge
    MlC6iXlarge
    ml.c6i.xlarge
    MlC6i2xlarge
    ml.c6i.2xlarge
    MlC6i4xlarge
    ml.c6i.4xlarge
    MlC6i8xlarge
    ml.c6i.8xlarge
    MlC6i12xlarge
    ml.c6i.12xlarge
    MlC6i16xlarge
    ml.c6i.16xlarge
    MlC6i24xlarge
    ml.c6i.24xlarge
    MlC6i32xlarge
    ml.c6i.32xlarge
    MlM7iLarge
    ml.m7i.large
    MlM7iXlarge
    ml.m7i.xlarge
    MlM7i2xlarge
    ml.m7i.2xlarge
    MlM7i4xlarge
    ml.m7i.4xlarge
    MlM7i8xlarge
    ml.m7i.8xlarge
    MlM7i12xlarge
    ml.m7i.12xlarge
    MlM7i16xlarge
    ml.m7i.16xlarge
    MlM7i24xlarge
    ml.m7i.24xlarge
    MlM7i48xlarge
    ml.m7i.48xlarge
    MlC7iLarge
    ml.c7i.large
    MlC7iXlarge
    ml.c7i.xlarge
    MlC7i2xlarge
    ml.c7i.2xlarge
    MlC7i4xlarge
    ml.c7i.4xlarge
    MlC7i8xlarge
    ml.c7i.8xlarge
    MlC7i12xlarge
    ml.c7i.12xlarge
    MlC7i16xlarge
    ml.c7i.16xlarge
    MlC7i24xlarge
    ml.c7i.24xlarge
    MlC7i48xlarge
    ml.c7i.48xlarge
    MlR7iLarge
    ml.r7i.large
    MlR7iXlarge
    ml.r7i.xlarge
    MlR7i2xlarge
    ml.r7i.2xlarge
    MlR7i4xlarge
    ml.r7i.4xlarge
    MlR7i8xlarge
    ml.r7i.8xlarge
    MlR7i12xlarge
    ml.r7i.12xlarge
    MlR7i16xlarge
    ml.r7i.16xlarge
    MlR7i24xlarge
    ml.r7i.24xlarge
    MlR7i48xlarge
    ml.r7i.48xlarge
    ML_T3_MEDIUM
    ml.t3.medium
    ML_T3_LARGE
    ml.t3.large
    ML_T3_XLARGE
    ml.t3.xlarge
    ML_T32XLARGE
    ml.t3.2xlarge
    ML_M4_XLARGE
    ml.m4.xlarge
    ML_M42XLARGE
    ml.m4.2xlarge
    ML_M44XLARGE
    ml.m4.4xlarge
    ML_M410XLARGE
    ml.m4.10xlarge
    ML_M416XLARGE
    ml.m4.16xlarge
    ML_C4_XLARGE
    ml.c4.xlarge
    ML_C42XLARGE
    ml.c4.2xlarge
    ML_C44XLARGE
    ml.c4.4xlarge
    ML_C48XLARGE
    ml.c4.8xlarge
    ML_C5_XLARGE
    ml.c5.xlarge
    ML_C52XLARGE
    ml.c5.2xlarge
    ML_C54XLARGE
    ml.c5.4xlarge
    ML_C59XLARGE
    ml.c5.9xlarge
    ML_C518XLARGE
    ml.c5.18xlarge
    ML_M5_LARGE
    ml.m5.large
    ML_M5_XLARGE
    ml.m5.xlarge
    ML_M52XLARGE
    ml.m5.2xlarge
    ML_M54XLARGE
    ml.m5.4xlarge
    ML_M512XLARGE
    ml.m5.12xlarge
    ML_M524XLARGE
    ml.m5.24xlarge
    ML_R5_LARGE
    ml.r5.large
    ML_R5_XLARGE
    ml.r5.xlarge
    ML_R52XLARGE
    ml.r5.2xlarge
    ML_R54XLARGE
    ml.r5.4xlarge
    ML_R58XLARGE
    ml.r5.8xlarge
    ML_R512XLARGE
    ml.r5.12xlarge
    ML_R516XLARGE
    ml.r5.16xlarge
    ML_R524XLARGE
    ml.r5.24xlarge
    ML_G4DN_XLARGE
    ml.g4dn.xlarge
    ML_G4DN2XLARGE
    ml.g4dn.2xlarge
    ML_G4DN4XLARGE
    ml.g4dn.4xlarge
    ML_G4DN8XLARGE
    ml.g4dn.8xlarge
    ML_G4DN12XLARGE
    ml.g4dn.12xlarge
    ML_G4DN16XLARGE
    ml.g4dn.16xlarge
    ML_G5_XLARGE
    ml.g5.xlarge
    ML_G52XLARGE
    ml.g5.2xlarge
    ML_G54XLARGE
    ml.g5.4xlarge
    ML_G58XLARGE
    ml.g5.8xlarge
    ML_G516XLARGE
    ml.g5.16xlarge
    ML_G512XLARGE
    ml.g5.12xlarge
    ML_G524XLARGE
    ml.g5.24xlarge
    ML_G548XLARGE
    ml.g5.48xlarge
    ML_R5D_LARGE
    ml.r5d.large
    ML_R5D_XLARGE
    ml.r5d.xlarge
    ML_R5D2XLARGE
    ml.r5d.2xlarge
    ML_R5D4XLARGE
    ml.r5d.4xlarge
    ML_R5D8XLARGE
    ml.r5d.8xlarge
    ML_R5D12XLARGE
    ml.r5d.12xlarge
    ML_R5D16XLARGE
    ml.r5d.16xlarge
    ML_R5D24XLARGE
    ml.r5d.24xlarge
    ML_G6_XLARGE
    ml.g6.xlarge
    ML_G62XLARGE
    ml.g6.2xlarge
    ML_G64XLARGE
    ml.g6.4xlarge
    ML_G68XLARGE
    ml.g6.8xlarge
    ML_G612XLARGE
    ml.g6.12xlarge
    ML_G616XLARGE
    ml.g6.16xlarge
    ML_G624XLARGE
    ml.g6.24xlarge
    ML_G648XLARGE
    ml.g6.48xlarge
    ML_G6E_XLARGE
    ml.g6e.xlarge
    ML_G6E2XLARGE
    ml.g6e.2xlarge
    ML_G6E4XLARGE
    ml.g6e.4xlarge
    ML_G6E8XLARGE
    ml.g6e.8xlarge
    ML_G6E12XLARGE
    ml.g6e.12xlarge
    ML_G6E16XLARGE
    ml.g6e.16xlarge
    ML_G6E24XLARGE
    ml.g6e.24xlarge
    ML_G6E48XLARGE
    ml.g6e.48xlarge
    ML_M6I_LARGE
    ml.m6i.large
    ML_M6I_XLARGE
    ml.m6i.xlarge
    ML_M6I2XLARGE
    ml.m6i.2xlarge
    ML_M6I4XLARGE
    ml.m6i.4xlarge
    ML_M6I8XLARGE
    ml.m6i.8xlarge
    ML_M6I12XLARGE
    ml.m6i.12xlarge
    ML_M6I16XLARGE
    ml.m6i.16xlarge
    ML_M6I24XLARGE
    ml.m6i.24xlarge
    ML_M6I32XLARGE
    ml.m6i.32xlarge
    ML_C6I_XLARGE
    ml.c6i.xlarge
    ML_C6I2XLARGE
    ml.c6i.2xlarge
    ML_C6I4XLARGE
    ml.c6i.4xlarge
    ML_C6I8XLARGE
    ml.c6i.8xlarge
    ML_C6I12XLARGE
    ml.c6i.12xlarge
    ML_C6I16XLARGE
    ml.c6i.16xlarge
    ML_C6I24XLARGE
    ml.c6i.24xlarge
    ML_C6I32XLARGE
    ml.c6i.32xlarge
    ML_M7I_LARGE
    ml.m7i.large
    ML_M7I_XLARGE
    ml.m7i.xlarge
    ML_M7I2XLARGE
    ml.m7i.2xlarge
    ML_M7I4XLARGE
    ml.m7i.4xlarge
    ML_M7I8XLARGE
    ml.m7i.8xlarge
    ML_M7I12XLARGE
    ml.m7i.12xlarge
    ML_M7I16XLARGE
    ml.m7i.16xlarge
    ML_M7I24XLARGE
    ml.m7i.24xlarge
    ML_M7I48XLARGE
    ml.m7i.48xlarge
    ML_C7I_LARGE
    ml.c7i.large
    ML_C7I_XLARGE
    ml.c7i.xlarge
    ML_C7I2XLARGE
    ml.c7i.2xlarge
    ML_C7I4XLARGE
    ml.c7i.4xlarge
    ML_C7I8XLARGE
    ml.c7i.8xlarge
    ML_C7I12XLARGE
    ml.c7i.12xlarge
    ML_C7I16XLARGE
    ml.c7i.16xlarge
    ML_C7I24XLARGE
    ml.c7i.24xlarge
    ML_C7I48XLARGE
    ml.c7i.48xlarge
    ML_R7I_LARGE
    ml.r7i.large
    ML_R7I_XLARGE
    ml.r7i.xlarge
    ML_R7I2XLARGE
    ml.r7i.2xlarge
    ML_R7I4XLARGE
    ml.r7i.4xlarge
    ML_R7I8XLARGE
    ml.r7i.8xlarge
    ML_R7I12XLARGE
    ml.r7i.12xlarge
    ML_R7I16XLARGE
    ml.r7i.16xlarge
    ML_R7I24XLARGE
    ml.r7i.24xlarge
    ML_R7I48XLARGE
    ml.r7i.48xlarge
    "ml.t3.medium"
    ml.t3.medium
    "ml.t3.large"
    ml.t3.large
    "ml.t3.xlarge"
    ml.t3.xlarge
    "ml.t3.2xlarge"
    ml.t3.2xlarge
    "ml.m4.xlarge"
    ml.m4.xlarge
    "ml.m4.2xlarge"
    ml.m4.2xlarge
    "ml.m4.4xlarge"
    ml.m4.4xlarge
    "ml.m4.10xlarge"
    ml.m4.10xlarge
    "ml.m4.16xlarge"
    ml.m4.16xlarge
    "ml.c4.xlarge"
    ml.c4.xlarge
    "ml.c4.2xlarge"
    ml.c4.2xlarge
    "ml.c4.4xlarge"
    ml.c4.4xlarge
    "ml.c4.8xlarge"
    ml.c4.8xlarge
    "ml.c5.xlarge"
    ml.c5.xlarge
    "ml.c5.2xlarge"
    ml.c5.2xlarge
    "ml.c5.4xlarge"
    ml.c5.4xlarge
    "ml.c5.9xlarge"
    ml.c5.9xlarge
    "ml.c5.18xlarge"
    ml.c5.18xlarge
    "ml.m5.large"
    ml.m5.large
    "ml.m5.xlarge"
    ml.m5.xlarge
    "ml.m5.2xlarge"
    ml.m5.2xlarge
    "ml.m5.4xlarge"
    ml.m5.4xlarge
    "ml.m5.12xlarge"
    ml.m5.12xlarge
    "ml.m5.24xlarge"
    ml.m5.24xlarge
    "ml.r5.large"
    ml.r5.large
    "ml.r5.xlarge"
    ml.r5.xlarge
    "ml.r5.2xlarge"
    ml.r5.2xlarge
    "ml.r5.4xlarge"
    ml.r5.4xlarge
    "ml.r5.8xlarge"
    ml.r5.8xlarge
    "ml.r5.12xlarge"
    ml.r5.12xlarge
    "ml.r5.16xlarge"
    ml.r5.16xlarge
    "ml.r5.24xlarge"
    ml.r5.24xlarge
    "ml.g4dn.xlarge"
    ml.g4dn.xlarge
    "ml.g4dn.2xlarge"
    ml.g4dn.2xlarge
    "ml.g4dn.4xlarge"
    ml.g4dn.4xlarge
    "ml.g4dn.8xlarge"
    ml.g4dn.8xlarge
    "ml.g4dn.12xlarge"
    ml.g4dn.12xlarge
    "ml.g4dn.16xlarge"
    ml.g4dn.16xlarge
    "ml.g5.xlarge"
    ml.g5.xlarge
    "ml.g5.2xlarge"
    ml.g5.2xlarge
    "ml.g5.4xlarge"
    ml.g5.4xlarge
    "ml.g5.8xlarge"
    ml.g5.8xlarge
    "ml.g5.16xlarge"
    ml.g5.16xlarge
    "ml.g5.12xlarge"
    ml.g5.12xlarge
    "ml.g5.24xlarge"
    ml.g5.24xlarge
    "ml.g5.48xlarge"
    ml.g5.48xlarge
    "ml.r5d.large"
    ml.r5d.large
    "ml.r5d.xlarge"
    ml.r5d.xlarge
    "ml.r5d.2xlarge"
    ml.r5d.2xlarge
    "ml.r5d.4xlarge"
    ml.r5d.4xlarge
    "ml.r5d.8xlarge"
    ml.r5d.8xlarge
    "ml.r5d.12xlarge"
    ml.r5d.12xlarge
    "ml.r5d.16xlarge"
    ml.r5d.16xlarge
    "ml.r5d.24xlarge"
    ml.r5d.24xlarge
    "ml.g6.xlarge"
    ml.g6.xlarge
    "ml.g6.2xlarge"
    ml.g6.2xlarge
    "ml.g6.4xlarge"
    ml.g6.4xlarge
    "ml.g6.8xlarge"
    ml.g6.8xlarge
    "ml.g6.12xlarge"
    ml.g6.12xlarge
    "ml.g6.16xlarge"
    ml.g6.16xlarge
    "ml.g6.24xlarge"
    ml.g6.24xlarge
    "ml.g6.48xlarge"
    ml.g6.48xlarge
    "ml.g6e.xlarge"
    ml.g6e.xlarge
    "ml.g6e.2xlarge"
    ml.g6e.2xlarge
    "ml.g6e.4xlarge"
    ml.g6e.4xlarge
    "ml.g6e.8xlarge"
    ml.g6e.8xlarge
    "ml.g6e.12xlarge"
    ml.g6e.12xlarge
    "ml.g6e.16xlarge"
    ml.g6e.16xlarge
    "ml.g6e.24xlarge"
    ml.g6e.24xlarge
    "ml.g6e.48xlarge"
    ml.g6e.48xlarge
    "ml.m6i.large"
    ml.m6i.large
    "ml.m6i.xlarge"
    ml.m6i.xlarge
    "ml.m6i.2xlarge"
    ml.m6i.2xlarge
    "ml.m6i.4xlarge"
    ml.m6i.4xlarge
    "ml.m6i.8xlarge"
    ml.m6i.8xlarge
    "ml.m6i.12xlarge"
    ml.m6i.12xlarge
    "ml.m6i.16xlarge"
    ml.m6i.16xlarge
    "ml.m6i.24xlarge"
    ml.m6i.24xlarge
    "ml.m6i.32xlarge"
    ml.m6i.32xlarge
    "ml.c6i.xlarge"
    ml.c6i.xlarge
    "ml.c6i.2xlarge"
    ml.c6i.2xlarge
    "ml.c6i.4xlarge"
    ml.c6i.4xlarge
    "ml.c6i.8xlarge"
    ml.c6i.8xlarge
    "ml.c6i.12xlarge"
    ml.c6i.12xlarge
    "ml.c6i.16xlarge"
    ml.c6i.16xlarge
    "ml.c6i.24xlarge"
    ml.c6i.24xlarge
    "ml.c6i.32xlarge"
    ml.c6i.32xlarge
    "ml.m7i.large"
    ml.m7i.large
    "ml.m7i.xlarge"
    ml.m7i.xlarge
    "ml.m7i.2xlarge"
    ml.m7i.2xlarge
    "ml.m7i.4xlarge"
    ml.m7i.4xlarge
    "ml.m7i.8xlarge"
    ml.m7i.8xlarge
    "ml.m7i.12xlarge"
    ml.m7i.12xlarge
    "ml.m7i.16xlarge"
    ml.m7i.16xlarge
    "ml.m7i.24xlarge"
    ml.m7i.24xlarge
    "ml.m7i.48xlarge"
    ml.m7i.48xlarge
    "ml.c7i.large"
    ml.c7i.large
    "ml.c7i.xlarge"
    ml.c7i.xlarge
    "ml.c7i.2xlarge"
    ml.c7i.2xlarge
    "ml.c7i.4xlarge"
    ml.c7i.4xlarge
    "ml.c7i.8xlarge"
    ml.c7i.8xlarge
    "ml.c7i.12xlarge"
    ml.c7i.12xlarge
    "ml.c7i.16xlarge"
    ml.c7i.16xlarge
    "ml.c7i.24xlarge"
    ml.c7i.24xlarge
    "ml.c7i.48xlarge"
    ml.c7i.48xlarge
    "ml.r7i.large"
    ml.r7i.large
    "ml.r7i.xlarge"
    ml.r7i.xlarge
    "ml.r7i.2xlarge"
    ml.r7i.2xlarge
    "ml.r7i.4xlarge"
    ml.r7i.4xlarge
    "ml.r7i.8xlarge"
    ml.r7i.8xlarge
    "ml.r7i.12xlarge"
    ml.r7i.12xlarge
    "ml.r7i.16xlarge"
    ml.r7i.16xlarge
    "ml.r7i.24xlarge"
    ml.r7i.24xlarge
    "ml.r7i.48xlarge"
    ml.r7i.48xlarge

    ProcessingJobDatasetDefinition, ProcessingJobDatasetDefinitionArgs

    AthenaDatasetDefinition Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobAthenaDatasetDefinition
    Configuration for Athena Dataset Definition input.
    DataDistributionType Pulumi.AwsNative.SageMaker.ProcessingJobDatasetDefinitionDataDistributionType
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    InputMode Pulumi.AwsNative.SageMaker.ProcessingJobDatasetDefinitionInputMode
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    LocalPath string
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    RedshiftDatasetDefinition Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobRedshiftDatasetDefinition
    Configuration for Redshift Dataset Definition input.
    AthenaDatasetDefinition ProcessingJobAthenaDatasetDefinition
    Configuration for Athena Dataset Definition input.
    DataDistributionType ProcessingJobDatasetDefinitionDataDistributionType
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    InputMode ProcessingJobDatasetDefinitionInputMode
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    LocalPath string
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    RedshiftDatasetDefinition ProcessingJobRedshiftDatasetDefinition
    Configuration for Redshift Dataset Definition input.
    athenaDatasetDefinition ProcessingJobAthenaDatasetDefinition
    Configuration for Athena Dataset Definition input.
    dataDistributionType ProcessingJobDatasetDefinitionDataDistributionType
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    inputMode ProcessingJobDatasetDefinitionInputMode
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    localPath String
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    redshiftDatasetDefinition ProcessingJobRedshiftDatasetDefinition
    Configuration for Redshift Dataset Definition input.
    athenaDatasetDefinition ProcessingJobAthenaDatasetDefinition
    Configuration for Athena Dataset Definition input.
    dataDistributionType ProcessingJobDatasetDefinitionDataDistributionType
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    inputMode ProcessingJobDatasetDefinitionInputMode
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    localPath string
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    redshiftDatasetDefinition ProcessingJobRedshiftDatasetDefinition
    Configuration for Redshift Dataset Definition input.
    athena_dataset_definition ProcessingJobAthenaDatasetDefinition
    Configuration for Athena Dataset Definition input.
    data_distribution_type ProcessingJobDatasetDefinitionDataDistributionType
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    input_mode ProcessingJobDatasetDefinitionInputMode
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    local_path str
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    redshift_dataset_definition ProcessingJobRedshiftDatasetDefinition
    Configuration for Redshift Dataset Definition input.
    athenaDatasetDefinition Property Map
    Configuration for Athena Dataset Definition input.
    dataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether the generated dataset is FullyReplicated or ShardedByS3Key (default).
    inputMode "File" | "Pipe"
    Whether to use File or Pipe input mode. In File (default) mode, Amazon SageMaker copies the data from the input source onto the local Amazon Elastic Block Store (Amazon EBS) volumes before starting your training algorithm. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your algorithm without using the EBS volume.
    localPath String
    The local path where you want Amazon SageMaker to download the Dataset Definition inputs to run a processing job. LocalPath is an absolute path to the input data. This is a required parameter when AppManaged is False (default).
    redshiftDatasetDefinition Property Map
    Configuration for Redshift Dataset Definition input.

    ProcessingJobDatasetDefinitionDataDistributionType, ProcessingJobDatasetDefinitionDataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    ProcessingJobDatasetDefinitionDataDistributionTypeFullyReplicated
    FullyReplicated
    ProcessingJobDatasetDefinitionDataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    ProcessingJobDatasetDefinitionInputMode, ProcessingJobDatasetDefinitionInputModeArgs

    File
    File
    Pipe
    Pipe
    ProcessingJobDatasetDefinitionInputModeFile
    File
    ProcessingJobDatasetDefinitionInputModePipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    FILE
    File
    PIPE
    Pipe
    "File"
    File
    "Pipe"
    Pipe

    ProcessingJobExperimentConfig, ProcessingJobExperimentConfigArgs

    ExperimentName string
    The name of an existing experiment to associate with the trial component.
    RunName string
    The name of the experiment run to associate with the trial component.
    TrialComponentDisplayName string
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    TrialName string
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
    ExperimentName string
    The name of an existing experiment to associate with the trial component.
    RunName string
    The name of the experiment run to associate with the trial component.
    TrialComponentDisplayName string
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    TrialName string
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
    experimentName String
    The name of an existing experiment to associate with the trial component.
    runName String
    The name of the experiment run to associate with the trial component.
    trialComponentDisplayName String
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    trialName String
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
    experimentName string
    The name of an existing experiment to associate with the trial component.
    runName string
    The name of the experiment run to associate with the trial component.
    trialComponentDisplayName string
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    trialName string
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
    experiment_name str
    The name of an existing experiment to associate with the trial component.
    run_name str
    The name of the experiment run to associate with the trial component.
    trial_component_display_name str
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    trial_name str
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.
    experimentName String
    The name of an existing experiment to associate with the trial component.
    runName String
    The name of the experiment run to associate with the trial component.
    trialComponentDisplayName String
    The display name for the trial component. If this key isn't specified, the display name is the trial component name.
    trialName String
    The name of an existing trial to associate the trial component with. If not specified, a new trial is created.

    ProcessingJobFeatureStoreOutput, ProcessingJobFeatureStoreOutputArgs

    FeatureGroupName string
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
    FeatureGroupName string
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
    featureGroupName String
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
    featureGroupName string
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
    feature_group_name str
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.
    featureGroupName String
    The name of the Amazon SageMaker FeatureGroup to use as the destination for processing job output. Note that your processing script is responsible for putting records into your Feature Store.

    ProcessingJobNetworkConfig, ProcessingJobNetworkConfigArgs

    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobVpcConfig
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
    EnableInterContainerTrafficEncryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    EnableNetworkIsolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    VpcConfig ProcessingJobVpcConfig
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig ProcessingJobVpcConfig
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
    enableInterContainerTrafficEncryption boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig ProcessingJobVpcConfig
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
    enable_inter_container_traffic_encryption bool
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enable_network_isolation bool
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpc_config ProcessingJobVpcConfig
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .
    enableInterContainerTrafficEncryption Boolean
    Whether to encrypt all communications between distributed processing jobs. Choose True to encrypt communications. Encryption provides greater security for distributed processing jobs, but the processing might take longer.
    enableNetworkIsolation Boolean
    Whether to allow inbound and outbound network calls to and from the containers used for the processing job.
    vpcConfig Property Map
    Specifies an Amazon Virtual Private Cloud (VPC) that your SageMaker jobs, hosted models, and compute resources have access to. You can control access to and from your resources by configuring a VPC. For more information, see Give SageMaker Access to Resources in your Amazon VPC .

    ProcessingJobProcessingInputsObject, ProcessingJobProcessingInputsObjectArgs

    InputName string
    The name for the processing job input.
    AppManaged bool
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    DatasetDefinition Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobDatasetDefinition
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    S3Input Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobS3Input
    Configuration for downloading input data from Amazon S3 into the processing container.
    InputName string
    The name for the processing job input.
    AppManaged bool
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    DatasetDefinition ProcessingJobDatasetDefinition
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    S3Input ProcessingJobS3Input
    Configuration for downloading input data from Amazon S3 into the processing container.
    inputName String
    The name for the processing job input.
    appManaged Boolean
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    datasetDefinition ProcessingJobDatasetDefinition
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    s3Input ProcessingJobS3Input
    Configuration for downloading input data from Amazon S3 into the processing container.
    inputName string
    The name for the processing job input.
    appManaged boolean
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    datasetDefinition ProcessingJobDatasetDefinition
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    s3Input ProcessingJobS3Input
    Configuration for downloading input data from Amazon S3 into the processing container.
    input_name str
    The name for the processing job input.
    app_managed bool
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    dataset_definition ProcessingJobDatasetDefinition
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    s3_input ProcessingJobS3Input
    Configuration for downloading input data from Amazon S3 into the processing container.
    inputName String
    The name for the processing job input.
    appManaged Boolean
    When True, input operations such as data download are managed natively by the processing job application. When False (default), input operations are managed by Amazon SageMaker.
    datasetDefinition Property Map
    Configuration for Dataset Definition inputs. The Dataset Definition input must specify exactly one of either AthenaDatasetDefinition or RedshiftDatasetDefinition types.
    s3Input Property Map
    Configuration for downloading input data from Amazon S3 into the processing container.

    ProcessingJobProcessingOutputConfig, ProcessingJobProcessingOutputConfigArgs

    Outputs List<Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobProcessingOutputsObject>
    An array of outputs configuring the data to upload from the processing container.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
    Outputs []ProcessingJobProcessingOutputsObject
    An array of outputs configuring the data to upload from the processing container.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
    outputs List<ProcessingJobProcessingOutputsObject>
    An array of outputs configuring the data to upload from the processing container.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
    outputs ProcessingJobProcessingOutputsObject[]
    An array of outputs configuring the data to upload from the processing container.
    kmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
    outputs Sequence[ProcessingJobProcessingOutputsObject]
    An array of outputs configuring the data to upload from the processing container.
    kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.
    outputs List<Property Map>
    An array of outputs configuring the data to upload from the processing container.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt the processing job output. KmsKeyId can be an ID of a KMS key, ARN of a KMS key, or alias of a KMS key. The KmsKeyId is applied to all outputs.

    ProcessingJobProcessingOutputsObject, ProcessingJobProcessingOutputsObjectArgs

    OutputName string
    The name for the processing job output.
    AppManaged bool
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    FeatureStoreOutput Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobFeatureStoreOutput
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    S3Output Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobS3Output
    Configuration for uploading output data to Amazon S3 from the processing container.
    OutputName string
    The name for the processing job output.
    AppManaged bool
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    FeatureStoreOutput ProcessingJobFeatureStoreOutput
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    S3Output ProcessingJobS3Output
    Configuration for uploading output data to Amazon S3 from the processing container.
    outputName String
    The name for the processing job output.
    appManaged Boolean
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    featureStoreOutput ProcessingJobFeatureStoreOutput
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    s3Output ProcessingJobS3Output
    Configuration for uploading output data to Amazon S3 from the processing container.
    outputName string
    The name for the processing job output.
    appManaged boolean
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    featureStoreOutput ProcessingJobFeatureStoreOutput
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    s3Output ProcessingJobS3Output
    Configuration for uploading output data to Amazon S3 from the processing container.
    output_name str
    The name for the processing job output.
    app_managed bool
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    feature_store_output ProcessingJobFeatureStoreOutput
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    s3_output ProcessingJobS3Output
    Configuration for uploading output data to Amazon S3 from the processing container.
    outputName String
    The name for the processing job output.
    appManaged Boolean
    When True, output operations such as data upload are managed natively by the processing job application. When False (default), output operations are managed by Amazon SageMaker.
    featureStoreOutput Property Map
    Configuration for processing job outputs in Amazon SageMaker Feature Store.
    s3Output Property Map
    Configuration for uploading output data to Amazon S3 from the processing container.

    ProcessingJobProcessingResources, ProcessingJobProcessingResourcesArgs

    ClusterConfig Pulumi.AwsNative.SageMaker.Inputs.ProcessingJobClusterConfig
    The configuration for the resources in a cluster used to run the processing job.
    ClusterConfig ProcessingJobClusterConfig
    The configuration for the resources in a cluster used to run the processing job.
    clusterConfig ProcessingJobClusterConfig
    The configuration for the resources in a cluster used to run the processing job.
    clusterConfig ProcessingJobClusterConfig
    The configuration for the resources in a cluster used to run the processing job.
    cluster_config ProcessingJobClusterConfig
    The configuration for the resources in a cluster used to run the processing job.
    clusterConfig Property Map
    The configuration for the resources in a cluster used to run the processing job.

    ProcessingJobRedshiftDatasetDefinition, ProcessingJobRedshiftDatasetDefinitionArgs

    ClusterId string
    The Redshift cluster Identifier.
    ClusterRoleArn string
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    Database string
    The name of the Redshift database used in Redshift query execution.
    DbUser string
    The database user name used in Redshift query execution.
    OutputFormat Pulumi.AwsNative.SageMaker.ProcessingJobRedshiftDatasetDefinitionOutputFormat
    The data storage format for Redshift query results.
    OutputS3Uri string
    The location in Amazon S3 where the Redshift query results are stored.
    QueryString string
    The SQL query statements to be executed.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    OutputCompression Pulumi.AwsNative.SageMaker.ProcessingJobRedshiftDatasetDefinitionOutputCompression
    The compression used for Redshift query results.
    ClusterId string
    The Redshift cluster Identifier.
    ClusterRoleArn string
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    Database string
    The name of the Redshift database used in Redshift query execution.
    DbUser string
    The database user name used in Redshift query execution.
    OutputFormat ProcessingJobRedshiftDatasetDefinitionOutputFormat
    The data storage format for Redshift query results.
    OutputS3Uri string
    The location in Amazon S3 where the Redshift query results are stored.
    QueryString string
    The SQL query statements to be executed.
    KmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    OutputCompression ProcessingJobRedshiftDatasetDefinitionOutputCompression
    The compression used for Redshift query results.
    clusterId String
    The Redshift cluster Identifier.
    clusterRoleArn String
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    database String
    The name of the Redshift database used in Redshift query execution.
    dbUser String
    The database user name used in Redshift query execution.
    outputFormat ProcessingJobRedshiftDatasetDefinitionOutputFormat
    The data storage format for Redshift query results.
    outputS3Uri String
    The location in Amazon S3 where the Redshift query results are stored.
    queryString String
    The SQL query statements to be executed.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    outputCompression ProcessingJobRedshiftDatasetDefinitionOutputCompression
    The compression used for Redshift query results.
    clusterId string
    The Redshift cluster Identifier.
    clusterRoleArn string
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    database string
    The name of the Redshift database used in Redshift query execution.
    dbUser string
    The database user name used in Redshift query execution.
    outputFormat ProcessingJobRedshiftDatasetDefinitionOutputFormat
    The data storage format for Redshift query results.
    outputS3Uri string
    The location in Amazon S3 where the Redshift query results are stored.
    queryString string
    The SQL query statements to be executed.
    kmsKeyId string
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    outputCompression ProcessingJobRedshiftDatasetDefinitionOutputCompression
    The compression used for Redshift query results.
    cluster_id str
    The Redshift cluster Identifier.
    cluster_role_arn str
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    database str
    The name of the Redshift database used in Redshift query execution.
    db_user str
    The database user name used in Redshift query execution.
    output_format ProcessingJobRedshiftDatasetDefinitionOutputFormat
    The data storage format for Redshift query results.
    output_s3_uri str
    The location in Amazon S3 where the Redshift query results are stored.
    query_string str
    The SQL query statements to be executed.
    kms_key_id str
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    output_compression ProcessingJobRedshiftDatasetDefinitionOutputCompression
    The compression used for Redshift query results.
    clusterId String
    The Redshift cluster Identifier.
    clusterRoleArn String
    The IAM role attached to your Redshift cluster that Amazon SageMaker uses to generate datasets.
    database String
    The name of the Redshift database used in Redshift query execution.
    dbUser String
    The database user name used in Redshift query execution.
    outputFormat "PARQUET" | "CSV"
    The data storage format for Redshift query results.
    outputS3Uri String
    The location in Amazon S3 where the Redshift query results are stored.
    queryString String
    The SQL query statements to be executed.
    kmsKeyId String
    The AWS Key Management Service (AWS KMS) key that Amazon SageMaker uses to encrypt data from a Redshift execution.
    outputCompression "None" | "GZIP" | "SNAPPY" | "ZSTD" | "BZIP2"
    The compression used for Redshift query results.

    ProcessingJobRedshiftDatasetDefinitionOutputCompression, ProcessingJobRedshiftDatasetDefinitionOutputCompressionArgs

    None
    None
    Gzip
    GZIP
    Snappy
    SNAPPY
    Zstd
    ZSTD
    Bzip2
    BZIP2
    ProcessingJobRedshiftDatasetDefinitionOutputCompressionNone
    None
    ProcessingJobRedshiftDatasetDefinitionOutputCompressionGzip
    GZIP
    ProcessingJobRedshiftDatasetDefinitionOutputCompressionSnappy
    SNAPPY
    ProcessingJobRedshiftDatasetDefinitionOutputCompressionZstd
    ZSTD
    ProcessingJobRedshiftDatasetDefinitionOutputCompressionBzip2
    BZIP2
    None
    None
    Gzip
    GZIP
    Snappy
    SNAPPY
    Zstd
    ZSTD
    Bzip2
    BZIP2
    None
    None
    Gzip
    GZIP
    Snappy
    SNAPPY
    Zstd
    ZSTD
    Bzip2
    BZIP2
    NONE
    None
    GZIP
    GZIP
    SNAPPY
    SNAPPY
    ZSTD
    ZSTD
    BZIP2
    BZIP2
    "None"
    None
    "GZIP"
    GZIP
    "SNAPPY"
    SNAPPY
    "ZSTD"
    ZSTD
    "BZIP2"
    BZIP2

    ProcessingJobRedshiftDatasetDefinitionOutputFormat, ProcessingJobRedshiftDatasetDefinitionOutputFormatArgs

    Parquet
    PARQUET
    Csv
    CSV
    ProcessingJobRedshiftDatasetDefinitionOutputFormatParquet
    PARQUET
    ProcessingJobRedshiftDatasetDefinitionOutputFormatCsv
    CSV
    Parquet
    PARQUET
    Csv
    CSV
    Parquet
    PARQUET
    Csv
    CSV
    PARQUET
    PARQUET
    CSV
    CSV
    "PARQUET"
    PARQUET
    "CSV"
    CSV

    ProcessingJobS3Input, ProcessingJobS3InputArgs

    S3DataType Pulumi.AwsNative.SageMaker.ProcessingJobS3InputS3DataType
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    S3Uri string
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    LocalPath string
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    S3CompressionType Pulumi.AwsNative.SageMaker.ProcessingJobS3InputS3CompressionType
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    S3DataDistributionType Pulumi.AwsNative.SageMaker.ProcessingJobS3InputS3DataDistributionType
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    S3InputMode Pulumi.AwsNative.SageMaker.ProcessingJobS3InputS3InputMode
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
    S3DataType ProcessingJobS3InputS3DataType
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    S3Uri string
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    LocalPath string
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    S3CompressionType ProcessingJobS3InputS3CompressionType
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    S3DataDistributionType ProcessingJobS3InputS3DataDistributionType
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    S3InputMode ProcessingJobS3InputS3InputMode
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
    s3DataType ProcessingJobS3InputS3DataType
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    s3Uri String
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    localPath String
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    s3CompressionType ProcessingJobS3InputS3CompressionType
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    s3DataDistributionType ProcessingJobS3InputS3DataDistributionType
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    s3InputMode ProcessingJobS3InputS3InputMode
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
    s3DataType ProcessingJobS3InputS3DataType
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    s3Uri string
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    localPath string
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    s3CompressionType ProcessingJobS3InputS3CompressionType
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    s3DataDistributionType ProcessingJobS3InputS3DataDistributionType
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    s3InputMode ProcessingJobS3InputS3InputMode
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
    s3_data_type ProcessingJobS3InputS3DataType
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    s3_uri str
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    local_path str
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    s3_compression_type ProcessingJobS3InputS3CompressionType
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    s3_data_distribution_type ProcessingJobS3InputS3DataDistributionType
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    s3_input_mode ProcessingJobS3InputS3InputMode
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.
    s3DataType "ManifestFile" | "S3Prefix"
    Whether you use an S3Prefix or a ManifestFile for the data type. If you choose S3Prefix, S3Uri identifies a key name prefix. Amazon SageMaker uses all objects with the specified key name prefix for the processing job. If you choose ManifestFile, S3Uri identifies an object that is a manifest file containing a list of object keys that you want Amazon SageMaker to use for the processing job.
    s3Uri String
    The URI of the Amazon S3 prefix Amazon SageMaker downloads data required to run a processing job.
    localPath String
    The local path in your container where you want Amazon SageMaker to write input data to. LocalPath is an absolute path to the input data and must begin with /opt/ml/processing/. LocalPath is a required parameter when AppManaged is False (default).
    s3CompressionType "None" | "Gzip"
    Whether to GZIP-decompress the data in Amazon S3 as it is streamed into the processing container. Gzip can only be used when Pipe mode is specified as the S3InputMode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your container without using the EBS volume.
    s3DataDistributionType "FullyReplicated" | "ShardedByS3Key"
    Whether to distribute the data from Amazon S3 to all processing instances with FullyReplicated, or whether the data from Amazon S3 is shared by Amazon S3 key, downloading one shard of data to each processing instance.
    s3InputMode "File" | "Pipe"
    Whether to use File or Pipe input mode. In File mode, Amazon SageMaker copies the data from the input source onto the local ML storage volume before starting your processing container. This is the most commonly used input mode. In Pipe mode, Amazon SageMaker streams input data from the source directly to your processing container into named pipes without using the ML storage volume.

    ProcessingJobS3InputS3CompressionType, ProcessingJobS3InputS3CompressionTypeArgs

    None
    None
    Gzip
    Gzip
    ProcessingJobS3InputS3CompressionTypeNone
    None
    ProcessingJobS3InputS3CompressionTypeGzip
    Gzip
    None
    None
    Gzip
    Gzip
    None
    None
    Gzip
    Gzip
    NONE
    None
    GZIP
    Gzip
    "None"
    None
    "Gzip"
    Gzip

    ProcessingJobS3InputS3DataDistributionType, ProcessingJobS3InputS3DataDistributionTypeArgs

    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    ProcessingJobS3InputS3DataDistributionTypeFullyReplicated
    FullyReplicated
    ProcessingJobS3InputS3DataDistributionTypeShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FullyReplicated
    FullyReplicated
    ShardedByS3Key
    ShardedByS3Key
    FULLY_REPLICATED
    FullyReplicated
    SHARDED_BY_S3_KEY
    ShardedByS3Key
    "FullyReplicated"
    FullyReplicated
    "ShardedByS3Key"
    ShardedByS3Key

    ProcessingJobS3InputS3DataType, ProcessingJobS3InputS3DataTypeArgs

    ManifestFile
    ManifestFile
    S3Prefix
    S3Prefix
    ProcessingJobS3InputS3DataTypeManifestFile
    ManifestFile
    ProcessingJobS3InputS3DataTypeS3Prefix
    S3Prefix
    ManifestFile
    ManifestFile
    S3Prefix
    S3Prefix
    ManifestFile
    ManifestFile
    S3Prefix
    S3Prefix
    MANIFEST_FILE
    ManifestFile
    S3_PREFIX
    S3Prefix
    "ManifestFile"
    ManifestFile
    "S3Prefix"
    S3Prefix

    ProcessingJobS3InputS3InputMode, ProcessingJobS3InputS3InputModeArgs

    File
    File
    Pipe
    Pipe
    ProcessingJobS3InputS3InputModeFile
    File
    ProcessingJobS3InputS3InputModePipe
    Pipe
    File
    File
    Pipe
    Pipe
    File
    File
    Pipe
    Pipe
    FILE
    File
    PIPE
    Pipe
    "File"
    File
    "Pipe"
    Pipe

    ProcessingJobS3Output, ProcessingJobS3OutputArgs

    S3UploadMode Pulumi.AwsNative.SageMaker.ProcessingJobS3OutputS3UploadMode
    Whether to upload the results of the processing job continuously or after the job completes.
    S3Uri string
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    LocalPath string
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
    S3UploadMode ProcessingJobS3OutputS3UploadMode
    Whether to upload the results of the processing job continuously or after the job completes.
    S3Uri string
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    LocalPath string
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
    s3UploadMode ProcessingJobS3OutputS3UploadMode
    Whether to upload the results of the processing job continuously or after the job completes.
    s3Uri String
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    localPath String
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
    s3UploadMode ProcessingJobS3OutputS3UploadMode
    Whether to upload the results of the processing job continuously or after the job completes.
    s3Uri string
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    localPath string
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
    s3_upload_mode ProcessingJobS3OutputS3UploadMode
    Whether to upload the results of the processing job continuously or after the job completes.
    s3_uri str
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    local_path str
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.
    s3UploadMode "Continuous" | "EndOfJob"
    Whether to upload the results of the processing job continuously or after the job completes.
    s3Uri String
    A URI that identifies the Amazon S3 bucket where you want Amazon SageMaker to save the results of a processing job.
    localPath String
    The local path of a directory where you want Amazon SageMaker to upload its contents to Amazon S3. LocalPath is an absolute path to a directory containing output files. This directory will be created by the platform and exist when your container's entrypoint is invoked.

    ProcessingJobS3OutputS3UploadMode, ProcessingJobS3OutputS3UploadModeArgs

    Continuous
    Continuous
    EndOfJob
    EndOfJob
    ProcessingJobS3OutputS3UploadModeContinuous
    Continuous
    ProcessingJobS3OutputS3UploadModeEndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    Continuous
    Continuous
    EndOfJob
    EndOfJob
    CONTINUOUS
    Continuous
    END_OF_JOB
    EndOfJob
    "Continuous"
    Continuous
    "EndOfJob"
    EndOfJob

    ProcessingJobStatus, ProcessingJobStatusArgs

    Completed
    Completed
    InProgress
    InProgress
    Stopping
    Stopping
    Stopped
    Stopped
    Failed
    Failed
    ProcessingJobStatusCompleted
    Completed
    ProcessingJobStatusInProgress
    InProgress
    ProcessingJobStatusStopping
    Stopping
    ProcessingJobStatusStopped
    Stopped
    ProcessingJobStatusFailed
    Failed
    Completed
    Completed
    InProgress
    InProgress
    Stopping
    Stopping
    Stopped
    Stopped
    Failed
    Failed
    Completed
    Completed
    InProgress
    InProgress
    Stopping
    Stopping
    Stopped
    Stopped
    Failed
    Failed
    COMPLETED
    Completed
    IN_PROGRESS
    InProgress
    STOPPING
    Stopping
    STOPPED
    Stopped
    FAILED
    Failed
    "Completed"
    Completed
    "InProgress"
    InProgress
    "Stopping"
    Stopping
    "Stopped"
    Stopped
    "Failed"
    Failed

    ProcessingJobStoppingCondition, ProcessingJobStoppingConditionArgs

    MaxRuntimeInSeconds int
    Specifies the maximum runtime in seconds.
    MaxRuntimeInSeconds int
    Specifies the maximum runtime in seconds.
    maxRuntimeInSeconds Integer
    Specifies the maximum runtime in seconds.
    maxRuntimeInSeconds number
    Specifies the maximum runtime in seconds.
    max_runtime_in_seconds int
    Specifies the maximum runtime in seconds.
    maxRuntimeInSeconds Number
    Specifies the maximum runtime in seconds.

    ProcessingJobVpcConfig, ProcessingJobVpcConfigArgs

    SecurityGroupIds List<string>
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    Subnets List<string>
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
    SecurityGroupIds []string
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    Subnets []string
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
    securityGroupIds List<String>
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
    securityGroupIds string[]
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    subnets string[]
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
    security_group_ids Sequence[str]
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    subnets Sequence[str]
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html
    securityGroupIds List<String>
    The VPC security group IDs, in the form 'sg-xxxxxxxx'. Specify the security groups for the VPC that is specified in the 'Subnets' field.
    subnets List<String>
    The ID of the subnets in the VPC to which you want to connect your training job or model. For information about the availability of specific instance types, see https://docs.aws.amazon.com/sagemaker/latest/dg/regions-quotas.html

    Package Details

    Repository
    AWS Native pulumi/pulumi-aws-native
    License
    Apache-2.0
    aws-native logo

    We recommend new projects start with resources from the AWS provider.

    AWS Cloud Control v1.32.0 published on Wednesday, Aug 13, 2025 by Pulumi