Working with Pipelines
Scheduling Pipelines
Note You can find the Default object on the Architect page in the Other section.
Note The minimum scheduling interval is 15 minutes.
Note You can find the Default object on the Architect page in the Other section.
Cloning Your Pipeline
Note You can't clone a pipeline using the command line interface (CLI).
Deleting Your Pipeline
Important You can't restore a pipeline after you delete it, so be sure that you won't need the pipeline in the future before you delete it.
Staging Data and Tables with Activities
Note Staging only functions when the
stagefield is set totrueon an activity, such asShellCommandActivity. For more information, see ShellCommandActivity.Note This scenario only works as described if your data inputs and outputs are
S3DataNodeobjects. Additionally, output data staging is allowed only whendirectoryPathis set on the outputS3DataNodeobject.Note This scenario only works as described if your data inputs and outputs are
S3DataNodeorMySqlDataNodeobjects. Table staging is not supported forDynamoDBDataNode.Note In this example, the table name variable has the # (hash) character prefix because AWS Data Pipeline uses expressions to access the
tableNameordirectoryPath. For more information about how expression evaluation works in AWS Data Pipeline, see Expression Evaluation.
Using Resources in Multiple Regions
Note The following list includes regions in which AWS Data Pipeline can orchestrate workflows and launch Amazon EMR or Amazon EC2 resources. AWS Data Pipeline may not be supported in these regions. For information about regions in which AWS Data Pipeline is supported, see AWS Regions and Endpoints.
Pipeline Definition File Syntax
Note On user-defined fields, AWS Data Pipeline only checks for valid references to other pipeline components, not any custom field string values that you add.
Working with the API
Note If you are not writing programs that interact with AWS Data Pipeline, you do not need to install any of the AWS SDKs. You can create and run pipelines using the console or command-line interface. For more information, see Setting up for AWS Data Pipeline
Last updated
Was this helpful?