Executing remote commands using AWS Data Pipeline

One of the best parts of working with Data Pipeline is that versatility of tasks that you can achieve by just using this one tool. In this section, we will be looking at a relatively simple pipeline definition using which you can execute remote scripts and commands on EC2 instances.

How does this setup work? Well, to start with, we will be requiring one S3 bucket (can be present in any AWS region) to be created that will store and act as a repository for all our shell scripts. Once the bucket is created, simply create and upload the following shell script to the bucket. Note however that in this case, the shell script is named simplescript.sh and the same name is used in the following pipeline ...

Get Implementing AWS: Design, Build, and Manage your Infrastructure now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.