AWS CodePipeline is a fully managed continuous delivery service that helps you automate your release pipelines for fast and reliable application and infrastructure updates.
CodePipeline automates the build, test, and deploy phases of your release process every time there is a code change, based on the release model you define.
This enables you to rapidly and reliably deliver features and updates.
You can easily integrate AWS CodePipeline with third-party services such as GitHub or with your own custom plugin.
With AWS CodePipeline, you only pay for what you use.
Webhooks are used to trigger pipeline when the source is GitHub repository.
CodePipeline now supports StackSet operations, simplifying automation for CloudFormation deployments across accounts..
You can integrate CodePipeline with GitHub Enterprise, Bitbucket, GitLab, or any other Git server that supports the webhooks functionality available in most Git software.
CodePipeline can deploy changes using the following AWS services:
CodeDeploy
Elastic Beanstalk
Amazon ECS
AWS Fargate
In AWS CodePipeline, an action is a task performed on an artifact in a stage. If an action or a set of parallel actions is not completed successfully, the pipeline stops running.
Content
In AWS CodePipeline, an action is part of the sequence in a stage of a pipeline.
It is a task performed on the artifact in that stage.
Pipeline actions occur in a specified order, in sequence or in parallel, as determined in the configuration of the stage.
CodePipeline provides support for six types of actions:
Source
Build
Test
Deploy
Approval
Invoke
Custom Action: If your release process includes activities that are not included in the default actions, such as an internally developed build process or a test suite, you can create a custom action for that purpose and include it in your pipeline.
You can use the AWS CLI to create custom actions in pipelines associated with your AWS account.
Content
There are two ways to configure server-side encryption for Amazon S3 artifacts:
CodePipeline creates an S3 artifact bucket and default AWS managed key when you create a pipeline using the Create Pipeline wizard. The AWS managed key is encrypted along with object data and managed by AWS.
You can create and manage your own customer managed key.
To automatically trigger pipeline with changes in the source S3 bucket, Amazon CloudWatch Events rule & AWS CloudTrail trail must be applied.
When there is a change in the S3 bucket, events are filtered using AWS CloudTrail & then Amazon CloudWatch events are used to trigger the start of the pipeline. This default method is faster & periodic checks should be disabled to have events-based triggering of CodePipeline.
You can use the following tools to monitor your CodePipeline pipelines and their resources:
Amazon CloudWatch Events — Use Amazon CloudWatch Events to detect and react to pipeline execution state changes (for example, send an Amazon SNS notification or invoke a Lambda function).
AWS CloudTrail — Use CloudTrail to capture API calls made by or on behalf of CodePipeline in your AWS account and deliver the log files to an Amazon S3 bucket. You can choose to have CloudWatch publish Amazon SNS notifications when new log files are delivered so you can take quick action.
Console and CLI — you can use the CodePipeline console and CLI to view details about the status of a pipeline or a particular pipeline execution.
Also, the periodic check will be a slower process to trigger CodePipeline.
As a best practice, when you use a Jenkins build provider for your pipeline’s build or test action, install Jenkins on an Amazon EC2 instance and configure a separate EC2 instance profile.
Make sure the instance profile grants Jenkins only the AWS permissions required to perform tasks for your project, such as retrieving files from Amazon S3.
The instance profile provides applications running on an Amazon EC2 instance with the credentials to access other AWS services. As a result, you do not need to configure AWS credentials (AWS access key and secret key).
Content
Content