CI/CD on AWS: Building an Automated Image Resizing Pipeline with Lambda and CodePipeline

The goal of this project is to build a serverless application that automatically resizes images uploaded to an S3 bucket and stores the resized versions in another S3 bucket. The application will leverage AWS services such as AWS Lambda, Amazon S3, and AWS CodePipeline for a streamlined CI/CD workflow. The resizing functionality will be implemented using a Python script and packaged as a Docker container.

Table of Contents

What is CI / CD?

CI/CD stands for Continuous Integration and Continuous Deployment, It is a set of practices and principles in software development that aims to automate and streamline the process of building, testing, and deploying code changes. 

  1. Continuous Integration
    • CI is a practice of merging code changes from multiple Devs into one central repository, think GitHub or AWS Code Commit. These commits trigger an automated build process which compiles the new code and runs a number of security and performance checks against it. 
  2. Continuous Deployment
    • CD is the practice  of automatically deploying changes to a staging environment or straight to production if you’re brave enough.  Changes are only deployed via the CD stage if all the tests defined within the CI stage are passed. 

We’ll be implementing CI/CD using AWS CodePipeline, CodeBuild, and CodeCommit. Code changes trigger the pipeline, automatically building, testing, and deploying the updated Docker container to our Lambda function stored in ECR. This streamlined process enables faster development cycles and reduces the risk of errors by eliminating manual intervention. 

Link to project code
Github link

S3 Buckets

Create an Amazon ECR Repository

Lets get started! 

  • First of all, you’ll need to create Two S3 buckets, One for image uploads and the other for storing the resized images. 
  • Go ahead and create these two buckets, leave all settings default. 
    • Bucket 1 = image-upload-bucket-404
    • Bucket 2 = resized-images-buckets-404
    • Remember the bucket name has to be unique, I’ve just appended the day and the month onto the end of mine. 
  1. Open the AWS Management Console and navigate to the ECR service.
  2. Click on “Create repository” and provide a name for your Docker image repository, I’ve called mine “image-resizer”.
  3. Note down the repository URI, as you’ll need it later.

Set up AWS CodeCommit Repository

  1. Open the AWS Management Console and navigate to the CodeCommit service.
  2. Click on “Create repository” and provide a name for your code repository, keeping the naming the same I’ve named my repo image-resizer-code.
  3. You’ll now need to install git onto your local PC. Follow the instructions provided by AWS.
  4. Setup your Git Credentials – https://docs.aws.amazon.com/codecommit/latest/userguide/setting-up-gc.html?icmpid=docs_acc_console_connect_np
  5. In the AWS CodeCommit console and navigate to your repository. Click on the “Clone URL” button and copy the HTTPS clone URL.

  6. In your terminal or command prompt, navigate to the directory where you want to clone the repository. Run the following command to clone the repository:

    git clone https://git-codecommit.ap-southeast-2.amazonaws.com/v1/repos/image-resizer-code

    Replace the URL with the one you copied from the AWS CodeCommit console and log into Git Credential Manager with the credential you created in step 4

  7. Navigate to the cloned repository: Change your current directory to the cloned repository:

    cd image-resizer-code

    Add your files: Copy or move the files you want to push into the cloned repository directory.

  8. Stage the changes: Run the following command to stage all the changes:

    git add .
  9. Commit the changes: Run the following command to commit the staged changes with a meaningful commit message:

    git commit -m "Initial commit"

    Replace “Initial commit” with a descriptive message for your commit. 

  10. Push the changes to AWS CodeCommit: Run the following command to push the changes to the AWS CodeCommit repository:

    git push -u origin master

    This command pushes the changes from your local “master” branch to the “master” branch of the remote repository. The -u flag sets the upstream reference for future pushes.

Create a Docker Image and Push to ECR

Here are the steps to install Docker and create a Docker image for your project:

  1. Install Docker:

  2. Create the necessary files for your Docker image:

    • Make sure you have the following files in your repository:
      1. Dockerfile
      2. Requirements.txt
      3. resize-image-lambda-function.py
  3. Build the Docker image:

    • Open a terminal or command prompt and navigate to the directory where your files are located.
    • Run the following command to build the Docker image:
      docker build -t image-resizer .
  4. Authenticate with Amazon Elastic Container Registry (ECR):

    • Replace us-east-1 with your desired AWS region and 123456789012 with your AWS account ID in the following commands.
    • Run the command to authenticate with ECR:
      aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 123456789012.dkr.ecr.us-east-1.amazonaws.com
  5. Tag the Docker image:

    • Tag the Docker image with the ECR repository URL:
      docker tag image-resizer:latest 123456789012.dkr.ecr.us-east-1.amazonaws.com/image-resizer:latest
  6. Push the Docker image to ECR:

    • Push the tagged Docker image to your ECR repository:
      docker push 123456789012.dkr.ecr.us-east-1.amazonaws.com/image-resizer:latest

Note: You can also find these commands in the ECR repository by clicking on the “View Push Commands” button. Make sure to replace the placeholders (region and account ID) with your specific values.

After completing these steps, your Docker image will be built and pushed to your ECR repository.

Create an AWS Lambda Function

  1. Open the AWS Management Console and navigate to the Lambda service.
  2. Click on “Create function” and choose “Container image” as the package type.
  3. Provide a name for your Lambda function (e.g., “image-resizer-function”).
  4. Select the ECR repository and the latest image version.

  1. Leave the rest default and click create function.
  2. within the function overview click the option Add trigger

  1. set the trigger to the S3 bucket we created right at the start, called Image-upload-bucket
  2. Sent event type to All object create events once configured click add.

  1. Next we need to configure how the docker container knows where to grab the image and where to put the resized images.
  2. go to the configurations tab, on the left find Environment Variables and click edit -> add environment variable 
  3. Set the Key to DESTINATION_BUCKET and the Value to the name of the bucket where the resized images need to go.
  4. Set the Key to SOURCE_BUCKET and the Value to the name of the source bucket. 

IAM Role and Policy

We need to grant our Lambda 

  1. Open the IAM console:

    • Sign in to the AWS Management Console.
    • Navigate to the IAM service by clicking on “Services” in the top navigation bar and searching for “IAM”.
  2. Create a new IAM role:

    • In the IAM dashboard, click on “Roles” in the left sidebar.
    • Click on the “Create role” button.
  3. Select the trusted entity:

    • Under “Select type of trusted entity,” choose “AWS service”.
    • In the “Choose a use case” section, select “Lambda” from the list of services.
    • Click on the “Next: Permissions” button.
  4. Attach permissions to the role:

    • On the “Attach permissions policies” page, you can choose to attach existing policies or create a new policy.
    • To create a new policy, click on the “Create policy” button.
  5. Create the IAM policy:

    • In the “Create policy” window, click on the “JSON” tab.
    • Paste the following IAM policy JSON into the text area:

{
 "Version": "2012-10-17",
 "Statement": [
   {
     "Effect": "Allow",
     "Action": [
       "s3:GetObject"
     ],
     "Resource": "arn:aws:s3:::image-upload-bucket-404/*"
   },
   {
     "Effect": "Allow",
     "Action": [
       "s3:PutObject"
     ],
     "Resource": "arn:aws:s3:::resized-images-bucket-404/*"
   },
   {
     "Effect": "Allow",
     "Action": [
       "logs:CreateLogGroup",
       "logs:CreateLogStream",
       "logs:PutLogEvents"
     ],
     "Resource": "arn:aws:logs:*:*:*"
   }
 ]
}
    • Replace image-upload-bucket-404 and resized-images-bucket-404 with the actual names of your source and destination S3 buckets, respectively.
    • Click on the “Review policy” button.
  1. Name and create the policy:
    • Give the policy a name
    • Click on the “Create policy” button to create the IAM policy.
  2. Attach the policy to the role:
    • Go back to the “Create role” window, and refresh the list of policies.
    • Search for the policy you just created (e.g., “LambdaImageResizingPolicy”) and select it by checking the box next to it.
    • Click on the “Next: Tags” button.
  3. (Optional) Add tags to the role:
    • If desired, add any tags to the role for identification or organizational purposes.
    • Click on the “Next: Review” button.
  4. Review and create the role:
    • Review the role details, including the trusted entity and attached permissions.
    • Give the role a name (e.g., “LambdaImageResizingRole”) and an optional description.
    • Click on the “Create role” button to create the IAM role.
  5. To attach this Role to the lambda function, Configuration -> Permissions
    • Click Edit
    • At the bottom of the page under Existing Role
    • Click the drop down arrow and select the Role you’ve just created.
    • Click Save.

Code Pipeline

  1.  Navigate to the the Code Pipeline Dashboard and select Create Pipeline. 
  2.  Leave everything default except for Giving your pipeline and IAM Role Name a name. 

  1. Click Next
  2. Set the source to our AWS Code Commit Repo and the Repo Name to the repo where the container code is stored.

  1. Click Next
  2. Set the build provider to AWS Code Build and select your region. 
  3. We don’t have a project setup yet so click the create project button 

Creating the Build Project

  1. Set the Provisioning model to On-demand. This option automatically provisions build infrastructure in response to new builds.
  2. For Environment image, select Managed image. This uses an image managed by AWS CodeBuild.
  3. Set Compute to EC2.
  4. Choose Amazon Linux as the Operating system.
  5. Select Standard for the Runtimes.
  6. For the Image, choose aws/codebuild/amazonlinux2-x86_64-standard:5.0 or the later version.
  7. Set the Image version to “Always use the latest image for this runtime version”.

  1. Expand Additional Configuration and scroll down to Environment Variables.
  2. Create the following variables, AWS_DEFAULT_REGION, ECR_REGISTRY, ECR_REPOSITORY and LAMBDA_FUNCTION_NAME. Setting each ones value to your configurations value.

  1. We want to create a new Role to assign to this project so select New Service Role and give it a name.
  2. Select the Use a buildspec file.

  1. After you’ve created the project it should auto appear in the project name field.

  1. select skip deploy stage
  2. Review and click create pipeline.

Code Deploy

  1. Open the IAM Console
    • Go to the AWS Management Console and navigate to the IAM service.
    • In the navigation pane, click on “Roles”.
  2. Find the “Resize_Image_Project” Role
    • In the list of roles, look for the “Resize_Image_Project” role.
    • Click on the role name to open the role details.
  3. Update the Trust Policy
    • In the role details page, click on the “Trust relationships” tab.
    • Click on the “Edit trust relationship” button.
    • In the “Policy Document” section, you should see a JSON document that defines the trust relationship. Look for the “Statement” section.
    • Add another statement to allow CodeDeploy to assume the role. The updated policy should look like this:
      
      {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Effect": "Allow",
            "Principal": {
              "Service": "lambda.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
          },
          {
            "Effect": "Allow",
            "Principal": {
              "Service": "codedeploy.amazonaws.com"
            },
            "Action": "sts:AssumeRole"
          }
        ]
      }
      
    • Click on the “Update Trust Policy” button to save the changes.
  4. Attach the CodeDeploy Policy
    • In the role details page, click on the “Permissions” tab.
    • Click on the “Add permissions” button and select “Attach policies”.
    • In the search box, type “AWSCodeDeployRoleForLambda” and check the box next to the policy with the same name.
    • Click on the “Attach policies” button at the bottom of the page.
  5. Copy the Role ARN
  6. Load up CodeDeploy and Click Create Application
    • Open the AWS CodeDeploy console.
    • Click on “Create application”.
    • Provide a name for your application (e.g., “Image_Resize”).
    • Set the compute platform to AWS Lambda.
  7. Create Deployment Group
    • After entering the application details, click “Create Deployment Group”.
    • Enter a group name (e.g., “Image_Resize_Group”).
    • In the “Service role” field, enter the ARN (Amazon Resource Name) of the “Resize_Image_Project” role you updated earlier.
    • Leave the deployment settings on the default setting
    •  Create the deployment group.

CodePipeline

  1. Creating a CodePipeline
    • Go to the AWS Management Console and navigate to CodePipeline.
    • Click Create pipeline
    • Name your pipeline
    • Leave the remaining settings default and click next.
  2. Add Source Stage
    • Set your source provider to AWS Code Commit.
    • Choose your repository and branch
    • Select detection options to CloudWatch Events.
  3. Add Build Stage
    • Choose AWS CodeBuild as the build provider
    • Select your CodeBuild project
    • Configure build type to single build.
  4. Add Deploy Stage
    • Select AWS Lambda as the deploy provider
    • Choose your Lambda function
    • Specify the deployment configuration
  5. Review and Create
    • Review all pipeline settings
    • Click “Create pipeline” to finalize

Testing the Pipeline

Now we’ve setup all the different parts to the pipeline, it’s time to test it end-to-end to ensure all components are working as we expected.

  1. Push a Code Change to CodeCommit
    • Open your local clone of the CodeCommit repository.
    • Make a small change to the Notes.txt file.
    • Option A: Using Git Command Line
      • Commit and push your changes using Git commands.
    • Option B: Using Visual Studio Code
      • Open your project folder in VS Code.
      • Make your desired change in the file.
      • Use the Source Control view to commit and push changes.
  2. Monitor the Pipeline Execution in the AWS Console
    • Navigate to the AWS CodePipeline console.
    • Select your pipeline from the list.
    • Watch as the pipeline progresses through the stages.
    • If any stage fails, click on the “Details” link to view logs and diagnose the issue.

  1. Upload a Test Image to the Source S3 Bucket
    • Open the AWS Management Console and navigate to S3.
    • Select your source bucket (e.g., image-upload-bucket-404).
    • Click “Upload” and select a test image from your local machine.
    • Click “Upload” to finish the process.
  2. Verify the Resized Image and Check Lambda Logs
    • Return to the S3 console.
    • Select your destination bucket (e.g., resized-images-bucket-404).
    • Look for a new object with the same name as your original image.
    • Download the new object and verify that it has been resized correctly.
    • To check the Lambda function logs:
      • Navigate to the CloudWatch console.
      • In the left sidebar, click on “Log groups” under the “Logs” section.
      • In the search bar, enter “/aws/lambda/” followed by your Lambda function name (e.g., /aws/lambda/image-resizer-function).
      • Click on the log group that matches your Lambda function name.
      • You’ll see a list of log streams. Click on the most recent one to view the latest execution logs.
      • Review the logs to ensure the function executed successfully and to troubleshoot any issues if needed.

Troubleshooting Tips

I ran into a few issues during this project, here’s a couple of pointers.

  • Ensure your CodeCommit repository is correctly linked to your pipeline.
  • Check that your S3 event trigger is properly configured on the source bucket.
  • Review the IAM roles and policies to ensure they have the necessary permissions.
  • Examine the CloudWatch Logs for your Lambda function to identify any runtime errors.
  • If using VS Code, ensure you’ve set up AWS credentials correctly in your environment.

 

Conclusion

In this project, we’ve successfully built a serverless, automated image resizing pipeline using various AWS services. Let’s recap what we’ve accomplished:

  1. We set up a CI/CD pipeline using AWS CodePipeline, integrating source control (CodeCommit), build processes (CodeBuild), and deployment (CodeDeploy).
  2. We created a serverless application using AWS Lambda that automatically resizes images uploaded to an S3 bucket.
  3. We containerized our application using Docker and stored our image in Amazon Elastic Container Registry (ECR).
  4. We implemented event-driven architecture, triggering our Lambda function automatically when new images are uploaded to S3.
  5. We explored important security considerations and best practices to ensure our pipeline is robust and secure.

This solution offers several benefits:

  • Scalability: Being serverless, it can handle varying loads without manual intervention.
  • Cost-effectiveness: You only pay for the compute resources you use.
  • Automation: The CI/CD pipeline automates the entire process from code commit to deployment.
  • Flexibility: The containerized approach allows for easy updates and modifications to the image processing logic.

While this project focused on image resizing, the same architecture can be adapted for various image processing tasks such as format conversion, applying filters, or even more complex operations using machine learning models.

As next steps, you might consider:

  1. Expanding the image processing capabilities.
  2. Implementing a web frontend for manual image uploads.
  3. Adding monitoring and alerting for better observability.
  4. Exploring multi-region deployment for improved global performance.