Serverless Made Easy: Run Custom Code in AWS Lambda Without Writing New Code!
Case Study

Serverless Made Easy: Run Custom Code in AWS Lambda Without Writing New Code!

June 16, 2024
3 views
Get tips and best practices from Develeap’s experts in your inbox

Deploying an AWS Lambda that Runs bash

AWS Lambda is a serverless computing platform that allows you to run your code without provisioning or managing infrastructure.

In this article, we will deploy an AWS Lambda function that runs a Docker image, executing a bash script.

By running scripts within docker images in AWS Lambda, you can do pretty much everything!

In this example, we will configure AWS and write a script that checks the number of files in an S3 bucket and updates a DynamoDB table with the number.

Prerequisites:

  • AWS account
  • AWS CLI installed and configured
  • Docker installed

Building and publishing our AWS Lambda the Docker Image

  1. Create a new directory for your project and navigate into it.
  2. Create a bash script named lambda-entrypoint.sh . This script will be executed as the entry point when the Docker image runs. The script will check the number of files in an S3 bucket and update a DynamoDB table with the number:
#!/bin/bash

echo "Checking the number of files in the S3 bucket..."

# Insert ${BUCKET_NAME} as a lambda envrionment variable
file_count=$(aws s3 ls s3://"${BUCKET_NAME}" --recursive | wc -l)

# Insert ${TABLE_NAME} as a lambda envrionment variable
aws dynamodb update-item --table-name="${TABLE_NAME}" --key='{"id": {"S": "'"${BUCKET_NAME}"'" }}' --attribute-updates '{"count": {"Action":"PUT", "Value":{"N":"'"${file_count}"'"}}}'

echo "DynamoDB table updated with the number of files in ${BUCKET_NAME} S3 bucket."

3. Create a file named Dockerfile in the new directory, which will be used to build the Docker image. A Dockerfile is a script that contains instructions on how to build a Docker image:

# Use an official Python lambda runtime as the base image
FROM public.ecr.aws/lambda/python:3.8

# Install aws-cli, used by the bash script
RUN pip3 install --no-cache-dir awscli

# Copy the entrypoint file and allow execute
COPY lambda-entrypoint.sh /lambda-entrypoint.sh
RUN chmod +x /lambda-entrypoint.sh

4. Create an S3 bucket, Upload something nice in there

5. Create a Dynamodb table with Partition Key “id” 

6. Build the Docker image using the Dockerfile. Open a terminal window in the directory containing the Dockerfile and run the following command:

docker build -t lambda-count-s3-files .
  1. Pushing the Docker Image to AWS ECR:

Create a new repository in AWS ECR:

aws ecr create-repository \
--repository-name <name> \
--region <your-region>

Tag the Docker image with the repository URI:

docker tag lambda-count-s3-files:latest <your-repository-uri>:latest

Log in to AWS ECR using the AWS CLI:

aws ecr get-login-password --region <your-region> | docker login --username AWS --password-stdin <your-account-id>.dkr.ecr.<your-region>.amazonaws.com

Push the Docker image to AWS ECR:

docker push <your-repository-uri>:latest

Running the Docker Image in AWS Lambda

Deploy by AWS Console:

  1. Go to the AWS Lambda console.
  2. Click on the “Create function” button.
  3. Choose “Container Image” and enter a name for the function, choose a function name and click “browse images” to choose an image from ECR.

4. At configuration tab, insert the environment variables.

5. Create an IAM role and attach an IAM policy that allows access S3, Dynamodb Cloudwatch. This policy gives full permissions over the chosen S3 bucket and Dynamodb table, it is recommended to create a policy with fine grained permissions.

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetBucketLocation",
"s3:ListAllMyBuckets"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::<bucket_name>",
"arn:aws:s3:::<bucket_name>/*"
]
},
{
"Sid": "SpecificTable",
"Effect": "Allow",
"Action": [
"dynamodb:*"
],
"Resource": "arn:aws:dynamodb:*:*:table/<table_name>"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents",
"ec2:CreateNetworkInterface",
"ec2:DescribeNetworkInterfaces",
"ec2:DeleteNetworkInterface",
"ec2:AssignPrivateIpAddresses",
"ec2:UnassignPrivateIpAddresses"
],
"Resource": "*"
}
]
}

 

Test the function by providing an event and clicking on the “Test” button. After the execution, check your Dynamodb table for changes!

In conclusion, deploying an AWS Lambda function that runs a Docker image with a bash script as the entry point is a straightforward process. You can use this setup to run your applications in a serverless environment and take advantage of the benefits of AWS Lambda.

We’re Hiring!
Develeap is looking for talented DevOps engineers who want to make a difference in the world.