AWS Lambda Tutorial: Working with Amazon S3
This step-by-step tutorial will guide you through the process of creating and configure a Lambda function that resizes images added to an Amazon Simple Storage Service (Amazon S3) bucket. When you add an image file to your bucket, Amazon S3 invokes your Lambda function. The function then creates a thumbnail version of the image and outputs it to a different Amazon S3 bucket.
Prerequisites
Before you get started, ensure that you have the following prerequisites in place:
- AWS account
- AWS CLI
- Basic knowledge of AWS Lambda and Amazon S3
Steps
1. Create two Amazon S3 buckets
To create the Amazon S3 buckets (console)
-
Open the Buckets page of the Amazon S3 console.
-
Choose Create bucket.
-
Under General configuration, do the following:
-
For Bucket name, enter a globally unique name that meets the Amazon S3 Bucket naming rules. Bucket names can contain only lower case letters, numbers, dots (.), and hyphens (-).
-
For AWS Region, choose the AWS Region closest to your geographical location. Later in the tutorial, you must create your Lambda function in the same AWS Region, so make a note of the region you chose.
-
-
Leave all other options set to their default values and choose Create bucket.
-
Repeat steps 1 to 4 to create your destination bucket. For Bucket name, enter
SOURCEBUCKET-resized
, whereSOURCEBUCKET
is the name of the source bucket you just created.
2. Upload a test image to your source bucket
To upload a test image to your source bucket (console)
-
Open the Buckets page of the Amazon S3 console.
-
Choose Upload.
-
Choose Add file and use the file selector to choose the object you want to upload.
-
Choose Open, then choose Upload.
3. Create a permissions policy
To create the policy (console)
-
Open the Policies page of the AWS Identity and Access Management (IAM) console.
-
Choose Create policy.
-
Choose the JSON tab, and then paste the following custom policy into the JSON editor.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:PutLogEvents",
"logs:CreateLogGroup",
"logs:CreateLogStream"
],
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::*/*"
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::*/*"
}
]
}
-
Choose Next: Tags.
-
Choose Next: Review.
-
Under Review policy, for Name, enter
LambdaS3Policy
. -
Choose Create policy.
3. Create an execution role
To create an execution role and attach your permissions policy (console)
-
Open the Roles page of the (IAM) console.
-
Choose Create role.
-
For Trusted entity type, select AWS service, and for Use case, select Lambda.
-
Choose Next.
-
Add the permissions policy you created in the previous step by doing the following:
-
In the policy search box, enter
LambdaS3Policy
. -
In the search results, select the check box for
LambdaS3Policy
. -
Choose Next.
-
-
Under Role details, for the Role name enter
LambdaS3Role
. -
Choose Create role.
4. Create the function deployment package
- Save the example code as a file named
lambda_function.py
.
import boto3
import os
import sys
import uuid
from urllib.parse import unquote_plus
from PIL import Image
import PIL.Image
s3_client = boto3.client('s3')
def resize_image(image_path, resized_path):
with Image.open(image_path) as image:
image.thumbnail(tuple(x / 2 for x in image.size))
image.save(resized_path)
def lambda_handler(event, context):
for record in event['Records']:
bucket = record['s3']['bucket']['name']
key = unquote_plus(record['s3']['object']['key'])
tmpkey = key.replace('/', '')
download_path = '/tmp/{}{}'.format(uuid.uuid4(), tmpkey)
upload_path = '/tmp/resized-{}'.format(tmpkey)
s3_client.download_file(bucket, key, download_path)
resize_image(download_path, upload_path)
s3_client.upload_file(upload_path, '{}-resized'.format(bucket), 'resized-{}'.format(key))
- In the same directory in which you created your
lambda_function.py file
, create a new directory namedpackage
and install the Pillow (PIL) library and the AWS SDK for Python (Boto3). Although the Lambda Python runtime includes a version of the Boto3 SDK, we recommend that you add all of your function's dependencies to your deployment package, even if they are included in the runtime. For more information, see Runtime dependencies in Python.
mkdir package
pip install \
--platform manylinux2014_x86_64 \
--target=package \
--implementation cp \
--python-version 3.9 \
--only-binary=:all: --upgrade \
pillow boto3
- Create a .zip file containing your application code and the Pillow and Boto3 libraries. In Linux or MacOS, run the following commands from your command line interface.
In Windows, use your preferred zip tool to create the lambda_function.zip file. Make sure that your lambda_function.py file and the folders containing your dependencies are all at the root of the .zip file.
5. Create the Lambda function
To create the function (console)
To create your Lambda function using the console, you first create a basic function containing some ‘Hello world’ code. You then replace this code with your own function code by uploading the.zip or JAR file you created in the previous step.
-
Open the Functions page of the Lambda console.
-
Make sure you're working in the same AWS Region you created your Amazon S3 bucket in. You can change your region using the drop-down list at the top of the screen.
-
Choose Create function.
-
Choose Author from scratch.
-
Under Basic information, do the following:
-
For Function name, enter
CreateThumbnail
. -
For Runtime choose Python 3.9.
-
For Architecture, choose x86_64.
-
-
In the Change default execution role tab, do the following:
-
Expand the tab, then choose Use an existing role.
-
Select the LambdaS3Role you created earlier.
-
-
Choose Create function.
To upload the function code (console)
-
In the Code source pane, choose Upload from.
-
For the Python and Node.js runtimes, choose .zip file. For the Java runtime, choose .zip or .jar file.
-
Choose Upload.
-
In the file selector, select your .zip or JAR file and choose Open.
-
Choose Save.
-
next add a layer from this link (pick the correct csv for your Region): https://github.com/keithrozario/Klayers/tree/master/deployments/python3.9
- select the corresponding arn for the pillow module:
6. Configure Amazon S3 to invoke the function
To configure the Amazon S3 trigger (console)
-
Open the Functions page of the Lambda console and choose your function (
CreateThumbnail
). -
Choose Add trigger.
-
Select S3.
-
Under Bucket, select your source bucket.
-
Under Event types, select All object create events.
-
Under Recursive invocation, select the check box to acknowledge that using the same Amazon S3 bucket for input and output is not recommended. You can learn more about recursive invocation patterns in Lambda by reading Recursive patterns that cause run-away Lambda functions in Serverless Land.
-
Choose Add.
7. Test your Lambda function with a dummy event
To test your Lambda function with a dummy event (console)
-
Open the Functions page of the Lambda console and choose your function (
CreateThumbnail
). -
Choose the Test tab.
-
To create your test event, in the Test event pane, do the following:
-
Under Test event action, select Create new event.
-
For Event name, enter
myTestEvent
. -
For Template, select S3 Put.
-
Replace the values for the following parameters with your own values.
-
For
awsRegion
, replaceus-east-1
with the AWS Region you created your Amazon S3 buckets in. -
For
name
, replaceexample-bucket
with the name of your own Amazon S3 source bucket. -
For
key
, replacetest%2Fkey
with the filename of the test object you uploaded to your source bucket in the step Upload a test image to your source bucket.
-
-
{
"Records": [
{
"eventVersion": "2.0",
"eventSource": "aws:s3",
"awsRegion": "us-east-1",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "EXAMPLE"
},
"requestParameters": {
"sourceIPAddress": "127.0.0.1"
},
"responseElements": {
"x-amz-request-id": "EXAMPLE123456789",
"x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "testConfigRule",
"bucket": {
"name": "example-bucket",
"ownerIdentity": {
"principalId": "EXAMPLE"
},
"arn": "arn:aws:s3:::example-bucket"
},
"object": {
"key": "test%2Fkey",
"size": 1024,
"eTag": "0123456789abcdef0123456789abcdef",
"sequencer": "0A1B2C3D4E5F678901"
}
}
}
]
}
Choose Save.
- In the Test event pane, choose Test.
- too ensure the function works. increase the timeout length to 15 seconds.
-
To check the your function has created a resized verison of your image and stored it in your target Amazon S3 bucket, do the following:
-Open the Buckets page of the Amazon S3 console.
-Choose your target bucket and confirm that your resized file is listed in the Objects pane.
Clean up your resources
You can now delete the resources that you created for this tutorial, unless you want to retain them. By deleting AWS resources that you're no longer using, you prevent unnecessary charges to your AWS account.
To delete the Lambda function
-
Open the Functions page of the Lambda console.
-
Select the function that you created.
-
Choose Actions, Delete.
-
Type
delete
in the text input field and choose Delete.
To delete the policy that you created
-
Open the Policies page of the IAM console.
-
Select the policy that you created (AWSLambdaS3Policy).
-
Choose Policy actions, Delete.
-
Choose Delete.
To delete the execution role
-
Open the Roles page of the IAM console.
-
Select the execution role that you created.
-
Choose Delete.
-
Enter the name of the role in the text input field and choose Delete.
To delete the S3 bucket
-
Open the Amazon S3 console.
-
Select the bucket you created.
-
Choose Delete.
-
Enter the name of the bucket in the text input field.
-
Choose Delete bucket.
Conclusion
Congratulations on creating a resized image with Amazon S3 and Lambda!