Amazon S3 is a web-based cloud storage service provided by AWS (Amazon Web Services) that stores files in buckets. Amazon S3 provides durability and security for the files stored within the buckets. There are no limitations on the number of files stored in Amazon S3. The Amazon S3 charges only for the resources you used, there are no hidden charges. There are different ways a user can upload and download content stored in the S3 bucket. Along with this, we can also trigger a lambda when a file is uploaded in the S3 buckets.
One of the ways to trigger an AWS lambda using the S3 trigger is to use EventBridge.
How to trigger lamda on s3 upload
Pre-requisites
- Basic understanding of AWS SAM
- AWS Credentials with permission to S3 bucket
- Basic understanding of S3 services
The following steps show the basic interaction between Amazon S3, AWS Lambda, and Amazon Cloudwatch.
- A file is uploaded in Amazon S3 bucket.
- After the file is succesfully uploaded, it will generate an event which will triggers a lambda function.
- The lambda function will generate an output in the form of log message which can be seen in Amazon Cloudwatch.
Upload a file in the S3 bucket
There are different ways to upload a file in the S3 bucket. One of the ways is to generate an upload pre-signed URL for the file and upload a file using that URL. Follow this link to create a bucket and upload a file in that bucket.
Trigger a lambda function
For this step, we will use AWS SAM to create a sample application that will handle the S3 event which will trigger the lambda function.
A sample AWS SAM template to handle the S3 event is provided below.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: simple lambda function
Globals: ## Include different parameters used in lambda function
Function:
Timeout: 30 ## Max execution time of the lambda in seconds
Resources:
LambdaTriggerFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: trigger-lambda/
Handler: app.handler
Runtime: nodejs14.x
Policies:
- AmazonS3FullAccess ## Policy to access the S3 bucket inside lambda
Events:
S3EventName:
Type: S3 ## Type of the event
Properties:
Bucket: test-bucket ## Name of the bucket
Events: s3:ObjectCreated:*
Filter:
S3Key:
Rules:
- Name: prefix
Value: test/ ## location of the file
The template.yml file is a declaration of AWS resources that determines how a resource is executed. In the above sample template file, the LamndaTriggerFunction is executed whenever a file is uploaded in the test folder of the bucket named test-bucket
.
Generate log in Amazon Cloudwatch
To log the sample message in the Amazon Cloudwatch, the lambda function includes the following code.
exports.handler = function(event) {
console.log("Incoming Event: ", event);
const bucket = event.Records[0].s3.bucket.name;
const message = `File is uploaded in - ${bucket} `;
console.log(message);
return "";
};
The event
param in the above code contains the detail object provided by the S3 event when a file is uploaded in the S3 bucket. This lambda function is located in the app.js file inside the trigger-handler folder which consoles the message File is uploaded in - test-bucket
on the Amazon Cloudwatch.
The sample code contained inside the event param is given below:
{
"Records": [
{
"eventVersion": "2.0",
"eventSource": "aws:s3",
"awsRegion": "us-east-1",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "EXAMPLE"
},
"requestParameters": {
"sourceIPAddress": "127.0.0.1"
},
"responseElements": {
"x-amz-request-id": "EXAMPLE123456789",
"x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "testConfigRule",
"bucket": {
"name": "example-bucket",
"ownerIdentity": {
"principalId": "EXAMPLE"
},
"arn": "arn:aws:s3:::example-bucket"
},
"object": {
"key": "test/key",
"size": 1024,
"eTag": "0123456789abcdef0123456789abcdef",
"sequencer": "0A1B2C3D4E5F678901"
}
}
}
]
}
This article shows how an S3 event can trigger a lambda function when a file is uploaded in a specific bucket inside S3.