This Project is purely taken from AWS test cases. Tested and Proven. Familiarize yourself with the 3 services here in, AWS S3, Lambda, and Cloudwatch.
To complete this tutorial, you carry out the following steps:
Create an Amazon S3 bucket.
Create a Lambda function that returns the object type of objects in an Amazon S3 bucket.
Configure a Lambda trigger that invokes your function when objects are uploaded to your bucket.
Test your function, first with a dummy event, and then using the trigger.
By completing these steps, you’ll learn how to configure a Lambda function to run whenever objects are added to or deleted from an Amazon S3 bucket. You can complete this tutorial using only the AWS Management Console.
Create an Amazon S3 bucket
First create an Amazon S3 bucket using the AWS Management Console.
To create an Amazon S3 bucket
Open the Amazon S3 console and select the Buckets page.
Choose Create bucket.
Under General configuration, do the following:
For Bucket name, enter a globally unique name that meets the Amazon S3 Bucket naming rules. Bucket names can contain only lowercase letters, numbers, dots (.), and hyphens (-).
For AWS Region, choose a Region. Later in the tutorial, you must create your Lambda function in the same Region.
Leave all other options set to their default values and choose Create bucket.
Upload a test object to your bucket
Later in the tutorial, you’ll test your Lambda function in the Lambda console. To confirm that your function’s code is working correctly, your Amazon S3 bucket needs to contain a test object. This object can be any file you choose (for example HappyFace.jpg
).
To upload a test object
Open the Buckets page of the Amazon S3 console and choose the bucket you created during the previous step.
Choose Upload.
Choose Add files and use the file selector to choose the object you want to upload.
Choose Open, then choose Upload.
When you test your function code later in the tutorial, you pass it data containing the file name of the object you uploaded, so make a note of it now.
Create a permissions policy
Before you can create an execution role for you Lambda function, you first create a permissions policy to give your function permission to access the required AWS resources. For this tutorial, the policy allows Lambda to get objects from an Amazon S3 bucket and to write to Amazon CloudWatch Logs.
To create the policy
Open the Policies page of the IAM console.
Choose Create Policy.
Choose the JSON tab, and then paste the following custom policy into the JSON editor.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:PutLogEvents",
"logs:CreateLogGroup",
"logs:CreateLogStream"
],
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::*/*"
}
]
}
Choose Next: Tags.
Choose Next: Review.
Under Review policy, for the policy Name, enter
s3-trigger-tutorial
.Choose Create policy.
Create an execution role
An execution role is an AWS Identity and Access Management (IAM) role that grants a Lambda function permission to access AWS services and resources. To enable your function to get objects from an Amazon S3 bucket, you attach the permissions policy you created in the previous step.
To create an execution role and attach your custom permissions policy
Open the Roles page of the IAM console.
Choose Create role.
For the type of trusted entity, choose AWS service, then for the use case, choose Lambda.
Choose Next.
In the policy search box, enter
s3-trigger-tutorial
.In the search results, select the policy that you created (
s3-trigger-tutorial
), and then choose Next.Under Role details, for the Role name, enter
lambda-s3-trigger-role
, then choose Create role.
Create the Lambda function
In this example, you create a Lambda function in the console using the Node.js 16.x runtime. The function you create in the console contains some basic ‘Hello World’ code. In the next step, you’ll replace this with the function code to get an object from your Amazon S3 bucket.
To create the Lambda function
Open the Functions page of the Lambda console.
Make sure you're working in the same AWS Region you created your Amazon S3 bucket in. You can change your Region using the drop-down list at the top of the screen.
Choose Create function.
Choose Author from scratch
Under Basic information, do the following:
For Function name, enter
s3-trigger-tutorial
For Runtime, choose Node.js 16.x.
For Architecture, choose x86_64.
In the Change default execution role tab, do the following:
Expand the tab, then choose Use an existing role.
Select the
lambda-s3-trigger-role
you created earlier.
Choose Create function.
Deploy the function code.
Your Lambda function will retrieve the key name of the uploaded object and the name of the bucket from the event
parameter it receives from Amazon S3. The function then uses the HeadObject
API call in the AWS SDK for JavaScript to get the object type for the uploaded object.
This tutorial uses the Node.js 16.x runtime, but we’ve also provided example code files for other runtimes. You can select the tab in the following box to see the code for the runtime you’re interested in. The JavaScript code you’ll deploy is the first example shown in the tab labeled JavaScript.
using System.Threading.Tasks;
using Amazon.Lambda.Core;
using Amazon.S3;
using System;
using Amazon.Lambda.S3Events;
using System.Web;
// Assembly attribute to enable the Lambda function's JSON input to be converted into a .NET class.
[assembly: LambdaSerializer(typeof(Amazon.Lambda.Serialization.SystemTextJson.DefaultLambdaJsonSerializer))]
namespace S3Integration
{
public class Function
{
private static AmazonS3Client _s3Client;
public Function() : this(null)
{
}
internal Function(AmazonS3Client s3Client)
{
_s3Client = s3Client ?? new AmazonS3Client();
}
public async Task<string> Handler(S3Event evt, ILambdaContext context)
{
try
{
if (evt.Records.Count <= 0)
{
context.Logger.LogLine("Empty S3 Event received");
return string.Empty;
}
var bucket = evt.Records[0].S3.Bucket.Name;
var key = HttpUtility.UrlDecode(evt.Records[0].S3.Object.Key);
context.Logger.LogLine($"Request is for {bucket} and {key}");
var objectResult = await _s3Client.GetObjectAsync(bucket, key);
context.Logger.LogLine($"Returning {objectResult.Key}");
return objectResult.Key;
}
catch (Exception e)
{
context.Logger.LogLine($"Error processing request - {e.Message}");
return string.Empty;
}
}
}
}
To deploy the function code
Open the Functions page of the Lambda console.
Choose the function you created in the previous step (
s3-trigger-tutorial
).Choose the Code tab.
Copy and paste the provided JavaScript code into the index.js tab in the Code source pane.
Choose Deploy.
Create the Amazon S3 trigger
Now you’ve deployed your function code, you create the Amazon S3 trigger that will invoke your function.
To create the Amazon S3 trigger
In the Function overview pane of your function’s console page, choose Add trigger.
Select S3.
Under Bucket, select the bucket you created earlier in the tutorial.
Under Event types, select All object create events. You can also configure a trigger to invoke Lambda when an object is deleted, but we won’t be using that option in this tutorial.
Under Recursive invocation, select the check box to acknowledge that using the same Amazon S3 bucket for input and output is not recommended. You can learn more about recursive invocation patterns in Lambda by reading Recursive patterns that cause run-away Lambda functions in Serverless Land.
Choose Add.
Test your Lambda function with a dummy event
Now that you’ve created and configured your Lambda function, you’re ready to test it. You first test your function by sending it a dummy Amazon S3 event to confirm it’s working correctly.
To test the Lambda function with a dummy event
In the Lambda console page for your function, choose the Code tab.
In the Code source pane, choose Test.
In the Configure test event box, do the following:
For Event name, enter
MyTestEvent
.For Template, choose S3 Put.
In the Event JSON, replace the following values:
Replace
us-east-1
with the region you created your Amazon S3 bucket in.Replace both instances of
my-bucket
with the name of your own Amazon S3 bucket.Replace
test%2FKey
with the name of the test object you uploaded to your bucket earlier (for example,HappyFace.jpg
).Choose Save.
{
"Records": [
{
"eventVersion": "2.0",
"eventSource": "aws:s3",
"awsRegion": "us-east-1",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "EXAMPLE"
},
"requestParameters": {
"sourceIPAddress": "127.0.0.1"
},
"responseElements": {
"x-amz-request-id": "EXAMPLE123456789",
"x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "testConfigRule",
"bucket": {
"name": "my-bucket",
"ownerIdentity": {
"principalId": "EXAMPLE"
},
"arn": "arn:aws:s3:::my-bucket"
},
"object": {
"key": "test%2Fkey",
"size": 1024,
"eTag": "0123456789abcdef0123456789abcdef",
"sequencer": "0A1B2C3D4E5F678901"
}
}
}
]
}
In the Code source pane, choose Test.
If your function runs successfully, you’ll see output similar to the following in the Execution results tab.
Test the Lambda function with the Amazon S3 trigger.
To test your function with the configured trigger, you upload an object to your Amazon S3 bucket using the console. To verify that your Lambda function has been invoked correctly, you then use CloudWatch Logs to view your function’s output.
To upload an object to your Amazon S3 bucket
Open the Buckets page of the Amazon S3 console and choose the bucket you created earlier.
Choose Upload.
Choose Add files and use the file selector to choose an object you want to upload. This object can be any file you choose.
Choose Open, then choose Upload.
To verify correct operation using CloudWatch Logs
Open the CloudWatch console.
Make sure you're working in the same AWS Region you created your Lambda function in. You can change your Region using the drop-down list at the top of the screen.
Choose Logs, then choose Log groups.
Choose the log group for your function (
/aws/lambda/s3-trigger-tutorial
).Under Log streams, choose the most recent log stream.
If your function has been invoked correctly in response to your Amazon S3 trigger, you’ll see output similar to the following. The
CONTENT TYPE
you see depends on the type of file you uploaded to your bucket.
Clean up your resources.
You can now delete the resources that you created for this tutorial, unless you want to retain them. By deleting AWS resources that you're no longer using, you prevent unnecessary charges to your AWS account.
To delete the Lambda function
Open the Functions page of the Lambda console.
Select the function that you created.
Choose Actions, Delete.
Type
delete
in the text input field and choose Delete.
To delete the execution role
Open the Roles page of the IAM console.
Select the execution role that you created.
Choose Delete.
Enter the name of the role in the text input field and choose Delete.
To delete the S3 bucket
Open the Amazon S3 console.
Select the bucket you created.
Choose Delete.
Enter the name of the bucket in the text input field.
Choose Delete bucket.