Here is some information about this architecture.
This solution will show you how to setup a DyanmoDB table that streams changes to a Lambda function.
The Terraform configuration sets up a Lambda function, a DynamoDB table, and the necessary IAM resources to operate the application. Whenever there is a change or update made to the items in the DynamoDB table, the change is sent to a stream. This setup involves configuring a Lambda function to continuously monitor this stream and get triggered with a payload that includes the details of the altered table item.
Here are the steps you can follow to build this solution on your own.
Here are the steps needed to build this architecture.
If you're using the Skillmix Labs feature, open the lab settings (the beaker icon) on the right side of the code editor. Then, click the Start Lab button to start hte lab environment.
Wait for the credentials to load. Then run this in the terminal:
$ aws configure --profile smx-lab
AWS Access Key ID [None]: AKIA3E3W34P42CSHXDH5
AWS Secret Access Key [None]: vTmqpOqefgJfse8i6QwzgpjgswPjHZ6h/oiQq4zf
Default region name [None]: us-west-2
Default output format [None]: json
Be sure to name your credentials profile 'smx-lab'.
Note: If you're using your own AWS account you'll need to ensure that you've created and configured a named AWS CLI profile named smx-lab.
We'll be doing all of our work in one Terraform file. Create a new directory on your computer somewhere, and then create a file named main.tf in it.
To start the coding process, open the main.tf file and add the terraform
and provider
blocks.
This configuration will require us to specify the version of the AWS provider that we want to use, as well as the version of Terraform that we are using. We will also specify the AWS profile and region that we want to use. This code will ensure that the correct versions of Terraform and the AWS provider are used, and that the AWS provider is configured correctly.
Append this code to the main.tf file:
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 3.27"
}
}
required_version = ">= 0.14.9"
}
provider "aws" {
profile = "smx-lab"
region = "us-west-2"
}
Next, we will create two data sources that will allow us to access information about the current AWS account and region. The first line of code, data "aws_caller_identity" "current" {}
, will create a data source that will provide information about the current AWS account, such as the account ID and the ARN. The second line of code, data "aws_region" "current" {}
, will create a data source that will provide information about the current AWS region, such as the region name and the region code.
Append this code to the main.tf
file:
data "aws_caller_identity" "current" {}
data "aws_region" "current" {}
Next, we will create an Amazon DynamoDB table using Terraform. This code will create a table called "UsersIds
" with a hash key of "UserId
" and a billing mode of "PROVISIONED
". It will also set the read and write capacity to 5, enable streams, and set the stream view type to "NEW_AND_OLD_IMAGES
". Finally, it will add two tags to the table, "Name" and "Environment".
Append this code to the main.tf file:
resource "aws_dynamodb_table" "dynamodb_table_users" {
name = "UsersIds"
billing_mode = "PROVISIONED"
read_capacity = 5
write_capacity = 5
stream_enabled = true
stream_view_type = "NEW_AND_OLD_IMAGES"
hash_key = "UserId"
attribute {
name = "UserId"
type = "S"
}
tags = {
Name = "dynamodb-test-table"
Environment = "dev"
}
}
Next, we will create an AWS Lambda function called "process-usersids-records" using the Terraform code below. This function will be used to process records from a DynamoDB stream. The code will specify the filename, source code hash, handler, role, and runtime for the Lambda function. The filename and source code hash will be provided by an archive file, the handler will be set to "index.handler
", the role will be set to the ARN of an IAM role, and the runtime will be set to "nodejs14.x
".
Append this code to the main.tf file:
resource "aws_lambda_function" "lambda_dynamodb_stream_handler" {
function_name = "process-usersids-records"
filename = data.archive_file.lambda_zip_file.output_path
source_code_hash = data.archive_file.lambda_zip_file.output_base64sha256
handler = "index.handler"
role = aws_iam_role.iam_for_lambda.arn
runtime = "nodejs14.x"
}
Next, we will create a Lambda handler function that will log the event object that is passed to it. This code will use the 'use strict' directive to ensure that the code is written in strict mode, and then it will define an exports.handler
function that takes an event object as an argument. Inside the function, we will use the console.log() method to log the event object as a stringified JSON object.
In your working directory, create a folder named ./src. In that folder, create a file named index.js.
/*! Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
* SPDX-License-Identifier: MIT-0
*/
'use strict'
exports.handler = async (event) => {
// Lambda handler code
console.log(JSON.stringify(event, 0, null))
}
Next, we will create an archive file resource in Terraform. This resource will take the source file, which is the index.js file in the src directory, and create a zip file from it. The output path of the zip file will be the lambda.zip file in the module directory. This zip file will be used to deploy the Lambda function.
Append this code to the main.tf file:
data "archive_file" "lambda_zip_file" {
type = "zip"
source_file = "${path.module}/src/index.js"
output_path = "${path.module}/lambda.zip"
}
Next, we will create an event source mapping between an AWS DynamoDB table and an AWS Lambda function. This event source mapping will allow the Lambda function to be triggered whenever there is a change in the DynamoDB table.
The code below will create an event source mapping between the DynamoDB table created by the 'aws_dynamodb_table
' resource and the Lambda function created by the 'aws_lambda_function
' resource. The 'starting_position
' parameter is set to 'LATEST', which means that the Lambda function will be triggered whenever there is a new change in the DynamoDB table.
Append this code to the main.tf file:
resource "aws_lambda_event_source_mapping" "lambda_dynamodb" {
event_source_arn = aws_dynamodb_table.dynamodb_table_users.stream_arn
function_name = aws_lambda_function.lambda_dynamodb_stream_handler.arn
starting_position = "LATEST"
}
Next, we will create an IAM role for our Lambda function using the Terraform code below. This code will create an IAM role with the name "iam_for_lambda
" and an assume role policy that allows the Lambda service to assume the role. The assume role policy is written in JSON format and defines the permissions that the Lambda service has when it assumes the role.
Append this code to the main.tf file:
resource "aws_iam_role" "iam_for_lambda" {
# uncomment the 'permissions_boundary' argument if running this lab on skillmix.io
# permissions_boundary = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:policy/LabUserNewResourceBoundaryPolicy"
name = "iam_for_lambda"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}
Next, we will create an IAM role policy that will allow our Lambda function to access the DynamoDB table and its streams. This policy will grant the Lambda function permission to create logs, invoke the function, and access the DynamoDB streams.
Append this code to the main.tf file:
resource "aws_iam_role_policy" "dynamodb_lambda_policy" {
name = "lambda-dynamodb-policy"
role = aws_iam_role.iam_for_lambda.id
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowLambdaFunctionToCreateLogs",
"Action": [
"logs:*"
],
"Effect": "Allow",
"Resource": [
"arn:aws:logs:*:*:*"
]
},
{
"Sid": "AllowLambdaFunctionInvocation",
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction"
],
"Resource": [
"${aws_dynamodb_table.dynamodb_table_users.arn}/stream/*"
]
},
{
"Sid": "APIAccessForDynamoDBStreams",
"Effect": "Allow",
"Action": [
"dynamodb:GetRecords",
"dynamodb:GetShardIterator",
"dynamodb:DescribeStream",
"dynamodb:ListStreams"
],
"Resource": "${aws_dynamodb_table.dynamodb_table_users.arn}/stream/*"
}
]
}
EOF
}
Next, we will create two outputs that will provide the ARN of the DynamoDB Users Ids table and the ARN of the Lambda function processing the DynamoDB stream. The first output will use the aws_dynamodb_table
resource to get the ARN of the DynamoDB Users Ids table, and the second output will use the aws_lambda_function
resource to get the ARN of the Lambda function processing the DynamoDB stream.
Append this code to the main.tf file:
output "dynamodb_usersIds_arn" {
value = aws_dynamodb_table.dynamodb_table_users.arn
description = "The ARN of the DynamoDB Users Ids table"
}
output "lambda_processing_arn" {
value = aws_lambda_function.lambda_dynamodb_stream_handler.arn
description = "The ARN of the Lambda function processing the DynamoDB stream"
}
Source
This project was sourced from the AWS Repo: https://github.com/aws-samples/serverless-patterns