Here is some information about this architecture.
This solution shows how to invoke a Lamdba function from an EventBridge. This demonstrates how to create powerful event driven architectures using EventBridge and other AWS services like Lambda.
In this solution, we create the following resources:
A Lambda function that logs some information to CloudWatch (a demo function)
A EventBridge rule and target that processes incoming events and sends them to the lambda function if there is a match
Related resources such as IAM roles and policies
This solution was adapted from: https://github.com/aws-samples/serverless-patterns/tree/main/eventbridge-lambda-terraform
Here are the steps you can follow to build this solution on your own.
If you're using the Skillmix Labs feature, open the lab settings (the beaker icon) on the right side of the code editor. Then, click the Start Lab button to start hte lab environment.
Wait for the credentials to load. Then run this in the terminal:
$ aws configure --profile smx-lab
AWS Access Key ID [None]: AKIA3E3W34P42CSHXDH5
AWS Secret Access Key [None]: vTmqpOqefgJfse8i6QwzgpjgswPjHZ6h/oiQq4zf
Default region name [None]: us-west-2
Default output format [None]: json
Be sure to name your credentials profile 'smx-lab'. Remember to replace <lab access key>
and <lab secret key>
with the access key and secret key you get from the “Lab Environment” you started above.
Note: If you're using your own AWS account you'll need to ensure that you've created and configured a named AWS CLI profile named smx-lab.
We'll be doing all of our work in one Terraform file. Create a new directory on your computer somewhere, and then create a file named main.tf
in it.
The first step is to create a Terraform and Provider blocks. These blocks bootstrap our project with the required plugins and provider configuration.
Append this code to the main.tf file:
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 4.22"
}
}
required_version = ">= 0.14.9"
}
provider "aws" {
profile = "smx-lab"
region = "us-west-2"
}
The required_providers
block specifies that the AWS provider is required, with a source of "hashicorp/aws" and a version of "~> 4.22".
The required_version
block specifies the minimum version of Terraform required to run this code, which is ">= 0.14.9".
The provider
block specifies the AWS provider configuration, with the profile set to "default" and the region set to "us-east-1".
Next, we will create two data sources that will allow us to access information about the current AWS account and region. The first line of code, data "aws_caller_identity" "current" {}
, will create a data source that will provide information about the current AWS account, such as the account ID and the ARN. The second line of code, data "aws_region" "current" {}
, will create a data source that will provide information about the current AWS region, such as the region name and the region code.
Append this code to the main.tf
file:
data "aws_caller_identity" "current" {}
data "aws_region" "current" {}
Next, let's create the Lambda function resource. This resource defines the following:
The function name
The local file that contains the function code
A hash that uniquely identifies a specific version of the function file
The path to the handler (filename.function_name)
The IAM Role resource (created later on)
The runtime
Append this code to the main.tf file:
resource "aws_lambda_function" "lambda_function" {
function_name = "ConsumerFunction"
filename = data.archive_file.lambda_zip_file.output_path
source_code_hash = data.archive_file.lambda_zip_file.output_base64sha256
handler = "app.handler"
role = aws_iam_role.lambda_iam_role.arn
runtime = "nodejs14.x"
}
Next, let's create the actual JS file that contains the function we want Lambda to run. For demo purposes, we have a very simple function that just logs the event object. Logging like this will output log statement to CloudWatch, which we can look at later on.
Create a new folder in your director named src
, and put a file in it called app.js
. Add this code to the file.
/*! Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
* SPDX-License-Identifier: MIT-0
*/
exports.handler = async (event) => {
console.log(JSON.stringify(event, null, 2))
}
This solution works by zipping the Lambda file and sending it straight to the Lambda service for deployment. To do that, we need to create a zip file from our JS file. This Terraform resource does that. We need to point to the .js file in our project, and provide it an output path.
Append this code to the main.tf file:
data "archive_file" "lambda_zip_file" {
type = "zip"
source_file = "${path.module}/src/app.js"
output_path = "${path.module}/lambda.zip"
}
Note: for files larger than 50MB you should upload to S3 first.
Next, we create the Lambda Role execution role. This role has two major configuration parts:
The permissions from the managed policy named AWSLambdaBasicExecutionRole will be attached to this role.
The lambda.amazonaws.com
service will be able to assume the role and execute functions on our behalf.
Append this code to the main.tf file:
data "aws_iam_policy" "lambda_basic_execution_role_policy" {
name = "AWSLambdaBasicExecutionRole"
}
resource "aws_iam_role" "lambda_iam_role" {
# uncomment the 'permissions_boundary' argument if running this lab on skillmix.io
# permissions_boundary = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:policy/LabUserNewResourceBoundaryPolicy"
name_prefix = "EventBridgeLambdaRole-"
managed_policy_arns = [data.aws_iam_policy.lambda_basic_execution_role_policy.arn]
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow",
"Sid": ""
}
]
}
EOF
}
Now we move on to creating the EventBridge event rule. EventBridge was borne out of CloudWatch. Some things still carry the old name, like this Terraform configuration below.
In this Terraform configuration we create the rule and specify an event_pattern. The event pattern is evaluated on all events coming into this event bus. If there's a match, EventBridge will send it to the Event Target (which we'll create next).
Append this code to the main.tf file:
resource "aws_cloudwatch_event_rule" "event_rule" {
name_prefix = "eventbridge-lambda-"
event_pattern = <<EOF
{
"detail-type": ["transaction"],
"source": ["custom.myApp"],
"detail": {
"location": [{
"prefix": "EUR-"
}]
}
}
EOF
}
Next we'll create the event target. There are many event targets supported. Here, we are sending the event to the Lambda function that we created previously.
We do this by creating an aws_cloudwatch_event_target
resource and specifying the rule (created above) and our Lambda function.
Append this code to the main.tf file:
resource "aws_cloudwatch_event_target" "target_lambda_function" {
rule = aws_cloudwatch_event_rule.event_rule.name
arn = aws_lambda_function.lambda_function.arn
}
We need to give EventBridge permissions to invoke the Lambda function. We do that by creating a aws_lambda_permission resource and applying the right permissions.
Append this code to the main.tf file:
resource "aws_lambda_permission" "allow_cloudwatch" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda_function.function_name
principal = "events.amazonaws.com"
source_arn = aws_cloudwatch_event_rule.event_rule.arn
}
We will output the function ARN to the console. Append this code to the main.tf file:
output "ConsumerFunction" {
value = aws_lambda_function.lambda_function.arn
description = "ConsumerFunction function name"
}
Let's deploy this thing! If you haven't done so, start the Skillmix lab session and get the account credentials. Configure your Terraform environment to use those credentials.
Then, open a terminal or command prompt, navigate to the folder with your Terraform file, and execute these commands:
# initiatlize the project
$ terraform init
# show the plan
$ terraform plan
# apply the changes
$ terraform apply
Wait for the changes to be applied before proceeding.
Let's test the solution! We will test using the AWS CLI. You should have this install and configured with the credentials.
First, we need to create a file that contains the payload that we will send to the EventBridge. In your project directory, create a file named event.json
. Then add this data to it:
[
{
"DetailType": "transaction",
"Source": "custom.myApp",
"Detail": "{\"location\":\"EUR-\"}"
}
]
Then, tt your terminal, execute this command. Note that the --profile
flag needs to be configured with your local profile name.
$ aws events put-events --entries file://event.json --profile smx-lab
After you do that, you can log into the CloudWatch console to see the event logs for the Lambda execution.
Source
This project was sourced from the AWS Repo: https://github.com/aws-samples/serverless-patterns