Here is some information about this architecture.
This solution shows how to build an API Gateway that integrates directly with EventBridge. The gateway accepts inbound requests and forwards them to EventBridge. EventBridge will process the payload of the request. If there's a matching rule, it will send the event to a predefined target.
In this case, the target is a Lambda function that will simply log the request to CloudWatch.
Here are the steps you can follow to build this solution on your own.
Here are the steps needed to build this architecture.
If you're using the Skillmix lab feature, here are the instructions to follow:
If you're using the Skillmix Labs feature, open the lab settings (the beaker icon) on the right side of the code editor. Then, click the Start Lab button to start hte lab environment.
Wait for the credentials to load. Then run this in the terminal:
$ aws configure --profile smx-lab
AWS Access Key ID [None]: AKIA3E3W34P42CSHXDH5
AWS Secret Access Key [None]: vTmqpOqefgJfse8i6QwzgpjgswPjHZ6h/oiQq4zf
Default region name [None]: us-west-2
Default output format [None]: json
Be sure to name your credentials profile 'smx-lab'.
Note: If you're using your own AWS account you'll need to ensure that you've created and configured a named AWS CLI profile named smx-lab.
We'll be doing all of our work in one Terraform file. Create a new directory on your computer somewhere, and then create a file named main.tf
in it.
Next, we will create a Terraform configuration that will allow us to use the AWS provider. This configuration will require us to specify the version of the AWS provider that we want to use, as well as the version of Terraform that we are using. We will also specify the AWS profile and region that we want to use. This code will ensure that the correct versions of Terraform and the AWS provider are used, and that the AWS provider is configured correctly.
Append this code to the main.tf
file:
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 3.27"
}
}
required_version = ">= 0.14.9"
}
provider "aws" {
profile = "smx-lab"
region = "us-west-2"
}
Next, we will create two data sources that will allow us to access information about the current AWS account and region. The first line of code, data "aws_caller_identity" "current" {}
, will create a data source that will provide information about the current AWS account, such as the account ID and the ARN. The second line of code, data "aws_region" "current" {}
, will create a data source that will provide information about the current AWS region, such as the region name and the region code.
Append this code to the main.tf
file:
data "aws_caller_identity" "current" {}
data "aws_region" "current" {}
Next, we will create an archive file resource. This resource will take a source file, in this case a Python file called LambdaFunction.py, and compress it into a zip file. The output path of the zip file will be set to the module path, and the resulting zip file will be named LambdaFunction.zip
.
Append this code to the main.tf
file:
data "archive_file" "LambdaZipFile" {
type = "zip"
source_file = "${path.module}/src/LambdaFunction.py"
output_path = "${path.module}/LambdaFunction.zip"
}
Next, we will create an IAM role for our API Gateway. This code will create an IAM role with an assume role policy that allows the API Gateway service to assume the role. This will allow the API Gateway to access other AWS services on our behalf.
Append this code to the main.tf
file:
resource "aws_iam_role" "APIGWRole" {
# uncomment the 'permissions_boundary' argument if running this lab on skillmix.io
# permissions_boundary = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:policy/LabUserNewResourceBoundaryPolicy"
assume_role_policy = <<POLICY1
{
"Version" : "2012-10-17",
"Statement" : [
{
"Effect" : "Allow",
"Principal" : {
"Service" : "apigateway.amazonaws.com"
},
"Action" : "sts:AssumeRole"
}
]
}
POLICY1
}
Next, we will create an AWS IAM policy. This policy will allow the user to put events into the default event bus in the current AWS region. The code defines the policy and its associated statement, which includes the action, effect, and resource.
Append this code to the main.tf
file:
resource "aws_iam_policy" "APIGWPolicy" {
policy = <<POLICY2
{
"Version" : "2012-10-17",
"Statement" : [
{
"Effect" : "Allow",
"Action" : [
"events:PutEvents"
],
"Resource" : "arn:aws:events:${data.aws_region.current.name}:${data.aws_caller_identity.current.account_id}:event-bus/default"
}
]
}
POLICY2
}
Next, we will create an IAM role policy attachment that will attach the policy we just created to the IAM role. This will allow the IAM role to access the resources that are specified in the policy.
Append this code to the main.tf
file:
resource "aws_iam_role_policy_attachment" "APIGWPolicyAttachment" {
role = aws_iam_role.APIGWRole.name
policy_arn = aws_iam_policy.APIGWPolicy.arn
}
Next, we will create an IAM role for the Lambda function. This code will create an IAM role with an assume role policy that allows the Lambda service to assume the role. This will allow Lambda to access other AWS services on your behalf.
Append this code to the main.tf
file:
resource "aws_iam_role" "LambdaRole" {
# uncomment the 'permissions_boundary' argument if running this lab on skillmix.io
# permissions_boundary = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:policy/LabUserNewResourceBoundaryPolicy"
assume_role_policy = <<POLICY3
{
"Version" : "2012-10-17",
"Statement" : [
{
"Effect" : "Allow",
"Principal" : {
"Service" : "lambda.amazonaws.com"
},
"Action" : "sts:AssumeRole"
}
]
}
POLICY3
}
Next, we will create an AWS IAM policy for the Lambda role above. This policy will allow the Lambda function to create log streams and put log events in the log group. The policy will use the AWS region and caller identity data to generate the resource ARN for the log group.
Append this code to the main.tf
file:
resource "aws_iam_policy" "LambdaPolicy" {
policy = <<POLICY4
{
"Version" : "2012-10-17",
"Statement" : [
{
"Effect" : "Allow",
"Action" : [
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource" : "arn:aws:logs:${data.aws_region.current.name}:${data.aws_caller_identity.current.account_id}:log-group:/aws/lambda/${aws_lambda_function.MyLambdaFunction.function_name}:*:*"
}
]
}
POLICY4
}
Next, we will create an IAM role policy attachment that will attach the Lambda IAM policy to its role.
Append this code to the main.tf
file:
resource "aws_iam_role_policy_attachment" "LambdaPolicyAttachment" {
role = aws_iam_role.LambdaRole.name
policy_arn = aws_iam_policy.LambdaPolicy.arn
}
Next, we will create an AWS API Gateway HTTP API to EventBridge using Terraform. This code will create an API Gateway HTTP API with an integration to EventBridge, which will allow us to send events to EventBridge from the API Gateway. The code also sets up the request parameters for the EventBridge integration, such as the source, detail type, and detail.
Append this code to the main.tf
file:
resource "aws_apigatewayv2_api" "MyApiGatewayHTTPApi" {
name = "Terraform API Gateway HTTP API to EventBridge"
protocol_type = "HTTP"
body = jsonencode(
{
"openapi" : "3.0.1",
"info" : {
"title" : "API Gateway HTTP API to EventBridge"
},
"paths" : {
"/" : {
"post" : {
"responses" : {
"default" : {
"description" : "EventBridge response"
}
},
"x-amazon-apigateway-integration" : {
"integrationSubtype" : "EventBridge-PutEvents",
"credentials" : "${aws_iam_role.APIGWRole.arn}",
"requestParameters" : {
"Detail" : "$request.body.Detail",
"DetailType" : "MyDetailType",
"Source" : "demo.apigw"
},
"payloadFormatVersion" : "1.0",
"type" : "aws_proxy",
"connectionType" : "INTERNET"
}
}
}
}
})
}
Next, we will create an API Gateway stage for our API. This stage will be named "$default" and will be automatically deployed when the Terraform code is applied. The stage will be associated with the API we created earlier, which is referenced by the aws_apigatewayv2_api.MyApiGatewayHTTPApi.id
variable.
Append this code to the main.tf
file:
resource "aws_apigatewayv2_stage" "MyApiGatewayHTTPApiStage" {
api_id = aws_apigatewayv2_api.MyApiGatewayHTTPApi.id
name = "$default"
auto_deploy = true
}
Next, we will create an AWS CloudWatch Event Rule. This code will create an event rule that will trigger when the source is "demo.apigw
" and the account is the current account ID. The event pattern is specified in the PATTERN block and will be used to determine when the event rule should be triggered.
Append this code to the main.tf
file:
resource "aws_cloudwatch_event_rule" "MyEventRule" {
event_pattern = <<PATTERN
{
"account": ["${data.aws_caller_identity.current.account_id}"],
"source": ["demo.apigw"]
}
PATTERN
}
Next, we will create an AWS CloudWatch Event Target using Terraform. This code will create a resource called "aws_cloudwatch_event_target" with the name "MyRuleTarget". This resource will link to our Lambda Function (specified by the ARN) to an AWS CloudWatch Event Rule (specified by the ID). This will allow the Lambda Function to be triggered when the Event Rule is triggered.
Append this code to the main.tf
file:
resource "aws_cloudwatch_event_target" "MyRuleTarget" {
arn = aws_lambda_function.MyLambdaFunction.arn
rule = aws_cloudwatch_event_rule.MyEventRule.id
}
We need to create a log group for the Lambda function.
Append this code to the main.tf
file:
resource "aws_cloudwatch_log_group" "MyLogGroup" {
name = "/aws/lambda/${aws_lambda_function.MyLambdaFunction.function_name}"
retention_in_days = 60
}
Next, we will create our Lambda Function resource. This will use our Python function code as its source, and set some other key settings for this project.
Append this code to the main.tf
file:
resource "aws_lambda_function" "MyLambdaFunction" {
function_name = "apigw-http-eventbridge-terraform-demo-${data.aws_caller_identity.current.account_id}"
filename = data.archive_file.LambdaZipFile.output_path
source_code_hash = filebase64sha256(data.archive_file.LambdaZipFile.output_path)
role = aws_iam_role.LambdaRole.arn
handler = "LambdaFunction.lambda_handler"
runtime = "python3.9"
layers = ["arn:aws:lambda:${data.aws_region.current.name}:017000801446:layer:AWSLambdaPowertoolsPython:15"]
}
We need to create a Python file for our function. We'll use a very simple Python file that just logs some information to CloudWatch.
In your working directory, create a folder named /src. In that folder create a file name LambdaFunction.py.
In that file, add the following code:
# Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
# SPDX-License-Identifier: MIT-0
import json
from aws_lambda_powertools import Logger
logger = Logger()
def lambda_handler(event, context):
logger.info(f"Received event: {event}")
Next, we will create an AWS Lambda permission that allows CloudWatch to invoke our Lambda function. This is done by using the resource "aws_lambda_permission" and setting the statement_id, action, function_name, principal, and source_arn.
Append this code to the main.tf
file:
resource "aws_lambda_permission" "EventBridgeLambdaPermission" {
statement_id = "AllowExecutionFromCloudWatch"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.MyLambdaFunction.function_name
principal = "events.amazonaws.com"
source_arn = aws_cloudwatch_event_rule.MyEventRule.arn
}
We will need several values from our deployed resources. Append the following to the main.tf
file.
output "APIGW-URL" {
value = aws_apigatewayv2_stage.MyApiGatewayHTTPApiStage.invoke_url
description = "The API Gateway Invocation URL Queue URL"
}
output "LambdaFunctionName" {
value = aws_lambda_function.MyLambdaFunction.function_name
description = "The Lambda Function name"
}
output "CloudWatchLogName" {
value = "/aws/lambda/${aws_lambda_function.MyLambdaFunction.function_name}"
description = "The Lambda Function Log Group"
}
Now that we have all of our code written, we can deploy the project. Open a terminal, navigate to the project, and run these commands.
# initialize the project
$ terraform init
# plan the project
$ terraform plan
# apply the project
$ terraform apply
To test the solution we will send an event to the API gateway. The event will be sent to EventBridge and then to the Lambda function. The Lambda function will log the event to CloudWatch.
At your command prompt, enter the following snippet. Change the endpoint value to the one from your outputs above.
curl --location --request POST '<your api endpoint>' --header 'Content-Type: application/json' \ --data-raw '{ "Detail":{ "message": "Hello From API Gateway" } }'
After you've submitted that request, you can go to the CloudWatch console, open the log group we created, and you'll see the request message.
Source
This project was sourced from the AWS Repo: https://github.com/aws-samples/serverless-patterns