Recently, I have started learning terraform (not by choice) because one of the clients needs it. Its a neat tool to manage your infra as code, no doubt about it. But if you ask my personal opinion as a dev, I would say go with the native cloud solution. For ex – AWS CDK.
Native not only has all the required features to work with your cloud provider but also streamlines and smoothens the development experience for the developer. You also find a lot of tools that works with the native solutions and not with Terraform. But that said, terraform is a powerful tool and if you don’t have the luxury of native solutions like me then Terraform is a great choice.
While I was learning to deploy the infrastructure with this, I struggled a little bit with the AWS Lambda Deployment and learned quite a lot along the way. That is what I want to share with you in this article.
At the end of this article we will deploy a simple Python Lambda API and access it via postman.
Table of Contents
How does terraform works?
The terraform was built to suffice a very crucial problem that arise when you manage a cloud infrastructure manually. Now-a-days the applications are so big that requires massive infrastructures and services. And managing all that manually is error prone and at times confusing.
Another reason was to keep the infrastructure cloud agnostic (well on paper at least). AS there are more and more cloud providers in the market, you would want to keep your infrastructure as such that you can migrate from one cloud to another without changing a lot of stuff.
Therefore, terraform solves these two problems.
Terraform is essentially a tool for building, changing and versioning infrastructure in a safe (again controversial because if you mess up the state then best of luck to you) and efficient manner. The files that are used in terraform are declarative. This means you only tell what you want and the logic is carried out by the terraform itself.
Under the hood, terraform defines a state of the infrastructure and then automatically creates, updates and deletes resources to match that state. So, every change that you make to the infrastructure is recorded in the terraform.tfstate
file. So anytime your infrastructure is changed, terraform will make the changes to the infrastructure to match the current state.
Therefore, it is very important to not mess with the state file. If you mess with the state file then its quite hard to get it back. That is why the common practice is to maintain the terraform state in a cloud bucket and not your local machine.
Terraform’s Basic Construct
The basic construct of terraform is its configuration files which are written in Hashicorp Configuration Language (HCL) or JSON Format (whichever is preferred). Once you have defined these configuration file, the terraform will then do a dry run which will outline the changes that will be made to the infrastructure to bring it into the desired state. And then the user can review and approve the plan and terraform applies those changes.
Project Structure
Below is the simple structure for your project. I will walk you through each of the file.
LambdaDemo.py
This is the main lambda that will be called once we are done with the setup.
main.tf
This file contains all the main configurations required to deploy our infrastructure.
provider.tf
There are multiple cloud providers, therefore, this file defines the providers info that will be used in your terraform during deployment
var.tf
This file contains the variable declaration that will be used in your script
var.tfvars
This files defines the variables that are declared in the var.tf file
python
└── LambdaDemo.py
terraform
├── main.tf
├── provider.tf
├── var.tf
└── var.tfvars
LambdaDemo.py
This is going to be a simple python code that we will call from the API. This will simply written 200.
def handler(event, context):
return {
"statusCode": 200
}
Yes, that is enough for this demo 😀
provider.tf
provider "aws" {
profile = var.aws_profile
region = var.aws_region
}
var.tf
variable "aws_profile" {
description = "AWS Profile For Deployment"
}
variable "aws_region" {
description = "AWS Region For Deployment"
}
variable "project" {
type = string
description = "name of the project"
}
variable "account_id" {
type = string
}
variable "lambda_policy_arn" {
type = string
}
variable "stage" {
type = string
}
var.tfvars
aws_region = "us-east-1"
aws_profile = "<your-aws-profile-in-credentials-file>"
account_id = "<your-aws-account-id>"
project = "demo-terraform-deployment"
lambda_policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
stage = "dev"
aws_profile: this is the basic information terraform needs to access your account. This information can be found in your console.
Once you have the information, store that information in the credentials file that is found at the following location ~/.aws/credentials
.
The credentials file will look like the following:
[profile-name]
AWS_ACCESS_KEY_ID=<YOUR_ACCESS_KEY_ID>
AWS_SECRET_ACCESS_KEY=<SECRET_ACCESS_KEY>
AWS_SESSION_TOKEN=<SESSION_TOKEN>
I leave it to you to setup this part correctly.
main.tf
This is the juicy part of our terraform configuration where we will setup the actual infra for our lambda function and api gateway. So before we build something its always a good idea to visualize how our infrastructure is going to look like. So this is what we will be building here:
Now that we know what we need to build, let’s built it.
Create an AWS IAM Role for Lambda Execution
Let’s create a role with a following name: demo-terraform-deployment-lambda-role
resource "aws_iam_role" "lambda_exec" {
name = "${var.project}-lambda-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Sid = ""
Principal = {
Service = "lambda.amazonaws.com"
}
}
]
})
}
We will also need to attach a policy to this role, so, let’s do that as well.
The policy that we are using here is the following:
arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
################################################################################
# Attach an IAM policy to the Lambda execution role #
################################################################################
resource "aws_iam_role_policy_attachment" "lambda_policy" {
role = aws_iam_role.lambda_exec.id
policy_arn = var.lambda_policy_arn
}
Next, we will create Lambda function and assign the above role to it.
Zip and Upload Lambda Package
Before we define the lambda we will also have to zip our current project and put it into S3 bucket. Then that information will be required to create the lambda function. So, let’s do that quickly.
If you remember, our project structure was simple:
python
└── LambdaDemo.py
terraform
├── main.tf
├── provider.tf
├── var.tf
└── var.tfvars
Here our lambda resides inside the Python folder and therefore you should zip that folder and push it to the S3 bucket. Following configuration will do exactly that.
################################################################################
# Create a ZIP archive of the function code #
################################################################################
data "archive_file" "lambda-demo" {
type = "zip"
source_dir = "../python"
output_path = "../lambda-demo.zip"
}
################################################################################
# Create an S3 bucket to store the Lambda function code #
################################################################################
resource "aws_s3_bucket" "lambda_bucket" {
bucket = "${var.project}-lambda-demo"
}
################################################################################
# Upload the Lambda function code to S3 #
################################################################################
resource "aws_s3_object" "lambda_demo" {
bucket = aws_s3_bucket.lambda_bucket.id
key = "lambda-demo"
source = data.archive_file.lambda-demo.output_path
# Calculate the ETag of the ZIP archive using the MD5 hash of its contents
etag = filemd5(data.archive_file.lambda-demo.output_path)
}
Once everything is done, this is how the file will be uploaded to S3 bucket:
Create AWS Lambda Function
The main parts of this resource are:
- role: lambda requires policy to execution policy and therefore this is the same role that we created above and attached the policy to it
- handler: provide the path to the handler from your project’s artifact root (the zipped file)
- s3_bucket: the bucket where you have stored the file
- s3_key: the object key of the artifact
Rest everything is basic boilerplate as per your need.
################################################################################
# Create an AWS Lambda function #
################################################################################
resource "aws_lambda_function" "lambda_demo" {
function_name = "${var.project}-lambda_demo"
role = aws_iam_role.lambda_exec.arn
architectures = ["arm64"]
runtime = "python3.10"
handler = "LambdaDemo.handler"
source_code_hash = data.archive_file.lambda-demo.output_base64sha256
# Use the base64-encoded SHA-256 hash of the function code to determine if it has changed
s3_bucket = aws_s3_bucket.lambda_bucket.id
s3_key = aws_s3_object.lambda_demo.id
}
Once you have created the lambda function resource, it’s time to create API Gateway and define our resources. So far this code will create a lambda function in your AWS cloud.
Go ahead and try to run terraform plan -var-file=var.tfvars
.
It will list down all the changes that will be made to your infra.
API Gateway Setup
There are 5 parts to API Gateway creation:
- API Gateway REST Api
- This is the root resource, the very first thing that shows up when you create a new API Gateway
- API Gateway Resource
- This is the REST Resource. It could be anything based on your application for example:
/books
/dashboards
/movies
- This is the REST Resource. It could be anything based on your application for example:
- API Gateway Method
- This is the REST method/verb (GET, PUT, POST, PATCH, DELETE)
- Here, in our example, we will only define a single GET method
- API Gateway Lambda Integration
- Now, this API Gateway needs to talk to the Lambda function that we created. It requires an additional integration step.
- After this integration, you will start to see the complete picture in the AWS console
- API Gateway Deployment
- You deploy this API to a stage (like dev, stage, prod etc)
- API Gateway Permission to Invoke Lambda
- This is required to invoke the lambda from your API Gateway
###########################################################
# Create an AWS API Gateway REST API #
###########################################################
resource "aws_api_gateway_rest_api" "lambda_demo" {
name = "${var.project}-lambda-demo-api"
endpoint_configuration {
types = ["REGIONAL"]
}
}
###########################################################
# Create a resource for the Lambda function in the API #
# Gateway REST API #
###########################################################
resource "aws_api_gateway_resource" "lambda_demo" {
parent_id = aws_api_gateway_rest_api.lambda_demo.root_resource_id
path_part = "lambda-demo"
rest_api_id = aws_api_gateway_rest_api.lambda_demo.id
}
###########################################################
# Create a method for the HTTP verb on the Lambda #
# function resource #
###########################################################
resource "aws_api_gateway_method" "lambda_demo_get_method" {
authorization = "NONE"
http_method = "GET"
resource_id = aws_api_gateway_resource.lambda_demo.id
rest_api_id = aws_api_gateway_rest_api.lambda_demo.id
}
###########################################################
# Create an integration for the Lambda function in the API #
# Gateway REST API #
###########################################################
resource "aws_api_gateway_integration" "lambda_demo_get_method_integration" {
http_method = aws_api_gateway_method.lambda_demo_get_method.http_method
resource_id = aws_api_gateway_resource.lambda_demo.id
rest_api_id = aws_api_gateway_rest_api.lambda_demo.id
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda_demo.invoke_arn
}
###########################################################
# Deploy the API Gateway REST API #
###########################################################
resource "aws_api_gateway_deployment" "lambda_demo" {
depends_on = [
aws_api_gateway_integration.lambda_demo_get_method_integration
]
rest_api_id = aws_api_gateway_rest_api.lambda_demo.id
lifecycle {
create_before_destroy = true
}
triggers = {
redeployment = sha1(jsonencode([
jsonencode(aws_api_gateway_integration.lambda_demo_get_method_integration),
jsonencode(aws_api_gateway_integration.lambda_demo_post_method_integration)
]))
}
}
###########################################################
# Create a stage for the API Gateway REST API deployment #
###########################################################
resource "aws_api_gateway_stage" "lambda_demo" {
deployment_id = aws_api_gateway_deployment.lambda_demo.id
rest_api_id = aws_api_gateway_rest_api.lambda_demo.id
stage_name = var.stage
}
#################################################################################
# APIGATEWAY PERMISSION FOR LAMBDA INVOKE #
#################################################################################
resource "aws_lambda_permission" "lambda_demo_get_method_permission" {
statement_id = "AllowExecutionFromAPIGateway"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda_demo.function_name
principal = "apigateway.amazonaws.com"
# More: http://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-control-access-using-iam-policies-to-invoke-api.html
source_arn = "arn:aws:execute-api:${var.aws_region}:${var.account_id}:${aws_api_gateway_rest_api.lambda_demo.id}/*/${aws_api_gateway_method.lambda_demo_get_method.http_method}${aws_api_gateway_resource.lambda_demo.path}"
}
Once you have all this written down correctly, all you need to do is run
terraform apply -var-file=var.tfvars
This will tell terraform to validate, plan and with your approval apply the changes to your cloud infrastructure.
Post Deployment
Once the deployment is successful, you will see your Lambda in the console with the API Trigger attached.
After this, click on the API Gateway and copy the url:
Now, head to the postman and hit the following url with a GET request. You should receive 200 OK
status like I do below.
Conclusion
This is a very basic lambda setup with terraform where we covered what terraform is, how it works, configured and deployed a lambda function and API Gateway to see it in action.
To add more, it’s worth noting that Terraform offers a lot of flexibility in terms of infrastructure management. It supports multiple cloud providers, and can even be used to manage on-premises infrastructure.
As for the lambda setup, it’s cool that we were able to deploy a working function and API Gateway. Terraform makes it easy to automate infrastructure management and deployment, saving time and reducing errors. With Terraform, you can version your infrastructure like you would code, and collaborate with others on infrastructure changes.
Overall, Terraform is a powerful tool that can simplify infrastructure management, and this example is just a glimpse of what it can do.
Do let me know your thoughts or suggestions for the future in the in the comments below.