Skip to content

Deploy to Terraform

This guide covers deploying Strands agents using Terraform infrastructure as code. Terraform enables consistent, repeatable deployments across AWS, Google Cloud, Azure, and other cloud providers.

Terraform supports multiple deployment targets. This deploy example illustates four deploy options from different Cloud Service Providers:

Cloud deployment requires your containerized agent to be available in a container registry. The following assumes you have completed the Docker deployment guide and pushed your image to the appropriate registry:

Docker Tutorial Project Structure:

Project Structure (Python):

my-python-app/
├── agent.py # FastAPI application (from Docker tutorial)
├── Dockerfile # Container configuration (from Docker tutorial)
├── pyproject.toml # Created by uv init
├── uv.lock # Created automatically by uv

Project Structure (TypeScript):

my-typescript-app/
├── index.ts # Express application (from Docker tutorial)
├── Dockerfile # Container configuration (from Docker tutorial)
├── package.json # Created by npm init
├── tsconfig.json # TypeScript configuration
├── package-lock.json # Created automatically by npm

Deploy-specific Docker configurations

Image Requirements:

  • Standard Docker images supported

Container Registry Requirements:

Docker Deployment Guide Modifications:

  • No special base image required (standard Docker images work)
  • Ensure your app listens on port 8080 (or configure port in terraform)
  • Build with: docker build --platform linux/amd64 -t my-agent .
Optional: Open AWS App Runner Setup All-in-One Bash Command
Copy and paste this bash script to create all necessary terraform files and skip remaining “Cloud Deployment Setup” steps below:
Terminal window
generate_aws_apprunner_terraform() {
mkdir -p terraform
# Generate main.tf
cat > terraform/main.tf << 'EOF'
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
region = var.aws_region
}
resource "aws_iam_role" "apprunner_ecr_access_role" {
name = "apprunner-ecr-access-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "build.apprunner.amazonaws.com"
}
}
]
})
}
resource "aws_iam_role_policy_attachment" "apprunner_ecr_access_policy" {
role = aws_iam_role.apprunner_ecr_access_role.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSAppRunnerServicePolicyForECRAccess"
}
resource "aws_apprunner_service" "agent" {
service_name = "strands-agent-v4"
source_configuration {
image_repository {
image_identifier = var.agent_image
image_configuration {
port = "8080"
runtime_environment_variables = {
OPENAI_API_KEY = var.openai_api_key
}
}
image_repository_type = "ECR"
}
auto_deployments_enabled = false
authentication_configuration {
access_role_arn = aws_iam_role.apprunner_ecr_access_role.arn
}
}
instance_configuration {
cpu = "0.25 vCPU"
memory = "0.5 GB"
}
}
EOF
# Generate variables.tf
cat > terraform/variables.tf << 'EOF'
variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
variable "agent_image" {
description = "Container image for Strands agent"
type = string
}
variable "openai_api_key" {
description = "OpenAI API key"
type = string
sensitive = true
}
EOF
# Generate outputs.tf
cat > terraform/outputs.tf << 'EOF'
output "agent_url" {
description = "AWS App Runner service URL"
value = aws_apprunner_service.agent.service_url
}
EOF
# Generate terraform.tfvars template
cat > terraform/terraform.tfvars << 'EOF'
agent_image = "your-account.dkr.ecr.us-east-1.amazonaws.com/my-image:latest"
openai_api_key = "<your-openai-api-key>"
EOF
echo "✅ AWS App Runner Terraform files generated in terraform/ directory"
}
generate_aws_apprunner_terraform

Step by Step Guide

Create terraform directory

Terminal window
mkdir terraform
cd terraform

Create main.tf

terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
}
provider "aws" {
region = var.aws_region
}
resource "aws_iam_role" "apprunner_ecr_access_role" {
name = "apprunner-ecr-access-role"
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Principal = {
Service = "build.apprunner.amazonaws.com"
}
}
]
})
}
resource "aws_iam_role_policy_attachment" "apprunner_ecr_access_policy" {
role = aws_iam_role.apprunner_ecr_access_role.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSAppRunnerServicePolicyForECRAccess"
}
resource "aws_apprunner_service" "agent" {
service_name = "strands-agent-v4"
source_configuration {
image_repository {
image_identifier = var.agent_image
image_configuration {
port = "8080"
runtime_environment_variables = {
OPENAI_API_KEY = var.openai_api_key
}
}
image_repository_type = "ECR"
}
auto_deployments_enabled = false
authentication_configuration {
access_role_arn = aws_iam_role.apprunner_ecr_access_role.arn
}
}
instance_configuration {
cpu = "0.25 vCPU"
memory = "0.5 GB"
}
}

Create variables.tf

variable "aws_region" {
description = "AWS region"
type = string
default = "us-east-1"
}
variable "agent_image" {
description = "Container image for Strands agent"
type = string
}
variable "openai_api_key" {
description = "OpenAI API key"
type = string
sensitive = true
}

Create outputs.tf

output "agent_url" {
description = "AWS App Runner service URL"
value = aws_apprunner_service.agent.service_url
}

Update terraform/terraform.tfvars based on your chosen provider:

agent_image = "your-account.dkr.ecr.us-east-1.amazonaws.com/my-image:latest"
openai_api_key = "<your-openai-api-key>"

This example uses OpenAI, but any supported model provider can be configured. See the Strands documentation for all supported model providers.

Note: Bedrock model provider credentials are automatically passed using App Runner’s IAM role and do not need to be specified in Terraform.

Terminal window
# Initialize Terraform
terraform init
# Review the deployment plan
terraform plan
# Deploy the infrastructure
terraform apply
# Get the endpoints
terraform output

Test the endpoints using the output URLs:

Terminal window
# Health check
curl http://<your-service-url>/ping
# Test agent invocation
curl -X POST http://<your-service-url>/invocations \
-H "Content-Type: application/json" \
-d '{"input": {"prompt": "What is artificial intelligence?"}}'

When you modify your code, redeploy with:

Terminal window
# Rebuild and push image
docker build -t <your-registry>/my-image:latest .
docker push <your-registry>/my-image:latest
# Update infrastructure
terraform apply

Remove the infrastructure when done:

Terminal window
terraform destroy