Model Providers
What are Model Providers?
Section titled “What are Model Providers?”A model provider is a service or platform that hosts and serves large language models through an API. The Strands Agents SDK abstracts away the complexity of working with different providers, offering a unified interface that makes it easy to switch between models or use multiple providers in the same application.
Supported Providers
Section titled “Supported Providers”The following table shows all model providers supported by Strands Agents SDK and their availability in Python and TypeScript:
| Provider | Python Support | TypeScript Support |
|---|---|---|
| Custom Providers | ✅ | ✅ |
| Amazon Bedrock | ✅ | ✅ |
| Amazon Nova | ✅ | ❌ |
| OpenAI | ✅ | ✅ |
| Anthropic | ✅ | ❌ |
| Gemini | ✅ | ❌ |
| LiteLLM | ✅ | ❌ |
| llama.cpp | ✅ | ❌ |
| LlamaAPI | ✅ | ❌ |
| MistralAI | ✅ | ❌ |
| Ollama | ✅ | ❌ |
| SageMaker | ✅ | ❌ |
| Writer | ✅ | ❌ |
| Cohere | ✅ | ❌ |
| CLOVA Studio | ✅ | ❌ |
| FireworksAI | ✅ | ❌ |
Getting Started
Section titled “Getting Started”Installation
Section titled “Installation”Most providers are available as optional dependencies. Install the provider you need:
# Install with specific providerpip install 'strands-agents[bedrock]'pip install 'strands-agents[openai]'pip install 'strands-agents[anthropic]'
# Or install with all providerspip install 'strands-agents[all]'# Core SDK includes BedrockModel by defaultnpm install @strands-agents/sdk
# To use OpenAI, install the openai packagenpm install openaiNote: All model providers except Bedrock are listed as optional dependencies in the SDK. This means npm will attempt to install them automatically, but won’t fail if they’re unavailable. You can explicitly install them when needed.
Basic Usage
Section titled “Basic Usage”Each provider follows a similar pattern for initialization and usage. Models are interchangeable - you can easily switch between providers by changing the model instance:
from strands import Agentfrom strands.models.bedrock import BedrockModelfrom strands.models.openai import OpenAIModel
# Use Bedrockbedrock_model = BedrockModel( model_id="anthropic.claude-sonnet-4-20250514-v1:0")agent = Agent(model=bedrock_model)response = agent("What can you help me with?")
# Alternatively, use OpenAI by just switching model provideropenai_model = OpenAIModel( client_args={"api_key": "<KEY>"}, model_id="gpt-4o")agent = Agent(model=openai_model)response = agent("What can you help me with?")import { Agent } from '@strands-agents/sdk'import { BedrockModel } from '@strands-agents/sdk/bedrock'import { OpenAIModel } from '@strands-agents/sdk/openai'
// Use Bedrockconst bedrockModel = new BedrockModel({ modelId: 'anthropic.claude-sonnet-4-20250514-v1:0',})let agent = new Agent({ model: bedrockModel })let response = await agent.invoke('What can you help me with?')
// Alternatively, use OpenAI by just switching model providerconst openaiModel = new OpenAIModel({ apiKey: process.env.OPENAI_API_KEY, modelId: 'gpt-4o',})agent = new Agent({ model: openaiModel })response = await agent.invoke('What can you help me with?')Next Steps
Section titled “Next Steps”Explore Model Providers
Section titled “Explore Model Providers”- Amazon Bedrock - Default provider with wide model selection, enterprise features, and full Python/TypeScript support
- OpenAI - GPT models with streaming support
- Custom Providers - Build your own model integration
- Anthropic - Direct Claude API access (Python only)