Multi-Provider Setup
Agent Air supports configuring multiple LLM providers, allowing you to switch between providers or use different models for different purposes. This flexibility enables scenarios like using Claude for complex reasoning, GPT-4 for general tasks, and Gemini for long-context work, all within the same agent. The LLMRegistry manages provider configurations and provides methods for accessing them at runtime.
Provider configuration can come from YAML files, environment variables, or programmatic setup. The registry supports all five built-in providers (Anthropic, OpenAI, Google, Cohere, Bedrock) plus OpenAI-compatible APIs and Azure OpenAI.
LLMRegistry
The LLMRegistry manages multiple provider configurations:
pub struct LLMRegistry {
configs: HashMap<String, LLMSessionConfig>,
default_provider: Option<String>,
}
It stores configurations by provider name and tracks which provider is the default.
YAML Configuration
The primary way to configure multiple providers is through a YAML config file. Place this file at the path specified by your AgentConfig::config_path(), typically ~/.myagent/config.yaml.
Complete Example
providers:
# Anthropic - Claude models
- provider: anthropic
api_key: sk-ant-api03-...
model: claude-sonnet-4-20250514
system_prompt: "You are a helpful assistant."
# OpenAI - GPT models
- provider: openai
api_key: sk-...
model: gpt-4-turbo
# Google - Gemini models
- provider: google
api_key: AIza...
model: gemini-2.5-pro
# Cohere - Command models
- provider: cohere
api_key: co-...
model: command-r-plus
# Amazon Bedrock
- provider: bedrock
bedrock_access_key_id: AKIA...
bedrock_secret_access_key: secret...
bedrock_region: us-east-1
model: anthropic.claude-3-sonnet-20240229-v1:0
# Azure OpenAI
- provider: openai
api_key: azure-key...
azure_resource: my-resource
azure_deployment: gpt-4-deployment
azure_api_version: "2024-10-21"
# OpenAI-compatible (Groq)
- provider: openai
api_key: gsk_...
model: llama-3.1-70b-versatile
base_url: https://api.groq.com/openai/v1
context_limit: 128000
default_provider: anthropic
Provider-Specific Configuration
Anthropic
- provider: anthropic
api_key: sk-ant-api03-...
model: claude-sonnet-4-20250514
system_prompt: "You are helpful."
OpenAI
- provider: openai
api_key: sk-...
model: gpt-4-turbo
system_prompt: "You are helpful."
Google (Gemini)
- provider: google
api_key: AIza...
model: gemini-2.5-pro
system_prompt: "You are helpful."
Cohere
- provider: cohere
api_key: co-...
model: command-r-plus
system_prompt: "You are helpful."
Amazon Bedrock
- provider: bedrock
bedrock_access_key_id: AKIA...
bedrock_secret_access_key: secret...
bedrock_region: us-east-1
bedrock_session_token: optional... # For temporary credentials
model: anthropic.claude-3-sonnet-20240229-v1:0
system_prompt: "You are helpful."
Azure OpenAI
- provider: openai
api_key: your-azure-key
azure_resource: my-resource
azure_deployment: gpt-4-deployment
azure_api_version: "2024-10-21"
system_prompt: "You are helpful."
OpenAI-Compatible
- provider: openai
api_key: gsk_...
model: llama-3.1-70b-versatile
base_url: https://api.groq.com/openai/v1
context_limit: 128000
system_prompt: "You are helpful."
ProviderConfig Structure
Each provider entry in the YAML maps to a ProviderConfig:
#[derive(Debug, Deserialize)]
pub struct ProviderConfig {
pub provider: String,
pub api_key: Option<String>,
pub model: Option<String>,
pub system_prompt: Option<String>,
pub base_url: Option<String>,
pub context_limit: Option<i32>,
// Azure fields
pub azure_resource: Option<String>,
pub azure_deployment: Option<String>,
pub azure_api_version: Option<String>,
// Bedrock fields
pub bedrock_access_key_id: Option<String>,
pub bedrock_secret_access_key: Option<String>,
pub bedrock_session_token: Option<String>,
pub bedrock_region: Option<String>,
}
The provider field determines which provider type to use:
"anthropic"- Anthropic Claude"openai"- OpenAI, Azure, or compatible"google"- Google Gemini"cohere"- Cohere Command"bedrock"- Amazon Bedrock
Loading Configuration
From Config File
Load directly from a YAML file:
use agent_air::agent::config::LLMRegistry;
use std::path::PathBuf;
let path = PathBuf::from("/home/user/.myagent/config.yaml");
let registry = LLMRegistry::load_from_file(&path, "Default system prompt")?;
Using load_config
The load_config() function implements the standard loading pattern:
use agent_air::agent::config::{load_config, AgentConfig};
struct MyConfig;
impl AgentConfig for MyConfig {
fn config_path(&self) -> &str { ".myagent/config.yaml" }
fn default_system_prompt(&self) -> &str { "You are helpful." }
fn log_prefix(&self) -> &str { "myagent" }
fn name(&self) -> &str { "MyAgent" }
}
let registry = load_config(&MyConfig);
This function:
- Tries to load from
~/{config_path} - Falls back to environment variables if the file does not exist
Environment Variable Fallback
When no config file is found, load_config() checks environment variables in order:
# Anthropic (checked first)
ANTHROPIC_API_KEY=sk-ant-...
ANTHROPIC_MODEL=claude-sonnet-4-20250514
# OpenAI (checked second)
OPENAI_API_KEY=sk-...
OPENAI_MODEL=gpt-4-turbo
# Google (checked third)
GOOGLE_API_KEY=AIza...
GOOGLE_MODEL=gemini-2.5-pro
# Cohere (checked fourth)
COHERE_API_KEY=co-...
COHERE_MODEL=command-r-plus
# Bedrock (checked fifth)
AWS_ACCESS_KEY_ID=AKIA...
AWS_SECRET_ACCESS_KEY=secret...
AWS_REGION=us-east-1
BEDROCK_MODEL=anthropic.claude-3-sonnet-20240229-v1:0
The first provider with valid credentials becomes the default.
Accessing Providers
Get Default Provider
let registry = load_config(&my_config);
if let Some(config) = registry.get_default() {
println!("Using model: {}", config.model);
}
Get Specific Provider
if let Some(anthropic) = registry.get("anthropic") {
// Use Anthropic config
}
if let Some(openai) = registry.get("openai") {
// Use OpenAI config
}
if let Some(google) = registry.get("google") {
// Use Gemini config
}
if let Some(cohere) = registry.get("cohere") {
// Use Cohere config
}
if let Some(bedrock) = registry.get("bedrock") {
// Use Bedrock config
}
List Available Providers
let providers = registry.providers(); // Vec<&str>
println!("Available: {:?}", providers);
// ["anthropic", "openai", "google", "cohere", "bedrock"]
Get Default Provider Name
if let Some(name) = registry.default_provider_name() {
println!("Default provider: {}", name);
}
Check If Empty
if registry.is_empty() {
eprintln!("No providers configured!");
}
Registry Methods Reference
| Method | Returns | Description |
|---|---|---|
new() | LLMRegistry | Create empty registry |
load_from_file(path, prompt) | Result<LLMRegistry, ConfigError> | Load from YAML file |
get_default() | Option<&LLMSessionConfig> | Get default config |
get(provider) | Option<&LLMSessionConfig> | Get config by name |
default_provider_name() | Option<&str> | Get default provider name |
providers() | Vec<&str> | List provider names |
is_empty() | bool | Check if no providers |
Programmatic Multi-Provider Setup
Create a registry programmatically without YAML:
use agent_air::controller::LLMSessionConfig;
// Create configurations for each provider
let anthropic_config = LLMSessionConfig::anthropic(
std::env::var("ANTHROPIC_API_KEY").unwrap(),
"claude-sonnet-4-20250514"
)
.with_system_prompt("You are helpful.");
let openai_config = LLMSessionConfig::openai(
std::env::var("OPENAI_API_KEY").unwrap(),
"gpt-4-turbo"
)
.with_system_prompt("You are helpful.");
let google_config = LLMSessionConfig::google(
std::env::var("GOOGLE_API_KEY").unwrap(),
"gemini-2.5-pro"
)
.with_system_prompt("You are helpful.");
let cohere_config = LLMSessionConfig::cohere(
std::env::var("COHERE_API_KEY").unwrap(),
"command-r-plus"
)
.with_system_prompt("You are helpful.");
let bedrock_config = LLMSessionConfig::bedrock(
std::env::var("AWS_ACCESS_KEY_ID").unwrap(),
std::env::var("AWS_SECRET_ACCESS_KEY").unwrap(),
"us-east-1",
"anthropic.claude-3-sonnet-20240229-v1:0"
)
.with_system_prompt("You are helpful.");
// Use configs directly when creating sessions
let controller = LLMController::new(None);
let claude_session = controller.create_session(anthropic_config).await?;
let gpt_session = controller.create_session(openai_config).await?;
Switching Providers at Runtime
Create sessions with different providers dynamically:
let registry = load_config(&my_config);
// Create a Claude session for complex reasoning
if let Some(anthropic) = registry.get("anthropic") {
let session_id = controller.create_session(anthropic.clone()).await?;
// Use for complex multi-step tasks
}
// Create a Gemini session for long-context work
if let Some(google) = registry.get("google") {
let session_id = controller.create_session(google.clone()).await?;
// Use for analyzing large documents
}
// Create a GPT session for general tasks
if let Some(openai) = registry.get("openai") {
let session_id = controller.create_session(openai.clone()).await?;
// Use for standard interactions
}
Provider Selection Strategy
Different providers have different strengths:
| Use Case | Recommended Provider | Why |
|---|---|---|
| Complex reasoning | Anthropic Claude | Strong at multi-step reasoning |
| Long documents | Google Gemini | 1M+ token context |
| General chat | OpenAI GPT-4 | Well-rounded performance |
| Enterprise/AWS | Amazon Bedrock | VPC, CloudWatch, IAM |
| Enterprise/Azure | Azure OpenAI | Azure compliance, managed identity |
| RAG workloads | Cohere Command-R | Optimized for retrieval |
| Fast inference | Groq (OpenAI-compatible) | Hardware-accelerated |
| Local/private | Ollama (OpenAI-compatible) | No data leaves your machine |
ConfigError
Configuration loading can fail with these errors:
pub enum ConfigError {
NoHomeDirectory,
ReadError { path: String, source: String },
ParseError { path: String, source: String },
UnknownProvider { provider: String },
}
Handle errors when loading:
match LLMRegistry::load_from_file(&path, "prompt") {
Ok(registry) => { /* use registry */ }
Err(ConfigError::ReadError { path, source }) => {
eprintln!("Could not read {}: {}", path, source);
}
Err(ConfigError::ParseError { path, source }) => {
eprintln!("Invalid YAML in {}: {}", path, source);
}
Err(ConfigError::UnknownProvider { provider }) => {
eprintln!("Unknown provider: {}", provider);
}
Err(e) => {
eprintln!("Config error: {}", e);
}
}
Complete Example
use agent_air::AgentAir;
use agent_air::agent::config::{load_config, AgentConfig};
struct MyConfig;
impl AgentConfig for MyConfig {
fn config_path(&self) -> &str { ".myagent/config.yaml" }
fn default_system_prompt(&self) -> &str { "You are helpful." }
fn log_prefix(&self) -> &str { "myagent" }
fn name(&self) -> &str { "MyAgent" }
}
fn main() -> std::io::Result<()> {
// Load configuration (file or environment)
let registry = load_config(&MyConfig);
if registry.is_empty() {
eprintln!("No LLM providers configured!");
eprintln!("Options:");
eprintln!(" 1. Create ~/.myagent/config.yaml");
eprintln!(" 2. Set ANTHROPIC_API_KEY");
eprintln!(" 3. Set OPENAI_API_KEY");
eprintln!(" 4. Set GOOGLE_API_KEY");
eprintln!(" 5. Set COHERE_API_KEY");
eprintln!(" 6. Set AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY");
return Ok(());
}
println!("Available providers: {:?}", registry.providers());
println!("Default: {:?}", registry.default_provider_name());
// AgentAir uses the registry internally
let mut agent = AgentAir::new(&MyConfig)?;
agent.run()
} 