Anthropic Provider
The Anthropic provider enables integration with Claude models. It supports both synchronous and streaming message completion with full tool use capabilities.
AnthropicProvider Struct
The provider is defined in src/client/providers/anthropic/mod.rs:
pub struct AnthropicProvider {
pub api_key: String,
pub model: String,
}
impl AnthropicProvider {
pub fn new(api_key: String, model: String) -> Self {
Self { api_key, model }
}
}
API Configuration
The Anthropic provider uses the following API settings:
| Setting | Value |
|---|---|
| Endpoint | https://api.anthropic.com/v1/messages |
| Version | 2023-06-01 |
| Content-Type | application/json |
Request headers:
x-api-key: <api_key>
anthropic-version: 2023-06-01
Content-Type: application/json
LLMSessionConfig Builder
Create an Anthropic session configuration using the builder:
use agent_air::controller::LLMSessionConfig;
let config = LLMSessionConfig::anthropic("sk-ant-...", "claude-sonnet-4-20250514");
The anthropic() method sets these defaults:
| Option | Default Value |
|---|---|
| max_tokens | 4096 |
| streaming | true |
| context_limit | 200,000 |
| compaction | Threshold (default) |
Builder Methods
Customize the configuration using builder methods:
let config = LLMSessionConfig::anthropic("sk-ant-...", "claude-sonnet-4-20250514")
.with_max_tokens(8192)
.with_system_prompt("You are a helpful assistant.")
.with_temperature(0.7)
.with_streaming(true)
.with_context_limit(200_000);
Available methods:
| Method | Description |
|---|---|
with_max_tokens(u32) | Set maximum response tokens |
with_system_prompt(impl Into<String>) | Set the system prompt |
with_temperature(f32) | Set sampling temperature |
with_streaming(bool) | Enable or disable streaming |
with_context_limit(i32) | Set context window size |
with_threshold_compaction(config) | Configure compaction |
without_compaction() | Disable compaction |
Streaming Support
The Anthropic provider fully supports streaming via Server-Sent Events (SSE). When streaming is enabled, responses arrive incrementally as StreamEvent values:
pub enum StreamEvent {
TextDelta(String),
ToolUse { id: String, name: String, input: Value },
MessageComplete,
Error(String),
}
The provider parses the SSE stream and converts Anthropic-specific events to the generic StreamEvent type.
System Message Handling
The Anthropic API uses a dedicated system field rather than including system messages in the message array. The provider automatically extracts system messages and formats them correctly:
{
"model": "claude-sonnet-4-20250514",
"system": "You are a helpful assistant.",
"messages": [
{"role": "user", "content": "Hello"}
],
"max_tokens": 4096
}
Tool Use Format
Tools are sent in the Anthropic format:
{
"tools": [
{
"name": "get_weather",
"description": "Get weather for a location",
"input_schema": {
"type": "object",
"properties": {
"location": {"type": "string"}
},
"required": ["location"]
}
}
]
}
Tool choice options:
"auto"- Let the model decide"any"- Force tool use{"type": "tool", "name": "..."}- Force specific tool
Environment Variables
Configure Anthropic via environment variables:
| Variable | Description | Default |
|---|---|---|
ANTHROPIC_API_KEY | API key (required) | None |
ANTHROPIC_MODEL | Model identifier | claude-sonnet-4-20250514 |
When using environment variables, the configuration is loaded automatically:
// Reads ANTHROPIC_API_KEY and ANTHROPIC_MODEL from environment
let registry = load_config(&my_agent_config);
let config = registry.get_default();
Available Models
Common Claude model identifiers:
| Model | Context | Description |
|---|---|---|
claude-sonnet-4-20250514 | 200K | Latest Sonnet model |
claude-opus-4-20250514 | 200K | Most capable model |
claude-3-5-sonnet-20241022 | 200K | Previous Sonnet |
claude-3-opus-20240229 | 200K | Previous Opus |
claude-3-haiku-20240307 | 200K | Fast, efficient model |
YAML Configuration
Configure in your agent’s config file:
providers:
- provider: anthropic
api_key: sk-ant-api03-...
model: claude-sonnet-4-20250514
system_prompt: "You are a helpful coding assistant."
default_provider: anthropic
Complete Example
use agent_air::AgentAir;
use agent_air::controller::LLMSessionConfig;
struct MyConfig;
impl AgentConfig for MyConfig {
fn config_path(&self) -> &str { ".myagent/config.yaml" }
fn default_system_prompt(&self) -> &str { "You are helpful." }
fn log_prefix(&self) -> &str { "myagent" }
fn name(&self) -> &str { "MyAgent" }
}
fn main() -> std::io::Result<()> {
let mut agent = AgentAir::new(&MyConfig)?;
// Configuration is loaded automatically from:
// 1. ~/.myagent/config.yaml (if exists)
// 2. ANTHROPIC_API_KEY environment variable (fallback)
agent.run()
}
Programmatic Configuration
For direct configuration without files:
use agent_air::controller::{LLMSessionConfig, LLMController};
let config = LLMSessionConfig::anthropic(
std::env::var("ANTHROPIC_API_KEY").expect("API key required"),
"claude-sonnet-4-20250514"
)
.with_system_prompt("You are a helpful assistant.")
.with_max_tokens(4096)
.with_streaming(true);
// Create session with this config
let controller = LLMController::new(None);
let session_id = controller.create_session(config).await?;
Error Handling
Anthropic API errors are converted to LlmError:
pub struct LlmError {
pub error_code: String,
pub error_message: String,
}
Common error codes from Anthropic:
authentication_error- Invalid API keyrate_limit_error- Too many requestsoverloaded_error- API temporarily unavailableinvalid_request_error- Malformed request
