AgentConfig Internals
This page documents the internal configuration system including the AgentConfig trait, configuration file loading, environment variable fallbacks, and the LLMRegistry that manages LLM provider configurations.
AgentConfig Trait
The AgentConfig trait defines the configuration interface that agents must implement:
pub trait AgentConfig {
/// Path to the configuration file relative to home directory
fn config_path(&self) -> &str;
/// Default system prompt for LLM sessions
fn default_system_prompt(&self) -> &str;
/// Prefix for log files
fn log_prefix(&self) -> &str;
/// Display name for the agent
fn name(&self) -> &str;
}
Implementation Example
struct MyAgentConfig;
impl AgentConfig for MyAgentConfig {
fn config_path(&self) -> &str {
".myagent/config.yaml"
}
fn default_system_prompt(&self) -> &str {
"You are a helpful coding assistant. \
Be concise and accurate in your responses."
}
fn log_prefix(&self) -> &str {
"myagent"
}
fn name(&self) -> &str {
"MyAgent"
}
}
Config Path Resolution
The config_path() return value is joined with the user’s home directory:
let home = dirs::home_dir()?;
let config_file = home.join(config.config_path());
// Result: /Users/username/.myagent/config.yaml
Configuration Loading
Configuration is loaded during AgentAir::new():
fn load_config<C: AgentConfig>(config: &C) -> Option<LLMRegistry> {
// 1. Try loading from YAML file
if let Some(home) = dirs::home_dir() {
let config_path = home.join(config.config_path());
if config_path.exists() {
match LLMRegistry::load_from_file(&config_path, config.default_system_prompt()) {
Ok(registry) => return Some(registry),
Err(e) => tracing::warn!("Failed to load config: {}", e),
}
}
}
// 2. Fall back to environment variables
LLMRegistry::from_env(config.default_system_prompt())
}
Load Priority
- YAML Configuration File: First attempts to load from
~/{config_path} - Environment Variables: Falls back to
ANTHROPIC_API_KEY,OPENAI_API_KEY, etc. - No Configuration: Returns
Noneif neither source is available
LLMRegistry
The LLMRegistry manages configured LLM providers:
pub struct LLMRegistry {
configs: HashMap<String, LLMSessionConfig>,
default_provider: Option<String>,
}
Creating from YAML
impl LLMRegistry {
pub fn load_from_file(
path: &PathBuf,
default_system_prompt: &str,
) -> Result<Self, ConfigError> {
let contents = fs::read_to_string(path)?;
let yaml: YamlConfig = serde_yaml::from_str(&contents)?;
let mut configs = HashMap::new();
let mut default_provider = None;
// Process each provider in the YAML
for (name, provider_config) in yaml.providers {
let session_config = LLMSessionConfig::from_yaml(
&provider_config,
default_system_prompt,
)?;
if provider_config.default.unwrap_or(false) {
default_provider = Some(name.clone());
}
configs.insert(name, session_config);
}
Ok(Self { configs, default_provider })
}
}
Creating from Environment Variables
impl LLMRegistry {
pub fn from_env(default_system_prompt: &str) -> Option<Self> {
let mut configs = HashMap::new();
let mut default_provider = None;
// Check for Anthropic
if let Ok(api_key) = env::var("ANTHROPIC_API_KEY") {
let model = env::var("ANTHROPIC_MODEL")
.unwrap_or_else(|_| "claude-sonnet-4-20250514".to_string());
let config = LLMSessionConfig::anthropic(&api_key, &model)
.with_system_prompt(default_system_prompt);
configs.insert("anthropic".to_string(), config);
default_provider = Some("anthropic".to_string());
}
// Check for OpenAI
if let Ok(api_key) = env::var("OPENAI_API_KEY") {
let model = env::var("OPENAI_MODEL")
.unwrap_or_else(|_| "gpt-4o".to_string());
let config = LLMSessionConfig::openai(&api_key, &model)
.with_system_prompt(default_system_prompt);
configs.insert("openai".to_string(), config);
// OpenAI becomes default only if Anthropic not configured
if default_provider.is_none() {
default_provider = Some("openai".to_string());
}
}
if configs.is_empty() {
None
} else {
Some(Self { configs, default_provider })
}
}
}
Registry Methods
impl LLMRegistry {
/// Get the default provider configuration
pub fn get_default(&self) -> Option<LLMSessionConfig> {
self.default_provider
.as_ref()
.and_then(|name| self.configs.get(name))
.cloned()
}
/// Get configuration by provider name
pub fn get(&self, provider: &str) -> Option<LLMSessionConfig> {
self.configs.get(provider).cloned()
}
/// List available provider names
pub fn providers(&self) -> Vec<&str> {
self.configs.keys().map(|s| s.as_str()).collect()
}
/// Check if any providers are configured
pub fn is_empty(&self) -> bool {
self.configs.is_empty()
}
/// Get the default provider name
pub fn default_provider(&self) -> Option<&str> {
self.default_provider.as_deref()
}
}
YAML Configuration Format
The configuration file uses YAML format:
providers:
anthropic:
provider: anthropic
api_key: ${ANTHROPIC_API_KEY} # Environment variable substitution
model: claude-sonnet-4-20250514
max_tokens: 8192
temperature: 0.7
streaming: true
context_limit: 200000
default: true
compaction:
type: threshold
threshold: 0.8
summary_model: claude-haiku-3-20240307
openai:
provider: openai
api_key: ${OPENAI_API_KEY}
model: gpt-4o
max_tokens: 4096
streaming: true
context_limit: 128000
Environment Variable Substitution
The YAML parser supports ${VAR_NAME} syntax for environment variable substitution:
fn substitute_env_vars(value: &str) -> String {
let re = Regex::new(r"\$\{(\w+)\}").unwrap();
re.replace_all(value, |caps: &Captures| {
env::var(&caps[1]).unwrap_or_default()
}).to_string()
}
This allows keeping API keys in environment variables while specifying other settings in YAML.
LLMSessionConfig
The LLMSessionConfig struct holds configuration for a single LLM session:
pub struct LLMSessionConfig {
pub provider: LLMProvider,
pub api_key: String,
pub model: String,
pub max_tokens: Option<u32>,
pub system_prompt: Option<String>,
pub temperature: Option<f32>,
pub streaming: bool,
pub context_limit: i32,
pub compaction: Option<CompactorType>,
}
LLMProvider Enum
pub enum LLMProvider {
Anthropic,
OpenAI,
}
Builder Methods
LLMSessionConfig provides builder methods for programmatic construction:
impl LLMSessionConfig {
/// Create Anthropic configuration
pub fn anthropic(api_key: &str, model: &str) -> Self {
Self {
provider: LLMProvider::Anthropic,
api_key: api_key.to_string(),
model: model.to_string(),
max_tokens: Some(8192),
system_prompt: None,
temperature: None,
streaming: true,
context_limit: 200000,
compaction: None,
}
}
/// Create OpenAI configuration
pub fn openai(api_key: &str, model: &str) -> Self {
Self {
provider: LLMProvider::OpenAI,
api_key: api_key.to_string(),
model: model.to_string(),
max_tokens: Some(4096),
system_prompt: None,
temperature: None,
streaming: true,
context_limit: 128000,
compaction: None,
}
}
/// Set streaming mode
pub fn with_streaming(mut self, streaming: bool) -> Self {
self.streaming = streaming;
self
}
/// Set maximum tokens for response
pub fn with_max_tokens(mut self, max_tokens: u32) -> Self {
self.max_tokens = Some(max_tokens);
self
}
/// Set system prompt
pub fn with_system_prompt(mut self, prompt: impl Into<String>) -> Self {
self.system_prompt = Some(prompt.into());
self
}
/// Set temperature
pub fn with_temperature(mut self, temperature: f32) -> Self {
self.temperature = Some(temperature);
self
}
/// Set context limit
pub fn with_context_limit(mut self, limit: i32) -> Self {
self.context_limit = limit;
self
}
/// Enable threshold-based compaction
pub fn with_threshold_compaction(mut self, config: CompactionConfig) -> Self {
self.compaction = Some(CompactorType::Threshold(config));
self
}
/// Enable LLM-based compaction
pub fn with_llm_compaction(mut self, config: LLMCompactorConfig) -> Self {
self.compaction = Some(CompactorType::LLM(config));
self
}
/// Disable compaction
pub fn without_compaction(mut self) -> Self {
self.compaction = None;
self
}
}
Compaction Configuration
Two compaction strategies are supported:
Threshold Compaction
pub struct CompactionConfig {
pub threshold: f32, // 0.0-1.0, trigger when context exceeds this ratio
pub keep_recent: usize, // Number of recent messages to preserve
pub summary_max_tokens: u32, // Max tokens for summary
}
impl Default for CompactionConfig {
fn default() -> Self {
Self {
threshold: 0.8,
keep_recent: 4,
summary_max_tokens: 2000,
}
}
}
LLM Compaction
pub struct LLMCompactorConfig {
pub threshold: f32,
pub summary_model: String, // Model to use for summarization
pub summary_max_tokens: u32,
}
Configuration Errors
#[derive(Error, Debug)]
pub enum ConfigError {
#[error("Failed to read config file: {0}")]
FileRead(#[from] io::Error),
#[error("Failed to parse YAML: {0}")]
YamlParse(#[from] serde_yaml::Error),
#[error("Invalid provider: {0}")]
InvalidProvider(String),
#[error("Missing required field: {0}")]
MissingField(String),
}
Usage in AgentAir
The configuration is used during AgentAir::new():
pub fn new<C: AgentConfig>(config: &C) -> io::Result<Self> {
// Load configuration
let llm_registry = load_config(config);
// Store for later use
Ok(Self {
name: config.name().to_string(),
llm_registry,
// ...
})
}
And when creating sessions:
pub fn create_initial_session(&mut self) -> Result<(i64, String, i32), AgentError> {
let registry = self.llm_registry.as_ref()
.ok_or_else(|| AgentError::NoConfiguration(
"No LLM configuration found".to_string()
))?;
let config = registry.get_default()
.ok_or_else(|| AgentError::NoConfiguration(
"No default provider configured".to_string()
))?;
// Create session with this config
let session_id = self.runtime.block_on(
Self::create_session_internal(&self.controller, config, &self.tool_definitions)
)?;
Ok((session_id, config.model.clone(), config.context_limit))
}
Next Steps
- Agent Lifecycle - How configuration is used during startup
- Builder Pattern - Programmatic configuration
- LLM Providers - Provider-specific details
