LLM Configuration

Agent Air loads LLM provider configuration from a YAML file or environment variables. The YAML file takes precedence when it exists and contains valid provider definitions.

Configuration File Location

The configuration file path is specified by config_path() in your AgentConfig implementation:

fn config_path(&self) -> &str {
    "~/.myagent/config.yaml"
}

Paths starting with ~/ are expanded to the user’s home directory. The framework does not create the configuration directory or file automatically. If the path does not exist, the framework falls back to environment variables.

YAML File Format

The configuration file has two top-level fields:

providers:
  - provider: anthropic
    api_key: sk-ant-...
    model: claude-sonnet-4-20250514

  - provider: openai
    api_key: sk-...
    model: gpt-4-turbo-preview

default_provider: anthropic

providers (required)

An array of provider configurations. Each entry requires:

  • provider: The provider identifier (see Supported Providers below)
  • api_key: Authentication key for the provider API
  • model: Model identifier to use for requests (optional for some providers)

default_provider (optional)

Specifies which provider to use when no specific provider is requested. If omitted, the first provider in the list becomes the default.

Supported Providers

Native Providers

These providers have dedicated implementations with full feature support:

ProviderYAML NameDefault ModelContext Limit
Anthropicanthropicclaude-sonnet-4-20250514200,000
OpenAIopenaigpt-4-turbo-preview128,000
Googlegooglegemini-2.5-flash1,000,000

OpenAI-Compatible Providers

These providers use the OpenAI API format and are automatically configured with appropriate defaults:

ProviderYAML NameDefault ModelContext Limit
Groqgroqllama-3.3-70b-versatile131,072
Together AItogethermeta-llama/Meta-Llama-3.1-70B-Instruct-Turbo131,072
Fireworks AIfireworksaccounts/fireworks/models/llama-v3p1-70b-instruct131,072
Mistral AImistralmistral-large-latest128,000
Perplexityperplexityllama-3.1-sonar-large-128k-online128,000
DeepSeekdeepseekdeepseek-chat64,000
OpenRouteropenrouteranthropic/claude-3.5-sonnet200,000
xAIxaigrok-2-latest131,072
AI21 Labsai21jamba-1.5-large256,000
Anyscaleanyscalemeta-llama/Meta-Llama-3.1-70B-Instruct128,000
Cerebrascerebrasllama3.1-70b128,000
SambaNovasambanovaMeta-Llama-3.1-70B-Instruct128,000

Local Providers

These providers run locally and do not require API keys:

ProviderYAML NameDefault ModelContext Limit
Ollamaollamallama3.1128,000
LM Studiolmstudiolocal-model128,000

Configuration Examples

Single Provider

providers:
  - provider: anthropic
    api_key: sk-ant-api03-xxxxx
    model: claude-sonnet-4-20250514

Multiple Providers

providers:
  - provider: anthropic
    api_key: sk-ant-api03-xxxxx
    model: claude-sonnet-4-20250514

  - provider: groq
    api_key: gsk_xxxxx
    model: llama-3.3-70b-versatile

default_provider: anthropic

Local Provider

providers:
  - provider: ollama
    api_key: ""
    model: llama3.1

For local providers, the api_key field can be empty but must be present.

Environment Variables

When no YAML configuration file exists or when it is empty, the framework falls back to environment variables. Set the appropriate API key variable to enable a provider.

Native Providers

ProviderAPI Key VariableModel VariableDefault Model
AnthropicANTHROPIC_API_KEYANTHROPIC_MODELclaude-sonnet-4-20250514
OpenAIOPENAI_API_KEYOPENAI_MODELgpt-4-turbo-preview
GoogleGOOGLE_API_KEYGOOGLE_MODELgemini-2.5-flash

OpenAI-Compatible Providers

ProviderAPI Key VariableModel Variable
GroqGROQ_API_KEYGROQ_MODEL
Together AITOGETHER_API_KEYTOGETHER_MODEL
Fireworks AIFIREWORKS_API_KEYFIREWORKS_MODEL
Mistral AIMISTRAL_API_KEYMISTRAL_MODEL
PerplexityPERPLEXITY_API_KEYPERPLEXITY_MODEL
DeepSeekDEEPSEEK_API_KEYDEEPSEEK_MODEL
OpenRouterOPENROUTER_API_KEYOPENROUTER_MODEL
xAIXAI_API_KEYXAI_MODEL
AI21 LabsAI21_API_KEYAI21_MODEL
AnyscaleANYSCALE_API_KEYANYSCALE_MODEL
CerebrasCEREBRAS_API_KEYCEREBRAS_MODEL
SambaNovaSAMBANOVA_API_KEYSAMBANOVA_MODEL

Local Providers

ProviderHost Variable
OllamaOLLAMA_HOST
LM StudioLMSTUDIO_HOST

Setting the host variable enables the local provider. No API key is required.

Precedence

Environment variables are only used when:

  • The YAML config file does not exist
  • The YAML config file cannot be read
  • The YAML config file contains no providers

If the YAML file contains valid provider configuration, environment variables are ignored. There is no merging of configuration sources.

When multiple API keys are set via environment variables, the default provider is selected in this order: Anthropic, OpenAI, Google, then OpenAI-compatible providers.

Available Models

Anthropic

ModelDescription
claude-sonnet-4-20250514Latest Sonnet model (default)
claude-opus-4-20250514Latest Opus model
claude-3-5-sonnet-20241022Claude 3.5 Sonnet
claude-3-opus-20240229Claude 3 Opus
claude-3-haiku-20240307Claude 3 Haiku (fast)

OpenAI

ModelDescription
gpt-4-turbo-previewGPT-4 Turbo (default)
gpt-4oGPT-4o
gpt-4o-miniGPT-4o Mini (fast)
gpt-4GPT-4
gpt-3.5-turboGPT-3.5 Turbo

Google

ModelDescription
gemini-2.5-flashGemini 2.5 Flash (default)
gemini-2.5-proGemini 2.5 Pro
gemini-1.5-proGemini 1.5 Pro

Groq

ModelDescription
llama-3.3-70b-versatileLlama 3.3 70B (default)
llama-3.1-70b-versatileLlama 3.1 70B
mixtral-8x7b-32768Mixtral 8x7B

For other OpenAI-compatible providers, refer to the provider’s documentation for available models.

Model-Specific Capabilities

Some features vary by model:

Streaming support:

  • Anthropic: Enabled by default
  • OpenAI: Disabled by default (can be enabled)
  • Google: Enabled by default
  • OpenAI-compatible: Varies by provider

Tool use:

  • Most modern models support tool use (function calling)
  • Check provider documentation for specific model capabilities

Model identifiers are not validated by the framework. Invalid model names result in errors from the provider API when making requests.

API Key Security

File Permissions

Restrict access to configuration files containing API keys:

chmod 600 ~/.myagent/config.yaml

Do Not Commit Keys

Add configuration files to .gitignore:

# Agent configuration with API keys
.myagent/config.yaml
config.yaml

Production Deployments

For production, prefer environment variables or secrets management systems:

  • Container orchestration (Kubernetes secrets, Docker secrets)
  • Cloud provider secrets managers (AWS Secrets Manager, GCP Secret Manager)
  • CI/CD pipeline secrets
  • Process managers (systemd environment files)

Key Formats

ProviderKey PrefixExample
Anthropicsk-ant-sk-ant-api03-xxxxx
OpenAIsk-sk-xxxxx
Groqgsk_gsk_xxxxx
OpenRoutersk-or-sk-or-xxxxx

Error Handling

Configuration errors are represented by ConfigError:

ErrorDescription
ReadErrorThe file could not be read (missing, permissions)
ParseErrorThe file contains invalid YAML
UnknownProviderA provider name is not recognized

When configuration loading fails, the framework logs a warning and the agent starts without LLM capabilities.

Changing Configuration

Configuration is loaded once at startup. To change providers, models, or API keys:

  1. Update the YAML configuration file or environment variable
  2. Restart the agent