Session Creation

This page documents how LLM sessions are created, configured, and registered with the session manager. Sessions are the core abstraction for managing conversations with LLM providers.

Creation Overview

┌─────────────────────────────────────────────────────────────────┐
│                    AgentAir                                     │
│  create_initial_session() or create_session()                   │
└─────────────────────────────────────┬───────────────────────────┘
                                      │ LLMSessionConfig

┌─────────────────────────────────────────────────────────────────┐
│                    LLMController                                 │
│  create_session(config, tools)                                  │
└─────────────────────────────────────┬───────────────────────────┘


┌─────────────────────────────────────────────────────────────────┐
│                    LLMSessionManager                             │
│  create_session(config, from_llm_tx)                            │
│  - Create LLMSession                                            │
│  - Store in sessions map                                        │
│  - Spawn session.start() task                                   │
└─────────────────────────────────────┬───────────────────────────┘


┌─────────────────────────────────────────────────────────────────┐
│                    LLMSession                                    │
│  new(config, from_llm_tx)                                       │
│  - Generate unique ID                                           │
│  - Create LLMClient                                             │
│  - Initialize state                                             │
└─────────────────────────────────────────────────────────────────┘

Session ID Generation

Each session receives a globally unique ID:

static SESSION_COUNTER: AtomicI64 = AtomicI64::new(0);

impl LLMSession {
    pub fn new(...) -> Result<Self, LlmError> {
        let id = SESSION_COUNTER.fetch_add(1, Ordering::SeqCst);
        // ...
    }
}

IDs are:

  • Monotonically increasing i64 values
  • Unique across all sessions in the process
  • Used for session lookup and event correlation

LLMSessionConfig

Sessions are configured via LLMSessionConfig:

pub struct LLMSessionConfig {
    pub provider: LLMProvider,
    pub api_key: String,
    pub model: String,
    pub max_tokens: Option<u32>,
    pub system_prompt: Option<String>,
    pub temperature: Option<f32>,
    pub streaming: bool,
    pub context_limit: i32,
    pub compaction: Option<CompactorType>,
}

Provider Configuration

pub enum LLMProvider {
    Anthropic,
    OpenAI,
}

Factory Methods

impl LLMSessionConfig {
    pub fn anthropic(api_key: impl Into<String>, model: impl Into<String>) -> Self {
        Self {
            provider: LLMProvider::Anthropic,
            api_key: api_key.into(),
            model: model.into(),
            max_tokens: Some(4096),
            system_prompt: None,
            temperature: None,
            streaming: true,
            context_limit: 200_000,
            compaction: Some(CompactorType::default()),
        }
    }

    pub fn openai(api_key: impl Into<String>, model: impl Into<String>) -> Self {
        Self {
            provider: LLMProvider::OpenAI,
            api_key: api_key.into(),
            model: model.into(),
            max_tokens: Some(4096),
            system_prompt: None,
            temperature: None,
            streaming: false,
            context_limit: 128_000,
            compaction: Some(CompactorType::default()),
        }
    }
}

Builder Methods

impl LLMSessionConfig {
    pub fn with_streaming(mut self, streaming: bool) -> Self {
        self.streaming = streaming;
        self
    }

    pub fn with_max_tokens(mut self, max_tokens: u32) -> Self {
        self.max_tokens = Some(max_tokens);
        self
    }

    pub fn with_system_prompt(mut self, prompt: impl Into<String>) -> Self {
        self.system_prompt = Some(prompt.into());
        self
    }

    pub fn with_temperature(mut self, temperature: f32) -> Self {
        self.temperature = Some(temperature);
        self
    }

    pub fn with_context_limit(mut self, context_limit: i32) -> Self {
        self.context_limit = context_limit;
        self
    }

    pub fn with_threshold_compaction(mut self, config: CompactionConfig) -> Self {
        self.compaction = Some(CompactorType::Threshold(config));
        self
    }

    pub fn with_llm_compaction(mut self, config: LLMCompactorConfig) -> Self {
        self.compaction = Some(CompactorType::LLM(config));
        self
    }

    pub fn without_compaction(mut self) -> Self {
        self.compaction = None;
        self
    }
}

Default Configuration Values

FieldAnthropic DefaultOpenAI Default
max_tokens40964096
streamingtruefalse
context_limit200,000128,000
compactionThreshold (0.75)Threshold (0.75)

Creating Initial Session

The most common path is creating the initial session from configuration:

impl AgentAir {
    pub fn create_initial_session(&mut self) -> Result<(i64, String, i32), AgentError> {
        // Get registry
        let registry = self.llm_registry.as_ref()
            .ok_or_else(|| AgentError::NoConfiguration(
                "No LLM configuration found".to_string()
            ))?;

        // Get default provider config
        let config = registry.get_default()
            .ok_or_else(|| AgentError::NoConfiguration(
                "No default provider configured".to_string()
            ))?;

        // Create session via controller
        let session_id = self.runtime.block_on(
            Self::create_session_internal(
                &self.controller,
                config.clone(),
                &self.tool_definitions,
            )
        )?;

        Ok((session_id, config.model.clone(), config.context_limit))
    }
}

Creating Additional Sessions

Create sessions with specific configurations:

impl AgentAir {
    pub fn create_session(&self, config: LLMSessionConfig) -> Result<i64, AgentError> {
        let session_id = self.runtime.block_on(
            Self::create_session_internal(
                &self.controller,
                config,
                &self.tool_definitions,
            )
        )?;

        Ok(session_id)
    }
}

Controller Session Creation

The controller delegates to the session manager:

impl LLMController {
    pub async fn create_session(
        &self,
        config: LLMSessionConfig,
        tools: &[LLMTool],
    ) -> Result<i64, LlmError> {
        // Create session via manager
        let session_id = self.session_mgr
            .create_session(config, self.from_llm_tx.clone())
            .await?;

        // Add tools to the session
        if let Some(session) = self.session_mgr.get_session_by_id(session_id).await {
            session.set_tools(tools.to_vec()).await;
        }

        Ok(session_id)
    }
}

Session Manager Creation

The session manager creates and stores sessions:

impl LLMSessionManager {
    pub async fn create_session(
        &self,
        config: LLMSessionConfig,
        from_llm: mpsc::Sender<FromLLMPayload>,
    ) -> Result<i64, LlmError> {
        // Create session instance
        let session = Arc::new(LLMSession::new(config, from_llm)?);
        let session_id = session.id();

        // Store in sessions map
        self.sessions.write().await.insert(session_id, session.clone());

        // Spawn processing task
        tokio::spawn(async move {
            session.start().await;
        });

        Ok(session_id)
    }
}

LLMSession Initialization

The session constructor performs initialization:

impl LLMSession {
    pub fn new(
        config: LLMSessionConfig,
        from_llm: mpsc::Sender<FromLLMPayload>,
    ) -> Result<Self, LlmError> {
        // Generate unique ID
        static SESSION_COUNTER: AtomicI64 = AtomicI64::new(0);
        let id = SESSION_COUNTER.fetch_add(1, Ordering::SeqCst);

        // Create LLM client for provider
        let client = match config.provider {
            LLMProvider::Anthropic => LLMClient::anthropic(&config.api_key)?,
            LLMProvider::OpenAI => LLMClient::openai(&config.api_key)?,
        };

        // Create internal channel
        let (to_llm_tx, to_llm_rx) = mpsc::channel(DEFAULT_CHANNEL_SIZE);

        // Initialize compactor
        let compactor = Self::create_compactor(&config.compaction);
        let llm_compactor = Self::create_llm_compactor(&client, &config.compaction)?;

        Ok(Self {
            id: AtomicI64::new(id),
            client,
            to_llm_tx,
            to_llm_rx: Mutex::new(to_llm_rx),
            from_llm,
            config,
            // ... initialize all fields
        })
    }
}

Compactor Initialization

Compactors are created based on configuration:

fn create_compactor(config: &Option<CompactorType>) -> Option<Box<dyn Compactor>> {
    match config {
        Some(CompactorType::Threshold(cfg)) => {
            Some(Box::new(ThresholdCompactor::new(cfg.clone())))
        }
        Some(CompactorType::LLM(_)) => None, // LLM compactor stored separately
        None => None,
    }
}

fn create_llm_compactor(
    client: &LLMClient,
    config: &Option<CompactorType>,
) -> Result<Option<LLMCompactor>, LlmError> {
    match config {
        Some(CompactorType::LLM(cfg)) => {
            Ok(Some(LLMCompactor::new(client.clone(), cfg.clone())?))
        }
        _ => Ok(None),
    }
}

Session Lookup

Sessions are retrieved by ID:

impl LLMSessionManager {
    pub async fn get_session_by_id(&self, session_id: i64) -> Option<Arc<LLMSession>> {
        self.sessions.read().await.get(&session_id).cloned()
    }
}

Session Storage

Sessions are stored in a thread-safe map:

pub struct LLMSessionManager {
    sessions: RwLock<HashMap<i64, Arc<LLMSession>>>,
}
  • RwLock: Allows concurrent reads, exclusive writes
  • HashMap<i64, Arc<LLMSession>>: Maps session ID to session
  • Arc<LLMSession>: Enables shared ownership across tasks

Session Removal

Sessions can be explicitly removed:

impl LLMSessionManager {
    pub async fn remove_session(&self, session_id: i64) -> bool {
        if let Some(session) = self.sessions.write().await.remove(&session_id) {
            session.shutdown().await;
            true
        } else {
            false
        }
    }
}

Multi-Session Support

The manager supports multiple concurrent sessions:

// Create sessions for different providers
let anthropic_session = core.create_session(
    LLMSessionConfig::anthropic(&api_key, "claude-sonnet-4-20250514")
)?;

let openai_session = core.create_session(
    LLMSessionConfig::openai(&api_key, "gpt-4o")
)?;

// Sessions operate independently
// Each has its own conversation history, token tracking, etc.

Post-Creation Configuration

After creation, sessions can be configured:

// Get session reference
let session = controller.session_mgr.get_session_by_id(session_id).await?;

// Set runtime overrides
session.set_max_tokens(8192);
session.set_system_prompt("You are a code reviewer.").await;
session.set_tools(tool_definitions).await;

Error Handling

Creation can fail for several reasons:

pub enum LlmError {
    // Client creation failures
    Http(reqwest::Error),
    Config(String),

    // API key issues
    Api { status: u16, message: String },
}

Common errors:

  • Invalid API key
  • Network configuration issues
  • Unsupported provider

Next Steps