Message Flow
This page describes how messages flow through the agent-air system, from user input through the LLM and back to the display. Understanding this flow is essential for debugging, extending, and customizing agent behavior.
Overview
Messages flow through four layers:
- TUI Layer - User input and display
- Agent Layer - Routing and conversion
- Controller Layer - Session management and orchestration
- Client Layer - HTTP communication with LLM providers
Each layer communicates with adjacent layers through typed channels, maintaining clear boundaries and enabling asynchronous operation.
User Input to LLM Response
The primary message flow handles user input and LLM responses:
┌─────────────────────────────────────────────────────────────────────────────┐
│ 1. User types message in TextInput, presses Enter │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 2. App creates ControllerInputPayload │
│ - input_type: InputType::Data │
│ - session_id: current session │
│ - content: user message text │
│ - turn_id: TurnId::new_user_turn(n) (e.g., u1, u2, u3) │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ to_controller_tx.send()
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 3. InputRouter receives payload, forwards to LLMController │
│ - router.run() loops on to_controller_rx │
│ - calls controller.send_input(payload) │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ input_tx.send()
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 4. LLMController receives in select! loop from input_rx │
│ - handle_input() dispatches based on input_type │
│ - handle_data_input() for user messages │
│ - Gets session from session_manager │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ session.send(ToLLMPayload)
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 5. LLMSession queues message for processing │
│ - Creates Message with role: User, content: text │
│ - Adds to conversation history │
│ - Calls LLMClient.send_message_stream() │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ HTTP POST (streaming)
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 6. LLMClient sends request to provider API (Anthropic/OpenAI) │
│ - Includes conversation history, system prompt, tools │
│ - Returns Stream<StreamEvent> │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ Streamed response chunks
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 7. LLMSession processes stream, emits FromLLMPayload events │
│ - FromLLMPayload::StreamStart when response begins │
│ - FromLLMPayload::TextChunk for each text delta │
│ - FromLLMPayload::Complete when done │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ from_llm_tx.send()
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 8. LLMController receives from from_llm_rx in select! loop │
│ - handle_llm_response() processes each payload │
│ - Emits ControllerEvent via event_func callback │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ event_func(ControllerEvent)
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 9. Event handler converts ControllerEvent to UiMessage │
│ - convert_controller_event_to_ui_message() │
│ - ControllerEvent::TextChunk -> UiMessage::TextChunk │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ from_controller_tx.try_send()
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 10. App receives UiMessage in event loop │
│ - handle_controller_message() processes message │
│ - Updates ChatView with streamed text │
│ - Triggers re-render │
└─────────────────────────────────────────────────────────────────────────────┘
Key Types in User Input Flow
ControllerInputPayload - Sent from TUI to controller:
pub struct ControllerInputPayload {
pub input_type: InputType, // Data or Control
pub session_id: i64,
pub content: String, // User message text
pub control_cmd: Option<ControlCmd>, // For Control inputs
pub turn_id: Option<TurnId>, // e.g., u1, u2
}
FromLLMPayload - Sent from LLMSession to controller:
pub enum FromLLMPayload {
StreamStart { session_id: i64, message_id: String, model: String },
TextChunk { session_id: i64, text: String },
ToolUse { session_id: i64, tool: ToolUseInfo },
ToolBatch { session_id: i64, tools: Vec<ToolUseInfo> },
Complete { session_id: i64, stop_reason: Option<String> },
TokenUpdate { session_id: i64, input_tokens: i64, output_tokens: i64 },
Error { session_id: i64, error: String },
}
Tool Execution Flow
When the LLM responds with tool use, an additional flow handles tool execution:
┌─────────────────────────────────────────────────────────────────────────────┐
│ 1. LLM response includes tool_use content block │
│ - stop_reason: "tool_use" │
│ - ContentBlock::ToolUse { id, name, input } │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 2. LLMSession emits FromLLMPayload::ToolBatch │
│ - Contains all tool uses from the response │
│ - Each tool has id, name, and JSON input │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ from_llm_tx.send()
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 3. LLMController receives, calls ToolExecutor │
│ - handle_llm_response() extracts tool requests │
│ - executor.execute_batch() spawns parallel tasks │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
┌───────────────────────────┼───────────────────────────┐
│ │ │
▼ ▼ ▼
┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐
│ 4a. Tool Task 1 │ │ 4b. Tool Task 2 │ │ 4c. Tool Task N │
│ - Lookup in registry│ │ - Lookup in registry│ │ - Lookup in registry│
│ - Create ToolContext│ │ - Create ToolContext│ │ - Create ToolContext│
│ - Call execute() │ │ - Call execute() │ │ - Call execute() │
└──────────┬──────────┘ └──────────┬──────────┘ └──────────┬──────────┘
│ │ │
│ tool_result_tx │ │
└───────────────────────────┼───────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 5. Individual ToolResults sent to tool_result_rx │
│ - LLMController emits ControllerEvent::ToolResult for each │
│ - TUI shows tool progress in real-time │
└─────────────────────────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 6. When all tools complete, ToolBatchResult sent to batch_result_rx │
│ - Contains all tool results with success/error status │
│ - LLMController collects results │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 7. LLMController sends tool results back to LLM │
│ - Creates ToLLMPayload::ToolResult │
│ - Includes all tool outputs as ContentBlock::ToolResult │
│ - LLMSession sends to API for continued conversation │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 8. LLM processes tool results, may respond or request more tools │
│ - If stop_reason: "end_turn" -> conversation complete │
│ - If stop_reason: "tool_use" -> repeat from step 1 │
└─────────────────────────────────────────────────────────────────────────────┘
Tool Execution Context
Each tool receives a ToolContext with execution metadata:
pub struct ToolContext {
pub session_id: i64,
pub tool_use_id: String,
pub turn_id: Option<TurnId>,
}
Tool Result Handling
Tools return ToolResult with execution outcome:
pub struct ToolResult {
pub tool_use_id: String,
pub tool_name: String,
pub status: ToolResultStatus, // Success, Error, Timeout
pub content: String, // Result content
pub error: Option<String>, // Error message if failed
}
Control Command Flow
Control commands (interrupt, shutdown, clear) follow a simplified path:
┌─────────────────────────────────────────────────────────────────────────────┐
│ 1. User triggers control action (e.g., Ctrl+C, /clear command) │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 2. App creates ControllerInputPayload │
│ - input_type: InputType::Control │
│ - control_cmd: Some(ControlCmd::Interrupt/Clear/Shutdown) │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 3. LLMController handles control command │
│ - handle_control_input() dispatches by ControlCmd │
│ - Interrupt: cancels current streaming │
│ - Clear: clears conversation history │
│ - Shutdown: initiates graceful shutdown │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 4. Controller emits ControllerEvent::CommandComplete │
│ - success: bool │
│ - message: Optional status message │
└─────────────────────────────────────────────────────────────────────────────┘
User Interaction Flow
Tools that need user input (AskUserQuestions, AskForPermissions) use a special flow:
┌─────────────────────────────────────────────────────────────────────────────┐
│ 1. Tool calls user_interaction_registry.request_user_input() │
│ - Creates pending question with unique tool_use_id │
│ - Emits ControllerEvent::UserInteractionRequired │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 2. Event forwarded to TUI via interaction_event_rx │
│ - Converted to UiMessage::UserInteractionRequired │
│ - App shows QuestionPanel or PermissionPanel │
└─────────────────────────────────────────┬───────────────────────────────────┘
│ User responds
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 3. App calls registry.submit_response(tool_use_id, response) │
│ - Registry resolves pending request │
│ - Waiting tool task receives response │
└─────────────────────────────────────────┬───────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────────────────┐
│ 4. Tool completes execution with user-provided data │
│ - Returns ToolResult with response │
│ - Flow continues with normal tool result handling │
└─────────────────────────────────────────────────────────────────────────────┘
Channel Summary
| Channel | Direction | Carries | Buffer Size |
|---|---|---|---|
to_controller | TUI -> Controller | ControllerInputPayload | 100 |
from_controller | Controller -> TUI | UiMessage | 100 |
from_llm | Session -> Controller | FromLLMPayload | 100 |
tool_result | Executor -> Controller | ToolResult | 100 |
batch_result | Executor -> Controller | ToolBatchResult | 100 |
interaction_event | Registry -> TUI | ControllerEvent | 100 |
permission_event | Registry -> TUI | ControllerEvent | 100 |
Turn ID Tracking
Every message in a conversation has a turn ID that identifies its position:
- User turns:
u1,u2,u3, … - Assistant turns:
a1,a2,a3, …
Turn IDs enable:
- Filtering stale responses when interrupting
- Correlating tool results with requests
- Debugging message flow
- Session recovery
Next Steps
- Async Runtime - How Tokio manages these concurrent flows
- Controller Events - All events emitted during message flow
- Message Handling - The 6-channel select! pattern
