All Domains
1587 entities found
Agent Memory Store
Agent Memory Store is implemented via PostgreSQL tables which store cross-session memory for the agent.
Agent migration file 003_agent_tables.sql
The 003_agent_tables.sql migration file creates the agent session database tables required for the IronClaw agent. Jesper DevOps is responsible for executing the agent migration file 003_agent_tables.sql to enable the IronClaw agent database functionality. The SQL migration execution script is used to apply the agent migration file 003_agent_tables.sql to the PostgreSQL database on theo. Agent database migration corresponds to the migration script backend/migrations/003_agent_tables.sql which was applied.
Agent Mode chat UI
The ChatPane component is part of the frontend agent interface for IronClaw-powered Agent Mode.
Agent Mode header
Agent orchestration system
The IronClaw agent feature includes the Agent orchestration system as a core capability.
Agent page
Agent selection UI
User can pick or automatically select data analysis agents, see which was used, and view confidence scores, supporting transparent agent strategy in DataLens.
Agent Selector
DataLens uses an Agent Selector UI that allows users to select or auto-route queries among different agents like Budget Analyzer and Direct SQL. DataLens provides a user-facing agent selector to pick or auto-route among agents. Agent Selector Dropdown UI component is part of the Backend frontend interface to select AI agents for query answering. Direct SQL is an option in the Agent Selector Dropdown for performing raw structured queries. Agent Selector directs queries to the Budget Analyzer agent Agent Selector directs queries to the Policy Researcher agent Agent Selector directs queries to the Efficiency Analyzer agent Agent Selector directs queries to Direct SQL agent
Agent session SQLAlchemy error
Fixed by ensuring session instances are converted to scalars before response serialization, resolving 'relation does not exist' errors during agent session startup, enabling reliable agent session creation. The SQLAlchemy detached instance error was fixed by commit 39e9d50.
Agent Skills Integration
Agent Skills Integration modifies _run_query to generate findings from query results and stream them as insights. Agent Skills Integration renders each streamed finding as a separate IronClawMessage. _run_query method is modified as part of Agent Skills Integration to generate findings from query results The process_message() method in agent_skills.py calls the _run_query() function asynchronously to execute queries. agent_skills.py process_message() yields IronClawMessage instances for communicating agent responses and errors. Opus analysis investigates the behavior of agent_skills.py process_message() to diagnose async exception issues. The Logging framework instruments agent_skills.py process_message() to capture errors and execution flow. The process_message() method in agent_skills.py calls the _run_query() function asynchronously to execute queries. agent_skills.py process_message() yields IronClawMessage instances for communicating agent responses and errors. Opus analysis investigates the behavior of agent_skills.py process_message() to diagnose async exception issues. The Logging framework instruments agent_skills.py process_message() to capture errors and execution flow.
agent.py event_generator
The agent.py event_generator() function invokes get_ironclaw_client() to obtain the appropriate agent client for message processing. StreamingResponse wraps the agent.py event_generator() function to stream data asynchronously to the client. The POST /api/v1/agent/sessions/{id}/message endpoint handles requests by executing the agent.py event_generator() function.
agent.py send_message endpoint
agent_findings
The `agent_findings` table persistently stores findings linked to agent sessions. Logging is necessary when AgentFinding is created to verify that findings are saved and streamed to the frontend properly. Session 56 with 3 messages (text, thinking, thinking) corresponds to no entries in the agent_findings table for project 14, indicating no findings created yet. AgentFinding data entities belong to projects representing findings generated in agent analysis. Agent findings are associated with sessions via the session_id column in agent_findings. Agent findings also relate to projects through the project_id column. Session 56, which includes messages, is related to the absence of data in the agent_findings table, indicating query execution paths stopping before findings are created. AgentSession data entity contains AgentFinding data entity as part of its structure. AgentFinding data entity uses AgentSkillLog data entity to log skills-related events. Insight physical table is derived from AgentFinding data entity representing findings extracted by the agent. Agent sessions contain agent_findings tables which store findings related to each session.
agent_messages
The `agent_messages` table contains conversation history linked to agent sessions for cross-session memory. AgentMessage data entities are part of AgentSession data entities representing messages within an agent session. Messages in agent_messages are linked to a particular session via the session_id column. AgentSession data entity includes AgentMessage data entity as a related component. Agent sessions include agent_messages tables which hold messages associated with each session.
agent_sessions
The `agent_sessions` table stores records of agent analysis sessions, linked to projects by `project_id`, and logs associated with each session such as GDPR flags, messages, findings, and skill logs. It is stored in the `003_agent_tables.sql` migration file, and the entity is a physical table in PostgreSQL with `id` as primary key.
agent_skill_log
The `agent_skill_log` table stores skill execution logs associated with records in the `agent_sessions` table. AgentSkillLog data entity records logs for AgentSession entities related to skill execution. AgentSession data entity incorporates AgentSkillLog data entity as part of its information model. AgentFinding data entity uses AgentSkillLog data entity to log skills-related events.
agents directory
Contains extraction agents like CSV, Excel, and PDF extractors responsible for handling respective data formats as per the plan.
AGENTS.md
Agent configuration includes the AGENTS.md file.
agentStore.findings
The agentStore.findings data is connected to the FindingsPanelNew component for frontend rendering.
AgentWarmingService
AgentWarmingService assembles warm context that is used by SkillExecutor in agent sessions.
AI
The Architecture includes all Python and AI components running on elin.
AI analysis quality
AI Cataloging
Batch upload is part of the enhanced AI cataloging process in DataLens. The FastAPI backend uses AI cataloging to automatically discover table structures on file upload. The Backend implements AI cataloging capabilities. The DataLens Platform uses AI cataloging via integration with the DS-STAR FileAnalyzer, which runs automatically upon file upload.
AI Core
AI Core capability requires the DS-STAR autonomous extraction capability. AI Core capability requires the Text-to-SQL with Ollama capability. AI Core capability requires the Document RAG capability. The DataLens Platform uses the AI Core capabilities to perform autonomous extraction, text-to-SQL, and document retrieval augmented generation. The AI Core includes DS-STAR autonomous extraction components such as PlannerAgent, VerifierAgent, RouterAgent, and Orchestrator. The AI Core includes Document RAG technology built using nomic-embed-text and Qdrant for vector search.
AI Generated Goals
AI Generated Goals are derived using the Hybrid (manual + AI-Assist) Approach combining user brief input with AI expansion via Claude.
AI summaries generation
Verifiable requirement with successful Phase 2 implementation; summaries are generated asynchronously.
AI summary background queue
Existing requirement to generate AI summaries asynchronously; now operational with background tasks.
AI Summary Generation
AI Summary Generation stores summaries in the FileUpload record ai_summary column after extraction completes in the cataloging workflow. The Files API supports AI Summary Generation by exposing necessary endpoints and data fields for storage and retrieval of AI summaries. The Extraction coordinator is part of the workflow that triggers AI Summary Generation after file extraction completes. AI Summary Generation is implemented asynchronously using the RQ job queue to avoid blocking HTTP responses during file list retrieval. Direct communicator Jesper approved progressing with AI Summary Generation and vectorize progress tracking fixes in the same session for accuracy. The AI Summary Generation feature uses the RQ job queue for asynchronous summary generation after extraction completes. AI Summary Generation stores the generated summaries in the ai_summary column of the FileUpload records.
AI Summary Generation Fix
Phase 2 requirement: moved AI summary generation to async background queue, resolving performance and timeout issues.
AI-assisted goal generation
Claude is used to power the AI-assisted project goal generation feature. The AI-assisted goal generation capability is realized through the generate-goal API endpoint.