All Domains
1587 entities found
memory/2026-03-02.md
Memory document recording the migration from DuckDB to PostgreSQL on 2026-03-02, covering code changes, data migration, and verification.
memory/2026-03-10.md
Memory: DataLens SVGV Analysis Timeout Issue
Analysis requests timeout due to missing ANTHROPIC_API_KEY in Coolify environment, preventing Claude responses. Solution: add the API key environment variable, then redeploy.
Message streaming format
FindingsGenerator logic outputs generated findings via the message streaming format for frontend rendering. The message streaming format is used by the frontend rendering to display agent messages and findings.
metadata queries
Metric definitions
MetricCard.svelte
The Frontend Components include the MetricCard.svelte UI component for KPI display with stats. The MetricCard.svelte UI component is part of the Full Findings Visualization Layer providing KPI display with statistics. The MetricCard.svelte UI component is part of the Full Findings Visualization Layer providing KPI display with statistics.
Metrics
migrate_duckdb_to_postgresql.py
mistralai
Pydantic-ai-slim integrates with mistralai for AI model support.
Mobile responsiveness
mode-watcher
Third-party npm dependency: mode-watcher@^1.1.0, used in frontend package.json, no specific version or license info available.
Monitoring
Monitoring Cluster
Monitoring Cluster requires the Consolidation Mechanism to unify monitoring-related datasets for accurate analysis.
MSG extractor
The batch upload pipeline depends on new extractors including the MSG extractor. The MSG extractor uses the extract-msg third-party component. MSG extractor uses background workers to process message files.
MSG files
Phase 2 file types include MSG files Phase 2 Strategy Research & Decision Point considers 2 MSG files for processing in Phase 2 as a low priority task. Opus 4.6 recommends processing all 2 MSG files as they provide governance signals.
multi-sheet workbooks
Excel files with multiple sheets, handled through normalization during extraction.
Multi-Stage Text-to-SQL Architecture
The architecture employs Arctic-Text2SQL-R1-7B for simple queries and Qwen3 for complex query handling, including schema selection, join discovery, and answer synthesis. It classifies query complexity to route models effectively, facilitating scaling to 490+ tables with optimized stages like schema compression, reranking, and error recovery. The Multi-Stage Text2SQL Architecture uses the Arctic Model as the specialized SQL generation stage with limited context window.
Multi-step extraction plan
Planner produces multi-step extraction plans for data processing.
Multi-Tenant Architecture
The DataLens Platform implements Multi-Tenant Architecture capability.
Multi-tenant auth
The Backend integrates multi-tenant authentication functionality. DataLens Platform includes an authentication system supporting registration, login, and session management. The Auth system uses Bearer tokens for simpler authentication management than JWT tokens.
multi-tenant data model
The DataLens Project employs a multi-tenant data model for organization isolation. The DataLens platform backend uses a multi-org data model. The Backend implements the multi-tenant data model.
Multi-tenant isolation
Provides organization-level data isolation supporting multiple tenants, essential for scalable deployment.
Multi-tenant PostgreSQL schema
The DataLens Platform uses a multi-tenant PostgreSQL schema for data management. DataLens Platform uses a multi-tenant PostgreSQL schema for organizations, users, projects, files, and catalog data. The Backend uses a PostgreSQL schema Project 14 will transition to using its own PostgreSQL schema for extracted data storage. Each project schema such as Project 14's schema is a distinct namespace within PostgreSQL. The Platform Backend uses a multi-tenant PostgreSQL schema to support data modeling and authentication. The DataLens platform backend uses a multi-org data model.
Multi-tenant SaaS
The DataLens platform backend uses a multi-org data model.
Multilingual Support (Danish)
The Multilingual Support (Danish) epic includes the User Language Preference capability. The Multilingual Support (Danish) epic includes the LLM Prompt Injection use case. The Multilingual Support (Danish) epic includes the Frontend UI Translation use case. The Multilingual Support (Danish) epic involves modifying Backend files to implement language preference and LLM prompt injection. The Multilingual Support (Danish) epic uses the API auth endpoint for language preference to enable user language selection. Danish Language Support was implemented so that Admin User interacts with the system in Danish language. The Backend Service query_enhancer.py uses Danish language keywords and entity recognition to extract columns and values from Danish questions.
Municipal Finance Data Research
The Real-World Validation System uses datasets sourced from Municipal Finance Data Research for testing.
MVP recommendation
PHASE2_SCOPE_DECISION.md recommends starting with the MVP option for 5-7 days, intending to expand to Ambitious if policy questions arise. The SVGV Budget Analysis Phase 2 epic contains the requirement for Phase 2 MVP with 33 out of 35 questions answered. The Phase 2 MVP includes the requirement for Batch Upload of all 150 SVGV budget files. The Phase 2 MVP includes Partial Extraction of 44 files (Excel and PDF) into DuckDB. The Phase 2 MVP includes the Analysis Report deliverable documented in ANALYTICAL_RESULTS.md file. Start MVP plan is recommended to be started before expanding to address Policy Questions if needed. Opus 4.6 recommends the MVP approach as a minimum viable product completing 25 of 35 analytical questions in 5 to 7 days with selective ingestion.
MVP vs Ambitious comparison
PHASE2_SCOPE_DECISION.md contains a comparison between MVP and Ambitious options. Expansion to Ambitious plan involves adding support for Policy Questions.