Project: datalens
81 entity types
Matrix/All Domains

All Domains

1587 entities found

Entity

Data coverage indicator

EpicIntent

Data Discovery feature

The Data Discovery feature includes backend services, UI components like DiscoveryFlow.svelte, and supports the SVGV dataset with 132 files, 473 tables, and 351K rows. It performs semantic matching, search, and consolidation to improve data analysis success rates, validated by extensive E2E tests. It offers /discovery, /tables, and /validate APIs, and is a key part of DataLens, validated by the team and stakeholder Arne Hauge. The Data Discovery feature uses the Discovery service. The Data Discovery system integrates with the PostgreSQL database for storing and retrieving data. The Data Discovery system depends on the RQ Worker to process extraction jobs asynchronously. Arne Hauge is expected to use the Data Discovery system after deployment. The Data Discovery feature was delivered and represented by commit 68764d5.

Entity

data flow problem

A data flow issue is causing queries to stop after thinking messages, likely due to an silent exception in an async generator. It is being investigated with recent logs and pending deployment fixes.

BusinessProcessIntent

data pipelines

DataLens helps build and run data pipelines. DataLens agent works on building and running data pipelines in its analysis workflows.

DataEntityData Model

Data queryable tables

473 tables created in PostgreSQL for Project 14 after full dataset extraction, totaling ~351,842 rows. Tables are queryable and include Danish field names, supporting analysis and natural language questions.

AcceptanceCriteriaIntent

Data validation test suite

CapabilityIntent

Data volume

ThirdPartyComponentArchitecture

Data Wizard

SystemBoundaryArchitecture

data/uploads/

The DS-STAR Intelligence Layer processes files from the 'data/uploads/' directory for autonomous extraction.

BusinessRuleIntent

DATA_BACKEND feature flag

The DATA_BACKEND feature flag can toggle between using PostgreSQL (PgDataService) and DuckDBService as the data backend.

SecurityConstraintSecurity

database connection

TechConstraintArchitecture

Database connectivity for Coolify

Reviewed in recent deployment; performance constraints noted.

ThreatSecurity

database lock

DataEntityData Model

database schema

Represents the data structure used during data management but no specific details provided in messages.

ThirdPartyComponentArchitecture

Database services

Project services cooperate with Database services in the backend Database services cooperate with Qdrant service which is running with firewall open

BusinessProcessIntent

Database session management

Database session management integrates with API endpoints to provide backend data operations for the agent.

IntegrationEndpointIntegrations

DATABASE_URL environment variable

Contains PostgreSQL URL for authentication and project tracking. Backend API requires database connection string configured in DATABASE_URL environment variable. Backend API requires database connection string configured in DATABASE_URL environment variable.

BusinessProcessIntent

DataFusion Mechanism

Proposed process for combining data from multiple tables into consolidated views, enhancing query relevance.

IntegrationEndpointIntegrations

DataLens

DataLens uses PostgreSQL, DuckDB, Ollama (qwen3-coder-next model), and Qdrant for semantic search, supporting hybrid methods. It handles cross-file queries with standard schemas, is scalable, multi-tenant, privacy-focused, and primarily local, with document extraction and autonomous AI cataloging via DS-STAR and Text-to-SQL endpoint, integrated with file analysis agents. DataLens implements the batch upload pipeline capability. DataLens implements smart processing as part of its data handling capabilities. DataLens provides the unified ask interface for natural language question routing over structured and semantic data.

CapabilityIntent

DataLens Agent Mode

DataLens Agent Mode uses IronClaw as the AI framework for autonomous data analysis, running as a sidecar Docker service, enabling persistent sessions with real-time communication via WebSocket, and designed for GDPR compliance by choosing secure, Rust-based IronClaw over OpenClaw.

ExternalSystemIntegrations

DataLens backend

The Discovery API router integrates with the DataLens backend system. Coolify auto-deploy integrates with the DataLens backend to deploy code changes automatically. IronClaw Agent backend endpoints are integrated within the DataLens backend API services. The DataLens backend depends on SQL migration files to create and update necessary database tables, including those for agent features. The DataLens platform backend uses the DS-STAR pipeline for various processing tasks. The DataLens platform backend uses Text-to-SQL capabilities. The DataLens platform backend uses Document RAG functionalities. The DataLens platform backend contains 11 API endpoints. The DataLens platform backend includes the schema.sql database schema. The DataLens platform backend uses a multi-org data model. The DataLens platform backend integrates the brainstorming skill. The DataLens platform backend integrates the shadcn-ui skill. The DataLens platform backend integrates the ui-audit skill. The DataLens platform backend integrates the test-engineering skill. The DataLens platform backend integrates the docker-essentials skill. The DataLens platform backend integrates the coolify skill for deployment. The DataLens platform backend includes an Agent config. INFRASTRUCTURE.md reserves port 8002 for DataLens backend.

EpicIntent

DataLens Fresh Test Project 8

Test project with 8 files uploaded, extracted, and validated with functional core processes; summaries pending.

EpicIntent

DataLens Master Implementation Plan

The plan outlines Epics for DS-STAR intelligence, Text-to-SQL, Document RAG, and DataLens Skill, detailing components like extraction, GPU infrastructure, and AI models such as Qwen2.5-Coder-14B-AWQ, with integration of Qdrant, DuckDB, and LlamaIndex, for autonomous data analysis and system orchestration. The DataLens DS-STAR Implementation Plan includes the Planner component for creating extraction plans. The plan includes Extractor Agents to extract and structure various data types. The plan includes a Verifier agent to check data quality after extraction. Router agent manages fixes and extensions to the extraction plan. The implementation uses DuckDB as a unified database for storing extracted data. After SQL execution, results are visualized as part of the pipeline. RAGAgent optionally searches unstructured text in the data analysis process. The plan uses vLLM for large language model inference on elin GPU. The implementation requires Python environment setup with all dependencies.

EpicIntent

DataLens Phase 2 GPU-First Implementation

EpicIntent

DataLens Session

Holistic validation and implementation of DataLens features including data discovery, language support, schema consolidation, and production deployment with full system testing across multiple phases. DataLens Session 2026-03-10 contains detailed status and progress about the IronClaw Agent Feature.

ThirdPartyComponentArchitecture

DataLens Session 2026-03-03

Post-launch feature discussion on Danish language support for UI and summaries, including backend prompts and frontend i18n framework implementation.

EpicIntent

DataLens Skill

The DataLens Skill epic encompasses a Telegram bot for user interactions, using OpenClaw framework for managing conversation state, uploading files, and executing skills. It integrates with backend services and OpenClaw memory for a conversational data querying environment. DataLens Orchestrator on theo is included to handle user requests and coordinate elin agents. DataLens Orchestrator uses the DataLens Skill running on theo to handle Telegram user interactions. DataLens Orchestrator uses the DataLens Skill running on theo to handle Telegram user interactions. The DataLens Orchestrator uses Telegram UI as the front-end communication channel for users. DataLens Orchestrator on theo is included to handle user requests and coordinate elin agents. DataLens Orchestrator uses the DataLens Skill running on theo to handle Telegram user interactions. DataLens Orchestrator uses the DataLens Skill running on theo to handle Telegram user interactions. The DataLens Orchestrator uses Telegram UI as the front-end communication channel for users. The DataLens Master Implementation Plan includes a DataLens Skill development phase. DataLens Skill development involves creating an OpenClaw skill following best practices. DataLens Skill involves building a Telegram bot for user interaction. DataLens Skill includes development of a Telegram upload handler to interface with elin services. DataLens Skill encompasses a query handler for natural language questions. DataLens Skill integrates conversation memory to maintain context in dialogs. DataLens Skill handles user requests via Telegram interface on theo.

StakeholderIntent

DataLens Subagent

EpicIntent

DataLens SVGV Budget analysis system

PageUser Interface

Datalens UI