Architecture
232 entities found
.env environment file
DataLens Platform deployment uses .env file for configuring environment variables including secrets and URLs.
/app/storage/
/home/ops/datalens mount (read-only)
/home/ops/datalens/agents/
/home/ops/datalens/venv/
100% local platform
The DataLens Project is a 100% local platform with zero cloud API costs and full data privacy.
@sveltejs/adapter-auto
The npm dev dependency @sveltejs/adapter-auto depends on svelte in the frontend project.
@sveltejs/adapter-node
The npm dev dependency @sveltejs/adapter-node depends on svelte in the frontend project.
@sveltejs/kit
The npm dev dependency @sveltejs/kit depends on @sveltejs/vite-plugin-svelte in the frontend project.
@sveltejs/vite-plugin-svelte
The npm dev dependency @sveltejs/kit depends on @sveltejs/vite-plugin-svelte in the frontend project.
_classify_via_keywords
Falls back to regex keyword patterns to classify questions.
Agent Gateway
The Agent Gateway module in FastAPI acts as a bridge and uses IronClaw Service for agent session management and skill execution. Agent Gateway manages sessions and multi-tenant context by interacting with PostgreSQL where metadata and agent tables reside. Agent Gateway interacts with Redis as part of the backend ecosystem for caching and background jobs. Agent Gateway depends on the IronClaw Service to handle reasoning loops, skill execution, and memory management via HTTP and WebSocket communication. Agent Gateway is implemented as a FastAPI module that bridges the frontend and IronClaw Service. Agent Gateway accesses PostgreSQL to handle metadata and agent session tables for multi-tenant context. Agent Gateway integrates with existing services including DuckDB for query execution. Agent Gateway integrates with existing Qdrant vector database service for data operations. Agent Gateway integrates with Anthropic API to provide cloud LLM model backend services.
agents directory
Contains extraction agents like CSV, Excel, and PDF extractors responsible for handling respective data formats as per the plan.
aiofiles
aiofiles is used in the File upload feature of the DataLens Platform to handle asynchronous file operations.
Alembic
Alembic can be used to manage SQL migration files for database schema evolution in DataLens. Alembic depends on SQLAlchemy for database migrations.
Alembic migration tool
Not currently set up; migrations are applied manually via SQL files. A plan to implement automated auto-migration on startup is recommended to prevent deployment issues, especially for agent-related tables.
Analytical Framework
Anthropic
Anthropic is a third-party AI service provider mentioned alongside others like OpenAI and Google, used for hosting or API access for AI models in DataLens. Pydantic-ai-slim integrates with anthropic for AI model support.
Architecture comparison with Vanna 2.0 and WrenAI
The architecture comparison with Vanna 2.0 and WrenAI validates architectural design decisions related to Vanna 2.0. The architecture comparison with Vanna 2.0 and WrenAI validates architectural design decisions related to WrenAI. The architecture comparison document validates design decisions related to Vanna 2.0. The architecture comparison document validates design decisions related to WrenAI. The architecture comparison document validates design decisions. The architecture comparison with Vanna 2.0 and WrenAI validates design decisions made.
Architecture Diagram
Arctic Context Window Limitation
Performance constraint identified in the system, restricting effective query size and requiring schema reduction.
Arctic-Text2SQL-R1-7B
A production model deployed on elin, it generates SQL queries using Arctic's architecture, replacing SQLCoder-7B. It handles DuckDB schemas, uses compressed schema representations, and integrates via Ollama API, optimized for efficient and accurate SQL creation for Project 14 data. The Arctic-Text2SQL-R1-7B model is running on Ollama on elin and integrated with the backend. The Arctic-Text2SQL-R1-7B model is running on Ollama on elin and integrated with the backend. The Arctic-Text2SQL-R1-7B model is used via the Backend Service question_router.py to generate SQL. The Arctic-Text2SQL-R1-7B model is utilized by the Backend Service text_to_sql.py for SQL query generation.
Authentication services
Authentication services cooperate with Project services in the backend
autonomous extraction
Part of DS-STAR pipeline for data analysis, involving extraction, planning, verification, and routing agents.
autoprefixer
npm dependency: autoprefixer@^10.4.24, used in frontend, version 0.115.0, license type unknown, no approval or risk assessment specified.
AWS Bedrock
AWS Bedrock is among the third-party AI providers set for hosting or API access for AI models in DataLens, supporting scalable inference solutions.
Azure
Azure cloud services are noted as part of third-party AI component options available for DataLens model hosting and inference, but specific deployment details are not given.
Backend Storage Volume
backend/app/services/consolidation.py
Contains the consolidation logic, implemented for intelligent table merging and creation of transient views. Integration uses backend/app/services/consolidation.py for handling consolidated views in the analysis pipeline.
backend/app/workers/catalog.py
The scope field replaces hardcoded text in backend/app/workers/catalog.py to generate more accurate file summaries. The LLM Prompt Injection use case modifies prompts in catalog.py to include language parameter for file summaries.