Operations
97 entities found
Async generator exception
An async generator exception is likely occurring silently within the running container e8f4cd9, causing queries to stop after thinking messages. The async generator exception causing query execution to stop is potentially mitigated or affected by the Coolify deployment status, which may be stuck or slow.
Backend Application Services
Phase C Integration modifies Backend Application Services to use consolidated views for query analysis within the DataLens System.
Backend container
The backend container hosting API services is configured on theo, currently running old code due to deployment issues. It relies on Docker-compose, with deployment pending a successful update of the relevant commit (52eab9a) to fix ongoing errors. The Backend container integrates with the OpenClaw Gateway to forward user analysis requests The Backend container runs the SVGV bulk extraction process to parse files and store data The backend container of DataLens Development is deployed on the server named theo. Phase 2 Strategy Research & Decision Point deployment depends on the backend container which needs restart to load the new code for Phase 2 activation. Coolify deploy manages backend container lifecycle for deployment and code reloads for new features. Backend container depends on Coolify environment variables to have the ANTHROPIC_API_KEY set to function correctly with OpenClaw and Claude. Backend container connects to OpenClaw Gateway to forward user analysis requests and receive responses. Coolify environment variables provide necessary settings including Anthropic API key to the backend container during deployment. The backend (Docker) service is defined in docker-compose.yml. The backend (Docker) service is defined in docker-compose.coolify.yml.
backend container environment
Backend Env
backend-qg4kk8ccggw844wscsossogs
Docling is installed and running on the backend-qg4kk8ccggw844wscsossogs server container to support batch processing.
backend-qg4kk8ccggw844wscsossogs-093750218425
backend/app/services/table_catalog.py
backend/app/services/table_catalog.py implements the TableCatalog use case with Danish translations and join hints for the Multi-Stage Text-to-SQL Architecture.
BACKEND_CRASH_DIAGNOSIS.md
Documents backend crash analysis; discusses lazy-loading Qdrant in QuestionRouter to prevent startup timeouts, with fallbacks if Qdrant is unavailable.
backend_storage
Docker container with no exposed ports, used for backend data storage and application services. Backend service requires a volume for backend_storage to persist user uploads and DuckDB files. The backend_storage (Docker) service is defined in docker-compose.yml. The backend_storage (Docker) service is defined in docker-compose.coolify.yml.
container-per-project
Coolify container
Coolify daemon
The Coolify daemon integrates with the GitHub API to fetch repository source code during deployment builds.
Coolify deployment
The Coolify deployment uses the Docker-compose configuration for service orchestration. Commit 52eab9a requires deployment via Coolify to enable updated debug logging in process_message function. Container e8f4cd9 runs old code and will be replaced once the commit 52eab9a is successfully deployed via Coolify. Commit 52eab9a requires deployment via Coolify to enable updated debug logging in process_message function. Container e8f4cd9 runs old code and will be replaced once the commit 52eab9a is successfully deployed via Coolify. Coolify deploy manages backend container lifecycle for deployment and code reloads for new features. The async generator exception causing query execution to stop is potentially mitigated or affected by the Coolify deployment status, which may be stuck or slow.
Coolify frontend container
Deployment container for frontend; setup ongoing.
Coolify-ready configuration
The Docker deployment configuration is designed to be ready for deployment with Coolify. The Docker deployment includes a Coolify-ready configuration DEPLOYMENT.md documents the Coolify-ready configuration
Danish summaries
Full Danish language support implemented across LLM summaries, analysis, and UI translations. User preferences stored in database, with Danish UI and Danish summaries now active for admin and general users.
DeepAnalyze-8B
DeepAnalyze-8B model download failed (requires 22-26GB VRAM, only 20GB available). Alternative models like SQLCoder-7B deployed for faster SQL generation (3-5x speedup) on elin GPU. Current setup uses Ollama on elin to serve SQLCoder-7B, improving inference from 40-50s to 2-3s.
Deploy_gpu_extractors.sh
The Deploy_gpu_extractors.sh script installs and configures the Docling extraction system and related GPU-first extraction components.
DEPLOYMENT_HYBRID.md
Hybrid deployment guide combining backend on elin with GPU-accelerated DataLens AI and frontend on theo via Coolify, ensuring separation of infrastructure and detailed setup instructions. The hybrid deployment involves backend running on elin. The hybrid deployment includes the frontend running on theo. DEPLOYMENT_HYBRID.md provides a complete guide for the hybrid deployment. DEPLOYMENT_HYBRID.md provides a complete guide for the hybrid deployment. Hybrid deployment architecture runs all Python and AI workloads on elin. The hybrid deployment involves backend running on elin. The hybrid deployment includes the frontend running on theo. DEPLOYMENT_HYBRID.md provides a complete guide for the hybrid deployment. DEPLOYMENT_HYBRID.md provides a complete guide for the hybrid deployment. Hybrid deployment architecture runs all Python and AI workloads on elin.
designs/phase2-gpu-first-implementation.md
Disk space
Infrastructure includes Disk space as a monitored specification. The plan monitors disk space usage for DuckDB and Qdrant data storage.
Docker
DataLens Agent Mode deployment involves running IronClaw Service as a Docker container. The DataLens platform uses Docker for deployment including backend, frontend, PostgreSQL, and Redis containers. Docker deployment is defined using the docker-compose.yml configuration file.
Docker Compose Stack
The DataLens Platform backend and dependencies are deployed using a Docker Compose Stack specification. The Docker Compose Stack includes the Platform Backend service as one of its containers alongside PostgreSQL and Redis.
Docker Deployment Guide
The DataLens Platform includes Docker Deployment for its full stack. Docker Deployment integrates with the Coolify Application deployment system. Docker Deployment contains the docker-compose.yml file for full stack orchestration. Docker Deployment includes the DEPLOYMENT.md acceptance document guiding deployment via Coolify. Docker Deployment includes the COOLIFY_DEPLOY.md acceptance document with deployment instructions. The Docker deployment environment contains the Frontend, Backend, PostgreSQL, and Redis services. The Docker deployment includes Health checks for components. The Docker deployment includes volumes configurations for persistent storage. The Docker deployment includes network configurations for service communication. The Docker deployment configuration is designed to be ready for deployment with Coolify.
docker-compose.coolify.yml
docker-compose.coolify.yml is part of the Docker-compose configuration used in deployment. The RQ Worker service depends on the service definition in docker-compose.coolify.yml for backend container deployment and job processing. The postgres (Docker) service is defined in docker-compose.coolify.yml. The redis (Docker) service is defined in docker-compose.coolify.yml. The backend (Docker) service is defined in docker-compose.coolify.yml. The worker (Docker) service is defined in docker-compose.coolify.yml. The frontend (Docker) service is defined in docker-compose.coolify.yml. The postgres_data (Docker) service is defined in docker-compose.coolify.yml. The redis_data (Docker) service is defined in docker-compose.coolify.yml. The backend_storage (Docker) service is defined in docker-compose.coolify.yml.
docker-essentials
Docker essentials are used within the DataLens platform backend for containerization. The DataLens platform backend includes the docker-essentials skill. The DataLens platform backend integrates the docker-essentials skill.
DOCXExtractorTET
Refactored for semantic, section-based chunking with optional Docling GPU-accelerated extraction on elin. Uses heading hierarchy to define boundaries, embeds tables as JSON, improves text quality, and integrates with batch processing. The DOCX Extractor is deployed as part of the Backend and produces text chunks and table parsing outputs.
DuckDB (per-project analytical data)
DuckDB is used for storing structured extracted data and semantic text chunks for each project, currently containing over 700 entities with detailed tabular and textual information from document processing. DuckDB isolation by separate .duckdb files per project is a design decision implemented within DataLens Development.
DuckDB database system
The Qdrant vector index depends on the DuckDB database for text chunk storage and embedding data source in the DataLens platform.