Project: datalens
81 entity types
Matrix/All Domains

All Domains

1587 entities found

IntegrationEndpointIntegrations

POST /upload-batch

The upload-batch endpoint extends the functionality of single file upload by handling multiple files asynchronously.

IntegrationEndpointIntegrations

POST /validate

API /validate in backend/api/discovery.py for validating convalidations, status unspecified.

IntegrationEndpointIntegrations

POST /{file_id}/extract

Uploading a file and extracting data from the uploaded file are sequential operations in files.py for data ingestion.

IntegrationEndpointIntegrations

POST /{project_id}/recommendations

Generate analysis recommendations, status unspecified.

Entity

POST request

QuestionRouter handles POST requests which previously hanged during Qdrant connection negotiation before the lazy loading fix.

ThirdPartyComponentArchitecture

postcss

npm dependency: postcss@^8.5.6, used in frontend, version 8.5.6, license type unknown, no approval or risk assessment noted.

ServerOperations

postgres-qg4kk8ccggw844wscsossogs-093750206021

The postgres (Docker) service is defined in docker-compose.yml. The postgres (Docker) service is defined in docker-compose.coolify.yml.

ServerOperations

postgres_data

Docker container with no exposed ports, used for PostgreSQL data storage. Backend service requires a volume for postgres_data to persist PostgreSQL database files. The postgres_data (Docker) service is defined in docker-compose.yml. The postgres_data (Docker) service is defined in docker-compose.coolify.yml.

SystemBoundaryArchitecture

postgres_data volume

RequirementIntent

POSTGRES_DB environment variable

RequirementIntent

POSTGRES_PASSWORD environment variable

RequirementIntent

POSTGRES_USER environment variable

ServerOperations

PostgreSQL

DataLens migrated project data from DuckDB to PostgreSQL for enhanced concurrency, with backend interactions, data storage, and metadata management relying on PostgreSQL, supporting multilingual features, and hosting the project_14 schema with 72 tables. Extraction Worker will write extracted data to PostgreSQL schemas instead of DuckDB. After migration, Query Router will read extracted data from PostgreSQL schemas. Post-migration Arctic SQL will generate SQL queries targeting PostgreSQL with minor translation. PgDataService manages extracted project data stored in PostgreSQL schemas. Each project schema such as Project 14's schema is a distinct namespace within PostgreSQL. Text chunks for all projects are stored in a unified table in PostgreSQL's public schema. extracted_tables table tracks metadata about extracted tables per project inside PostgreSQL. Phase 2 Strategy Research & Decision Point uses PostgreSQL for file storing and metadata management as indicated by files stored in PostgreSQL during deployment tests. Backend server uses PostgreSQL for tracking file metadata, users, projects, and authentication. PostgreSQL database stores the project_14 schema used for SVGV data storage and extraction results. The Data Discovery system integrates with the PostgreSQL database for storing and retrieving data. IronClaw agent tables are physical tables mapped within the PostgreSQL database for persistent storage of agent sessions and related data. OpenClaw Skill API references PostgreSQL database for metadata of the 132 budget files. theo Backend accesses PostgreSQL database to manage budget files metadata as part of the data platform. PostgreSQL catalogs metadata for SVGV files but does not contain extracted budget tables after system restart. PostgreSQL database stores user information for authentication and project tracking. PostgreSQL database stores project metadata like org_id and created_by user.

ServerOperations

PostgreSQL (metadata storage)

POST /files/upload?project_id=4 endpoint stores uploaded files in PostgreSQL database

GDPRRecordSecurity

PostgreSQL (Metadata)

PostgreSQL stores project-related metadata, session info, and historical data for the DataLens platform, supporting agent sessions, conversation logs, findings, and GDPR flags.

ExternalSystemIntegrations

PostgreSQL 16

Backend API uses PostgreSQL 16 database service as its metadata store. The DataLens Platform backend depends on PostgreSQL 16 as the database technology. PostgreSQL 16 database server is integrated within the Coolify deployment environment for DataLens Development.

ServerOperations

PostgreSQL 16 service

The backend service depends on the PostgreSQL 16 service as its metadata storage with connection configured via environment variable. Backend API depends on PostgreSQL 16 service for metadata storage. The production deployment on theo uses PostgreSQL 16 for metadata storage.

ServerOperations

PostgreSQL container

The Production Deployment on theo includes running the PostgreSQL container. The PostgreSQL container runs on the server named theo as part of the Coolify infrastructure.

SystemBoundaryArchitecture

PostgreSQL Data Volume

ServerOperations

PostgreSQL database

A PostgreSQL database system is employed as the backend database platform, serving as the core server for metadata, user, and project data management for DataLens. The PostgreSQL database stores FileUpload records with the ai_summary column used for AI summaries of files. Query uses PostgreSQL Database to persist query history and metadata about user questions and projects.

ServerOperations

PostgreSQL database system

DataEntityData Model

PostgreSQL Decimal type

FindingsGenerator requires support for PostgreSQL Decimal type to correctly process numeric data in findings. The Live Backend supports PostgreSQL Decimal type for findings generation.

AcceptanceCriteriaIntent

PostgreSQL Health Check

BusinessProcessIntent

PostgreSQL metadata storage

Uses PostgreSQL for storing system metadata, with ready deployment in production.

CapabilityIntent

postgreSQL MVCC

Entity

PostgreSQL on theo

Frontend deployment on theo is containerized and runs together with PostgreSQL on theo via Coolify.

InfrastructureSpecOperations

PostgreSQL project_14 schema

The SVGV Full Reset includes dropping and recreating the PostgreSQL project_14 schema as part of the reset. Arne Hauge accesses data stored in the PostgreSQL project_14 schema for budget analysis. The Data Discovery System requires the PostgreSQL project_14 schema as the primary data store for consolidated tables and query execution. The omkostningsberegning table is part of the PostgreSQL project_14 schema storing extracted SVGV data.

PhysicalTableData Model

PostgreSQL schema

The DataLens Platform uses a multi-tenant PostgreSQL schema for organizations, users, projects, files, and cataloging data. The DataLens Platform uses SQLAlchemy to interact with the PostgreSQL schema for database operations. The Backend uses a PostgreSQL schema for data storage.

PhysicalTableData Model

PostgreSQL theo:5433 proxy

ThirdPartyComponentArchitecture

PPTX extractor

The batch upload pipeline depends on new extractors including the PPTX extractor. The PPTX extractor uses the python-pptx third-party component. The PPTX extractor uses python-pptx to extract slide-based chunks during DataLens Phase 2. The PPTX extractor implements slide-based chunking, with potential sub-slide splits for dense content. The PPTX extractor uses python-pptx to extract slide-based chunks during DataLens Phase 2. The PPTX extractor implements slide-based chunking, with potential sub-slide splits for dense content. The PPTX extractor operates via background workers to perform extraction tasks. The PPTX extractor capability is validated by the test_extractors test case.