All Domains
1587 entities found
async event_generator()
async exception visibility
Implement diagnostic logging in question_router.py to track if event_generator is called, identify where execution stops, and pinpoint silent failures in async generators, especially in _run_query() or route() method, to improve visibility of exceptions.
Async generator exception
An async generator exception is likely occurring silently within the running container e8f4cd9, causing queries to stop after thinking messages. The async generator exception causing query execution to stop is potentially mitigated or affected by the Coolify deployment status, which may be stuck or slow.
Audit logs
Audit Logs
Vanna 2.0 maintains audit logs tracking every query per user for compliance purposes. Vanna 2.0 tracks every query per user for compliance through audit logs. Vanna 2.0 records audit logs tracking queries per user for compliance
audit_logs table
DataLens needs to add an audit_logs table and related query tracking middleware to provide compliance and track which user ran each query. DataLens requires adding an audit_logs table and query tracking middleware to support audit logging.
Auth
Auth system successfully integrated with login/auth endpoints. User login fixed with correct credentials, enabling secure access. Frontend and backend authentication are functional, supporting Danish language and user sessions in production.
Auth API Endpoints
auth guards
The Frontend uses auth guards for authentication
Auth token
Authentication
API endpoints are governed by authentication constraints to ensure secure access. The DataLens Platform includes an authentication system supporting registration, login, and session management. The Auth system in the DataLens Platform uses python-jose as a dependency for security or token management. passlib is used by the Auth system in the DataLens Platform for hashing passwords securely.
Authentication services
Authentication services cooperate with Project services in the backend
Authentication Token 2033781553447d00cb54bd110e2f14c623860f24eceb7421
Authorization Bearer tokens
Bearer token authentication is implemented in the platform using Authorization Bearer tokens standard for session management. The DataLens Platform uses Authorization Bearer tokens for authentication and authorization purposes.
authStore.init
authStore.user
auto-queue extraction
autonomous data analysis
IronClaw-powered Agent Mode provides capability for autonomous data analysis.
autonomous extraction
Part of DS-STAR pipeline for data analysis, involving extraction, planning, verification, and routing agents.
autoprefixer
npm dependency: autoprefixer@^10.4.24, used in frontend, version 0.115.0, license type unknown, no approval or risk assessment specified.
AWS Bedrock
AWS Bedrock is among the third-party AI providers set for hosting or API access for AI models in DataLens, supporting scalable inference solutions.
Azure
Azure cloud services are noted as part of third-party AI component options available for DataLens model hosting and inference, but specific deployment details are not given.
Azure OpenAI
B-tree index
Backend
Backend uses FastAPI, supports multi-tenant auth, and integrates DS-STAR, Text-to-SQL, and PostgreSQL, Redis, Qdrant, Ollama on elin. Critical components include question_router, findings_generator, and document extractors. It manages project data, files, and analysis API, with fallback mechanisms for Qdrant. It currently uses DuckDBService; after migration, it will replace this with PgDataService to manage data in PostgreSQL.
Backend API
DataLens Platform provides a Backend API to support frontend and other clients. The frontend /ask-stream switch uses the backend /ask-stream endpoint to enable streaming query responses and prevent client timeout errors. The frontend /ask-stream switch uses the backend /ask-stream endpoint to enable streaming query responses and prevent client timeout errors. The Backend API interacts with the RQ worker to manage extraction job queues and status for SVGV files. The backend container hosts the Backend API which is functional and ready to handle requests for DataLens. The Data Discovery Feature uses Backend API endpoints including /discovery, /tables, and /validate for functionality. RQ Worker extraction processing uses Backend API endpoints for extraction tasks coordination. The Data Discovery System uses Backend API endpoints to support entity extraction, ranking, and join discovery services. The Backend API integrates with the Discovery backend service via the defined routes. Backend API queries DuckDB tables produced from the SVGV extraction for budget data Backend API queries DuckDB tables produced from the SVGV extraction for budget data Backend API queries DuckDB tables produced from the SVGV extraction for budget data The Backend API uses the /ask-stream endpoint to handle long-running queries with streaming responses to prevent client timeouts. The Backend API integrates with the Ollama GPU service to run LLM models qwen3-coder-next and nomic-embed-text for query classification and embeddings generation. Backend API exposes the Extraction API endpoints to trigger extraction of individual SVGV files. Backend API uses OpenClaw Gateway WebSocket client to communicate with OpenClaw Gateway for agent chat operations. Backend API exposes the Extraction API endpoints to trigger extraction of individual SVGV files. Backend API uses OpenClaw Gateway WebSocket client to communicate with OpenClaw Gateway for agent chat operations.
Backend Application Services
Phase C Integration modifies Backend Application Services to use consolidated views for query analysis within the DataLens System.
Backend container
The backend container hosting API services is configured on theo, currently running old code due to deployment issues. It relies on Docker-compose, with deployment pending a successful update of the relevant commit (52eab9a) to fix ongoing errors. The Backend container integrates with the OpenClaw Gateway to forward user analysis requests The Backend container runs the SVGV bulk extraction process to parse files and store data The backend container of DataLens Development is deployed on the server named theo. Phase 2 Strategy Research & Decision Point deployment depends on the backend container which needs restart to load the new code for Phase 2 activation. Coolify deploy manages backend container lifecycle for deployment and code reloads for new features. Backend container depends on Coolify environment variables to have the ANTHROPIC_API_KEY set to function correctly with OpenClaw and Claude. Backend container connects to OpenClaw Gateway to forward user analysis requests and receive responses. Coolify environment variables provide necessary settings including Anthropic API key to the backend container during deployment. The backend (Docker) service is defined in docker-compose.yml. The backend (Docker) service is defined in docker-compose.coolify.yml.