ServerOperations
Ollama (Qwen3)
The ollama user operates the Ollama inference service hosting the Qwen3-coder-next model used as a fallback LLM for query generation.
The ollama user operates the Ollama inference service hosting the Qwen3-coder-next model used as a fallback LLM for query generation.