Integration: Open WebUI
Use Open WebUI as a chat frontend for your Haystack apps through Hayhooks
Table of Contents
Overview
Open WebUI is an open-source chat UI for LLM apps. By exposing your Haystack pipelines and agents through Hayhooks as OpenAI-compatible endpoints, you can use Open WebUI as the frontend: run Hayhooks and Open WebUI (separately or via Docker Compose), then connect Open WebUI to Hayhooks in Settings. You get streaming, optional status and notification events, and optional OpenAPI tool server integration.
For full details, see the Hayhooks Open WebUI integration guide.
Installation
Install Hayhooks:
pip install hayhooks
Install and run Open WebUI separately, e.g. with Docker:
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e WEBUI_AUTH=False -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
For a pre-wired setup, use the Hayhooks + Open WebUI Docker Compose (see Quick Start with Docker Compose).
Usage
Connect Open WebUI to Hayhooks
-
Start Hayhooks with your pipelines:
hayhooks run --pipelines-dir ./pipelines -
In Open WebUI go to Settings โ Connections and add a connection:
- API Base URL:
http://localhost:1416(orhttp://hayhooks:1416/v1when using Docker Compose) - API Key: any value (Hayhooks does not require auth)
- API Base URL:
-
In a new chat, select your deployed pipeline as the model.
Pipeline wrappers must support chat completion (e.g. implement run_chat_completion or run_chat_completion_async). See
OpenAI compatibility and the
pipeline examples for implementation details.
Optional: Open WebUI events
For status updates, notifications, and tool-call feedback in the UI, use helpers from hayhooks.open_webui in your pipeline and stream OpenWebUIEvent objects. See
Open WebUI Events and the
open_webui_agent_events and
open_webui_agent_on_tool_calls examples.
License
hayhooks is distributed under the terms of the
Apache-2.0 license. Open WebUI is subject to its
own license.
