Conduit
For AI assistants & chatbots, Conduit is a self-hosted solution that provides simple, fast, and reliable chat server powered by Matrix.
An honest look at what you get when you replace your Open-WebUI PWA with a real native app.
TL;DR
- What it is: Open-source (GPL-3.0) native iOS and Android app for Open-WebUI — the Flutter-built mobile client your self-hosted AI stack was missing [README].
- Who it’s for: Anyone already running Open-WebUI on a home server or VPS who’s tired of the mobile browser experience and wants native file sharing, voice input, Siri shortcuts, and proper auth handling without wrestling with PWA quirks [website][README].
- Cost: Free app download. Requires an existing Open-WebUI instance. No subscription, no per-query pricing.
- Key strength: The authentication layer. Conduit handles the messy reality of self-hosted setups — Cloudflare Tunnel, oauth2-proxy, Authelia, Tailscale, and a half-dozen other proxy configurations — without requiring you to punch holes in your firewall or allowlist specific endpoints [README].
- Key weakness: It’s a mobile skin on top of Open-WebUI’s API. It doesn’t extend the AI capabilities themselves, and it lives or dies with Open-WebUI’s own reliability. At 1,169 GitHub stars, the community is small and it appears to be a primarily solo-developer project [merged profile].
What is Conduit
Conduit is a Flutter-based iOS and Android client for Open-WebUI, the popular self-hosted frontend that lets you talk to local LLMs (Ollama, LM Studio) and cloud models (OpenAI, Anthropic) through a browser interface. The tagline on the website — “Your Open-WebUI stack, designed for your pocket” — is accurate and understated [website].
The core premise is simple: Open-WebUI’s web interface works on mobile, but it’s a web interface. Progressive Web Apps on iOS are a concession Apple made grudgingly, and it shows. No Siri integration. No native share sheet. No access to the platform keychain. No way to send a voice note and have it transcribed before hitting send. Conduit replaces all of that with a proper native app that speaks the same Open-WebUI API.
It’s available on the App Store and Google Play, and the source code is on GitHub under GPL-3.0. The README’s quickstart is three commands: clone, flutter pub get, flutter run [README]. If you’ve ever built a Flutter app, the setup is familiar. If you haven’t, the store releases are the path.
What’s not obvious from the description is how much work went into the authentication layer — more on that below. The developer clearly spent time on the part that actually makes self-hosted setups painful: connecting to your server from outside your local network, through the various authentication proxies that security-conscious self-hosters run.
Why people choose it
No independent tech reviews of this specific app were available in the sources provided for this article. What exists is a small set of user quotes on the Conduit website — four snippets pulled from App Store and Google Play reviews — and the GitHub README. This is itself a signal: Conduit is a niche tool for a niche use case, and the niche is small enough that tech media hasn’t noticed it yet.
The user quotes that do exist cluster around the same three themes:
It works where the PWA doesn’t. “UI is far superior to using a PWA. Everything picks up perfectly over Tailscale.” (Pixel 9 Pro, Android) [website]. This is the core value prop in one sentence. Tailscale is the standard way self-hosters expose services to their phones without opening ports, and the fact that the Tailscale + Conduit path works cleanly matters.
It’s fast and the offline history is real. “Incredibly responsive and fast. Offline chat history is a major win.” (Pixel Tablet, Android) [website]. Open-WebUI’s web interface requires a live connection to display your history. A native app can cache conversation lists locally.
It just works against a self-hosted server. “Worked out of the box with my self-hosted server. Exactly what I needed.” (SpikefishX, iOS) [website]. The bar here is low — a lot of “mobile clients for Open-WebUI” have been abandoned or incomplete — and Conduit clears it.
The fourth quote is the most useful for evaluating the project’s actual quality: “Easy companion app for Open-WebUI. The developer clearly focused on usability.” (ythefly, iOS) [website]. “Companion app” is the right framing. Conduit doesn’t try to be a standalone AI client or replace Open-WebUI — it extends it to your pocket.
Features
Based on the GitHub README:
Core chat:
- Real-time response streaming — tokens appear as they generate, not in a block at the end [README]
- Model selection — pick from whatever models your Open-WebUI instance has configured [README]
- Conversation management: create, search, browse history, organize into folders [README]
- Full markdown rendering with syntax highlighting [README]
- Light, dark, and system-matched themes [README]
Input modes:
- Voice input via speech-to-text — hands-free queries, transcribed before sending [README][website]
- File and image uploads for RAG — send documents into conversations [README]
- Multi-modal support for vision models [README]
- Share sheet integration — share content from other apps directly into a conversation [website]
Authentication (this is the depth feature):
This is where Conduit earns its keep. Most “connect to your self-hosted server” apps assume you’re hitting a plain HTTP or HTTPS endpoint with no auth. Conduit was clearly built by someone who actually runs self-hosted infrastructure:
- Username + password against Open-WebUI’s native login [README]
- SSO / OAuth via an in-app WebView — handles Google, Microsoft, GitHub, OIDC, and any other provider your Open-WebUI instance is configured for [README]
- Reverse proxy authentication — detects and handles Cloudflare Tunnel, oauth2-proxy, Authelia, Authentik, Pangolin, and similar setups. Conduit captures proxy session cookies from the native cookie store and includes them in subsequent API and WebSocket requests. No endpoint allowlisting required, no server-side changes needed [README]
- LDAP credentials [README]
- JWT token (paste directly) [README]
- Custom HTTP headers — add
X-API-Key,Authorization, or any other header and Conduit will include it on all requests [README]
The auth options are displayed dynamically based on what your server actually exposes, rather than showing a fixed form [README]. This is table stakes for a tool aimed at real self-hosted deployments.
What it doesn’t add: Conduit is a client, not a model router. It doesn’t add RAG pipelines, agents, or model capabilities on top of Open-WebUI. If Open-WebUI doesn’t support something, Conduit can’t offer it.
Pricing: what this actually costs
Conduit itself is free — download from the App Store or Google Play, no paywall, no in-app purchases listed [website][README].
The cost is the backend it requires. Conduit is useless without a running Open-WebUI instance. Here’s what a realistic full-stack costs:
Running Open-WebUI (the backend):
- A Hetzner CX22 (2 vCPU, 4GB RAM): €4.35/month — handles Open-WebUI itself fine without GPU inference
- If you want local model inference on the server: you need significantly more RAM (16–32GB) and ideally a GPU, which pushes into $50–100+/month territory or requires a home server
- If you connect Open-WebUI to OpenAI or Anthropic APIs: you pay those providers per token, but you’re not paying for ChatGPT Plus ($20/month) or Claude Pro ($20/month) just for the chat interface
The actual savings comparison: The relevant SaaS comparison is not “Conduit vs. some SaaS alternative” — Conduit has no direct SaaS equivalent. The comparison is:
- ChatGPT Plus: $20/month for mobile access to GPT-4o, includes the ChatGPT iOS/Android app
- Claude Pro: $20/month for mobile access to Claude, includes the Claude iOS/Android app
- Open-WebUI + Conduit + OpenAI API: Open-WebUI on a $5 VPS + Conduit (free) + OpenAI API pay-per-use. At typical founder usage (a few hundred queries/month), API costs are often $2–5/month. Total: ~$7–10/month for comparable capability.
Where the savings compound: if you’re routing to a local model (Ollama on a home server you already own), the marginal cost of additional queries is zero. The ChatGPT and Claude apps charge you whether you use them or not.
Pricing data for Conduit-specific tiers: not applicable. The app is free, source is free, the only cost is your infrastructure.
Deployment reality check
“Deployment” for Conduit means two things: running the backend and installing the app.
The app install: Download from App Store or Google Play. Standard mobile app install. No configuration required before launch [website][README].
The backend (Open-WebUI): This is the real deployment. Conduit’s README states the requirement plainly: “A running Open-WebUI instance” [README]. Open-WebUI itself runs on Docker and has its own setup requirements: a Linux host, Docker, enough RAM to run the model you want. If you’re already running Open-WebUI, adding Conduit is just downloading the app.
If you’re not running Open-WebUI yet:
- Open-WebUI Docker setup: 30–60 minutes for someone comfortable with a terminal
- Getting Ollama + a model running alongside it: another 30 minutes
- Exposing it securely to your phone (Tailscale or Cloudflare Tunnel): 15–30 minutes
- Total realistic estimate for a technical user: half an afternoon
What Conduit adds to that: Once the backend is running, Conduit’s connection flow is: enter server URL, validate connectivity, authenticate. The reverse proxy detection is the part that saves time — if you’re behind Authelia or oauth2-proxy, Conduit handles the redirect flow without you configuring anything on the server side [README].
Potential friction points:
- You need to know your Open-WebUI instance’s URL and how your auth is configured
- If your VPN or tunnel goes down, Conduit loses its connection — this is a backend reliability problem, not a Conduit problem
- The app requires Android 6.0 (API 23)+ or iOS 12.0+ [README] — effectively every phone made in the last 7 years
What’s less clear: The website claims “Offline Ready — History and context available” [website]. The exact scope of offline behavior (can you browse history, or can you also continue typing against a cached context?) isn’t specified in the primary sources. One App Store reviewer specifically called out offline history as “a major win,” which suggests at minimum conversation browsing works without a connection [website].
Pros and cons
Pros
- Free, genuinely open source (GPL-3.0). No subscription, no limits, no API key for Conduit itself. The source is on GitHub and buildable from scratch [README].
- The authentication depth is real. Handling reverse proxy auth (Cloudflare Tunnel, oauth2-proxy, Authelia, Pangolin) natively, with automatic detection and no server-side changes, is the feature that makes this viable for real self-hosted setups [README]. Most mobile clients for similar tools skip this entirely.
- Built with Flutter — genuinely cross-platform. iOS and Android from one codebase, both available in their respective stores [README][website].
- Voice input and share sheet integration. Not available in the Open-WebUI PWA. These are native capabilities Conduit actually implements [README][website].
- Folder management and local conversation caching. Quality-of-life features for people who use their AI assistant frequently [README].
- App Store and Google Play availability. You don’t have to sideload or build from source for a standard install [website].
Cons
- GPL-3.0, not MIT or Apache. If you want to embed Conduit in a commercial product or fork it into a branded client for your business, the GPL license creates obligations. Not a problem for personal use; potentially a problem if you want to build on top of it [merged profile].
- Solo developer risk. At 1,169 stars, this is a small project. The website has no team page, the README has no org backing. If the developer moves on, the project moves on without them. The GPL license means someone can fork it, but small forks rarely gain traction.
- No standalone value. Conduit is entirely dependent on Open-WebUI. If Open-WebUI’s API changes in a breaking way, Conduit breaks until it’s updated. If Open-WebUI is down, Conduit shows you nothing.
- No third-party reviews exist yet. The absence of coverage from tech publications is a signal about how new and niche this tool is. The reviews that do exist are four store quotes on a website — not enough to triangulate reliability problems or long-term behavior [website].
- The star count is modest. 1,169 GitHub stars [merged profile] suggests a small user base. That’s not a quality verdict — most self-hosted mobile clients are niche by definition — but it does mean bug discovery and community-driven fixes are slower.
Who should use this / who shouldn’t
Use Conduit if:
- You’re already running Open-WebUI on a server and accessing it via a mobile browser or PWA, and the experience annoys you.
- You access your Open-WebUI through Cloudflare Tunnel, Tailscale, oauth2-proxy, or another auth proxy — Conduit handles this better than any browser-based approach.
- You want voice input and native share sheet on your phone without opening your AI stack to a third-party cloud service.
- You’re comfortable with the idea of connecting a mobile app to your own infrastructure.
Skip it if:
- You haven’t set up Open-WebUI yet and have no plans to. This app does nothing without the backend.
- You’re evaluating AI tools and want a beginner-friendly path. Open-WebUI + Conduit is not that — start with a managed service and come back to self-hosting once you know what you actually want.
- You want a fully offline, no-server AI experience on mobile. Conduit requires a live backend.
- The GPL license is a problem for your use case.
Consider the alternatives instead if:
- You want to self-host but don’t want to run Open-WebUI specifically — some alternatives ship their own mobile access.
- You want ChatGPT-quality hosted models without managing infrastructure — the official apps are polished and Conduit is not competing with them.
Alternatives worth considering
Open-WebUI mobile browser / PWA: The zero-install option. Works, ships no additional software, requires no app store. Trade-offs: no Siri, no share sheet, PWA reliability on iOS remains inconsistent, no native keychain integration.
Open-WebUI itself, with a different frontend: If your complaint is about mobile access specifically, consider whether a self-hosted Nginx or Caddy reverse proxy with proper HTTPS gets you close enough via mobile browser to skip the native app entirely.
AnythingLLM: Competing self-hosted AI chat platform with its own architecture. Doesn’t have an official native mobile app as of this writing; Conduit only targets Open-WebUI.
Ollama with a dedicated app: For iOS, there are several Ollama-native clients (not Open-WebUI clients). If you want to talk directly to local models without Open-WebUI’s middleware layer, that’s a different stack entirely.
ChatGPT app / Claude app: The managed service comparison. You pay $20/month, you get a polished native app with no server to maintain. Conduit + Open-WebUI beats this on cost and data ownership; loses on polish and zero-maintenance operation.
Bottom line
Conduit solves a specific, real problem: Open-WebUI’s mobile experience is mediocre, and fixing it by building a proper native client is the correct approach. The authentication depth — handling reverse proxies, Cloudflare Tunnel, oauth2-proxy, and LDAP without requiring server-side configuration — is the feature that separates it from the graveyard of abandoned “Open-WebUI mobile clients” on GitHub. For someone already running Open-WebUI through Tailscale or a similar setup, Conduit probably just works, and the four App Store and Play Store quotes on the website reinforce that.
The honest caveats are equally specific: this is a small project with a small community, GPL-licensed, with no org backing, and no third-party coverage. It’s not the kind of tool you build critical workflow automation on. It’s the kind of tool you install on a Saturday afternoon, point at your server, and either delete because it doesn’t work with your setup or keep because it does. At zero cost and five minutes to install, the risk-reward math is obvious.
Sources
Primary sources used in this article:
- Conduit GitHub Repository and README — https://github.com/cogwheel0/conduit (GPL-3.0, 1,169 stars)
- Conduit Official Website — http://conduit.cogwheel.app (product description, feature list, workflow documentation)
- Conduit Website — User Feedback Section (App Store and Google Play review quotes) — http://conduit.cogwheel.app
Note: The third-party article URLs provided for this review (relay.fm/conduit, cafleurebon.com, storagecafe.com, academy.fpblock.com, conduitpay.com) contain no coverage of Conduit the Open-WebUI client. They reference an unrelated podcast, a perfume, a storage facility, a Haskell streaming library, and a stablecoin payments company respectively. No independent tech reviews of this product were available at the time of writing.
Features
Authentication & Access
- API Key Authentication
- LDAP / Active Directory
- Single Sign-On (SSO)
Integrations & APIs
- REST API
- WebSocket Support
AI & Machine Learning
- RAG / Knowledge Base
- Speech-to-Text / Voice
Media & Files
- File Attachments
- Markdown Support
Customization & Branding
- Themes / Skins
Mobile & Desktop
- Mobile App
Related AI & Machine Learning Tools
View all 93 →OpenClaw
320KPersonal AI assistant you run on your own devices. 25+ messaging channels, voice, cron jobs, browser control, and a skills system.
Ollama
166KRun open-source LLMs locally — get up and running with DeepSeek, Qwen, Gemma, Llama, and more with a single command.
Open WebUI
128KRun AI on your own terms. Connect any model, extend with code, protect what matters—without compromise.
OpenCode
124KThe open-source AI coding agent — free models included, or connect Claude, GPT, Gemini, and 75+ other providers.
Zed
77KA high-performance code editor built from scratch in Rust by the creators of Atom — GPU-accelerated rendering, built-in AI, real-time multiplayer, and no Electron.
OpenHands
69KThe open-source, model-agnostic platform for cloud coding agents — automate real software engineering tasks with sandboxed execution, SDK, CLI, and enterprise-grade security.