I recently built my own AI chat platform that combines the power of Open WebUI with n8n workflow automation, all securely accessible through Cloudflare Tunnel. What started as a quest for more control over my AI interactions has evolved into a personalized command center that lets me chat with my automations in a seamless, secure environment. The best part? I didn’t have to compromise on flexibility, security, or user experience—because I built it exactly the way I needed it.
Why I Needed More Control: The Motivation Behind Building My Chat Platform
My journey toward building a custom AI chat hub started with a growing frustration. I was already using n8n for various automations, but the interaction experience was purely mechanical—trigger a workflow, wait for results, rinse, repeat. What I really wanted was a more conversational way to interact with these automations, something that felt natural and integrated.
At the same time, I was using several different AI chat interfaces, each with its own limitations. Some lacked persistence between sessions, others didn’t give me control over my data, and none of them integrated well with my existing tools. The commercial options either cost too much for my needs or didn’t offer the customization I wanted.
I realized what I needed was a centralized, self-hosted chat platform that could not only connect to commercial AI models but also—and this was crucial—deeply integrate with my n8n workflows. I wanted to be able to simply chat with my automations and have them respond intelligently. Plus, I wanted to lay the groundwork for eventually hosting my own local AI models to reduce dependency on external services.
This wasn’t just about building something cool—it was about creating a practical tool that would make my daily workflows more efficient and give me complete control over my AI interactions.
Choosing My Stack: The Components and Their Roles
After researching various options, I settled on three core components for my custom chat platform:
- Open WebUI: I chose this as my chat interface because it offers a clean, modern UI similar to commercial chat platforms but with significantly more flexibility. It supports multiple AI models, has built-in function calling capabilities, and is actively developed. Compared to alternatives like text-generation-webui or LM Studio’s basic interface, Open WebUI offered the right balance of polish and extensibility I needed.
- n8n: I was already using n8n for automation, and its webhook capabilities made it perfect for integration. Rather than adding another automation tool, I wanted to leverage what I already had and make it accessible through a conversational interface. The ability to design complex workflows visually and then trigger them through chat messages was exactly what I needed.
- Cloudflare Tunnel: For secure access, I needed something that wouldn’t require opening ports on my router or dealing with dynamic IP issues. Cloudflare Tunnel provided a secure way to expose my self-hosted services to the internet with automatic TLS encryption and without the security risks of port forwarding. Compared to alternatives like Tailscale or self-managed reverse proxies, Cloudflare offered the best combination of simplicity and security.
These choices weren’t arbitrary—each addressed a specific requirement while ensuring the overall system would be maintainable and future-proof. I wanted components that would work well together now but also support my plans to eventually add local AI models and more complex integrations.
Architectural Foundation: Designing for Maintainability with Docker Compose
I knew from past projects that a clean, well-organized infrastructure would save me countless headaches down the road. Docker Compose was the obvious choice for orchestrating the various services, allowing me to define the entire stack as code and make it easily reproducible.
I designed the system around a shared Docker network called “ai-net” that allows all containers to communicate while remaining isolated from other services on my server. This network architecture is simple but effective—services can find each other by container name, eliminating the need for complex service discovery.
For data persistence, I set up several named volumes:
- A volume for Open WebUI to store settings, conversation history, and user accounts
- Another for Cloudflare Tunnel credentials to persist across restarts
- A dedicated “ollama” volume, provisioned for future local AI model storage
Initially, my Compose file was cluttered with unnecessary options and variables, but I iteratively simplified it. I learned that hard-coding specific image tags rather than using “latest” prevented unexpected updates from breaking my setup. I also removed services I wasn’t actively using to keep the system lean and focused.
This approach to infrastructure design reflects a pragmatic philosophy: build for tomorrow, but don’t over-engineer today. The foundation is solid enough to support future expansion but simple enough to maintain without becoming a second job.
Integration Breakthrough: Connecting n8n and Open WebUI for Seamless Automation
The real magic of this project emerged when I figured out how to make n8n a first-class citizen in the chat interface. This wasn’t just about launching workflows—I wanted them to feel like natural extensions of the conversation.
Open WebUI has a Functions feature that normally connects to OpenAI or compatible function-calling APIs. I leveraged this interface to create a custom “Pipe” class that would:
- Listen for specific function calls in chat messages
- Forward these requests to n8n via webhook endpoints
- Process n8n’s response and present it back in the chat
The elegance of this solution is that it required no additional middleware or custom servers. When I type something like “What’s on my calendar today?” in the chat, my function detects the intent, calls the appropriate n8n workflow, and returns the calendar entries directly in the conversation—all seamlessly integrated into the chat experience.
This integration transformed n8n from a behind-the-scenes automation tool to an interactive assistant I could converse with. I no longer need to switch contexts or remember specific trigger phrases—I can simply chat naturally and let the system figure out which automation to run.
This breakthrough was particularly satisfying because it created a powerful synergy between two tools that weren’t originally designed to work together. It’s a perfect example of how custom integration can create something greater than the sum of its parts.
Securing Access: Implementing Cloudflare Tunnel for Safe, Public Connectivity
Security was non-negotiable for this project. I needed to access my chat platform from anywhere without compromising my home network. Cloudflare Tunnel provided the perfect solution.
The setup process was straightforward but required attention to detail:
- I created a new tunnel in the Cloudflare Zero Trust dashboard
- Set up the cloudflared service in my Docker Compose, being careful to use the “tunnel run” command with the “–no-autoupdate” flag (a crucial detail for container stability)
- Created a CNAME record pointing “my.domain” to my tunnel
- Configured the Service URL to point to “http://open-webui:8080” (using the internal Docker network hostname)
What makes this approach superior to alternatives is that it requires no open ports on my router, automatically handles TLS certificate renewal, and provides a layer of protection through Cloudflare’s edge network.
I also used Portainer to inject sensitive environment variables like my WEBUI_SECRET_KEY, ensuring session security without hardcoding secrets in configuration files. This approach to managing secrets is something I’ve learned through experience—convenience is never worth compromising security.
Navigating DNS and Proxying: Ensuring Reliable Service Discovery
DNS configuration proved to be more nuanced than I initially expected. Getting everything working smoothly required careful adjustment of records and understanding how Cloudflare’s proxy interacts with tunneled services.
I created a CNAME record for my.domain.ai pointing to my tunnel’s domain, ensured my apex domain was properly configured, and cleaned up some extraneous GoDaddy NS records that were causing conflicts. These steps were crucial for achieving reliable routing to my self-hosted service.
What I learned is that DNS is rarely “set it and forget it”—it requires thoughtful configuration and occasional maintenance. Removing unnecessary records was just as important as adding the right ones, a lesson in minimalism that applies to many aspects of system administration.
The attention to detail paid off with a rock-solid DNS setup that reliably routes traffic to my application, regardless of my home IP address changes or ISP issues.
Insights from the Trenches: Major Challenges and Lessons Learned
Building this system wasn’t without its challenges, each providing valuable lessons for future projects:
- Docker Compose Nuances: I spent hours troubleshooting issues caused by empty tags and whitespace in environment variables. The lesson was clear—in configuration files, precision matters as much as correctness. Now I’m meticulously careful with syntax in all my infrastructure-as-code.
- Cloudflared Container Behavior: The Docker container for cloudflared behaves differently than the Windows installer version. I initially tried using the same commands I’d used on Windows, only to discover that the container needed the native “tunnel run” command instead. This taught me to always consult container-specific documentation rather than assuming command compatibility across platforms.
- Cloudflare Error 1033: I initially configured my Service URL to point to the public hostname rather than the internal container name, resulting in a redirect loop. This reinforced the importance of understanding service discovery in proxied setups—inside the container network, services should reference each other by their Docker hostnames, not public URLs.
- Cloudflare Error 1000: DNS conflicts between my CNAME and proxy settings led to mysterious “DNS points to prohibited IP” errors. Resolving this required ensuring DNS purity—removing conflicting records and verifying correct delegation. I now approach DNS changes with careful planning and incremental verification.
Each of these challenges taught me something valuable about complex system integration. The most important meta-lesson was patience—sometimes stepping away from a problem and returning with fresh eyes reveals solutions that were invisible before.
The Impact: Capabilities Delivered by My Custom Chat Hub
The completed system exceeded my expectations, delivering a powerful platform that transformed how I interact with my automations:
I now have a secure, publicly accessible chat interface at https://my.domain.ai, protected by TLS and with no exposed ports on my home network. My persistent admin account means I never lose conversation history or need to reconfigure preferences.
The seamless n8n workflow integration has been transformative for productivity. I can now trigger complex automations through natural conversation, get responses directly in the chat, and even have multi-turn interactions with my workflows. Tasks that used to require multiple steps now happen in a single conversational flow.
Perhaps most valuable is the flexibility—I can easily add new integrations, adjust how my automations respond, and adapt the entire system to my evolving needs. This level of control simply isn’t possible with commercial platforms.
The satisfaction of building something perfectly tailored to my requirements can’t be overstated. Every time I use the system to accomplish something that would have been cumbersome before, I’m reminded of why custom solutions are worth the investment of time and effort.
What’s Next: Future Plans for My AI Chat Platform
This project is far from finished—it’s a foundation for ongoing enhancement. My roadmap includes:
- Activating local model hosting using the Ollama integration (the volume is already provisioned for this)
- Integrating additional external LLM APIs (Claude, Gemini, OpenAI) via Open WebUI’s built-in connectors
- Implementing more granular Cloudflare Access policies for enhanced security
- Adding Watchtower for automated container updates to reduce maintenance overhead
- Integrating vector databases to enable more context-aware responses and document retrieval
What excites me most is the potential for local AI models. As these continue to improve in capability while reducing their resource requirements, having the infrastructure ready to deploy them is a strategic advantage. I’m particularly interested in fine-tuning models specifically for interacting with my workflows.
The modular design of the system means I can implement these enhancements incrementally, prioritizing based on what delivers the most value at any given time.
Final Thoughts: The Value of Building Your Own Integration Hub
This project reinforced my belief in the power of custom, self-hosted solutions. While commercial platforms offer convenience, nothing matches the flexibility, control, and satisfaction of building something that works exactly the way you need it to.
The strategic advantages are clear: deep integration between services like n8n and chat interfaces, complete control over the user experience, and the ability to create truly personalized application frontends. These benefits go beyond technical curiosity—they translate to real productivity gains and reduced friction in daily workflows.
There’s also an intangible benefit to understanding every component of a system you’ve built yourself. When issues arise (and they always do), troubleshooting is faster and more effective because there’s no black box to reverse-engineer.
For anyone considering a similar project, my advice is simple: start small, iterate often, and don’t be afraid to refactor when needed. The journey of building is as valuable as the final product, providing insights and skills that transfer to countless other technical challenges.
Community Connection: Your Thoughts and Experiences?
Have you built your own custom integration hub or self-hosted AI solution? I’d love to hear about your experiences, challenges, and creative solutions. Or if you’re considering a similar project, what questions do you have about the approach I’ve taken?
The community of builders creates tremendous value through shared knowledge and experiences. Whether it’s suggesting improvements to my setup or sharing your own journey, your insights could help others on similar paths.
Leave a Reply