Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

OpenBerth

AI-assistant native self-hosted deployment platform

OpenBerth is a self-hosted deployment platform that AI assistants can use natively. Tell Claude, your agent, or any AI tools you use on your phone or desktop to deploy an app, it does. Store and use secrets securely, AI references them by name without seeing values. Works from CLI, AI chat, or a single file drop. Your server, your rules. Connect any AI tools to MCP server and start building.

Top comment

I deployed an app from my phone last week, not by SSHing anywhere, but by telling Claude "build me a feedback form and deploy it with my database credentials." 30 seconds later I had a live HTTPS URL. That moment made me realize this thing is ready to share. **The backstory:** I got tired of paying platform taxes on side projects and internal tools. Vercel and Railway are great but I have servers sitting there. I just wanted: code in → URL out. No YAML, no Docker Compose, no 47-step CI pipeline, and even my non-technical friend be able to quickly use it instead of sharing localhost:3000 ;) So I built OpenBerth. One binary on any Linux server. That's the whole setup. **What surprised me** was how much better it got once I connected it to AI. Claude (on my phone, desktop, or Cursor) can: → Write code and deploy it in the same conversation → Look up my encrypted secrets by description ("Stripe key", "production database") and use them without ever seeing the values → Push live updates with hot reload — no rebuild wait → Pull logs and debug issues right in the chat I basically have a DevOps engineer in my pocket now. **It's not AI-only though.** `berth deploy` from your terminal works exactly how you'd expect. Drop a React component, a Python app, a Go service, even a Jupyter notebook — auto-detected, sandboxed, HTTPS, done. Every deployment shows up in a **built-in app gallery** — a dashboard of everything you and your team/friends/family have deployed, all in one place. Click through to any app, check its status, or share it. It's like your own private app store. Each app also gets **persistent storage** out of the box — a built-in key-value data API that survives redeploys. Your app can store and retrieve data without setting up a separate database. Great for prototypes, internal tools, or anything that needs to remember state. Fully open source, self-hosted, no usage limits, no surprise bills. **One more thing** that I think matters: OpenBerth doesn't lock you into any specific AI tool. Today I use Claude and Cursor. Tomorrow a better model comes out? Just point it at my server. My infrastructure, my secrets, my deployments — they all stay. The AI is interchangeable, the platform is yours. I think this is where things are heading. AI coding tools will keep getting better and cheaper. The last thing you want is your production apps trapped inside a platform that picked one model for you. Own your deployment layer, swap the brains whenever you want. I've been using this daily for months and I'm genuinely curious — if you had an AI that could deploy to YOUR server, what's the first thing you'd build?