This product was not featured by Product Hunt yet. It will not be visible on their landing page and won't be ranked (cannot win product of the day regardless of upvotes).
Torrix
Self-hosted LLM observability. Every token. Every dollar.
Most LLM observability tools send your prompts to their cloud. Torrix runs on your server. Add 2 lines of Python, or route any HTTP client through the proxy. No code changes needed. Every AI call is logged instantly: tokens, cost, latency, and the full prompt trace. Works with OpenAI, Anthropic, Gemini, Groq, Azure, Mistral, SAP AI Core, n8n, and any HTTP API. Community edition is free forever. Your data never leaves your infrastructure.
I am Adarsh, a developer & integration consultant at SAP building Torrix as a side project.
The frustration: every LLM observability tool I tried wanted to store my prompts on their servers. For enterprise work, that's a non-starter.
So I built Torrix, a self-hosted proxy that logs every AI call locally in SQLite. Two lines of Python, or zero code changes via the HTTP proxy. Supports 300+ LLM models. Deploys in 60 seconds with Docker. Free forever.
Would love your honest feedback, especially from anyone running LLMs in production. What's missing? What would make this a daily driver for you?
No comment highlights available yet. Please check back later!
About Torrix on Product Hunt
“Self-hosted LLM observability. Every token. Every dollar.”
Torrix was submitted on Product Hunt and earned 3 upvotes and 1 comments, placing #110 on the daily leaderboard. Most LLM observability tools send your prompts to their cloud. Torrix runs on your server. Add 2 lines of Python, or route any HTTP client through the proxy. No code changes needed. Every AI call is logged instantly: tokens, cost, latency, and the full prompt trace. Works with OpenAI, Anthropic, Gemini, Groq, Azure, Mistral, SAP AI Core, n8n, and any HTTP API. Community edition is free forever. Your data never leaves your infrastructure.
Torrix was featured in Open Source (68.4k followers), Developer Tools (511.7k followers), Artificial Intelligence (467.3k followers) and GitHub (41.2k followers) on Product Hunt. Together, these topics include over 188.4k products, making this a competitive space to launch in.
Who hunted Torrix?
Torrix was hunted by Adarsh Rao. A “hunter” on Product Hunt is the community member who submits a product to the platform — uploading the images, the link, and tagging the makers behind it. Hunters typically write the first comment explaining why a product is worth attention, and their followers are notified the moment they post. Around 79% of featured launches on Product Hunt are self-hunted by their makers, but a well-known hunter still acts as a signal of quality to the rest of the community. See the full all-time top hunters leaderboard to discover who is shaping the Product Hunt ecosystem.
Want to see how Torrix stacked up against nearby launches in real time? Check out the live launch dashboard for upvote speed charts, proximity comparisons, and more analytics.
Hey Product Hunt! 👋
I am Adarsh, a developer & integration consultant at SAP building Torrix as a side project.
The frustration: every LLM observability tool I tried wanted to store my prompts on their servers. For enterprise work, that's a non-starter.
So I built Torrix, a self-hosted proxy that logs every AI call locally in SQLite. Two lines of Python, or zero code changes via the HTTP proxy. Supports 300+ LLM models. Deploys in 60 seconds with Docker. Free forever.
Would love your honest feedback, especially from anyone running LLMs in production. What's missing? What would make this a daily driver for you?
Try it: https://torrix.ai