Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

TensorZero

Open-source stack for industrial-grade LLM applications

Build industrial-grade LLM applications: one API for every LLM, observability, optimization (prompts, models, etc.), evaluations, and A/B testing — all open source. Turn metrics and human feedback into smarter, faster, and cheaper LLMs. Get started in minutes.

Top comment

Hi Product Hunt - we're the team behind TensorZero, an open-source LLM infrastructure project.

What is TensorZero?

TensorZero is an open-source stack for industrial-grade LLM applications:

  • Gateway: access every LLM provider through a unified API, built for performance (<1ms p99 latency)

  • Observability: store inferences and feedback in your database, available programmatically or in the UI

  • Optimization: collect metrics and human feedback to optimize prompts, models, and inference strategies

  • Evaluation: benchmark individual inferences or end-to-end workflows using heuristics, LLM judges, etc.

  • Experimentation: ship with confidence with built-in A/B testing, routing, fallbacks, retries, etc.

Take what you need, adopt incrementally, and complement with other tools.

https://github.com/tensorzero/tensorzero

Why should you use a tool like this?

Over time, these components enable you to set up a principled feedback loop for your LLM application. The data you collect is tied to your KPIs, ports across model providers, and compounds into a competitive advantage for your business.

Here are some recent blog posts we wrote that illustrate some of the benefits:

We hope TensorZero will be useful to many of you Hunters!

How is TensorZero different from other tools?

1. TensorZero enables you to optimize complex LLM applications based on production metrics and human feedback.

2. TensorZero supports the needs of industrial-grade LLM applications: low latency (thanks to Rust 🦀), high throughput, type safety, self-hosted, GitOps, customizability, etc.

3. TensorZero unifies the entire LLMOps stack, creating compounding benefits. For example, LLM evaluations can be used for fine-tuning models alongside AI judges.

And it's all open source!

How much does TensorZero cost?

Nothing. TensorZero is 100% self-hosted and open-source (Apache 2.0). There are no paid features.

("But really, how do you plan to make money?" PH sneak peek: next year, we're planning to launch an optional, complementary paid service focused on automated LLM optimization, abstracting away all the GPUs needed to handle that. The developer tool we're working on today will remain open source.)

How can I help?

We'd love to get your feedback: features you like, features that are missing, anything confusing in the docs, etc.

TensorZero is 100% open source, so feedback from the builder community helps us prioritize the roadmap, improve the developer experience, fill any gaps in the docs, and so on.

Thank you! Please let us know if you have any questions or feedback.