Product Thumbnail

Langfuse 2.0

An open source LLM engineering platform

Open Source
Developer Tools
Artificial Intelligence

Langfuse is the open source LLM Engineering Platform. It provides observability, tracing, evaluations, prompt management, playground and metrics to debug and improve LLM apps. Langfuse is open. It works with any model, framework and you can export all data.

Top comment

Hi Product Hunt 👋👋👋 I’m Clemens, co-founder of Langfuse (with @marc_klingen and @max_deichmann). We are so excited to be live on Product Hunt again today launching Langfuse 2.0! We originally launched Langfuse right here in late August 2023 and made it to #1 Product of the Day. Earlier this year we were lucky enough to be awarded a Golden Kitty in the AI Infra category for 2023! Product Hunt holds a special place in our heart and company history. Langfuse has come a long way. Last year, we set out to build open source observability and analytics for LLM apps. We saw a need and gap for a dev-focussed, open source and self-hostable product that allows for complex tracing and nesting of LLM and other workloads. We are excited to introduce Langfuse 2.0 today. We have moved from an observability tool to a platform for LLM engineering. The scope is now wider and the core product has matured so much since last year. A few things you should know about: 👣 Tracing: TS & Python (new: decorator) + OpenAI, Llama Index, Langchain, Litellm + more 🤨 Eval Service: Automatically run evals against all new incoming traces 💾 Datasets: Collect, test and run fine-tuning, testing, and golden datasets in Langfuse 💯 Metrics: Dashboards and analytics on cost, latency and quality 🤖 Prompt Management: Version and deploy prompts from within Langfuse 🕹️ LLM Playground: Engineer your prompts with context in Langfuse 🏎️ Export & Fine Tune: Open GET API and csv/JSON exports to build downstream use cases 🚄 Scale: We’ve invested significantly in scaling and resilience as we’ve scaled to thousands of users and handle millions of events a day 🧑‍🤝‍🧑 Community: Join our thousands of users on GitHub Discussions and Discord See for yourself: ⭐ GitHub: https://github.com/langfuse/lang... ⏯️ Demo (includes data, no credit card): https://langfuse.com/demo 📚 Docs: https://langfuse.com/docs We’re excited to launch the all-new Langfuse on Product Hunt today. We will be in the comments the whole day and can’t wait to hear your thoughts & feedback!

Comment highlights

This is very cool! Will share it with my team at QueryPal(https://www.querypal.com), as we start benchmarking and evaluating, we are currently looking into WandB.

Hey guys! Congrats! Can you explain the significance of the LLM Playground within Langfuse?

We've been using Langfuse since v1 — and have been loving it! The pace of improvements is crazy. Also highly recommend their Posthog integration to keep all your analytics — including LLMs — in one place.

Langfuse is production ready application for LLM Observability and LLMops. Thank you for awesome open source application!

@lan An open-source LLM (Language Model) engineering platform is a dynamic digital infrastructure that revolutionizes how language models are developed, customized, and deployed within engineering contexts. By embodying the principles of open-source philosophy, these platforms empower developers, researchers, and organizations to collaborate, innovate, and democratize access to advanced language technologies. At its core, an open-source LLM engineering platform serves as a versatile ecosystem for experimenting with language models, leveraging machine learning algorithms to comprehend, generate, and process natural language text. It provides a collaborative environment where users can access pre-trained models, datasets, and tools, facilitating rapid prototyping, fine-tuning, and evaluation of language models for diverse applications. Key features of such platforms include a centralized model repository, enabling easy access to state-of-the-art models and datasets, along with customization tools for tailoring models to specific tasks and domains. Deployment infrastructure ensures seamless integration of language models into production environments, whether through cloud-based solutions or on-premises deployments. Collaboration frameworks foster community-driven development, version control, and knowledge sharing, while robust evaluation and monitoring capabilities enable users to assess model performance and track usage metrics. Moreover, an open-source LLM engineering platform encourages transparency, accountability, and inclusivity in language model development, addressing concerns related to bias, fairness, and ethical considerations. By promoting open access to tools, resources, and best practices, these platforms foster innovation, accelerate research, and drive positive societal impact across various domains, from natural language processing and software development to healthcare, finance, and beyond. In essence, an open-source LLM engineering platform embodies the ethos of collaboration, democratization, and responsible innovation, empowering individuals and organizations to harness the power of language models for positive change in an increasingly interconnected world. https://answertenant.com/real-es...

A great product to productionize your LLM applications. Once you create a prototype, the next step is understanding the cost structure, and tracing the paths - and Langfuse helps you to solve those challenges. The new decorators make it easy to track anything easily from your code. Good luck with the launch!

Langfuse is a well-designed and feature-rich platform that addresses a significant need in the rapidly evolving LLM ecosystem. Its open-source nature and comprehensive capabilities make it a highly compelling offering for the community. Congratulations on the launch!

Congrats on launching Langfuse 2.0! @clemo_sf An open-source LLM Engineering Platform providing observability, tracing, evaluations, prompt management, playground, and metrics. Debug and improve LLM apps easily with Langfuse. It's open, works with any model or framework, and allows exporting all data. How does Langfuse ensure compatibility with different frameworks and models?

Congrats on the launch! Langfuse is really an amazing tool to get llm apps production-ready, glad to see it getting so much traction on here.

When I found Langfuse, I was so impressed that I made it my priority to integrate it CVToBlind's product stack. Having the ability to not only track the cost of my prompts, but also compare them between different LLM's, group each calls by various ID's and track individual user usage - this is a killer set of features, all while staying well within the free tier. Their recent addition of Posthog integration allowed for us to track our product usage even better. I did miss one feature to fully support my needs, but since it's open source I could just add that code myself - and that's exactly what I did. Working with @marc_klingen has been great, the code is in a great shape and they definitely have my confidence in serving our prompts for production use. We need more products like this. Best of luck, keep it up!