Product Thumbnail

Convo

Memory & observability for LLM apps

Developer Tools
Artificial Intelligence
Tech

Hunted bySunny ☀️Sunny ☀️

Convo is the fastest way to log, debug, and personalize AI conversations. Capture every message, extract long-term memory, and build smarter LLM agents - with one drop-in SDK.

Top comment

Hey folks! 👋

We built Convo SDK after months of wrangling LangGraph checkpointers and database infra just to get persistent memory working.

Every time we added memory to an agent, we ended up knee-deep in connection pools, schema migrations, and random production crashes 😵‍💫

So we made something simple:

One line to replace any LangGraph checkpointer.

No Postgres. No Mongo. No ops.

Just:

checkpointer: convo.checkpointer()


It’s TypeScript-first, handles multi-user threads out of the box, and gives you time-travel debugging + bulletproof persistence, all without spinning up a DB.

We’d love feedback from other agent builders, LangGraph users, or anyone tired of “database hell.”

Excited to hear what you think!

_________________

🔧 Want to try it in your own stack?

We put together a quick-start cookbook with real-world recipes to get going fast:

👉 https://www.notion.so/Convo-gett...

📦 Install via NPM

Grab the SDK and start hacking in under 2 minutes:

👉 https://www.npmjs.com/package/co...

🧠 Join our live workshop (July 26)

We’re running a hands-on LangGraph memory session where we’ll build from stateless to fully persistent agents:

👉 https://lu.ma/fl29ul0l?utm_sourc...

Comment highlights

@sunnyjoshi one line to replace LangGraph checkpointer? Brilliant - I’ve been struggling with similar memory issues in my Claude Code sessions. Does this work directly with Anthropic’s API or would I need a wrapper? Database hell struggle is real 😆

The drop-in SDK and one-liner setup is beautifully dev-friendly :) Congratulations @sunnyjoshi and team!

I like that it’s TypeScript-first and supports multi-user threads without extra config. That alone is a reason for me to give it a shot.

Giving LLM apps memory and observability is key to making them smarter and more reliable — Convo looks like a must-have for any AI dev stack. Congrats on the launch 🚀

Logging every convo AND adding long-term memory? That’s huge for making LLM apps feel way smarter, tbh. You folks nailed it with this one!

We're also doing a virtual event covering memory and persistent graph state using checkpointers, please feel free to join - https://lu.ma/fl29ul0l

About Convo on Product Hunt

Memory & observability for LLM apps

Convo launched on Product Hunt on July 22nd, 2025 and earned 148 upvotes and 11 comments, placing #8 on the daily leaderboard. Convo is the fastest way to log, debug, and personalize AI conversations. Capture every message, extract long-term memory, and build smarter LLM agents - with one drop-in SDK.

Convo was featured in Developer Tools (511k followers), Artificial Intelligence (466.2k followers) and Tech (621.5k followers) on Product Hunt. Together, these topics include over 313.7k products, making this a competitive space to launch in.

Who hunted Convo?

Convo was hunted by Sunny ☀️. A “hunter” on Product Hunt is the community member who submits a product to the platform — uploading the images, the link, and tagging the makers behind it. Hunters typically write the first comment explaining why a product is worth attention, and their followers are notified the moment they post. Around 79% of featured launches on Product Hunt are self-hunted by their makers, but a well-known hunter still acts as a signal of quality to the rest of the community. See the full all-time top hunters leaderboard to discover who is shaping the Product Hunt ecosystem.

Want to see how Convo stacked up against nearby launches in real time? Check out the live launch dashboard for upvote speed charts, proximity comparisons, and more analytics.