Product Thumbnail

Convo

Memory & observability for LLM apps

Developer Tools
Artificial Intelligence
Tech

Convo is the fastest way to log, debug, and personalize AI conversations. Capture every message, extract long-term memory, and build smarter LLM agents - with one drop-in SDK.

Top comment

Hey folks! 👋

We built Convo SDK after months of wrangling LangGraph checkpointers and database infra just to get persistent memory working.

Every time we added memory to an agent, we ended up knee-deep in connection pools, schema migrations, and random production crashes 😵‍💫

So we made something simple:

One line to replace any LangGraph checkpointer.

No Postgres. No Mongo. No ops.

Just:

checkpointer: convo.checkpointer()


It’s TypeScript-first, handles multi-user threads out of the box, and gives you time-travel debugging + bulletproof persistence, all without spinning up a DB.

We’d love feedback from other agent builders, LangGraph users, or anyone tired of “database hell.”

Excited to hear what you think!

_________________

🔧 Want to try it in your own stack?

We put together a quick-start cookbook with real-world recipes to get going fast:

👉 https://www.notion.so/Convo-gett...

📦 Install via NPM

Grab the SDK and start hacking in under 2 minutes:

👉 https://www.npmjs.com/package/co...

🧠 Join our live workshop (July 26)

We’re running a hands-on LangGraph memory session where we’ll build from stateless to fully persistent agents:

👉 https://lu.ma/fl29ul0l?utm_sourc...

Comment highlights

@sunnyjoshi one line to replace LangGraph checkpointer? Brilliant - I’ve been struggling with similar memory issues in my Claude Code sessions. Does this work directly with Anthropic’s API or would I need a wrapper? Database hell struggle is real 😆

The drop-in SDK and one-liner setup is beautifully dev-friendly :) Congratulations @sunnyjoshi and team!

I like that it’s TypeScript-first and supports multi-user threads without extra config. That alone is a reason for me to give it a shot.

Giving LLM apps memory and observability is key to making them smarter and more reliable — Convo looks like a must-have for any AI dev stack. Congrats on the launch 🚀

Logging every convo AND adding long-term memory? That’s huge for making LLM apps feel way smarter, tbh. You folks nailed it with this one!

We're also doing a virtual event covering memory and persistent graph state using checkpointers, please feel free to join - https://lu.ma/fl29ul0l