Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

Slm-mesh

Your AI coding sessions can finally talk to each other

SLM Mesh is an open-source MCP server that gives AI coding agents peer-to-peer communication. SLM Mesh fixes this with 8 MCP tools: - Peer discovery (scoped by machine, directory, or git repo) - Direct messaging + broadcast - Shared key-value state - File locking with auto-expire - Event bus for real-time coordination Works with Claude Code, Cursor, Aider, Windsurf, Codex, VS Code — any MCP-compatible agent. npm install -g slm-mesh

Top comment

Hi PH! I'm Varun, and I built SLM Mesh because I was tired of being the message bus between my AI coding sessions. The core insight: every AI coding agent (Claude, Cursor, Aider) has isolated sessions. When you run 3+ in parallel, they have no way to coordinate. You end up copy-pasting context between terminals. SLM Mesh adds 8 MCP tools that let agents discover each other, send messages, share state, and lock files. It works with ANY MCP-compatible agent, not just one vendor. Technical highlights: - SQLite + WAL mode for persistence - Unix Domain Sockets for <100ms push delivery - Bearer token auth on every endpoint - 480 tests, 100% line coverage - Auto-start broker, auto-shutdown when idle MIT licensed. Would love your feedback. GitHub: https://github.com/qualixar/slm-...