11.ai is a new voice-first AI assistant from ElevenLabs. It connects to tools like Perplexity and Linear via MCP to take action on your behalf. Features 5,000+ voices and is now open in a free alpha.
A voice assistant that sounds perfectly natural is one thing, but the real value comes when it can actually do things for you. When that great voice interface connects to more of your tools, your ability to get things done just skyrockets.
ElevenLabs' new experimental alpha, 11.ai, is built on this idea. It combines their top-tier conversational AI with a growing list of integrations via MCP, connecting to tools like Perplexity for research, Linear for tasks, and Slack for updates.
This means you can go from just asking questions to actually managing your workflow—planning your day, researching topics, creating tickets—all through a natural conversation. It's a huge boost for real-world productivity.
You can even personalize it with thousands of voices. As a proof of concept for voice-first assistant, this is very interesting to see.
Since tools like Hacker News are being integrated, makes you wonder when we'll be able to just speak to find goodies on PH, right? 🤔
Congrats on the launch! I’ve tested ElevenLabs before and it actually works really well — super natural voices and fast processing. Curious what’s next on the roadmap!
I have used ElevenLabs in the past and always loved both the interface and the final output. Excited about the upcoming updates.
Some friends recommended ElevenLabs to me, and since I’m planning to build AI-powered YouTube and TikTok channels, it looks like the perfect tool to try out.
Can I use it with my clone voice? So I will have myself as an assistant!
Hey, Louis from ElevenLabs here.
For years, voice assistants could do little more than set alarms and timers. Now, with technology that can reason, plan, and act - while conveying the full emotional nuance of human speech - your voice becomes the most natural interface to move work forward.
11.ai blends ElevenLabs Conversational AI, with the Model Context Protocol (MCP) for tools & integrations.
In plain English, that means you can speak to 11.ai and it can instantly:
Plan your day “Plan my day and push the top three tasks to Linear.”
Research & summarise “Use Perplexity to gather the latest on Acme Corp and give me a 30-second briefing.”
Catch up the team “Summarise yesterday’s Slack chatter in #engineering and post the key decisions.”
All of this happens in real time, even in your own cloned voice if you prefer. We’re launching new integrations every week, so let us know what would make your daily workflow better.
Who’s it for?
Anyone whose workload lives in tools - and wishes those tools felt as natural as conversation:
Founders & operators who hop between tasks all day
PMs & engineers who need hands-free triage on the go
RevOps & sales who live in CRMs but hate updating them
Anyone who prefers talking to tapping.
What’s next?
Today we’re opening 11.ai (alpha) - free. Try it today.
Thanks for sharing @zaczuo ! Excited to see all the new workflows this unlocks. We're iterating very quickly based on feedback so please do comment down below with any suggestions/ideas.
Have a favorite integration in mind that you’d like to have built-in? Let @louis_jordan know in the comments!
I think live weather is a no-brainer for that morning outfit brainstorm.
Congratulations on the launch🎉
Can we utilize this features with the current "create agent" Api ???
I tried this and it was crazy good. I was able to get information about an upcoming meeting, have Eleven send a calendar invite with the information it discovered in the meeting details, and create the invite. The invite appeared instantly in my Google Cal. The future is going to be insane.
Saw the intro video yesterday, looks promising. Combine it with a wearable and it could be one of the best AI assistants.