Aya Vision, from Cohere For AI, is the open-weights, multilingual, multimodal models (8B & 32B). Outperforms larger models on multilingual vision tasks. Available on Hugging Face and Kaggle.
Check out Aya Vision, a new set of open-weights models from Cohere For AI, and this is a significant step towards making AI truly global! Most vision-language models are heavily biased towards English. Aya Vision tackles this head-on by supporting 23 languages spoken by over half the world's population.
Here's why it's important:
🌍 Multilingual by Design: Excels at understanding and generating text and processing images/videos across a wide range of languages. 🖼️ Multimodal: Handles both images/videos and text. 🚀 Outperforms Larger Models: Cohere claims Aya Vision (8B and 32B versions) outperforms models many times their size (like Llama 3 90B!) on multilingual multimodal tasks. 🔓 Open Weights: Available on Hugging Face and Kaggle. 📱 Free on WhatsApp: You can even try Aya for free on WhatsApp!
They're also releasing a new benchmark, Aya Vision Benchmark, specifically for evaluating multilingual multimodal performance. The goal is to build AI that understands the nuances of different cultures and languages, not just add more languages.
About Aya Vision on Product Hunt
“Multilingual, Multimodal AI from Cohere”
Aya Vision launched on Product Hunt on March 9th, 2025 and earned 118 upvotes and 5 comments, placing #7 on the daily leaderboard. Aya Vision, from Cohere For AI, is the open-weights, multilingual, multimodal models (8B & 32B). Outperforms larger models on multilingual vision tasks. Available on Hugging Face and Kaggle.
On the analytics side, Aya Vision competes within Open Source, Artificial Intelligence and Photo & Video — topics that collectively have 536.5k followers on Product Hunt. The dashboard above tracks how Aya Vision performed against the three products that launched closest to it on the same day.
Who hunted Aya Vision?
Aya Vision was hunted by Zac Zuo. A “hunter” on Product Hunt is the community member who submits a product to the platform — uploading the images, the link, and tagging the makers behind it. Hunters typically write the first comment explaining why a product is worth attention, and their followers are notified the moment they post. Around 79% of featured launches on Product Hunt are self-hunted by their makers, but a well-known hunter still acts as a signal of quality to the rest of the community. See the full all-time top hunters leaderboard to discover who is shaping the Product Hunt ecosystem.
Hi everyone!
Check out Aya Vision, a new set of open-weights models from Cohere For AI, and this is a significant step towards making AI truly global! Most vision-language models are heavily biased towards English. Aya Vision tackles this head-on by supporting 23 languages spoken by over half the world's population.
Here's why it's important:
🌍 Multilingual by Design: Excels at understanding and generating text and processing images/videos across a wide range of languages.
🖼️ Multimodal: Handles both images/videos and text.
🚀 Outperforms Larger Models: Cohere claims Aya Vision (8B and 32B versions) outperforms models many times their size (like Llama 3 90B!) on multilingual multimodal tasks.
🔓 Open Weights: Available on Hugging Face and Kaggle.
📱 Free on WhatsApp: You can even try Aya for free on WhatsApp!
They're also releasing a new benchmark, Aya Vision Benchmark, specifically for evaluating multilingual multimodal performance. The goal is to build AI that understands the nuances of different cultures and languages, not just add more languages.