Hello ProductHunt! 👋
The new Qwen 3.5 small models are now available for iPhone and iPad in Locally AI.
The new models beat models 4 times their size, support vision and reasoning toggle. Four sizes are available: 0.8B, 2B, 4B, and 9B (available on supported iPads).
Enjoy!
Qwen 3.5 2B vision on an iPhone 16 Pro is astonishing. This is absolutely the future of device AI. I can't wait for OSes that use AI as the kernel for everything.
"Offline + private + no login = the holy trinity that 90% of AI apps ignore because cloud is easier to monetize. Locally AI is betting on the right side of the privacy conversation.
As someone building Fillix, a Chrome extension that makes job hunting embarrassingly easy, the 'no login required' UX decision hits close to home. The best tools get out of your way instantly. Apple Silicon optimization is the cherry on top. Congrats on shipping!
Great launch 👏
Running powerful models locally on iPhone and iPad is exactly where things are heading — privacy-first, fully offline, no logins, no cloud dependency.
Excited to see Qwen integrated here. Strong reasoning + vision capabilities, and having that fully on-device is a big step forward in user control and data ownership.
Curious about:
– inference speed across different devices
– memory usage and optimization
– how the model download and UX flow are handled
If performance holds up, this could be a serious alternative to cloud-based AI apps. Congrats on the launch 🚀
Literally the best app to experience the latest @Qwen3 local AI models on your phone! 🚀