Build with Apple's on-device AI, now open to developers
Apple's new Foundation Models framework gives developers direct access to its ~3B parameter on-device language model. Features deep Swift integration for easy use, powerful privacy, and cost-free AI inference for apps on Apple silicon.
Hi everyone!
I know a lot of the talk after WWDC has been about the new Liquid Glass design, but I think the long-term impact of Apple's progress with their on-device models is being underestimated.
With the new Foundation Models framework, Apple is giving developers direct access to the ~3B on-device model that powers many Apple Intelligence features. The integration with Swift makes it remarkably simple to bring generative features into an app, all while running locally, privately, and at no inference cost.
Given Apple's massive hardware install base, this framework could unleash a huge amount of potential. While on-device AI still has a way to go to catch up with the most advanced cloud models, I personally believe this gap will close faster than many expect. Once on-device AI crosses that key capability threshold, we're likely to see a whole new wave of app innovation.