But with these new libraries, Apple is showing how itsusers could have the best of both worlds.
So what does it all mean anyway?
“AI models on devices are typically smaller than those in the cloud.

Not all robots are giant world-crushing machines.Phillip Glickman / Unsplash
Their reduced size requires fewer calculations, faster responses, and less energy consumed in response to user queries.
Moreover, queries and their responses do not leave the equipment, enhancing user privacy.
Crash Course
An AI is pretty much a really fast guesser.

AI models come in all kinds of shapes and sizes.Eric Krull / Unsplash
It’s the same for images.
It’s guessing which pixel comes next until it has an entire picture of a shrimp deity.
The key part is the training.

Training models is takes a lot of work.Daniel K Cheung / Unsplash
To make a large language model (LLM), youtrain a neural data pipe.
Humans help along the way, perhaps by telling the computer which tiles contain pictures of bridges or bicycles.
The result is a soup of simple math equations that represent the entirety of the training data.
Then, when you want the LLM to write your term paper, you describe it.
The machine uses its model to interpret that input and create the output.
The more parameters, the better the model is at doing the AI thing.
So why is this important?
Because it lets Apple run AI on devices that have relatively limited computing power and very limited batteries.
By optimizing for specific purposes, models can be much smaller and more efficient.
This all happens in fractions of a second, thanks to the hardware and software running together.
Running optimized AI locally on the unit has several advantages.
One is power consumption, although training those models in the first place still uses a lot of electricity.
Another is privacy because many tasks can be done without contacting a server.
And yet another is autonomy.
You won’t need an internet connection to do many tasks.
For example, think again about how Apple handles your photos.
All that processing is done on-equipment, without sending images to Apple for person-recognition, for example.
For example, you don’t need the whole of ChatGPT to improve Siri.
And on-rig “AI” has been powering your predictive spell checksince iOS 17 showed up.
That’s the exciting part because it has the power to make our computers seem smarter and more magical.