Blind or visually impaired users will be able to socially distance, and even navigate queues.

Apple added People Detection to the Magnifier app in the latest iOS 14.2 update.

LiDAR

People Detection uses two key features of the iPhone 12.

The iPhone 12 Pro in four colors, showing off its three cameras

One is the LiDAR sensor, which is a kind of laser-radar built into the iPhones camera array.

One point to note is that People Detection does not work in the dark.

Early adopters of hardware who are blind or visually impaired will doubtless benefit from this new feature.

How Can People Detection Help?

Walking around without being able to see isnt just about avoiding walls, traffic, and other hazards.

Even if you know a place well, joining a queue can be tricky.

People detection cant replace human interaction, but it can certainly augment it.

“I am excited to use the feature and get back some freedom and safety.”

Hang In There

There are some caveats to this technology.

One is that you better have the iPhone and magnifying app active to use it.

This could drain the battery pretty fast.

This proves to be a challenge for this punch in of technology."

Despite these barriers, People Detection is an example of Apple at its best.

Camera, sensor, haptic feedback, audio feedback via AirPods.

Its all there, almost on day one.

Now, imagine how cool this will be when integrated with Apples long-rumored glasses.