The iPhone 14 Pro Camera has one big headline featureits brand-new 48-megapixel sensor.

Even the flash is now useful.

Or more useful, anyway.

Closeup on the iPhone 14 Pro camera array on the back of the phone.

Apple

Lenses and Cameras

All the cameras on the iPhone 14 Pro are improved.

Most important could be the wider aperture of the primary camera, which lets more light in.

Lens apertures are measures in stops.

Above and close view of the cameras on the front and back of the iPhone 14 Pro.

Apple

Adding or subtracting a stop doubles or halves the amount of light entering the camera.

Unfortunately, only the primary camera gets a wider aperture.

Here’s how the iPhone 14 Pro compares to the iPhone 13 Pro from last year.

An image captured with the iPhone 14 Pro camera.

Apple

That’s not the whole story, though.

All three cameras get physically bigger sensors, which means more light-gathering capabilities, and less noise.

That means you, the subject, will be sharper.

An image of a person wrapped in a blanket of hand-crocheted wool take with the iPhone 14 Pro.

Apple

Not everything is good news, though.

The primary camera is now seriously wide.

Now, it has a 24mm equivalent, which is already considered to be ultra-wide by many photographers.

A portrait captured with the iPhone 14 Pro.

Apple

This makes even less sense since incorporating the super wide camera."

For comparison, the ultra-wide camera is a light-bending 13mm equivalent, and the 3x camera is 77mm.

Speaking of ‘standards,’ the standard lens, considered neither wide nor telephoto, is 50mm.

A portrait captured in low light with the 2X zoom feature on the iPhone 14 Pro.

Apple

That’s more than double the focal length of the iPhone’s primary camera.

48MP

48 megapixels is absurd for a phone.

If left unprocessed, they’d come in at around 50-60MB per image.

A zoomed in photograph taken with the iPhone 14 Pro.

Apple

Compare that to the 12MP pictures from, say, the iPhone 12, which come in at 2-3MB.

But the story here is that these imagesareprocessed.

That’s how processed they are.

And that pipeline is impressive.

The camera doesn’t just capture one image and then process that.

Another of those tricks is the Photonic Engine.

Low Light Photonic Engine

A camera, whether film or digital, captures light.

And that means it’s harder to take pictures in low light, like indoors or at night.

Both film and digital sensors react to photons falling on them.

That leads to ugly, noisy images when you venture to amplify everything into a picture.

The Photonic Engine is Apple’s answer to noisy low-light images.

Essentially, it’s an extension of the existing Sweater Mode, akaDeep Fusion.

In itself, this is an incredible feat.

That’s a ton of data to get through, and multiple captures only add to the workload.

Even with very normal light levels, the detail is astonishing.

However, sometimes this isnt the case.

But it also has a new option.

The primary camera cuts a 12-megapixel rectangle out of the center of the 48MP sensor.

Its a very smart move and will probably still give better images than the 3X zoom in low light.

This sensor-cropping trick also works for video, enabling 4K video at a 2x zoom.

Its impressive and is a very compelling reason to upgrade to this camera.