In Google’s case, the bump is a bar that stretches across the width of the unit.

It looks cool and houses an impressive array of new lenses and sensors.

But, also like the iPhone, it’s what’s inside that counts.

Four Google Pixel 6 smartphones lined up, face down.

Google

Computational photography is taking over photography, but where is it going?

The tiny lenses and sensors in phones struggled in low light and had trouble capturing intricate detail.

On the other hand, photographers don’t always want a “perfect” shot.

Example of what Magic Eraser can do in a photograph.

Example of what Magic Eraser can do in a photograph.Magic Eraser

The Pixel 6

The cameras inside the new phones are impressive.

For example,Magic Eraserlets you remove distracting elements from the photo.

Not only that, but the camera detects these elements automatically and suggests removal.

Photography examples from the Google Pixel 6.

Photography examples from the Google Pixel 6.Google

You just confirm with a tap.

Magic Eraser

Or how about face unblur?

If your subject is moving fast in low light, this feature attempts to unblur their face.

It’s perfect for snapshots of fast-moving kids (all kids who aren’t sleeping) indoors.

And Motion Mode does the oppositedeliberately blurring elements that are moving for effect.

Perhaps the best feature is the most subtle.Real Tonelets cameras render any skin tone properly.

Given theethnic bias that has been built into photographysince the early film days, this is a big deal.

Better Pictures, Less Effort

Computational photography seems to have two purposes right now.

One is to give you an amazing photo, no matter the conditions.

In some ways, this risks making all our photos look the same.