The iPhone could already be regarded as a camera with a computer and screen attached, but the iPhone 14 Pro takes this to ridiculous levels. 

  • For the first time, the iPhone has a massive 48MP sensor-up from 12MP. The processing pipeline has been upgraded to thrive on all that extra data. The primary camera is really an ultrawide camera, now.

The iPhone 14 Pro Camera has one big headline feature—its brand-new 48-megapixel sensor. And this, plus a few other significant hardware improvements, is at the heart of a whole new camera system. If you’re interested in smartphone photography, this is Apple’s most exciting new camera for years, and almost no part has gone untouched. Even the flash is now useful. Or more useful, anyway. 

“You could, in fact, print out a billboard-size image from a 48MP sensor, which is quite remarkable from a smartphone camera,” Mark Condon, the CEO and founder of Shotkit, told Lifewire via email. 

Lenses and Cameras

All the cameras on the iPhone 14 Pro are improved. The primary camera gets the biggest change, jumping from 12MP to 48MP, but the other cameras also get essential updates. Most important could be the wider aperture of the primary camera, which lets more light in. This will improve low-light performance and also let you use a faster shutter speed in any given situation, leading to less motion blur. 

Lens apertures are measures in ƒ stops. If you’ve used a DSLR camera with an aperture ring around the lens, you’ll know that each click corresponds to one ‘stop.’ Adding or subtracting a stop doubles or halves the amount of light entering the camera. It’s exactly the same, light-wise, as changing the shutter speed from, say, 1/1000th of a second sec to 1/500th of a second—double the time means double the light. 

Unfortunately, only the primary camera gets a wider aperture. Here’s how the iPhone 14 Pro compares to the iPhone 13 Pro from last year. The primary camera goes from ƒ1.5 to ƒ1.78 (lower numbers—confusingly—mean bigger holes), the ultra-wide goes from ƒ1.8 to a worse ƒ2.2, and the 3x telephoto remains at ƒ2.8. 

That’s not the whole story, though. All three cameras get physically bigger sensors, which means more light-gathering capabilities, and less noise. 

And the front-facing selfie camera is now, for the first time, an autofocus model instead of a fixed-focus unit. That means you, the subject, will be sharper. 

Not everything is good news, though. The primary camera is now seriously wide. For a long time, the primary camera has had a 26mm equivalent lens, which is already very wide. Now, it has a 24mm equivalent, which is already considered to be ultra-wide by many photographers.

“While Apple has made consistent advances with the primary camera in terms of image quality, they have quietly compromised the main lens focal length by going wider and wider, now at 24mm equivalent,” says photography enthusiast Phototo on the DP Review forums. “IMO, it’s a bad choice given that most subjects are people shots which are best taken with longer focal lengths of 30mm and up. This makes even less sense since incorporating the super wide camera.”

For comparison, the ultra-wide camera is a light-bending 13mm equivalent, and the 3x camera is 77mm. ‘Equivalent’ means that the numbers are changed so that they can be compared to film and full-frame cameras, so we can have a standard reference point. Speaking of ‘standards,’ the standard lens, considered neither wide nor telephoto, is 50mm. That’s more than double the focal length of the iPhone’s primary camera. 

48MP

48 megapixels is absurd for a phone. If left unprocessed, they’d come in at around 50-60MB per image. Compare that to the 12MP pictures from, say, the iPhone 12, which come in at 2-3MB. 

But the story here is that these images are processed. A lot. If iPhone 14 Pro images were food, they’d be those wet pink flaps of ‘ham’ in the shape of teddy bear faces. That’s how processed they are. 

Apple takes those 48 million pixels, jams them through its imaging pipeline, and pops out regular old 12MP images. The result is an image with way more detail, thanks to the increased data from the initial capture, but that is still small enough to store and share. 

“We do expect the 12-megapixel images to be much better, in all conditions, because: 12 MP shots will be ‘binned’; with signal collected from 4 pixels vs. 1, there’s less noise and higher image quality; its sensor is almost twice the size; new Photonic Engine pipeline,” says camera app developer Halide on Twitter. 

And that pipeline is impressive. The camera doesn’t just capture one image and then process that. Depending on the mode you’re in, it might capture several, or even dozens, of images, then compare the differences between them, extract the best parts, and then use all that data to perform different tricks. These tricks include smile and blink detection and therefore, the magic ability to choose the capture where the subject looks the least goofy. 

Another of those tricks is the Photonic Engine.

Low Light Photonic Engine

A camera, whether film or digital, captures light. And that means it’s harder to take pictures in low light, like indoors or at night. Both film and digital sensors react to photons falling on them. That’s partly because less light means less data (the same reason our eyes see less at night) and partly because when the signal level drops, it gets closer to the noise floor of the sensor. That leads to ugly, noisy images when you try to amplify everything into a picture. 

Low light can mean darkness, which is what Night Mode is for, but low-light noise also strikes in anything but sunlight or bright overcast conditions.

The Photonic Engine is Apple’s answer to noisy low-light images. Essentially, it’s an extension of the existing Sweater Mode, aka Deep Fusion. Deep Fusion took several images and compared them to sift the noise patterns from the image, and then combined the results to make a sharper, more detailed image than would otherwise be possible. 

The Photonic Engine does the same, but instead of combining already-processed images, it starts work further back along the pipeline, chomping through the unprocessed data from those 48MP captures. 

In itself, this is an incredible feat. That’s a ton of data to get through, and multiple captures only add to the workload. But the results are clearly amazing, as you can see in Apple’s promo shots here. Even with very normal light levels, the detail is astonishing. 

Magic Zoom

Apple has pulled off one other very neat trick, something only possible with a big sensor like the 48MP model used here: the 2X zoom.

Usually, if you want to zoom into a scene, you need a zoom lens or a telephoto lens. Or you can use a digital zoom, which is essentially the same as taking the center section of the image and enlarging it. Digital zoom results in lower quality images, which is why we have the extra camera on the backs of our phones—the equivalent of changing lenses on a DLSR or mirrorless camera. 

However, sometimes this isn’t the case. On previous iPhones, the low-light performance of the telephoto cameras has been relatively poor compared to the primary camera. In low light, when you take a 2x or 3x picture (depending on the iPhone model), the phone will prefer a digital crop from the primary camera over using the telephoto.

As we have seen, the iPhone 14 Pro has a primary camera, a 0.5x ultra-wide, plus a 3x telephoto. But it also has a new option. The primary camera cuts a 12-megapixel rectangle out of the center of the 48MP sensor. 

This is not technically a digital zoom because every pixel in the final image corresponds to a pixel on the sensor, whereas a digital zoom is a lower-resolution image scaled up to a larger size. It’s a very smart move and will probably still give better images than the 3X zoom in low light. 

This sensor-cropping trick also works for video, enabling 4K video at a 2x zoom. And with video, you can let that cut-out frame move to further compensate for camera shake, adding extra image stabilization. 

Light Reading

There are other improvements, too, like the new Adaptive True Tone flash, which arrays nine LED lights and changes the illumination pattern to match the selected lens for better coverage. 

But really, this update comes down to bigger sensors and an incredibly powerful computing engine that doesn’t choke on all that data. It’s impressive and is a very compelling reason to upgrade to this camera.

Get the Latest Tech News Delivered Every Day