If you’ve taken a selfie on the iPhone XS, you might think it looks a little different to other selfies you’ve taken in the past, especially from previous iPhones.
We already know the.
But some users on Reddit and YouTube have claimed that the camera on the iPhone XS has a “beauty mode” effect on faces that smooths imperfections. Several apps like Snapchat, Instagram and FaceTune actively offer filters that enhance or retouch facial features.
Here’s why selfies look different on the XS, but not for the reason you might think.
Cameras don’t see like our eyes do
If you’re taking photos of a high-contrast scene, it’s hard for a camera sensor to capture all the detail in highlights and shadows. Think of a photo taken from indoors, looking out to a window with lots of light streaming in from the outside. Most cameras end up exposing for either the indoor light (which means the window light blows out completely) or the outside light (meaning the indoor scene is dark and underexposed).
One solution is high dynamic range (HDR) images. HDR blends multiple exposures — usually an underexposed, overexposed and correctly metered photo — into one. This helps capture a greater dynamic range in photos, so shadow and highlight detail is evened out. Take this photo from the iPhone XS with HDR on (left) and off (right) and notice the extra detail retained in the window.
Without HDR, phone cameras can struggle to expose for both shadow and highlight detail, so you often end up with blown highlights or muddy-looking shadows.
Why HDR looks different
Many photographers have used HDR techniques to achieve what can look like “hyper-real” photos. With the right kind of processing, at the most extreme end of the scale photos can look overly saturated and almost like they’re illustrations, or airbrushed versions of reality.
Apple’s version of HDR on the iPhone XS, XS Max and the forthcoming iPhone XR, is called Smart HDR. It’s on by default for photos taken on both the front and rear cameras. (FYI if you want to turn it off, go to Settings > Camera.)
At, Phil Schiller used an example of a photo taken of a moving subject to explain how Smart HDR works. The A12 Bionic chip first captures four frames as buffer, then takes additional “inter frames” at different exposures to bring out highlight details. It also takes a long exposure to grab shadow detail. Then, all the frames are analyzed and it works out how to merge the best parts into one photo.
With Smart HDR turned on, the XS generates a blended image. But even without Smart HDR turned on, the XS is already using computational photography to merge exposures, perform local tone mapping (a technique to map colors to achieve an HDR effect) and recover highlight detail on regular photos.
It’s also important to note that merging multiple exposures and blending images isn’t unique to Apple. Google Pixel and Samsung Galaxy phones do similar things in their own HDR modes.
So what’s with the smoothing effect?
Two things. You might think that an HDR image looks “airbrushed,” particularly when comparing it to a photo taken on a phone that doesn’t have HDR turned on. Take, for example, the portrait below taken on the rear cameras of both the iPhone XS (left) and iPhone X (right). The XS image may look softer to you because the glowing highlights have been reduced, thanks to blending exposures and less contrast.
Secondly, to make an HDR image, you need at least three images taken at the same time. Unless you hold the phone incredibly steady or ask your subject to hold their expression (try that with kids), you’ll likely introduce some sort of shake. The way to get around this is to have the camera take photos at incredibly fast shutter speeds.
But to get a good exposure at a fast shutter speed of hundredths of a second, especially in low light, the camera needs to crank up the ISO (light sensitivity). This introduces a lot of noise, which can look like speckles, or grain on your photos. It’s only magnified by having a small sensor like that found on a front-facing camera.
Cameras often apply noise reduction to get rid of this noise, but the tradeoff is photos can look smoothed out. Below is an example of a photo taken on a DSLR in low light at ISO 3200, with a lot of noise (left). On the right, the same photo with heavy noise reduction applied in Lightroom. As you can see, the image on the right looks a lot smoother and loses some detail. It’s an extreme example, but gives you an idea of what noise reduction can do.
Here’s an important caveat with the iPhone XS: if you’re taking selfies or photos from the front-facing camera in good lighting, the camera doesn’t appear to apply much noise reduction at all, at least from my tests. In low light, the noise reduction seems to be more aggressive, hence a smoothing effect.
And it’s not just faces where this applies — If you look at photos taken of other subjects in low light, especially with the front-facing camera, you may notice the same effect.
Sebastiaan de With, designer of popular third-party camera app Halide, explains the changes in the XS camera in this piece. One important conclusion to glean from his deep dive is this:
“The iPhone XS merges exposures and reduces the brightness of the bright areas and reduces the darkness of the shadows. The detail remains, but we can perceive it as less sharp because it lost local contrast.”
What about shooting in raw?
Since iOS 10, iPhones have been able to. Raw files are photos captured straight from the image sensor without processing applied. This means no HDR effects, no noise reduction and an untouched image.
However, de With found that if you’re shooting in raw on the XS, the sensor noise is stronger than it was on the X, so the noise reduction is more aggressive. Third-party apps will need to optimize specifically for the new camera, or users will need to shoot in manual mode and deliberately underexpose.
Where to from here?
One way that this effect could potentially be tweaked is with a software update to offer different levels of Smart HDR, or to reduce the intensity of the noise-reduction algorithm for all photos.
Turning off Smart HDR makes more of a difference for photos taken with the rear camera than it does the front camera. And as already discussed, the XS camera is taking photos in a different way than earlier iPhones, through computational photography and merging exposures. So photos already look different, even without Smart HDR on.
But a big part of this entire discussion is how we see photos differently — especially of ourselves. Many people I showed selfies to preferred the XS because their photos looked more even and slightly softer. Others deferred to the X because it appeared to retain more detail to their eye, even if the image had more noise. As always, your personal preference may sway one way or the other. But neither image is wrong: they’re just different.
- [LLODO] Pennsylvania governor who required COVID masks be worn indoors bans alcohol sales night before Thanksgiving
- [LLODO] In a first for San Francisco, police officer who shot Keita O’Neil charged in killing
- [LLODO] AMERICA TOGETHER: Florida teen fixes American flags for local businesses
- [LLODO] Georgia recount signature-matching impossible despite demands from Trump, Republicans, says secretary of state
- [LLODO] Could Georgia Senate races keep Democrats’ court-packing hopes alive?
- [LLODO] Kentucky AG files restraining order against governor to block religious school closures
- [LLODO] Billionaire CEO vows to fight for working class against Big Tech censorship
- [LLODO] Two married Texas AG prosecutors shot, one fatally, at their El Paso home
- [LLODO] Oklahoma dispatcher dies from coronavirus complications