Monday, 9 November 2020

Apple Reveals Its Design Philosophy Behind the iPhone Camera

Since the Pro Max marks the first time in a while that Apple changed the size of its camera sensor, PetaPixel spoke to two Apple executives who outlined the company's vision and design philosophy behind camera development.

iPhone Camera

In an interview with Apple's Product Line Manager, iPhone Francesca Sweet and Vice President, Camera Software Engineering Jon McCormack, both made clear that the company thinks of camera development holistically: it's not just the sensor and lenses, but also everything from Apple's A14 Bionic chip, to the image signal processing, to the software behind its computational photography.

iPhone Camera Design Philosphy

Apple says that it's main goal for smartphone photography is based around the idea of letting folks live their lives, and capture photos of that life without being distracted by the technology.

iPhone Camera Pixel

"As photographers, we tend to have to think a lot about things like ISO, subject motion, et cetera," McCormack said "And Apple wants to take that away to allow people to stay in the moment, take a great photo, and get back to what they're doing."

He explained that while more serious photographers want to take a photo and then go through a process in editing to make it their own, Apple is doing what it can to compress that process down into the single action of capturing a frame, all with the goal of removing the distractions that could possibly take a person out of the moment.

"We replicate as much as we can to what the photographer will do in post," McCormack continued. "There are two sides to taking a photo: the exposure, and how you develop it afterwards. We use a lot of computational photography in exposure, but more and more in post and doing that automatically for you. The goal of this is to make photographs that look more true to life, to replicate what it was like to actually be there."

McCormack says that Apple achieves this by using machine learning to break down a scene into more easily understood pieces.

"The background, foreground, eyes, lips, hair, skin, clothing, skies. We process all these independently like you would in Lightroom with a bunch of local adjustments," he explained. "We adjust everything from exposure, contrast, and saturation, and combine them all together."

Speaking specifically about Apple's Smart HDR technology, McCormack explained how we already see the benefits of this kind of computational photography.

0 comments:

Post a Comment