In parallel, the advancement of iPhone’s camera has also played a significant role in encouraging upgrades each year. While camera sensors have incrementally improved, the image processing capabilities built into Apple’s Ax silicon have advanced even faster, providing the logic to capture better photos more rapidly with sharper focus and more accurate color, tone, and exposure.
This year, Apple’s high end iPhone 7 model is expected to deliver dual camera sensors, building upon technology sufficiently advanced for Apple to have acquired a series of other companies including PrimeSense, LinX, Metaio, Emotient, Faceshift and Perceptio.
Those purchases amount to one of the largest outlays Apple has ever made for outside technology. Maybe it was done on purpose, with a goal in mind?
Yet again, analysts looking at these rumored Chinese iPhone case designs—as if they were magically meaningful tea leaves offering a portent of doom that nothing new was going to happen this year—have actually overlooked all of this, even with the aperture of two lenses staring right at them.
There are incredible things you can do with two camera sensors: depth perception, image correction, expanded dynamic range in exposure and color gamut—just ask anyone who has ever worn an eyepatch. Two eyes are better.
Add together new lenses and advanced imaging silicon and you have a really exciting potential for entirely new kinds of image capture. But Apple has also already demonstrated something else related to iOS photography: vastly advanced image cataloging by subject, location and image type, now being done directly on the phone in iOS 10 Photos. Its shouldn’t be surprising that the best and fastest implementation of this will be on Apple’s latest A10-powered, iPhone 7 hardware.