1. Advanced 3D sensing function
Apple has been repeatedly expressing interest in AR augmented reality, ARKitDevelopmentThe release of the framework has made AR applications more and more popular, and users are beginning to use it.iPhoneoriPadExperience the new experience of AR technology. But this is only the beginning.
The new addition of the third lens is likely to significantly improve the 3D space recognition capability on the iPhone. Last year it was reported that the rear sensing function would be more advanced than the iPhone X series' front HDD deep camera system. This will also open the door to a variety of new AR applications.
The report also claims that the three-lens array will be capable of stereoscopic imaging, where two sensors can capture images of individual objects from different angles and then use triangulation to obtain the distance between the iPhone and the subject.
In 2017, Bloomberg reported that Apple's goal was to use rear 3D sensing technology in the iPhone launched in 2019. However, it was reported at the time that Apple was evaluating a ToF method that could calculate the 3D image of the environment from the time the laser reflected around the object.
So, no matter which way Apple uses, the third lens will definitely improve the AR function of the iPhone. It may also be preparing for the rumored 2020 Apple glasses.
2, enhance the optical zoom function
The third lens may also bring 3x optical zoom to the iPhone for the first time, and its playability is enhanced. The current iPhone 7 Plus, iPhone 8 Plus, iPhone X, iPhone XS, and iPhone XS Max only support 2x optical zoom.
The Huawei P20 Pro is the first smartphone with a rear-mounted three-shot camera that includes a 40-megapixel lens, a 20-megapixel monochrome lens, and an 8 megapixel telephoto lens that provides 3x optical zoom.
3, improve low light performance
Adding a third lens captures more light and improves the iPhone's low-light performance.
Apple has invested heavily in computational photography, such as the smart HDR feature on the latest iPhone, which gives photos more highlights and shadow detail, while depth control also significantly improves portrait mode.
Google's new "Night Sight" feature for Pixel 3 and Pixel 3 XL sets a new benchmark for low-light shooting, using machine learning technology to select the right color based on image content, resulting in no need for a flash in low light conditions Can take very bright photos.
If you add a third lens and then combine with more advanced machine learning techniques, Apple may really be able to come up with a shooting function that can compete with Night Sight this year.