New information regarding the dual cameras leaks

Aug 8, 2016 12:07 GMT  ·  By

We’ve known for a while that the next iPhone model (no matter if it’s called iPhone 7, 2016 iPhone, or iPhone 6 SE) will come with camera upgrades on both the standard version and the bigger Plus, but the latter is definitely more interesting because it’ll feature a dual-lens configuration for the very first time in Apple’s history.

It goes without saying that dual-camera setup should provide better quality photos, but despite the number of leaks so far, nobody knows for sure what the purpose of this new hardware on the upcoming iPhone actually is.

But according to a report from Bloomberg, the new iPhone Plus will come with a new camera system using two separate sensors to shoot different photos that are then combined for a higher-quality result. The aim is to achieve brighter and sharper photos, according to the report, while also offering an increased zoom level.

At first glance, such information doesn’t tell much about the next iPhone, but when looking at the existing dual-camera configurations already on the market, it seems that Apple’s going to pick something similar to Huawei’s P9.

How the dual cameras could work

So basically, the purpose is to offer two different cameras on the next iPhone and make them work independently but still connected to each other to combine shots for better quality. So how they’ll do that?

First of all, it’s all up to the sensors they’re going to use. While one of the cameras will use a regular RGB sensor that shoots colors photos just like the ones today, the other sensor will be monochrome, so it’ll take pictures in black and white. The whole idea is very simple here: using complex software optimizations (the CPU and RAM will play a key role here, and that’s why hardware upgrades in this area are necessary), the two photos are combined for one better results with sharper colors and better brightness.

The black and white photo is playing a more important role than you’d be tempted to believe because it contributes to the better brightness that Apple might be aiming for. According to a more complex principle that we’ll try to make simpler, a monochrome photo has better brightness than a similar color shot.

When shooting a color pic, the RGB sensor isn’t trying to accurately determine the amount of light, but only to filter it in order to reproduce colors and determine which color is which. All without actually impacting brightness too much, but this is pretty impossible to prevent in such a small camera like the ones available on smartphones. This is the reason contre-jour photos taken with smartphones aren’t necessarily the best and usually have awful brightness.

On the other hand, the monochrome sensor is specifically focused on the amount of light because it doesn’t need to do any filtering, so it doesn’t lose any details, providing a sharper end result.

By mixing the two sensors, Apple could get the best from both of them, which is a sharper and brighter photo. And with post-processing, the new iPhone can do additional refining of these pictures, providing better quality than the existing models.

Without a doubt, there might be more technology behind Apple’s new dual-camera system, and we’re going to find out everything about it next month, when the overhauled iPhone launches. Until then, this scenario seems the most likely, and it would really make sense, given all the information that has gotten leaked so far.

Alleged dual-camera module for the next iPhone
Alleged dual-camera module for the next iPhone

Photo Gallery (2 Images)

Dual-camera system on the 2016 iPhone Plus
Alleged dual-camera module for the next iPhone
Open gallery