The ante will continue to be upped in 2022 when it comes to crazy megapixel competition. After the first 108MP smartphone hit the market in late 2019, the 200MP mark was crossed this month with the Motorola Edge 30 Ultra ( review ) and the Xiaomi 12T Pro ( hands-on ).
Even the previously mentioned iPhone 14 Pro jumped to a 48 MP camera this year, which is quite unusual for Apple. However, the interesting aspect of ultra-high-resolution sensors is not the 100 MB photos used for prints in the house, but for zooming.
How does 2x zoom work in iPhone 14 Pro?
Like full digital cameras, most smartphones do not have zoom lenses that allow you to change the focal length by changing lenses due to space limitations. Instead, there are separate sensors for different zoom levels. The iPhone 14 Pro has one sensor for ultra-wide-angle (0.5x), one for wide-angle (1x) and one for telephoto (3x).
When zooming is initiated using a two-finger gesture in the camera app, the smartphone now digitally shifts the image from one camera until it reaches the next camera module’s zoom. As digital zoom increases, the quality naturally decreases. The level of reduction depends on the camera in question.
The iPhone 14 Pro’s main camera is a 1/1.28-inch image sensor that measures 9.8 by 7.3 millimeters and has a resolution of 48 MP. With 2x zoom, Apple simply cuts a 12-megapixel unit off the center of the sensor, which still measures 4.9 by 3.7 millimeters across the sensor — and corresponds to a 1/3-inch format. That’s still good enough for good photos in good lighting conditions.
With 3x zoom, Apple switches to a next-generation 12-megapixel sensor (which is the same focal length as 3.2x at 77 millimeters.) In the 1/3.5-inch format, or 4.0 by 3.0 millimeters, the sensor is again a bit smaller than the “2x sensor” cut out from the center of the main camera. It’s small.
In the following chart, you can see how many sensors are available for the camera in the iPhone 14 Pro and iPhone 13 Pro at different focal lengths. The widest angle (13 millimeters) starts on the left side. The main camera (24 mm) shows a significant increase. As far as the telephoto camera (77 millimeters), the iPhone 14 Pro always has a larger sensor than the iPhone 13 Pro. Finally, the telephoto camera remains the same.
The chart above is interesting, only in megapixels (MP) rather than sensor area shown on the vertical axis. While the growth is similar for the ultra-wide and telephoto sensors at 12MP each, the jump to 48MP is noticeable. The iPhone 14 Pro has more quality to play with, with digital zoom between 1x and 3.2x.
How big and high quality should it be?
The iPhone 14 Pro discussed so far isn’t even the smartphone with the highest resolution or biggest sensor. This week, Xiaomi introduced the 12T Pro, which offers a 200 MP sensor and in return completely forgot about the telephoto camera. But how much more space do more megapixels provide for digital zoom? Let’s compare it with iPhone 14 Pro:
However, apart from the resolution, the most important factor is the sensor area available for the camera at different focal lengths. The Isocell HP1 installed in the Xiaomi 12T Pro is 1/1.22 inch larger than the main camera in the iPhone 14 Pro, but it disappears in terms of sensor area when the Apple smartphone is switched to telephoto zoom.
And what about Quad-Bayers?
We’ve overlooked one thing so far: the color masks above the sensor. To explain Quad-Bayer, we must first look at how image sensors work. An image sensor consists of many small light sensors that only measure the amount of incident light without being able to distinguish colors. 12 MP means having 12 million such light sensors.
To convert this black-and-white sensor to a color sensor, a Color mask It is placed on the sensor to filter the hazard light based on green, red or blue. The Bayer mask used in most image sensors always divides the two pixels into two green pixels and one red and one blue pixel. The 12 MP sensor therefore has six million green pixels and three million blue and red pixels each.
in demosaicing Or De Bavarian , image processing algorithms use the brightness values of surrounding pixels of different colors to estimate the RGB value of each pixel. A very bright green pixel surrounded by “dark” blue and red pixels will be completely green. And the green pixel next to the fully exposed blue and red pixels will be white. And so on, until we have an image with 12 million RGB pixels.
With high quality sensors, however, the color mask looks different. In a so-called quad-bay sensor, typically in the 50 MP range, there are four brightness pixels under each red, green or blue pixel. 108MP sensors have nine (3×3) pixels per color area, and 200MP sensors have 16 (4×4) pixels. Sony calls this quad-bayer, while Samsung uses tetra-, nona- or tetra2pixel.
While the image sensors have a resolution of up to 200MP in brightness, the color mask stops at 12MP. This is not a problem, because in perspective, brightness resolution is more important than color resolution. However, with extremely high digital zoom, image errors occur as color resolution eventually decreases.
As an example, here we have prepared a small image of Android. On the left you will see (1) a grayscale image, on the right (3) an RGB image with quarter resolution. The central image is a mix of the left and right images – and at first glance the result looks great. But on closer inspection, the transition between green and blue at the top of the android is unholy.
And artefacts occur on such transitions when the color mask projects far away from the sensor itself, which has a lower resolution. By the way, Samsung probably used an unusual 64MP sensor with an RGB matrix for this definition instead of the much maligned 1.1x telephoto sensor in the S20 Plus and S21 Plus.
The bottom line is, it’s hard to tell how image quality looks in cameras from hardware specifications alone, especially since a more critical role ultimately goes to the manufacturer’s algorithms as well. And the so-called Mozaing again It is also a big challenge for sensors with 2×2, 3×3, or 4×4 bayermask. Unlike conventional mosaicing, the color values must be interpolated over larger sensor areas and in a more complex manner.
On the other hand, very large sensors also bring their own problems. As manufacturers use lenses to keep the light limited, they often encounter problems with chromatic aberrations and other artifacts, especially at the edges of the image. And shallow depth of field is a problem at close range.
So it remains interesting, and I hope you find this journey into the world’s largest and highest resolution image sensors interesting. What would your dream camera look like in a smartphone? I look forward to your comments!