For the first time on an iPhone, the iPhone 14 Pro features a 48-megapixel wide-angle rear lens. Despite its amazing design, the iPhone 6s released a 12-megapixel rear camera. ProRAW or third-party applications can, however, allow you to take 48MP photos using the Apple Camera app.
How the iPhone 14 Pro uses its 48MP camera
You may ask yourself: How does the iPhone perform 48MP pictures by default?
Apple uses a technique known as pixel binning, which combines data from four angles into one. This is because using all 48MP is often a problem for low-light photos. iPhone uses the 48MP sensor to obtain a 12MP picture with better quality and less noise.
The iPhone 14 Pros Camera app has also added a 2x zoom to its image that used to be the original 48MP image to result in a 12MP zoom. This allows users to digitally zoom in without losing definition and without having to switch to the telephoto lens with 3x zoom.
Apple has limited how users can take pictures with the new sensor if you want a 48MP picture. You must include Apple ProRAW in the camera app.
RAW vs. non-RAW
For those unfamiliar, a RAW photo is basically the original image taken from the sensor, with little or no post-processing. It includes all the information about brightness, shadows, and colors that can be altered later in an image editing software like Adobe Lightroom. Because of this, a RAW image file can be 15 times larger than a compressed image.
The camera takes the picture and then removes some of this information in order to result in a smaller file that takes up more space.
Apple has implemented this feature natively in the iOS Camera app with Apple ProRAW, but now with the iPhone 14 Pro, Apple has decided to tie the 48MP resolution photos with ProRAW.
The downside of these photos is that they consume a lot of space in the iPhone''s internal storage. Apple claims that every 48MP RAW file can be approximately 75MB. There is no doubt that all real pro users end up allowing this option, but everyone should be able to take advantage of the full 48 megapixels of the iPhone 14 Pro camera.
48MP compressed photos
I wish I could take 48MP pictures without using my iPhones because to the only way to do it is by enabling ProRAW. However, why does this matter? I have done several experiments to demonstrate how 48megapixel photos are noticeablely improved even when compressed.
The following are some highlights. Images have been created, allowing you to see more details here:
Here''s another example that shows the level of detail in a 48MP photo, even after compression:
I used a shortcut created by Gabriele Trabucco (via Vadim Yuryev) that quickly converts ProRAW 48MP photos to the HEIF (High Efficiency Image File Format) compressed format. The HEIF is a software that is capable of keeping the quality of images in substantially smaller files.
Despite the fact that each 48MP RAW file in these examples is around 60MB, the HEIF-compressed version is only 3.2MB less than the 3.3MB of the 12MP JPEG image.
Despite the compression, iPhone users may choose between shooting in 720p, 1080p, or 4K. Why not use the same technique for taking photos? Im sure that a lot of users would choose to take pictures in 48MP resolution.
Plus: I want to get rid of Smart HDR
This iPhone XS was created back in 2018, and features AI to enhance photos by making several post-processing adjustments. Remember the Beautygate of the iPhone XS? That''s why Smart HDRs are so depressing.
Back then, Smart HDR was not as effective as today, and users could still turn it off. Since iPhone 12, Smart HDR can no longer be turned off by users. While Smart HDR attempts to improve photos, it ends up ruining some of them by making them extremely unnatural.
Ive seen a lot of people recently complaining about iPhone pictures, but I do not like any of the pictures.
An iOS feature request: An option to turn off Smart HDR. Sometimes, it just ruins the photos (in this case, it destroyed the sky compared to the Live Photo without the same processing). pic.twitter.com/Zb4cPS6qO4
Filipe Esposito (@filipeesposito) October 5, 2022
Max Tech, a YouTuber, has produced a fantastic video demonstrating how the iPhone 14s Smart HDR is falling behind in comparison to the post-processing of the recently released Google Pixel 7 Pro.
I understand that post-processing is essential to alleviate hardware limitations (in this case, small camera lenses and sensors), but it has reached a point where some of this post-processing is simply over.
If someone from Apple is reading this, please bring back the option to disable Smart HDR. I''m not going to have to take RAW pictures just to take advantage of the 48-megapixel resolution or to avoid this excessive post-processing.
What about you? Do you believe that Apple may expand its capabilities to the native iPhone Camera app? Let me know in the comments below.