
Why New Smartphone Cameras Feel Worse
Audio Summary
AI Summary
The speaker tested every iPhone generation from 1 to 17, taking the same picture with each. The results showed that newer generations aren't always superior. Early smartphone cameras, like the iPhone's 2-megapixel pinhole, were basic. As demand for quick, high-quality photos grew, cameras improved drastically, becoming a key selling point.
Around the 2010s, dramatic improvements lessened as phone sizes and camera bumps reached physical limits. Comparing an iPhone 17 photo to an iPhone 11 photo, they appear similar without pixel-peeping. The iPhone 17 has a larger sensor, creating more background blur, while the 11 has more in focus. Most phones from the past five years take usable photos in ideal daylight.
Modern cameras now focus on difficult scenarios: low light, fast subjects, and deep zoom. Any current phone can take a good photo in good lighting, but true differentiation comes in challenging conditions. For example, a Pixel 10 can use computational photography, multi-frame HDR, and tone mapping to capture a perfectly exposed, backlit scene—a feat impossible for older phones like the Nexus 4.
However, this pursuit of the "perfect" photo, where computational tricks save any situation, has made regular daylight photos slightly worse. The speaker observes an "overprocessed" look, particularly in Samsung Galaxy phones. Comparing the Galaxy S23 to the S26, the S23's photo, while having less shadow detail and HDR, appears more natural, lacking the unnatural glowing and flatness of the S26. Many viewers agree.
The challenge for manufacturers is balancing when to apply these intense processing tricks. They are essential for saving bad photos but can detrimentally affect good ones. The "snap" seen in the viewfinder after taking a photo, especially in difficult scenarios, is this post-processing at work. Turning down or off post-processing can sometimes yield more natural-looking results, even with older phone models.