People use their phones for all sorts of things, but ‘photography’ is top of the list for many. Apple knows this, and points out that iPhone users took more than 3 trillion photos in the past year. It stands to reason that the company continues to focus on making its cameras better, but also to lean into the computational photography world that helps photographers take better photos more easily.
The company claims that its iPhone 14 cameras is even better in all lighting situations than before. The phone gets a new 12-megapixel camera with a larger sensor and a f/1.5 aperture. It still incorporates its ‘sensor shift’ optical image stabilization, and the phone gains a few new tricks as well.
The larger aperture helps taking photos of fast-moving subjects, and the company claims a 49% improvement in low-light capturing. It also packs an ultra-wide camera that helps you get photos from a new perspective, getting more of the landscape in the frame. The wide-angle camera also comes in handy for its new image stabilization toys, which means that you can now record video while running at full tilt.
The front camera is new too, and introduces autofocus for the first time, which makes it easier to get you in focus. Neat.
The company further highlighted its improvements in its “deep fusion” algorithms, which Apple claims is a huge leap forward in computational photography.
“It uses the powerful neural engine to combine multiple frames into a single image. This delivers extraordinary detail and preserves even the subtlest textures in these mid to lower-light environments,” a spokesperson for Apple said at today’s event. “Now we are taking our image pipeline further by applying deep fusion much earlier in the process on uncompressed images. This retains much more information in detail and enables the rendering of more colors and brighter colors. This new process unlocks our biggest step forward yet for low light performance we call it photonic engine.”
More TechCrunch