What are Focus Pixels?

2023-07-29 18:00:00

Apple has, in recent years, invested heavily in new technologies to the cameras that equip the iPhones and also the iPads.

Examples of this, of course, abound: we have from software resources, such as Portrait mode and Photographic Styles, to functions that are only possible thanks to advances in hardware, such as True Tone flash and Photonic Engine, for example.

In this specific article, we are going to talk a little about the Focus Pixels. See, below, everything you need to know about this technology!

Focus Pixels first appeared with the iPhones 6 e 6 Plusreleased in 2014.

Thanks to an image signal processor (image signal processoror ISP) enhanced by Apple, more information about the image is sent to the camera’s sensor, resulting in better and even faster autofocus — which can be seen in the preview of the photo or video on the screen of the device.

In practice, this is already well known in the photography industry as a focus automatic with phase detection (phase-detect auto focusor PDAF), present even in DSLR cameras and mirrorless professionals.

This works by masking part of the pixels, which are “covered” and then compared to other covered pixels to check the light that “enters” them; then, the identification is made if the object is in focus or not.

To make this even better, Apple has implemented improvements to the feature over the years. Want examples? On iPhones 6/6 Plus, the number was about 1 pixel for every 64 subpixels; on iPhones 6s/6s Plus and 7/7 Plus, that number dropped to 32; on iPhones 8 Plus, to 20; and, in the XS/XS Max, for just 16 — all of this, of course, taking into account that the smaller the quantity, the better the end result.

Related Articles:  ELDEN RING is the best game of the year, FromSoftware is the best studio: Golden Joystick Awards 2022 winners announced

via AppleToolBox, TechCrunch

1690654701
#Focus #Pixels

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.