In Focus: The Galaxy S5’s Phase Detection, LG G3’s Laser Autofocus, and HTC One M8’s Duo Camera

BY

Published 29 May 2014

NSFW AI Why trust Greenbot

We maintain a strict editorial policy dedicated to factual accuracy, relevance, and impartiality. Our content is written and edited by top industry professionals with first-hand experience. The content undergoes thorough review by experienced editors to guarantee and adherence to the highest standards of reporting and publishing.

Disclosure

lg-g3-laser-autofocus

Smartphone manufacturers have been pushing the envelope lately on photography and camera technology, employing new ways to help you get the best out of that small sensor that’s strapped to the back of your phone. In the past few months, we’ve seen several different takes on the camera’s focus problem, including HTC’s Duo Camera Refocus, Samsung’s Phase Detection Autofocus, and LG’s Laser Autofocus. Does this jargon sound too technical to you? If so, we’ll do our best to explain it in simpler terms below.

Autofocus

Let’s start by explaining plain autofocus. Regular phones and cameras use “passive contrast autofocus,” which is a way for the camera to detect how well-defined the object you’re pointing at is, and move the lens slightly forward or backward to improve the result. Essentially, this method relies on algorithms that calculate the contrast between two pixels on the screen in the area that you click to focus on, and if the contrast is low, then it knows that the object isn’t probably as sharp as it should be.

The downside of passive autofocus is that it is slow, relying on a trial-and-error system until it gets the focus right, and it doesn’t work as accurately in low light, because darkness reduces the contrast between pixels even if they are in focus. You can read more about contrast autofocus here.

Galaxy S5’s Phase Detection Autofocus

Phase Detection Autofocus (PDAF for short) uses a different kind of technology to find its focus point. Essentially, it compares the light obtained from the two opposite sides of the lens (two phases). If the light reaches a convergence point on the sensor, then the image is in focus. If it doesn’t, then it knows if it should specifically push the lens forward or back to achieve focus. This takes the guess work out of regular autofocus. You can read more about the Galaxy S5’s specific flavor of PDAF here.

In diagram 2, the two reflected lights converge on the same point. This is the basis of PDAF.

In diagram 2, the two reflected lights converge on the same point. This is the basis of PDAF.

While PDAF is faster than the passive contrast autofocus that we explained above, it is still of little usefulness in dark situations because it heavily relies on light to get its focus point.

LG G3’s Laser Autofocus

Let’s get this out of the way first: this is not exactly a laser. It’s an infrared beam that doesn’t disperse (hence laser) that gets emitted in the direction that you pointed on the screen. By measuring the time it takes for the beam to reach the object and come back (think RADAR or SONAR), the phone can tell the distance between the camera and the object, and hence move the lens to achieve perfect focus. Not much known is about the G3’s Laser Autofocus, but you can read more explanation (and speculation) here and here.

Laser Autofocus takes the trial-and-error out of the contrast focus system, especially if the object is close. If the focus point is farther away, it will default back to contrast autofocus but it seems that it will at least help in eliminating the closest focus points quickly and restricting the search to distant objects. And since it doesn’t rely on contrast or light, it should theoretically find a faster and more accurate focus in dark conditions, without relying on the LED flash to find the object.

HTC One M8’s Duo Camera

HTC went in a completely different direction with its Duo Camera setup. By using a second camera, it can have more information regarding an image and allow you to edit the focus point of an image even after you take it. Essentially, it uses regular contrast autofocus in its main camera sensor, but also appends to the image another one taken by the secondary camera. This second image usually has a wider field of view and contains depth information (how far away objects are). After taking the photo, you can edit it to sharpen the foreground and create an out-of-focus “bokeh” effect (blurred) in the background, or the opposite. Anandtech demonstrates it best in their review of the M8.

While refocus has the advantage of allowing you to take photos without really worrying about the point of focus, it only improves the focus time a little to begin with. It also doesn’t bring any improvement to low-light photography, and HTC’s algorithms still need to be honed. The current ones result in a very distinct line between sharpness and blurriness, almost like a poorly photoshopped image.

Your preference?

Which focus system do you prefer? I admit that I am fascinated by the Laser Autofocus, not only because of its name or my personal preference for LG, but also because I’ve always liked the idea of SONAR, and I’m tired of trying to take pictures in low-light only to see the LED flash launch uselessly, trying to find a focus point and failing.