Are you struggling to understand passive autofocus in photography? Do you want to know how passive autofocusing is used to create amazing photos?
That’s what this article is all about.
Because I’m going to tell you everything you need to know about passive autofocus:
What it is.
The two key passive AF systems.
And the common problems with passive AF.
By the time you’ve finished this article, you’ll be a passive autofocus expert–guaranteed.
So let’s get started.
What is Passive Autofocus?
Passive autofocus is the most common AF system in consumer cameras today.
Now, the goal for most AF systems is acquiring focus as accurately and quickly as possible.
And passive autofocus works, well, passively; it determines the proper focus point for your camera by analyzing data from a camera sensor (more on this in the next section).
In other words:
Passive autofocus doesn’t affect the environment. The camera doesn’t send out and receive any light. The camera doesn’t send out and receive any sound waves.
Instead, a camera passively autofocuses by taking in light from the surrounding environment, analyzing it, and then deciding where to focus for the sharpest images.
This is in contrast to active autofocus, where your camera throws out a signal (for example, a light signal–infrared–or a sound signal–sonar). The signal bounces back, and the camera uses the signal’s data to determine the distance from the subject and where to focus.
That’s active autofocus, not passive autofocus.
While both active and passive autofocus have limitations, active autofocus is much less common in today’s camera lineups. This is because active AF requires a clear path between the AF unit and the camera, which makes it impossible to focus through clear objects, such as windows.
So passive autofocus is the go-to AF method in today’s cameras.
Now let’s take a closer look at the mechanics of passive autofocusing:
How Does Passive Autofocus Work?
As I explained above, passive autofocus uses data collected by a camera sensor to determine the correct point of focus.
This is done in two main ways:
Either using contrast detection autofocus, or using phase detection autofocus.
Let’s look at each of these AF types in turn:
Contrast Detection Autofocus
Contrast detection autofocus generally works by using data from the camera’s main sensor–that is, the sensor that is used to capture images.
In mirrorless bodies, your camera’s sensor is constantly taking in data; with DSLRs, the same is true (assuming that you’re using Live View).
Then, you select an autofocus point–or your camera selects it for you, depending on your AF mode.
Next, when you press down the shutter button on your camera, the contrast detection AF system will start analyzing the data that’s coming into the camera sensor.
The system will identify your selected autofocus point, and it will look to see whether there is an area of high contrast in the relevant part of your image.
High contrast areas indicate that the identified part of the image is in focus.
And low contrast areas indicate that the identified part of the image is out of focus.
So the AF system will look to see whether your lens is focusing in the right place–and if it determines that your lens has misfocused, it will adjust the lens’s focus until things become sharp (that is, until there is a high contrast area under the selected autofocus point).
Now, because contrast detection autofocus generally uses data that comes directly from the sensor, it’s highly accurate. But while contrast detection AF technology has advanced rapidly over the past few years, it’s still very slow–because your camera must rack the lens back and forth until it nails focus.
Because of this, camera manufacturers are shifting away from contrast detection autofocus, or at least from AF systems that rely entirely on contrast detection AF. One thing to bear in mind is that contrast detection AF requires an unobstructed sensor, so DSLRs can only use contrast detection AF when shooting in Live View mode–though mirrorless cameras can potentially use contrast detection AF continuously.
On the other hand, phase detection autofocus generally couldn’t be used when shooting in Live View, but only when a camera mirror was present–as you find in DSLRs.
But given how Live View was being used with increasing frequency (especially given the addition of impressive video capabilities to DSLRs), many manufacturers attempted to incorporate phase-detection AF into sensor-based AF systems.
Phase Detection Autofocus
Phase detection autofocus is a popular alternative to contrast detection autofocus, because of its speed; phase-detection AF allows photographers to lock onto their subjects in moments, compared to the plodding pace of a contrast-detection system.
In past iterations, phase detection autofocus worked through a sensor in the bottom of DSLRs (separate from the main imaging sensor).
Light would go into the camera, and while some would reflect off the mirror and up into the optical viewfinder, some would be sent to the AF sensor.
The AF sensor would perceive two separate images–and only once the scene was in focus would the two images settle into one.
Because it was relatively easy for cameras to identify whether the images were in and out of alignment and by how much, phase-detection AF systems were fast. They could determine the adjustments required in order to make the image sharp, and hence could quickly focus the image.
But phase-detection autofocus came with a few issues.
First, because the camera sensor and the autofocus sensor were separate, even the slightest misalignment between the two would result in back focusing or front focusing issues.
In order to focus with perfect accuracy, the phase-detection system required that the image sensor and the AF sensor were the exact same distance away from the incoming light.
That way, if the phase detection sensor identified and out-of-focus image, the image would be out of focus at the location of the imaging sensor.
So misalignment between the sensor and the AF system resulted in focusing inconsistencies and, ultimately, failed images.
Note that lenses could contribute to this issue, as well. A lens might front-focus or back-focus consistently, causing you to miss images while in the field.
Hence, companies such as Canon and Nikon developed a focusing calibration method, which they then built into their higher-end phase-detection AF cameras. That way, users could test their camera with each lens, then program in focusing adjustments in order to deal with these AF issues.
However, old phase-detection systems had another issue, which I mentioned briefly in the previous section:
They couldn’t work in Live View.
Cameras relied on the mirror to send light to the phase-detection AF system. But in Live View, the mirror is flipped up, exposing the imaging sensor to the light. So cameras relied on phase-detection AF only when using the optical viewfinder, and contrast-detection AF when using Live View (most commonly used by video shooters, but also by some landscape, architectural, and street photographers, to name a few). This was a serious issue, because contrast-detection AF systems were slow, and didn’t seem to be improving very quickly.
Plus, mirrorless cameras technically only ever use Live View. They include no mirror, and instead offer the viewer a feed straight to the camera sensor. So mirrorless cameras could only use the slow contrast-detection AF systems…
…until on-sensor phase-detection AF systems came to consumer cameras.
An on-sensor phase-detection AF system incorporates the phase-detection AF right into the imaging sensor. For instance, a camera with this type of system might dedicate some sensor pixels to phase-detection data, as in the case of Canon’s early hybrid phase-detection technology. Or a camera might include sensor pixels that work both as imaging pixels and as AF pixels, which is how Canon’s famed Dual-Pixel AF technology works.
Regardless, on-sensor phase-detection systems revolutionized autofocus. With it, Live View focusing became much faster, making video shooting easier and mirrorless cameras far more marketable.
Problems With Passive Autofocus
Passive autofocus comes with two main problems, even at its best.
First, passive autofocus is far less reliable in low light conditions, and will eventually stop working entirely. As autofocus technology improves, the limits of passive AF in low light expand, but even the best hobbyist cameras cannot focus in pitch darkness.
This is true for both contrast detection AF systems and phase detection AF systems.
Active autofocus, on the other hand, can find focus even in the darkest conditions (though it’s plagued with other problems, as explained previously).
That’s why, if you’re doing photography in near darkness, you’ll want to deactivate the autofocus system and focus manually instead. Your eyes are better than the AF system when it comes to this type of low-light work.
Second, passive autofocus requires high-contrast edges to be able to grab focus. This is true whether you’re using contrast-detection AF or phase-detection AF; without contrast, the autofocus system will hunt for critical seconds while in the field (and may never find focus in the end).
While AF systems are getting better and better at detecting contrast, certain subjects, such as pure white snow, a black wall, or a gray building can still give AF systems trouble. In order to nail the shot of a low-contrast subject, you’ll often have to find a high-contrast element near your subject, grab focus, then recompose.
For instance, a shot like this, taken in low light, can spell trouble for a passive autofocus system:
Passive Autofocus in Photography: The Next Steps
Now that you’ve finished this article, you know all about passive autofocus–what it is and how it works.
So the next time you’re trying to understand how your camera does its focusing…
…just remember this article.
It’ll get you back on track!
Passive autofocus is a system for capturing sharp images purely through the analysis of data. Your camera doesn’t actually do anything to its surroundings–it doesn’t send out sound waves or light waves–instead, it takes in data and makes adjustment to its focusing mechanisms based on that data.
In other words, passive autofocus doesn’t require any work from the camera, aside from data processing. This is in contrast to active autofocus, which sends out signals to determine the distance from camera to subject, and only then focuses.
Contrast detection autofocus is a type of passive autofocus. It works by analyzing data from the camera sensor, then adjusting the focus mechanism to create a higher contrast area under the selected autofocus point. Unfortunately, this is a relatively slow process, which is why camera manufacturers have moved away from contrast detection AF systems (and are using on-sensor phase detection systems, instead).
Phase detection autofocus is a type of passive autofocus. It works by comparing two images of the scene, and attempts to align them with one another. Phase detection autofocus is very fast, but it has had accuracy problems in the past (when coupled with DSLRs). However, the advent of on-sensor phase detection has increased the accuracy of PDAF, while maintaining its stellar speed.
Pretty much all of today’s popular cameras use passive autofocus, from DSLRs to compact cameras to smartphones to mirrorless models.
All current DSLRs use phase detection autofocus when working with the optical viewfinder. However, most DSLRs use contrast detection autofocus when shooting Live View. Most of the latest mirrorless cameras use at least some phase detection autofocus system, as well.
DSLRs generally use contrast detection autofocus when shooting in Live View. Some mirrorless cameras use contrast detection AF exclusively, while others use a mix of contrast detection AF and phase detection AF (and still others use only phase detection AF).