These University of Texas scientists took a cue from Mother Nature when searching for the most advanced autofocus design on Earth. Our eyes do it better than anything we know of, so why not model it after our own lens?
“Johannes Burge, a postdoctoral fellow at the University of Texas, and his advisorWilson Geisler, wondered how it was that the eyes of humans and many otheranimals were able to focus so much more efficiently than most digital cameras. In a traditional autofocus system, the camera uses only one piece of information about a scene to determine whether or not an object is in focus—its level of contrast. Contrast, says Burge, isn’t always a perfect proxy for focus. But it’s worse than that: To determine in which direction to re-focus, a camera must first change its point of focus and compare the new image it captures with the old one, to determine whether or not the object in question has a higher or lower level of contrast. Often, the camera isn’t even re-focusing in the correct direction when it captures this second image. This method of “guessing and checking” is “slow and not particularly accurate,” says Burge.
Burge’s and Geisler’s approach is different. As they outlined in a recent paper in the Proceedings of the National Academy of Sciences, their software algorithm cananalyze any still image captured from a scene and instantly know how to re-focus a lens to bring it into focus. It requires no before-and-after comparison. The way it works is that it takes an inventory of the features in a scene.”