X
Innovation

The brain's autofocusing abilities could inspire better cameras

Researchers are understanding how it is that our visual system avoids the repetitive guess-and-check technique of digital cameras. It's all about estimating blur.
Written by Janet Fang, Contributor

You know how sometimes you just can’t get your camera to focus on exactly what you want?

We, on the other hand, have the ability to look at an object, near or far, and instantly bring it into focus. Stare at your laptop, the fall scenery blurs. Look up at a tree, and you defocus on the anti-glare screen.

Researchers are getting closer to understanding how our brains accomplish this brilliant feat of accuracy. We can now precisely predict focus errorthe difference between the distance to the target and the distance the lens is focused. ScienceNow explains.

The discovery could advance our understanding of how nearsightedness develops… or even help engineer digital cameras with better focusing abilities.

To understand how biological visual systems avoid the guess-and-check technique of digital cameras, Johannes Burge and Wilson Geisler from University of Texas at Austin have developed an algorithm that estimates the focus error from a single blurry image.

So first, this is how a camera’s repetitive auto-focusing works: it changes the focal distance, measures the contrast in the image it sees, and then repeats the process until it has maximized the contrast. "This search procedure is slow, often begins its search in the wrong direction, and relies on the assumption that maximum contrast equals best focus – which is not strictly true," Burge says.

In order to see something clearly, you need an accurate estimate of blur. We and other animals instinctively extract key features from a blurry image, use that to determine our distance from an object, and then instantly focus our eyes to the precise desired focal length.

(The chameleon, for instance, relies on this method to pinpoint the location of a flying insect and snap its tongue to that exact spot. Placing a lens in front of its eye alters the amount of blur and causes it to misjudge the distance.)

But, we don’t know how we/they use blur to accurately estimate distance so well.

  1. The duo created a computer simulation of the human visual system.
  2. They presented the computer with digital images of natural scenes like faces, flowers, and scenery. The content varied a lot, but many features of the images remained the same – like patterns of sharpness and blurriness and relative amounts of detail.
  3. Then, to mimic how our visual system might be processing the images, they added a set of filters to the model designed to detect these features.
  4. When they blurred the images by changing the focus error, they discovered that they could predict the exact amount of focus error based on the pattern of response they observed in the feature detectors.

In other words, there’s enough information in a static image to determine if something is too close or too far away.

The work was published in Proceedings of the National Academy of Sciences yesterday.

From ScienceNOW.

Image: Johannes Burge

This post was originally published on Smartplanet.com

Editorial standards