Pure Genius

Q&A: Christopher Chabris, psychology professor, on everyday illusions

Q&A: Christopher Chabris, psychology professor, on everyday illusions

Posting in Science | From Issue 02 November 25, 2013

Chabris explains how we can be oblivious to things that later seem obvious -- and why people who think they're funny have the worst sense of humor.

When a politician tells a personal story that turns out to be false, does that make him a liar? When an employee exudes confidence, does that make her the smartest person in the room?

Depite our intuition about the way our minds work, the answers might turn out to be no, according to Christopher Chabris, a psychology professor at Union College. (To see why, watch the video at bottom to test your own mind before you finish the rest of this story.)

I spoke with Chabris, co-author with Dan Simons of The Invisible Gorilla, about why we don't see a gorilla right in front of us, why we trust our memories too much and how these everyday illusions cause some sticky situations. Below are excerpts from our interview.

The video that inspired your book -- and received more than eight million YouTube views -- shows six people passing basketballs back and forth. The viewer is instructed to count how many times one group passes the ball. In the middle of the clip, a person dressed in a gorilla suit walks through the group and thumps her chest. The idea is that most people will be so engrossed in counting passes that they won't notice the gorilla. How did you know the experiment would work and what did you find?

We didn't know it was going to work. It was a surprise to me and, in some ways, it's still a surprise. Once you've seen this thing walk across the screen, it seems so obvious. It's hard to imagine anyone could miss it.

Dan Simons was a new faculty member at Harvard in the psychology department and I was a graduate student. We were teaching a course on experimental methods in cognitive psychology. Dan decided the class should collaborate on one experiment. It was his idea to recreate a study done by Ulric Neisser, one of the pioneers of cognitive psychology. He made a video in the '70s using a trick with mirrors to superimpose three video streams together. He had people passing basketballs around -- two teams of three people. But he had a woman carrying an umbrella walk through the middle of the action. Neisser found that if people paid attention to what the basketball players were doing -- counting the number of times they threw the ball -- a lot of people would miss this woman with the umbrella.

We made lots of different versions of the video. In one of them, we had the person wearing the gorilla suit face the camera and thump her chest. We wanted to push the limits. We said, ‘If a lot of people miss something that just walks through, what if it's on the screen longer and thumps its chest? Can we finally get it to be noticed?' Even then, only half the people we tested noticed it. I was surprised by the results. I've gotten over it now, but for several years when I would show the video in class or at a talk I would worry everyone would see the gorilla. It seemed so obvious. It took a long time for us to get over the belief everyone has, which is that we should see obvious things.

That experiment shows what you call ‘inattentional blindness,' a term that sounds a bit dangerous. What does it mean and why does it happen?

Inattentional blindness was not a term we coined. I believe it was coined by Arien Mack and Irvin Rock, who wrote a book by that name. They did experiments like this. They would ask people to stare at a computer screen. A giant plus sign would flash at the center of the screen. Either the vertical or horizontal part of the plus sign would be longer than the other. You had to decide which was longer. You had to pay close attention to get it right. After a few of these judgments, another little object would flash somewhere on the screen at the same time as the plus sign. A quarter of the time or more, people didn't notice the other object. They had this idea that when your attention is absorbed by a demanding visual task, it can be as though you're blind to other things. Those other objects were visible if you had been looking where they were, but with your attention directed to the plus sign it was as though they were invisible. That's why it's called inattentional blindness. It's as though you have a blind spot there.

We tried with our work to show it's not just briefly flashed objects on computer screens that can cause inattentional blindness. It can happen in longer events and in a more naturalistic setting where you're watching human beings. In the paper we called Sustained Inattentional Blindness, the idea was that inattentional blindness could last as long as it took the event to play out. As long as your attention was focused elsewhere, you could be blind to salient events.

Why does this happen? Is it just the ways our minds work?

You could call it a side effect of the way the mind is designed. Other researchers have found that when you pay attention to some aspect of your visual world, the systems in the brain responsible for processing what you're paying attention to increase in activity. That's part of what attention is. It's devoting more information processing power to whatever you're paying attention to. It lets you do things with that stimulus you couldn't do otherwise. For example, you can follow a ball that's moving around at very high speeds. You can't do that without paying attention.

The fact we can pay attention gains us a lot. It increases the computational power of the brain with respect to what we're paying attention to. But the cost is that it decreases the information processing done elsewhere. Attention is a wonderful ability. But in a way, it's like a zero-sum game. It takes our power away from noticing unexpected objects. That surprises people. Their intuition is that those things will grab their attention. It turns out attention can be devoted so strongly to one thing that you don't notice other things at all.

Inattentional blindness and the other examples in your book are known as ‘everyday illusions.' How do you define that term?

They're illusions about the way our minds work. Thinking you're going to notice everything important that happens is an illusion about how your mind works. Thinking you're going to remember things in more detail and with more accuracy than you do, that's an illusion about how your mind works. The more we thought about them, the more we realized the big role they play in everyday behavior and decisions. You don't need special equipment to experience these. They happen all the time in everyday life.

The media has been known to lambast politicians who tell stories about their past that didn't actually happen to them. But according to your work, we might want to go easier on them. They could be suffering from what you call ‘the illusion of memory.' Describe that.

When politicians say something about their own past that's not true, the debate always flares up. Are they a liar? Are they losing their mind? But this is the way memory works. It's not nearly as perfect as we think. The illusion of memory is our way of summarizing the findings of decades of research on how memory works. Memory does not work like a video camera. It doesn't record everything that happens to you. It's biased in systematic ways. Usually our memories have ourselves at the center of the action much more than we actually were. Our memories make us look better than we did and our memories also tend to conform to what we would expect to happen and lose unimportant but distinguishing detail.

A great example of this from recent news was the George Zimmerman trial in Florida. A large part of the prosecution's case was based on the inconsistency in the statements that Zimmerman gave about the incident. The prosecution argued these inconsistencies were lies. That might or might not have been true. I don't have an opinion on that -- and science can't answer those questions dependably about individual cases. But what we know about memory is enough to say it's entirely possible that, as someone recounts an event, it changes over time. This has been demonstrated in research on events like people remembering the assassination of J.F.K. or the explosion of a space shuttle.

This would all be fine if people acted in their everyday lives in accordance with how memory works. But people get into arguments just because their memories differ. They vote for or against the politician because that person's memory might have been inaccurate. Hillary Clinton's whole campaign was bogged down for weeks because she had this false memory about being shot at in Bosnia. She might be president today if it weren't for the illusion of memory.

You've also found that the least skilled among us tend to be the most overconfident. Can you give an example of this?

Often when you're interacting with somebody, you don't have that much information about how skilled they are or how accurate their memories are. Our natural tendency as human beings seems to be to take someone else's confidence as an indicator of those more unobservable facts about that person's skill level and knowledge and memory. We believe the confident and we don't believe the unconfident. Justin Kruger and David Dunning, two social psychologists, did a study where they compared how skilled someone was at a task with how skilled they thought they were. The skill they chose was sense of humor. They had people evaluate 30 different jokes on how funny they were. Then, they compared people's ratings of the jokes with ratings by professional comedians. They found the people whose ratings of the jokes diverged the most from the professional comedians' ratings thought they were above average in sense of humor. Almost everybody in their study thought they were above average, but the biggest gap was in the people who had the worst sense of humor.

Is there anything we can do to improve how our minds work?

We're working on a new book, which is going to try to answer that question. Knowing our minds are limited in these ways, can see what we're missing? Can we know when we might be in dangerous cognitive territory and need to pay extra attention or rethink assumptions or question our knowledge? Part of the answer is learning more about how your mind works. The next step is to be more honest with yourself about your own level of knowledge. Lots of bad decisions are made when people overestimate their own knowledge. Adopting an attitude of humility about your own knowledge is crucial.

Then you can start to think about specific ways to see what you're missing -- maybe by searching for more information relevant to your decision. You can reflect on your own decision-making processes. Was I using all the information I should have? Was I jumping to conclusions? This isn't easy. There aren't any shortcuts. But we do have an extraordinary capacity to learn new skills and become experts in tasks that seem undoable. Just watch two chess masters. They're moving at light speed and making good moves, despite only thinking about them for a couple of seconds. If you don't know how to play chess, what they're doing would seem alien. But you can learn step-by-step. Developing expertise in particular areas is a surprisingly good way to improve your decision-making capability. There aren't any get-smart-quick schemes that will suddenly solve these problems for you. But once you know the pitfalls, you can figure out how to avoid them.

Photo: Christopher Chabris / By Matt Milless

Share this

Christina Hernandez Sherwood

Contributing Writer

Contributing Writer Christina Hernandez Sherwood has written for the Los Angeles Times, Newsday, the Philadelphia Inquirer, Diverse: Issues in Higher Education and Columbia Journalism Review. She holds degrees from the University of Delaware and Columbia University's Graduate School of Journalism. She is based in New Jersey. Follow her on Twitter. Disclosure