By Laura Shin
Posting in Healthcare
Research on the way computer use affects our memories has turned up surprising findings.
Like the television and telephone before it, the Internet instigates feelings of love and loathing.
We are addicted to it, but resent our dependence on it. We can't wait to see what innovations it will bring us, but wonder what it has caused us to lose. And like the phone and TV, it leads us to wring our hands over one crucial question: Is it making us dumber?
A new study shows that the Internet is affecting our memories, though whether or not you think it's making us dumber depends on your definition of dumber.
In a series of experiments, Columbia University psychologist Betsy Sparrow and her co-researchers demonstrated that people are more likely to remember things when they think they won't be able to find them using a computer and vice versa.
“Participants did not make the effort to remember when they thought they could later look up the trivia statement they had read,” the authors write in Science (abstract only without subscription). (Hm, whether or not the Internet is making us dumber, it does seem to be making us lazier.)
The researchers also showed that people are even better at remembering where facts are stored than they are at remembering the fact itself.
Dr. Sparrow and her collaborators, Daniel M. Wegner of Harvard and Jenny Liu of the University of Wisconsin, Madison, conducted four experiments.
Looking to the computer for answers
In the first experiment, Dr. Sparrow showed that when confronted with difficult questions, people already start thinking about looking for the answers online.
The researchers asked 46 Harvard undergraduates easy and difficult questions, such as "Does 2 plus 3 equal 5?" or "Does Denmark contain more square miles than Costa Rica?" Afterward, they were shown general words, such as "table" or "telephone," as well as computer-related words, such as "modem" or "Google," in red or blue. They were then asked to identify the color of each word.
Subjects who had just tried to answer difficult questions were slower to identify the colors of the computer-related words than those who hadn't. Sparrow says this is because they were thinking about using the computer to find the answers. In fact, participants took longest to respond to the color question when the word "Google" came up.
How computers affect what we decide what to remember
Next, Sparrow and her colleagues aimed to determine whether having access to a computer affects what we remember.
In two experiments, almost 90 Harvard and Columbia undergraduates typed information into a computer. Half were told the information would be saved and the other half were told it would be erased. When asked to recall the statements, the students who thought the computer would erase their work remembered the statements better. (Okay, maybe the Internet isn't making us that lazy.)
Remembering where better than what
The last experiment posed trivia questions to 34 Columbia undergrads, who were told that all the information would be saved in six different files with generic names such as "FACTS," "DATA," "NAMES" or "INFO." When asked to recall the facts and their locations, they remembered the locations better.
Technology becomes external memory
These experiments show that people are using technology as external memory storage. As the Los Angeles Times says, quoting the researchers:
[W]e've come to use our laptops, tablets and smartphones as a 'form of external or transactive memory, where information is stored collectively outside of ourselves. ... We are becoming symbiotic with our computer tools, growing into interconnected systems that remember less by knowing information than by knowing where information can be found.'
Sparrow believes that this new trend might make us smarter, because we don't waste energy trying to memorize facts, thereby reserving brainpower for understanding the big picture.
As she told U.S. News and World Report, "If you take away the mindset of memorization, it might be that people get more information out of what they are reading, and they might better remember the concept," she explained.
Expanding external memory from people to computers
Though we may think it's very new to turn to our iPhone to find out information we can't remember, it turns out that people have long relied on outside sources for memory.
Twenty-five years ago, co-author Wegner and his now wife, Toni, were looking for the sponge they used to wash the car. He thought she knew where it was since she remembered everything about their washing and cleaning chores. She thought he knew since he was in charge of all the facts about their garage and car.
Their inability to find the sponge led to a better finding: the concept of transactive memory, in which people depend on others to remember things in areas about which the other person is more knowledgeable. For instance, you might always turn to your history buff dad when you have a question about the Civil War, while he always turns to you, his tech-savvy daughter, when he has a computer question.
As we rely less on dad and more on Google and Wikipedia, let's hope that if this new trend isn't making us smarter, it is as least making us better at figuring out whether or not to believe what we read on the Internet.
Editor's Note: This post has been updated to reflect that the words in the first study were presented in red or blue.
Jul 14, 2011
Hi! I am a student at Valparaiso University, and in my english class we are currently discussing the use of technology and how it has begun to possibly shift us humans into more technologically dependent beings. Your blog post regarding the topic of Google affecting our memory is one that is extremely relevant to the current class discussions. If at all possible could you check us out at newculturesofwriting.tumblr.com we would greatly appreciate it! Thank you!
got my complaits about not accepting my responses here but not the 2 times i actually responded to all this incorrect data. I learned way back in college that you should take notes as u go from about 10% to 25% retention just in writing it once, imagine what happens when you study those notes? There is a lot more to say, but don't think this can handle that much.
The test one made it on here temporarily but both posts did not show up at all. WTF is going on? It gets irritating to write related information down only to see it disappear. I give up.
One of the first things I was taught while getting my Business Computer Systems degree was that we would learn about 10% of what we needed to know but we would know how to find the other 90%. This was the goal of the college. What is wrong with using reference material, our brains can't remember everything. Long ago they found that the human mind can only learn 7 things at once on average, try to stuff more in, then something has to be thrown out. This came from a great English professor I had starting in 1980. I can even remember her name, Lois Rolph, haven't thought of her name in many years and she taught me over 30 years ago. Also they knew that if you only heard something, you would remember 5 to 10 percent. That jumped to something like 25 percent simply by writing it down, like in notes. That is not studying your notes, just from the act of writing the information down. Who is the genius who taught that taking notes causes you to remember less. That is one of the stupidest things I ever heard of. Btw, I came out of those classes as an honor grad, later was top in my class in accounting and still later while taking Business Computer Systems classes for a 2 year degree, I was well on top of the rest of the class. I applied what I learned earlier about how to study. It is a shame that I didn't get taught how to study back in elementary, junior or senior high schools. Parents didn't know either as they had never been taught how to study. Hope this enlightens someone.
I just responded to an article and it said here is what u posted, or words to that effect. That was followed by one blank line. Think the programmers need to get their brains working so their programs do what they are supposed to do.
Some things you should always entrust to your memory, while most you can look up. When I did taxes years ago I had a client who was 94 and into dementia. He did not know where he lived or his age. His granddaughter brought him in. I ask him if he knew his social security number. He didn't. I ask his granddaughter if he was in the service and she said yes. I as him his service number and he gave it it to me easily. As you all know we had it on every piece of cloths and said it about 50 times a day. I'm 84 and I still remember the phone # of the 1st phone I used when I was a child as my mother made me memorize it. Use the computer for most other.
Good article on a good study. I know my memory has declined from relying on Google for looking things up, and I know almost no one now who remembers phone numbers since they're all in our phones. And I don't like it. I view memorization as part of mental training, so I lament this trend as something that makes our minds weaker. I suppose it's similar to the industrial revolution, in which machines could suddenly do hard labor. Now we all have to go to the gym to keep our bodies from falling apart.
Well done! Thank you very much for professional templates and community edition sesli chat sesli sohbet
Lets not forget the origins of the Internet. The connecting of places of learning, for the sharing of information, much like a world library. Like any library around the world, there are certain subject matter that is written by different authors, each having gained their knowledge from a third parties. Their interpretation of the gained information, presents itself as their own unique knowledge. Intelligence is what we do with knowledge, not the raw information itself. Surely everyone can see, the internet is simply a indexing/referencing system that works at the speed of light.
Ms.Shin's article only confirms something that old computer hands knew way back when there were only IBM mainframes around. These machines were so complex that you simply could not remember everything about CICS, VTAM, VSAM, Cobol, Assembler. etc. The only thing you had to remember was where to look it up. Now, with the Internet, looking anything up has become easier. You don't remember because you don't have to remember!
While I was studying I tried not to take notes so that I would get better at remembering things it did not always work I found that even a small hint in the note helped to make you remember the rest. so notes are prompts or indexes otherwise the brain appears to become confused while sorting things out.
I was told this tale back when I was in college (yes we had colleges back in those days). Einstein was working with his students on a problem and one mentioned difficulty with the equation. Einstein put it up on the chalk board and began to run it through, at one point he turned to the class and asked them what the speed of sound was. They were flabberghasted and said, duhwah? They mention to him the cause of their surprise. . the smartest guy ever does not know what every 5th grader knows? "Why should I when it is so easy to look up," he said. "Save your brain for the things nobody else knows the answer to!" I thought that was hysterical but after raising a child with Asperger's Syndrome I have come to think it is part of their genius. (I am assuming the story is true but do not know it for a fact.) So perhaps we should ask if the computers are actually turning us into unorthodox genius?
When we come to rely too much on reference materials, or other people, or technology, instead of basically forcing ourselves into the recall habit, we can become mentally lazy. That's right, lazy. Why bother remembering who signed the Declaration, if you can just Google the answer? While technology does help us, in terms of enabling us to rapidly access, store, and retrieve facts and information, now that we've become computer-dependent and our figurative synapses have grown and attached themselves to the network cable like some nightmarish H.R. Giger painting or a scene out of Star Trek, all you have to do to KILL us, is shut the systems off upon which we now rely so heavily. Imagine someone being able to throw a switch, and shut half your brain off. Maybe not such a great development, in some ways? Maybe not.
Extending our data memory, our math processing speeds, all can equal faster and far better problems solving. However, it doesn't extend our critical thinking skills and as we have seen with big media (Foxy Murdoch) there is the very real possibility that our perceptions will be purposefully manipulated by big media and even in science. Now it is more and more important that we view everything we read (especially "expert" opinions on the net with a certain level of skepticism and that at the very least requires us to verify it with multiple sources - which fortunately is usually equally quick and easy.
I'm sure that, when people started writing things down as complex urban societies began to develop around 5000 years ago, there were those who said that this was the end of civilization as they knew it. Not entirely coincidentally, those nay-sayers would have been the traditional knowledge/lore/law keepers whose power and influence depended on their prodigious memories. They eventually lost out to the scribes, the new knowledge keepers, who in their turn tried to dominate society by restricting entry into their profession. Now the scribes are under threat (and I speak as one myself, as are all the diversified knowledge workers who read and comment here). However, as society becomes more complex, more effective skills and techniques are required, and must be developed if that society is not to stagnate. The trick is to accept the good ideas and reject the bad, and that requires thought and effort by all members of society. You can't keep a good meme down (and some bad ones as well, it seems)!
The results of the study support the traditional view of intelligence. It is good netiquette not to sweat the little stuff. It's no big deal if you forget miscellaneous facts that you can find on the internet as long as you know the philosophy and theory behind what your saying.
The brain is a great storage device, ALL of our neural development (including cognitive understanding) comes from the integration of information in our memories. The problem is we have no consistent consciously-usable techniques for FORGETTING once we have integrated information so that we can reset our RAM. Eventually, integrated "understanding" would clog all available memory capacity, but we would certainly get more than the few decades we get before we stop learning "little" stuff easily. Fairly soon (20 years? maybe a lot less?) ordinary computers will have the memory capacity to match our brains, and will equal or surpass our pattern recognition skills, possibly exceed our meta-pattern class recognition capability. Functionally, there is almost no limit to potential computer capacity as each circuit element size limit is worked around (maybe we'll hit it at the 3-atom size, but then we might be working purely photicly by then), we don't have that capability. There is an opportunity for a true cooperative existence for a short time before the electronic capability so far outstrips us that there is no longer any possibility for interaction.
This study is useless for most people. She only tested Harvard and Columbia undergraduates in an academic atmosphere. What about those of us whose memories are rapidly failing, like mine. The www and Google and Wikipedia have become my second memory. This is especially good for those "tip of the tongue" situations but I'm usually able to find what I want in a reasonable amount of time. I'm not competing in Jeopardy. Also, she appears to presume that using an external remembering mechanism will dumb us all down because the www is not to be trusted. Give us some credit, Laura. I, for one, have never ordered a copper bracelet to cure some malady and have never rushed to the city council to express my concerns about fluoridated water or flu shots.
it's about time we went away from memorization and started toward cognitive understanding. Computers have an ability to recall data and facts in a moments notice. But they are terrible at pattern recognition. The human brain has an incredible ability to recognize patterns at multiple angles, but are terrible at recalling data and facts. This is like peanut butter and chocolate...two great things that are greater together!
I rarely accept the first answer to the question I am looking up on the web unless it is obvious. I usually look at several sources and if they all say basically the same thing then you can be more assured it is correct. But even back in high school (which was MANY years ago) our teacher would often say its not imperative to remember the answer but to know where the answer can be found. Your brain can only store so much and needs constant "refreshing" to remember facts in detail. As time goes by more and more of the knowledge about a certain thing degrades unless you are working with it or are viewing it regularly. I remember hardly anything about geometry I learned in high school for instance simply because I have never had to use the knowledge so it has mostly disappeared.
Even back in the 1970s and 80s when libraries were the great repositories of information a wise teacher told me "if you cant remember the answer 3 years from now, know where to find the answer." The Internet caveat I have added to his statement is "go to hard copy if the Internet gives conflicting answers. Find the root data."
Whether that oversight was from the researchers, or the author of this article, the fact that subjects took longer to summon up the color of the Google logo IS NOT necessarily because google is a search engine, and therefore must create some sort of haptic conflict when the subject attempts to remember a feature NOT about searching. Rather, YOU search for "google logos" and go through PAGES of old/celebratory changes of the Google logo, and then I'll ask YOU "What color is the Google logo?" -- And I GUARANTEE you'll take an extra second or two hemming (or hawing) about it. Really -- there's reason they refer to Psychology as a "soft" science...
This study reflects a similar finding to what Wertsch and his colleagues' are working on, namely mediational means. Mediational means are tools that allow people to participate in practice. They influence people's agency both to assist and to constrain it. In the case of the computer, it is allowing people greater access to information; however, it comes with a price: without a personal choice to remember what a person has read, he or she will likely become more dependent on the computer. I personally think this study reflects the greater need for modern learners to assume responsibility to engage in some kind of active processing after reading something online, whether through a blog or a user comment system. Active processing allows learners to analyze and reflect on what they read, making mental connections between the new fact and some fact they already know. Additional articles: Wertsch, J. V. (2002). Computer Mediation, PBL, and Dialogicality. Distance Education, 23(1), 105-108. Wertsch, J. V., & Rupert, L. J. (1993). The authority of cultural tools in a sociocultural approach to mediated agency. Cognition and Instruction, 11(3&4), 227-239. Hatano, G., & Wertsch, J. V. (2001). Sociocultural approaches to cognitive development: The constitutions of culture in mind. Human Development, 44, 77-83.
Now when I'm out with friends and I turn to my iPhone to answer a question we are all pondering, I can tell them, "Be like Einstein -- save your brain for the things nobody else knows the answer to!" Laura
Hi Dangnad, I'm not sure what points you are disputing with the article. The article, like you, said that Google and Wikipedia and the Web are becoming our second memory. When you say "she appears to presume that using an external remembering mechanism will dumb us all down because the www is not to be trusted" -- are you talking about me or the researcher, Betsy Sparrow? Dr. Sparrow said she thought that this is making us smarter, actually, as I pointed out here: "Sparrow believes that this new trend might make us smarter, because we don???t waste energy trying to memorize facts, thereby reserving brainpower for understanding the big picture." If you meant me when you said "she appears to presume ... ," I don't think that the Internet will make us dumber or that it is not to be trusted. I was just making the point that as we depend on it more instead of memorizing facts on our own, we need to be discerning about the sources we trust. As someone who regularly posts stories online, I actually have a stake in people trusting the Internet and using it more, so I definitely don't think it is not to be trusted! Laura
Sorry Laura. I did, indeed, mean Betsey Sparrow. You are a fine journalist evidenced by this well-written article.
While the words were in those colours any recollection of the word Google would be littered with a large array of associations and a history of the Google logo appearing in many colors and shapes over time, some of which have been very striking depending on the individual. This would slow down most people's responses to this word, as was found. Where as modem would have less associations in general. I think that computer words would in general reflect a person's history of use as they do have a large imprint in our lives now. Possibly 50 years ago these words would have had less meaning and associations to process so the color would be more memorable and taken less time to recall. Perhaps a more neutral word would have been a better choice. All studies are fraught with such traps. Barry
I wasn't fishing for a compliment! Just wanted to clarify her opinion or mine, depending on which one of us you were referring to. But I'm glad to see that although you rely on the Internet for information (as do I) that you are discerning about which sources you trust. Laura