A Message From Your Brain: I’m Not Good At Remembering What I Hear

A new study shows that we are far better at remembering what we see and touch than what we hear.

BY DIANE COLE, FOR NATIONAL GEOGRAPHIC

NEXT TIME SOMETHING you hear goes in one ear and out the other, you have a built-in excuse. Just blame it on your Achilles’ ear—a weakness that lies not in a mythical hero’s heel, but in the real-life way the brain processes sound and memory.

That’s the suggestion of a University of Iowa study comparing how well we recall something, depending on whether we see it, hear it, or touch it.

Associate professor of psychology and neuroscience Amy Poremba and graduate student James Bigelow asked a hundred undergraduates to participate in two related experiments. In the first, students listened to sounds, looked at images, and held objects. Then, after an interval ranging from one to 32 seconds, they were asked whether various stimuli were the same or different from the originals. In a second experiment, the students were asked to recall sounds, images, and objects after an hour, a day, and then a week. In both instances, the students’ auditory recall came in last, lagging far behind the tactile and visual memories, which the students recalled at about the same level. The longer the time that elapsed, the greater the gap became, with auditory memory lagging farther and farther behind the other types of memory.

“Our auditory memory isn’t as robust as we might like to think it is,” says Poremba. “We think that we are great at integrating all the senses,” but the experiment shows that tactile and visual memory easily trumped auditory memory.

The results further suggest that the brain processes tactile and visual memories through a similar mechanism, but that auditory memory is processed differently. This has potential implications for understanding the evolution of the human brain, says Bigelow, since the auditory memory of monkeys and chimpanzees also lags behind their tactile and visual memory.

See Me, Feel Me

As for the here and now, the study holds possible applications for teaching and learning. “This reinforces the importance of multisensory learning and shows that the tactile can be very important,” says John Black, Cleveland E. Dodge Professor in the Department of Human Development at Teachers College, Columbia University. Current technology that combines multisensory and multimedia components—such as i-Pads, tablets, and e-textbooks—requires students to touch and move their fingers over the screen to access videos, voiceovers, and additional text, which can enable multisensory processing. “This is not to underplay the importance of the verbal, but it emphasizes that we should not forget about the other aspects. You need them all.”

Indeed, the study is a reminder that we need to engage all the senses “to promote learning and memory,” says Janet Brain, a learning disabilities specialist in New York. That approach is already “the hallmark of much of the reading instruction that’s done with dyslexic children.”

Technology Can Help

Along with Black, she finds that technology provides many possibilities for multisensory learning. Interactive computer graphics and videos that add more senses to the mix can “make visual cues much stronger” and “improve visual memory,” she says—and can also increase attention span. In other words, the more varied ways in which you are exposed to and interact with the material, the more likely you will be to remember it.

And if you want a practical example of what can happen when you use primarily one—as opposed to multiple—senses in teaching, Brain points to the once ubiquitous approach to teaching foreign languages known as the audio-lingual method. One reason for its mixed success, Brain suggests, is that the language labs that were central to the approach could make for a kind of auditory vacuum, with students spending hours listening to audio recordings and sentence drills. Less emphasis was given to connecting the words heard to objects that would help endow meaning to the words and sentences in the drill—like passing around an apple when students learn the French word for the fruit, pomme.

The final takeaway may come from a Chinese proverb, say Poremba and Bigelow: “I hear and I forget; I see, and I remember.”