Scientists explain why our memories are so inaccurate
I’ve been re-experiencing my youth lately as movies that replay at random times inside my own head.
Sometimes, it’s the one where I’m rushed to the emergency room, age 10, after accidentally stabbing myself in the guts while climbing the spiky gate in front of my childhood home. Or when my bike-riding brother was hit by a car. Or when my father, home from the hospital, gathered his children in the back bedroom to tell us that our cancer-stricken mother, age 45, had “gone home to God.” It’s not surprising that my brain films usually contain highly emotional or fear-charged content, since those are ones we all are said to recall most vividly, and with the greatest detail.
But recently I’ve wondered: Are these memories even accurate?
Consider the Brian Williams episode.
If you discard the notion that he lied outright, the former NBC news anchor’s inability to distinguish fact from fiction in recounting his 2003 helicopter-in-Iraq tale received a near-universal shrug from memory scientists, if not media pundits, who said the occurrence was not that big a deal. Turns out, Ronald Reagan, George W. Bush and Hillary Clinton—to name just a few White House types—have all been caught on tape merging an emotional tale from the past with what’s known as a “prestige enhanced memory distortion.”
But when New York Times media critic David Carr died in February less than a week after Williams went off the air, I took it as evidence that it was time to explore the topic of memory. I knew the renowned columnist and former alternative weekly editor a bit, and his newsroom death sent me back to his 2008 memoir, The Night of the Gun. In it, Carr investigated his own volatile life’s events and basically discovered that his memories were vastly unreliable.
“Memory,” he concluded, “uses the building blocks of fiction—physical detail, arc, character, and consequence—to help us explain ourselves to ourselves and to others.”
Well, the possibility that our memories suffer from truthiness seems especially weighty today, what with the world’s population living longer than ever, baby boomers confronting age-related memory loss in record numbers, and disorders like Alzheimer’s disease readying to avalanche the health care system.
There’s no doubt that the human brain—and how it encodes, stores and retrieves information—has evolved over hundreds of thousands of years and is surely one of the most remarkably complex marvels to be found in the universe. The memory is genius, serving as a GPS (with astonishing spatial navigation capacity), interactive Rolodex (cross-indexing people, places and things with dexterity) and personal encyclopedia (sparing us from having to learn the same things over and over).
But even given all that—just how flawed are our memories? Is it possible that some of our most treasured ones never really happened? Can false memories really be planted inside our brains? Finally, in this age of exponential self-documentation, will our memories’ fallibility become more or less relevant?The stuff inside the brain
Jordan Crivelli-Decker—a graduate student with a giant white smile and face like Michael Cera, the teenage love interest in Juno—sits inside what looks like a double-wide refrigerator on the second floor of the UC Davis Center for Neuroscience. He’s inside this soundproof booth to demonstrate what it’s like to undergo a particular memory experiment: an EEG, or electroencephalogram.
Two female principals of the Davis Dynamic Memory Lab team apply hair gel-type goo to dozens of small, metal discs attached to a stretchy purple cap on Crivelli-Decker’s head. They fasten wires and plug everything into a computer, with the gel serving as a kind of conductor between the electrodes and the subject’s brain. Seated at a keyboard, closed into the cubicle, Crivelli-Decker begins tapping out responses to a standard working memory test.
The goal of the EEG is to monitor small amounts of electric currents the brain produces and, sure enough, we’re soon looking at a computer readout of a memory at work. The trial is just one of several used “to measure stuff inside the brain” in the lab of prefrontal cortex rock star Charan Ranganath, a cognitive neuroscientist and UC Davis psychology professor.
“The currents are very, very tiny,” said Ranganath. “But when there’s populations of tens of thousands of cells operating in sync,” he says, you can see patterns of brain activity.
Ranganath—a charismatic young scientist who loves Twitter, guitar bands (noise rock, math rock) and cognitive neuroscience—won a prestigious Guggenheim Fellowship recently for his work on Alzheimer’s disease.
“We’re trying to figure out how different brain areas code for different pieces of information,” he explains. In addition to the EEG, Ranganath’s crew uses eye-tracking techniques and fMRIs (functional magnetic resonance imaging) to track areas of activity in the brain so as to better learn how it files and retrieves information.
There’s something experts agree on: Memories are not as accurate as most of us believe they are.
Basically, every time a person recalls an experience, that memory is opened up for potential change. In remembering, you are actually reconstructing a memory.
Once an event is over, it’s gone, says Ranganath. “All you have is the story you created. It’s more like bits of sights, sounds and words stored in different parts of the brain that you put together into a story.
“You ask anybody who does memory work and they’ll tell you that,” he says. “Nothing’s true in memory.”
Ranganath calls himself “part of a movement” of scientists who believe that the brain makes a “pretty fundamental distinction between two halves … the ’who’ and the ’what’ parts, and the ’when,’ ’where’ and ’how’ parts.” Half concerns itself with information about people and things, and the other is devoted to the context in which you encounter those people and things.
“When you track different features of memories, there are parts that track the objective part of what really happened [who and what], and then there’s brain areas that follow much more the mental construction [when, where, how]—the story we tell ourselves.”
The later realm, with the context, is where the brain gets a bit creative in recall, and tends to generate a “story” around what actually happened.
When I mention to Ranganath that I’d often experienced a memory from childhood as a video, he agrees that’s how most people think it works: The mind records an event, then plays it back.
But that’s wrong, he says.
“It’s not like you’re pulling a book off the shelf and putting it back on the shelf,” he said. “It’s more like pulling it off, doing a little bit of scribbling in it, then putting it back.”
And what about out-and-out memory failures, regardless of our tendency to “reconstruct” some of the context?
Memory can fall short for various reasons, he explained. Alzheimer’s, schizophrenia, stroke, brain trauma—all can seriously disrupt recall processes. And plenty of people experience run-of-the-mill memory failure simply because they didn’t pay enough attention at the time of “encoding,” when the brain takes the first step in processing a new memory.
But simply getting older is the main culprit when it comes to most failing memories. In normal aging, the hippocampus shrinks naturally, resulting in an increased loss of details, such as names or where you last placed an object (like your keys).
Is there hope for the average person facing memory loss due to aging?
Ranganath’s advice is simple: Make sure to get exercise (30 minutes a day), try to get enough sleep and reduce chronic stress.
“Those are as good as we have right now,” he shrugs. “Our brain cells require energy to do what they do, and you want to be able to deliver that as quickly as possible. A brain that can rapidly get the glucose it needs—that brain has the advantage.”
Still, for most people, memory “improves until you hit college age,” he says. The professor holds his hand up high, as if it’s an airplane that’s taken flight. “Then it starts to decline,” he laughs. “From 30 on, it’s …”
Ranganath moves his hand swiftly downward, crashing the plane.Famous blue Icee
When you were 5 years old, you got lost in the shopping mall. Remember? Your mom gave you some money to buy a blue Icee, you ran ahead and somehow got turned around. This elderly Chinese lady found you, right? Then your mom appeared and took you to get the Icee.
The above is a fictional tale, spun by experimental psychologist Beth Loftus, one of the world’s leading authorities on memory. Loftus, now a professor of psychology and professor of law at University of California Irvine, has testified in hundreds of courtrooms, mostly for the defense in criminal cases, as an expert on the human mind’s ability to distort memories.
Loftus invented the “lost in the mall” story—and many other such narratives—to demonstrate that she could implant false memories in human minds. The result: She could. By merely suggesting the “lost” scenario to 24 adults, six came to fully believe that they’d actually been lost in the mall.
“We can easily distort memories and implant memories,” Loftus said flat out. “There is no question that memories can be contaminated.”
In fact, Loftus has a three-step “recipe” for how to plant a fake memory, whether purposefully or not. (Hint: It’s not how Leonardo DiCaprio did it in Inception.) First, you must possess the subject’s trust. Second, you plant the seed of an incident with specific detail (like the blue Icee). Third, you coax the subject to imagine the scene unfolding.
In the mid-1980s and early ’90s, the recipe was what some therapists unknowingly followed to “recover” incest memories from their clients’ childhoods. Loftus and others eventually proved that many of these recollections were false and had been unwittingly “implanted.”
Her work has perhaps a more verifiable parallel in mice.
Just last March, French neuroscientists successfully implanted false memories into the brains of sleeping mice, confirming findings from previous studies. Using electrodes to stimulate and monitor the activity of nerve cells in the mice, they created fake memories that changed mice behavior when they awoke.
UC Davis neuroscientist Brian Wiltgen has done similar work, using light to actually erase specific memories in mice (as with that memory-erasing stick in Men in Black, I imagine).
Another expert in the recall field is Jianjian Qin, a Sacramento State psychology professor who studies the mechanisms of memory in humans. Qin conducted research that indicated it was indeed possible to get adults to believe an event occurred in their childhood that actually did not. His work, co-published in 2008 with others in The Journal of Experimental Psychology, went on to show that people are basically no good at differentiating between their “fake” and “real” memories.
“Confidence does not equate with accuracy,” Qin said.
“If you have an eyewitness sitting there in a courtroom, pointing to the defendant, saying, ’He’s the one,’ that’s the most convincing, powerful evidence you can have,” said Qin.
But it’s often not good evidence, he said. The Innocence Project (a nonprofit legal organization dedicated to upending guilty verdicts with genetic evidence) found that more than 70 percent of convictions overturned through DNA testing were based on eyewitness testimony.
Qin conducts a basic exercise in his cognitive psychology class. “I ask them to remember the last time they had a fight with their significant other,” he said. “Then I say, ’How many of you remember something that happened in a different way than your significant other did?’”
All the hands go up.
“At some level,” he said, “people know memory is not always trustworthy.”
According to Ranganath, the fallibility of our recall system is not a flaw, just a fact. “Even when people think they’re telling you what actually happened, they’re making it up,” he said. “And you have to. If you didn’t do that, you’d have a memory disorder.”
Ranganath believes our brains developed the way they did to best “predict our future based on our past.”
“It’s not like our brains evolved to remember with precision a 16th birthday party or a first date or something,” he said. “Those things don’t give you much survival power.”
Ranganath theorizes that the human brain is simply not designed to enable us to accurately remember each and every detail about events that are nonessential from an evolutionary or “survival” standpoint. However, brain functions and memories that enhance our ability to adapt and survive—say, by enabling us to size up an individual or situation that may pose a threat—tend to be more truthful.
“What gives you survival power is the ability to use information to do things in the future that are adaptive.
“Our brains are designed to take experiences we have and strip them like somebody might strip a car,” he said. And then figure out, “How can I use this in my world?”The future of memory
Each of us has two selves, writes Nobel laureate Daniel Kahneman. The behavioral economist used an infamous 2010 TED Talk to unpack how our “experiencing self” differs from our “remembering self.”
“There’s a difference between being happy in your life and being happy with your life,” said the author of Thinking, Fast and Slow. He described how the self “which does the living” (the experiencing self) is completely different than the self that “maintains the story of our life” (the remembering self).
The experiencing self lives in the present, he said. The remembering self compiles “stories” we reconstruct from our less-than-trustworthy memories for future reference.
Kahneman’s theory manages to inspire a thoroughly modern question: How might social media—with its 24/7 ability to document what we are experiencing—impact our individual and collective recall going into the future?
In other words, will it matter that we remember poorly if there’s backup evidence as to what really happened on Facebook or Instagram?
Jesse Drew, a self-described “social media fanatic” who teaches media archeology as an associate professor of technocultural studies at UC Davis, thinks the answer is equal parts yes and no.
The most accurate way of digging up history, i.e., remembering the narrative of our past events, used to be through people’s correspondence and personal diaries, he said. But both these modes are “almost dead,” replaced by Facebook and social media.
“As we all know, Facebook is PR,” he said. “It’s used primarily to boast and show off the good side of things. That’s not very helpful when you’re trying to uncover what really happened.”
Still, Drew believes the technology may hold its own cure. There’s a tendency, especially among the young, to engage in “memory testing,” he said. And the trend may catch fire enough to aid our collective memory going forward.
“People throw out a memory of an event on social media, and people chime in and say, ’Yes, it happened that way,’ or ’No, it didn’t happen that way really, but I remember this,’” he said. “Social media doesn’t have to be PR. It can actually help hone your memory.”
Drew said he’s seen a tendency, too, toward young people using social media for oral histories and to build “memory sites.” He referenced the many Facebook pages now dedicated to displaying collective memories, for example, of the ever-gentrifying Mission District in San Francisco, where he has a personal history.
“People are starting to preserve memories more and test memories with other people,” he said. “There are a lot of great history projects out there.”
When asked for thoughts about science’s ability to “incept,” or implant, memories in human minds, Drew said it’s been there, done that.
“The Nazis knew that, too,” he said. “History is what people write about history. It’s not really what happened.”
Acutely aware of the failures of memory since his own mother suffers from Alzheimer’s, Drew encourages his young students to dig up some history of their own, find the oldest member of their extended family, “sit down and interview them” about their life while it’s possible. If some of the memories are not entirely accurate, he said, so be it.
“Video is cheap,” he said, “and you can counter the fluff of Facebook with deeper storytelling and oral histories. … I think it’s starting to swing this way. People want more authenticity, a more in-depth understanding of their world and the people in their lives.”You must remember this
In the spirit of memory testing, I sought the help of my sister and brothers. I asked them to describe the details about that shared, traumatic event from our childhood that I mentioned earlier: the one where we were told our mother had died.
It turned out our memories varied, but only very little.
It happened in daylight, we all agreed.
The sun poured in through the louvered windows in the back bedroom, where my father asked his six children, all under the age of 12, to gather. Our grandmother and an older cousin were also present, though a few of us don’t remember these two in attendance. My brothers recall spatial aspects, like where they were seated in the room, while my sister and I do not. We all remember our father delivering the awful news that our mother was never coming home again. We all recall crying together as a family. I’m the only one with a memory of my father’s exact words (“Your mother has gone home to God”), though after researching this story I realize that maybe the writer in me “reconstructed” the memory over the years with added dialogue from our Catholic upbringing.
After sharing his memory, one of my brothers thanked me for dredging up his recall on this defining moment in our lives—“thanks, this has been healthy for me”—and, like a key in a lock, his words illuminated a paradox that had been hiding from me.
There’s something more essential than the veracity of old memories. It’s the importance of recalling them, however reconstructed, with others, especially loved ones, in the present.
In The Art of the Novel, Milan Kundera writes: “There would seem to be nothing more obvious, more tangible and palpable than the present moment. And yet it eludes us completely. All the sadness of life lies in that fact.”
Or maybe our remembering selves don’t deserve that much credit. Perhaps it is the simple fact of our existence that we must struggle so fervently to experience life more frequently in the “tangible and palpable” here and now.