Researchers may have identified a brain quirk that promotes pessimism
by Markham Heid
Your brain is like a faultless movie projector. Sights, sounds, and a jumble of other sensory information pass into it via the spinning reel of your existence, and your brain reconstitutes that hodgepodge into an objective, lossless experience that you call consciousness.
Of course, that’s wrong.
Your brain is actually not a faultless projector. The reality it makes for you is biased and suggestible. Expectation, experience, emotion, and many other variables shape the world that your brain creates.
In his 2019 book Rethinking Consciousness, the Princeton psychologist and neuroscientist Michael Graziano explains that the brain’s interpretation of reality is built upon internal models that are patchy, subjective, and skewed— “like impressionistic or cubist paintings of reality,” he writes.
“Our intuitive understanding of the world around us and our understanding of ourselves, always distorted and simplified, are dependent on those internal models,” he adds.
A lot of recent scientific inquiry has explored how the brain constructs these internal models, and how their flaws may get us into trouble.
Some of that work has examined the brain’s heavy reliance on predictions born of experience. While helpful in some contexts, those predictions — and, by extension, your reality — may be imbalanced in deeply problematic ways.
Take a look at this dot:
Is it blue or is it purple?
For a 2018 study in the journal Science, researchers posed this question again and again. Over a series of five experiments, they found that peoples’ answers were surprisingly vulnerable to manipulation.
In one experiment, the researchers showed people hundreds of dots that ranged in hue from solidly blue to solidly purple. At first, the proportion of blue- to purple-colored dots was equal, and the people’s answers reflected this split. But after a while, the researchers showed some of the people fewer and fewer blue dots. Invariably, these people started to label more of the purple-shaded dots as blue.
“When the prevalence of blue dots decreased, participants’ concept of blue expanded,” the study authors wrote. The peoples’ brains seemed intent on balancing out the proportion of dots based on prior experience. Even when the researchers explicitly told people that they would see fewer blue-shaded dots, the people continued to make the same error.
Dots are one thing. But in two additional experiments, the study team found that the same effect turned up when people assessed threatening faces or weighed in on ethical matters.
“For all of these different types of judgments, what counts as blue or threatening or unethical is affected by the prevalence of what you’ve seen before,” says David Levari, Ph.D., first author of the study and a post-doctoral research associate at Harvard University. “So when the threatening faces or blue dots go away, you still see them.”
“The brain makes relative judgments, and those can be pushed around in ways people don’t appreciate.”
He and his colleagues termed this phenomenon “prevalence-induced concept change.” Basically, the reality your brain constructs is susceptible to a kind of experience-based inertia.
“Your brain’s not like a tape measure that measures exactly how long something is,” Levari says. “The brain makes relative judgments, and those can be pushed around in ways people don’t appreciate.”
In a lot of situations, this tendency to make relative judgments is helpful and efficient. It can facilitate quick decisions or assessments that are context-appropriate. But as Levari’s research showed, this tendency can also lead us to perceive threats or unethical behavior where none exists.
In their study paper, Levari and his colleagues wrote that the brain’s apparent tendency to mold new information to align with its prior expectations may have some “sobering” implications.
“Although modern societies have made extraordinary progress in solving a wide range of social problems, from poverty and illiteracy to violence and infant mortality, the majority of people believe that the world is getting worse,” they wrote. “The fact that concepts grow larger when their instances grow smaller may be one source of that pessimism.”
There are big problems out there that demand our attention and energies. No one is suggesting otherwise. But Levari’s work indicates that even when things get objectively better—a happy event that should gratify and inspire us — our minds may struggle to acknowledge that improvement.
He says that prevalence-induced concept change can work in both directions — meaning it can pull us toward overly sanguine viewpoints. But some of our habits may be nudging (or shoving) our minds toward excessively negative representations of the world.
“We are very sensitive to what we get exposed to, and especially to things we see over and over again.”
According to a 2020 Nielsen report, the average American adult spends somewhere between 11 and 12.5 hours a day consuming some type of media. That’s up from about seven hours in 1980.
As our media diet has swelled, the rise of machine learning and targeted-content algorithms have created “filter bubbles” (a.k.a., information echo chambers) that lead us to content that reinforces our beliefs, feelings, or proclivities. Levari’s research suggests that, even when we know what we’re seeing is imbalanced or inaccurate, our views are nonetheless influenced.
“The things you see on social media or in other media may not be appropriately representative of things in the real world, or in your world, but they can loom very powerfully in your mind,” he says.
If you’re exposing your brain to a steady stream of content that is excessively angry, snarky, aggrieved, despondent, politically preoccupied, or otherwise lopsided, these attributes will gradually saturate your reality. Your brain will detect them everywhere. Blue dots will become purple.
“The main takeaway,” Levari adds, “is that we are very sensitive to what we get exposed to, and especially to things we see over and over again.”