Chapter 4. We Are What We Seek

“A fanatic is one who can’t change his mind and won’t change the subject.”

Winston Churchill (attributed)

I hate to bring up partisan politics; it generally doesn’t do much good when one is trying to make any form of argument. But it turns out that if I look back at the times in my life when I have had a recognizably bad information diet, they’re the times when I’ve been knee-deep in politics. Working in politics is an amazing opportunity to try to affect change, sure. But it’s also a great way to pick up a disease called delusion.

In the summer of 2003, I packed my bags and headed up to New England to work as the lead programmer for the insurgent presidential candidate, Howard Dean. The staff was reasonably kind—mostly native Vermonters and interns at the time. They liked to pick on this poor Southerner, though; at one point, someone warned me that if I spent too much time outside with my eyes open in the winter, the fluid in my eyeballs would freeze over. I remember shutting my eyes hard and sprinting out across the ice to my car and grasping for the door handle blindly on several occasions. Yankees are tricky, I tell you what.

Cults, startups, Apple keynotes, and political campaigns all have one thing in common: a group of people with delusional loyalty to the mission they’re trying to accomplish. Those of us on the Dean campaign feasted on a diet consisting of the narrative that we would be the ones to remove the evil George W. Bush from office. I ended up gaining a lot from that campaign: about 32 pounds from the constant supply of campaign-contributed Ben and Jerry’s ice cream, and a healthy dose of crazy.

Each morning, the media miners—the folks in charge of watching all the cable news—would feed us clips that told us how well we were doing. The afternoon was filled with blog posts from across the Internet talking about how revolutionary our campaign was. Evenings were filled with watching the latest and greatest episodes of “The West Wing” starring President Bartlett—the fictional president that we assured ourselves was based on Howard Dean, despite producer and writer Aaron Sorkin donating twice as much money in 2004 to the presidential campaigns of Dick Gephardt, Wesley Clark, and John Edwards.

There was also constant speculation: Republican Strategist Karl Rove had said, gleefully, that Dean was the candidate he wanted to win the Democratic nomination. We were emboldened by his claim. They were afraid of us—Karl Rove never says what he means. He must be giving us his endorsement because he doesn’t want to face us! We’d try to find as many facts as we could to support this idea.

That CNN cut to Donald Rumsfeld instead of showing Howard Dean’s speech on tax policy? Certainly evidence that the White House was using whatever it could to keep us off the air. Obviously CNN, too, had become an instrument of this evil republican regime.

The week before the Iowa caucuses, I remember asking the campaign’s pollster, Paul Ford, by how much we were going to win Iowa. His response was: “We’re not. John Kerry is going to win it by 18 points.”

My jaw dropped. I wasn’t sad or disappointed. I was mad at Paul and a little disappointed in him. How could he be such a traitor? Hadn’t he seen the news? He clearly was incompetent. Any fool could see that we’d correctly leveraged the Internet in Iowa and this puppy was in the bag. Howard Dean would win Iowa and go on to beat George W. Bush.

But Paul was right and we were crazy. You know the rest of the story: Howard Dean lost the Iowa caucus by nearly 20 points, and would go on to give a concession speech with a yell that became his defining moment. Only the political intelligentsia would remember his use of the Web. The rest of the electorate remembers him for that terrible scream.

The morning after the caucuses, our Burlington, Vermont, offices were filled with more delusion. One of my colleagues ran up to me as I walked into the office and said, “Clay, did you see the Governor’s speech last night? It was awesome. He’s totally back. We’re going to win this thing.”

We redoubled our efforts—though Dean was down by double-digits in New Hampshire, we could make a comeback. Every primary and caucus after that, we convinced ourselves we still had it. As the weeks went by, as the sinking feeling got stronger that we would lose to John Kerry, we got hungrier and hungrier for any poll that would give us even a slim chance of winning.

If, a month later, you had polled the staff to ask who would win the Wisconsin primary—our line in the sand—we’d have told you it was Howard Dean. And we’d believe, out of desperation, anything that told us we were right.

We came in third.

Reality Dysmorphia

In eating disorder treatment centers, a physician will often ask the patient to draw an outline of her own body on a large chalkboard. Then, the doctor will ask the patient to place her back against the wall, and trace the actual outline of her body. For many patients, the outlines that they draw are quite exaggerated, sometimes twice as large as their actual bodies.

It’s a phenomenon called body dysmorphia: that someone’s self-image isn’t attached to reality. The phenomenon goes beyond the patients just thinking they’re a different shape than they really are, though: when the victims of this disorder look in the mirror, they’re literally seeing something different than what everybody else does.

During the Dean campaign, the delusion that resulted from my poor information diet was a cognitive version of this disease: reality dysmorphia. I haven’t met a single campaign operative here in Washington, D.C., on either side, that didn’t have at least a mild case of it.

This kind of delusion comes from psychological phenomena like heuristics, confirmation bias, and cognitive dissonance.

It turns out our brains are remarkable energy consumers. Though it typically represents only 2% of the human body’s weight, the brain consumes about 20% of the body’s energy resources.[40] As such, we’ve evolved—both for our brain’s energy consumption, and for our social survival—to use shortcuts in order to be able to handle more complex thoughts.

Think of a heuristic as a rule of thumb: a mental shortcut, or the thing you get once you burn your hand on a hot pan and learn that you shouldn’t touch hot pans anymore. You needn’t bother testing this hypothesis anymore; you know it. Heuristics are psychologically there so that you don’t have to think about them anymore, and you can spend your brain’s energy thinking about something else.

Heuristics have a dark side, though: they cause us to have unconscious biases towards things we’re familiar with, and choose to do the same thing we’ve always done rather than do something new that may be more efficient.

They cause us to make logical leaps that take us to false conclusions. For instance, these mental shortcuts underpin our capacity for racism, sexism, and other forms of discrimination.

One such nefarious heuristic is called confirmation bias. It’s the psychological hypothesis that once we begin to believe something, we unconsciously begin seeking out information to reinforce that belief, often in the absence of facts. In fact, our biases can grow to be so strong that facts to the contrary will actually strengthen our wrong beliefs.

In 2005, Emory University professor Drew Westen and his colleagues recruited 15 self-described strong Democrats and 15 strong Republicans for a sophisticated test. They used a functional magnetic resonance imaging (fMRI) machine to study how partisan voters reacted to negative remarks about their party or candidate. Westen and his colleagues found that when these subjects processed “emotionally threatening information” about their preferred candidates, the parts of the brain associated with reasoning shut down and the parts responsible for emotions flared up.[41] Westen’s research indicates that once we grow biased enough, we lose our capacity to change our minds.

Following Westen’s study, social scientists Brendan Nyhan and Jason Reifle conducted a new test,[42] and discovered what they believe is a “backfire effect.”

Nyhan and Reifle provided the subjects with sample articles claiming that President Bush stated that tax cuts would create such economic growth that it would increase government revenues. The same articles included corrective statements from a 2003 Economic Report of the President and various other official sources, claiming that this was implausible. The researchers then showed the students the actual tax revenues as a proportion of GDP declining after Bush’s tax cuts were enacted.

The results were fascinating: after reading the article, the conservatives in the study were still more inclined to believe that tax cuts increase revenue as a result of reading the correction. Hearing the truth made conservatives more likely to agree with the misperception. The facts backfired.

We already know that things like confirmation bias make us seek out information that we agree with. But it’s also the case that once we’re entrenched in a belief, the facts will not change our minds.

Politics is the area in which scientists have studied the psychological causes of bias the most. It’s easy to get people to self-identify, and universities tend to have more of an interest in political science than in other realms of social studies. But you can also see the results of this kind of bias in areas other than politics: talk to a Red Sox fan about whether or not the Yankees are the best team in baseball’s history, and you’ll see strong bias come out. Talk to MacBook owners about the latest version of Windows and you may see the same phenomenon.

We’ve likely evolved this way because it’s safer. Forming a heuristic means survival: watching your caveman friend eat some berries and die doesn’t make you want to conduct a test to see if those berries kill people. It makes you want to not eat those berries anymore, and to tell your friends not to eat those berries either.

Cognitive scientists Hugo Mercier and Dan Sperber took reasoning and turned it on its head. After all, if all the evidence around reasoning shows that we’re actually pretty bad at using it to make better choices, then maybe that’s not reason’s primary function. In their paper “Why do humans reason?,”[43] they argue instead that “reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.” Mercier and Sperber argue that our minds may have evolved to value persuasion over truth. It certainly is plausible— human beings are social animals, and persuasion is a form of social power.

The seeds of opinion can be dangerous things. Once we begin to be persuaded of something, we not only seek out confirmation for that thing, but we also refute fact even in the face of incontrovertible evidence. With confirmation bias and Nyhan and Reifle’s backfire effect in full force, we find ourselves both addicted to more information and vulnerable to misinformation for the sake of our egos.

This MSNBC Is Going Straight to My Amygdala

Neuroscience is the new outer space. It’s a vacuum of promise and fantasy waiting to be filled with science and data. There’s no greater, no more mysterious, no more misunderstood organ in our bodies than our brains. If one weighed the pages of mythology around the brain against that of all scientific papers ever written about it, the scale would likely tip towards myth.

The fields of psychology and neuroscience are filled with misinformation, disagreement, untested hypotheses, and the occasional consensus-based, verifiable, and repeatably tested theory. And so it’s a struggle for me: on one hand, I’m preaching about information diets, but—in trying to synthesize my own research in the field—I run the risk of accidentally feeding you junk information myself. On the other hand, so much of both fields is applicable to an information diet that it’s impossible not to draw on them.

Banting had an advantage on me. When he wrote his Letter on Corpulence, the Calorie was a unit used to measure the energy consumption of steam engines. Science had not scratched the surface of what he’d touched on yet. I’m writing in the midst of the dawn of science in this field; we know some, but not a lot. It’s as scientifically accurate to say, “This MSNBC is going straight to my amygdala,” as it is to say, “This ice cream is going straight to my thighs.” Only, now we actually have more information and more accurate research about how ice cream actually affects your thighs.

Let’s start with some acknowledgement that our brains are not exactly like the digestive and endocrine systems. Direct comparisons tend to be ridiculous: the rules for how our minds store and process information are different from how our bodies store and process food. Food consumption has immediate effects: drink an extraordinary amount of water, and you may get a fatal case of water intoxication. The same is not true for information; few people have died directly from reading too much PerezHilton.com in a given day.

Cognitive processing does, however, cause physiological changes just like our food does—only not in the same way. Up until a few years ago, it was thought that the human brain became fixed at some point during early childhood. Now science has shown that this isn’t the case; our brains constantly adapt and change their physiological structure. Every time we learn something (according to neuroscientists), it results in a physiological change in the brain.

This phenomenon is called neuroplasticity, and a quote from Dr. Donald Hebb sums it up: “neurons that fire together, wire together.” More explicitly, Hebb says:

“Let us assume that the persistence or repetition of a reverberatory activity (or “trace”) tends to induce lasting cellular changes that add to its stability.… When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.”[44]

The human brain is constantly adapting to experiences and the choices the mind makes. In London, taxi drivers must pass a comprehensive exam known as “The Knowledge,” which requires them to instantly create routes for passengers without the use of a GPS or a map. It’s considered the world’s most comprehensive taxi driver’s test, and it takes up to four years to prepare for and pass it.

According to scientists at the University College London, this is why London cab drivers have a differently shaped hippocampus than “regular” people.[45] The hippocampus is important to the brain’s ability to move short-term memories to long-term memory and to help with spatial navigation, the skills the cab drivers in London need the most. As a cab driver exercises this part of the brain more, the brain adjusts and lends more neurons to the region. When that happens, old circuits are replaced by new ones.

That’s one example of how doing things changes the physical composition of the brain. What about just reading something? Could something with a lower cognitive load—like watching your favorite television program— alter your brain’s structure?

The answer is likely yes. Every time you learn something new, it results in a physiological change in your brain. In 2005, in the paper “Invariant visual representation by single neurons in the human brain,” Quiroga et al. found that a single cell in the human brain fired off only when a picture of Jennifer Aniston alone was shown to a test subject. Another, distinct neuron showed up when the subject was shown a picture of Halle Berry, another for a picture of the Sydney Opera House, and another for the Baha’i temple. [46]

They even found a correlation between the neuron that fired off when a picture was viewed of the common landmark or celebrity, and the string of letters representing the corresponding name. In other words: a picture of the Sydney Opera House fired off the same neuron that seeing the string “Sydney Opera” did.

But that’s about memory. What about beliefs?

Dr. Ryota Kanai and some colleagues at the University College London Institute of Cognitive Neuroscience took self-described liberals and conservatives and studied their brains via fMRI (functional magnetic resonance imaging). They found something remarkable: the liberal brains had structural differences in the anterior cingulate cortex—the region of the brain responsible for empathy and conflict monitoring.[47] Conservative brain structures, by contrast, had enlarged right amygdali—the part of the brain responsible for picking up threatening facial expressions and responding to threatening situations aggressively.

The science is admittedly sketchy. Kanai’s test group was rather limited— just a small group of students. It also doesn’t explicitly prove that environmental factors had anything to do with the increased sizes of the respective brain regions; this could be genetic.

That said, I contacted Kanai and asked him if these differences in brain region sizes could be the result of media consumption and other environmental factors. Here’s what he had to say:

“From our study, it’s hard to resolve the chicken or the egg causality of brain structure and political orientation. I think this needs to be further explored with additional empirical work. As you suggested, exposure to politically tinged information could have influenced people’s political opinions, and it would be very interesting to see if such changes are reflected in brain structure. This is an empirical question we have to answer by more experiments.”

I also contacted another respected neuroscientist in the field, Dr. Marco Iacoboni at UCLA, to see what he had to say:

“I think it’s plausible, although unprovable at this stage. I mean, any decision we make is based on neurophysiological activity, it doesn’t come from the gods. If people, on average, become more or less liberal, in some way something must have happened in their brain. The tricky issue is the chain of causes and effects.”

Whether or not media consumption could physically alter your brain to be more partisan is unknown. But what’s known is that whenever you learn something new, the result is a physiological change in the body—just like whenever you eat. Another similarity is that we’re not in direct control over what changes get made to our brains, and where.

Search Frenzy

Back in 1954, psychologist James Olds found that if he allowed a rat to pull a lever and administer a shock to its own lateral hypothalamus, a shock that produced intense pleasure, the rat would keep pressing the lever, over and over again, until it died. He found that “the control exercised over the animal’s behavior by means of this reward is extreme, possibly exceeding that exercised by any other reward previously used in animal experimentation.”[48] This launched the study of brain stimulation reinforcement, which has been shown to exist in all species tested, including humans. At the heart of brain stimulus reinforcement is a neurotransmitter called dopamine.

Dopamine makes us seek, which causes us to receive more dopamine, which causes us to seek more.[49] That jolt you feel when you get a new email in your inbox, or hear the sound of your cell phone’s ding? That’s dopamine, and it puts you in a frenzy. This used to be helpful: our dopamine systems helped us, as a species, to find resources, acquire knowledge, and innovate. But in an age of abundance, there are new consequences.

Dopamine receptors often put us in a loop. With all the inputs available to us today—all the various places where notifications come about: our email boxes, our text messages, our various social network feeds, and blogs to read—our brains throw us into a runaway loop in which we’re not able to focus on a given task at hand. Rather, we keep pursuing new dopamine reinforcement from the deluge of notifications headed our way.

We got this way because of evolution. We’re wired to seek. For thousands of years, those that sought information got to live longer, got to have sex, and pass on their genes. We’re information-consumption machines that evolved in a world where information about survival was scarce. But now it’s abundant. With cheap information all around us, if we don’t consume it responsibly, it could have serious health consequences.

Get The Information Diet now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.