What's a brain bias?

Cognitive biases interrupt our ability to make rational decisions in our personal and financial lives.

We come into this world with a hidden repertoire of biases that seduce us into acting irrationally in a variety of common situations. By producing fast decisions and strong actions, these behaviors were adaptive, enabled us to survive in a hunter-gatherer environment. Most biases operate outside conscious awareness. The short articles that follow are mainly about identifying them. That's the hard part. But once a bias has been identified, correcting it is straightforward and mechanical. Each one becomes a point of inquiry. Cognitive biases are also called "effects," "errors," "fallacies," "glitches," and "illusions."

The biases are energy efficient: There's only so much information that can be analyzed by our brains before a cognitive load maximum is reached, and in the lead-up to that load, our critical-thinking faculties get sloppier. That's when mental short-cuts, like the brain biases, are useful. The history of "Cognitive Load Theory" can be traced back to the beginning of Cognitive Science and the work of G.A. Miller (1956). He was one of the first to suggest our working memory capacity was limited.




The Confirmation Bias

The tendency to seek evidence that agrees with our position and dismiss evidence that does not.

Instinctively, most humans avoid evidence that contradicts their opinions. Contrary information is upsetting and confusing. We don't want to admit our beliefs may be wrong. Admitting to thinking errors feels like a put down. Wallowing in self-righteousness feels warm and fuzzy. The Confirmation Bias is so powerful that even when we understand it deeply and witness our intransigence, we find it hard to correct.

Michael Shermer, author of The Mind of the Market, says: "Confirmation bias is where we look for and find confirmatory evidence for what we already believe and ignore disconfirmatory evidence."

Lewis Wolpert, author of Six Impossible Things before Breakfast, says: "Beliefs, once acquired, have a kind of inertia in that there is a preference to alter them as little as possible.  There is a tendency to reject evidence or ideas that are inconsistent with our beliefs." (page 85)

The Confirmation Bias sways us to...
•  favor evidence that agrees with our position
•  believe the future will bring new evidence to support it
•  cling stubbornly and passionately to our stance
•  adopt positions from traditions, religions and ideologies

Synonyms:  The Semmelweis Effect, the belief bias, belief preservation, biased assimilation, belief overkill, hypothesis locking, polarization effect, the Tolstoy syndrome, selective thinking, myside bias, law of fives, and Morton's demon.


We all seem convinced we're right about politics, religion or science these days. What makes us so sure of ourselves.  

By Robert Burton, author of  ON BEING CERTAIN


The Self-Serving Bias

The tendency to take credit for desirable outcomes and blame others for undesirable ones.

A student who gets a good grade on an exam might say, "I got an A because I am intelligent and I studied hard!" whereas a student who does poorly on an exam might say, "The teacher gave me an F because he doesn't like me! It's not my fault."

Three reasons have been proposed to explain the self-serving bias. The first explanation is motivational: people are motivated to protect their self-esteem, so create explanations that make them feel better. The second explanation focuses on making impressions on others: although people may not believe the content of a self-serving utterance, they may offer it to others to create a favorable impression. The third focuses on the mechanisms of memory: reasons for success might be more memorable than reasons for failure.

The Contrast Bias

The tendency to mentally upgrade or downgrade an object when comparing it to a contrasting object.

We constantly compare things, people and situations. We deem them bad, good or neutral depending on what we've recently experienced in the same category. We voted for Obama because we compared him to Bush. Contrast effects are common in human thinking.

• A hefted weight is perceived as heavier than normal when "contrasted" with a lighter weight. It is perceived as lighter than normal when contrasted with a heavier weight.
• An animal works harder than normal for a given amount of reward when that amount is contrasted with a lesser amount and works less energetically for that given amount when it is contrasted with a greater amount.
• A person appears more appealing than normal when contrasted with a person of less appeal and less appealing than normal when contrasted with one of greater appeal.

So if you were to compare your car to a clunker and not a Maserati, you'd feel better.

The Planning Fallacy

The tendency to underestimate the time is takes to complete a task.

The Denver International Airport opened 16 months late, at a cost overrun of $2 billion.  The Eurofighter Typhoon, a joint defense project of several European countries, was delivered 54 months late at a cost of £19 billion, instead of £7 billion. The Sydney Opera House may be the most legendary construction overrun of all time, originally estimated to be completed in 1963 for $7 million, and finally completed in 1973 for $102 million. Planners tend to focus on the project itself and underestimate time for sickness, vacation, meetings, and other "overhead" tasks. 

In one study, 37 students were asked to estimate the completion times for their senior theses. The average estimate was 33.9 days. Only about 30-percent of the students were able to complete their thesis in the amount of time they predicted, and the average actual completion time was 55.5 days.

On the opening date for the St. Regis in Kauai: "Originally slated to accept reservations as early as March, the St. Regis public relations department released an official announcement Tuesday evening saying that the opening has been delayed until October 1." (The Garden Island; May 1, 2009)

The Just-World Bias

The tendency for people to believe the world is "just" and therefore people "get what they deserve."

People fantasize that the world is "just," so when they witness something bad happening to someone, they often rationalize it by searching for things the victim might have done to deserve it. "She brought it on herself because she..."  This deflects their anxiety, lets them continue to believe that the world is a just place, but at the expense of blaming victims for many things that weren't, objectively, their fault. A not-nice, but natural, way to judge people in need. We must learn to distinguish between true-karma and false-karma. The caste system in India is a classic case of false-karma because it sentences millions of the unborn to a slummy slot in life. 

The Loss Aversion Bias

The tendency to find losses twice as painful as we find gains pleasurable.

“People hate losses (and their Automatic Systems can get pretty emotional about them). Roughly speaking, losing something makes you twice as miserable as gaining the same thing makes you happy....Consequently loss aversion produces inertia, meaning a strong desire to stick with your current holdings,” says Richard Thaler, author of Nudge: Improving Decisions About Health, Wealth and Happiness. Loss-aversion was first convincingly demonstrated by Amos Tversky and Daniel Kahneman. Tune into a fascinating Kahneman lecture here.

LOSS-AVERSION & MONEY MANAGEMENT: "Loss aversion also explains one of the most common investment mistakes: investors evaluating their stock portfolios are most likely to sell stocks that have increased in value. Unfortunately, this means that they end up holding on to their depreciating stocks. Over the long term, this strategy is exceedingly foolish, since ultimately it leads to a portfolio composed entirely of shares that are losing money. Even professional money managers are vulnerable to this bias and tend to hold losing stocks twice and long as winning stocks. Why does an investor do this? Because he is afraid to take a loss—it feels bad—and selling shares that have decreased in value makes the loss tangible. We try to postpone the pain for as long as possible...The only people who are immune to this mistake are neurologically impaired people who can't feel any emotions at all. In most situations, these people have very damaged decision-making abilities. And yet, because they don't feel the extra sting of loss, they are able to avoid the costly emotional errors brought on by loss aversion," writes Jonah Lehrer in his new book, How We Decide.

LOSS-AVERSION AND WAR: Nations have gone to war until their doom because of loss aversion. It simply means you refuse to admit you made a mistake. "Once we have committed a lot of time or energy to a cause, it is nearly impossible to convince us that it is unworthy." The real question is, "How bad do your losses have to be before you change course?" (Social Psychology Fourth Edition, Aronson et al., p. 175)

 "The Neural Basis of Loss Aversion in Decision-Making Under Risk"

We are risk-loving over losses
 and risk-averse over gains.

The GroupThink Bias

The tendency to do (or believe) things because many other folks do. 

Decades of research show people tend to go along with the majority view, even if that view is objectively incorrect. Now, scientists are supporting those theories with brain images. A new study in the journal Neuron shows when people hold an opinion differing from others in a group, their brains produce an error signal. A zone of the brain popularly called the "oops area" becomes extra active, while the "reward area" slows down, making us think we are too different.

The two leading theories of conformity are that people look to the group because they're unsure of what to do, and that people go along with the norm because they are afraid of being different, said Dr. Gregory Berns, professor of psychiatry and behavioral sciences at Emory University School of Medicine in Atlanta, Georgia. Berns' research, which he describes in the book Iconoclast: A Neuroscientist Reveals How to Think Differently, found that brain mechanisms associated with fear and anxiety do play a part in situations where a person feels his or her opinion goes against the grain. 

Also called "Herd Behavior" and "The Bandwagon Effect" and "The Conformity Effect."

MANTRA: To act more rationally, I must not get swept up into group acting, thinking and feeling frenzies.

The Beautiful People Bias

The tendency for beautiful people to receive more rewards than less attractive people.
Do pretty people earn more? Studies show attractive students get more attention and higher evaluations from their teachers, good-looking patients get more personalized care from their doctors, and handsome criminals receive lighter sentences than less attractive convicts. But how much do looks matter at work? The ugly truth, according to economics professors Daniel Hamermesh of the University of Texas and Jeff Biddle of Michigan State University, is that plain people earn 5 percent to 10 percent less than people of average looks, who in turn earn 3 percent to 8 percent less than those deemed good-looking. These findings agree with other research that shows the penalty for being homely exceeds the premium for beauty and that across all occupations, the effects are greater for men than women.

Size matters, too. A study released last year by two professors at the University of Florida and University of North Carolina found that tall people earn considerably more money throughout their careers than their shorter co-workers, with each inch adding about $789 a year in pay. A survey of male graduates of the University of Pittsburgh found that the tallest students' average starting salary was 12 percent higher than their shorter colleagues'. The London Guildhall study showed that overweight women are more likely to be unemployed and that those who are working earn on average 5 percent less than their trimmer peers.

According to Dr. Gordon Patzer, who has spent more than three decades studying and writing about physical attractiveness, human beings are hard-wired to respond more favorably to attractive people. Even studies of babies show they will look more intently and longer at prettier faces.

"Good-looking men and women are generally judged to be more talented, kind, honest and intelligent than their less attractive counterparts," Patzer says. "Controlled studies show people go out of their way to help attractive people—of the same and opposite sex—because they want to be liked and accepted by good-looking people."

These conclusions may not sound too pretty to those of us who were dealt a bad hand in the looks department. But don't rush off to try out for the next round of "Extreme Makeover" just yet. Despite what the research says, some of the world's most successful people have been ordinary looking at best, and you would never mistake the faces in Fortune for those in Esquire or Entertainment Weekly. Business legends are often of average height (Bill Gates at 5 feet 9 inches) or even diminutive (Jack Welch, 5 feet 8 inches, and Ross Perot, 5 feet 7 inches). What's more, many folks who are lovely to look at complain that they lose out on jobs because people assume they are vacuous or lightweights.

According to Gordon Wainright, author of Teach Yourself Body Language, anyone can increase their attractiveness to others if they maintain good eye contact, act upbeat, dress well (with a dash of color to their wardrobe) and listen well. He also stresses the importance of posture and bearing and suggests that for one week you stand straight, tuck in your stomach, hold your head high and smile at those you meet.

The Von Restorff Effect

The tendency to recall an item that "stands out like a purple cow" more easily than other items in a group.

A phenomenon of memory in which radically different things are more easily recalled than ordinary things. Memorizing isn't simply a matter of repetition. Attention plays a role in organizing material in ways that influence its later recall. For example, in any given number of items to be learned, an item that is different from the rest in size, colour, or other basic characteristics will be more readily recalled, such as a word printed in differently coloured ink or hi-lighted on a grocery list. 

This bias is named after the German psychologist Hedwig von Restorff (1906–1962) who first reported it in 1933. Also known as The Isolation Effect.

The Omissions Bias

The tendency to judge harmful actions as worse, or less moral, than equally harmful inactions.

Sam, a tennis player, would be facing a tough opponent the next day in a decisive match. He knows his opponent is allergic to a food substance. Suppose Sam recommends the food containing the allergen to hurt his opponent’s performance, or the opponent himself orders the allergenic food, and Sam says nothing. Most people judge Sam’s action of recommending the allergenic food as being more immoral than Sam’s inaction of not informing the opponent of the allergenic substance. Which stance would you take?

The Neglect-of-Probability Bias

The tendency to marvel over coincidences and ignore probabilities .

Why does probability-neglect happen? It's partly a psychological thing; imagining something happening and how we would feel about it is easy, natural, intuitive, but thinking about probability is difficult, mathematical, unfamiliar. And it's also partly a media problem; we have much more news reporting now than we did, say, thirty years ago. There were no 24-hour news channels then, only a small number of TV channels, and no internet. The news machine is voracious, and so when there is an accident or a disaster or any sort of human tragedy, it is reported and analysed endlessly. This makes us think that events which are actually very rare happen frequently, and perversely, events which are relatively common are under-reported precisely because they're not news.

Quiz: What's the safest way to travel? How much safer do you think it is to travel by car than to walk? A bit? A lot? Is a train safer than a plane? The best numbers I could find were fatalities per billion passenger kilometres for 1999:

Mode of Travel 
Deaths Per Billion Passenger Km

 Air 0.02
Boat 0.3
Rail 0.9
Car 2.8
 Bicycle 41
Pedestrian 49
Motor cycle 112

The Information Overload Bias

The tendency to place too much attention on information, even when it's barely relevant.

People often think, "The more information I acquire to make a decision, the better." But often, seeking extra information is a flagrant waste of time and money. Curiosity and confusion lead us into information binges. And, with Google continually beckoning and rewarding us, the info-gathering habit is harder to break than ever.

When you confront the brain with too much information, the quality of the decision tends to decline.

The Anchoring Effect

The tendency to "anchor" (rely too heavily) on one piece of information when making a decision. 

During normal decision-making, we often anchor, or overly rely, on specific information at the expense of other important information. We narrow our sights. Usually, once the anchor is set, there is a bias toward that information. Take, for example, my mom while shopping for a boat. She focused excessively on the pretty blue curtains of an older Chris-Craft and didn't think about how well the engine worked. She paid $30,000 for the vessel which sat in a slip for a year while a mechanic fiddled with the engine. A year later, she dumped the boat (because it needed a new engine) and muttered something like, "Boats are holes in the water where you throw money...and the happiest day of a boat owner's life is the day they buy the boat and the day they sell it." This bias is also known as "The Focusing Effect."

The Impact Bias

The tendency for people to overestimate the duration or intensity of their future feelings.

This bias says, it's not going to be as bad—or as fab!—as you might imagine, so relax. In other words, people seem to think that if disaster strikes, it will take longer to recover emotionally than it actually does. Conversely, if a happy event occurs, people overestimate how long they will feel elated, thrilled and titillated. The mind moves on, or does it?  What about obsessing over bad things? 


The Look-Alikes Bias

The tendency for people to cozy up to people who look like themselves and pick on those who don't.

We migrate toward people who remind us of ourselves. We perch with look-alikes because we feel comfortable and validated when other folks share our physical appearance—nose shape, coloring, height, taste in clothes—as well as our opinions. Have you ever noticed that most couples look like brother and sister?  Few people have made this observation yet it's so in-our-faces.

"From the little we do know, we can say that good people (we, us) are capable of incredible harm to others and even themselves. That daily moral decisions we make are not based on the principles of justice that we think they are, but are often a result of the familiarity and similarity of the other to oneself. These two simple types of bias happen because the mind and its workings remain invisible to us and until we unmask it meanderings, the disparity between what we do and what we think we do will remain murky," says MAHZARIN R. BANAJI, Professor of Social Ethics, Department of Psychology, Harvard University

The Denial Bias

The tendency to discount or disbelieve an important and uncomfortable fact.

When faced with a fact that's uncomfortable to accept, this bias says, " Reject it! It's not true despite overwhelming evidence." The subject may deny the reality of the unpleasant fact (simple denial), admit the fact but deny its seriousness (minimization), or admit both the fact and seriousness but deny responsibility (transference). Denial is a generic defense mechanism. 

Denial of Fact
Avoiding a fact by lying. Lying can take the form of an outright falsehood (commission), leaving out certain details to tailor a story (omission), or by falsely agreeing to something (assent). Someone who is in denial of fact is typically using lies to avoid facts they think may be painful to themselves or others.

Denial of Responsibility
Avoiding personal responsibility by blaming, minimizing or justifying. Blaming is a direct statement shifting culpability. Minimizing is an attempt to make the effects of an action seem less harmful than they are. Justifying presents reasons why someone is right. Denial of responsibility is a ploy to avoid potential harm or pain by shifting attention away from oneself.

Denial of Impact
Avoiding thinking about or understanding the harms one's behavior has caused to themselves or others. By doing this, the person is able to avoid feeling a sense of guilt and it can prevent that person from developing remorse or empathy for others. Denial of impact reduces or eliminates a sense of pain or harm from poor decisions.

Denial of Awareness
Avoiding pain and harm by stating he/she was in a different state of awareness (such as alcohol or drug intoxication or on occasion mental health related). This type of denial often overlaps with denial of responsibility.

Denial of Cycle
Avoiding looking at one's decisions leading up to an event or not considering one's pattern of decision-making and how harmful behavior is repeated. The pain and harm being avoided by this type of denial is more of the effort needed to change the focus from a singular event to looking at preceding events. Many who use this type of denial say things such as, "It just happened."

Denial of Denial
Denial of denial involves thoughts, actions and behaviors which bolster confidence that nothing needs to be changed in one's personal behavior. This form of denial typically overlaps with all of the other forms of denial, but involves more self-delusion.

BrainTip:  When you're in denial, ask yourself, "Is it true that denying the facts is less painful in the long run? How does denial make me feel? And how would I feel if I faced, and heartily embraced, the truth?"

The Déformation Professionelle Bias

The tendency to look at things from the point of view of your profession and forget a broader perspective.

A French phrase, Déformation Professionnelle, is a pun on the expression "formation professionnelle," meaning "professional training." The implication is that all (or most) professional training results to some extent in a distortion of the way the professional views the world. "When the only tool you have is a hammer, every problem looks like a nail," is an adage describing this phenomenon.

The Endowment Effect

The tendency to demand much more to give up an object than you would be willing to pay to acquire it.

In one experiment, people demanded a higher price for a coffee mug that had been given to them, but put a lower price on one they did not yet own. People value a good or service more once their property right to it has been established. In other words, people place a higher value on objects they own than objects that they don't. The endowment effect was described as inconsistent with standard economic theory which states that a person's willingness to pay for a good should be equal to their willingness to accept compensation to be deprived of the good.

This is from Jonah Lehrer 's blog, The Frontal Cortex:

JUNE 22, 2009
I went jean shopping this weekend. Actually, I went to the mall to return a t-shirt but ended buying a pair of expensive denim pants. What happened? I made the mistake of entering the fitting room. And then the endowment effect hijacked my brain. Let me explain.
The endowment effect is a well studied by-product of loss aversion, which is the fact that losing something hurts a disproportionate amount. (In other words, a loss hurts more than a gain feels good.) First diagnosed by Richard Thaler and Daniel Kahneman, the endowment effect stipulates that once people own something - they have an established or imagined "property right" to the object - that something dramatically increases in subjective value.Wikipedia has an excellent summary of an experiment documenting the endowment effect by Dan Ariely and Ziv Carmon:
Duke University has a very small basketball stadium and the number of available tickets is much smaller than the number of people who want them, so the university has developed a complicated selection process for these tickets that is now a tradition. Roughly one week before a game, fans begin pitching tents in the grass in front of the stadium. At random intervals a university official sounds an air-horn which requires that the fans check in with the basketball authority. Anyone who doesn't check in within five minutes is cut from the waiting list. At certain more important games, even those who remain on the list until the bitter end aren't guaranteed a ticket, only an entry in a raffle in which they may or may not receive a ticket. After a final four game, Carmon and Ariely called all the students on the list who had been in the raffle. Posing as ticket scalpers, they probed those who had not won a ticket for the highest amount they would pay to buy one and received an average answer of $170. When they probed the students who had won a ticket for the lowest amount they would sell, they received an average of about $2,400. This showed that students who had won the tickets placed a value on the same tickets roughly fourteen times as high as those who had not won the tickets.
What does this have to do with fitting rooms and jeans? Once I tried on the pants, I became an implicit owner of them. I stared at myself in the mirror and admired the fit, the wash, etc. I thought about how good they would look with my shoes. I contemplated wearing them to various upcoming events and all the strangers who would look at my pants and think "Those are nice pants!" In other words, I spent a few minutes imagining my life with these new jeans and, once that happened, the pants suddenly became much more valuable. I mentally endowed myself with the object and didn't want to lose something that I didn't even own. As a result, the ridiculous price tag ($170 for Levis!) no longer seemed so ridiculous. The lesson? Don't try something on that you don't want to buy.
Update: Via a reader (thanks Alon!) comes this study, which demonstrates that merely touching an item can trigger the endowment effect.

The Illusory-Correlation Bias

The tendency to inaccurately link an action and an effect.

It's so easy to overestimate a link between two variables—a cause with an effect—even when the link is slight to non-existent. For example, we might link being slim with happiness, assuming heavy women are less happy. Haven't you had the thought, "If I lose my pot belly, I'll be happier." But is it true? Isn't happiness dependent on myriad variables, including our brainstate or moods. And what about astrology, linking distant planets to earthly personalities? Really far-fetched and so many fall for it.

Redelmeier and Tversky (1996) assessed 18 arthritis patients over 15 months, while also taking comprehensive meteorological data. Virtually all of the patients were certain that their condition was correlated with the weather. In fact, the actual correlation was close to zero.

The opposite of an "illusory correlation" is an "invisible correlation," where an actual correlation is not seen. For instance, the link between smoking and cancer was not noticed for years.

IT COULD HAPPEN TO YOU
After a delightful birthday lunch at Poggio’s in Sausalito, my friend Dawn (a fitness trainer who has hiked the 2,000 mile Pacific Coast Trail twice) and I went for a casual stroll in my hilly neighborhood.
   Back at the house, she gasped, “Your fingers are blue! You might not be getting enough oxygen to your extremities. Take off your shoes, let’s look at your toes.”
    They weren’t blue, which baffled us even more. We talked about calling a doctor, but first I needed to go to the bathroom. While sitting on the john, I glanced down at my hands, which were resting on my new unwashed blue jeans.
     “The jeans did it!” I screamed.
     “No way.”
    “Let’s see if it comes off,” I said, lathering my hands with soap. Then I ran them across my white kitchen counter and, voila, it turned blue. We howled, realized we had been snookered by an illusory correlation.
   According to David Ludden, associate professor of psychology at Lindsey Wilson College in Kentucky, there is a strong urge in humans to seek causes for events in their lives, and we fall for the ones we manufacture even if they’re wrong.
   “No other species makes causal inferences like we do,” says Ludden. “This ability clearly gave humans a strong evolutionary advantage, allowing them to understand and—and hence control—their environments to a greater degree than any other animal. Still, our ability to ascertain causality is not that reliable. For one thing, we are prone to causal illusions—we infer causal relationships where none exist. Furthermore, when we are unable to explain why events occur, we feel distressed, so we tend to make up explanations with little or no evidence to support them. Nevertheless, the advantages of this ability to construct beliefs about causal relationships must have outweighed the negative side effects for early humans.”
   So now I know why Dawn and I quickly fabricated a link between oxygen-deprivation and my fingers. What if we’d gone to the hospital? Would the doctors have figured out the true cause? Or would they have made up an illusory correlation, too?
   My friend Lynn calls this cognitive bias “The Jumping-to-Conclusions Bias,” because conclusions appear quickly and with a feeling of certainty. To learn more about certainty, peruse neurologist Robert A. Burton’s keen tome, On Being Certain: Believing You're Right When You’re Not.