Discovering and Overcoming Cognitive Illusions

By Ryan McGreal
Published June 01, 2009

How much different would our society look if we understood our cognitive limitations as well as we understand our physical limitations? That's the question behavioural economist Dan Ariely poses in his recently posted talk at the TED conference.

Through a series of highly entertaining illustrations, Ariely demonstrates that human decision-making is wracked by cognitive illusions (in an analogy to optical illusions) that seriously impair our ability to make rational decisions.

Humans can't think clearly: hilarity ensues.

Illusions Inescapable, Repeatable

Ariely makes two crucial observations about our cognitive limitations:

First, we can discover our illusions through careful study, but we cannot stop ourselves from experiencing them as such, even when we understand that they are illusions.

He showed the well-known picture of two tables of apparently different lengths:

These two images are the same length. No, seriously. (Image Credit: Cool Optical Illusions)
These two images are the same length. No, seriously. (Image Credit: Cool Optical Illusions)

He superimposed two red measuring lines to demonstrate that both tables are the same length. On removing the lines he said:

The interesting thing about this is, when I take these lines away, it's as if you haven't learned anything in the last minute. You can't look at this and say, "Okay, now I see reality as it is." It's impossible to overcome the sense that this [gesturing to the table on the left] is indeed longer.

So we can take our illusions into account, but we cannot free ourselves from experiencing those illusions.

His second observation is that these mistakes are consistently repeatable and predictable. We cannot escape the perception itself of the illusion, but with careful study, we can discover that the illusion is happening and adjust our thinking to take the illusion into account.

Ariely used optical illusions to illustrate the principle, but moved from optical to cognitive illusions to demonstrate that they operate in an analogous manner. Since "vision is one of the best things we do," it seems reasonable to conclude that we will also make predictable mistakes in cognitive processes for which we do not have a specially adapted brain structure.

In fact, Ariely argues that not only are our cognitive illusions bigger and more serious than our optical illusions, but also they are harder to demonstrate since we cannot simply hold a ruler up to the errors in which we process data and form conclusions.

Manipulating Choices

Using a comparison of European countries over rates of organ donation as a demonstration, Ariely shows that the way questions are formed - especially questions over intellectually or morally complex issues - has a huge influence on which choices people make.

In the case of organ donations, European countries are divided into a few with extremely low rates of donors and countries with nearly 100 percent donors. The difference is not social or cultural and is not related to incentives. Instead, the difference is that in some countries, citizens need to check a box on a form to authorize organ donation, and in the other countries, citizens need to check a box on a form to opt out of organ donation.

The issue is so complex, Ariely argues, that people opt to go with the default choice - whatever that default happens to be - rather than figure out what to think about it.

It gets more interesting. Given a choice between an all-expenses paid trip to Paris or Rome, respondents are roughly evenly split. However, by adding a third choice - a trip to Rome where all expenses are paid except for a morning cup of coffee - a majority of respondents choose the trip to Rome including coffee.

Adding a slightly inferior version of the Rome trip makes the full Rome trip look even better than the Paris trip by comparison.

He demonstrates the same principle at work in a few other cases, concluding with some helpful advice on how to increase your chance of picking up a date the next time you go bar-hopping.

Cognitive Safety Features

We are generally well aware of our physical limitations (notwithstanding the Lake Wobegon Effect). For example, automobiles are designed to compensate for our slow reaction time, our tendency to distraction, and above all for the fragility of our bodies.

However, we generally ignore our cognitive limitations when we design physical and social systems. To the extent that awareness of our persistent cognitive deficits exist at all, it is used to exploit and manipulate our limitations, not to ameliorate them.

It actually feels illiberal or at least condescending even to bring it up at all, a fact that advertisers exploit shamelessly in their defence of aggressive, manipulative marketing tactics. You don't think people are capable of making good decisions? I guess we need the nanny state to do our thinking for us now!

It's a damned tricky business, because a major premise of authoritarianism in all its forms is, in fact, the belief that people are infantile and need to be governed.

The goal of a clearer understanding of our cognitive limitations should be empowerment to make better decisions, not dependency on some leader (who, after all, is subject to exactly the same cognitive limitations).

Overcoming Cognitive Limitations

Luckily, we already have a fairly reliable method of identifying and overcoming our cognitive limitations in our reasoning and decision-making processes: the scientific method. One of my favourite quotes is by Stuart Chase, quoted in S.I. Hayakawa's classic Language in Thought and Action:

Common sense is that which tells us the world is flat.

As a society, it's high time we start to cultivate a lifelong habit of questioning our assumptions, revisiting received wisdom, subjecting our hypotheses to vigorous empirical testing, and openly debating public policy not based on conformance to this or that narrow ideology or dogma, but to demonstrable and repeatable evidence.

One thing we need to cultivate as part of this is a willingness to be proven wrong. Instead of wasting our time seeking to 'safe face' by stubbornly defending ridiculous, indefensible positions, it would be great if we could all step back and learn from intellectually honest debate. Another favourite quote is from John Maynard Keynes, recently back in fashion among economists:

When the facts change, I change my mind. What do you do, sir?

Can you imagine a society of people willing to change their minds when the facts change? We could stop wasting our time reacting fearfully to ideas. We could free ourselves from eternal judgment against our own past decisions. We could arrange the framework in which we make decisions to ensure that our choices better reflect what we really want.

Ryan McGreal, the editor of Raise the Hammer, lives in Hamilton with his family and works as a programmer, writer and consultant. Ryan volunteers with Hamilton Light Rail, a citizen group dedicated to bringing light rail transit to Hamilton. Ryan writes a city affairs column in Hamilton Magazine, and several of his articles have been published in the Hamilton Spectator. He also maintains a personal website and has been known to post passing thoughts on Twitter @RyanMcGreal. Recently, he took the plunge and finally joined Facebook.


View Comments: Nested | Flat

Read Comments

[ - ]

By JonC (registered) | Posted June 02, 2009 at 14:08:44

I read a related piece related to complex decision making. http://www.juliansanchez.com/2009/04/06/...

The article discusses the argument from authority fallacy and then gets into arguments between experts. "Sometimes the arguments are such that the specialists can develop and summarize them to the point that an intelligent layman can evaluate them. But often—and I feel pretty sure here—that's just not the case." This creates opportunity for the disingenuous to state opinions or falsehoods while only ensuring that the statement sounds plausible.

Then goes on to coin the "one way hash argument". The term is explained in better detail at the link, but in short, is a quick argument that requires a great amount of explaining to counter. In essence, that a short, intuitive and plausible (but wrong) argument tends to carry an undue amount of weight due to the ease of understanding (as long as the concepts are above the laypersons understanding).

The authour then ties it into the Dunning-Kruger effect, which leads to a situation where "people with less competence will rate their ability more highly than people with relatively more competence". http://en.wikipedia.org/wiki/Dunning-Kru...

Permalink | Context

[ - ]

By Ryan (registered) - website | Posted June 02, 2009 at 16:25:32

Great find, JonC.

The Dunning-Kruger effect should strike self-doubt into anyone willing to wade into complex issues.

It's as Yeats warned: "The best lack all conviction, while the worst / Are full of passionate intensity" - though like so many pithy observations Shakespeare was there first: "The fool doth think he is wise, but the wise man knows himself to be a fool."

Combine Dunning-Krueger (and the related Lake Wobegon effect) with what some psychologists call "depressive realism" and it starts to make sense as a necessary corollary to the mindset required for basic functioning in society.

Several studies suggest that depressed people actually have more realistic and accurate (albeit more narrow and constrained in scope) perceptions - of themselves, their circumstances, their importance to others and their control over events - than happy, psychologically 'well-adjusted' people (I'm over-generalizing, and the data are more nuanced than I'm presenting here, but you get the general idea).

That is, it seems to require an unwarranted over-assessment of one's own abilities just to get out of bed in the morning. Among the predictable side-effects of such widespread over-confidence we may count the Dunning-Kruger and Lake Wobegon effects.

Yet the alternative seems to remain an incapacitating and therefore self-fulfilling depression.

If not for us all thinking we're better than we are, hardly anything would get done at all. :)

In this context, one of the most important hallmarks of a successful society may well be its ability to aggregate individual contributions in such a way as to maximize the diffusion and acceptance of good ideas and their implementations while containing the harm caused by people who refuse to acknowledge their own incompetence and hence ameliorate it through any kind of error-correction / self-improvement.

Getting back to the essay on climate change that you cited, our civilization will fail if it cannot find a way to maximize the diffusion and acceptance of this looming crisis while containing the damage being caused by deniers (both professional and amateur) and their tactics.

The one-way hash argument (a great metaphor!) acts as a form of intellectual vandalism: cheap and easy to commit, expensive and difficult to clean up. What we need is some kind of argumentative rainbow table:


Or, to switch metaphors for a moment, we need somehow to reconfigure our debating platforms to work in a manner more analogous to Wikipedia, which reverses the usual logic of vandalism - on Wikipedia, it's difficult to vandalize an entry but easy to clean it up. That's how a self-regulating community manages to avoid the Broken Windows dynamic:


Permalink | Context

[ - ]

By Libtard Supporter (anonymous) | Posted June 02, 2009 at 20:32:19

Comments with a score below -5 are hidden by default.

You can change or disable this comment score threshold by registering an RTH user account.

Permalink | Context

[ - ]

By highwater (registered) | Posted June 02, 2009 at 21:31:02

Meanwhile us NO BS people are busy running the economy

And a fine job you've been doing too. Maybe if you'd spent a little time wondering if you might be wrong about something, we wouldn't be in the mess we're in today. Thanks for nothing.

Permalink | Context

[ - ]

By zookeeper (registered) | Posted June 02, 2009 at 23:10:31

Don't feed the troll. Just downvote and move along.

Permalink | Context

[ - ]

By Ryan (registered) - website | Posted June 03, 2009 at 11:16:04

I just came across an interesting study in context with what I wrote in my earlier comment:


"Good moods enhance the literal size of the window through which we see the world. The upside of this is that we can see things from a more global, or integrative perspective. The downside is that this can lead to distraction on critical tasks that require narrow focus, such as operating dangerous machinery or airport screening of passenger baggage. Bad moods, on the other hand, may keep us more narrowly focused, preventing us from integrating information outside of our direct attentional focus."

Permalink | Context

[ - ]

By Ryan (registered) - website | Posted June 17, 2009 at 15:12:05

New study: "we prefer advice from a confident source, even to the point that we are willing to forgive a poor track record."


Permalink | Context

View Comments: Nested | Flat

Post a Comment

You must be logged in to comment.

Events Calendar

Recent Articles

Article Archives

Blog Archives

Site Tools