Policy

Discovering and Overcoming Cognitive Illusions

By Ryan McGreal
Published June 01, 2009

How much different would our society look if we understood our cognitive limitations as well as we understand our physical limitations? That's the question behavioural economist Dan Ariely poses in his recently posted talk at the TED conference.

Through a series of highly entertaining illustrations, Ariely demonstrates that human decision-making is wracked by cognitive illusions (in an analogy to optical illusions) that seriously impair our ability to make rational decisions.

Humans can't think clearly: hilarity ensues.

Illusions Inescapable, Repeatable

Ariely makes two crucial observations about our cognitive limitations:

First, we can discover our illusions through careful study, but we cannot stop ourselves from experiencing them as such, even when we understand that they are illusions.

He showed the well-known picture of two tables of apparently different lengths:

These two images are the same length. No, seriously. (Image Credit: Cool Optical Illusions)
These two images are the same length. No, seriously. (Image Credit: Cool Optical Illusions)

He superimposed two red measuring lines to demonstrate that both tables are the same length. On removing the lines he said:

The interesting thing about this is, when I take these lines away, it's as if you haven't learned anything in the last minute. You can't look at this and say, "Okay, now I see reality as it is." It's impossible to overcome the sense that this [gesturing to the table on the left] is indeed longer.

So we can take our illusions into account, but we cannot free ourselves from experiencing those illusions.

His second observation is that these mistakes are consistently repeatable and predictable. We cannot escape the perception itself of the illusion, but with careful study, we can discover that the illusion is happening and adjust our thinking to take the illusion into account.

Ariely used optical illusions to illustrate the principle, but moved from optical to cognitive illusions to demonstrate that they operate in an analogous manner. Since "vision is one of the best things we do," it seems reasonable to conclude that we will also make predictable mistakes in cognitive processes for which we do not have a specially adapted brain structure.

In fact, Ariely argues that not only are our cognitive illusions bigger and more serious than our optical illusions, but also they are harder to demonstrate since we cannot simply hold a ruler up to the errors in which we process data and form conclusions.

Manipulating Choices

Using a comparison of European countries over rates of organ donation as a demonstration, Ariely shows that the way questions are formed - especially questions over intellectually or morally complex issues - has a huge influence on which choices people make.

In the case of organ donations, European countries are divided into a few with extremely low rates of donors and countries with nearly 100 percent donors. The difference is not social or cultural and is not related to incentives. Instead, the difference is that in some countries, citizens need to check a box on a form to authorize organ donation, and in the other countries, citizens need to check a box on a form to opt out of organ donation.

The issue is so complex, Ariely argues, that people opt to go with the default choice - whatever that default happens to be - rather than figure out what to think about it.

It gets more interesting. Given a choice between an all-expenses paid trip to Paris or Rome, respondents are roughly evenly split. However, by adding a third choice - a trip to Rome where all expenses are paid except for a morning cup of coffee - a majority of respondents choose the trip to Rome including coffee.

Adding a slightly inferior version of the Rome trip makes the full Rome trip look even better than the Paris trip by comparison.

He demonstrates the same principle at work in a few other cases, concluding with some helpful advice on how to increase your chance of picking up a date the next time you go bar-hopping.

Cognitive Safety Features

We are generally well aware of our physical limitations (notwithstanding the Lake Wobegon Effect). For example, automobiles are designed to compensate for our slow reaction time, our tendency to distraction, and above all for the fragility of our bodies.

However, we generally ignore our cognitive limitations when we design physical and social systems. To the extent that awareness of our persistent cognitive deficits exist at all, it is used to exploit and manipulate our limitations, not to ameliorate them.

It actually feels illiberal or at least condescending even to bring it up at all, a fact that advertisers exploit shamelessly in their defence of aggressive, manipulative marketing tactics. You don't think people are capable of making good decisions? I guess we need the nanny state to do our thinking for us now!

It's a damned tricky business, because a major premise of authoritarianism in all its forms is, in fact, the belief that people are infantile and need to be governed.

The goal of a clearer understanding of our cognitive limitations should be empowerment to make better decisions, not dependency on some leader (who, after all, is subject to exactly the same cognitive limitations).

Overcoming Cognitive Limitations

Luckily, we already have a fairly reliable method of identifying and overcoming our cognitive limitations in our reasoning and decision-making processes: the scientific method. One of my favourite quotes is by Stuart Chase, quoted in S.I. Hayakawa's classic Language in Thought and Action:

Common sense is that which tells us the world is flat.

As a society, it's high time we start to cultivate a lifelong habit of questioning our assumptions, revisiting received wisdom, subjecting our hypotheses to vigorous empirical testing, and openly debating public policy not based on conformance to this or that narrow ideology or dogma, but to demonstrable and repeatable evidence.

One thing we need to cultivate as part of this is a willingness to be proven wrong. Instead of wasting our time seeking to 'safe face' by stubbornly defending ridiculous, indefensible positions, it would be great if we could all step back and learn from intellectually honest debate. Another favourite quote is from John Maynard Keynes, recently back in fashion among economists:

When the facts change, I change my mind. What do you do, sir?

Can you imagine a society of people willing to change their minds when the facts change? We could stop wasting our time reacting fearfully to ideas. We could free ourselves from eternal judgment against our own past decisions. We could arrange the framework in which we make decisions to ensure that our choices better reflect what we really want.

Ryan McGreal, the editor of Raise the Hammer, lives in Hamilton with his family and works as a programmer, writer and consultant. Ryan volunteers with Hamilton Light Rail, a citizen group dedicated to bringing light rail transit to Hamilton. Ryan wrote a city affairs column in Hamilton Magazine, and several of his articles have been published in the Hamilton Spectator. His articles have also been published in The Walrus, HuffPost and Behind the Numbers. He maintains a personal website, has been known to share passing thoughts on Twitter and Facebook, and posts the occasional cat photo on Instagram.

4 Comments

View Comments: Nested | Flat

Read Comments

[ - ]

By JonC (registered) | Posted June 02, 2009 at 14:08:44

I read a related piece related to complex decision making. http://www.juliansanchez.com/2009/04/06/...

The article discusses the argument from authority fallacy and then gets into arguments between experts. "Sometimes the arguments are such that the specialists can develop and summarize them to the point that an intelligent layman can evaluate them. But often—and I feel pretty sure here—that's just not the case." This creates opportunity for the disingenuous to state opinions or falsehoods while only ensuring that the statement sounds plausible.

Then goes on to coin the "one way hash argument". The term is explained in better detail at the link, but in short, is a quick argument that requires a great amount of explaining to counter. In essence, that a short, intuitive and plausible (but wrong) argument tends to carry an undue amount of weight due to the ease of understanding (as long as the concepts are above the laypersons understanding).

The authour then ties it into the Dunning-Kruger effect, which leads to a situation where "people with less competence will rate their ability more highly than people with relatively more competence". http://en.wikipedia.org/wiki/Dunning-Kru...

Permalink | Context

[ - ]

By Libtard Supporter (anonymous) | Posted June 02, 2009 at 20:32:19

Comments with a score below -5 are hidden by default.

You can change or disable this comment score threshold by registering an RTH user account.

Permalink | Context

[ - ]

By highwater (registered) | Posted June 02, 2009 at 21:31:02

Meanwhile us NO BS people are busy running the economy

And a fine job you've been doing too. Maybe if you'd spent a little time wondering if you might be wrong about something, we wouldn't be in the mess we're in today. Thanks for nothing.

Permalink | Context

[ - ]

By zookeeper (registered) | Posted June 02, 2009 at 23:10:31

Don't feed the troll. Just downvote and move along.

Permalink | Context

View Comments: Nested | Flat

Post a Comment

You must be logged in to comment.

Events Calendar

There are no upcoming events right now.
Why not post one?

Recent Articles

Article Archives

Blog Archives

Site Tools

Feeds