How much different would our society look if we understood our cognitive limitations as well as we understand our physical limitations? That's the question behavioural economist Dan Ariely poses in his recently posted talk at the TED conference.
Through a series of highly entertaining illustrations, Ariely demonstrates that human decision-making is wracked by cognitive illusions (in an analogy to optical illusions) that seriously impair our ability to make rational decisions.
Humans can't think clearly: hilarity ensues.
Ariely makes two crucial observations about our cognitive limitations:
First, we can discover our illusions through careful study, but we cannot stop ourselves from experiencing them as such, even when we understand that they are illusions.
He showed the well-known picture of two tables of apparently different lengths:
These two images are the same length. No, seriously. (Image Credit: Cool Optical Illusions)
He superimposed two red measuring lines to demonstrate that both tables are the same length. On removing the lines he said:
The interesting thing about this is, when I take these lines away, it's as if you haven't learned anything in the last minute. You can't look at this and say, "Okay, now I see reality as it is." It's impossible to overcome the sense that this [gesturing to the table on the left] is indeed longer.
So we can take our illusions into account, but we cannot free ourselves from experiencing those illusions.
His second observation is that these mistakes are consistently repeatable and predictable. We cannot escape the perception itself of the illusion, but with careful study, we can discover that the illusion is happening and adjust our thinking to take the illusion into account.
Ariely used optical illusions to illustrate the principle, but moved from optical to cognitive illusions to demonstrate that they operate in an analogous manner. Since "vision is one of the best things we do," it seems reasonable to conclude that we will also make predictable mistakes in cognitive processes for which we do not have a specially adapted brain structure.
In fact, Ariely argues that not only are our cognitive illusions bigger and more serious than our optical illusions, but also they are harder to demonstrate since we cannot simply hold a ruler up to the errors in which we process data and form conclusions.
Using a comparison of European countries over rates of organ donation as a demonstration, Ariely shows that the way questions are formed - especially questions over intellectually or morally complex issues - has a huge influence on which choices people make.
In the case of organ donations, European countries are divided into a few with extremely low rates of donors and countries with nearly 100 percent donors. The difference is not social or cultural and is not related to incentives. Instead, the difference is that in some countries, citizens need to check a box on a form to authorize organ donation, and in the other countries, citizens need to check a box on a form to opt out of organ donation.
The issue is so complex, Ariely argues, that people opt to go with the default choice - whatever that default happens to be - rather than figure out what to think about it.
It gets more interesting. Given a choice between an all-expenses paid trip to Paris or Rome, respondents are roughly evenly split. However, by adding a third choice - a trip to Rome where all expenses are paid except for a morning cup of coffee - a majority of respondents choose the trip to Rome including coffee.
Adding a slightly inferior version of the Rome trip makes the full Rome trip look even better than the Paris trip by comparison.
He demonstrates the same principle at work in a few other cases, concluding with some helpful advice on how to increase your chance of picking up a date the next time you go bar-hopping.
We are generally well aware of our physical limitations (notwithstanding the Lake Wobegon Effect). For example, automobiles are designed to compensate for our slow reaction time, our tendency to distraction, and above all for the fragility of our bodies.
However, we generally ignore our cognitive limitations when we design physical and social systems. To the extent that awareness of our persistent cognitive deficits exist at all, it is used to exploit and manipulate our limitations, not to ameliorate them.
It actually feels illiberal or at least condescending even to bring it up at all, a fact that advertisers exploit shamelessly in their defence of aggressive, manipulative marketing tactics. You don't think people are capable of making good decisions? I guess we need the nanny state to do our thinking for us now!
It's a damned tricky business, because a major premise of authoritarianism in all its forms is, in fact, the belief that people are infantile and need to be governed.
The goal of a clearer understanding of our cognitive limitations should be empowerment to make better decisions, not dependency on some leader (who, after all, is subject to exactly the same cognitive limitations).
Luckily, we already have a fairly reliable method of identifying and overcoming our cognitive limitations in our reasoning and decision-making processes: the scientific method. One of my favourite quotes is by Stuart Chase, quoted in S.I. Hayakawa's classic Language in Thought and Action:
Common sense is that which tells us the world is flat.
As a society, it's high time we start to cultivate a lifelong habit of questioning our assumptions, revisiting received wisdom, subjecting our hypotheses to vigorous empirical testing, and openly debating public policy not based on conformance to this or that narrow ideology or dogma, but to demonstrable and repeatable evidence.
One thing we need to cultivate as part of this is a willingness to be proven wrong. Instead of wasting our time seeking to 'safe face' by stubbornly defending ridiculous, indefensible positions, it would be great if we could all step back and learn from intellectually honest debate. Another favourite quote is from John Maynard Keynes, recently back in fashion among economists:
When the facts change, I change my mind. What do you do, sir?
Can you imagine a society of people willing to change their minds when the facts change? We could stop wasting our time reacting fearfully to ideas. We could free ourselves from eternal judgment against our own past decisions. We could arrange the framework in which we make decisions to ensure that our choices better reflect what we really want.
You must be logged in to comment.
There are no upcoming events right now.
Why not post one?