Derren Brown, whom some might know from his Netflix series or live shows, is also an author of excellent books on the human mind and its limitations: what economics first Nobel laureate Herbert Simon called bounded rationality and makes us “suckers, idiots”, to evaluate things and make decisions.
Example: 1) there is a fatal disease affecting 1 in 10,000 people; 2) there is a 99% accurate test; 3) I take the test and the result is positive. Question: How likely am I to have the disease? Answering 99% is not only the immediate tendency but the wrong answer.
Imagine that a million people take the test. As 1 in 10,000 has the disease, 100 have the disease. And since the test is 99% accurate, the results for these people are: 99 positive results and it’s true that they have the disease. And one negative and false result since the person has the disease.
Now still in the one million that do the test there are 999,900 who do not have the disease (because it only affects 1 in 10,000). Again the test is 99% accurate, so for these 999,900 people: 989,901 negative (and true) results and 9,999 positive (and false) results.
That is, we have two types of people receiving positive test results: the 99 who actually have the disease and 9,999 who don’t but the test (which is only 99% accurate) was wrong.
And so on receiving the bad news that the test was positive, I belong to either the first group of 99 or the second of 9,999 people.
Because 9,999 is 100 times more than 99, it’s 100 times more likely that I don’t have the disease. Even with positive test result.
That is, the answer to the question of how likely am I to have the disease is not 99%, but less than one percent.
Conclusion: The next time you do a medical exam and receive bad news, remember to ask what is the frequency of the disease in society. And if it is low, you are more likely to be healthy and the test is mistaken than you are ill and the test is correct. As long as, of course, the test is not 100% accurate and the disease infrequent in society.
After Herbert Simon, the Nobel laureates Kahneman and Thaler came to categorize the limitations of the human mind into 30 types of errors, which are called heuristics, and the tendency to rationalize after events is one example, seeking information that confirms our decisions. This is the field of behavioral economics.
Imagine you have four cards with letters on one side and numbers on the other and on the table are: A / D / 3 / 7. Question: what cards and what minimum number should I flip to (un)confirm the rule that on if in one side is A, on the other is always 3?
The tendency is to answer: flip cards A and 3. However, this proves nothing.
If I turn card A and get 3, it doesn’t stop that if I turn 7, I also get A and consequently unconfirm the rule.
In addition, flipping 3 looking for A is irrelevant because the rule to test is that on the other side of A is always 3, but the rule does not prevent 3 from having A and also other letters.
Hence the right answer is that I must flip cards A and 7 contrary to our tendency to focus on cards A and 3 to seek confirmation.
Let’s see. The D is irrelevant. The 3 also because the rule under test is that with A there is always 3 but that does not prevent 3 from coming with other letters as well. However, if by flipping A appears other number or by flipping 7 appears A, then the rule is unconfirmed. Hence I must flip cards A and 7.
Marcus Marcellus, a five-time Roman consul, always traveled in a litter with the curtains drawn to avoid seeing any sign that in his superstition he considered contrary to the decision that had already been made.
Changing people’s consumption patterns in the market is difficult because even when people risk trying the new, they often return to the original brand to evade the feeling that they have done wrong, suboptimally, in the past. I did badly. It was stupid. I feel insecure. My survival instinct is affected.
Therefore, with the mind playing us 1) pranks like these, and 2) systematically, always of the same type (the so-called heuristics), ignoring them is disastrous.
Hence, the problem is not what we ignore (conscious ignorance). It is more what we ignore that we ignore (unconscious ignorance). And it is above all what we think it is, and is not. That we are rational when we are really just … homo sapiens.
Article published by our speaker Professor J. Sá at his editorial column (September 2019) and based in one of his conferences on the topic of behavioral economics