Here’s a quote from an article I’ll link below about Daniel Kahneman’s ideas.
“For a long time, the most widely-accepted view of how people make decisions under uncertainty was expected utility theory. This theory imagines that people are totally rational and, when faced with a decision, they weigh the benefits and risks of each possible outcome, considering how likely each outcome is to happen.”
I don’t like the use of the word “rational” in this context. I’ll explain that later.
Kahneman promoted the idea of “prospect theory,” which says we don’t evaluate gains and losses by strict utility – the way an accountant might – and that our perception of the potential gains or losses affect our decisions.
Part of this is that we’re more afraid of loss than we are attracted by gain.
Consider a homeowner who is wondering about buying insurance for a house that’s worth $200,000. The insurance policy is priced at $2,000 per year, and it covers damages from a natural disaster that has a 1% chance of occurring each year.
According to traditional economic theory, the expected loss from the disaster is 1% of $200,000, which is $2,000, which is also the cost of the insurance. A so-called “rational person” might be indifferent to buying insurance in this case, since he’d be paying $2,000 to prevent a $2,000 loss.
The problem with this analysis is that the typical homeowner is not an accountant at a huge corporation, where losing a house is just a number on a spreadsheet. Losing the house is very significant to the homeowner. There’s value to the homeowner in knowing that he’ll have a house despite a disaster.
Prospect theory is similar to the idea of “throwing good money after bad,” or the sunk cost fallacy, which involves making today’s decision based on money you spent yesterday.
One of the central tenets of Prospect Theory is that losses loom larger than gains. This can cause people to continue investing in a losing proposition to avoid realizing a loss. The sunk cost fallacy is the tendency for people to continue in a course of action even when abandoning it would be more beneficial. They make this decision because they have already invested time, energy, or other resources, and feel that it would all have been for nothing if they quit.
Imagine you have an old clunker of a car that needs an expensive repair. You pay to have the repair done. Then a week later the car needs another expensive repair. To avoid the sunk fallacy, you can’t allow the money you’ve already spent to influence today’s decision about the needed repair.
To wrap up, I want to comment on this business of “rationality.” We humans have a lot of cognitive biases that influence the way we think, and it’s popular to say that these things make us act irrationally. I’m uncomfortable with that analysis. These biases developed for a reason. Instead of analyzing them by some allegedly “rational” standard – based on numbers and probabilities and all that – I think it makes more sense to first figure out why that bias might be a good idea.
It’s like Chesterton’s fence. If you find a fence in the woods, don’t tear it down until you know why it was put there.
I’m not suggesting that you should gleefully go around affirming cognitive biases and committing logical fallacies, but I think we need to tone down this rational vs. irrational stuff and figure out whether some things might make sense in one context even if they don’t make sense in another.
For example, when evaluating the sunk cost fallacy, remember to consider the old saying, “in for a penny, in for a pound.” The tricky thing is to know when a particular model or mental framework applies and when it doesn’t.
Links
Remembering Daniel Kahneman: 7 theories that can help you understand how you think