Predictable Irrationality
Some of the most interesting stuff that I've encountered in social science has to do with the way that people make decisions (this work usually stems from a weird interesection of cognitive psychology, behavioral economics, and organizational theory). I mentioned Herb Simon a while back, with his "bounded rationality" theory. Another really interesting set of theories comes from Kahneman and Tversky (both psychologists) and their work on cognitive biases and utility theory.
What these guys say is that, ok, we all know that people are not always rational. Duh. Sometimes, people make bad decisions. There's often an underlying assumption that people make bad decisions because they don't have enough information. These guys said, no, that's not true. Even with all the relevant information, people still make bad decisions, and what's more, they do it in predictable ways. These are what they called cognitive biases.
I just find this shit totally fascinating. Here's a rough example of what I'm talking about.
So, if you were entering into a raffle, say, or a lottery, the "rational" amount you should be willing to bet would be the expected value of the return. Expected value is the sum of all the outcomes multiplied by their probabilities. So, let's say there's a 30% chance at winning 100 dollars.
100(.3)= $30.00.
So, that means you should be willing to bet up to $30 for a 30% chance at winning $100.
However, most people aren't willing to go that high. That's called "risk aversion." People who would be willing to bet even more than $30 are risk-takers. Most people are biased toward security and away from risk, at least as compared to a purely mathematical assessment of rationality.
This is just one of a ton of cognitive biases that have been identified in the literature. Here's an ironclad source for some others. A few other notable ones include:
Loss aversion: People tend to value a loss at a higher level than a gain of identical face value. In other words, they hate losing 5 bucks more than they like gaining 5 bucks.
Sucking at extreme probabilities (sorry, can't remember the proper name for this): overestimating the danger presenting by events with really small probabilities (think: fears of flying) and underestimating the danger from events with relatively large probabilities of occuring (think: getting into a car wreck).
Anchoring Bias: When, in the face of uncertainty, people use whatever information is presented to produce an answer, even if it's irrational. Like, when charities put a "suggested donation" on their fundraising stuff; that's supposed to plant a little seed that fills in this gap of uncertainty.
Optimism Bias: A tendency to underestimate the danger of doing things we want to do.
Children: A tendency to overestimate dangers to children
Strangers: A tendency to underestimate dangers to strangers and to overestimate it for people we know.
Anyway...once you hear about this stuff, you start seeing it EVERYWHERE!!! Including in policy and politics. I'm more and more convinced that these things are really central to understanding the political process, as strategic politicians try to manipulate these things, either to get more votes, or to pass policies they like.

0 Comments:
Post a Comment
<< Home