source:businessinsider | by SHANA LEBOWITZ AND DRAKE BAER
欲翻译的小伙伴,可评论『领稿』,并把已完成的译文『贴在评论』处或『发在自己的心理圈内』。请领稿的小伙伴,尽量在一个星期内完成翻译,谢谢!!~~
You make thousands of decisions every day — from what to eat for breakfast to which job offer to take.
And you might think you approach all those decisions rationally.
Yet research suggests there are a huge number of cognitive stumbling blocks that can affect our behavior, preventing us from acting in our own best interests.
Here, we"ve rounded up some of the most commonly cited biases that screw up our decision-making.
People are overreliant on the first piece of information they hear.
In a salary negotiation, for instance, whoever makes the first offer establishes a range of reasonable possibilities in each person"s mind.
Any counteroffer will naturally be anchored by that opening offer.
Availability heuristic
When people overestimate the importance of information that is available to them.
For instance, a person might argue that smoking is not unhealthy on the basis that his grandfather lived to 100 and smoked three packs a day.
Bandwagon effect
The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink — and it"s a reason meetings are often unproductive.
Blind-spot bias
Failing to recognize your cognitive biases is a bias in itself.
Notably, Princeton psychologist Emily Pronin has found that "individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves."
Choice-supportive bias
When you choose something, you tend to feel positive about it, even if the choice has flaws. You think that your dog is awesome — even if it bites people every once in a while — and that other dogs are stupid, since they"re not yours.
Clustering illusion
This is the tendency to see patterns in random events. It is central to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette after a string of reds.
Confirmation bias
We tend to listen only to the information that confirms our preconceptions — one of the many reasons it"s so hard to have an intelligent conversation about climate change.
Conservatism bias
Where people believe prior evidence more than new evidence or information that has emerged. People were slow to accept the fact that the Earth was round because they maintained their earlier understanding that the planet was flat.
Information bias
The tendency to seek information when it does not affect action. More information is not always better. Indeed, with less information, people can often make more accurate predictions.
Ostrich effect
The decision to ignore dangerous or negative information by "burying" one"s head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets.
But there"s an upside to acting like a big bird, at least for investors. When you have limited knowledge about your holdings, you"re less likely to trade, which generally translates to higher returns in the long run.
Outcome bias
Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn"t mean gambling your money was a smart decision.
Overconfidence
Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives.
Perhaps surprisingly, experts are more prone to this bias than laypeople. An expert might make the same inaccurate prediction as someone unfamiliar with the topic — but the expert will probably be convinced that he"s right.