
Cognitive Biases: 10 Thinking Errors That Prevent Ethical Decisions
Cognitive biases are systematic thinking errors that quietly govern our moral decisions. We are convinced that we act rationally and ethically, but that very conviction is the trap. Daniel Kahneman, Nobel laureate in economics, and his colleague Amos Tversky spent decades studying how our minds systematically err — and these errors carry a heavy price: broken relationships, unjust decisions, and accumulated karmic debt.
Before diving into specific biases, it helps to see how your thinking works in practice. Take the moral compass test at karm.top — it reveals patterns in your decision-making across real situations.
System 1 and System 2 (Kahneman)
In his landmark work «Thinking, Fast and Slow» (2011), Daniel Kahneman describes two modes of thinking. System 1 operates automatically, quickly, effortlessly — this is intuition. System 2 is slow, demanding concentration, analytical. The problem: most of our decisions, including moral ones, are made by System 1, which is riddled with biases.
Fast thinking as a source of errors
When you instantly decide whether someone is guilty, whether you like a person, or whether they deserve help — that is System 1 at work. It conserves energy but pays with accuracy. Evolution built us not for deep moral analysis but for quick decisions under threat. Cognitive biases are a byproduct of that evolutionary history.
According to the American Psychological Association, most people overestimate the quality of their moral judgments: we think we are making balanced decisions when we are actually following automatic heuristics.
10 Biases That Affect Ethical Decisions
1. Confirmation Bias
We seek information that confirms what we already believe and ignore what contradicts our convictions. In a moral context this means: we tend to justify our own actions and condemn identical actions by others. If you were late — traffic was to blame. If a colleague was late — they are irresponsible. Peter Wason's research (the Wason Selection Task, 1960) showed: people reject information that disproves their hypotheses even when it is clearly in their interest to accept it.
2. Moral Licensing
One of the most dangerous biases in ethics. Anna Merritt and colleagues at the University of Toronto, in a 2010 study published in Psychological Review, demonstrated: people who performed a virtuous act were significantly more likely to allow themselves a selfish or immoral act immediately afterward. Donated to charity — then snapped at the waiter. The brain seems to keep a «moral ledger», and when it is in the black, we allow ourselves to spend the accumulated credit.
3. Diffusion of Responsibility
The more people present in a situation requiring action, the less each individual feels personal responsibility. Classic experiments by Bibb Latané and John Darley (1968) demonstrated: when subjects thought they were the only witness to an emergency, they helped in 85% of cases. When they thought there were five witnesses — only 31%. More on this in our article on the bystander effect and the karma of inaction.
4. The Halo Effect
One positive trait creates a «halo» through which we perceive all other qualities of that person as also positive. Psychologist Edward Thorndike described this in 1920: attractive people seem smarter, charismatic politicians seem more honest, successful entrepreneurs seem more virtuous. Morally, this means: we tend to forgive people we like things we would not forgive strangers.
5. In-Group Bias
We systematically evaluate people from our group — national, religious, professional — more favorably than «outsiders». Henri Tajfel, creator of social identity theory, showed: mere random division of people into groups was sufficient to make them give advantages to their own side. In ethics this translates to double standards: we judge «ours» differently from «theirs».
6. Sunk Cost Fallacy
We continue investing resources in a failing project, relationship, or decision simply because we have already invested a lot. Morally, this means: we justify past unethical decisions to avoid admitting we were wrong. «I have already invested so much — I can't quit now.» This trap causes us to compound errors rather than correct them.
7. Just-World Hypothesis
We want to believe the world is fair and people get what they deserve. Psychologist Melvin Lerner described the phenomenon in the 1960s: subjects began to evaluate a victim of injustice negatively simply because they could not help her. «If something bad happened to her — she must have done something wrong.» This bias fuels callousness disguised as logic.
8. Optimism Bias
We underestimate the likelihood of negative consequences from our own actions. Tali Sharot of University College London, in «The Optimism Bias» (2011), describes: roughly 80% of people believe they are less likely than average to get divorced, fall ill, or have an accident. In ethical terms this means: we underestimate the harm we may cause through our actions.
9. Fundamental Attribution Error
When explaining other people's behavior, we overestimate the role of personal traits and underestimate circumstances. Someone is late — they are irresponsible. When we are late — circumstances were to blame (traffic, emergencies). Lee Ross of Stanford described this phenomenon in 1977. Morally, it leads to unjust condemnation: we judge others for their «character» while explaining our own behavior through context.
10. Bias Blind Spot
We see cognitive biases in others but not in ourselves. Research by Emily Pronin of Princeton University (2002) showed: people acknowledge that biases are real and affect others, but are convinced they themselves are immune. This meta-bias makes all the preceding ones especially dangerous: we cannot correct what we cannot see in ourselves. More on how this manifests in group dynamics — in our article on conformism and personal ethics.
How to Recognize Your Own Biases
The first step is to acknowledge that you have them. This is not weakness — it is honesty. Cognitive biases are not a sign of stupidity; they are a feature of the human brain. Nobel laureates are subject to them no less than anyone else.
The Red Team Practice
This method is borrowed from military strategy and intelligence. Before making an important moral decision, consciously assume the role of «devil's advocate»: what arguments speak against your position? What might someone who thinks differently say? What facts might you have ignored because they were inconvenient?
Psychologist Gary Klein developed the «pre-mortem» method: imagine your decision has failed. Why did it happen? This thought experiment surfaces blind spots before they become real consequences.
Tools for Cleaner Thinking
Cognitive science research offers several proven strategies:
- Slowing down: when a question has a moral dimension — consciously switch from System 1 to System 2. Ask yourself: «What would I think about this situation if it involved a stranger?»
- Third-person perspective: Ethan Kross of the University of Michigan showed that thinking about yourself in the third person leads to more balanced, less emotionally distorted decisions.
- The 10-10-10 rule: ask three questions before deciding: how will I view this in 10 minutes? In 10 months? In 10 years?
- Diverse perspectives: deliberately seek out people whose views differ from yours and ask them questions — not to persuade them, but to test your own assumptions.
- Pause before reacting: especially in conflict situations. Viktor Frankl spoke of the space between stimulus and response — it is in that space that conscious ethical choice becomes possible.
Cognitive Biases and Karma
From a karmic perspective, cognitive biases are the mechanism that allows us to act unethically while maintaining a positive self-image. Moral licensing allows us to «offset» good with bad. Fundamental attribution error allows us to judge others more harshly than ourselves. The just-world hypothesis helps us rationalize indifference to others' suffering.
Working with cognitive biases is working on honesty with yourself. It is one of the most difficult, but also the most valuable, karmic labors. The connection between clarity of thinking and quality of action is direct: the more accurately we see reality, the more precise our moral decisions.
Learn more about how your inner compass works in specific situations in our article on the moral compass.
Check Your Moral Compass
Knowing about cognitive biases is only the beginning. The next step is to see how they specifically affect your decisions. Take the moral compass test at karm.top: it presents real situations and shows which thinking patterns drive your reactions. This is not a judgment — it is a mirror.
Frequently Asked Questions
Can cognitive biases be completely eliminated?
No. Cognitive biases are a built-in feature of the human brain, evolutionarily conditioned. But awareness of their existence and the application of specific practices — slowing down, third-person perspective, decision journals — can significantly reduce their influence on important decisions.
Which of the 10 biases is most dangerous for ethics?
Many researchers consider moral licensing particularly dangerous precisely because it disguises itself as virtue. A person thinks they are acting well by taking «bad credit» after a good deed. This creates an illusion of morality while actually diminishing it.
How are cognitive biases related to conformism?
In-group bias and confirmation bias directly feed conformism: we follow our group's opinion, ignore contradictory information, and adopt others' decisions as our own. More in the article on conformism and personal ethics.
Did you enjoy this article? Share it with others! Even sharing it with someone might improve their life!


