
The Karma of Borrowed Certainty: Why We Adopt Others' Beliefs as Our Own
Consider: how many of the beliefs you call your own have you actually stress-tested? Political views, ideas about healthy living, notions of what success looks like — most people discover that these supposedly personal convictions came from parents, teachers, beloved authors, or charismatic strangers on the internet. This isn't a character flaw. It's the normal operation of a brain that evolved for group survival, not independent philosophical investigation. But understanding how this mechanism works gives you something valuable: the ability to choose what to keep and what to reconsider.
We live in an era where information updates faster than we can process it. Information overload makes epistemic dependence not just convenient but practically inevitable. The question isn't whether we borrow beliefs — we all do. The question is whether we're aware of the process.
Epistemic Dependence as Norm
Philosopher John Hardwig published a provocative 1985 paper, "Epistemic Dependence," demonstrating that in modern society, most of what we "know" we know through trust in others. A doctor trusts the biochemist, the biochemist trusts the physicist, the physicist trusts the mathematician. This chain of delegation is inevitable — we can't personally verify everything.
Psychologists call this cognitive offloading. The brain is an extraordinarily energy-intensive organ — it consumes about 20% of the body's total energy while comprising only 2% of its mass. Evolution developed a powerful economy mechanism: trust those who have thought more about a subject than you have. Social learning — following the experienced members of a group — allowed our ancestors to accumulate knowledge across generations without reinventing the wheel.
Cognitive dependence is therefore not a bug but a feature. The problem begins when delegation becomes unconditional.
When Delegation Becomes Harmful
Social psychologist Robert Cialdini describes the authority heuristic as one of six key influence mechanisms. When someone is perceived as an expert or authority, we tend to accept their claims without critical analysis. In most situations this is reasonable — we can't become experts in everything. But this same mechanism makes us vulnerable to those who merely appear to be experts.
The classic Milgram experiments (1960s) and Solomon Asch's conformity studies (1951) demonstrated how deeply social pressure penetrates our cognitive processes. In Asch's experiments, people agreed with an obviously wrong majority answer 37% of the time — not because they couldn't see the right answer, but because they unconsciously concluded: if everyone thinks differently, maybe I'm the one who's wrong.
This works particularly alarmingly in the context of ideological groups, religious movements, and contemporary information bubbles. Linguist Steven Pinker describes ideological capture: someone who accepts a few key group beliefs gradually accepts the entire worldview — not because they agree with every element, but because leaving the system is too socially costly.
The Authority Heuristic: Why We Trust Confident Voices
An interesting phenomenon: confidence and competence often correlate in our perception, though the actual relationship between them is weaker than it appears. Psychologist David Dunning described this in the context of the famous Dunning-Kruger effect: people with minimal knowledge in a domain often demonstrate maximum confidence. True experts, conversely, are typically more cautious in their judgments — they understand the complexity of the problem better.
In the media environment, this creates a paradox: the voices that sound most compelling and confident often belong to people with the least actual expertise. Social media algorithms amplify this: confident, emotionally charged statements receive more engagement than measured and nuanced ones. When you constantly see confident voices, the brain begins treating that confidence as a signal of truth.
Neuroimaging research (Berns et al., 2010) found that when we're presented with opinions from authority figures, activation in brain regions associated with independent judgment actually decreases. We literally "switch off" critical thinking when we hear an expert.
Epistemic Cowardice: Avoiding the Discomfort of Updating Beliefs
Philosopher Linda Zagzebski introduced the concept of epistemic virtue — a collection of thinking qualities that help us better understand reality. In opposition lies epistemic cowardice: the tendency to cling to comfortable beliefs and avoid information that might challenge them.
Updating beliefs is a psychologically costly process. The cognitive dissonance that arises when new information contradicts an existing worldview feels literally threatening. The brain responds to it similarly to physical pain — the anterior cingulate cortex activates to process both kinds of "discomfort."
This is why people engage in confirmation bias: we actively seek information that confirms what we already believe. This isn't mere cognitive laziness — it's emotional self-protection. Acknowledging that a belief you've carried for ten years was mistaken means experiencing something like the small death of a part of your identity.
You can examine your own beliefs through the Oracle — sometimes an unexpected question from outside helps you see what you actually think, rather than what you think you think.
Building Epistemic Autonomy
Epistemic autonomy isn't the rejection of all authorities or perpetual skepticism where nothing can be known. It's the skill of holding beliefs provisionally: accepting them as working hypotheses, ready for revision when new evidence arrives.
Philosopher Jonathan Quong proposes the concept of autonomous trust: you can trust an expert while retaining the right — and responsibility — to ask questions about methodology, conflicts of interest, and the context of their claims. The key question isn't "who said this" but "how did they come to know it."
Critical thinking researchers identify several key skills: distinguishing factual claims from value judgments; tracking the origins of beliefs; evaluating evidence quality; and resisting appeals to authority as the sole argument. None of these skills is innate — all of them can be trained.
Interestingly, the psychology of self-deception often walks hand-in-hand with epistemic dependence: we don't just adopt others' beliefs, we convince ourselves we arrived at them independently.
Practical Exercise: The Source Audit
Here's a concrete exercise for developing epistemic autonomy. Set aside an hour and conduct a "source audit" of five of your core beliefs — about politics, health, relationships, career, or any other significant domain.
For each belief, answer the following questions:
- Where did it come from? Who specifically shaped this belief — a parent, teacher, book, or social media figure?
- When did you last check it? Have you looked for arguments against this belief as actively as arguments for it?
- What evidence would change it? If you can't name such evidence — that's a signal the belief has become unfalsifiable and is serving more of an identity function than a cognitive one.
- Who benefits from you believing this? This isn't a call to paranoia, but a useful question for identifying potential conflicts of interest.
- What would change in your life if this belief turned out to be wrong?
The goal isn't to discard all your beliefs. Many will turn out to be well-founded. But the very process of examination changes your relationship to the belief: it stops being part of you and becomes a tool — useful as long as it works.
Related practice: the next time you hear a confident claim in media or conversation, ask yourself: "How does this person know this?" Not "are they right" but specifically "how do they know." This simple question creates a small epistemic pause that, over time, becomes a habit. And the habit of asking the right questions is what philosophers call intellectual virtue.
A few questions for reflection: Which belief did you accept from someone else and never seriously examined? Is there someone in your life whose opinion you accept without scrutiny — and why? When did you last change an important belief, and what convinced you? What do you fear discovering if you audit your own views? How would your life change if you held your core beliefs just a little more provisionally?
Probabilistic Thinking as Antidote
One of the most practical tools against borrowed certainty is probabilistic thinking — the habit of expressing beliefs not in absolute categories ("this is true" / "this is false") but in probabilities ("I'm roughly 70% confident about this"). Professional forecasters actively use this technique — people whose job requires accurate probability assessments.
Psychologist Philip Tetlock, who studied forecasters over 20 years, found that those who think in probabilities and willingly revise their estimates with new data consistently outperform experts who hold confident categorical positions. Probabilistic language creates a psychological buffer: if a belief is "70% confidence," changing it to "50%" when new data appears is far easier than going from "this is absolute truth" to "I was wrong."
In practice this looks like: try for a week to state your beliefs with explicit confidence levels. "I'm about 80% confident that this policy works the way they say." "I'm maybe 60% confident that this person is trustworthy." It sounds unusual and slightly awkward — but that awkwardness is a sign you're beginning to deal more honestly with your uncertainty.
It's interesting to compare your thinking to others': probabilistic thinking is covered in a dedicated article with detailed analysis — including the cognitive biases that make us worse at estimating probabilities than we think.
Finally, it's important to understand: epistemic autonomy doesn't mean isolation from intellectual community. On the contrary, it enables higher-quality participation in it. When you critically evaluate sources and examine beliefs, you become a more valuable participant in any discussion: you can offer something more than simply reproducing others' viewpoints. You bring your own processed, verified position. That's what makes genuine intellectual dialogue productive — not agreement, but the capacity for independent judgment. Also worth remembering: changing a belief is not an admission of defeat. It's evidence of intellectual honesty and willingness to grow. Those who never change their views aren't demonstrating resilience — they're demonstrating cognitive inflexibility.


