
Fake News: Are You Responsible for What You Share?
How Misinformation Works: The Mechanics of Virality
In 2018, researchers at MIT Media Lab published a landmark study in the journal Science on how information spreads on Twitter. Soroush Vosoughi and his co-authors analyzed 126,000 information cascades from 2006 to 2017. The conclusion was unambiguous: false news spreads faster, farther, and more broadly than true news. Falsehoods reached people 6 times faster and reached 20 times more people than verified facts.
Why? The authors found: not bots, but humans are the primary spreaders of misinformation. And the main driver is emotion. False stories were more likely to evoke fear, disgust, anger, and surprise. True stories were more likely to evoke trust and sadness. The intensity of emotion determined the speed of spread.
MIT Media Lab Study 2018: Lies Spread 6 Times Faster Than Truth
This research overturned conventional thinking about who is responsible for spreading misinformation. It's commonly assumed that the problem lies in social media algorithms or organized influence campaigns. Vosoughi and colleagues showed: even without bots in the equation, humans independently spread falsehoods more effectively than truths.
This raises a direct question about every user's responsibility. If you share something unverified — you're participating in this mechanism, even if you don't realize it.
Emotion as the Engine of Misinformation Virality
Fake news is constructed to generate maximum emotional response. Messages that appeal to fear («This is killing people»), righteous anger («They're hiding the truth»), disgust («Look at what they're doing»), or surprise («You won't believe this, but it's true») — get more engagement because emotional arousal reduces critical thinking.
Our brains literally operate differently under emotional arousal: the prefrontal cortex — the region responsible for critical analysis — receives fewer resources. This is precisely why the most viral fake stories always generate strong emotion before rational thinking kicks in.
The Psychology of Believing Lies: Cognitive Biases
It would be unfair to assume that only «uneducated» or «naive» people fall for fake news. Research shows that the cognitive biases that make us vulnerable to misinformation are fundamental features of human perception — regardless of educational level.
Confirmation Bias
We tend to seek out, remember, and trust information that confirms what we already believe. If a fake story aligns with our political views, our fears, or our worldview — we're less likely to verify it. The Reuters Institute Digital News Report finds: people are 2–3 times less likely to fact-check a story if it aligns with their existing beliefs.
The Illusory Truth Effect: Repetition Creates Trust
Research in cognitive psychology shows: simply repeating a statement increases the subjective probability that it is true. This phenomenon is called the «illusory truth effect.» This is precisely why organized misinformation campaigns persistently repeat the same narratives — each repetition increases perceived plausibility.
Motivated Reasoning
When information threatens our identity or beliefs, we don't process it objectively. Instead, we search for reasons not to believe it. Conversely, when information confirms our worldview, we accept it without scrutiny. This is called motivated reasoning — and it's what turns us into voluntary spreaders of misinformation.
The Karma of Sharing: You Become the Source
In the era of social media, every user has become a publisher. Your repost is a publication. Your share is an endorsement. When you spread unverified information, you take on part of the responsibility for its consequences.
The Moral Responsibility of the Spreader
Classical ethics distinguishes between action and inaction, between direct harm and indirect participation. Sharing fake news is not a «neutral button press.» It is active participation in spreading potentially harmful information. If that fake news leads to real consequences — discrimination, violence, panic, wrong medical decisions — your share is part of the causal chain.
Important: ignorance is not a complete excuse when you had the opportunity to verify the information. This is what ethics calls «reckless disregard» — ignoring obvious risks.
The Collective Harm of Misinformation
Misinformation has documented social consequences. During the COVID-19 pandemic, the «infodemic» — the wave of false information — led to real deaths: people refused vaccinations, took dangerous «remedies,» and delayed seeking medical help. The WHO officially declared the infodemic a public health threat.
Everyone who spread unverified claims about the virus or vaccines contributed to this system — regardless of their intentions.
5 Steps to Verify Information
Media literacy is a skill you can practice. First Draft News, an organization specializing in information verification, identifies several levels of checking that anyone can master.
1. Check the Source
Who published this information? Is it a well-known outlet with established editorial standards? Or a website you've never seen before? Check the «About» page, look at the site's other content, and search for reviews of its reputation.
2. Find the Original Source
Most news stories reference a primary source. Follow the link. Read the original document, study, or statement — not a paraphrase of it. Paraphrases often substantially distort the meaning of the original.
3. Find Multiple Independent Confirmations
If an event really happened, multiple independent sources will report it. If you only find information on one website — that's a red flag. Use search engines to find other publications using the relevant keywords.
4. Check the Date
Old news is often republished without context, creating a false impression of currency. Always check the publication date. If an event occurred years ago, sharing it today may be deliberate misinformation.
5. Use Fact-Checking Resources
Specialized fact-checking resources exist: Snopes, FactCheck.org, Reuters Fact Check, PolitiFact, and local fact-checking organizations. If the information you want to share concerns an important event — spend 2 minutes checking it.
Your Online Honesty: The Karma of Information Behavior
Honesty in the information space is the same form of karmic responsibility as honesty in personal relationships. By spreading unverified information, you lower the overall level of trust in society. By making the effort to verify, you invest in the information health of your community.
A simple question before every share: «Have I verified this?» It takes 2–5 minutes. It's your contribution to the karmic balance of the information space.
To explore your patterns of honesty and responsibility in everyday situations, take the test at karm.top. Also read about the psychology of honesty and lies.
Frequently Asked Questions
Am I responsible if I didn't know the information was false?
Ignorance reduces but doesn't eliminate responsibility when you had the opportunity to verify. If you see a sensational headline that evokes a strong emotion — that's a signal to verify before sharing. Conscious inattention is also a choice.
Don't algorithms bear more responsibility than users?
Algorithms create an environment that amplifies the virality of misinformation. But the MIT study shows: humans — not bots — are the primary spreaders of false information. Responsibility is distributed: algorithms, platforms, and users share it jointly.
What should I do if I've already shared a fake story?
Acknowledge the mistake publicly, delete the share, and post a correction. This requires courage, but it's the only honest response. Research shows: people who publicly admit mistakes are trusted more than those who conceal them.
Did you enjoy this article? Share it with others! Even sharing it with someone might improve their life!


