Join Waitlist

The science behind information diversity

We didn't discover the filter bubble problem — researchers have been documenting it for over a decade. We're just building the tools to help you do something about it.

75%
Overestimate their news discernment
39%
Actively avoid news
71%
Consume little or no foreign news
65%
Feel exhausted by politics

The Filter Bubble Problem

In 2011, Eli Pariser coined the term "filter bubble" to describe how personalization algorithms create isolated information environments. Since then, researchers have been documenting just how bad it's gotten.

Modern recommendation systems optimize for engagement, not understanding. They learn what makes you click, share, and stay — then show you more of exactly that. The result is a feedback loop that narrows your worldview while making you feel more informed than ever.

75%

Overconfidence in News Judgment

Three in four Americans overestimate their ability to distinguish legitimate news from misinformation, placing themselves 22 percentiles higher than warranted.

PNAS, 2021
39%

News Avoidance Rising

Nearly 4 in 10 people actively avoid news sometimes or often — up 10 percentage points since 2017 as news fatigue spreads globally.

Reuters Institute Digital News Report, 2024
71%

Global News Gap

Seven in ten Americans consume little to no foreign news — with 24% never consuming any international news coverage at all.

SmartNews Survey, 2025
65%

Political Exhaustion

Nearly two-thirds of Americans always or often feel exhausted when thinking about politics — while only 10% feel hopeful.

Pew Research Center, 2023

The Social Media Factor

Social media has fundamentally changed how we consume information — and not always for the better. These platforms now sit at the center of our information ecosystem, amplifying both connection and division.

54%

Get News From Social Media

More than half of U.S. adults now get news from social media at least sometimes — Facebook and YouTube lead at 33% and 32% respectively.

Pew Research Center, 2024
70%

False News Spreads Faster

Falsehoods are 70% more likely to be retweeted than accurate information — and reach 1,500 people six times faster than truth.

MIT/Science, 2018
74%

Concerned About Misinformation

Nearly three-quarters of Americans are "very concerned" about the spread of misinformation online — and 79% say they've encountered false news.

Gallup/Knight Foundation, 2024
43%

Gen Z Gets News on TikTok

Among adults under 30, nearly half regularly get news from TikTok — up from just 9% in 2020. The platform's news role has quadrupled.

Pew Research Center, 2024

It's Not Just the Algorithm — It's Us

A landmark MIT study found something surprising: humans, not bots, are primarily responsible for spreading false information. When researchers removed all bot activity from their dataset, the patterns remained the same. We share misinformation because it's novel, surprising, and triggers emotional responses like disgust and outrage — exactly what makes content engaging.

This is why awareness matters. You can't outsource critical thinking to platform moderation. The only sustainable solution is helping people see their own information patterns — which is exactly what we built Mindspan to do.

How We're Applying This Research

The research is clear: filter bubbles are real, most people don't realize they're in one, and awareness is the first step to breaking out. We built Mindspan to make that awareness accessible to everyone.

The Filter Bubble Index (FBI)

Based on frameworks from media diversity research, we created the FBI Score — a single, actionable metric that captures the diversity of your information diet. It's calculated from four components identified in the literature as key indicators of information diversity:

📊

Filter Bubble Index (FBI)

A 0-100 score measuring information diet diversity

📰

Content Variety

Distribution across content types: news, analysis, opinion, research, entertainment. Measures format diversity.

Weight: 25%
⚖️

Perspective Balance

Ratio of supportive to critical viewpoints on topics you follow. Measures ideological diversity.

Weight: 30%
🌍

Geographic Scope

Coverage across world regions: North America, Europe, Asia-Pacific, Africa, Latin America, Middle East.

Weight: 25%
🏷️

Topic Breadth

Range of subject areas: politics, technology, science, culture, business, health, environment, etc.

Weight: 20%

How It Works

We use on-device machine learning to analyze your content without compromising privacy. Here's the process:

1

Content Classification

A lightweight NLP model classifies page content into types (news, opinion, etc.) using linguistic patterns — not URL matching.

2

Topic Extraction

Named entity recognition and topic modeling identify subjects discussed, then immediately discard source text.

3

Sentiment Analysis

We analyze framing and perspective (critical vs. supportive) to measure viewpoint diversity on topics you follow.

4

Geographic Detection

Location entities and publication metadata determine regional focus of content for geographic diversity scoring.

5

Aggregation

All classifications are aggregated into percentages. Only these anonymized ratios ever leave your device.

6

Scoring

The FBI Score is calculated using weighted entropy measures that reward balanced, diverse consumption patterns.

The Research That Convinced Us to Build This

These are some of the key studies that shaped our understanding of the problem — and convinced us that a privacy-first awareness tool was needed:

Foundational

The Filter Bubble: What the Internet Is Hiding from You

Eli Pariser's seminal work introducing the concept of personalized information environments and their effects on democratic discourse.

Pariser, E. (2011) Penguin Press →
Large-Scale Study

Exposure to ideologically diverse news and opinion on Facebook

Landmark study of 10.1 million Facebook users finding that user choice reduces cross-cutting content by 17% for conservatives vs. ~7% algorithmic reduction — proving individual behavior drives filter bubbles more than algorithms alone.

Bakshy, Messing & Adamic (2015) • Science • 2,735+ citations DOI: 10.1126/science.aaa1160 →
Experimental Study

Selective Exposure in the Age of Social Media

Stanford researchers found that social endorsements can reduce partisan selective exposure to "levels indistinguishable from chance" — showing that context and presentation can override filter bubble tendencies.

Messing & Westwood (2012) • Communication Research DOI: 10.1177/0093650212466406 →
Recent Research

Breaking the Loop: Causal Learning to Mitigate Echo Chambers

Analysis of Twitter, Google+, and Facebook showing how algorithmic filtering and user behavior create self-reinforcing bubbles — and potential interventions to break the cycle.

Yu et al. (2025) • ACM Transactions on Information Systems DOI: 10.1145/3757738 →
Overconfidence Study

Overconfidence in news judgments is associated with false news susceptibility

Study showing 75% of Americans overestimate their news discernment ability by 22 percentiles on average — the overconfident are more likely to share misinformation.

Lyons et al. (2021) • PNAS PNAS →
Misinformation Study

The Spread of True and False News Online

Analysis of 126,000 news stories on Twitter found falsehoods are 70% more likely to be retweeted and reach 1,500 people 6x faster than truth. Crucially, humans — not bots — drive this spread.

Vosoughi, Roy & Aral (2018) • Science DOI: 10.1126/science.aap9559 →

Why We Built This

The research paints a clear picture: most of us are stuck in information bubbles we can't see. But reading about filter bubbles doesn't break you out of one — you need to see your own patterns.

The Gap We're Filling

Researchers have spent years documenting the problem. The landmark Bakshy study found that user choice reduces exposure to diverse viewpoints by 17% — more than double the algorithm's ~7% effect. Media literacy advocates tell us to "diversify our sources." But until now, there hasn't been a simple, private way to actually see what you're consuming and what you're missing.

That's what Mindspan does. We take the insights from this research and turn them into a tool you can use every day — without sacrificing your privacy to yet another company that wants to track everything you read.

What the Research Suggests Works

  • Awareness alone helps: Studies show that simply making people aware of their consumption patterns leads to more diverse information seeking. Messing & Westwood found that context cues can reduce partisan selectivity to chance levels.
  • It's you, not just the algorithm: The Bakshy study proved individual behavior drives filter bubbles more than algorithms. That means you have power to change it — if you can see it.
  • Privacy matters: People won't use tools that feel invasive. A privacy-first approach isn't just ethical — it's essential for adoption.

See your own patterns

The research is clear. Find out what you've been missing soon.