Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Why More Information Makes You Dumber

This chapter’s title is deliberately provocative, and I should clarify it before we go further: more information doesn’t literally make you dumber. Your IQ doesn’t drop when you open a new browser tab.

What happens is more subtle and, in some ways, worse.

More information, consumed without the right systems, makes you feel more informed while actually degrading the quality of your decisions, predictions, and understanding.

This is not intuitive. It seems obvious that more information should lead to better outcomes. After all, decisions made with relevant data should beat decisions made without it.

And in many cases, that’s true — up to a point.

Beyond that point, additional information doesn’t help. It actively hurts, through mechanisms that are well-documented but poorly understood by most people navigating the information landscape.

Understanding these mechanisms is essential for building an effective information triage system, because the goal isn’t to consume the maximum amount of information. It’s to consume the right amount, which is often considerably less than feels comfortable.

The Paradox of Choice, Applied to Information

In 2004, psychologist Barry Schwartz published The Paradox of Choice, arguing that an abundance of options, rather than being liberating, can be paralyzing and ultimately unsatisfying.

His evidence came primarily from consumer decisions — jam selections, retirement plans, dating options — but the underlying mechanism applies directly to information consumption.

The core finding: when the number of options exceeds a certain threshold (which varies by individual and context), three things happen.

First, decision quality declines.

With too many options, people struggle to evaluate them systematically and default to simplistic heuristics or arbitrary criteria. They pick the first thing that seems good enough, or the thing that’s most familiar, or the thing that requires the least effort to evaluate — none of which reliably correspond to the best option.

Second, decision speed declines.

More options mean more comparisons, and the number of possible comparisons grows geometrically with the number of options. Facing a wall of possibilities, people often defer the choice entirely, which in the context of information means “save for later” (i.e., never).

Third, satisfaction declines.

Even when people make a good choice from many options, they’re less satisfied with it than they would have been if they’d chosen from fewer options. The reason is opportunity cost awareness: with many options, you’re more aware of what you didn’t choose and more likely to second-guess yourself.

Now apply this to information consumption.

When you have three articles to read about a topic, you read the one that seems best and you’re satisfied.

When you have three hundred, you spend twenty minutes trying to figure out which ones are worth reading, eventually pick three that may or may not be the best ones, skim them anxiously because you’re aware of the 297 you’re not reading, and come away feeling less informed and less confident than if you’d had fewer choices.

This isn’t hypothetical. This is the literal experience of trying to “research” anything on the modern internet. The information is there — more of it than you could ever need — and having it makes the experience worse.

Decision Fatigue and the Information Consumer

Decision fatigue is the well-documented phenomenon where the quality of your decisions degrades over the course of a day as you make more of them.

It’s the reason judges grant more paroles after lunch than before it, the reason you’re more likely to impulse-buy something at the end of a shopping trip, and the reason your willpower feels depleted by evening.

Every piece of information you consume presents you with a cascade of micro-decisions:

  • Is this worth reading?
  • How carefully should I read it?
  • Should I save this for later?
  • Does this change what I think about the topic?
  • Should I share this with someone?
  • Does this require any action on my part?
  • How does this fit with what I already know?
  • Is the source reliable?
  • Am I being manipulated?

These decisions feel trivial individually, but they accumulate.

A knowledge worker processing a hundred emails, scrolling through a social media feed, reading a few articles, and monitoring a few Slack channels is making hundreds of these micro-decisions per hour. By mid-afternoon, their decision-making capacity is measurably depleted.

The insidious thing about decision fatigue in the context of information consumption is that it degrades exactly the capacity you need most: the ability to evaluate quality, relevance, and reliability.

When you’re decision-fatigued, you’re more likely to:

  • Accept information at face value
  • Miss inconsistencies or questionable claims
  • Rely on surface cues (who shared it, how many likes it has, whether the headline confirms your priors) rather than deeper evaluation
  • Default to “save for later” instead of making a real decision about the content’s value

In other words, the more information you process, the worse you get at processing information.

The curve doesn’t just flatten; it inverts. There’s a sweet spot — enough information to be well-informed, not so much that your evaluation capacity collapses — and most people are well past it.

The Dunning-Kruger Effect, Supercharged

The Dunning-Kruger effect is the observation that people with limited knowledge or competence in a domain tend to overestimate their abilities, while genuine experts tend to underestimate theirs.

The original 1999 paper by David Dunning and Justin Kruger has been contested on methodological grounds, and the effect is probably less dramatic than the popular “Mount Stupid” graph suggests. But the core insight — that a little knowledge can produce unwarranted confidence — is well-supported by subsequent research.

The modern information environment is a Dunning-Kruger accelerator.

Consider what happens when you read a few articles about, say, monetary policy.

You learn some terminology: quantitative easing, inflation targeting, the Phillips curve.

You encounter some arguments: hawks vs. doves, the debate about Modern Monetary Theory.

You develop opinions, perhaps strong ones, about what the Federal Reserve should do.

Now consider what a monetary economist knows.

They’ve spent years studying the mathematical models that underlie these concepts. They’ve read hundreds of papers on the empirical evidence. They understand the limitations and assumptions of each framework. They know which arguments are well-supported and which are fringe. They know the history — which policies were tried, what happened, why the results were ambiguous.

The gap between “read a few articles” and “studied for a decade” is enormous, but it doesn’t feel enormous from the “read a few articles” side.

You know the terminology. You can follow the arguments. You can hold your own in a conversation. From the inside, this feels like being informed.

It’s not. It’s being familiar, which is a very different thing.

The modern information environment makes this worse in two ways.

First, it provides surface-level coverage of everything.

A hundred years ago, if you wanted to learn about monetary policy, you had to find a textbook, go to a library, and invest substantial time. The effort required created a natural filter: you only learned about things you cared enough about to invest in.

Now, you can absorb the superficial version of almost any topic in fifteen minutes of scrolling. The barrier to surface-level exposure has essentially disappeared.

Second, the social media layer creates feedback loops that reinforce premature confidence.

You share your hot take on monetary policy. People who agree with you like and retweet it. People who disagree argue with you, which feels like engagement rather than correction. Your view gets reinforced, your confidence grows, and the gap between your confidence and your actual understanding widens.

This is not a problem that affects stupid people. It affects everyone, including very smart people who are used to learning quickly and forming competent opinions.

The smartest people may actually be more vulnerable, because they’re accustomed to their intuitions being correct and less likely to recognize when they’ve crossed into a domain where their intuitions are unreliable.

The Headline Illusion

A specific and particularly damaging version of the Dunning-Kruger problem is what I’ll call the headline illusion: the sense of understanding that comes from reading headlines, summaries, and abstracts without engaging with the underlying content.

This has always existed — people have always skimmed newspapers — but the modern information environment has elevated it to the primary mode of information consumption for many people.

Consider the pathway by which most people encounter information:

  1. A headline appears in a social media feed or news aggregator.

  2. The headline is designed to convey the conclusion without the nuance: “Study Shows X Causes Y” or “Why Z Is Dead.”

  3. You read the headline. You now have a data point: X causes Y. Z is dead.

  4. You may or may not click through. If you do, you probably skim the first few paragraphs, which typically reinforce the headline’s claim.

  5. You move on. The data point is filed away. X causes Y. Z is dead.

What you didn’t get:

  • The methodology of the study
  • The effect size
  • The confidence intervals
  • The author’s caveats
  • The replications (or lack thereof)
  • The competing interpretations
  • The context that makes the finding meaningful or trivial

What you did get: a confident-sounding conclusion stripped of everything that would let you evaluate it.

Multiply this by dozens of headlines per day, across many topics, over months and years, and you build up an impressive-seeming structure of knowledge that’s actually a house of cards.

You “know” hundreds of things. Very few of them are things you understand.

This matters because headline-level knowledge is systematically biased.

Headlines emphasize the novel, the dramatic, and the conclusive. They don’t say “Study Finds Small, Statistically Ambiguous Effect That May or May Not Replicate.” They say “Scientists Discover Key to Longer Life.”

The version of reality you construct from headlines is more dramatic, more certain, and more simple than actual reality. And because you’ve been exposed to so much information, you may have more confidence in this distorted picture than someone who’s read less but read more carefully.

The research on this is troubling.

A 2019 study published in the Journal of Experimental Psychology found that people who read only headlines subsequently expressed higher confidence in their understanding of the topic than people who read the full articles — even though their actual understanding was significantly worse.

Reading the headline didn’t just fail to inform; it created the illusion of being informed, which preempted the motivation to learn more.

This is worth sitting with for a moment. The most common mode of information consumption in the modern world — reading headlines — is not just ineffective. It’s actively counterproductive, because it replaces the motivation to learn with the feeling of having already learned.

Information Is Not Knowledge

This is perhaps the most fundamental point in this chapter, and it’s one that the information-saturated environment actively obscures: information and knowledge are not the same thing.

Information is data. It’s facts, claims, observations, and descriptions. It can be transmitted, stored, and copied without loss. It’s the raw material of understanding.

Knowledge is what you get when you process information through experience, reflection, and integration. It includes not just the facts but the connections between them, the context that makes them meaningful, the understanding of their limitations, and the ability to apply them in novel situations.

Knowledge cannot be transmitted directly; it must be constructed by each individual through the process of learning.

The modern information environment is spectacularly good at delivering information and spectacularly bad at facilitating the conversion of information into knowledge.

This is because the conversion process requires exactly the things the information environment disrupts:

  • Time
  • Reflection
  • Sustained attention
  • Integration with existing understanding
  • The willingness to sit with ambiguity while your brain does its slow work of making sense

Consider two people preparing for a decision about, say, adopting a new technology for their team.

Person A spends six hours reading everything they can find: blog posts, documentation, comparison articles, Twitter threads, forum discussions, vendor materials. They process a huge volume of information and can recite many facts about the technology.

Person B spends two hours reading three carefully selected sources: the official documentation, one critical review from a trusted expert, and one case study from a team in a similar context. They then spend an hour thinking about how the technology would fit their specific situation, discussing it with a colleague, and mapping it against their past experience with similar technologies.

Person A has consumed more information.

Person B has more knowledge.

And in my experience — which is the experience of someone who has been Person A many, many times and slowly, painfully learned to be Person B — Person B makes better decisions.

The difference isn’t intelligence or discipline. It’s the ratio of consumption to processing. Person A spent all their time inputting and no time integrating. Person B spent less time inputting but more time integrating, and the integration is where the value is.

When More Data Makes Worse Predictions

One of the most counterintuitive findings in decision science is that, in some domains, additional information makes predictions worse rather than better.

The classic demonstration comes from Paul Slovic’s 1973 study of horse racing handicappers.

Slovic gave experienced handicappers access to varying amounts of information about each race — from five variables to forty variables.

More information increased the handicappers’ confidence in their predictions but did not increase their accuracy. With forty variables, they were just as wrong as with five, but they were much more certain about it.

That finding deserves a moment of reflection: more data didn’t help them get the right answer. It helped them feel better about the wrong answer.

This finding has been replicated across many domains.

Philip Tetlock’s landmark study of expert political judgment, published in 2005 as Expert Political Judgment: How Good Is It? How Can We Know?, found that experts who consumed more information and considered more variables made worse predictions than those who relied on simpler models.

His famous finding that the average expert was roughly as accurate as a “dart-throwing chimpanzee” has been somewhat mischaracterized in popular accounts (the actual finding was more nuanced), but the core insight stands: more information did not equal better predictions, and the experts who performed best were those who updated their views most readily in response to new evidence rather than those who consumed the most evidence.

Why does this happen? Several mechanisms are at play.

Overfitting.

In statistics, overfitting occurs when a model is so well-adapted to historical data that it fails to generalize to new situations. The same thing happens with human judgment.

When you have a huge amount of information, you can construct a compelling narrative that explains everything you’ve seen. But the narrative may be fitting the noise rather than the signal, and when new data arrives that doesn’t match the narrative, you’re stuck.

Dilution.

In psychology, the dilution effect is the finding that adding irrelevant information to relevant information reduces the influence of the relevant information on judgment.

If I tell you that a student got an A on the math exam, you’ll predict they’re good at math. If I also tell you that the student has brown hair, drives a Honda, and enjoys cooking, your prediction of their math ability becomes less extreme — even though the additional information is completely irrelevant.

More information doesn’t just add signal; it adds noise that dilutes the signal.

Narrative construction.

Human beings are storytelling animals. Give us data and we’ll construct a story that explains it.

The more data we have, the more elaborate the story — but elaborate stories aren’t necessarily more accurate. They’re just more convincing, which is dangerous when you’re the one convincing yourself.

Anchoring and adjustment failure.

When you have a lot of information, you tend to anchor on the first pieces you encounter and adjust insufficiently as new information arrives.

The result is that early information has disproportionate influence, and the sheer volume of subsequent information creates the illusion that you’ve fully incorporated it when you haven’t.

The Expert vs. the Well-Read Amateur

Given everything above, you might wonder what actually distinguishes genuine expertise from well-informed amateurism. If more information doesn’t automatically lead to better understanding, what does?

The research on expertise, which spans several decades and many domains, points to a few key factors.

Structured knowledge.

Experts don’t just have more information; they have information organized into efficient, interconnected structures.

A chess grandmaster doesn’t remember more individual chess positions than an amateur; they have learned to recognize patterns that compress many positions into a few meaningful chunks.

A medical expert doesn’t have a larger database of symptoms; they have mental models that connect symptoms to causes in ways that allow efficient diagnosis.

This structured knowledge comes from sustained engagement with a domain over time, not from breadth of information consumption. Reading a thousand articles about chess won’t produce the pattern recognition that playing a thousand games will.

Calibrated uncertainty.

Experts are better at knowing what they don’t know. They can accurately assess their own confidence levels: when they’re sure, they’re usually right; when they’re unsure, they acknowledge it.

This calibration comes from repeated feedback — making predictions, seeing outcomes, and adjusting one’s confidence accordingly.

Amateurs, including well-read amateurs, tend to be poorly calibrated. They’re overconfident about topics they’ve read a lot about and underconfident about topics they haven’t. This miscalibration is actively worsened by consuming large amounts of surface-level information, which increases confidence without improving accuracy.

Mental models.

Experts have internalized the deep structure of their domain — the causal relationships, the constraints, the typical patterns of failure and success. These mental models allow them to reason about novel situations by analogy, to predict consequences, and to identify the variables that actually matter.

Mental models are built through a combination of study and practice, with practice being the critical ingredient. Reading about mental models doesn’t give you mental models, any more than reading about swimming teaches you to swim.

The information is necessary but not sufficient.

The ability to ignore.

Perhaps counterintuitively, experts are better at ignoring irrelevant information.

When a doctor examines a patient, they don’t give equal weight to every piece of information available. They focus on the diagnostically relevant signs and symptoms and suppress the irrelevant ones. This selective attention is a skill, and it’s one that develops through experience, not through consuming more information.

This is the central irony of information overload from the perspective of expertise: the thing that separates experts from amateurs isn’t how much they consume but how effectively they filter.

And the modern information environment, by presenting everything with equal prominence and urgency, actively undermines the filtering that expertise depends on.

When Depth Beats Breadth (and Vice Versa)

Not all information consumption decisions are the same. Sometimes you need to go deep; sometimes you need to go broad. Understanding when each approach is appropriate is one of the most important skills in information triage.

Depth Wins When:

You need to make a specific decision.

If you’re choosing a technology, evaluating a job offer, or diagnosing a problem, depth in the relevant domain will serve you far better than breadth across many domains. Read three sources carefully rather than skimming thirty.

The topic is complex and context-dependent.

Some topics can’t be meaningfully understood from summaries. Monetary policy, climate science, legal reasoning, and most engineering problems fall into this category. If you’re not willing to go deep enough to understand the nuances, you’re probably better off trusting an expert than forming your own opinion.

The stakes are high.

When errors are costly, surface-level understanding is dangerous. This is true whether you’re making investment decisions, evaluating medical options, or assessing security risks.

You’re building expertise.

If this is your domain — the thing you do professionally, the field you want to master — depth is non-negotiable. You need the structured knowledge, the calibrated uncertainty, and the mental models that only come from sustained, deep engagement.

Breadth Wins When:

You’re exploring.

In the early stages of investigating a new area, breadth helps you map the territory. You don’t know enough yet to know what to go deep on, so sampling widely helps you identify the most important sub-topics.

You need creative connections.

Innovation often comes from connecting ideas across domains. Breadth of exposure increases the probability of finding unexpected connections.

But note: this only works if you’re also going deep enough in at least one domain to recognize which connections are meaningful. Broad and shallow is a recipe for false analogies.

You’re maintaining general awareness.

Not everything deserves deep attention. For topics that are tangentially relevant to your work or interests, a headline-level understanding may be genuinely sufficient — as long as you recognize that it is headline-level and don’t mistake it for real understanding.

You’re scanning for threats and opportunities.

Strategic awareness — knowing roughly what’s happening in adjacent fields, markets, or domains — requires breadth. The goal isn’t to understand everything deeply; it’s to notice when something important is happening that warrants deeper investigation.

The Balance

The key insight is that depth and breadth serve different purposes, and the right balance depends on your goals.

Most people’s information consumption is too broad and too shallow, not because breadth is wrong but because the information environment makes breadth the path of least resistance and depth requires deliberate effort.

Scrolling is easy. Reading is harder. Thinking is hardest of all.

The environment is optimized for scrolling.

The “Informed but Not Wise” Failure Mode

There’s a particular failure mode that afflicts people who consume large amounts of information, and it’s worth naming explicitly because it’s both common and hard to see from the inside.

I call it the “informed but not wise” pattern.

It looks like this:

The person reads voraciously. They can speak intelligently about many topics. They have opinions — often strong ones — about politics, technology, business, science, culture. In conversation, they’re impressive. They know things. They can cite studies, reference articles, name-drop experts.

But when it comes to actually making decisions, predicting outcomes, or navigating complex situations, their performance is mediocre.

They’re paralyzed by the complexity they can see. They’re pulled in different directions by the competing arguments they’ve internalized. They constantly revise their views based on the latest thing they read, never settling into a stable-enough framework to act on.

I can describe this pattern in detail because I’ve lived it. There was a period in my career where I was consuming information at an impressive rate and producing decisions at a terrible one. I could explain every side of every issue. I could not, for the life of me, decide what to actually do about any of them.

The problem isn’t that they’re unintelligent. The problem is that they’ve substituted information consumption for thinking.

They’ve read a thousand opinions about what to do and formed none of their own. They know what everyone else thinks and have never sat quietly long enough to figure out what they think.

This is the deepest cost of information overload: not the time it consumes, not the attention it fragments, but the way it can crowd out the slow, effortful, uncomfortable process of developing your own judgment.

Judgment isn’t the same as knowledge.

Judgment is knowing what to do with knowledge — which pieces matter, how to weigh competing considerations, when to act and when to wait. It develops through reflection, experience, and the willingness to be wrong.

It does not develop through reading more articles.

The wisest people I know, across many domains, share a common trait: they consume information selectively and think about it extensively. They read less than you’d expect and reflect more. They’re comfortable with not knowing things, uncomfortable with pretending to know things, and ruthless about distinguishing what they actually understand from what they’ve merely been exposed to.

This is the standard to aim for. Not “maximally informed” but “wisely informed” — knowing enough to act well, and having the judgment to recognize when you don’t know enough.

Practical Implications

If this chapter has done its job, you’re now slightly less confident about the value of consuming more information and slightly more interested in consuming it better.

Here are some practical principles that follow from the research we’ve discussed.

Set information budgets for decisions.

Before researching a decision, decide in advance how much information you’ll consume: how many sources, how much time, what types. Then stop when you hit the budget, even if you feel like you “should” read more.

The research suggests that your early sources will provide most of the value and additional sources will mainly increase your confidence without improving your accuracy.

Distinguish information from knowledge.

When you finish reading something, ask yourself: what do I actually understand that I didn’t before? Can I explain it in my own words? Can I apply it to a new situation?

If you can only repeat what you read, you’ve acquired information. If you can use it, you’ve acquired knowledge.

Be honest about your level of understanding.

For any topic, honestly assess whether you have headline-level familiarity, working knowledge, or genuine expertise. Act accordingly: defer to experts on topics where you have only headline-level familiarity, even if you have strong feelings.

Especially if you have strong feelings.

Invest in depth for topics that matter.

Choose a few areas where you’ll develop real understanding, and protect the time and attention required to do so. Accept that this means being ignorant about many other things. This is a feature, not a bug.

Create processing time.

After consuming information, give yourself time to think about it before consuming more. Even fifteen minutes of reflection after a reading session is worth more than fifteen minutes of additional reading.

This is the hardest habit to build, because reflection doesn’t feel productive. You’re not producing anything. You’re not consuming anything. You’re just… thinking.

It feels like wasting time. It’s the opposite.

Track your prediction accuracy.

One of the best ways to calibrate your confidence is to make predictions and check whether they come true. This reveals the gap between what you think you know and what you actually know, and it’s humbling in a productive way.

Write down what you think will happen. Wait. Check. Repeat. The pattern of your errors will teach you more about your information processing than any amount of additional reading.

Key Takeaways

  • More information increases confidence without necessarily increasing accuracy. The gap between feeling informed and being informed is widened by high-volume consumption.

  • The paradox of choice applies to information: too many sources lead to worse selection, slower decisions, and less satisfaction with the result.

  • Decision fatigue from processing large volumes of information degrades your ability to evaluate the very information you’re consuming.

  • Surface-level familiarity with many topics creates a Dunning-Kruger trap: enough knowledge to be confident, not enough to be competent.

  • Reading headlines creates an illusion of understanding that preempts the motivation to learn more deeply.

  • In many domains, additional information beyond a certain threshold makes predictions worse, not better. The mechanisms include overfitting, dilution, and narrative construction.

  • What distinguishes experts from well-read amateurs is structured knowledge, calibrated uncertainty, internalized mental models, and the ability to selectively ignore irrelevant information.

  • The deepest cost of information overload isn’t time or attention — it’s the crowding out of reflection and judgment development.