Illustrated grid of cognitive bias icons representing decision-making errors, social biases, and memory distortions
Psychology • 14 min read

Cognitive Biases: The Complete List
With Examples You'll Recognize

April 2026 • by NerdSip Team

TL;DR

A categorized reference of 25+ cognitive biases covering decision-making, social judgment, memory, and perception. Each bias includes a clear explanation and a real-world example you'll recognize from your own life. Research citations from Kahneman, Tversky, Cialdini, Loftus, and more.

TikTok Instagram Reddit LinkedIn

Your brain lies to you. Not occasionally, not in edge cases, but constantly, confidently, and without any warning signal that it's happening. Every decision you make passes through a filter of shortcuts, distortions, and pattern-matching errors that evolution built into your cognition millions of years ago.

These are cognitive biases. They're not personality flaws. They're features of human hardware, baked into the neural architecture of every person who has ever lived. Kahneman and Tversky spent decades mapping them. Behavioral economists built an entire discipline around them. And yet most people can't name more than three.

This is the reference list. Twenty-five biases, grouped by category, each with a clear definition and an example you'll recognize from your own life. Bookmark it. You'll need it more often than you think.

Decision-Making Biases

These biases distort how you evaluate options, weigh evidence, and choose between alternatives. They operate fastest when the stakes are highest.

1. Confirmation Bias

You don't search for truth. You search for evidence that supports what you already believe, and you dismiss evidence that contradicts it. This isn't laziness. It's the default operating mode of human cognition. Peter Wason demonstrated it in 1960 with his famous card selection task, and thousands of studies since have confirmed: people treat supporting evidence and opposing evidence with radically different levels of scrutiny.

Example: You suspect a coworker is underperforming. From that moment, you notice every missed deadline and ignore every successful delivery. By month's end, you have a mental case file that feels airtight, built entirely from selective attention.

2. Anchoring Bias

The first number you encounter in any decision disproportionately shapes every number that follows. Tversky and Kahneman (1974) showed that even random, irrelevant anchors shift people's estimates by 20-40%. The anchor doesn't need to be meaningful. It just needs to arrive first.

Example: A car dealer shows you a vehicle listed at $45,000, then offers it at $38,000. You feel like you're getting a deal. The car's actual market value is $33,000, but that number never entered your head because $45,000 got there first.

3. The Sunk Cost Fallacy

You continue investing in something because of what you've already spent, not because of what you'll gain. Hal Arkes and Catherine Blumer (1985) showed that people who paid full price for a theater subscription attended more shows than those who got a discount, even when neither group particularly enjoyed the performances. Past investment hijacks future decisions.

Example: You've watched three seasons of a show you stopped enjoying in season two. You keep watching because you've "already invested this much time." The time is gone regardless. But your brain won't let you walk away from a sunk cost.

4. Loss Aversion

Losing something hurts roughly twice as much as gaining the equivalent feels good. Kahneman and Tversky's Prospect Theory (1979) proved this asymmetry is fundamental to human decision-making, not a quirk. Your brain treats losses and gains on completely different scales.

Example: You find $20 on the sidewalk. Nice. You lose a $20 bill from your wallet an hour later. The day feels like a net negative, even though you're exactly where you started.

5. The Framing Effect

The same information, presented differently, produces different decisions. Tversky and Kahneman (1981) showed that people chose a medical treatment described as having a "90% survival rate" far more often than one described as having a "10% mortality rate." Same data. Opposite reactions. How something is framed changes what you choose.

Example: A yogurt labeled "95% fat-free" feels healthy. A yogurt labeled "contains 5% fat" feels indulgent. You're reading the same nutritional fact and reaching different conclusions.

6. The Availability Heuristic

You judge how likely something is based on how easily you can recall an example. If it comes to mind quickly, you assume it's common. Tversky and Kahneman (1973) demonstrated this with a simple question: are there more English words that start with "K" or have "K" as the third letter? Most people say "start with K" because those words are easier to recall. The correct answer is the opposite.

Example: After watching a documentary about plane crashes, you feel genuinely nervous about an upcoming flight, even though you drove to the airport through statistically far more dangerous traffic without a second thought.

7. The Overconfidence Effect

People consistently overestimate the accuracy of their own judgments. When asked to give 90% confidence intervals for numerical estimates, people's true accuracy hovers around 50% (Lichtenstein, Fischhoff, and Phillips, 1982). You are far less right than you feel.

Example: You're "absolutely certain" you know the way to the restaurant without GPS. Twenty minutes later, you're parked in front of a building that closed in 2019, still reluctant to admit you were wrong.

Social and Group Biases

These biases warp how you perceive other people, assign blame, and navigate group dynamics. They are especially dangerous because they feel like accurate social perception.

8. The Halo Effect

One positive trait causes you to assume everything else about a person is also positive. Edward Thorndike identified this in 1920 when he noticed that military officers who rated a soldier as physically attractive also rated them as more intelligent, more capable, and more honest, with no evidence for any of those additional traits.

Example: A candidate walks into a job interview with a firm handshake, great posture, and a well-tailored suit. Before they've answered a single question, you've already decided they're competent.

9. In-Group Bias

You automatically favor people you perceive as belonging to your group and view outsiders with suspicion. Henri Tajfel's minimal group experiments in the 1970s showed that even arbitrary, meaningless group assignments (like preferring one abstract painting over another) were enough to trigger favoritism and discrimination.

Example: You meet someone at a conference who went to the same university as you. Instantly, you trust them more, laugh at their jokes more readily, and assume shared values. You've known them for four minutes.

10. The Fundamental Attribution Error

When someone else makes a mistake, you attribute it to their character. When you make the same mistake, you attribute it to circumstances. Lee Ross coined the term in 1977. This asymmetry is so consistent across cultures and contexts that it earned the word "fundamental" in its name.

Example: A driver cuts you off in traffic. They're reckless, selfish, probably a terrible person. When you cut someone off because you almost missed your exit, you had a perfectly good reason.

11. The Bandwagon Effect

The probability that you'll adopt a belief increases with the number of people who already hold it. This isn't just social conformity. It's a deep cognitive shortcut: if many people believe something, your brain treats that as evidence of its truth. Solomon Asch's conformity experiments (1951) showed that people will deny the evidence of their own eyes rather than disagree with a unanimous group.

Example: A new restaurant opens. It's empty for weeks. Then one night it's full, and suddenly there's a line out the door every evening. Nothing changed except visibility. Crowds attract crowds.

12. The Authority Bias

You give disproportionate weight to the opinion of someone perceived as an authority figure, even when their expertise is irrelevant to the topic at hand. Stanley Milgram's obedience experiments (1963) showed this taken to its extreme: ordinary people administered what they believed were dangerous electric shocks simply because a man in a lab coat told them to continue.

Example: A celebrity endorses a financial product. They have no background in finance. You consider the investment anyway, because the endorsement came from someone famous.

13. The Just-World Hypothesis

You want to believe the world is fair, that good things happen to good people and bad things happen to bad people. Melvin Lerner (1980) showed this belief is so strong that people will blame victims of misfortune rather than accept that bad things can happen randomly. It's a defense mechanism: believing the world is just means believing you're safe.

Example: You hear about someone who was laid off and your first instinct is to wonder what they did wrong, rather than consider that layoffs are often arbitrary and structural.

Memory and Recall Biases

Your memory is not a video recorder. It's a reconstruction engine that edits, distorts, and fabricates with total confidence. These biases explain why.

14. The Misinformation Effect

Exposure to misleading information after an event alters your memory of the original event. Elizabeth Loftus demonstrated this in her landmark 1974 car crash study: participants who were asked how fast the cars were going when they "smashed" into each other gave higher speed estimates and were more likely to "remember" seeing broken glass (there was none) than those asked about cars that "contacted" each other. A single word rewrote their memory.

Example: A friend recounts a party you both attended, including a detail that didn't happen. Two weeks later, you remember that fabricated detail as something you personally witnessed.

15. Hindsight Bias

After an event occurs, you believe you predicted it all along. Baruch Fischhoff (1975) called this the "knew-it-all-along" effect. The problem isn't just smugness. It's that hindsight bias makes you unable to accurately evaluate your past predictions, which means you can't improve your judgment over time.

Example: A startup fails. Everyone who expressed doubt says, "I always knew it wouldn't work." Check their text messages from launch day. You'll find congratulations and excitement, not skepticism.

16. The Peak-End Rule

You judge an experience almost entirely by its most intense moment and how it ended, not by the sum of every moment within it. Kahneman demonstrated this with patients undergoing colonoscopies: adding extra mild discomfort at the end (making the procedure longer) actually improved patients' ratings of the experience, because it created a less painful ending.

Example: A two-week vacation was mostly unremarkable, but the final dinner at a rooftop restaurant was stunning. You remember it as the best trip of your life.

17. The Rosy Retrospection Bias

You remember past events as more positive than they actually were. Mitchell et al. (1997) tracked people before, during, and after vacations and found that their post-trip memories were consistently rosier than their real-time reports. Your brain applies a warm filter to the past.

Example: You remember college as the best years of your life. Your journal from that period tells a different story: stress, loneliness, financial anxiety, and at least one existential crisis per semester.

18. The Zeigarnik Effect

Incomplete tasks occupy more mental space than completed ones. Bluma Zeigarnik (1927) noticed that waiters remembered active orders perfectly but forgot completed ones almost immediately. Your brain keeps unfinished business in a persistent open loop, demanding attention until closure arrives.

Example: You can't stop thinking about an email you forgot to send at work, but you've completely forgotten the ten emails you successfully sent that morning.

Perception and Attention Biases

These biases shape what you notice, how you interpret it, and what you believe is real. They operate before conscious thought even begins.

19. The Dunning-Kruger Effect

People with limited knowledge in a domain overestimate their competence, while genuine experts underestimate theirs. Kruger and Dunning (1999) found that the least competent performers in logic, grammar, and humor estimated their ability in the 60th-70th percentile when they actually scored in the bottom 12%. Ignorance doesn't just limit knowledge. It limits the ability to recognize what you don't know.

Example: After watching a 20-minute YouTube tutorial on investing, you feel ready to pick individual stocks. Someone with a finance PhD hedges every statement with caveats and uncertainty.

20. The Baader-Meinhof Phenomenon (Frequency Illusion)

Once you notice something for the first time, you suddenly see it everywhere. This is selective attention combined with confirmation bias, not a change in reality. Arnold Zwicky coined the linguistic term "frequency illusion" in 2005. The thing was always there. Your brain simply wasn't filtering it in.

Example: You learn a new word, "defenestration." Within a week, you encounter it in a podcast, a novel, and a crossword puzzle. You wonder if the universe is trying to tell you something. It isn't. You just have a new filter.

21. The Spotlight Effect

You believe other people notice your appearance, behavior, and mistakes far more than they actually do. Thomas Gilovich et al. (2000) had students wear embarrassing T-shirts to class and estimate how many classmates noticed. They predicted about 50%. The real number was under 25%. Everyone is starring in their own movie. Nobody is watching yours as closely as you think.

Example: You spill coffee on your shirt before a meeting and spend the entire hour convinced everyone is staring at the stain. After the meeting, nobody mentions it. Nobody noticed.

22. Attentional Bias

Your emotional state determines what you notice. Anxious people detect threats faster. Depressed people notice negative stimuli more readily. MacLeod, Mathews, and Tate (1986) demonstrated this with the emotional Stroop task: people were slower to name the color of words that matched their emotional concerns. Your attention is not neutral. It's shaped by what you feel.

Example: You're worried about money. Suddenly, every conversation seems to be about finances. Every headline mentions recession. You're not imagining it, exactly. You're just filtering the world through a lens of financial anxiety.

23. The IKEA Effect

You overvalue things you helped create, regardless of their objective quality. Norton, Mochon, and Ariely (2012) found that people valued self-assembled IKEA furniture nearly as much as professionally built pieces, even when the self-assembled versions were visibly inferior. Labor creates love. Effort inflates perceived worth.

Example: You spend three hours on a homemade birthday cake that looks, honestly, like a geological accident. You still think it's better than anything from the bakery. Your family politely disagrees.

24. The Curse of Knowledge

Once you understand something, you find it almost impossible to imagine not understanding it. This makes experts terrible at explaining their field to beginners. Camerer, Loewenstein, and Weber (1989) demonstrated this with a "tapping" experiment: people who tapped the rhythm of a song estimated that listeners would identify it 50% of the time. The actual identification rate was 2.5%.

Example: A software engineer explains a bug to a non-technical colleague using three acronyms and a metaphor that assumes knowledge of database architecture. The colleague nods blankly. The engineer genuinely can't understand what's confusing.

25. Status Quo Bias

You prefer things to stay the way they are, even when change would objectively benefit you. Samuelson and Zeckhauser (1988) documented this across insurance choices, investment portfolios, and policy decisions. The current state feels safer because it's known. Change introduces uncertainty, and your brain treats uncertainty as threat.

Example: You've been with the same phone carrier for eight years. A competitor offers better coverage at a lower price. You stay anyway. Switching feels like effort, and the current plan feels "fine," even though you complain about it monthly.

Why Knowing the List Isn't Enough

Here's the uncomfortable truth about cognitive biases: awareness alone doesn't fix them. Knowing that confirmation bias exists doesn't stop you from falling for it. Recognizing loss aversion in a textbook doesn't prevent it from hijacking your next negotiation.

The gap between knowing and noticing is enormous. Biases operate below conscious thought, faster than deliberation. They feel like clear perception, not errors. The only thing that reliably reduces their influence is repeated, spaced practice in identifying them across different contexts.

That's exactly what NerdSip is built for. It generates spaced-repetition lessons on individual biases, forcing you to identify them in realistic scenarios: conversations, business decisions, relationship dynamics, media consumption. Not passive reading. Active pattern recognition, with enough repetition that spotting biases becomes reflexive rather than theoretical.

Download NerdSip free and start a course on cognitive biases. Five minutes a day. The goal isn't to memorize a list. It's to build the instinct to catch your own brain in the act.

Sources and Further Reading

About the author: The NerdSip Team writes research-backed guides on psychology, learning science, and cognitive performance. NerdSip is a gamified micro-learning app that turns complex topics into 5-minute AI-generated lessons with spaced repetition, progression systems, and social features. Learn more at nerdsip.com.

Frequently Asked Questions

How many cognitive biases are there?

Researchers have cataloged over 180 distinct cognitive biases, though the exact number depends on how broadly you define a 'bias.' Many overlap or emerge from the same underlying heuristic. The most influential taxonomy comes from Buster Benson's Cognitive Bias Codex, which groups them into four categories: too much information, not enough meaning, need to act fast, and what should we remember. This article covers the 25+ most well-documented biases with the strongest research support.

What is the most common cognitive bias?

Confirmation bias is widely considered the most pervasive cognitive bias. It affects virtually every domain of thought, from political opinions to medical diagnoses to hiring decisions. Kahneman and other researchers have found it operates automatically and is extremely difficult to override, even when you know about it. Other strong contenders include the availability heuristic and the anchoring effect.

Can you overcome cognitive biases?

You can't eliminate them. They're hardwired into how your brain processes information. But you can reduce their impact through structured decision-making frameworks, actively seeking disconfirming evidence, slowing down before important decisions, and regular practice identifying biases in real scenarios. Research by Morewedge et al. (2015) found that interactive training games reduced cognitive biases measurably, and the effect persisted for at least 8 weeks.

What is the difference between a cognitive bias and a logical fallacy?

A cognitive bias is a systematic pattern of deviation from rationality in judgment, rooted in how your brain processes information. It happens automatically, often unconsciously. A logical fallacy is a specific error in reasoning within an argument, like a straw man or false dichotomy. Biases are psychological; fallacies are structural. You can commit a logical fallacy without any cognitive bias driving it, and biases can distort your thinking without producing a formal fallacy.

Learn to Spot Your Blind Spots

NerdSip generates spaced-repetition lessons on each cognitive bias so you actually recognize them in conversations, negotiations, and decisions. Download free and start your first course in under a minute.