Thinking Errors or Cognitive Distortions
Thinking errors, also known as cognitive distortions, are thought process mistakes that we all make from time to time. These patterns are generally unhelpful, though context matters. Many thinking errors were once adaptive in our ancestral environment, but now cause problems in modern contexts. What counts as a “thinking error” may also depend on your goals and values.
When we get good at spotting these potential mistakes we can start disputing or challenging them in our daily lives. The fewer thinking errors we make, the less likely we are to form irrational or self-defeating beliefs, and the less likely we are to engage in problematic behaviors or experience distressing emotions.
This handout uses the perspectives of Stoic philosophy and Rational Emotive Behavior Therapy (REBT) to address mental health concerns; within this combined perspective we look for ways of thinking that are irrational or unhelpful. Also see the ABCDE model of REBT to understand the formalized way of seeing how these thinking errors are tied to problematic consequences in our lives (depressive symptoms, anxiety symptoms, relationship problems, anger, behavioral problems, etc.) and how we can replace these with more rational/helpful ways of thinking.
How to Use This Handout:
1. Read through all entries, starring the ones you recognize in yourself
2. Focus on the Foundational Errors you starred—addressing these reduces multiple specific errors
3. Notice the cross-references—they reveal how errors interact and compound
4. Use this as a reference: when you catch yourself in distress, consult your starred items
5. Request deeper handouts on specific errors from your therapist for detailed work
Foundational Errors
Before diving into specific thinking errors, let’s look at some of the larger meta-level philosophical or logical errors that explain how basic thinking errors operate. These foundational errors often underlie multiple specific thinking errors. Understanding these foundational patterns helps explain why we fall into the specific traps listed in the next section—and why addressing foundational errors can help reduce multiple specific errors at once.
Treating impressions like reality — We often treat automatic thoughts and emotions (impressions) as facts, when these are just interpretations of our situation or of reality. When we experience sadness, for example, we think and act as if this feeling is all there is to say about the situation. This is especially common with anxiety (feeling anxious makes us think danger is present) and anger (feeling wronged makes us certain we were wronged). Reality is more complex. Automatic thoughts and emotions are temporary, automatic evaluations. How we evaluate something one day is going to be different than how we evaluate it in several days, or in a month, a year, or longer.
If we treat impressions like reality, we become overly reactive to them—that is, we act before we have time to think critically about what is the best move (consider a chess analogy here). When we experience impressions, it is in our best interest to ask if they are rational or helpful. If they are not rational/helpful we can then dispute and replace them, transform them into something more helpful, or simply observe them without engaging (decentering). A common adage used in psychotherapy is “Don’t believe everything you think; don’t trust everything you feel”—remembering this can help you to learn to not uncritically accept impressions.
Related cognitive biases: Emotional reasoning bias, affect heuristic (using emotional responses as information about reality)
Category error (reification/concretizing) — A category error occurs when we treat something as belonging to one category or type when it actually belongs to another. In the context of thinking errors, the most common forms are:
- Treating processes as things: Experiencing ourselves or others as unchanging, concrete entities rather than as ongoing processes. We are more like rivers than rocks—constantly flowing and changing. When we say “I am an anxious person” or “He is a failure,” we’re freezing a process into a thing, creating the illusion that traits or behaviors define our permanent essence, when in reality we are always capable of change and growth.
- Mistaking maps for territory: Treating abstract explanatory models, diagnoses, or conceptual frameworks as if they were concrete realities rather than useful descriptions. A diagnosis like “depression” or “ADHD” is a map—a descriptive tool—not the territory of your actual experience. When we say “I am bipolar” or “My OCD made me do it,” we reify the model into a force or entity that controls us, rather than recognizing it as a shorthand description of patterns we experience.
Philosophical context: Buddhism and process philosophy both emphasize this fundamental insight: what we call a “self” is actually a dynamic process, not a fixed entity. This aligns with Heraclitus’ panta rhei (everything flows), Buddhist anatman (no-self doctrine), and Whitehead’s process philosophy, which treats entities as temporal occasions rather than enduring substances. Similarly, general semantics (Korzybski) warns against confusing the map (our models and language) with the territory (reality itself).
Related cognitive biases: Essentialism bias, correspondence bias (attributing behavior to fixed traits rather than situations), nominal fallacy (believing that naming something explains it)
Examples:
- Process reification: Tim labels himself as “a bad person” based on past actions, treating his identity as a permanent fact rather than recognizing he is a person who has done both good and bad things and continues to evolve.
- Diagnostic reification: Marcus thinks “My ADHD makes me procrastinate” rather than “I have attention regulation patterns that make certain tasks more difficult, and I can develop strategies to work with these patterns.”
Overgeneralization (hasty generalization) — Drawing broad conclusions from limited evidence or small samples. This works in both directions: assuming what’s true of a few cases is true of all cases (specific to general), or assuming what’s true in general must be true in a specific case (general to specific). Our brains naturally want to find patterns quickly, but this leads us to make faulty generalizations from insufficient data. Sample size and representativeness matter—one experience, or even several experiences, may not represent the full picture.
Related cognitive biases: Availability / saliency heuristic (overweighting easily recalled examples), representativeness heuristic (assuming small samples represent populations)
Example: After three difficult conversations with his mother-in-law, David concludes “She hates me and always will.” He’s overgeneralizing from a small sample to a universal truth. Conversely, someone might think “Most people recover from depression, so I don’t need to take mine seriously”—overgeneralizing from the general to their specific case without considering individual factors.
Worrying about externals — “Externals” are things we cannot control. We cannot control the weather or a lot of events in the world; although we can try to be a likeable person, we are ultimately not in control of what other people think of us; we cannot control what we did in the past or other things that happened in the past; we cannot fully control what other people believe, do, or say. Worrying about these things takes away from our ability to focus on those things we can control, such as: our preparation; how we think / what we believe; what we do.
While we can’t control externals, it’s unavoidable to have emotional responses to them—the key is not to ruminate or let these responses dominate our thinking. This distinction comes from Epictetus” Enchiridion: “Some things are up to us, some are not up to us.” The dichotomy of control is fundamental to Stoic ethics and psychology.
Related cognitive biases: Illusion of control (overestimating control over external events)
Example: Sid is extremely sad because a girl he likes does not like him back.
Dualistic or black and white thinking (aka absolutistic thinking) — Seeing things as either totally true or totally false, totally good or totally bad, without seeing shades of grey or nuance. Often involves using words like “always,” “never,” “every,” and “all.”
Classical logic assumes every statement is either completely true or completely false—but reality is often more nuanced. Most qualities exist on a spectrum (intelligence, attractiveness, morality) rather than as binary categories. Aristotle recognized this with his “doctrine of the mean”—virtue and wisdom typically lie between extremes on a spectrum, not at absolute poles. The sorites paradox (heap paradox) illustrates why sharp boundaries are illusory: if you remove grains of sand from a heap one by one, at what exact point does it stop being a heap? There’s no clear line because “heap” admits of degrees.
Related cognitive biases: Binary bias, false dichotomy, splitting (in psychodynamic terms)
Example: Larry believes that his political views are 100% true and that everyone else is completely wrong, failing to recognize that complex issues often involve trade-offs and multiple valid perspectives.
Magnifying our pain — Pain, whether it is psychological or physical, is unpleasant by definition. Pain is unavoidable; however, pain is often necessary for growth and meaning. While pain is inevitable, we do not have to unnecessarily multiply our pain—for example, we do not have to turn discomfort into suffering. The thinking errors in this handout represent ways we often amplify our pain into intense suffering. For example, catastrophizing (see in Common Thinking Errors section) is a way we increase our pain, non-acceptance is a way we increase our pain, etc., etc. The Parable of the Two Arrows from the Buddhist tradition provides a helpful way of remembering this.
Over-confidence in our perceptions / knowledge — Sometimes we trust our perceptions and our knowledge claims too much. This can be a problem because our perceptions are not always reliable (optical illusions demonstrate this), and because sometimes we are wrong about the things we think we know. To counter this foundational error, we should seek to 1) confirm or reality test our perceptions (e.g., Do others report having the same perceptions? Do instruments confirm our perceptions? Do our perceptions seem logical, rational, and believable?) and 2) practice epistemic humility (epistemic = related to epistemology—that is, the philosophical study of how we can know things).
Epistemic humility roughly means being appropriately humble about what we claim to know. This includes: not claiming certainty when we can’t be certain, being willing to revise beliefs when presented with better evidence, and recognizing that strong or extraordinary claims require strong evidence. The Chesterton’s fence principle provides an illustrative example of why epistemic humility is crucial when making reforms. This over-confidence thinking error relates to fallibilism in epistemology—the view that we can be wrong even about beliefs we’re highly confident in. As Socrates noted, acknowledging what we don’t know is the beginning of wisdom.
Related cognitive biases: Dunning-Kruger effect, overconfidence bias, hindsight bias
Monocausal thinking (single-cause fallacy) — Assuming complex phenomena have single causes when reality is almost always multicausal. We’re drawn to simple explanations because they’re easier to grasp, but oversimplifying causation leads us to miss important factors and pursue ineffective solutions. This is especially problematic when explaining mental health issues, relationship problems, or personal struggles—these almost always result from multiple interacting causes (genetics, environment, choices, circumstances, relationships, biology, etc.).
Related cognitive biases: Fundamental attribution error (in causal attribution), narrative fallacy (preferring simple causal stories)
Example: Marcus believes “I”m depressed because of my childhood trauma,” while overlooking other contributing factors like his current isolation, poor sleep habits, lack of exercise, recent job stress, and possible biological factors. By focusing on a single cause, he misses opportunities to address factors within his control.
Violating the principle of parsimony (Occam’s Razor) — Accepting complex, elaborate explanations when simpler explanations fit the evidence just as well or better. The principle of parsimony holds that when multiple explanations could account for something, we should prefer the one that requires the fewest assumptions or the least complexity. This doesn’t mean the simplest explanation is always right—but it means we shouldn’t accept unfounded conspiracy theories or complex explanations without good reason. Overly complicated explanations often smuggle in assumptions we cannot rationally justify with evidence or sound logical inference.
When someone doesn’t text you back, it’s more parsimonious (requires fewer assumptions) to think they’re busy than to think they’re orchestrating an elaborate plan to avoid you.
NOTE: While monocausal thinking oversimplifies causation, violating parsimony does the opposite—it overcomplicates explanation. The goal is appropriate complexity: explanations should be as simple as possible, but no simpler (Einstein’s refinement of Occam’s Razor).
Related cognitive biases: Conjunction fallacy (judging complex scenarios as more probable than simple ones), clustering illusion (seeing patterns in randomness)
Example: When Arthur doesn’t get promoted, he develops an elaborate theory involving office politics, favoritism, and people conspiring against him. The more parsimonious explanation—that other candidates had more experience or better interview performance—requires far fewer assumptions but feels less satisfying because it doesn’t make him the center of a narrative.
Confirmation bias — Seeking out information that confirms our beliefs while avoiding information that challenges them, often because we tie parts of our identity to our beliefs or are emotionally invested in them being true.
Related cognitive biases: Myside bias, belief perseverance, selective exposure
Example: Jennifer only reads news sources that align with her political views and dismisses any contradictory information as “fake news,” preventing her from considering other perspectives.
Non-acceptance — Refusing to accept reality on some level. For example, that people (including ourselves) are fallible and make mistakes; or that the world is complicated, messy and bad things happen. Non-acceptance may increase or prolong our distress.
REBT recommends unconditional acceptance as an antidote to this type of thinking. This means accepting ourselves as fallible and imperfect, but still deserving of kindness and respect; accepting others as fallible and imperfect, but still deserving of kindness and respect; and accepting life/the world as complicated and messy, while also seeing the good and the beauty.
NOTE: Acceptance means recognizing reality as it is, not approving of what happened or giving up on change. We can accept that something occurred, and was or is outside of our control, while still working to prevent future harm. See: fundamental attribution error, just world hypothesis in Common Thinking Errors section).
Related philosophical concepts: Radical acceptance (DBT), amor fati (Nietzsche/Stoicism)
Example: Maria was mistreated by a coworker six months ago. She remains intensely angry and upset because she cannot accept that people sometimes behave badly, preventing her from moving forward.
Common Thinking Errors
Catastrophizing — Taking a relatively minor event and interpreting it as a major catastrophe, often by imagining only the worst possible outcomes. Often involves overgeneralization (extrapolating from limited evidence), dualistic thinking (if something bad happens, everything is ruined), and magnifying our pain. See Foundational Errors section.
Example: Annie makes a mistake at work that upsets several coworkers. She concludes that no one likes her, that she is a failure, and that she will never be good at anything.
Jumping to conclusions — Believing a certain conclusion is true without sufficient evidence. This often involves overgeneralization—see Foundational Errors section.
Example: Sarah has felt her heart beating harder and faster than usual for two days. Because of this, she is convinced that she has a life-threatening heart condition.
Personalization — Interpreting another person’s or persons’ statements or behaviors as directed at oneself, without sufficient evidence. Often involves jumping to conclusions and overconfidence in our perceptions. See Foundational Errors section.
Example: Jeff thinks his girlfriend was more brief with him than usual on the phone, and that she seemed upset. Jeff believes she acted in this way because she is mad at him.
Emotional thinking — Thinking under the influence of powerful emotions often leads us to jump to conclusions and react automatically without rational thought, frequently in ways that run counter to our values or long-term goals. Involves treating impressions like reality—mistaking emotional reactions for facts about the situation. See Foundational Errors section.
Example: Bill is in a monogamous relationship with Tracy. Bill saw Tracy hug another man in a restaurant parking lot. Bill flew into a rage and drove his car into the other man’s car.
Bandwagon fallacy — Believing something just because a lot of other people do, without thinking critically about whether it’s actually true. Bandwagon thinking is not just potentially factually wrong, it may also indicate a lack of moral courage to go against the majority.
Example: Ted has heard dozens of people say that he is unintelligent. Because of this, Ted believes this to be true.
Groupthink — Going along with what others in our group believe or do in order to fit in, often leading people to do things they would never do on their own.
Example: Terrence does not know much about Sally, except that his group of friends hates her. To fit in, Terrence starts to be mean to Sally and tells everyone that he hates her.
Paranoid thinking and extreme mistrust — Taking healthy skepticism too far by rejecting expert consensus or well-supported information without good counter-evidence.
Example: Despite overwhelming scientific consensus and evidence, Marcus refuses to believe any information from medical experts because he thinks they are all part of a conspiracy.
Fundamental attribution error — Attributing problematic behavior in others to their personality while attributing our own problematic behavior to our circumstances, making us harder on other people than we are on ourselves. Involves overgeneralization (generalizing from single instances to character traits) and category error (treating dynamic persons as fixed entities). See Foundational Errors section.
Example: When Amy gets cut off in traffic by John, she believes it is because he is a rude, careless person. However, when she does the same thing a few days later, she says it was only because she was in a hurry.
Rigid expectations (including “should statements”) — Having very rigid or unrealistic demands about how things have to be, often expressed through “should,” “must,” or “ought” statements. Often involves dualistic thinking (things must be exactly one way) and non-acceptance (refusing to accept reality as it is). See Foundational Errors section.
Example: Craig thinks he must always be seen as nice and friendly. Because of this Craig is a people pleaser who gets pushed around at his job.
Magical thinking — Believing that there is a causal relationship between two or more things when there is no solid evidence to support this belief. Involves overconfidence in our perceptions and violating parsimony (preferring complex causal explanations over simpler ones). See Foundational Errors section.
Example: Derek thinks he can influence what happens to him by thinking in certain ways. Because of this Derek starts to worry when he has anxious thoughts about something bad happening, thinking that his anxious thoughts will cause those bad things to happen to him.
Mind reading — Assuming we know what someone is thinking based on limited clues, without sufficient evidence.
Example: Elise has known Valerie for over a year and considers her a good friend. Elise finds out that Valerie is having a party, but she has not been invited. Elise thinks that Valerie hates her because she did not invite her to the party.
Negativity bias (discounting the positive) — Over-emphasizing negative things about ourselves, our lives, and the world in general, while failing to see the positive things. Involves confirmation bias—selectively attending to information that confirms our negative view. See Foundational Errors section.
Example: Eric thinks he is a horrible person. He thinks of all the things he has done wrong in his life, and in his mind he labels himself in all kinds of negative ways.
Positivity bias (discounting the negative) — Unconsciously ignoring bad things and only focusing on the good, which can cause us to ignore important warning signs or to be naïve. Involves confirmation bias—selectively attending to information that confirms our positive view. See Foundational Errors section.
Example: Mary doesn’t like to be exposed to news that upsets her, so she is unaware that there have been many burglaries in her neighborhood. Mary leaves her doors unlocked at night and her valuables are stolen.
Global labeling — Labeling ourselves or others in a global way that supposedly defines our essence, rather than evaluating individual actions. This thinking error involves a category error or overgeneralization—see Foundational Errors section.
Example: Tim has done some things in the past that he regrets. Tim has done a lot of good things too, but he does not think about these things very much. He thinks about the bad things all the time and labels himself as a bad person. Because Tim thinks he is a bad person, he gives up on trying to become better.
Believing or identifying with our automatic thoughts — Accepting unwanted or distressing thoughts that pop into our minds as true or as defining who we are, rather than recognizing them as just automatic mental events. See treating impressions like reality in the Foundational Errors section.
Example: Shawn keeps having intrusive thoughts that go against his values. Shawn thinks he is a monster for having these thoughts.
Nirvana fallacy — Believing that if we cannot do something perfectly, then we should not do it at all, or that only perfect solutions are worth considering, leading to perfectionism and paralysis. This often involves dualistic thinking—see the Foundational Errors section.
Example: Jessica wants to start exercising but believes that unless she can work out for an hour every day, there’s no point in trying. Because she can’t commit to that much time, she doesn’t exercise at all.
Over-valuing folk wisdom — Applying pithy statements or aphorisms to every situation in life, even when they may be factually wrong, counterproductive, or harmful in certain contexts. This thinking error involves overgeneralization (applying lessons from specific contexts universally) and often dualistic thinking (the wisdom is either always true or always false). See Foundational Errors section.
Example: Gary lives by the idea that “Nice guys finish last.” He is extremely competitive and does not care much about what happens to other people or if they get hurt emotionally or physically. Because of this, very few people want to be friends with Gary, or even associate with him. Martha lives by the idea that “A leopard never changes its spots.” Because of this she does not believe people can change, and cuts people off even if they are earnestly trying to improve.
Just world hypothesis — Believing that the world is just (fair), and that people that are “good” have only or mostly good things happen to them, and that “bad” people have only or mostly bad things happen to them. This way of thinking may lead us to blame ourselves or others when bad things happen. This often involves confirmation bias or violating the principle of parsimony—see Foundational Errors section.
Example: After losing his job due to company-wide layoffs, Marcus believes he must have done something wrong or been a bad employee, even though the layoffs were due to economic factors beyond his control.
Fortune telling — Believing we can accurately predict how things are going to go in the future, which may cause us to stop trying to reach our goals, making our pessimistic predictions become self-fulfilling prophecies. This often involves overgeneralization—see Foundational Errors section.
Example: Sasha frequently gets turned down when she shows romantic interest in someone. She believes no one will ever like her because she has been turned down so many times. Because of this Sasha stops trying to attract a romantic partner and she doesn’t approach anyone.
________________________________________________________________________________
Garden View Mental Health
www.gvmentalhealth.com
© 2025 Garden View Mental Health.
This handout may be reproduced for therapeutic and educational purposes with attribution. Commercial use prohibited without permission.
