Unit 1 Tutorials Introduction To Critical Thinking
Unit 1 Tutorials Introduction To Critical Thinking
Thinking
INSIDE UNIT 1
Closed-Minded Thinking
Ideology
Implicit Bias
Cognitive Bias
Introduction to Fallacies
Formal Fallacies
Introduction to Informal Fallacies
Fallacies of Misdirection (Part 1)
Fallacies of Misdirection (Part 2)
Fallacies of Irrelevance (Part 1)
Fallacies of Irrelevance (Part 2)
WHAT'S COVERED
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 1
In this lesson, you will learn what critical thinking is and how it evolved as a term and concept. In
particular, you will learn:
1. The Historical Development of Critical Thinking
2. A Working Definition of Critical Thinking
3. Thinking Processes Involved in Critical Thinking
4. A Refined Definition of Critical Thinking
When you decided to study critical thinking, you might not have imagined that you would first have to define
critical thinking. "It's just thinking clearly and carefully!" you might guess. Or it's thinking outside the box, or
being open-minded, or making decisions based on logic instead of emotion.
THINK ABOUT IT
Do you have a working definition of “critical thinking”? How would you describe it to someone?
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 2
In fact, all of these definitions are correct but incomplete. The idea of critical thinking has been in development
for thousands of years, and it defies easy summary. In order to fully understand what we mean when we speak
of critical thinking, we have to begin with the origins of the idea.
Critical thinking, as an explicit term, has its roots in John Dewey’s work from the early 20th century. Dewey
introduced the term (also called “reflective thinking”) as an educational goal, which he understood to be a
scientific attitude of mind. He defined critical thinking specifically as:
[A]ctive, persistent, and careful consideration of any belief or supposed form of knowledge in the light of
the grounds that supports it, and the further conclusions to which it tends (Dewey 1910: 6).
PEOPLE TO KNOW
John Dewey
John Dewey (1859-1952) is sometimes called “the father of critical thinking.” His works of philosophy
range across many disciplines, but he is best known for his work in educational reform. His vision for
both how students learn and how teachers are trained radically changed the field of education.
While the term "critical thinking" was new, the concept was not. Dewey quoted older sources like Francis Bacon
(1500s), John Locke (1600s), and John Stuart Mill (1800s), implicitly noting that there is a long history of the
development of critical scientific methods of thinking.
In order to develop our own working definition of critical thinking, let’s overview a brief history of logic and
critical thought. Western critical thought and logic (at least that we know about) probably started with the
teachings of Socrates almost 2,500 years ago.
PEOPLE TO KNOW
Socrates
Socrates was an ancient Greek philosopher who lived in the city state of Athens in the fifth century
B.C.E. He is often called the father of western philosophy for his method of inquiry called dialogues, in
which he engaged in conversation on topics of morals and ethics and, through questioning, revealed
truths. Though he never authored a single text, his work is known because his protege, Plato, wrote
down many of his dialogues.
Socrates established the importance of asking deep questions that probed into standard assumptions and
patterns of thinking. He established the need for evidence-based thinking aided by closely examined
reasoning.
Socrates' student Plato went on to be the teacher of Aristotle, who is credited with developing one of the
earliest components of critical thinking: logic. Modern critical thinking is essentially tied to logic. Logic is
reasoning by way of a set of rigid rules. Logic has a long history both in the Western and Eastern traditions.
Historically, some of the earliest formal logics were developed in ancient Greece, India, and China. In Ancient
Greece, Aristotle was the first Western formal logician. His logical theories are called logical because they rely
on grouping and classification, such as “all dogs are mammals, and all mammals are animals. Therefore, dogs
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 3
are animals.” He analyzed the logical structure of arguments and articulated the rules that govern good
arguments. Modern categorical logic is derived from Aristotle’s original teachings.
PEOPLE TO KNOW
Aristotle
Born in 384 BC, Aristotle is a student of Plato and a critically important ancient Greek philosopher. His
writings covered a vast number of subjects, from biology to ethics to logic.
Meanwhile, in ancient India in the sixth century BC, independent of the scholastic developments in ancient
Greece, the Nayaya school introduced a form of logical analysis called the five-membered schema. Like
Aristotle’s categorical logic, this was a way of analyzing logical structure for the rules of good argumentation.
Further, logic was initially developed in the ancient Chinese tradition in the fifth century BC by Mozi. Mozi
founded the Mohist school, whose logic tradition focused on solving logical puzzles. The Mohists were also the
first to use notation for logical arguments.
TERM TO KNOW
Logic
Reasoning by way of a set of rigid rules.
Reasoning about any belief, the evidence that supports that belief, and the conclusions that it supports.
Here, belief is confidence that a statement is true or that something exists; there are richer, more complex
meanings of belief that will be described later.
We can think critically about any belief or piece of knowledge that we have. Suppose we think critically about
our belief that all people should have the right to vote. This would involve us thinking about the evidence we
have that supports the idea that all people should have the right to vote and what conclusions that belief would
entail. For example, if all people had the right to vote, that would mean that felons should be allowed to vote,
that voting days shouldn’t conflict with people’s work schedules, and polls shouldn’t close until everybody has
had the chance to vote—none of which is currently true in the United States.
We will refine our definition further below, but first we will look at what critical thinking entails.
TERMS TO KNOW
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 4
Critical Thinking
Reasoning about any belief, the evidence that supports that belief, and the conclusions that it supports.
Reasoning
The act of thinking in an active, persistent, and careful way.
Belief
Confidence that a statement is true or that something exists.
1. Questioning: The art of interrogating a subject, idea, or belief. Good questioning involves curiosity and
creativity. There are two main kinds of questions that one can ask: open questions and specific questions.
Open questions require the respondent to elaborate on a given point. These include questions such as:
“What you think of…?”, “How does that relate to…?”, and “Why do you think… is true?” Specific questions, on
the other hand, ask for a concrete answer. They are not open-ended and usually focus on clarification or
description. They include questions such as: “Is… true?”, “What does… mean?”, and “How are you
defining…?” Good questioning ultimately encourages creativity (thoughtful and original approaches to an
issue), and introspection (careful interrogation of our own beliefs and thinking).
EXAMPLE Consider the simple statement from the Declaration of Independence, “All men are created
equal.” This statement can be questioned with open-ended questions like, “Why do you think this is true?”
or "What do you hope to achieve by stating this?" It can also be questioned with specific questions like,
“What do you mean by ‘equal’?” and even “What do you mean by men?” The answer to the second
question has particularly evolved since Thomas Jefferson wrote this famous statement!
2. Analyzing: The process of examining a subject or belief, understanding it, and being able to explain both
the belief and its implications. Analyzing belief is at the core of critical thinking, and there are a variety of
ways we can go about this analysis. We can compare and contrast, break an idea down into its parts, and
consider implications and evidence for that belief. We can also analyze the questions we have about a
particular idea or subject by determining what may or may not be a good answer to each question and what
answering the question involves. Analysis is the substance of critical thinking, and you will find as we
proceed with the course that the formal and informal logical approaches we discuss are geared toward
analysis.
EXAMPLE In the previous example, we questioned the meaning of the statement, “All men are created
equal.” In analyzing this statement, we might ask questions like, “What role does the government play, if
any, in protecting this equality?”
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 5
3. Concluding: Forming an opinion, belief, attitude or determining a novel fact about a subject or idea.
Drawing a conclusion can involve evaluating the quality of an idea in forming a value judgment, inferring
new information from what is given in the subject or idea, synthesizing given information to determine a
novel fact, and creating new ideas, arguments, and/or broadening one’s horizons in a particular subject
matter. Crucially, what’s important to concluding is that it relies on sound analysis and open-mindedness.
Open-mindedness requires monitoring our own thinking to make sure that we are open to every possibility,
every kind of information, and every kind of alternative or approach, and are not wedded to our own
preconceived opinions.
EXAMPLE Throughout American history, different people have drawn different conclusions about the
implications of “all [people]” being “created equal.” Some may think it means only that everybody is treated
the same under the law. Others have concluded that to support this statement, we must ensure that all
people all have equal opportunities, such as access to education. Either conclusion requires rigorous
questioning and analysis to support the conclusion.
TERMS TO KNOW
Questioning
The art of interrogating a subject, idea, or belief.
Open Questions
Questions that require the respondent to elaborate on a given point. These include questions such as,
“What you think of…?”, “How does that relate to…?”, and “Why do you think… is true?”
Specific Questions
Questions that require a concrete answer and may have a factual answer. These include questions like,
“Where were you on the night of June 5th, 2017?” and “What is the capital of North Dakota?”
Analyzing
The process of examining a subject, idea, or belief in question, understanding it, and being able to
explain both the information and its implications.
Concluding
Forming an opinion, belief, attitude or determining a novel fact about the subject or idea in question.
Open-Mindedness
Monitoring your own thinking to make sure that you are open to every possibility, every kind of
information, and every kind of alternative or approach, and are not wedded to your preconceived
opinions.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 6
thinking in an active, persistent, and careful way about any belief, the evidence that supports that belief,
and the conclusions that the belief leads to, which is in part directed at finding out as much relevant,
truthful information as practically possible.
When critical thinking is defined as active, persistent, and careful, we mean that critical thinking is a process of
one’s own mind that takes considerable time, focus of attention, and mental and personal energy. Critical
thinking is not easy, quickly performed, or without cost. Good critical thinking is essential to the humanities and
sciences, and it takes considerable training and experience to become skilled at it.
By saying that critical thinking includes finding out as much truth as possible, it must first be noted that
everything that we think about requires information. When a problem must be solved, it is important to know
what the problem is. This is done by gathering information about the nature of the problem itself. When a good
decision needs to be made, it is important to gather all possible information about every available alternative,
so we have a good chance of making the best decision.
EXAMPLE A regular use of critical thinking is in managing money and setting a budget. We need
information (such as about our expenses and our income) and must make sure the information is accurate.
We constantly ask questions (Do we need this item? Can we get it cheaper?). Anytime something changes—
a raise, a rent hike, or something else—we have to revise the budget.
Information may be true, false, or misleading. A misleading statement (or misinformation) is often false but may
be factually true while leading to the wrong conclusion. For example, a politician may declare that their
opponent voted against a bill to compensate veterans. In fact, the opponent voted against the bill in favor of a
stronger one that would do more to compensate veterans. The statement is true, but intentionally misleading.
To have our best chance of successfully analyzing our subject and coming to good conclusions, we need the
information to be both truthful and accurate.
EXAMPLE Imagine you are a general at war with your neighboring country. During a key battle, it is
reported to you that the neighboring country’s forces appear to be making an all-out attack on your army’s
flank. So, you decide to call all your troops to your flank to repel the attack. Unfortunately, the appearance
of an all-out attack on your flank was, in truth, a diversion with most of your neighboring forces attacking
your rear and winning the war. You lost the war, lost your country, lost the lives of many troops, and maybe
lost your own life by relying on misleading information. If you knew the truth of the all-out attack on your
flank as truly a diversion, you would have had your best chance for making the right decisions on how to
fight the battle and win the war.
Although few of us will be making key military decisions, we are constantly bombarded with misleading
information that can lead to poor decisions.
It is also important to make sure that you have sufficient truthful information to think critically about a subject. It
is not unusual to find that the truthful information you have been able to gather is not enough to solve the
problem, make the right decision, make a quality evaluation, or result in that new discovery. If our good critical
thinking tells us that we need more truthful information to be successful in making the right decision, for
example, we should seriously consider deferring that decision for a later time when we have the needed
truthful information.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 7
EXAMPLE At the start of the COVID-19 pandemic in 2020, every business and institution was faced with
tough decisions about what to do in response to the public health threat. There was a lot of confusion and
disagreement, and the biggest reason for the confusion and disagreement was that we did not have much
information about the virus at the time. Unfortunately, those decisions had to be made before we knew
more, leading to rapidly changing information.
THINK ABOUT IT
Imagine that you are hearing unusual noises in your car engine just before a long road trip. Is it serious
enough to delay or cancel the trip? How can you make a decision without more information?
Now that we have a definition, why is critical thinking important in your life? And how do you practice good
critical thinking? These questions will be answered throughout the course.
TERMS TO KNOW
Critical Thinking
Thinking in an active, persistent, and careful way about any belief, the evidence that supports that
belief, and the conclusions that it leads to, which is in part directed at finding out as much relevant,
truthful information as practically possible.
Misinformation
Information that may be outright false or misleading, being factually true but leading to a false
conclusion.
SUMMARY
In this lesson, you began to learn the historical development of critical thinking and learned a working
definition of critical thinking. After considering the thinking processes that are involved in critical
thinking, you saw a refined definition of critical thinking that will continue to guide us throughout the
class.
REFERENCES
Hitchcock, David, "Critical thinking", The Stanford Encyclopedia of Philosophy (Fall 2020 Edition), Section 3.
Edward N. Zalta (ed.), URL = plato.stanford.edu/archives/fall2020/entries/critical-thinking
TERMS TO KNOW
Analyzing
The process of examining a subject, idea, or belief in question, understanding it, and being able to explain
both the information and its implications.
Belief
Confidence that a statement is true or that something exists.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 8
Concluding
Forming an opinion, belief, attitude or determining a novel fact about the subject or idea in question.
Critical Thinking
Reasoning about any belief, the evidence that supports that belief, and the conclusions that it supports.
Critical Thinking
Thinking in an active, persistent, and careful way about any belief, the evidence that supports that belief,
and the conclusions that it leads to, which is in part directed at finding out as much relevant, truthful
information as practically possible.
Logic
Reasoning by way of a set of rigid rules.
Misinformation
Information that may be outright false or misleading, being factually true but leading to a false conclusion.
Open Questions
Questions that require the respondent to elaborate on a given point. These include questions such as,
“What you think of…?”, “How does that relate to…?”, and “Why do you think… is true?”
Open-Mindedness
Monitoring your own thinking to make sure that you are open to every possibility, every kind of
information, and every kind of alternative or approach, and are not wedded to your preconceived
opinions.
Questioning
The art of interrogating a subject, idea, or belief.
Reasoning
The act of thinking in an active, persistent, and careful way.
Specific Questions
Questions that require a concrete answer and may have a factual answer. These include questions like,
“Where were you on the night of June 5th, 2017?” and “What is the capital of North Dakota?”
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 9
The Dangers of Not Using Critical Thinking
by Sophia
WHAT'S COVERED
In this lesson, you will learn why critical thinking is important to your professional and personal lives.
Specifically, you will learn the dangers of not using critical thinking, particularly the poor decisions that
follow from:
1. Not Asking Questions
2. Letting Emotions Interfere
3. Making Assumptions
4. A Complete Breakdown in Critical Thinking
There are many dangers to not employing good critical thinking. Not thinking critically can lead to having
unfounded opinions, believing false and misleading sources, and making assumptions that are not backed
up and not well argued for. Perhaps most importantly, not using good critical thinking can lead us to make
poor decisions and fail to solve problems effectively. It also prevents us from making new discoveries or
innovations.
Consider your average Joe (we’ll even call him Joe!). He has a college degree and a job that requires sharp
analysis skills, but he might not use those skills outside of work. During an ordinary day, Joe makes several
decisions—some important, some not.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 10
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 11
Joe did ask one right question: “Why are these labels present on this food?” However, he jumped to a
conclusion instead of considering other answers to that question. The alternative, evidence-based answer to
that question would be that “some people can’t eat gluten or high amounts of fat due to diseases or allergies,
but those don’t affect people without those health conditions.” Notice that this possibility escaped Joe because
he succumbed to a marketing tactic: that labels indicate benefits or warnings for everyone. If he had analyzed
that conclusion, Joe would’ve realized that it couldn’t possibly be true. Some warnings give age restrictions, and
are only focused on children (e.g., "choking hazard to children under three”), or are focused on people with
particular health issues (e.g., ”Cheerios help reduce cholesterol”). Joe’s assumptions interfered with his ability to
come to the best conclusion. Fortunately for Joe, it’s unlikely to change his life. He just might not enjoy his lunch
as much as he might.
Joe returns to his job as a manager of a medium-sized office to find an urgent memo. He is tasked by upper
management with altering office functions to cut expenses. Joe studies the current budget and realizes that he
can meet the new budget goal if he eliminates one half-time position—and thankfully, there is one half-time
employee who has already given her notice. Joe closes out that half-time position, meaning he will not hire a
replacement. He has met the new budget goal and feels glad he didn't have to let anyone go.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 12
However, there will be unforeseen consequences to Joe’s decision. He has to reassign the duties of the half-
time position, which increases the workload for other members of the team. This lowers morale and the quality
of their work. In fact, before too long, the overburdened staff begins to make mistakes that lead to lost revenue.
In the end, they still can’t balance the budget.
Joe used poor critical thinking again but in a different way. He asked the right questions and didn’t jump to any
conclusions, but instead of considering all of his options, he made the decision that caused the least discomfort.
He ignored overall workload capacity for the whole team, and though he met his new budget requirement, he
failed to make the best decision.
3. Making Assumptions
Assumptions and unanalyzed concepts run rampant in cases of critical thinking that involve personal ethics.
Personal ethics are an individual’s understanding of which actions are “right” and which actions are “wrong,”
not as facts, but as moral principles.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 13
Personal ethics can be highly loaded. Our ethical judgments often entail universal assumptions about what
people should or should not do. By engaging critical thinking in developing our personal ethics, we develop a
more nuanced and often truer picture of what is and is not a right action. We call this interaction between critical
thinking and personal ethics, moral reasoning. We will return to moral reasoning in much more detail in Unit 5.
However, for the time being let’s consider Joe again.
On his way to the train station to go home, Joe sees a man bundled in blankets sitting on the corner. As he
passes by, the man asks for spare change. Joe feels affronted. He has worked all day to earn a living, and this
man just expected a handout? Joe guesses the man probably made bad decisions to end up on the streets.
Rather than telling the man that he doesn’t have any money, or that he doesn’t feel comfortable giving him any,
Joe hurries past and refuses to look the man in the eye. He jumps on the train, worried the man might follow
him.
This example is full of assumptions about someone else’s life and decisions on the basis of Joe’s personal
ethics. Joe values hard work and financial prudence. He assumes that if someone is unhoused and asking for
money, they must not have worked hard or made good decisions. He thinks relying on other people’s charity is
bad, morally.
However, Joe has not considered all of the possibilities that would explain why the man was living on the
streets and asking for money. He may have health issues that make it almost impossible to hold onto a job. He
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 14
may in fact have a job—more than 40% of unhoused adults in the U.S. do have a job—but doesn't earn enough
to cover housing costs in Joe's expensive city. He may have had insurmountably bad luck.
Moreover, even if the reality were that the man had made bad decisions that directly led to his losing housing,
Joe would have to consider whether he should judge his own family and friends who might have made similar
bad decisions, but avoided homelessness. It’s important that our conclusions be consistent across all cases. In
this case, Joe lacks consistency.
Finally, Joe’s moral reasoning lacks compassion. Compassion is when we acknowledge the humanity in others
and empathize with it. This is critical as it reminds us to be thoughtful in our moral judgment. Joe demonstrated
compassion earlier when he opted to solve his budget crisis without firing anyone, but he is less inclined to do
so for a stranger.
Keep in mind that while compassion can be essential to the best decisions, emotion can cloud our judgment.
We need to strike just the right balance in order to achieve optimal critical thinking.
TERMS TO KNOW
Personal Ethics
An individual’s understanding of which actions are “right” and which actions are “wrong,” not as facts,
but as moral principles.
Moral Reasoning
A more nuanced and often truer picture of what is and is not a right action, achieved by applying critical
thinking to personal ethics.
Compassion
Acknowledging the humanity in others and empathizing with it.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 15
Joe is again acting on his emotions rather than asking good questions and doing good analysis. As in the
previous example, Joe assumes that his son’s behavior is due to moral failings. He doesn’t stop to consider
alternatives, and he doesn’t ask his son the questions that would reveal these possibilities.
If Joe had asked the right questions, he might have learned that Junior studied for several hours before Joe
passed by his room that night, and was only taking a gaming break to wind down. He might have learned that
Junior struggled with the material due to learning differences, not to laziness. He might learn that over half the
class failed the test, and that the teacher has a reputation for being tough and unreasonable. Further, he might
learn that the “screen time” is how his son copes with his frustration and anxiety, and taking it away will actually
worsen his performance at school!
In short, Joe’s critical thinking has a complete breakdown. He failed to ask the right questions, made unfair
assumptions, and let his emotions take over. This doesn’t make Joe a bad person; it merely makes him human.
But training the mind to ask questions, challenge assumptions, and set aside personal feelings can lead to
better decisions.
THINK ABOUT IT
Have you made any recent decisions—big or small—that would have been different if you applied more
critical thinking? Did you make any assumptions or snap judgments? Were there more questions you could
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 16
have asked or information you could have used in making that decision? Write it down on a sticky note or
save it to a notes app. We will come back to it later.
SUMMARY
In this lesson you saw some of the dangers of not using critical thinking, such as not asking questions,
letting emotions interfere, and making assumptions. When these habits happen all at once, it can lead
to a complete breakdown in critical thinking.
TERMS TO KNOW
Compassion
Acknowledging the humanity in others and empathizing with it.
Moral Reasoning
A more nuanced and often truer picture of what is and is not a right action, achieved by applying critical
thinking to personal ethics.
Personal Ethics
An individual’s understanding of which actions are “right” and which actions are “wrong,” not as facts, but
as moral principles
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 17
The Benefits of Using Critical Thinking
by Sophia
WHAT'S COVERED
In this lesson, you will learn why critical thinking is important to your professional and personal lives.
Specifically, you will learn the dangers of not using critical thinking, particularly the poor decisions that
follow from:
1. Identifying Unsupported Assumptions
2. Critically Evaluating Information
3. Making Good Decisions
The dangers of not using critical thinking are clear. But critical thinking is not merely the avoidance of bad
choices and incorrect arguments. There are many benefits to using critical thinking as well.
Some assumptions are packed into the words we use; we assume the meaning of the word is universal and
unarguable. Consider the word “family.” At first it seems like a simple word that everybody understands: a mom,
a dad, a couple of kids, maybe a dog or cat. But are married couples without children families? Single parents
and their children? A group of people who live together but are not related or romantically involved? What
about married people with children who do not live together? Each of these questions challenges assumptions
about what families can look like, or should look like, and impacts how we see people in our community.
BIG IDEA
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 18
Major discoveries and revolutionary acts can start by questioning the most basic and seemingly safest
assumptions. To be successful, these questions have to be followed with rigorous gathering and analysis of
information to form and defend a new position.
Consider an article from Mail Online dated May 6, 2020. The title of the article reads “Coronavirus IS
causing deadly new inflammatory fever in some children.” However, when the article is fully read, it
becomes apparent that the “is” in the title is misleading. The article actually states that “a study of eight
cases has suggested that the syndrome may be caused by coronavirus” (emphasis ours). The title claims
that a correlation between coronavirus and the inflammatory fever is already known, but the article says it is
only a possibility, based on too small a sample to be certain. Headlines often make use of such
ambiguously worded headlines to make the article more tempting for readers. Moreover, many reports on
scientific findings tend to overstate or exaggerate those findings to interest readers. But this sort of “click
bait” overstatement leads to panic and confusion if people don’t click through and read the entire story.
Another misleading news headline is when cause-and-effect is not directly stated in the headline. Consider
an example from the U.S. Sun dated September 14, 2020: “If you snore you could be THREE TIMES more
likely to die of coronavirus.” However, really the claim in the article is that “those who have been diagnosed
with the loud snoring condition, obstructive sleep apnoea, are being urged to take extra precautions.”
Obstructive sleep apnea may cause snoring, but snoring is not what makes you more likely to die of
coronavirus. Rather, having obstructive sleep apnea puts you at a higher risk of dying from coronavirus.
These two things are different because many more people snore than have obstructive sleep apnea. As
before, the headline may be intentionally misleading to make people click through.
For decades, the tobacco industry combatted medical studies showing the link between cigarettes and
cancer with bad faith arguments such as:
There is insufficient proof that smoking causes cancer.
Moderate smoking is not damaging to one's health.
Some people have been known to smoke all of their lives without developing cancer.
There are many powerful interests bombarding us with such bad-faith arguments. Critical thinking helps see
through fallacies that intentionally mislead us and tempt us to make bad decisions.
Memes—or easily shared graphics, often with both images and text that people share across social
networks—are notorious for spreading misinformation, often regarding emotional topics. While it should
seem obvious that memes are a poor source for information, they appear from people we trust, which
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 19
makes us more likely to take them seriously. For example, you might see a meme about the contents of
vaccines, shared by your Aunt Betty. You assume Betty wouldn’t post the meme without having good
reason. You are likely to forget the details, but remember that vaccines are potentially harmful and have
strange ingredients. Like headlines, memes might have factual information that is exaggerated or out of
context, but they may also have statements that are false.
TRY IT
Read through your news feed or social media feed, following links to news articles. Do the headlines and
comments on social media match the articles they link to?
These are just some of the ways that critical thinking—asking the right questions—can make you a smarter
consumer of the news and less likely to believe misinformation. We will cover these various ways of misleading
people, or jumping to false conclusions, throughout the class.
Notice how similar this is to the steps to good critical thinking that we outlined above. First, you have to ask
yourself what questions are relevant to the decision you want to make, then you analyze the information
relevant to those questions you asked, and lastly, make a decision based on the conclusion you reach.
In the first section of this tutorial, Joe made a series of decisions, from what he would have for lunch to how he
would deal with his son’s bad grades. Let’s walk through the day again and see what happens if Joe uses his
critical thinking skills.
Recall that Joe first deliberated over lunch. This time he asks the clerk, “I’ve seen a lot of these meals are
gluten-free. Does it matter?” The clerk says it matters to him, because he has Crohn's Disease, but if Joe has
never been sick after eating bread, he’s fine. After verifying this on his phone, Joe opts for a tastier option and
goes back to his office.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 20
Unfortunately, the memo still waits for him: he has to cut his budget. Joe considers cutting that open position,
but first he calls in people who would take on the additional workload. All three of them balk. They are already
stretched, and aren’t well-trained for the departing person’s duties. Moreover, one explains, it makes no sense
to divide those duties among different people, as it will create more work and a need for meetings to keep
everyone on the same page.
By collecting all the relevant information, and setting his feelings aside, Joe comes to a combination of ways to
meet the goal—canceling all travel and conferences for the next year, doing mailings in-house instead of using a
contractor, and other budget-cutting measures. While some of these decisions are unpopular with staff, it is less
unpopular than cutting an entire position.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 21
As Joe walks by the unhoused person who asks for money, he initially has the same reaction: that the person
has failed in some way, and may be a threat. But he then challenges his own assumptions. “My brother-in-law
has been out of work for two years, but my sister is keeping him off the streets,” he remembers. “Am I in a
position to really understand what’s happening in this person’s life?” He realizes that his assumption is not
based on evidence, but rather Joe’s own discomfort. He does not give the man money, but offers the man a
granola bar and decides to make a donation to the nearby soup kitchen.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 22
When Joe gets home, he finds out that his son failed the math test. Joe still feels the initial disappointment, but
pushes past it. He asks how everyone else did, and finds out Junior is far from alone. Junior reminds him that
he’s doing well in all his other classes; he’s just been struggling with math as it gets more conceptual, and the
teacher does not slow down when students ask her to. With more evidence, and by setting his gut feelings
aside, Joe realizes his son might need additional help with math, and that one test won’t make a big difference
in the long run.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 23
Note that Joe doesn’t have to have special intellectual gifts in any of these cases. He has to ask the right
questions, challenge his own assumptions, and (sometimes) set aside his gut feelings. These aren’t gifts; they
are good habits.
THINK ABOUT IT
In the last tutorial, we asked you to think about a big decision you made where you could have applied
more critical thinking. Pretend you still had that decision in front of you.
What questions will you ask?
What assumptions do you need to challenge?
What emotions need to be set aside?
SUMMARY
In this lesson, you saw some of the benefits of critical thinking, such as identifying unsupported
assumptions and critically evaluating information. Ultimately, the benefits of critical thinking are all
about making good decisions.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 24
Real-World Examples of Critical Thinking
by Sophia
WHAT'S COVERED
In this lesson, you will learn the application of critical thinking through important real-life examples,
including:
1. Katherine Johnson
2. Billy Beane
3. The Mount Everest Disaster
4. Careers That Require Critical Thinking
We’ve discussed the importance of critical thinking in everyday life, but what about the big picture? Can
critical thinking help you advance in your career? In this section, we’ll look at a couple of success stories by
people who’ve demonstrated critical thinking at a high level. We’ll also look at a worst-case scenario when
critical thinking is abandoned. In each of these cases, we will look at the core elements of good critical
thinking:
Gathering and analyzing the best information available, which often comes from asking the right
questions.
Setting aside personal feelings and biases that may cloud your judgment.
Challenging assumptions and being open-minded to new ways of thinking.
1. Katherine Johnson
One icon of critical thinking is the mathematician Katherine Johnson. At a time when “computers” meant people
doing longhand calculations, Johnson was hired by NASA to do this work and became essential to the U.S.
Space Program. One of Johnson’s special skills was working backwards from a desired landing site to make
navigation plans for astronauts returning to earth.
Besides her talent for performing mathematical calculations, Johnson had a knack for making sure there were
always back-up plans and safety checks. For example, Johnson performed calculations to help astronauts
navigate their way home by the stars even if all of their electronic equipment failed. As digital computer-aided
calculations became the norm, Johnson double-checked the calculations to make sure they were correct. In
one of her most famous moments, astronaut John Glenn refused to fly a mission until she verified the
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 25
computer's projections. However, she was open-minded about the possibilities of non-human computers. Her
trust for them helped build confidence for digital technology at NASA.
Gathering, analyzing, and verifying information was at the heart of Johnson’s work.
She set aside personal feelings that may have clouded her judgment, such as seeing the possibilities for
non-human computers even though they challenged her own position and future at NASA.
She challenged assumptions by asking questions such as, “What will happen if this expertly engineered
system fails?”
Johnson’s story (along with colleagues Dorothy Vaughan and Mary Jackson) is captured in the book Hidden
Figures by Margot Lee Shetterly and in the movie based on the book.
2. Billy Beane
Another example of critical thinking in action is Billy Beane and the 2002 Oakland Athletics baseball team
(usually called the A’s). As general manager of the A’s, Beane faced an enormous budget shortfall in comparison
to big-market teams like the New York Yankees. To make the A’s competitive, Beane had to take a whole new
approach to the game, including both how to win games and how to win over fans. In so doing, he built a
competitive and popular team, and radically changed the way every team does business.
The story of Billy Beane and the 2002 Oakland A’s is captured in the book Moneyball by Michael Lewis, and
in the movie by the same name.
For decades, the prized measures (or “stats”) that told a player’s worth were “the triple crown”: hits, homers,
and runs batted in. Players who reached high marks in all three were superstars. Superstars, in turn, filled the
bleachers with fans. Since the days of Babe Ruth, that was the quintessential rule of how baseball worked. That
meant the basic job of a general manager was to sign superstar players.
But Beane took a different approach. He was inspired by a growing movement in baseball to look at completely
different measures of player ability like slugging percentage and “VARP” (value over replacement player).
Notably, these stats had only recently become possible as the Internet made the data easy to find and easy to
use and personal computers made it easy to run high-powered calculations. According to these stats, some
superstar players—like Derek Jeter of the Yankees—were overrated, and others, like Omar Vizquel of
Cleveland, were underrated. To baseball fans, the idea that Omar Vizquel was actually more valuable than
Derek Jeter seemed ridiculous. Vizquel was well-liked and respected, but he was no Derek Jeter!
Beane built the 2002 team around these principles, signing underrated (and therefore, less expensive!) talent.
He filled the roster with non-marquee players that had high values based on the new stats. The A’s turned into a
juggernaut, winning over 100 games and going deep in the playoffs, all at a third of the salary of the New York
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 26
Yankees. Fans love winning more than anything, so the team not only filled the bleachers in Oakland, they were
a big draw on the road. Over the next 20 years, all teams started to replicate the “moneyball method.”
Gathering and analyzing the best and most accurate information available, in this case, new statistics
measuring player performance.
Trusting evidence-based analysis over gut feelings and instincts.
Challenging basic, long-held assumptions about the “right” way to build a baseball team.
In 1996, Rob Hall, an elite mountaineer, led a group to scale the summit of Mount Everest. Climbing the highest
peak in the world requires not only climbing expertise, but significant resources. Hundreds of climbers have lost
their lives trying, and most were highly skilled and experienced climbers.
Climbing to the summit (29,000 feet) is an extremely difficult and dangerous undertaking. Climbers spend
weeks climbing to higher and higher altitudes while giving the body time to adjust to the rapidly lowered levels
of oxygen. Camps are established along the way where the climbers stay for periods of time while they
acclimate to the reduction in oxygen. The final push to the summit involves an 18-hour round trip from the last
base camp to the summit.
The 1996 disaster on Mount Everest was told in the book and subsequent movie Into Thin Air. The book
was written by Jon Krakauer, one of the climbers in the expedition who survived.
One rule Rob Hall’s team established was that if they did not reach the summit by 2 P.M., they would return to
base camp to avoid descending in the dark. Nevertheless, they broke this rule, as many of the expedition
reached the summit after 4 P.M. To make matters worse, a raging blizzard hit as the climbers descended. They
had not anticipated a storm, and no contingency plan was in place. In total, eight climbers, including Rob Hall,
lost their lives on the mountain. (Leger, 2016)
Hall’s team failed to use good critical thinking skills in several different ways:
Hall made assumptions based on limited information. On previous climbs at that time of year, he had never
seen such bad weather, so he didn’t think it would be a problem or have a plan in place.
The team let emotions guide them. Besides the driving ambition to reach the peak, the team had a lot
invested in the climb, in both time and money. Though they knew intellectually that they should reach the
peak by 2 P.M., they pressed on when they missed the deadline.
This also demonstrated closed-mindedness; the grit and determination that makes one a world-class
mountain climber can also be a deadly failing.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 27
THINK ABOUT IT
What individuals (famous or not) do you think exemplify critical thinking? Do you know of anyone personally
or through media who has demonstrated the following skills?
Gathering the best and most accurate information available, which often comes from asking the right
questions.
Setting aside personal feelings and biases that may cloud their judgment.
Challenging assumptions and being open-minded to new ways of thinking.
As you read in the previous examples, critical thinking is essential in many professional fields, even sports.
Some jobs have critical thinking at their core. For example:
Police detectives have to ask the right questions, consider all the evidence, and analyze all possible
answers to those questions to come up with the most likely conclusion based on that evidence.
Medical staff must use critical thinking to ensure they make the right diagnosis, provide needed care at the
first signs of a crisis, and give patients the correct treatments.
Security analysts protect an organization’s networks and systems against intruders. In order to create the
best protection plan, they must ask the right questions about how bad actors might get in, then analyze the
data to find the best ways to protect the organization.
Journalists must scrutinize their sources and ask the right questions to get true and accurate information.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 28
A manager at any level is a job that entails problem-solving and decisions, and at their best, managers
foresee problems and opportunities by asking new questions.
Educational administrators look to continuous improvement by collecting and analyzing information and
asking key questions.
Scientists and engineers are tasked with solving complex problems by first ascertaining the real nature of
the problem and then deciding on the right and best solution.
In fact, critical thinking is a fundamental part of almost any job, and is an immensely valuable skill to employers.
Employees with critical thinking skills often become high performers because they:
One of the reasons employers like to hire college graduates—even ones with majors unrelated to the job—is
that throughout the curriculum, college trains students in critical thinking. All of these skill sets build
fundamentally upon critical thinking and are of benefit to any employer, and to you personally, inside and
outside of work.
SUMMARY
In this lesson, you learned about the application of good critical thinking in high-profile cases involving
Katherine Johnson, whose work at NASA was essential to the early years in the space program, and
Billy Beane and his reliance on a more evidence-based and analytical approach to building a baseball
team, as well as the critical thinking breakdown leading to the Mount Everest disaster. You also learned
about specific careers that require critical thinking and how these skills can be a benefit to any
profession.
REFERENCES
Leger, C. (2016, December 31). The 1996 Everest Disaster - The Whole Story. Base Camp Magazine.
basecampmagazine.com/2016/12/31/the-1996-everest-disaster-the-whole-story/
Roberto, M. (2009). The Art of Critical Decision Making. Chantilly: The Great Courses.
www.thegreatcourses.com/courses/art-of-critical-decision-making.html
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 29
Steinberg, L. (2015, August 18). Changing the Game: The Rise of Sports Analytics. Forbes
www.forbes.com/sites/leighsteinberg/2015/08/18/changing-the-game-the-rise-of-sports-analytics
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 30
Closed-Minded Thinking
by Sophia
WHAT'S COVERED
In this lesson, you will learn about the ability to consider other ideas and opinions and why it can be so
difficult to do so. Specifically, you will learn about:
1. Why People Are Closed-Minded
2. How to Be More Open-Minded
3. Limits on Open-Mindedness
Of these, perhaps the most harmful is being closed-minded. Closed-minded thinking means having a fixed
point of view and refusing to consider other ideas or opinions. It can mean not even accepting credible and
important information that doesn’t support one’s prior opinion. We know from the last challenge that critical
thinking requires us to question our assumptions and set aside emotions so we can make better decisions.
Closed-minded thinking is refusing to do so, essentially refusing to think critically. While being closed-minded
has a lot of overlaps with ideology and implicit bias, which are usually unintentional errors that we will discuss in
the subsequent tutorials, closed-mindedness is a willful and conscious decision. It is what happens when a
person decides to ignore information and ideas that they don't wish to hear.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 31
But challenging assumptions and setting aside emotions are easier said than done! By the time we reach
adulthood we have many values, beliefs, attitudes, and affinities that influence the way we think. It may not
even be possible to challenge every assumption, or set aside every emotion, because our minds don’t work
that way.
EXAMPLE Valerie makes a commitment to read a wide range of books and articles by authors with
diverse backgrounds and opinions. While she tries to begin each book with a completely open mind, some
are so contrary to her values that she can’t read them without getting upset. She can’t simply shut her
feelings off regarding the issues that matter to her.
Closed-minded thinking may mean surrounding yourself with friends who share your opinions, or seeking out
articles and videos that interpret the world in the same way that you do. It may mean readily believing
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 32
information that supports your opinions, and being more skeptical of information that contradicts it. If you feel
uncomfortably that this describes you, you are not alone. While few people think of themselves as closed-
minded, most of us make choices that keep us in our comfort zone. In fact, people may even be upset at the
idea of watching a news program or reading an article by someone they know disagrees with them.
TRY IT
Find commentary that runs contrary to your own thinking. It can be a blog, a video, or a TV talk show. Watch
for twenty minutes as if you have no prior knowledge or opinion on the topic. You don’t have to accept or
believe what you read or hear, but try to find at least one persuasive point the person makes.
People generally know that being closed-minded is “bad,” so why is anyone closed-minded in the first place? It
may be that being closed-minded is an inherent human trait: protecting our own thought processes, memory,
and selves made closed-mindedness evolutionarily beneficial. Closed-mindedness in many ways functions
similarly to our other self-protection habits. It is a way of protecting ourselves from perceived threats against our
belief system.
However, closed-mindedness becomes counterproductive to our own growth and our community when we lose
the ability to work with or even talk to people who disagree with us. This degree of closed-mindedness can be
exacerbated by spending a lot of time in “echo chambers,” or communities that repeat our own beliefs back to
us and may even denigrate our opponents.
TERM TO KNOW
Closed-Minded Thinking
Having a narrow point of view and refusing to consider other ideas or opinions.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 33
Let’s look at one process by which we can develop open-mindedness. Notice that this isn’t the only way to
develop open-mindedness, but it is a step-by-step process you can rely on to work towards more open-minded
ways of thinking.
1. The first thing we must do is take time to be careful in our own thought process. Here’s a step-by-step
guide for this process:
a. First, orient yourself towards seeking truth and understanding instead of winning arguments or getting
credit for being “right.”
b. Second, ask questions and challenge all information and assumptions that come to your attention in
considering a particular topic.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 34
c. Third, make judgments, conclusions, and beliefs only after exercising good critical thinking. Similarly,
take sufficient time to think before deciding on a course of action. Be sure to reserve judgment when
you are uncertain.
2. Next, once you have tentatively arrived at a belief or perspective you want to endorse:
a. You must first review your thoughts for bias, prejudice, beliefs, ideologies, ideas, and experiences which
can undermine your critical thinking skills.
b. Next, develop good and substantial arguments against your own belief or perspective. This will help you
test how well your views hold up.
c. Lastly, be willing to challenge and modify your own beliefs.
3. Finally, we must be considerate and careful in how we engage with others about each of our beliefs and
perspectives.
a. First, cultivate a continued interest in hearing what others think. Actually seek out/ask about and listen to
other people’s thoughts, arguments, and beliefs.
b. Second, when others do share their perspectives, inquire deeply about how they arrived at those
opinions, beliefs, and arguments.
4. Being able to incorporate what you have learned from others into your own belief system will require:
a. Not feeling threatened by disagreement but having respect for and learning from it.
b. Being comfortable with ambiguity, where neither side is “completely right” or “completely wrong.”
Notice that even though we framed the steps for engaging with others after coming to terms with your own
beliefs and perspective, we can engage these steps for thoughtful conversation prior to forming a conclusion
about a particular topic. However, it is usually the case that we come into a conversation with some preformed
opinions, conclusions, and beliefs about a particular subject even if those are opinions that we haven’t yet
deeply grappled with.
Crucially, the most important ways in which we exercise open-mindedness as an end toward good critical
thinking lie in how we respond to and acknowledge the arguments of others. We must be open to receiving
new points, never too committed to our own arguments, and always evaluating the quality of arguments and
beliefs of others and the ways in which others are conveying those beliefs. Being in touch with our own
emotions, assumptions, and evidence, and willing to engage with others about those in a non-judgmental and
thoughtful manner, is key to open-minded critical thinking.
BIG IDEA
The hallmark of open-mindedness is being able to consider the best possible argument that directly
contradicts your own beliefs. This does not mean that you accept that argument, only that you understand
how people with good intentions can arrive at that opinion.
IN CONTEXT
Amy brings her new partner Cynthia to dinner with her parents. Amy’s parents have always been quite
progressive on a variety of political topics—they are pro-choice, in support of government-funded
safety net programs, against aggressive deportation and restricted immigration programs, and in
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 35
support of physician-assisted suicide, to name a few. However, at the beginning of dinner, it starts to
become clear that Amy’s new partner Cynthia is from a substantially different background than Amy’s
family and holds at least a couple fairly remarkably different beliefs. The primary difference that comes
up over dinner is that Cynthia is against physician-assisted suicide (PAS), unlike Amy’s parents. Amy’s
parents believe that PAS is an important aspect of individual autonomy.
Amy’s parents argue that PAS represents a “right to die with dignity,” giving control over one’s death in
the same way we have control over when and how we live. Cynthia disagrees. Amy’s parents are
initially shocked that Cynthia is against PAS. However, they set down their initial shock and
defensiveness at the disagreement. Importantly, they don’t assume Cynthia is ill-informed on the issue
or wrongheaded.
They ask Cynthia in great detail about her beliefs. She argues that PAS laws are not well-enough
regulated and are being used as an economic replacement to quality end-of-life care for disabled
people. She says that the problem is not that PAS is inherently wrong, but that in our current economic
and political landscape, major improvements to our healthcare system must be made first, primarily
moving healthcare away from its model of cost-saving at the expense of patients. Amy’s parents listen
to Cynthia without any preconceived notions and are comfortable in the fact that she has important
perspectives to share with them.
At the end of the conversation, they find themselves moved by Cynthia's points about the poor state
of the healthcare system and the ways in which disabled people may wrongly feel economic pressure
to engage in PAS. They realize that their perspective was informed by a bias which assumes that the
healthcare system treats terminally ill patients fairly. They end the conversation with a deeper
understanding of some of the worries around PAS laws and ways in which their own beliefs and
perspectives were limited. This leads them to move to reserve judgment on PAS laws until they better
understand the complex issues that Cynthia highlighted for them.
Open-mindedness is of ultimate importance to good critical thinking. Any limitations on our thinking results in
fewer alternatives in decision making, fewer avenues to solving a problem, less likelihood of making an
important discovery, and less information and evidence than would be otherwise available. However, open-
mindedness is not easy. It takes training and practice to listen carefully, suspend judgment, and analyze an
argument with care.
TRY IT
Is there somebody in your life with radically different opinions, but who can engage in a conversation about
their beliefs? Try opening up a dialogue where you truly seek to understand how they formed their beliefs
instead of convincing them they are wrong. This isn't easy when you're emotional about a topic, but with
practice you can get better!
3. Limits on Open-Mindedness
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 36
It’s important to note here that we said that open-mindedness requires considering new ideas, information,
evidence, arguments, and possibilities. This does not entail an open-minded person taking every idea seriously.
Rather, an open-minded person doesn’t dismiss things out of hand without sound reasoning and justification. An
open-minded person is willing to talk to people they disagree with, but may not be easily converted.
EXAMPLE An open-minded person will consider the possibility that extraterrestrial beings have been
visiting the earth but will not believe his neighbor has seen them without evidence.
Furthermore, open-minded people are willing to listen to ideas and evidence that are different from their own,
but will recognize if the ideas are themselves problematic, such as dehumanizing or delegitimizing marginalized
people, or pushing theories that are completely unsupported by evidence, and may end those discussions
rather than engage further.
EXAMPLE Historians should be open to new historical data, new interpretations of historical evidence,
and new theories of historical events, but are not required to take seriously people who deny the Holocaust
or trivialize the practice of slavery.
Furthermore, an open-minded person may only listen to people who are engaging in good faith arguments,
meaning they are sincere in their arguments and have the best intentions. Engagement in good faith is closely
connected to open-mindedness because when we engage with others, we assume the other person has good
intentions, is presenting information honestly, and is trying to come to the best conclusions. Even if we end up
disagreeing, such discussions can still be worthwhile, productive, and enlightening.
However, many arguments are made in bad faith, meaning a person has a hidden agenda. They may want to
dominate the discussion, to coerce the other parties into acquiescing to their point in some way or another, or
simply to embarrass their opponents. A person arguing in bad faith may intentionally use common misleading
tactics or otherwise be dishonest or deceptive. Being open-minded does not obligate you to continue engaging
with someone who isn’t arguing in good faith.
IN CONTEXT
Fiona gets into an argument with Jackie, a colleague, about immigration laws in the United States.
Jackie’s position is that the U.S. needs the strictest possible immigration laws in order to protect the
integrity of the country. She refers to immigrants by racial slurs and accuses Fiona of not caring
whether American citizens lose their jobs or are impoverished. She further accuses Fiona of being
“un-American” and unpatriotic. Fiona says she doesn’t want to talk about it anymore.
Is Fiona being closed-minded? In this case, Jackie is making an argument in bad faith. She is belittling
Fiona, using inflammatory and derogatory language about immigrants, and demanding Fiona change
her position because it is morally bad. She hasn’t given Fiona many reasons and evidence for her
position. Worst of all, her arguments clearly have a racial component which Fiona finds not only
offensive, but hurtful. Fiona may need to walk away from this discussion as self-care.
Now suppose Jackie does not use derogatory language for immigrants, and instead talks about
immigrants as people who needed to seek a better life, and acknowledges that many are refugees
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 37
from war-torn countries or are fleeing other dangerous situations. She demonstrates empathy and
compassion for the people entering the country. She states that despite knowing all this, her
reasoning for her position on immigration came from a deep-seated concern about loss of jobs for
Americans and a potential decrease in quality of life for those born here. She provides economic
evidence that more lax immigration policy can lead to high unemployment.
Fiona may still find this conversation upsetting. She has deep feelings on the topic, and it affects
people she knows and cares about. However, Jackie is engaging in good faith. She is not intimidating,
belittling, or coercing anyone. She’s laying out what she takes to be evidence with an
acknowledgment of the evidence that may come from the other side. She is genuinely interested in
hearing the evidence and reasoning behind Fiona’s own beliefs. Fiona can demonstrate open-
mindedness by participating in the discussion, even if it is difficult. It is unlikely to change her position,
but the conversation will help deepen her understanding of the issue by understanding the opposing
point of view.
TERMS TO KNOW
Good Faith
An argument or discussion in which all parties present are honest, respectful of each other’s dignity,
and genuinely want to hear what the other person has to think and say.
Bad Faith
An argument or discussion in which one or more of the parties has a hidden, unrevealed agenda,
leading to dishonesty and a refusal to accept the basic dignity of their opponent.
SUMMARY
In this tutorial, you learned about closed-minded thinking, and how this trait hinders good critical
thinking. You first learned why people are closed-minded and how to be more open-minded by
adopting good habits in how we seek out and understand others. We also learned about the limits on
open-mindedness that assure our time and energy is well spent.
TERMS TO KNOW
Bad Faith
An argument or discussion in which one or more of the parties has a hidden, unrevealed agenda, leading
to dishonesty and a refusal to accept the basic dignity of their opponent.
Closed-Minded Thinking
Having a fixed point of view and refusing to consider other ideas or opinions.
Good Faith
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 38
An argument or discussion in which all parties present are honest, respectful of each other’s dignity, and
genuinely want to hear what the other person has to think and say.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 39
Ideology
by Sophia
WHAT'S COVERED
In this lesson, you will learn the ways ideology can both bolster and limit good critical thinking. In
particular, you will learn about:
1. Ideology as a Benefit to Critical Thinking
2. Ideology as a Barrier to Critical Thinking
EXAMPLE Parents might be open to consider other points of view on raising children but wouldn’t
abandon the guiding principle that they want our children to grow up healthy and happy.
Ideology is a deeply-rooted set of interrelated core beliefs that guides a person’s commitment to other values,
principles, and decisions. Often ideology is a key part of our identity and is held by many members of the same
group. For example, capitalism and socialism are ideologies that entail a set of beliefs and ideas about how the
economic activities of the world should be organized and regulated. Christianity, Buddhism, and Humanism are
religious ideologies about the origins and purpose of our existence. While most ideologies seem to be religious
or political “-isms,” not all are. For example, many professions have ideologies that describe the guiding
principles of their practice and may even have an oath or affirmation that states it explicitly, like the Hippocratic
Oath in medicine. As you might interpret from this, a person can have multiple ideologies at once, and they may
even be contradictory. For example, a strong sense of loyalty and service in the military may be in conflict with a
religious ideology that one should never hurt others.
BIG IDEA
Ideology is not the same as having a personal opinion. By definition, it refers to a collection of shared
beliefs among a group, not a single belief belonging to one person. Furthermore, ideology is never a single
value, but a complex set of ideas. Finally, ideology is about who you are, what you do, and why and how
you do it. You can have opinions about issues that only tangentially concern you, but ideology is
fundamentally about how you live and participate in the world.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 40
Ideology can be both a benefit and a barrier to good critical thinking. When used in a beneficial way, an
ideology gives us a framework for having discussions or making decisions. It gives us integrity by making sure
our actions and opinions are consistent with our stated values. In this way, ideology can bolster good critical
thinking, requiring us to interrogate our other beliefs and having a standard by which to measure them.
EXAMPLE Because she believes in minimal government, Kara often bases her political decisions on
that ideology. When the city proposes a new dog park, she is initially excited, knowing she would take her
own dog there daily. She then returns to her ideological principles and asks herself tough questions: Should
people without dogs have to pay taxes to subsidize her pets? Should local government even be involved in
recreational pursuits? She knows she would be opposed to the plan if she did not have a dog, and so feels
she should be against the plan based on her ideology. She feels this gives her integrity, letting core
principles guide her choices instead of personal interest.
Having the common ground of a shared ideology can also guide thoughtful, constructive debates. All
professional and academic fields are guided by ideologies about the goals, values, and methods they pursue.
While there can be brisk, even intense, arguments within any field, this shared ideology makes argument part of
the process of discovery and analysis, an essential part of critical thinking.
EXAMPLE Teachers of young children can disagree about the best way to teach children to read, but
share the ideology that they should teach children to read, and will pursue whatever method is proven to
work best.
Similarly, one of the foundational American ideologies is the importance of open debate on issues of public
importance. Many laws and legal decisions over the centuries of the American experiment have honed the idea
that Americans should be allowed to freely debate their political opinions without censorship. Though
disagreements may be intense, all sides are usually committed to that ideology of the public debate. This
makes ideology essential to critical thinking, at least at a social level.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 41
THINK ABOUT IT
TERMS TO KNOW
Ideology
A deeply-rooted set of interrelated core beliefs that guides a person’s commitment to other values,
principles, and decisions. Often ideology is a key part of our identity and is held by many members of
the same group.
Integrity
The quality of making sure your actions and opinions are consistent with your stated values.
When we do not critically analyze what assumptions our ideologies contain, or fail to recognize the assumptions
upon which our beliefs are built, our ideologies negatively affect our critical thinking. This is because our
ideologies trick us into believing these assumptions are justified simply because they are consistent with the
ideology. We may even come to think of ideology as universal truth instead of beliefs. The ideology itself
becomes the “all good” that must be defended instead of a guiding principle.
EXAMPLE Ken is a business owner and an avowed capitalist. Capitalism’s guiding principle is private
ownership of industries and their operation for profit. Ken believes that a free and unregulated market leads
to economic strength. He argues with his brother that a new government program is wrong because it
violates the main tenets of capitalism. His brother argues that the program has already had a positive
measurable impact. Ken argues that even if the program does do some good, it is still wrong because it is
“socialism.” Socialism is an ideology for government intervention in the running and regulation of industries,
and in the distribution of wealth. In reality, most modern economies (including the U.S.) have aspects of
capitalism and of socialism. But instead of using his ideology to guide or justify his decisions, Ken has a rigid
commitment to it that shuts down critical thinking.
One with such rigid adherence to a set of principles is sometimes called an ideologue, and the practice of
insisting on those principles as the only possible good is dogmatic. Both of these terms essentially reflect that a
person can no longer get along with or talk to people with other beliefs. They might begin to see people with
conflicting ideologies as morally corrupt. Indeed, you don’t have to go far on the Internet to find people utterly
convinced that socialists are lazy leeches on working people, or that capitalists are motivated entirely by greed
and avarice.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 42
Indeed, an ideologue may even begin to see members of their own group as corrupt because they are not as
devoted to the ideological principles as they should be. This is common in politics and religion, where inner
struggles are often about idealism, or devotion to the ideology, and pragmatism, which emphasizes practical
and workable solutions that may compromise the ideology. However, idealism is not synonymous with being an
ideologue; any ideology is strengthened by these kinds of arguments. One is only an ideologue when they
become unwilling to consider different points of view.
BIG IDEA
Ideologies can even be directly and explicitly against critical thinking. For example, some education systems
emphasize rote memorization over creativity and analysis. While not an ideology in itself, the belief is grounded
in ideologies that emphasize discipline of children, and thus see education only in the context of discipline. This
means that students in these systems are not encouraged or trained to think or debate and do not learn that
there is more than one way to deal with a single issue. Their ability to think critically is stifled at an early age.
Adherents to the ideology will insist that such practice is best because it is consistent with their ideology, not
because they have any evidence that it is effective. Some ideologies have even made intolerance of subversion
or lack of consideration of other opinions a central tenet, essentially criminalizing critical thinking.
EXAMPLE Karen joins a religious group that forbids her to interact with people who don’t support her
“spiritual journey,” which can mean anybody who questions or challenges the authority of the group itself.
This means cutting off contact with her family and friends.
This is why we have to be careful with our ideologies. They can give us guiding principles for coming to
decisions that are consistent with core beliefs, and common ground to make our debates with opponents
constructive. But we have to take care that ideologies aren’t used to justify unquestioned assumptions or bad
decisions.
THINK ABOUT IT
Earlier you identified one of your guiding ideologies. Now think about how that ideology applies to your
own life and thinking. What beliefs do you have because of the ideology? How has your ideology guided
your decisions?
TERMS TO KNOW
Ideologue
One with rigid adherence to a set of principles.
Dogmatic
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 43
The practice of insisting on one’s principles as the only possible good or correct beliefs.
Idealism
Devotion to an ideology.
Pragmatism
Focus on practical and workable solutions that may entail compromise.
SUMMARY
In this lesson we learned about ideology, a complex set of beliefs that define who we are and how we
live and participate in society. We learned about ideology as a benefit to critical thinking, such as
supporting ethical decision making and giving grounds for public debate. However, we also learned
about ideology as a barrier to critical thinking, when we stop questioning our core assumptions, or
worse, presume that conflicting ideologies are formed with bad intent and refuse to engage with
people who hold them.
REFERENCES
Davidson, B. (1994, April 15). Obstacles and Opportunities for Critical Thinking in Japan. Retrieved from
Ebscohost.com:
search-ebscohost-com.ezproxy.snhu.edu/login.aspx?direct=true&db=eric&AN=ED377216&site=eds-
live&scope=site
TERMS TO KNOW
Dogmatic
The practice of insisting on one’s principles as the only possible good or correct beliefs.
Idealism
Devotion to an ideology.
Ideologue
One with rigid adherence to a set of principles.
Ideology
A deeply-rooted core belief that guides a person’s commitment to other values, principles, and decisions.
Often ideology is a key part of our identity and is held by many members of the same group.
Integrity
The quality of making sure your actions and opinions are consistent with your stated values.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 44
Pragmatism
Focus on practical and workable solutions that may entail compromise.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 45
Implicit Bias
by Sophia
WHAT'S COVERED
In this tutorial, you will learn how to recognize implicit bias that affects how you judge and treat other
people and explore some of the forms this bias can take. Specifically, you will learn about:
1. Recognizing Implicit Bias
2. Kinds of Implicit Bias
2a. The Halo Effect/Horns Effect
2b. Affinity Bias
2c. Confirmation Bias
2d. Attribution Bias
3. Combating Implicit Bias
EXAMPLE A husband in a heterosexual marriage may honestly believe in gender equality and still
presume his wife is primarily responsible for housework and childcare.
Implicit bias can be any kind of bias that automatically and unintentionally affects critical thinking. Implicit bias is
based on assumptions, beliefs, or attitudes that we are typically not consciously aware of, but which affect and
limit our thinking. In use, implicit bias usually refers to how we judge and treat other people.
TRY IT
You can check your own implicit bias at Project Implicit, which provides many self-assessments.
Many studies show that people have implicit bias even if they don’t know it (that’s the definition of implicit!).
They may even actively try to avoid bias, while still exhibiting implicit bias in testing. For example, people will
expressly say that they are not racist and have no racist beliefs of any kind, but when asked a series of
questions, they show a strong tendency to rely on stereotypes to make snap decisions. One study asked
people to pick the most “suspicious” person from a row of photos. As the test sped up, people were more and
more inclined to quickly click on the person of a different race. Another study asked people to quickly sort a
series of words into categories like “work” and “family.” These words were paired with words like “male” and
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 46
“female,” sometimes with the traditional gender roles aligned with each category, other times misaligned.
People showed they were much quicker to place a word like “suitcase” in the “work” category when it was
aligned with “male,” but took a moment longer when it was paired with “female.” They consciously knew that
the word suitcase belonged in the “work” category but were slowed by the subconscious feeling that putting it
into the category also labeled “female” was somehow wrong.
This doesn’t mean those people lied about not being racist or sexist. They don’t want to be racist or sexist, and
may even be passionate about avoiding these biases, but their biases are not conscious. Thus, implicit bias is
one of the most insidious and difficult to overcome of any kind of flawed thinking.
Implicit bias can be either favorable or unfavorable, and may be linked to almost any trait, such as assuming a
tall person is good at basketball. This kind of implicit bias is mostly (but not completely) harmless, but implicit
bias also leads to widespread discrimination based on race, religion, gender, national origin, age, sexual
orientation, and ability (among other things), affecting access and opportunities for members of marginalized
groups in employment, education, and housing. Note that this is again unconscious bias; even people acting in
what they believe to be a neutral manner can make unfair generalizations.
EXAMPLE A 2017 Harvard study found that resumes written to reflect that they were submitted from
white candidates fared markedly better than those that were written to reflect nonwhite identity when the
qualifications between the resumes were otherwise identical.
Studies like the ones described above show that implicit bias affects all of us, including those of us who
sincerely profess to be against prejudice and discrimination of any kind. Although it is socially unacceptable to
admit to holding such ugly stereotypes, they are still there subconsciously. Claiming to be “color blind,” or
otherwise uninfluenced by implicit bias is itself a lack of critical thinking. To truly avoid such influence, it takes
the self-awareness to know that these biases are ingrained by our exposure to them throughout our lives.
These biases can manifest themselves in many ways; while not always triggered by historically oppressed
groups, they often are, and all of the implicit biases in this section are typically aligned with prejudging people,
making assumptions about them based on one or more traits, and can lead to discrimination.
TERM TO KNOW
Implicit Bias
Any kind of bias that automatically and unintentionally affects critical thinking. Implicit bias is based on
assumptions, beliefs, or attitudes that we are typically not consciously aware of, but which affect and
limit our thinking. In use, implicit bias usually refers to how we judge and treat other people.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 47
The halo effect is the tendency to let a positive perception of an individual color our opinions of many other
characteristics of that person. It's called the halo effect because you can imagine a person's one good quality
putting an angelic halo over them, making their whole person seem to be angelic as well. One example of the
halo effect is the perception of a person's intelligence being greatly influenced by the prestige of the school
they attended. People who attend Ivy League schools or who have advanced degrees (especially PhD’s) are
more likely to be treated as productive, thoughtful, and intelligent than their peers, despite—and sometimes in
direct contradiction of—their actual behaviors and actions in the workplace. Another common example of the
halo effect is being more forgiving of wrongdoing if a person is perceived as being religious, especially if they
share one's specific religious beliefs.
Perhaps unsurprisingly, research has shown that one of the most common factors that influences the halo effect
is attractiveness. Perception of attractiveness also affects perception of success in life and personality.
Attractive people are seen as more charming or given the benefit of the doubt if they make mistakes.
EXAMPLE The famous presidential debate between John F. Kennedy and Richard Nixon in 1960
provides a potential example of the halo effect. Prior to the debate—the first presidential debate to ever be
televised—the two candidates were neck-and-neck. But according to popular wisdom, Kennedy's
handsome face and fashionable grooming stood in stark contrast to Nixon's sweaty appearance and five
o'clock shadow. Kennedy took the lead in the race after the televised debates, and the halo effect might
very well have contributed to the impression that Kennedy decisively won those debates. In fact, there is a
popular myth that voters who listened to the debates on the radio thought that Nixon performed better!
The horns effect is the opposite phenomenon, letting one quality color our perceptions of other (even
completely unrelated) qualities. The metaphor in the name of the effect is the reverse of the angelic halo; in this
case, we are allowing a negative quality to give someone devilish horns that makes us perceive their whole
person negatively as well. For example, research has shown that if people do not like the way a product looks,
they are less likely to buy it, even if it is potentially beneficial to them and the packaging has no impact on its
effectiveness. Much worse is when this carries over to people, judging them based on appearance or assuming
the worst of them because they have opposing views that shouldn’t affect their job. For example, there is
measurable and proven discrimination against unattractive people in hiring practices.
This effect is a serious and sometimes dangerous negative effect on good critical thinking.
EXAMPLE One area in which the horns effect can even sometimes be life-threatening is medicine.
Physicians regularly fall into the trap of judging patients based on their appearance or character qualities
without conducting a medical examination first, or choose which tests to run based on these initial
judgments. For example, a general practitioner may presume a heavyset patient’s stomach pains are due to
overeating and recommend a better diet, and miss a much more serious issue.
THINK ABOUT IT
In what ways are you affected by the halo and horns effects?
TERM TO KNOW
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 48
A form of implicit bias where favorable or unfavorable opinions on one trait color our perceptions of
other traits.
The affinity bias can radically change our behavior in the same situations depending on whether we feel affinity
for the person or not. For example, when a person towards whom we feel an affinity seems to appear
uncomfortable, we might smile or approach them to be inviting. For a person towards whom we do not feel an
affinity, we might not make the same effort. This can create a snowball effect because other people will react to
our extra warmth with their own warmth, which can create a deeper affinity. At the same time, those with whom
we don’t have affinity may respond to our blasé attitude with distance of their own, which will further widen the
affinity gap.
EXAMPLE A colleague tells a slightly inappropriate joke. You enjoy chatting with him because you have
many common interests. You don’t want to ruin that casual friendship, so you avoid rebuking him for the
joke as you might with another colleague.
THINK ABOUT IT
Imagine that you are traveling in another country where there are very few tourists at all, let alone people
from the same place as you. Walking down the street, you happen to see a person wearing a t-shirt from
your favorite coffee shop back home! Do you approach them? Do you think about them differently, even for
a moment, than you think about all the other strangers around you?
TERM TO KNOW
Affinity Bias
An implicit bias that leads us to favor people who either we perceive as similar to us or who we feel
connected to.
EXAMPLE A study at Stanford University gave several people the same study on capital punishment,
then interviewed them about what they recalled and concluded from the study. Although all people read
the exact same study, people who oppose capital punishment felt the article supported their position and
recalled the facts that confirmed this. People supporting capital punishment recalled different facts, also in
the article, that they felt supported their position.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 49
One of the most important ways in which this shows up implicitly is in job and school recruitment. Recruiters
have to be careful with this potential for implicit bias. If a recruiter makes a judgment about a candidate based
on the affinity bias or halo/horns effect, they will subconsciously look for evidence that backs up their first
opinion. The danger in confirmation bias is that it can and does result in systemic loss of good job candidates
and even sometimes the hiring of bad ones.
TERM TO KNOW
Confirmation Bias
The tendency to search for, focus on, analyze, and remember information that aligns with our
preconceived perceptions and opinions.
However, when we assess people we are not close to, we tend to do the opposite. We consider their
achievements due to good luck or receiving special favors, and their failures a result of their personal limitations
or character flaws. This can cause a significant issue for our perceptions of others and how we interact with
them. Worse, we can be even less likely to attribute people’s successes to their hard work and talent when they
are from marginalized groups, and similarly see their failures as due to central character flaws.
EXAMPLE Influencers on social media demonstrate the effectiveness of the halo effect. Because they
are charismatic and savvy with social media, they are also perceived as having great insight and taste, so
their followers purchase the products they promote or endorse.
TERM TO KNOW
Attribution Bias
A form of implicit bias where we tend to attribute the success of ourselves and the people we’re close
to, to hard work and talent, and the setbacks to bad luck or unfairness; we then do the opposite to
people we do not feel connected to.
Another thing to try is developing at least two starkly different perceptions of someone when you first meet
them. Over time, as you get to know the person better, one perception might look increasingly more correct
than the other, but the two perceptions will have kept you from prematurely putting a person into a box.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 50
We can also keep these biases in mind and remind ourselves of them when we find ourselves strongly pulled
toward one person instead of another, or we find that the evidence is piling up in our head for one version of a
story. We can give all people the benefit of the doubt in judging both their success and setbacks, and
acknowledge both our wins and losses as complex and due to many factors.
Interrogating our thinking for these biases by slowing down and being thorough and learning all of the ways in
which such effects can sharpen our thinking is the most important thing we can do to combat implicit bias.
For the kinds of implicit bias that lead to stereotypes and discrimination, accepting their existence in our own
thinking can only take us so far. We often need to set up objective standards and methods to guarantee our
decision making and assessments will be free of bias and fair.
EXAMPLE To combat implicit bias in college admissions, a management system is sometimes used that
conceals identifying information from applicants that reveals or implies their identity, such as name and zip
code. This helps reviewers judge them more fairly.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 51
THINK ABOUT IT
Think very open-mindedly about what implicit biases you may have when considering different people and
groups that you do not belong to. Consider how these subconscious stereotypes could affect your thinking
if you were called upon to assess a member of such a group.
SUMMARY
In this lesson, we learned that implicit bias works unconsciously but adversely affects good critical
thinking, particularly in the way we judge and treat other people. We learned about recognizing implicit
bias, beginning by accepting that all people are affected by it, as research has proven. We learned
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 52
about specific kinds of implicit bias, such as the halo effect/horns effect, and the related affinity bias,
confirmation bias, and attribution bias. Finally, we learned about combating implicit bias.
REFERENCES
Miller, C. C. (2020, February 1). Young Men Embrace Gender Equality, but They Still Don’t Vacuum. The New
York Times. https://www.nytimes.com/2020/02/11/upshot/gender-roles-housework.html
Gerdeman, D. (2017, May 17). Minorities Who "Whiten" Job Resumes Get More Interviews. Harvard Business
School. hbswk.hbs.edu/item/minorities-who-whiten-job-resumes-get-more-interviews
TERMS TO KNOW
Affinity Bias
An implicit bias that leads us to favor people who either we perceive as similar to us or who we feel
connected to.
Attribution Bias
A form of implicit bias where we tend to attribute the success of ourselves and the people we’re close to,
to hard work and talent, and the setbacks to bad luck or unfairness; we then do the opposite to people we
do not feel connected to.
Confirmation Bias
The tendency to search for, focus on, analyze, and remember information that aligns with our
preconceived perceptions and opinions.
Implicit Bias
Any kind of bias that automatically and unintentionally affects critical thinking. Implicit bias is based on
assumptions, beliefs, or attitudes that we are typically not consciously aware of, but which affect and limit
our thinking. In use, implicit bias usually refers to how we judge and treat other people.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 53
Cognitive Bias
by Sophia
WHAT'S COVERED
In this lesson, we will learn of some cognitive biases that interfere with good analysis of information. In
particular, we will learn about:
1. Illusory Correlation Bias
2. Illusory Causation Bias
3. Hindsight Bias
4. Egocentric Bias
5. Sunk Cost Bias
In the previous tutorial, you learned about many forms of implicit bias. These forms of implicit biases are
often due to, or corresponding to, socially conditioned biases against marginalized groups of people. They
also included more general biases that favor people “like us” and color our perceptions of people “not like
us.”
In this tutorial, we will look at cognitive bias, or errors in the way we gather and process information.
Though still implicit (meaning, we are not aware of them), they tend to be less socially constructed and
more about psychological tendencies.
However, you may begin to see that biases can come in clusters, such as a heavy political leaning that
leads to a halo effect with likeminded people, confirmation bias that sustains and deepens the belief, and
an attribution bias that sees all politicians from other parties as illegitimate. Any of these biases can further
cement those implicit biases, so while we have names for different tendencies, these are not mutually
exclusive.
BIG IDEA
Being aware of bias isn’t about naming and categorizing each error, but rather gives us a way to reflect on
our own thinking.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 54
You may know that correlation is an established relationship between two variables or sets of data. For
example, height has a logical correlation to weight—on the whole, the taller a person is, the more they weigh.
Illusory correlation bias is when we jump to conclusions about a relationship between two events or conditions
when no relationship exists. In other words, an observer sees a relationship that isn’t there, usually based on
limited data. When the illusory correlation happens, a person falsely concludes that there is a relationship
between variables that isn’t really there.
EXAMPLE Athletes are famously prone to illusory correlation bias, in the form of superstitions. Michael
Jordan wore a pair of lucky shorts under his uniform for every game with the Chicago Bulls, while Serena
Williams ties her shoes in exactly the same way before each tennis match. When an athlete attributes their
big win to a specific routine or a lucky object, that's the illusory correlation bias at work.
Things like superstitions seem mostly harmless, so why is it important to notice and eliminate illusory
correlations? First of all, we often make risk calculations and decisions on the basis of correlations. For example,
we might avoid driving when roads are at their busiest, because we know there is a correlation between busy
roads and traffic collisions. But in order to make good decisions, we need our correlations to be accurate. For
example, if we know several people who learned they had major health issues when they saw a doctor, it would
be foolish to avoid seeing a doctor out of the idea that a checkup causes health issues.
Illusory correlations can also have damaging effects on a political level. For example, politicians have led the
American public to connect violent crime with immigrant communities. This has resulted in harsh immigration
restrictions and strict deportation methods. However, many studies have shown there is no positive correlation
between immigrants and crime. In fact, studies show a weak negative correlation, meaning that immigrants are
slightly less likely to commit a violent crime than native-born citizens.
To prevent illusory correlation bias, we need to have lots of data and thoroughly understand what sort of tight
relationship qualifies as correlation. In some cases, it may take training in data analysis to understand if
correlation is really present.
EXAMPLE Some scientific studies showing correlation even pass peer review and get published before
statistical methodology reveals the correlation is illusory. One such paper is the famous Dunning-Kruger
paper, “Unskilled and Unaware of It,” which showed (at a glance) a high correlation between low
performance on a task and high self-assessment; in short, that people who think they are the best are really
the worst! This paper became well known out of academic contexts, with people on social media often
invoking the Dunning-Kruger effect when they see someone whose apparent understanding of a topic is
low but their self-perception is high. While such individuals exist, the statistical correlation in the paper was
debunked by multiple subsequent studies.
TERMS TO KNOW
Cognitive Bias
An error in the way we gather and process information.
Correlation
An established relationship between two variables or sets of data.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 55
Perceiving a relationship between two events or conditions when no relationship exists. In other words,
an observer sees a relationship that isn’t there, usually based on limited data.
EXAMPLE Ice cream sales and drownings have a high statistical correlation, with each going up and
down at the same times and to the same degrees throughout the year. One might see the correlation and
draw a false conclusion, that ice cream hampers swimming ability. Of course, the real reason for the
statistical correlation is that people both eat ice cream and go swimming in warmer weather, but neither
action causes the other.
It is common for people to make reasoning errors by presuming that two data sets that correlate must mean
that one event causes the other. This is illusory causation bias, and it’s important to note that the data itself,
even the correlation, can be fully accurate. The illusion is seeing causation.
EXAMPLE In 2020, viral memes showed a high correlation of 5G networks and Covid-19 infections.
Comparing two heat maps (geographic maps with colors to show “hot zones”) led to the striking proof that
the two phenomena were concentrated in the same areas. This led to people seeing causation between 5G
networks and coronavirus outbreaks, when the simpler explanation for the correlation was simply that both
the virus and cell phone use were higher in more densely populated areas.
To prevent ourselves from being tricked by illusory causation bias, we can:
THINK ABOUT IT
Both illusory correlation and illusory causation depend on the human ability to recognize patterns, and are
thus closely connected to the ability to discover authentic correlations or causations. How do we determine
if a correlation or causation is real or illusory?
The best way to test a theory of correlation or causation is to see if it can lead to accurate predictions.
Such tests usually require large data sets and skilled analysis, but it's safe to assume any theory is
illusory until it can be verified.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 56
TERMS TO KNOW
Causation
When one event or action directly causes another.
3. Hindsight Bias
Hindsight bias is the tendency to perceive past events as being more predictable than they were. In other
words, people tend to believe that they would have anticipated an event or outcome, or that they say they
suspected or foretold an historical event they lived through.
EXAMPLE After the terrorist attacks on September 11, 2001, in New York City, many New Yorkers
claimed they had an eerie feeling that morning, that they called in sick to work because they had a bad
dream foretelling the attack, or otherwise suspected something was wrong. But firsthand testimony after an
emotionally fraught event is highly suspect; absent is any historical evidence that people communicated
that uneasy feeling before the attacks. This doesn’t mean people are lying, but that as they recall the
events, they have a tendency to see themselves as having had more foresight than they did.
This bias also leads us to judge others. When others make decisions that don’t work out as planned, we can
claim we would have known better. For example, you have probably heard people second-guessing the
decisions coaches make during a game. In politics, any presidential administration is attacked for their failure to
prepare for some unforeseen emergency. In law, people can be unfairly convicted of criminal negligence
because both prosecutors and juries know the outcome and find it much more predictable than it was.
Hindsight makes it easy to search for the hints that someone should have acted on, and which the critic
believes that they themselves would have acted on.
Research suggests that people exhibit the hindsight bias even when they’re aware of the bias and try to
eliminate it, which makes it more difficult to combat than other biases. However, we can still be open-minded,
consider alternative explanations, think about different perspectives, and be honest with ourselves as we reflect
on past events. Reminding ourselves that “hindsight is 20/20” will help keep our own biases in check.
TRY IT
Is there a recent event in your life that felt fated to occur or "you knew would happen"? It may be on a
personal level or on a national level. Reconsider the event and what would need to change to arrive at a
different outcome. This practice often leads to seeing how an outcome could be quite different based on
small factors, and how the event was thus not as predictable as we thought.
TERM TO KNOW
Hindsight Bias
The tendency to perceive past events as being more predictable than they were.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 57
4. Egocentric Bias
The egocentric bias (or egocentrism) is the tendency to have a higher opinion of oneself than is realistic or to
give unusual weight to one’s own self-interest. While people often fail to consider situations from other people’s
perspectives, egocentric bias is an extreme version, where a person not only fails to consider those
perspectives, but considers them irrelevant or less important. In some ways, this bias makes sense as people
are more aware of their own behaviors, and have direct access to their own thoughts and emotions in a way
they do not have for other people. But egocentric bias can influence one’s ethical judgments, when they
believe outright that self-interested outcomes are not only preferred but morally superior.
EXAMPLE The “NIMBY” factor, or “not in my back yard,” is a common example of egocentric thinking.
People may see the need for public projects like housing for low-income people or halfway houses for
people who have recently left prison, but don’t want them in their own neighborhoods.
Another effect of egocentrism is believing one’s contributions to a collaborative project are greater than those
of the other team members. People may also overestimate how much others have negatively impacted the
group (e.g., starting arguments) and underestimate how often they did so themselves. Both of these come from
having a heightened awareness of one’s own experiences, and not necessarily witnessing what others have
done, but it may also be due to failing to reflect accurately on the entire experience.
EXAMPLE Men in heterosexual partnerships tend to believe they do as much or more housework than
their partners, but studies show they overestimate their own contributions and underestimate their partners’
contributions.
Unlike the hindsight bias, the egocentric bias can be combated by making people aware of the bias. You can
remind yourself that you may not have all the information you need, or that you have a natural instinct to inflate
your own work.
TERM TO KNOW
Egocentric Bias
The tendency to rely too heavily on one’s own perspective, not considering the perspective of others,
and/or to have a higher opinion of oneself than is realistic. Hindsight bias and attribution bias are two
forms of egocentric bias.
EXAMPLE Joan buys a non-refundable ticket to a meditation retreat in the woods. On the day of the
retreat, she feels nauseous with a low-grade fever. Further, it’s foggy, humid, and thunderstorms are on the
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 58
horizon. Joan knows that traffic and the trails are going to be worse because of the storm, and going out in
the woods will pose a significant risk to getting sicker. She no longer expects the retreat to be enjoyable,
because she doesn’t feel up to it. Furthermore, if her sickness is contagious, she might pass it on to others.
It seems like the current risks and drawbacks definitely outweigh the benefits. However, Joan is reluctant to
give up on the trip because of her investment. Besides the money she’s spent, she has been looking
forward to it for weeks, taken time off work, and arranged her life around it. It seems silly to do all that and
then just stay home! The sunk cost bias explains why she’s so likely to go on the retreat even when she
knows she won’t have a good time.
Acting rationally requires that we only take the future benefits and costs into account—namely ones that could
be changed. But psychologically it is hard to accept a certain loss. The sunk cost fallacy can have many bad
effects on our lives, from staying in an unhappy relationship because of the time we’ve already spent with our
partner, to keeping a clunker car because we’ve already spent so much money on repairs. The sunk cost fallacy
also impacts large-scale decisions that governments and companies make.
EXAMPLE In 1962, British and French engine manufacturers and governments started work on the
Concorde airplane, which would be the first supersonic passenger plane. The plane itself was a
masterpiece of engineering, but it became evident in development that costs were greatly surpassing
expectations, and they would never earn it back in profits. Nonetheless, the project continued because
manufacturers and governments had already made significant financial and time investments in the project
and feared the embarrassment of abandoning it. Moreover, the project had received a lot of positive press
and they feared the embarrassment of quitting in the middle of the project. Such decisions cost taxpayers
millions of dollars and can undermine other crucial public investments.
Psychologically, the sunk cost bias is caused by an unwillingness to take an intentional loss. But when we
become aware of the sunk cost bias, we can be more intentional in our thinking, setting aside feelings of pride
or fear of embarrassment and looking only at future costs and benefits.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 59
THINK ABOUT IT
We have all committed egocentric bias or sunk cost bias at some time in our lives, whether the stakes were
high or low. Can you think of examples? How would you approach similar situations differently with these
biases in mind?
TERM TO KNOW
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 60
SUMMARY
In this lesson, you learned about more forms of implicit bias, particularly ones that involve interpreting
data, such as illusory correlation bias and illusory causation bias. Other poor decisions might be driven
by a tendency to overrate your own abilities, such as hindsight bias and egocentric bias. Similarly, the
sunk cost bias prevents us from admitting we made a mistake.
REFERENCES
McNiff, E. & Phillips, H. (2012, October 24). Disaster Predictions: People Claim Premonitions of Sept. 11 Attacks,
Japanese Tsunami. ABC News. abcnews.go.com/US/sept-11-terrorist-attacks-japanese-tsunami-people-
claim/story?id=17553825
Jarry, J. (2020). The Dunning-Kruger Effect Is Probably Not Real. McGill University Office for Science and
Society. www.mcgill.ca/oss/article/critical-thinking/dunning-kruger-effect-probably-not-real
Pew Research Center (2015). Raising Kids and Running a Household: How Working Parents Share the Load.
www.pewresearch.org/social-trends/2015/11/04/raising-kids-and-running-a-household-how-working-parents-
share-the-load
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior
theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–
2109.
Oeberst, A. & Goeckenjan, I. (2016). When Being Wise After the Event Results in Injustice: Evidence for
Hindsight Bias in Judges’ Negligence Assessments. Psychology, Public Policy, and Law 22(3), 271-279.
concept.paloaltou.edu/resources/translating-research-into-practice-blog/negligence-judgments-judges-
biased-hindsight
TERMS TO KNOW
Causation
When one event or action directly causes another.
Cognitive Bias
An error in the way we gather and process information.
Correlation
An established relationship between two variables or sets of data.
Egocentric Bias
The tendency to rely too heavily on one’s own perspective, not considering the perspective of others,
and/or to have a higher opinion of oneself than is realistic. Hindsight bias and attribution bias are two
forms of egocentric bias.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 61
Hindsight Bias
The tendency to perceive past events as being more predictable than they were.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 62
Formal Fallacies
by Sophia
WHAT'S COVERED
In this tutorial, you will learn about fallacies, or errors in reasoning, with a focus on formal fallacies. In
particular, you will learn about:
1. Formal Fallacies
2. Denying the Antecedent
3. Affirming the Consequent
1. Formal Fallacies
In the previous challenge, we learned about inherent and cognitive biases, which are errors in thinking based
on underlying assumptions or mistakes in the way we gather and process information. We are now ready to
begin talking about fallacies, which are also errors in reasoning, but that usually take shape only when you
begin to make an argument.
So, what is a fallacy? A fallacy is the use of faulty reasoning in the construction of an argument, which may make
the argument appear stronger than it really is (particularly if the fallacy is not spotted). While biases are deeply
rooted in the way you think and are hard to eradicate, fallacies are usually easier to spot and counter. While
biases are by definition unintentional, fallacies can be done willfully to skew an argument in bad faith.
Throughout this class we will learn how to construct and analyze arguments in both natural language (meaning
arguments constructed in plain English or other spoken languages) and in formal logic, a kind of structured
reasoning that follows a set of rules and often uses mathematical symbols. The goal of formal logic is to study
the “pure structure” of arguments, understanding how one or more claims (or premises) leads to a result (or
conclusion). Both premises and conclusions are called statements, which simply means it is a declarative
sentence that asserts something to be true.
We call this set of statements a logical argument or formal argument. It looks quite different than our usual idea
of an argument, such as an editorial in the newspaper or a spat at the playground. A logical argument is a series
of statements, including one or more premises and a conclusion, where the conclusion can be arrived at from
the premises.
As you will see, some logical arguments can be similar in scope to opinion pieces–that is, they start with
agreed-upon premises and try to show how those premises lead to a conclusion. Other logical arguments are
purely factual, showing that with one set of premises we must arrive at a certain conclusion. And in many of the
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 63
examples you will see, there is no “argument” in the sense that we are trying to change someone’s mind, justify
a belief, or convince someone to take action. They may be purely hypothetical statements about nonexistent
people. The purpose of these arguments (to use the logical term) is to show the mechanics of logical reasoning,
so they can be applied to more sophisticated arguments later.
However, some basic formal logic is easy to follow without special training, such as this example.
This is a valid argument, meaning the conclusion is deducible from the premises. You will learn a lot more about
formal logic in subsequent units. For now, you can see that this argument has a conclusion, starting with
“therefore,” that follows logically from the premises. Note that you can replace the words “fish” and “water” with
any plant, animal, or biome. This shows that the structure (or form) of the argument works even as you swap out
the content. In fact, as you will learn later, you can have a logical argument without any words at all.
HINT
The word formal is used a lot in logic: formal logic, formal argument, formal language. This doesn’t mean
“formal” in the sense of wearing a tux to a posh event, but that they take a particular form or follow strict
rules in the way they are composed.
These examples reveal an inherent logical rule. If we know something leads to something else (e.g., if…then…),
and we know that the first thing is true (that which comes after “‘if”), then we can conclude with the second
thing (that which comes after “then”).
TRY IT
Presuming each of the following arguments are valid, see if you can guess the missing statement.
1. All elephants have big ears.
2. ______________________________
3. Therefore, Dumbo has big ears.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 64
Dumbo is an elephant.
1. ______________________________
2. Peter is a good dog.
3. Therefore, Peter never bites anyone.
Using formal logic is also a good way to reveal errors in logic, or formal fallacies. Formal fallacies are errors in
the structure of an argument, leading to a conclusion that does not follow from the premises, as you will see in
the examples below.
TERMS TO KNOW
Fallacies
Errors in reasoning; unlike biases, these are not based on underlying assumptions or the way you think,
but flaws in the argument itself. Fallacies may also be used intentionally when arguing in bad faith.
Formal Logic
Structured reasoning that follows a set of rules and may use mathematical symbols.
Premises
The statements that should logically lead to (or be the evidentiary support for) the conclusion.
Conclusion
The logical result of the argument.
Statement
A declarative sentence that asserts something to be true.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 65
Valid
The conclusion follows logically from the premises.
Formal Fallacy
An error in the structure of an argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 66
Drawing the conclusion that you do not need an umbrella from the conditional and the fact that it is not raining
is fallacious—that is, based on a fallacy—because rain is not the only situation that requires an umbrella. Maybe
it’s not raining, but it is snowing, sleeting, or hailing, where an umbrella would be useful; or maybe I am going as
Mary Poppins to a costume party. The logical error this fallacy is based on is presuming the consequent is false
because the antecedent is false. When we engage in this fallacy, we fail to consider the conditions where the
antecedent is false, but the consequent is true.
EXAMPLE This error happens fairly frequently in everyday life. For example, one might argue that if you
see a rise in doctor visits for the flu, then cases of the flu are spiking. Say we accept that argument. Since
doctor visits for the flu are not rising, they then argue, the flu must not be spiking. But in truth, people may
be treating the flu at home instead of going to the doctor; in fact, they may be doing exactly as advised
because the doctor cannot give any additional advice or treatment in the office and going to the clinic
continues the spread of the virus. This happened in the H1N1 flu pandemic of 2009.
Even if conclusions are true, a poorly formed or fallacious argument is not valid.
You probably know that the caped crusader is associated with the DC Universe, so the conclusion is a true
statement. However, it does not follow logically from the premises. There are Marvel heroes who are not
Avengers (at least when this tutorial was written)! If we consider the form of the argument removed from some
of its content, the argument about Batman has the same form as this argument:
However, this conclusion is simply not warranted. As we pointed out earlier, supposing B is Jessica Jones/Luke
Cage/Daredevil, etc., then B is in the Marvel Universe without being an Avenger.
TRY IT
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 67
1. All mice love cheese.
2. Chewy loves cheese.
3. Therefore, Chewy is a mouse.
TERMS TO KNOW
Conditional
An if/then statement (other words may be used, but the sentence can be rephrased as an if/then
statement).
Antecedent
The “if” part of an “if/then” statement.
Consequent
The “then” part of an “if/then” statement.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 68
fallacy comes from concluding the truth of the antecedent from affirming the truth of the consequent. Here is an
example in natural language.
The error here is assuming the antecedent is true because the consequent is true. As described above, there
are a handful of reasons we might carry an umbrella that aren’t due to rain, so this is not a valid conclusion. You
might notice that both of these fallacies follow from the same simple error: assuming that a conditional
statement works both ways. By this, we mean the error comes from assuming that if one portion of the
conditional is true, you can conclude the other portion is true in either direction. However, as we will learn in
Unit 3, conditionals are much more complicated than this.
EXAMPLE A common example of affirming the consequent is the feeling people have after adopting a
new diet. The person on the diet might think, “If this diet is a good one, then I’ll start to see weight loss
soon.” After a couple of weeks, the person does lose weight, which affirms the consequent, so they assume
the diet is a good one and may tell all their friends to try it. However, many diets have initial weight loss,
followed by a plateau or even a “bounce back” to the original weight. So, it is not proven that the diet is
really “the best way to lose weight” simply because there is initial weight loss.
As with denying the antecedent, this form of argument is always a fallacy, regardless of the meaning of the
terms involved, and even if the conclusion is true. The structure of the argument—and so the logic itself—is
flawed. Let’s return to the Marvel Universe.
1. If Spider-Man is a Marvel superhero, then movies with Spider-Man will be streaming on Disney+.
2. Spider-Man is in a movie streaming on Disney+.
3. Therefore, Spider-Man is a Marvel superhero.
Notice that even though the conclusion is true here, the argument itself is a fallacy. This is because the truth of 1
and 2 don’t force us to conclude 3. This is most clear to see if we look at the structure of the argument with its
content removed. Let’s do this by generalizing the argument beyond Spider-Man. When we take out every
instance of Spider-Man in the argument and replace it with a variable, in this case the letter B, we get this
argument:
Once we focus in on the structure of the argument, we can see that even in the case when a superhero is
Spider-Man and the conclusion is true (at least as of the writing of this tutorial), it’s not true for every potential
superhero. Suppose the superhero was Underdog. In this case, the premises would be true, but the conclusion
would be false. This is because even though Underdog is streaming on Disney+, Underdog is not a member of
the Marvel Universe. Our Spider-Man conclusion was true but did not follow from the first two statements. This
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 69
is because the logic of the argument itself is flawed. We can tell that the logic is flawed when we can substitute
another term which keeps the first two statements true but renders the conclusion false. We will learn more
about how to analyze arguments like these in future units.
TRY IT
TERM TO KNOW
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 70
SUMMARY
In this lesson, you began to learn about fallacies, or errors in reasoning. Like arguments, fallacies can
be formal or informal. Formal fallacies can be represented in formal terms, and the error is always a
logical one. Two classic examples of formal fallacies are denying the antecedent and affirming the
consequent. Though slightly different in form, both of these rely on assuming an if/then statement
works both ways.
Source: THIS CONTENT HAS BEEN ADAPTED FROM Introduction to Logic and Critical Thinking
TERMS TO KNOW
Antecedent
The “if” part of an “if/then” statement.
Conclusion
The logical result of the argument.
Conditional
An if/then statement (other words may be used but the sentence can be rephrased as an if/then
statement).
Consequent
The “then” part of an “if/then” statement.
Fallacies
Errors in reasoning; unlike biases, these are not based on underlying assumptions or the way you think,
but flaws in the argument itself. Fallacies may also be used intentionally when arguing in bad faith.
Formal Fallacy
An error in the structure of an argument.
Formal Logic
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 71
Structured reasoning that follows a set of rules and may use mathematical symbols.
Premises
The statements that should logically lead to (or be the evidentiary support for) the conclusion.
Statement
A declarative sentence that asserts something to be true.
Valid
The conclusion follows logically from the premises.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 72
Introduction to Informal Fallacies
by Sophia
WHAT'S COVERED
In this tutorial, you will continue to learn about fallacies, but now with informal fallacies, which rely more
on a misunderstanding of the terms involved than in the logical form of the argument. In particular, you
will learn about these fallacies:
1. Composition Fallacy
2. Division Fallacy
3. Begging the Question
4. False Dichotomy
5. Equivocation
1. Composition Fallacy
Informal fallacies, in contrast to formal fallacies, cannot be captured in formal logic. We need to know the
meanings of the words or an argument’s broader context to understand those errors. Informal fallacies rely on
understanding the substance of the arguments, the meanings of the words, and the context of the statements
within the argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 73
Two kinds of fallacy are about the relationship between parts to the whole to which they belong. Take, for
example, the game Jenga. The goal of the game is to build a tower, and so in discussions throughout the game
we can talk about the tower as a whole or the blocks used to build the tower (e.g., the parts of the tower).
Similarly, think about a very common animal like a dog. We have the capacity to talk about the dog as a whole
or different parts of the dog, like the dog’s coat, teeth, eyes, nails, etc.
Now think about the kinds of properties we attribute to objects. We can attribute a property to either an object’s
part or the whole of the object. To attribute a property to all of an object’s parts is to attribute that property
distributively. If we attribute a property to the object as a whole, then we are attributing that property
collectively. Let’s consider a few examples.
Think back to our example of a Jenga tower. If someone comments that the tower is wobbly, the property of
wobbliness is being attributed collectively. That is, we’re not saying that each individual Jenga block alone is
wobbly. Rather, we’re saying that when all those individual parts are stacked in this particular way, that the
collective tower as a whole is wobbly.
On the flipside, there are some properties that are distributive. Consider the case where you’re leaving the
house for vacation, and your spouse turns to you and says, “The house is locked.” Being locked is a distributive
property. What your spouse is saying is that each door (and window, depending on how particular they are) is
locked. In other words, the parts of the house that give entrance to it are locked. It doesn’t make sense to talk
about a house as having the collective property “locked.” For another example, we can consider a bouquet. If
someone says the bouquet is white, they are using white as a distributive property. That is, each flower in the
bouquet is white. In all likelihood, the stems of the flowers in the bouquet are not white (stems are green), and
so it can’t be the case that the bouquet collectively has the property of being completely white. Rather, that
property of whiteness is distributed amongst each of the bouquet’s parts—namely the flowers.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 74
Now, why does this matter for fallacies? Because poor reasoning about the role of distributive and collective
properties in arguments can lead to fallacies. For example, an argument commits the composition fallacy when
and only when a term used distributively in the premises is interpreted collectively in the conclusion. Another
way of putting this is that the composition fallacy is when somebody makes the mistake of attributing a
property collectively that is only true distributively (i.e., of the parts) and makes an inference on the basis of that
misattribution.
1. Every member of the gymnastics team weighs less than 110 lbs.
2. Therefore, the whole gymnastics team weighs less than 110 lbs.
This argument commits the composition fallacy. Arguments which utilize the composition fallacy argue that
since each part of the whole has a certain feature, it follows that the whole has that same feature. But as we can
see, this might lead to ridiculous conclusions. Take another clear example:
Again, the idea is clear: just because you understand the meaning of each word in an article, doesn’t imply that
you understand the article’s overall meaning. However, critically the composition fallacy is informal because it is
not purely about the structure of the argument. Depending on the content, sometimes the inference from part
to whole can be acceptable. Consider:
This argument and the ones above have the same form, but the last one is a fair, logical argument and the
others are obvious fallacies. The difference between the two arguments is not their form, but their content.
Again, that’s what makes this an informal fallacy. We have to rely on our understanding of the meanings of the
words or concepts involved.
TRY IT
The first statement is not a fallacy. The second statement is an example of composition fallacy. It
sounds like the family shares one car, whereas they might have several.
Composition Fallacy
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 75
It’s a fallacy when… It’s not a fallacy if…
A statement that only applies to individual parts is A statement made about individual parts does apply
applied to the whole. to the whole.
TERMS TO KNOW
Informal Fallacy
An error in reasoning which cannot be identified without understanding the meanings behind the
statements involved in the argument.
Distributively
To attribute a quality of something to its parts. This may or may not be fallacious depending on context.
Collectively
To attribute a quality of something to the whole. This may or may not be fallacious depending on
context.
Composition Fallacy
An error in reasoning that since each part of the whole has a certain feature, it follows that the whole
has that same feature.
2. Division Fallacy
The division fallacy can be understood as the reverse of the composition fallacy and so they are easy to
confuse. The division fallacy argues that since the whole has some feature, each part must also have that
feature. (The composition fallacy, as we have just seen, goes in the opposite direction: since each part has
some feature, the whole must have that same feature.) In our previous terminology, the division fallacy happens
when someone makes the mistake of attributing distributively a property that is only true collectively and then
making an inference on that basis.
This is clearly a fallacy. Just because the whole house costs one million dollars, it doesn’t follow that each part
of the house costs one million dollars.
Here, the conclusion is based on one explicit and one hidden premise: that the dog is soft, and if the dog
collectively is soft, then every one of its parts is soft. Thus, the mistaken arguer concludes that a very un-soft
part of the dog must be soft (since according to the division fallacy, collectivity implies distributivity).
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 76
Nonetheless, here is a similar argument that does hold up:
As with the composition fallacy, the forms of the arguments are the same, but one leads to a reasonable
conclusion and the other does not. We can only use the contents or meanings of the premises to determine
whether the conclusion is sound.
TRY IT
The first statement is an example of composition fallacy. When we say a family is originally from
another country, we may mean the ancestors came over a hundred years ago, before any living
members were born. The second statement is not a fallacy.
Division Fallacy
A statement that only applies to the whole is applied A statement made about the whole is applicable to
to individual parts. each part.
TERM TO KNOW
Division Fallacy
An error in reasoning that since the whole has some feature, each part must also have that feature.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 77
“We regret that we cannot issue a refund after thirty days because our store policy requires a return within thirty
days.”
This does not explain why they have the policy; it simply restates it. Simply put, the premise and the conclusion
are the same. We can reconstruct them as follows:
The argument is valid, but it fails to give us a meaningful explanation. Here are a couple of other examples of
begging the question.
We know the Bible is divinely inspired because the third chapter of II Timothy says that “all scripture is
given by divine inspiration of God.”
Smoking causes cancer because the smoke from cigarettes is a carcinogen.
When a prosecutor says to a defendant: “So, how did it feel when you killed your business partner?” (in a
murder trial about the homicide of the business partner).
In some cases, an argument may feel like this fallacy but isn’t. Consider this example, which may arise from a
discussion about children misbehaving in class.
Yes, a lot of kids squirm, fidget, can’t sit still, and would rather talk to their friends than do school work,
because that’s the way kids are.
This might feel like a circular argument or “begging the question,” because it uses the premise as an
explanation for the question, “Why are kids misbehaving in class?” But in the context of a discussion about
ADHD, it may be a reality check, reminding the other person to have realistic expectations about child behavior
before assuming kids need a diagnosis. In that case, it is not begging the question because the premise is more
or less universally accepted; the person making the argument is reminding the person of an established fact,
not asking them to accept a conclusion without support.
TRY IT
The first statement is an example of begging the question. The second sentence is not a fallacy,
though the speaker may need to provide further evidence of their claim.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 78
It’s a fallacy when… It’s not a fallacy if…
TERM TO KNOW
4. False Dichotomy
A false dichotomy is simply a presentation of two options (often extreme) as if they were the only ones
available. Consider a conversation between two neighbors.
The problematic argument is that either A avoids the group altogether or is required to give complete personal
information to the Facebook founder and CEO. Obviously, there are other options: joining Facebook under a
false name, for example, or simply not participating in any way outside of the group. The false dichotomy, like
other informal fallacies, is not obvious from its logical structure but from the meaning of its premise.
Crucially, a false dichotomy arises when we reason from an either/or stance, and we haven’t considered all the
relevant possibilities. Here are a couple more examples:
Be my friend or be my enemy.
In a capitalist economy, you either win big or you lose big.
You are either pro-war or you hate the troops.
The problem lies in presenting two options, often one as an extreme, when there are many other more nuanced
options that aren’t being considered. Needless to say, there may be cases where two options really are the only
options, or represent the best options. Consider this statement.
Jenny seems to really like you, so if you don’t feel the same way, you should cut things off instead of
stringing her along.
In this case, the two options being presented are breaking up with Jenny or getting serious. The other options
aren’t being concealed; however, the dichotomy is presented because those other options are worse (for
Jenny) than the two given.
TRY IT
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 79
Well, we can either stand here arguing about it for the rest of the night or you can admit I’m right.
I don’t want to drive around looking for a place that’s open, so let’s just eat here.
The first statement is an example of a false dichotomy, presuming the only options are to “admit I’m
right” or argue forever. For example, the speaker omits alternatives like agreeing to disagree or even
admitting the other person is right. The second is not a false dichotomy; it presents two alternatives
that seem to be the only available options.
False Dichotomy
TERM TO KNOW
False Dichotomy
An error in reasoning that presents two options (often extreme) as if they were the only ones available.
5. Equivocation
Consider the following argument:
This is a silly argument, but it illustrates the fallacy of equivocation, or using two different meanings of a word in
the same argument. Here, the word “headache” is used equivocally. In the first premise, “headache” is used
figuratively to mean “irritating” or “difficult,” whereas in the second premise, “headache” is used literally. As with
every informal fallacy we have examined in this section, equivocation can only be identified by understanding
the meanings of the words involved. However, fallacies of equivocation can be hard to spot because words can
have so many nuances and meanings. Equivocation is often used to humorous effect as well. Puns are basically
equivocations, surprising readers by using an unexpected meaning of a word.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 80
Note that some words have specialized meanings within certain fields. For example, in this class we will talk
about “valid” in a special way. In fact, you will learn that an argument can be valid even if the premises and
conclusion are all false, whereas our common understanding of “valid” usually suggests that a thing is true and
accurate, such as a valid statement being a fair representation of the facts, or a valid passport being legal and
real. Because of this, there may be confusion if people aren’t used to a specialized meaning, but this does not
make the use itself a fallacy unless the two meanings of the term are intentionally conflated by the person
making the argument.
TRY IT
The first statement is an example of equivocation, shifting meaning from “save” as in “set aside and
keep safe,” to the commercialized meaning of save, meaning “spend less than you might elsewhere.”
The second example is not equivocation because although it uses two meanings of a word, it actually
clarifies which meaning you should understand for the discussion.
Equivocation
TERM TO KNOW
Equivocation
An error in reasoning that relies on using different meanings of the same word or phrase within the
same argument.
SUMMARY
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 81
In this lesson, you continued to learn about fallacies, now focusing on informal fallacies, where the error
is only discernible by the meaning of the words used. Informal fallacies may be logically valid but still
include errors in reasoning, such as misinterpreting a premise, as in composition fallacy and its reverse,
division fallacy, presenting the conclusion as a premise in begging the question, or misrepresentation
using false dichotomies or equivocation.
Source: THIS CONTENT HAS BEEN ADAPTED FROM Introduction to Logic and Critical Thinking
TERMS TO KNOW
Collectively
To attribute a quality of something to the whole. This may or may not be fallacious depending on context.
Composition Fallacy
An error in reasoning that since each part of the whole has a certain feature, it follows that the whole has
that same feature.
Distributively
To attribute a quality of something to its parts. This may or may not be fallacious depending on context.
Division Fallacy
An error in reasoning that since the whole has some feature, each part must also have that feature.
Equivocation
An error in reasoning that relies on using different meanings of the same word or phrase within the same
argument.
False Dichotomy
An error in reasoning that presents two options (often extreme) as if they were the only ones available.
Informal Fallacy
An error in reasoning which cannot be identified without understanding the meanings behind the
statements involved in the argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 82
Fallacies of Misdirection (Part 1)
by Sophia
WHAT'S COVERED
In this tutorial, we learn more informal fallacies that can affect rational thinking—in particular, ones with
strong emotional appeal but which ultimately give information that isn’t relevant to the argument. In
particular, you will learn about the following fallacies:
1. Ad Hominem
2. Straw Person
3. Tu Quoque
What all fallacies of misdirection have in common is that they make an argument or response to an
argument that simply changes the subject. These arguments can be tempting or compelling to consider, so
it is important to recognize that although they are emotionally or psychologically compelling, they are not
rational, and undermine good critical thinking. Recall that a fallacy is flawed reasoning, and that good critical
thinking may require setting aside our feelings. This doesn’t mean we have to become inhuman robots, just
that for the purposes of analyzing an argument, we can separate the reasoning of an argument from the
various ways they might play on our feelings.
1. Ad Hominem
“Ad hominem” is a Latin phrase that can be translated into English as the phrase, “against the person.” In an ad
hominem fallacy, instead of responding to (or refuting) the argument a person has made, one attacks the
person advancing the argument.
EXAMPLE A well-known public intellectual named Peter Singer makes an argument that it is morally
wrong to spend money on luxuries for oneself rather than give all of your disposable income to charity. The
essence of his argument is that there are children who die preventable deaths, that there are charities who
could save the lives of these children if they had the resources, and that they would have enough resources
if wealthier people gave excess income to charity instead of making unnecessary purchases. Singer argues
that since all of these premises are true, our frivolous purchases actually kill children.
In response to this argument, an angry audience member might respond: “Your coffee drink must cost five
or six bucks, and I bet you get one or two a day. That must add up to thousands of dollars a year. So I say
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 83
your argument is bunk!” (This is hypothetical, but Singer is frequently challenged or interrogated on his own
spending habits.)
This is an example of ad hominem fallacy, because it attacks Singer instead of his argument. If Singer is
shown to be a hypocrite, then the argument seems to lose credibility. But whether or not Singer is a
hypocrite is irrelevant to whether his argument has merit. The argument stands on its own and it is that
argument we need to assess.
You will encounter ad hominem attacks throughout politics and media, even (maybe especially) online.
EXAMPLE A politician who puts forth arguments to take action against global warming is attacked for
taking a fossil fuel-burning jet between her home state and Washington DC; the argument is that if global
warming were a legitimate issue, she would not be traveling so much.
EXAMPLE A prosecutor counters some expert testimony that undermines her case, asserting that an
automobile accident was due to vehicle malfunction and not driver error. The prosecutor starts to ask
questions about the expert’s background, revealing that he was well paid to do his analysis and give
testimony, and has furthermore done similar work dozens of times in other cases, each with substantial pay.
She then argues to the jury that he is “paid to do this,” and they should “make of that what [you] will.”
EXAMPLE A person in an online argument responds, “I’d expect rubbish arguments like that from a guy
who still uses an AOL account!”
In each of these cases, the attack on the person does not provide evidence for the conclusion. Nevertheless,
they can be effective on an emotional level. It is satisfying to see an opponent as a hypocrite, a cheater, or a liar.
For people on the fence, it is proof that the person can’t be trusted. But good critical thinking requires judging
an argument on its own merits, not the character of the person making the argument.
Note that in each of these cases, the purpose of the attack is to undermine the person’s argument. The
underlying concept is that if the person isn’t credible, the argument must be false. However, there are cases
where attacking a person’s credibility is not an ad hominem attack.
Suppose a witness is on the stand testifying against a defendant in a court of law. The defense lawyer may dig
up things about the witness’s past such as her own criminal record. In this, the defense lawyer has good reason
to paint the witness as untrustworthy. The jury is trusting her statements as true. Revealing facts that show her
to be dishonest makes her testimony unreliable. Note that in the case, the witness is not making an argument,
per se, but simply making claims about their experience. Similarly, it may be that attacking a senator whose
actions are inconsistent with her words is not to discredit the bigger argument about global warming, but to
question her commitment to environmental issues. In any of these examples, the attack may still be unkind,
unprincipled, or unfair; they simply do not meet the definition of ad hominem fallacy.
EXAMPLE The 2018 novel American Dirt paints a sympathetic picture of Mexican fugitives entering the
U.S. illegally. At the time it was released, the author was attacked not for the overall purpose of the book,
but because she had no personal connection to Mexican Americans or to undocumented immigrants. In this
case, it is not an ad hominem attack, because critics were not disagreeing with the overall “argument” of
the book, but rather the author’s authenticity and credibility as an expert on the topic; moreover, they
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 84
resented her profiting from telling a story that she had not lived through personally. You may or may not
agree with that criticism, but it is not an example of an ad hominem attack.
TRY IT
The first example is a fallacy because the person’s background does not have any relevance to their
claims. The second is not a fallacy because the reliability of a person is relevant to whether they should
hold public office, and compulsive or addictive behavior compromises their reliability. It may be unfair if
the person has set those behaviors aside long ago, but it is not a fallacy.
Ad Hominem Fallacy
TERM TO KNOW
Ad Hominem
A fallacy where one attacks the person making an argument instead of refuting the argument itself.
2. Straw Person
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 85
If you have ever taken a class on persuasive writing or been on a debate team, you might have learned that an
effective way to build an argument is to present the opposing argument, then refute it. But the temptation can
be strong to then present the opposing argument in an exaggerated way, which is much easier to refute. This is
a straw person fallacy, misrepresenting the opposing argument so it is easier to refute.
The name (originally “straw man” or “strawman”) is quite old—over five hundred years!—and comes from the
practice of training soldiers for battles by having them knock down enemies made of straw. While this might
be good practice for striking targets from a distance, it goes without saying that it is much easier to
vanquish enemies made of straw than the real thing.
EXAMPLE Two candidates for political office are having a debate in which one has proposed a plan for
putting more money into healthcare and education, and the other has laid out his plan for deep tax cuts.
The second responds: “You won’t be satisfied until you’ve squeezed every last penny out of the taxpayers.
If he’s elected, you'd better kiss that summer vacation goodbye, folks, because he’s going to tax you into
poverty.” The second has wildly exaggerated the first’s position so it is easier to refute. Nonetheless, this
kind of straw person argument is ubiquitous in political discourse because it works. Like the ad hominem
attack, it can be emotionally compelling and therefore effective, but it is fallacious reasoning.
EXAMPLE Consider another political debate between two candidates for a local school board. The first
has proposed that the schools should provide middle schoolers with sex education classes, including how
to use contraceptives. The second responds: “I can’t believe you’re encouraging middle schoolers, some as
young as eleven years old, to have sex!” This is another straw person fallacy. It exaggerates and
misrepresents the proposal and the purposes of sex education and chooses the youngest end of the age
spectrum to be the standard.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 86
As with other fallacies of relevance, straw person fallacies can be compelling on some level, even though they
are irrelevant. We are easily swayed by arguments that demonize our opponents and are further swayed by fear
of the unknown, if exaggerated consequences of a policy decision are given to us. Straw person arguments are
thus popular with politicians because they play on people’s emotions, and because there are rarely bad political
consequences for doing so.
TRY IT
The first statement is not a straw person argument, as it uses exact words to represent the opponent’s
prior argument. The second example, though a famous zinger, is a straw person argument. Senator
Quayle made no claims of being John F. Kennedy; he only used Kennedy as an example of someone
who sought higher office at a young age.
Your presentation of the opposing argument is Your presentation of the opposing argument is honest
intentionally exaggerated to make it easier to and accurate, ideally using direct quotes from your
discredit or refute. opponents.
TERM TO KNOW
3. Tu Quoque
“Tu quoque” (pronounced “tyoo kwoh-kwee”) is a Latin phrase that can be translated into English as “you too”
or “you, also.” The tu quoque fallacy is a way of avoiding answering a criticism by bringing up a criticism of your
opponent rather than answering the criticism. In other words, the idea is that you avoid talking about your flaws
by bringing up and subsequently focusing the audience on your opponent’s flaws. This is a fallacy because it is
a way to entirely avoid giving evidence for or against a particular subject.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 87
EXAMPLE Suppose that a conservative politician is revealed to have had extramarital affairs, and an
opponent takes it up in a debate. “Maybe you can explain why the voters should trust you and your
conviction in your ‘family values’ when you betrayed your own family and values with your mistress,” he
says. His opponent replies, “I won’t take marital advice from a guy who’s on his fourth wife!” The fallacy is
used as a way of avoiding having to answer a tough question. This is yet another fallacy that is really
common in political discourse, and one with emotional appeal to the audience. It’s easy to watch a political
debate as a kind of rhetorical punching match, where both are expected to simply get their licks in.
There are circumstances in which a “you too” response is not fallacious. For example, suppose that a politician
criticizes the incumbent for taking money from special interest groups. In this case, the one under attack might
respond that his challenger has also taken money from special interest groups, and so has every other
candidate for every office higher than city dog catcher. He might say it is a fact of life in American politics. This
isn’t really a tu quoque fallacy because it is not avoiding the criticism, but giving context for the behavior.
Tu quoque is often used as a follow-up to an ad hominem attack (and is often itself an ad hominem attack). An
interlocutor attacks your character and so you respond by cutting your interlocutor down to size, so to speak.
However, it’s worth noting that just because the fallacy is used as a response to a fallacy doesn’t make it any
more permissible or any less of a fallacy.
TRY IT
The first is an example of tu quoque fallacy; the second is not an example of tu quoque because it
does not evade the question.
Tu Quoque Fallacy
TERM TO KNOW
Tu Quoque
A fallacy of avoiding answering a criticism by bringing up a similar criticism of your opponent rather than
answering the criticism.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 88
SUMMARY
In this lesson, you learned about three fallacies of misdirection; all three involved the way you
characterize your opponent and their argument. In the ad hominem fallacy, a personal attack against
your opponent is used to undermine their argument. In a straw person fallacy, an exaggerated form of
their argument is given that is easier to refute. The tu quoque fallacy avoids criticism by deflecting it
back to the opponent; in fact, it may be used in response to their own ad hominem fallacy. While there
may be times where it is not fallacious to lodge an attack that questions a person’s honesty, sincerity, or
authenticity, it is fallacious to believe that their argument is invalid; the argument must be taken on its
own merits.
Source: THIS CONTENT HAS BEEN ADAPTED FROM Critical Thinking: Analysis and Evaluation of Argument
TERMS TO KNOW
Ad Hominem
A fallacy where one attacks the person making an argument instead of refuting the argument itself.
Tu Quoque
A fallacy of avoiding answering a criticism by bringing up a similar criticism of your opponent rather than
answering the criticism.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 89
Fallacies of Misdirection (Part 2)
by Sophia
WHAT'S COVERED
In this section, we will continue to learn about common informal fallacies that affect our reasoning.
Specifically, we will learn about:
1. Genetic Fallacy
2. Appeal to Consequences
3. Red Herring
1. Genetic Fallacy
The genetic fallacy occurs when one argues (or, more commonly, implies) that the origin of a theory or idea is a
reason for rejecting (or accepting) it. Although the roots of an idea can be interesting, they are not always
relevant to the actual argument at hand. For a strong objection to an argument, opponents should explain what
is wrong with the idea itself rather than focus on its origins. When origins are discussed, they should be able to
explicitly explain how they affect the current argument.
EXAMPLE When smoking bans began to be passed, the tobacco lobby argued that such bans had
origins in Nazi Germany, which is historically accurate but irrelevant to why smoking bans were being
proposed decades later in the U.S. The hope was to suggest that there was something sinister about the
bans by linking them to Nazis. In this case, there were no clear connections made between smoking bans
and other aspects of Nazi Germany, and the argument was not being made out of a good faith concern.
Genetic fallacies can also be used to argue for something. For example, someone might argue that Jesus
himself said that “the poor will always be with us,” so it is impossible to end poverty, and in turn, use that to
reject a social welfare program designed to end poverty. In this case, the reasoning is set aside for the
emotional link to a person millions worship and admire. But ideas must be assessed on their own merits, not
their origins.
As with the other fallacies listed in this tutorial, there can be great emotional pull toward this fallacy. If you are
reading the arguments of Thomas Jefferson, for example, you may be unable to forget that however lofty his
ideas, he enslaved people and failed to acknowledge their humanity. Such deep flaws must surely be ingrained
in his arguments, you might think. However, this is an error in reasoning unless you can expose the flaws in the
arguments themselves. At the same time, it is a fallacy to presume the words of Jefferson are perfect or above
criticism because he is so lionized in American history. We can and should (for the sake of good critical thinking)
carefully evaluate his documents for biases and ideologies which might be linked to their origin.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 90
TRY IT
The first statement is not a fallacy, at least not yet. It is understandable to be skeptical of opinions that
come from people or organizations with a vested interest in something; in this case, the speaker is right
to be skeptical but note that the speaker doesn’t dismiss the findings outright, and may still read the
study and examine it for errors. The second statement does reject the claims outright because of the
source, so it is an example of genetic fallacy.
Genetic Fallacy
TERM TO KNOW
Genetic Fallacy
A fallacy of rejecting (or accepting) an argument solely on its origin, and negative (or positive)
associations people have that are not relevant to the soundness of the argument.
2. Appeal to Consequences
The appeal to consequences fallacy is the reverse of the genetic fallacy: while the genetic fallacy connects the
truth or reasonableness of an idea based on its origins, the appeal to consequences fallacy is the mistake of
trying to assess the truth or reasonableness of an idea based on the consequences of accepting that
conclusion.
For example, suppose that a study proves a link between women not having access to legal abortion and
increased child poverty, abuse, and neglect. Perhaps the researchers had set out hoping to prove the opposite
and are stunned by the results. As staunch opponents of abortion, they may refuse to accept the legitimacy of
the study and look back to find mistakes in the methodology. If it leads to the public reconsidering a ban on
abortion, they reason, there must be something wrong with the study, or some flaw in its design. However, the
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 91
fact that the results would have an unwanted effect is not a reason for rejecting the conclusions of the study as
false. The consequences of an idea (good or bad) are irrelevant to the truth or reasonableness of that idea.
Notice that the researchers might choose not to publish the study since it does not support their goals. While
this is not good scientific practice nor academic integrity, choosing not to publish disagreeable results is not
itself a fallacy. The fallacy is deciding that the results themselves are false because they have negative
implications. The fact is, sometimes truth can have negative consequences and falsehoods can have positive
consequences.
This does not mean it is always a fallacy to measure the consequences of a specific action or decision. In fact,
this is crucial to moral reasoning (as we will discuss in a later unit) and is part of good critical thinking. It is only a
fallacy when you reject a reasonable or factual conclusion because the consequences of it being true may
undermine other positions.
THINK ABOUT IT
You are part of a nonprofit organization that traps feral cats, neuters them, and re-releases them (commonly
called TNR for trap, neuter, release). The purpose of TNR is to gradually reduce the number of feral cats
without exterminating them outright. You are confronted with data that feral cats pose a critical risk to local
wildlife. You suspect that if this data is widely read, it will undermine support for your program. In fact, it may
even change what people consider to be “pro animal.”
While it would be tempting to consider the potential consequences of this data to be bad for your
mission and therefore reject or suppress it, TNR groups and wildlife groups have collaborated under
the common goal of reducing the feral cat population. In this case, resisting the temptation of the
appeal to consequences leads to better critical thinking, a better solution, and makes allies of another
organization that we might see as an opponent.
The appeal to consequences is used implicitly and explicitly in public policy. Abundant studies that demonstrate
global warming are dismissed not based on the science, but on the lost jobs and other repercussions of trying
to slow or stop climate change. There is great emotional pull to reject a study that challenges more deeply held
beliefs, and might even change your way of life. Similarly, it takes strength to accept new information that might
force you to revise your deeply held beliefs or even change your habits. This is the true sign of a good critical
thinker.
Appeal to Consequences
Legitimate conclusions or information are rejected You look realistically at consequences as a direct
because they might lead to unwanted consequences. result of your actions or decisions.
BIG IDEA
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 92
Both genetic fallacy and the appeal to consequences do not mean it’s wrong to even consider or
investigate the origins or consequences of a belief; these are essential to good critical thinking. They are
only fallacies when a conclusion is reached based on an emotional response to origins or consequences
instead of careful reasoning.
TERM TO KNOW
Appeal to Consequences
The fallacy of trying to assess the truth or reasonableness of an idea based on the consequences of
accepting that idea.
3. Red Herring
The red herring fallacy involves diverting someone’s attention from the actual issue at hand by drawing the
attention instead to an issue that has only a surface relevance to the argument. Red herrings may be an
intentional or accidental diversion away from the core issues in the argument.
Though most associated with false clues in mystery stories, the term “red herring” is from a practice in
training bloodhounds to stay on the true trail. A canned herring (which turns red in the pickling process)
would be dragged along the trail to confuse the hounds.
Consider a “common” colloquial retort to issues with money. If a child says to their parent, “Dad, it’s really hard
to make a living on my salary,” a common rejoinder is for the dad to reply, “Consider yourself lucky! When I was
your age, I made $40 a week.” Though, on the surface, this reply is tangentially relevant (it also talks of salaries),
it takes attention away from the core issue, which is that the child has difficulty living on their salary. The dad’s
salary being less than the child’s is irrelevant, particularly as the cost of living was significantly lower when the
dad was growing up.
Another common example of a red herring is the “starving kids in Africa” reply. Suppose that Tom broke up with
his boyfriend and is expressing sadness to you. You reply, “Compared to the troubles of the starving kids in
Africa, your problems are pretty small. So just think of that when you start getting sad.” This is a red herring: the
difficulties that poor children in Africa or elsewhere may face are not at all relevant to the difficulty Tom is
experiencing right now.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 93
In a more political context, red herrings are often used to divert attention from the issue at hand to another
issue constituents may feel strongly about. Suppose a senator gets pushed on the issue of abortion. A reporter
asks, “Why won’t you support the anti-abortion amendment. Aren’t you concerned at all for the unborn
children’s lives who have been blotted out?” The senator replies, “Why do you people get so worked up about
the lives being blotted out by abortion when you don’t seem to care at all about the thousands of lives blotted
out by poor handgun laws? Why don’t we focus on that?” Diverting from the issue of abortion to gun laws is a
red herring as it is not directly related to the debate around abortion.
Here's an example from an actual politician shortly after a school shooting in Uvalde, Texas, in 2022. When
responding to the deaths of the students killed by the shooter, Texas Lt. Governor Winsome Earle-Sears used
several red herrings to avoid engaging with the argument that our gun laws are a core issue in the number of
school shootings in the U.S. She stated that “we took prayer out of schools…we have emasculated our men…we
have fatherless homes….” Though it can be argued that these are, in fact, societal problems, they are red
herrings in the question around gun laws and mass shootings. She made these comments in response to a call
for gun control after a school shooting. The statements quoted above are irrelevant to that question.
However, what feels like a red herring may not be after all, if the person can show the direct connection to the
current argument. For example, if the dad who brings up his original salary can show that, adjusted for inflation,
it is indeed a comparable amount to his son’s salary and follow up with financial planning for how to live on a
budget as he did, it would be more relevant than it seems at a glance.
TRY IT
Only the third is a red herring, since the fact other people wanted the job isn’t relevant to the decision
the person has to make. The first two are not red herrings; they do shift discussion but are relevant to
the decision.
Red Herring
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 94
TERM TO KNOW
Red Herring
Diverting someone’s attention from the actual issue at hand by drawing the attention instead to an
issue that has only a surface relevance to the argument.
SUMMARY
In this lesson, you learned three more fallacies of misdirection. These three fallacies do not attack the
person making the opposing argument but still introduce irrelevant information to the argument. The
genetic fallacy looks at origins of an idea to color people’s opinions, regardless of whether those
origins are relevant to the current situation. An appeal to consequences may cause you to reject an
idea or information because of the larger implications of it being true. A red herring is raising any point
that is not directly related to the topic under discussion with the goal of changing the subject or shifting
focus.
Source: THIS CONTENT HAS BEEN ADAPTED FROM Critical Thinking: Analysis and Evaluation of Argument.
REFERENCES
Clark, A. (2019, August 7). We’ve been trapping and sterilizing stray cats for decades. Does it work? University of
Florida News. news.ufl.edu/2019/08/does-sterilizing-stray-cats-work/
Vargas, T. (2022, June 1). The danger of a politician blaming mass shootings on ‘emasculated’ men. The
Washington Post. www.washingtonpost.com/dc-md-va/2022/06/01/emasculated-men-mass-shootings/
TERMS TO KNOW
Appeal to Consequences
The fallacy of trying to assess the truth or reasonableness of an idea based on the consequences of
accepting that idea.
Genetic Fallacy
A fallacy of rejecting (or accepting) an argument solely on its origin, and negative (or positive) associations
people have that are not relevant to the soundness of the argument.
Red Herring
Diverting someone’s attention from the actual issue at hand by drawing the attention instead to an issue
that has only a surface relevance to the argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 95
Fallacies of Irrelevance (Part 1)
by Sophia
WHAT'S COVERED
In this section, you will learn about fallacies of irrelevance, which give false support to an argument
through irrelevant appeals. Fallacies of irrelevance may be made in good faith but upon consideration
are not legitimate support. In this tutorial, we specifically talk about trying to support an argument with
irrelevant information, such as:
1. Appeal to Authority
2. Inconsistency Fallacy
3. Loaded Question
What all fallacies of irrelevance have in common is that they make an argument or response to an argument
that is irrelevant to the topic at hand. Similar to fallacies of misdirection, fallacies of irrelevance
fundamentally rely on shifting the topic of the argument to something tangential to the core points of the
argument. Though these fallacies don’t often have strong emotional pull the way fallacies of misdirection
do, if we aren’t careful to watch out for them, these fallacies can lead us down unhelpful conversational
rabbit holes. These fallacies tend to fall into two categories: either taking irrelevant considerations into
account or failing to take relevant considerations into account.
The first of these, taking irrelevant considerations into account, includes defending a conclusion by
appealing to irrelevant reasons, such as through inappropriate appeals to pity, popular opinion, or tradition.
Though similar to red herrings at a glance, the difference is that red herrings are an intentional change in
focus. The fallacies in this case are believed to actually support the argument, but do not do so.
Irrelevant appeals can be effective if they make us interject emotions into a decision. An example would be
when a student failed a course and asked the teacher to give him a pass instead, because “his parents will
be upset.” Since grades should be given on the basis of performance, the reason being given is irrelevant.
However, the teacher may feel sympathetic to the student and change the grade. Remember that setting
emotions aside is an important part of critical thinking.
EXAMPLE Suppose an American politician has been arguing for an improvement in human rights for
minorities in another country with which the U.S. has a strong relationship. Now imagine someone criticizing
these efforts in the following way: “These arguments for human rights in another country have no merit
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 96
because they are advanced by a politician who naturally stands to gain from the positive media coverage.”
This criticism is fallacious because whether the person advancing the argument has something to gain from
being seen to have improved human rights conditions is a completely different issue from whether there
ought to be improved human rights conditions.
Specifically, fallacies that invoke irrelevant considerations can be broken down into three specific fallacies.
1. Appeal to Authority
An appeal to authority is the use of the opinions of well-known and well-regarded people (authority figures) to
support an argument that is not related to said authority's field of expertise. As you know from writing academic
papers where we cite sources, experts are crucial to any argument. The problem comes when we invoke
someone whose expertise is not relevant to the issue for which we are invoking it, or when they are an outlier
of expert opinion; it is thus sometimes called an appeal to false authority.
A common way you might encounter the appeal to authority fallacy is when the opinions of scientists and
medical professionals are used to support ideas outside their specializations. People who were hesitant to
receive the COVID-19 vaccines when the vaccines were first introduced in 2021 often pointed to the many
examples of nurses and other medical staff who shared their vaccine hesitancy. However, though these medical
professionals were undoubtedly well-trained for their specific jobs, they were not specialists in the formulation
and evaluation of vaccines. Thus, evoking the opinions of medical professionals who do not work directly with
vaccines is a form of the appeal to authority fallacy.
Similarly, if you try hard enough, you can find someone with a doctorate who will tell you that the earth is flat, or
with a medical degree who can vouch for leeches as a medical intervention. The appeal to authority fallacy
occurs when we mistake the authority's advanced qualifications in another field or specialization as relevant to
the opinion they express on a different topic altogether. Other times, the authority may have the right
credentials but be an outlier in their field and not producing quality scholarship in support of their opinion.
At the extreme end of the fallacy are celebrity opinions being pushed as evidence on areas wildly outside of
their field. An Oscar winner might promote a health fad, or a beloved musician might endorse a political
candidate. You might see the overlap here with the halo effect, which we described in the last challenge. The
emotional appeal of a person we hold in high regard can be strong because we feel that such successful
people couldn’t be mistaken, or that people who have moved us with their art must be wise, but they are human
and as prone to bad judgment as anyone else.
There are cases where trusting or quoting experts is not itself a fallacy, and is even necessary in the world we
live in. Nobody can be an expert on every topic from astrophysics to microbiology, and we rely largely on trust
of experts to make decisions. It is fallacious when we privilege a person with more authority on a subject than
they really have, in an area outside their expertise, or at odds with the general consensus in the field.
TRY IT
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 97
Which of the following is an example of an appeal to authority fallacy?
I’m buying this car because it got an excellent review in Automotive Monthly.
I’m buying this car because my favorite actor said it’s awesome in an interview I read.
I’m buying this car despite all the bad reviews because my Facebook friend loves hers.
The second argument is the most fallacious, since the actor has no authority on the topic. The third is
also fallacious, although it is a human trait to trust a friend more than strangers; even if the friend is an
automotive engineer, their experience with the car should be considered in context of the bad reviews.
The first is not fallacious, presuming that the reviews are conducted by impartial experts on evaluating
cars.
Appeal to Authority
TERM TO KNOW
2. Inconsistency Fallacy
The inconsistency fallacy happens when an argument includes a contradiction. That is, the argument is
inherently flawed because it promotes two distinct beliefs which cannot both be true at the same time. This is
exemplified in a classic quip from baseball player Yogi Berra, “I never said most of the things I said.” Though
tongue-in-cheek, the example is clear; the conclusion (that I never said most of those things) is in clear
contradiction to the premise (I said those things). Another funny Yogi Berra inconsistency is, “Nobody goes
there anymore. It’s too crowded.” Again, this is an inconsistency because if nobody actually went there, then
there is no way it could be crowded.
Inconsistency fallacies can show up more perniciously in everyday reasoning. For example, a person might
generally be thrifty, and even admonish others for spending money, and at the same time talk about investing
money in risky endeavors, living up to the expression “penny wise, pound foolish.”
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 98
Inconsistency often occurs when someone attempts to support a position with an ideological premise that
actually contradicts their position. For example, suppose a nature preserve is going to be turned into a hotel
and golf course, and the person in charge of the plan claims the project is “great for nature, great for preserving
the environment.” For another example, someone might justify cutting revenue to schools by arguing that they
support education. Such arguments are surprisingly common in politics.
THINK ABOUT IT
What about the reverse? Is it inconsistent to say, "I believe that murder can sometimes be justified," and
then to oppose abortion and support the death penalty, or to support abortion and oppose the death
penalty?
Interpreting inconsistency takes care, however; accusing a person of inconsistency may entail creating a
strawman argument of your own. For example, you might accuse a person of inconsistency if they say they love
animals and also eat meat. This suggests that their “love for animals” is so absolute they will not harm any
under any circumstances, where in fact they may just mean they enjoy having pets and going to the zoo. You
can still disagree with them over meat-eating, but it is not inconsistent if they simply have a more practical view
of things. Keep in mind that while people may be inconsistent in their views, they may hold both beliefs in good
faith.
TRY IT
The first answer is inconsistent, since the term “cancel culture” suggests silencing or censoring
opinions you disagree with, which is exactly what the person proposes doing in the schools. The
second is not self-contradictory but recognizes there are reasonable limits on free expression when
the rights of other people are taken into consideration.
TERM TO KNOW
Inconsistency Fallacy
When an argument includes a self-contradiction.
3. Loaded Question
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 99
The loaded question fallacy is when someone uses a question that contains a controversial or unjustified
assumption in making an argument. A common loaded question (particularly in the political sphere) is to ask a
question like, “Have you stopped cheating on your wife?” The problem with such a question is that it makes the
assumption that you have cheated on your wife, and either way you answer (i.e., “yes” or “no”) implies that you
did cheat on your wife. When this assumption is unjustified, the question is loaded. Keep in mind that this fallacy
is different from begging the question, because begging the question involves assuming the premises in the
conclusion, whereas a loaded question involves asking a question based on an unjustified assumption.
Crucially, the way to respond to uses of this fallacy often involves not actually answering the question directly
(with a simple yes/no response), but rather challenging the question’s underlying assumption. In other words,
“In my view, a fetus isn’t a baby” or “I have never cheated on my wife.”
Last, remember that not all questions which carry assumptions are loaded questions. For example, if the spouse
is known to have been committing infidelity, asking, “Have you stopped cheating on your wife?” is a legitimate
question which carries an evidence-backed assumption (it is true that you did cheat on your wife).
THINK ABOUT IT
A pollster calls for your opinion and asks if you want “four more years of the same stagnant economy,” or
support “bold actions that will revitalize the economy.” Is this a loaded question? What assumptions are
being implied by the questioner? How might the question be asked in a fair way?
Loaded Question
TERM TO KNOW
Loaded Question
Asking a question that contains a controversial or unjustified assumption.
SUMMARY
In this lesson, you began learning about fallacies of irrelevance, which are using appeals in an
argument that do not actually support the argument. One way of doing this is taking irrelevant
considerations into account, like an appeal to false authority, a person or group who do not have
expertise on the topic, or inconsistency fallacy, where elements of a position contradict each other.
Another form of fallacy is asking a loaded question, referring to framing a question in a way that forces
the opponent to accept a premise that has not been proven.
Source: THIS CONTENT HAS BEEN ADAPTED FROM Critical Thinking: Analysis and Evaluation of Argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 100
TERMS TO KNOW
Appeal to Authority
The fallacy of supporting our arguments with the claims of famous or well-regarded people who are either
not experts on the matter being argued and/or do not represent the consensus of experts in the
field. Also known as the appeal to false authority.
Inconsistency Fallacy
When an argument includes a self-contradiction.
Loaded Question
Asking a question that contains a controversial or unjustified assumption.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 101
Fallacies of Irrelevance (Part 2)
by Sophia
WHAT'S COVERED
In this section, you will continue to learn about fallacies of irrelevance, which give false support to an
argument through irrelevant appeals. Fallacies of irrelevance may be made in good faith but upon
consideration are not legitimate support. In this tutorial, we will focus on fallacies where a person fails
to take all of the available information into account. These include:
1. Appeal to Ignorance
2. Appeal to Incredulity
3. Appeal to the Stone
In the previous tutorials, we talked about fallacies where someone tries to change the subject or support an
argument with irrelevant information. The other category of fallacies of irrelevance is when we fail to take
relevant considerations into account. For example, it is not unusual for us to ignore or downplay criticisms
because we do not like them, even when those criticisms are justified. Or sometimes we might be tempted
to make a snap decision, believing our knee-jerk reactions are correct when, in fact, we should be
investigating the situation more carefully and doing more research. Of course, if we fail to consider a
relevant fact simply because we are accidentally ignorant of it—if we don't know what we don't know—then
this lack of knowledge does not constitute a fallacy.
1. Appeal to Ignorance
The first fallacy of irrelevance we will consider in this category is appeals to ignorance. An appeal to ignorance
happens when one argues that a statement is true because there is no evidence that it isn’t true, or the
statement hasn’t yet been proven false. A famous example of an appeal to ignorance comes from Bertrand
Russell, a famous British philosopher from the first half of the 20th century. Russell asks his audience to imagine
that there is a teapot orbiting the sun between Earth and Mars. The teapot is too small to be seen by even the
most powerful telescopes. Russell’s hypothetical interlocutor thus concludes that since we cannot prove there
isn’t a teapot there, we must assume that it is there orbiting the sun. Obviously, this is patently ridiculous, and a
straightforward reply is to demand positive evidence for us to believe the teapot claim.
Nonetheless, it is quite common for people to engage in this fallacy in their everyday lives. People speak as if it
is the business of others to disprove their beliefs rather than our responsibility to prove them. Conspiracy
theories, for example, often rely on appeals to ignorance. People will argue that because scientists cannot
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 102
disprove the existence of UFOs, they must exist. Carl Sagan summed up this issue nicely when he observed in
The Demon-Haunted World: Science as a Candle in the Dark:
The claim that whatever has not been proven false must be true, and vice versa. (e.g., There is no
compelling evidence that UFOs are not visiting Earth; therefore, UFOs exist, and there is intelligent life
elsewhere in the Universe. Or: There may be seventy kazillion other worlds, but not one is known to have
the moral advancement of the Earth, so we’re still central to the Universe.) This impatience with ambiguity
can be criticized in the phrase: absence of evidence is not evidence of absence.
Appeals to ignorance even happen in religion, where people argue that since we cannot prove that God
doesn’t exist, he must actually exist—or alternatively, that since we cannot prove God does exist, that he must
not. Appeals to ignorance go both directions. Not only do people argue that if there is no evidence against
something, it must be true, people also argue that if there is no evidence to support something, it must be false.
Both forms of appeal to ignorance are fallacious.
THINK ABOUT IT
Would you hire an exterminator if you had not seen any vermin in your house? Probably not, and it’s
reasonable to make that decision if you haven’t seen any pests. In fact, we rely on what we don’t know to
be true to guide us to decisions all the time; it is only a fallacy when you insist that a lack of evidence is
absolute proof, especially when we are not in a position to have that evidence.
Remember that informal fallacies are largely about context and interpretation, however. There are cases where
an appeal to ignorance may at least contribute to an argument, if not providing airtight proof. For example,
suppose the claim is one that, if true, people would have to know about it. If you claim that aliens had invaded
the earth and were everywhere, it would be fair for someone to respond that they haven’t seen any. The lack of
evidence in this case challenges the claim that the aliens are everywhere.
Consider the argument, “Kids today are so disrespectful. It’s because they don’t get the same discipline they
used to.” A person might then respond, “Well, I’ve been teaching for thirty years and have not seen any
difference in the way kids behave.” This does present a counterargument; although it is based on a lack of
evidence, the person responding should have seen evidence of kids behaving differently if they knew many
children over a long span of time and could observe behavioral shifts between generations.
Appeal to Ignorance
TERM TO KNOW
Appeal to Ignorance
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 103
An argument is dismissed because it has not yet been proven true, or is accepted because it has not
been proven false.
2. Appeal to Incredulity
Appeals to incredulity are fallacious arguments in which the arguer presumes that whichever position is true
must be the one that is easy to understand or imagine. Often, appeals to incredulity have the form of claiming
that you cannot imagine how something could be true, and thus it can’t possibly be true. Sometimes, this fallacy
shows up as someone undermining the authority of an expert by appealing to their own failure to understand
the argument. This fallacy often shows up in contrasting expert opinions, and in debates about science.
A common example is challenging the theory of evolution. Opponents often use the “difficult-to-explain” feature
of biology as proof that there must be a creator. For example, they may say “scientists cannot explain how the
platypus evolved by the theory of Darwinian evolution. Therefore, the platypus didn’t evolve.” However, the
problem with this sort of argument is that even if there is no non-design-based explanation for how a certain
organism evolved in the present moment, one could be discovered in the future. The inexplicable is simply
inexplicable at the present moment and does not provide evidence for an alternative explanation.
This doesn’t mean you have to believe any theory or accept it as true if you don’t understand it, just that you
should not dismiss it as false. Perhaps a person explains dark matter, and you have trouble understanding
exactly what dark matter is or why they believe in it. You might be skeptical. That doesn’t make it a fallacy; it is
only a fallacy if you state, “There’s no such thing as dark matter,” because you don’t understand the concept.
BRAINSTORM
You might not understand aerodynamics, but your airplane is staying in the sky with or without your
understanding. What are some other topics or ideas that might be prone to the appeal to incredulity fallacy
because they are too wild or complicated to be easily understood?
Appeal to Incredulity
TERM TO KNOW
Appeal to Incredulity
An argument is dismissed because it is too difficult to imagine or comprehend.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 104
Appeal to the stone is a fallacy in which an arguer dismisses a claim as absurd without actually demonstrating
why it is absurd. The term comes from a story in which the English writer Samuel Johnson (best known for
writing the first English dictionary) has been listening to a friend argue that the physical universe is illusory,
which is impossible to refute. Johnson says, “I refute it thus!” and kicks a stone.
These arguments are fallacious because they fail to consider the merits of the claim under dispute. Appeals to
the stone happen in all sorts of contexts and function as straight-up dismissals of ideas, beliefs, and theories.
For example, someone may say, “You barely even need to hear anything about the theory of evolution to know
it is ridiculous. They basically just think we are all monkeys.” Such a dismissal is an appeal to the stone. There is
no engagement with the argument itself, and no demonstration as to why the theory is absurd or false.
Remember that in critical thinking, we should support any conclusion with evidence. In this case, we may ask
that the other person do so but suspend judgment ourselves unless we can provide evidence to justify our
dismissive attitude.
As with the appeal to incredulity, this does not mean you have to accept the argument as true, or even worth
your time, if it seems ridiculous and there is no evidence given to support it. For example, when hearing the
argument that the entire universe is illusory, Johnson might have said, “Without proof or even a way to test this
theory, I don’t see the point to pondering it,” which would have not been a fallacy. The fallacy comes from
presuming that an argument is refuted by your claim of ridiculousness.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 105
It’s a fallacy when… It’s not a fallacy if…
An argument is considered false or impossible The argument is considered unprovable or not useful
because it is absurd. but no presumption is made that it is therefore false.
TRY IT
Imagine that you like playing racing games, but your roommate is concerned that these types of video
games make people into bad drivers. “No, they won’t,” you say. He replies, “You don’t know that.” You say,
“You’re one to talk, you play Grand Theft Auto all the time and were in an accident last week.”
What fallacies are present in your argument? What fallacies are present in your friend's argument?
The first fallacy, made by your roommate, is an appeal to ignorance, “You don’t know that.” An
argument should not be presumed true until you have evidence against it. Your response, “You were in
an accident last week,” is a red herring, simply changing the subject to avoid the real topic. You might
have also called it an ad hominem attack or a “tu quoque” fallacy, since you turned the discussion to
his own driving accident. In any case, bringing up your roommate’s accident doesn’t refute his
argument.
TERM TO KNOW
SUMMARY
In this lesson, you continued to learn about fallacies of irrelevance, which refers to using appeals in an
argument that do not actually support the argument. Another way of doing this is failing to take relevant
considerations into account, such as an appeal to ignorance (stating that because nobody can
disprove your argument, it must be true), an appeal to incredulity (dismissing an argument because you
find it hard to believe or hard to understand), and the appeal to the stone (essentially dismissing an
argument as ridiculous).
Source: THIS CONTENT HAS BEEN ADAPTED FROM Critical Thinking: Analysis and Evaluation of Argument.
REFERENCES
Sagan, C. (1995). The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House.
TERMS TO KNOW
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 106
Appeal to Ignorance
An argument is dismissed because it has not yet been proven true, or is accepted because it has not
been proven false.
Appeal to Incredulity
An argument is dismissed because it is too difficult to imagine or comprehend.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 107
Terms to Know
Ad Hominem
A fallacy where one attacks the person making an argument instead of refuting the argument
itself.
Affinity Bias
An implicit bias that leads us to favor people who either we perceive as similar to us or who
we feel connected to.
Analyzing
The process of examining a subject, idea, or belief in question, understanding it, and being
able to explain both the information and its implications.
Antecedent
The “if” part of an “if/then” statement.
Appeal to Authority
The fallacy of supporting our arguments with the claims of famous or well-regarded people
who are either not experts on the matter being argued and/or do not represent the
consensus of experts in the field. Also known as the appeal to false authority.
Appeal to Consequences
The fallacy of trying to assess the truth or reasonableness of an idea based on the
consequences of accepting that idea.
Appeal to Ignorance
An argument is dismissed because it has not yet been proven true, or is accepted because it
has not been proven false.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 108
Appeal to Incredulity
An argument is dismissed because it is too difficult to imagine or comprehend.
Attribution Bias
A form of implicit bias where we tend to attribute the success of ourselves and the people
we’re close to, to hard work and talent, and the setbacks to bad luck or unfairness; we then
do the opposite to people we do not feel connected to.
Bad Faith
An argument or discussion in which one or more of the parties has a hidden, unrevealed
agenda, leading to dishonesty and a refusal to accept the basic dignity of their opponent.
Belief
Confidence that a statement is true or that something exists.
Causation
When one event or action directly causes another.
Closed-Minded Thinking
Having a fixed point of view and refusing to consider other ideas or opinions.
Cognitive Bias
An error in the way we gather and process information.
Collectively
To attribute a quality of something to the whole. This may or may not be fallacious depending
on context.
Compassion
Acknowledging the humanity in others and empathizing with it.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 109
Composition Fallacy
An error in reasoning that since each part of the whole has a certain feature, it follows that
the whole has that same feature.
Concluding
Forming an opinion, belief, attitude or determining a novel fact about the subject or idea in
question.
Conclusion
The logical result of the argument.
Conditional
An if/then statement (other words may be used but the sentence can be rephrased as an
if/then statement).
Confirmation Bias
The tendency to search for, focus on, analyze, and remember information that aligns with our
preconceived perceptions and opinions.
Consequent
The “then” part of an “if/then” statement.
Correlation
An established relationship between two variables or sets of data.
Critical Thinking
Thinking in an active, persistent, and careful way about any belief, the evidence that supports
that belief, and the conclusions that it leads to, which is in part directed at finding out as
much relevant, truthful information as practically possible.
Distributively
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 110
To attribute a quality of something to its parts. This may or may not be fallacious depending
on context.
Division Fallacy
An error in reasoning that since the whole has some feature, each part must also have that
feature.
Dogmatic
The practice of insisting on one’s principles as the only possible good or correct beliefs.
Egocentric Bias
The tendency to rely too heavily on one’s own perspective, not considering the perspective
of others, and/or to have a higher opinion of oneself than is realistic. Hindsight bias and
attribution bias are two forms of egocentric bias.
Equivocation
An error in reasoning that relies on using different meanings of the same word or phrase
within the same argument.
Fallacies
Errors in reasoning; unlike biases, these are not based on underlying assumptions or the way
you think, but flaws in the argument itself. Fallacies may also be used intentionally when
arguing in bad faith.
False Dichotomy
An error in reasoning that presents two options (often extreme) as if they were the only ones
available.
Formal Fallacy
An error in the structure of an argument.
Formal Logic
Structured reasoning that follows a set of rules and may use mathematical symbols.
Genetic Fallacy
A fallacy of rejecting (or accepting) an argument solely on its origin, and negative (or positive)
associations people have that are not relevant to the soundness of the argument.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 111
Good Faith
An argument or discussion in which all parties present are honest, respectful of each other’s
dignity, and genuinely want to hear what the other person has to think and say.
Hindsight Bias
The tendency to perceive past events as being more predictable than they were.
Idealism
Devotion to an ideology.
Ideologue
One with rigid adherence to a set of principles.
Ideology
A deeply-rooted core belief that guides a person’s commitment to other values, principles,
and decisions. Often ideology is a key part of our identity and is held by many members of
the same group.
Implicit Bias
Any kind of bias that automatically and unintentionally affects critical thinking. Implicit bias is
based on assumptions, beliefs, or attitudes that we are typically not consciously aware of, but
which affect and limit our thinking. In use, implicit bias usually refers to how we judge and
treat other people.
Inconsistency Fallacy
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 112
When an argument includes a self-contradiction.
Informal Fallacy
An error in reasoning which cannot be identified without understanding the meanings behind
the statements involved in the argument.
Integrity
The quality of making sure your actions and opinions are consistent with your stated values.
Loaded Question
Asking a question that contains a controversial or unjustified assumption.
Logic
Reasoning by way of a set of rigid rules.
Misinformation
Information that may be outright false or misleading, being factually true but leading to a false
conclusion.
Moral Reasoning
A more nuanced and often truer picture of what is and is not a right action, achieved by
applying critical thinking to personal ethics.
Open Questions
Questions that require the respondent to elaborate on a given point. These include questions
such as, “What you think of…?”, “How does that relate to…?”, and “Why do you think… is true?”
Open-Mindedness
Monitoring your own thinking to make sure that you are open to every possibility, every kind
of information, and every kind of alternative or approach, and are not wedded to your
preconceived opinions.
Personal Ethics
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 113
An individual’s understanding of which actions are “right” and which actions are “wrong,” not
as facts, but as moral principles
Pragmatism
Focus on practical and workable solutions that may entail compromise.
Premises
The statements that should logically lead to (or be the evidentiary support for) the conclusion.
Questioning
The art of interrogating a subject, idea, or belief.
Reasoning
The act of thinking in an active, persistent, and careful way.
Red Herring
Diverting someone’s attention from the actual issue at hand by drawing the attention instead
to an issue that has only a surface relevance to the argument.
Specific Questions
Questions that require a concrete answer and may have a factual answer. These include
questions like, “Where were you on the night of June 5th, 2017?” and “What is the capital of
North Dakota?”
Statement
A declarative sentence that asserts something to be true.
Tu Quoque
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 114
A fallacy of avoiding answering a criticism by bringing up a similar criticism of your opponent
rather than answering the criticism.
Valid
The conclusion follows logically from the premises.
© 2024 SOPHIA Learning, LLC. SOPHIA is a registered trademark of SOPHIA Learning, LLC. Page 115