Learn More

40 Hour Workweek

Education Trends, Podcast Articles   |   Sep 1, 2024

Why humans fall for misinformation & creative ways to teach information literacy skills

By Angela Watson

Founder and Writer

Why humans fall for misinformation & creative ways to teach information literacy skills

By Angela Watson

Even young students can learn how to understand and combat misinformation, and it’s a key information literacy skill in 2024.

In this episode, I’m talking to Melanie Trecek-King of Thinking is Power, because I love how her approach to the topic of misinformation is characterized by compassion and empathy. Melanie emphasizes that everyone is susceptible to misinformation, and falling for it doesn’t indicate a lack of intelligence. This perspective humanizes those who believe in conspiracy theories or disinformation, so we can view them as people who–like ourselves–have unknowingly accepted false information.

Melanie is an Associate Professor of Biology at Massasoit Community College in Massachusetts, where she teaches a general-education science course designed to equip college students with critical thinking, information literacy, and science literacy skills. An active speaker and consultant, Trecek-King loves to share her “teach skills, not facts” approach with other science educators and to help organizations meet their goals through better thinking. Trecek-King is also the Education Director for the Mental Immunity Project and CIRCE (Cognitive Immunology Research Collaborative), which aim to advance and apply the science of mental immunity to inoculate minds against misinformation.

Melanie and I discuss three primary reasons we fall for misinformation:

1) Confirmation Bias: Our tendency to interpret information in ways that confirm our existing beliefs. Once we believe something, we see evidence for it everywhere, reinforcing that belief. Skepticism is crucial for protecting oneself from misinformation, but it’s most challenging when information confirms our biases.

2) Appeals to Emotion: Emotions, particularly anger, outrage, and fear, can trigger the part of our brain that hinders critical thinking. Many forms of misinformation specifically appeal to our emotions to convince us without evidence. When we feel emotionally triggered, it’s a good time to slow down and practice emotional skepticism.

3) Reiteration Effect: Also known as the illusory truth effect, the reiteration effect means that the more we hear something repeated, the more likely we are to think it’s true, even if it isn’t. Our brain equates ease of processing with truth, so repeated exposure to false information can lead us to believe it.

We also discuss the problem with “doing your own research,” and why Melanie sees 2024 as the post-trust era, not the post-truth era, and how we can respond

The remainder of our conversation is centered on how to teach information literacy to students. Melanie provides actionable tips and ready-to-use resources to help you:

1. Demonstrate to students that they can be fooled (e.g., through personality reading exercises).
2. Discuss how beliefs are formed using non-triggering examples (e.g., historical witch trials).
3. Include misinformation in lessons to help students recognize its characteristics.
4. Use tools like the FLOATER toolkit to help students evaluate claims systematically.
5. Have students create misinformation to understand its techniques better.

While these concepts are typically taught at the college level, they can be introduced as early as middle school. Even elementary students can begin to understand concepts like author’s purpose and recognizing persuasive techniques.

Understanding misinformation is crucial in our daily lives, yet it’s often absent from educational standards. Check out Melanie’s site for lots of free resources to teach about misinformation using humor and non-triggering approaches to help students recognize it in the real world.

 

Listen to episode 309 below,
or subscribe in your podcast app

Sponsored by Opportunity Gap

Key quotes

  • “We are all prone to falling for various types of misinformation. It’s just a matter of what kind of misinformation we’re most susceptible to.”
  • “The most important time to be skeptical is when something confirms our biases because it fits how we think the world works.”
  • “Appeals to emotion, like anger and fear, trigger the part of our brain that is highly emotional, making us less able to think critically.”
  • “The reiteration effect: the more we hear something repeated, the more likely we are to think it’s true, even if it’s not.”
  • “Doing your own research often means looking for information that confirms what you already thought was true. True research involves systematically collecting and evaluating evidence.”
  • “We can’t be independent thinkers on everything and we shouldn’t aim to be. Trusting experts who have dedicated their lives to studying specific topics is essential.”
  • “Disinformation purveyors don’t need to prove the consensus is wrong; they just need to create doubt.”
  • “People who fall for disinformation think they know better and feel empowered, but it’s incredibly difficult to break through that certainty and confidence.”
  • “Understanding misinformation is crucial to our daily lives and often absent from our standards. Bring various forms of misinformation into your classrooms to help students recognize it in the real world.”
  • “One important skill is lateral reading: open a new tab and search for the claim or source with terms like ‘fact check’ or ‘reliable.'”

The Importance of Empathy in Addressing Misinformation

ANGELA: So Melanie, one of the things that I really loved about your approach when I heard you on the Divorcing Religion podcast is how compassionate and empathetic you are and your insistence that we all fall for misinformation. It’s just a matter of what kind of misinformation we’re most susceptible to. So it doesn’t mean that a person is stupid if they believe in a conspiracy theory or other types of disinformation. And I think this is a really important place to begin because it really humanizes the other side so they are no longer idiots or horrible people. They’re just folks who have believed some things that they don’t realize aren’t true. Can you say more about that?

MELANIE: We are all prone to falling for various types of misinformation. Whenever I talk about misinformation with students or people, everybody seems to understand misinformation is a problem but we all think it’s somebody else’s problem. Like we’re too smart to fall for misinformation but of course, we know other people who fall for it. So the problem with that is we can all fall for it but also there’s not necessarily a link between intelligence and what you fall for. We all have blind spots we all think that we want to believe or don’t want to believe and so part of not falling for misinformation means through recognizing your own vulnerabilities.

Three Primary Reasons for Falling for Misinformation

You’ve identified three primary reasons that we fall for misinformation. Confirmation bias, appeals to emotion, and the reiteration effect. I’d like to unpack each one of those with you one by one. Let’s start with confirmation bias. 

Confirmation Bias

Confirmation bias is our tendency to interpret information in ways that confirm what we already think is true. And so once we think something is true, we see evidence for it everywhere. And every time we do, it reconfirms that belief. Skepticism is the single most important factor to protect yourself from misinformation. But the most important time to be skeptical and the hardest time to be skeptical is when something confirms our biases. Because if I’m scrolling through a newsfeed and I see something that fits with how I think the world works, then why would I question it? I mean, it makes sense. And so it just seeps in there and I accept it is true.

Funny story here. My husband and I like to play card games, and I don’t like to lose. So we have this game that we play, and we’ve played it a lot, and every time we pull out the game I’m like, “Oh gosh, you’re going to beat me again, you know, here we go”.

“What are you talking about? You went all the time.”

“No, I don’t. You beat me all the time.”

Now, the interesting part is that he’s been keeping score. So he knows exactly how many games he’s won and how many I’ve won. And I think right now we’re like 90-something to 60-something. I do win more than he does. But if he didn’t have that data, I would still think that I lost more because that matters to me more. And when I lose, it confirms that bias. And if I win, Oh, this is just the now thing. All of that is a way just continue to interpret information in a way that confirms that we see the world. This isn’t even why we fall for satire. So satire makes us laugh for making a point, but we’ve all seen people who fall in for satire. And it’s because it fits. Of course, that’s the way that it works and so they don’t question it.

How can we be skeptical in those moments when we first encounter information? What would it look like to be skeptical about something that is confirming our existing worldview?

Yeah, this is the hardest time to be skeptical because you can’t be skeptical about everything all the time. It would just take away too much energy. So to me, it’s more about knowing your own vulnerabilities and about having that healthy level of just awareness to allow the question to enter your mind. Because if you don’t stop and question whether something is true then you won’t check. And so I teach my students how to fact check, but you won’t fact check if you already just think something is true. So that little check in your head, just check in with yourself. Is this true?

Another story here. So last year at Halloween, this image was going around and it was “Halloweiners”, the black licorice-flavored hot dogs. And I saw people sharing this that honestly, I would have thought it should know better. They were PhDs, academics, nutritionists, and people who fight misinformation, and they’re like, Look at American diets, this is absolutely disgusting. And I thought, Wait a minute, that doesn’t look right. So I fact-checked it. And in fact, it was an altered image. It was satire. And they fell for it because it confirms American diets are terrible. So of course, Americans would eat black licorice-flavored hot dogs.

So this is just about being aware of your confirmation bias and maybe not going into everything with skepticism because, as you mentioned, it’s a drain on our resources. Our brain uses confirmation bias to help us think more efficiently. If we question everything we learn, that’s just not efficient. It’s something that’s set up to help us, but can we get more curious about it? you know, what else might be true, what might be another perspective. Is that the kind of direction you’re going with the confirmation bias?

You’re right. We literally can’t be skeptical about everything. So knowing the times and places. And sometimes, there’s more at stake if you’re not skeptical, especially if you’re going to share something, like it, or interact with it in some way then check.

Appeals to Emotion

Another reason that we fall from misinformation in addition to confirmation bias is appeals to emotion. How does that work?

Yeah, appeals to emotion. So this is a broad group of fallacies that include any number of emotions. But in particular, anger, outrage, and fear trigger the part of our brain that is highly emotional, fight, fight, freeze, part, and when it does that, we’re less able to think critically. A lot of forms of misinformation specifically appeal to our emotions, to try and convince us without evidence. And so they’ll use hyperbolic language or overly sensationalized name calling, yelling.

If you feel yourself being emotionally triggered, it’s a pretty good time that you might want to slow down and practice some emotional skepticism. It’s also why we fall for satire again because satire confirms our biases or it can and all that makes us laugh. And so there’s another emotion where you know once we’ve got that emotional part of the brain going then our critical thinking back with these are less online.

Reiteration Effect

So we may fall for misinformation due to confirmation bias or appeals to emotion. Another reason is the reiteration effect. Let’s talk about that one.

Its other name is Illusory Truth Effect. I like the reiteration effect because it describes more the concept that basically the more we hear something repeated the more likely we are to think it’s true even if it’s not true. And actually in studies, if you tell people this is not true but you give them the piece of misinformation and then tell it to them again and again, they’re going to remember it is true. So this is really a problem if you’re in an echo chamber where the same false information is pinging around and around and around. Our brain assumes ease of processing with truth. So we hear it again, and again. Every time we hear it, our brain doesn’t have to think about it quite as much. So over time, the brain just equates that with, Well, it must be true.

I definitely feel like I see that a lot in politics, where there are certain talking points or certain party lines and folks will repeat it over and over. And I realize, I’ve heard this so many times and I don’t actually think this is true. I don’t think there’s any time to back this up at all. I think this is their own spin, but literally it’s become such a talking point. I’ve heard it repeatedly and I think it definitely, I’ve seen that in myself. It’s mainly believed that something like that was true, even though I knew that it was actually just a talking point, just based on the amount of times that I heard it spoken and the amount of times that I saw it in print.

Yeah, and I commend you for doing that because once you’ve heard it so many times, it’s even hard to step back and question if it’s true.

The Problem with “Doing Your Own Research”

You have a really amazing article in your site called “The problem with doing your own research”. I love that because I feel like that’s become like such the buzzword of the last few years. Do your own research. What’s the problem with that?

The problem is — this is a broad generalization here — but oftentimes when this is being used, what this means is I went to Google and I looked for information that confirmed what I already thought was true and that is not how research works. So there was a difference between primary and secondary research. And one can do research by looking at existing research but it is still a systematic way of collecting and evaluating evidence. When someone is doing their own research if it’s not in an area that they have expertise So there’s limited knowledge, they don’t know what they’re looking for, they don’t know the body of evidence, there’s a giant hole, a lack of understanding, but a desire to fill it and it’s usually with something that we want to believe or don’t want to believe. And so that motivation fuels us.

We go to a search engine and we use it. We even find a study that says — and you can find a study that has anything. I could even go to Google Scholar, and I could type in something and go see science. But to me, this highlights the importance of knowing what you don’t know. And recognizing your limits, and ironically, I often see this from people who fashion themselves as some sort of independent thinker. And this phrase bothers me a bit because none of us are independent thinkers. Even if we can think critically and deeply about a particular topic, it is impossible to do on every topic. None of us can have that kind of knowledge.

And the result is, you’re ultimately gonna have to trust someone else. So, if I’m not doing primary research, if I’m doing secondary research and I’m looking at other existing sources, who do I choose to trust and why? And oftentimes when our biases are leading us, we’re picking the sources that tell us what we want to hear and not necessarily the ones that are most reliable.

Yeah, we’re not doing our own research. We’re doing our own Google. Number one. That’s a big part of it. Another thing that comes to mind when I think about people doing their own research and our confirmation bias is that the lack of trust in our society is a big problem. And it seems like because you can find an article that confirms pretty much anything, you can find a study that has been interpreted in any kind of way. It might be a study with eight participants, but you know, these scientists have reached a conclusion that X, Y, and Z is true.

And so we have a situation right now specifically in the US in which we do not trust our experts. We don’t trust our teachers to make good decisions for kids. We don’t trust our public schools. We don’t trust our doctors. We don’t trust our politicians. There’s just such a lack of trust and I think we have to be really careful in not abandoning the idea that truth exists. Truth can be uncovered and we can understand. There are people who benefit from us giving up and just saying well I guess we’ll never really know. There’s a lot of things that we can know and I think we can approach it with intellectual humility as you’re saying knowing what we don’t know and always being open to more information. We cannot be independent thinkers on everything that shouldn’t even be the goal necessarily, I don’t think.

Because an independent thinker means that I don’t trust anybody else who has dedicated their life’s work to studying this, understanding this, is credentialed on it, is living and breathing this thing. And I think it’s especially important for teachers to consider because we in education know what that’s like to not be trusted. We know what it’s like for some stranger on the internet to think they know better than we do about what we should be doing in our classrooms. I’m always trying to be mindful of not necessarily differing to people who have more expertise, but certainly listening to them and considering them. Because if you have expertise and skills and experience in an area that I don’t, your opinion should weigh a little bit higher in my mind than just, you know, Joe Schmo on Facebook who was leaving a comment about something that he found in doing his own research.

Post-Truth Era and Trust Issues

I am so glad you brought that up because there’s a lot of talk about post-truth and I don’t think we’re in a post-truth actually. I think people want the truth. I do think that some have so much cynicism at this point that they doubt the ability to find the truth. Yes, but I think you’re right that is actually a function of the lack of trust and sometimes that’s not by accident. So  I think it’s more of a post-trust in that our trust has broken down. And when we don’t, instead of trusting the most reliable sources of information, or the most reliable processes of information, of knowledge production, we’re trusting air quotes. Those in our tribes or in our groups that tell us, confirm our identities, or share similar beliefs. Disinformation pushes distrust partly for that reason. So misinformation is information that’s not true. Disinformation is deliberately deceitful. So disinformation is a subcategory of misinformation.

Disinformation purveyors have a strategy so they don’t have to prove to you that the consensus is wrong, they just have to sow enough doubt that gets you to go, Well, I don’t know. We don’t know enough, so maybe we shouldn’t do anything. And then they, by definition, have to appeal to conspiracy theories because that is really the only way to explain why basically all the world’s experts don’t hold that position. So they’re all conspiring, and of course, you can’t trust any of them. They’re up to get you. We’re on your side, right? And that opens the door then for whatever it is that they want to sell you. So that breakdown of trust is part of a strategy. It’s sometimes a side product of other things. But disinformation purposefully so does trust for that reason.

You also said, “Independent thinker means that I don’t trust others’ expertise” and that’s totally what that means. It’s I trust myself and not experts. And I will say for myself at least I know what I know within a relatively narrow area and I know there’s more to know within that area and I know there’s a whole lot more that I don’t know. But importantly, I know where to go to find that information. I don’t have to know everything. I just need to know who knows it so that I can have the best access to that information. And that’s why human societies are so successful. In part because we specialize. We break down and I can study critical thinking and somebody else grows my food and somebody else treats my wastewater and fixes my car and flies my plane. That’s how societies are built and that trust breakdown is a serious problem.

It is. And understanding that that is an intentional strategy, you know, by people who have ulterior motives there. I often hear phrases like, I’m just asking questions. That’s sort of code for, I’m just going to sow some seeds of doubt and poke at this a little bit, but I don’t actually have any answers. I don’t have any research. I don’t have any expertise. I’m just going to kind of make you distrust other sources besides me. And to just recognize that as a strategy. The goal is to get people to give up on ever discerning truth, to think that it’s just not possible to arrive at it. And that’s a way to, as you mentioned, it’s a way to control. It’s a way to keep you within that bubble of information and trust them and not anyone else.

Ironically, the way that it’s often packaged is a false sense of empowerment, meaning people who have fallen for this think that they know better, that they are armed with knowledge and have even special knowledge, and they know that they can’t trust. It feels enlightened, but it is the opposite of that. But that feeling, the feeling of certainty, the feeling of knowing, that feeling of empowerment is incredibly difficult to break through. People who fall for it believe they have special knowledge and can’t trust traditional sources. This feeling of certainty and empowerment is incredibly difficult to break through.

Right. And we’ve got those three reasons that we fall for misinformation at play. We’ve got confirmation bias. This person is questioning an expert that I don’t like, or questioning someone who sees the world differently than me. It’s appealing to my emotions. It’s making me feel empowered. It’s also making me feel angry because I’m being lied to. And it’s the reiteration effect. You’re continually telling me that this can’t be trusted or this isn’t right. And then those three things together, Hmm, yeah, it starts to make sense. Maybe that conspiracy is true.

So let’s get into how to teach these skills to students. You have tons of really clever and engaging free lessons on your site and my favorite is Wake Up Sheeple which I feel ties into what we’re talking about here so perfectly. It’s a design-your-own conspiracy lesson and it’s designed to show kids how conspiracy theories originate and how easily they spread. I’m going to link to your lesson on how to inoculate against information, and something you have called floater, a toolkit against misinformation. Those are some of the gems that I found on your site, but tell me some of your favorite ways to make these kinds of topics that we’ve talked about accessible to kids and to make information literacy meaningful for them.

So I love this question and you’re probably gonna have to shut me up at some point. One of the things that I’ve discovered in teaching these skills is that the most important first step is to get students to recognize that they can be fooled. And so I start class by fooling them. I tell them that I have a friend who is an astrologer and she’s going to do free personality readings on them. And so I give them a few questions and I say, “You know, next time I’ll give you a reading”. The next time, we’re going to vote. “So silently read this and on a scale of 1 to 5 how accurate is she? So 1 to 5. I’ve been doing this for years, about 4.3 to 4.5 out of 5. She’s super accurate.” And the students are really excited. “Okay. Now talk to the person next to you about what you thought was accurate and why”. And sometimes it can take them 10 to 15 minutes before they all got the same reading.

Now I didn’t come up with this. This was originally done by Richard Bore in the 50s. James Randi made it famous. It’s based on Barnum statements. Things like You have a need for people to admire you. You’ve wondered whether you’ve done or said the right things. You’ve questioned your life choices. You know, pretty statements basically apply to all of us. But I do this because I want students to realize that they can be fooled. Now, if you don’t want to be fooled, that’s what critical thinking is for. But if I told them they could be fooled, they’d be like, Gosh, I’m smarter than that, but no, you can be fooled. So I do it to prove that to them. And then we start to talk about how we come to our beliefs.

I start with what I call the critical thinking training rules approach. I’m purposefully starting with beliefs that aren’t triggering because I want students to think about the process, get used to how it feels in the brain to evaluate these beliefs and the evidence for them without being emotionally triggered. And so I cover the witchcraft trials in Europe from the 15th to the 17th century. And we talk about the things people were accused of and the things people confess to. Why did they confess? Well, let me show you. So I show them some of the torture and it’s horrible, right? Now, I would confess to basically anything, and a lot of people did. Was that good evidence? They were obviously convinced, they were right. I mean, they were so convinced, they were torturing and killing hundreds of thousands of people, but was their evidence good?

Now, I want them to turn that in on themselves, but from this bird’s eye view, where they’re not personally vested in this belief, they can look at them and go, Well, they thought they were right, how good was their evidence? So I’m moving as we go through this basic epistemology and metacognition, and then I include a lot of misinformation in class.I think this is really key because if we don’t show students what bad science looks like, science denial, pseudoscience, conspiracy theories or fake news, or whatever it did, if we don’t show students that and only show them the good stuff, that’s not how the real world works. They’re going to walk out of our classrooms, get on their phones, and then how can they tell the difference? So seeing the misinformation helps them understand the characteristics of the more reliable information.

The toolkit is really helpful because I realized I was asking students to evaluate claims, but like, where do you start? I’ll give them a Reiki or a crystal healing, okay, well, I’ll think critically through this. It’s a mountain, right? Where do you start? So the toolkit is intended to break down, start at the beginning. Can I test this claim? So the F is falsified, but can I prove it wrong? Is it logical? What kinds of fallacies are being committed here? Am I being honest with myself, the objectivity? Are there other ways to explain the alternative explanations? Tentative conclusions, evidence, reproducibility, all of that is a way to help students think through claims. I started early in the semester. I give them lots of opportunities to practice. And I dive into different rules and different levels of detail as the semester progresses. And I have them create misinformation.

So the conspiracy theory that you’re referencing is an example. I teach my students how to cold read like psychics. I teach them how to argue illogically, actually, this is one of my favorite assignments. I really, so I teach a course on critical thinking and at the end of the semester, one semester, I was kidding around finals time. I was going to need emails from students saying that they should pass the class and I was reading them going, if you learned what I was hoping to teach you, you wouldn’t be sending me this argument. And so now it’s an assignment early on in the semester by teaching some logical fallacies, and I say, “Okay, pretend, it’s the end of the semester, and you’re failing, because you deserve to, right? In this scenario, you should not pass. Send me an email arguing for why you should pass the class, using at least four fallacies from class.”

And so they’ll tell me like, “Well, my dog got hit by a car and I’m really sad. And if you fail me, I’m going to be living in a tent and be homeless for the rest of my life.”, or, “My parents think I should pass”, or they love to insult my character, so lots of bad hominem attacks. But the point is it’s always fun, right? These are in jest, humor, so they send me these emails, and then they have to find the fallacies that other students use in their arguments. So yeah, it’s a riot and then there’s one on I have them create advertisements for fake pseudoscience products. I teach them how to be science deniers and the whole point is that they can learn to recognize the technique of that particular type of misinformation so that they don’t fall for it in the real world.

Right, if you recognize it then you understand what’s happening and I find it so interesting. As someone who runs an online business, I see the marketing hacks and the things that marketers do. And people in my industry see right through it, but the average person has no idea that this whole thing is just a way to get someone on their email list or whatever the thing is, but the more that you see it, you sort of can’t unsee it. And that’s really powerful.

That’s exactly what my students tend to tell me. After the psychic cold reading, I’ll show them how to do it, and then show them psychics, and they’ll go, Wait, what? And they’ll say, “I can’t unsee this now”. And I bring in my lab coats and let students take pictures of themselves in lab coats. And so, Dr. So-and-So says it works, you know, certified by, sometimes they’ll just put certified on it, certification body or anything. Techno babble like, so a phytochemical enzyme electron vibrations at this cellular level, or something make it science madlibs but you know make it sound sciencey and yeah the students will all say “Well look at this. I saw on Instagram, professor. This is exactly like what we…”  Yes it is exactly the same.

Recognizing this technique is so powerful. What age group do you think it’s appropriate to start introducing these concepts?

So I teach college students I have taught this class to high school students and aspects of it have been taught to middle school. I would definitely say I think middle school is right for this. Students at that age are curious and able to handle the more challenging concepts and you know I teach college students it’s never too late I should say that but it’s kind of too late like starting earlier is better.

How to Teach Information Literacy Skills

The first crucial step in teaching these skills is getting students to recognize that they can be fooled. One effective method is to start the class by fooling the students, for example, with a fake astrology reading.

Critical Thinking Training Rules Approach

Start with beliefs that aren’t emotionally triggering. This allows students to practice evaluating beliefs and evidence without being emotionally invested. Historical examples, like the European witchcraft trials, can be useful for this purpose.

Exposing Students to Misinformation

It’s essential to show students what bad science, science denial, pseudoscience, and conspiracy theories look like. This helps them understand the characteristics of more reliable information.

The FLOATER Toolkit

The FLOATER toolkit breaks down the process of evaluating claims:

– F: Falsifiability (Can the claim be proven wrong?)
– L: Logic (Are there any logical fallacies?)
– O: Objectivity (Are we being honest with ourselves?)
– A: Alternative explanations
– T: Tentative conclusions
– E: Evidence
– R: Reproducibility

Creating Misinformation

Having students create their own misinformation can be a powerful learning tool. Examples include:
– Teaching students how to cold read like psychics
– Creating advertisements for fake pseudoscience products
– Arguing illogically (e.g., writing an email arguing to pass a class using logical fallacies)

Age-Appropriate Information Literacy Education

While the speaker primarily teaches college students, these concepts can be introduced as early as middle school. Students at this age are curious and capable of handling more challenging concepts. Starting earlier is generally better, but it’s never too late to learn these skills.

Lateral Reading: A Key Skill

Lateral reading is a crucial skill for evaluating online information. Instead of staying on a single website to determine its reliability, students should:

1. Open a new tab
2. Search the claim or source along with terms like “fact check,” “reliable,” or “credible”
3. Use the broader information ecosystem to assess trustworthiness

This method is faster and more effective than traditional website evaluation techniques.

Understanding Social Media Algorithms

It’s important for students to understand why they see certain content on social media platforms. Having students compare their social media feeds can help them realize how different their online ecosystems can be, and how this impacts their worldview.

Key Takeaway

The most important takeaway is the necessity of incorporating misinformation education into classrooms. While it’s crucial to our daily lives, it’s often absent from educational standards. Educators should strive to bring various forms of misinformation into their classrooms in a humorous, light-hearted, and non-triggering way to help students recognize it in the real world.

Resources for Educators

You can find more resources and lesson plans from Melanie at the “Thinking is Power” website:

The Foundations in Critical Thinking section covers basic concepts 
The Topics section allows you to explore by skill/topic 
The Educators tab provides info on how to use the teaching resources

The Truth for Teachers Podcast

Our weekly audio podcast is one of the top K-12 broadcasts in the world, featuring our writers collective and tons of practical, energizing ideas. Support our work by subscribing in your favorite podcast app–everything is free!

Explore all podcast episodes
Apple Podcasts Logo Spotify Podcasts Logo Google Play Podcasts Logo

Angela Watson

Founder and Writer

Angela is a National Board Certified educator with 11 years of teaching experience and more than a decade of experience as an instructional coach. She started this website in 2003, and now serves as Editor-in-Chief of the Truth for Teachers...
Browse Articles by Angela

Sign up to get new Truth for Teachers articles in your inbox

Leave a Reply

Want to join the discussion? Feel free to contribute!