On Misinformation and Truth Sandwiches—Lisa Fazio, Vanderbilt University

a series of dimly lit gray doors with one red door
Photo credit: Arek Socha from Pixabay

Episode Notes

Lisa Fazio is an assistant professor of psychology and human development at Vanderbilt University. She and her team in the Building Knowledge Lab study how children and adults learn new information, true and false, and how to correct errors in people’s knowledge.

Lisa’s research has applications in both the educational environment of the classroom and out in the world, including when it comes to how our brains process the inexhaustible stream of headlines, stories, videos, memes, likes, shares, and whatever else the Internet serves up to us at all hours of the day and night.

We framed our conversation around a study Lisa and one of her colleagues at Vanderbilt published in the September 2020 issue of the journal Psychological Science. It’s a paper that builds on previous work by her and many others related to how the number of times we hear a statement repeated impacts whether we think it’s true … even if it’s not.

In addition to implications for how we consume information on social media and elsewhere, this illusory truth effect has a throughline to Lisa’s research in education. And as she explains, all of us—no matter our age or our beliefs—must navigate the same internal mechanism that associates repetition with truth.

Whether this inclination serves us well or causes problems depends on the circumstances. But when it is problematic, such as in the case of misinformation, Lisa suggests a counter strategy that befits this, a podcast founded on the idea of brunch. It’s called:

The truth sandwich.

LINK

Episode Transcript

*Note: We do our best to make these transcripts as accurate as we can. That said, if you want to quote from one of our episodes, particularly the words of our guests, please listen to the audio whenever possible. Thanks.

Ted Fox  0:00  
(voiceover) From the University of Notre Dame, this is With a Side of Knowledge. I'm your host, Ted Fox. Before the pandemic, we were the show that invited scholars, makers, and professionals out to brunch for informal conversations about their work. And we look forward to being that show again one day. But for now, we're recording remotely to maintain physical distancing. If you like what you hear, you can leave us a rating on Apple Podcasts or wherever you're listening. Thanks for stopping by.

Lisa Fazio is an assistant professor of psychology and human development at Vanderbilt University. She and her team in the Building Knowledge Lab study how children and adults learn new information, true and false, and how to correct errors in people's knowledge. Lisa's research has applications in both the educational environment of the classroom and out in the world, including when it comes to how our brains process the inexhaustible stream of headlines, stories, videos, memes, likes, shares, and whatever else the internet serves up to us at all hours of the day and night. We framed our conversations around a study Lisa and one of her colleagues at Vanderbilt published in the September 2020 issue of the journal Psychological Science. It's a paper that builds on previous work by her and many others related to how the number of times we hear a statement repeated impacts whether we think it's true ... even if it's not. In addition to implications for how we consume information on social media and elsewhere, the illusory truth effect has a through line to Lisa's research in education. As she explains, all of us--no matter our age or our beliefs--must navigate the same internal mechanism that associates repetition with truth. Whether this inclination serves us well or causes problems depends on the circumstances. But when it is problematic, such as in the case of misinformation, Lisa suggests a counter strategy that befits this, a podcast founded on the idea of brunch. It's called: The truth sandwich. (end voiceover)

Lisa Fazio, welcome to With a Side of Knowledge.

Lisa Fazio  2:21  
Thank you. I'm excited to be here.

Ted Fox  2:23  
So you and your colleague at Vanderbilt, Carrie Sherry, you just published a really interesting study in the September 2020 issue of the journal Psychological Science. The title is "The Effect of Repetition on Truth Judgments Across Development," and it deals with something called the illusory or illusory--not sure the right way to pronounce it-- truth effect. What is the illusory truth effect?

Lisa Fazio  2:50  
Yeah, so the illusory truth effect is this finding that people tend to believe repeated statements more than they do something that they've only heard once. So by simply stating something twice, I can get you to believe it more.

Ted Fox  3:06  
And is there--does that relationship hold up with kind of diminishing returns beyond the repetition of just two times? Like, if I told you the same thing 10 times, presumably that's going to stick around more than if you just heard it twice?

Lisa Fazio  3:22  
Yeah. So this question of kind of what happens with multiple repetitions is really interesting. In the currently published literature, people have gone up to nine times.

Ted Fox  3:32  
Oh, wow.

Lisa Fazio  3:32  
And what you see is kind of a big increase from one to two, smaller from two to three, smaller, smaller, but still increasing each time.

Ted Fox  3:44  
So one finding of your study that really--and I shared this with you beforehand--but it kind of caused me to stop and say, Whoa, was you summarized it as: "Repetition increased adults' truth judgments even for falsehoods that contradicted preschool-level knowledge." And I think when I was first reading about your research, when I was first reading the paper, I was thinking, Okay, you know, for this illusory truth effect to be present, maybe the ideas presented have to be kind of complicated, or they're at least kind of opaque. But it sounds like the two of you showed that's not the case. Can you give us an example of something along those lines, where we see this kind of surprising result?

Lisa Fazio  4:31  
Yeah, so I'll say first off, you are similar to many decades of psychologists who are looking at this study. (Ted laughs) So this illusory truth effect was first demonstrated way back in 1977. There have been dozens of replications since, and there was just this assumption in the field that the effect only happened when people didn't have anything else to go on. So it happened for statements that were kind of plausible, but I don't really know if they're true or false. So a common statement people would use was something like, French horn players get a bonus to stay in the U.S. Army. Turns out that's true. But you probably didn't have any direct knowledge of its truth before hearing it. 

Ted Fox  5:15  
Right. 

Lisa Fazio  5:16  
What my co-authors and I have shown is that that's not the case. In fact, repetition increases truth even when it directly contradicts what you already know. So among people who can tell you that the skirt that Scottish men wear is called a kilt, when they hear the skirt that Scottish men wear is called a sari twice, they think it's more true than if they've only heard it once.

Ted Fox  5:43  
I mean, has there been--is there a lot in the literature about assumed credibility of the source? Has that been studied where it makes a big difference of--I mean, again, if I'm putting my very amateur psychologist hat on, I would think like, Oh okay, there's something to do with, I assume this source is credible. But does this effect happen just by sheer fact of the accumulation of the repetitions?

Lisa Fazio  6:15  
Unfortunately, people don't seem to pay as much attention to source credibility as you might hope. So it seems that just kind of raw repetitions, regardless of source credibility, are what matters. So some people have shown this effect using fake news headlines--so political fake news, things like Clinton was crying in her basement after the election or other kind of outlandish, fake headlines that have been put out. And what you find is that repetition increases belief even for those kind of implausible statements. And it doesn't matter that they're coming from these kind of disreputable sources.

Ted Fox  6:57  
And you touched on this a minute ago, but it sounds like even if it's an area where I presumably am somewhat conversant, even then, like, even if it's contradicting--and one that I thought was, if I'm remembering this right, an example in the paper, and I guess this goes back more towards the preschool knowledge piece--but it was this idea if you were saying to people, it was something to do with wasps, and I think it was either they produce honey or something like that. And (Ted laughs) it was just kind of incredible to me. And I was trying to put myself in that position of, Well, yeah, I guess if someone, if I heard that a few times in a row, I might [think], Well, did I remember that wrong? So I mean, it really seems like our own existing knowledge, it's difficult at times for it to be a bulwark against just the accumulated pressure of the repetitions.

Lisa Fazio  7:53  
Yeah, so we use this term "knowledge neglect," which is the idea that we have a lot of knowledge in our head, but we don't often, we don't always use it in a given situation. So you know that bees are the insect that makes honey. But when you've heard wasps are the insect that make honey twice, kind of in short succession, you're more likely to believe it than if you've only heard it once.

Ted Fox  8:17  
Right. So the main point of the Psychological Science paper that we're talking about, it wasn't to show that the illusory truth effect exists. You mentioned it here, you pointed out in the paper it's been shown in over 100 studies in the last 40 years, including your own work, and you talked about it there where it's something that contradicts people's existing knowledge. But in this study, you wanted to investigate how and why we start associating repetition with truth. And we'll link to the full article in the episode notes for people who really want to dig into that. And it is, there's some things that will be over your head if you were like me, but it actually--it's a very readable paper, even for a layperson kind of audience. But in general, how do you set out to study something like this?

Lisa Fazio  9:06  
Our question here was really kind of, we know that people use repetition as a cue for truth; why do they start doing it? Is it because they kind of reflect on the feelings they have when they read these statements and kind of consciously realize--or at least kind of implicitly realize--that the statements that are more easily processed that they've heard multiple times are more likely to be true? Or is it instead this really automatic thing, where just your brain's calculating statistics of how many times you've heard things and then how likely it is to be true? So does this truth effect require metacognition or reflecting on your own thoughts, reflecting on how you feel when you're experiencing something? Or is it something that just happens on its own? And so one of the ways to test that is to look at development, and to look at how kids learn or start to use this effect. And if it requires this kind of complicated metacognition and thinking about your own thinking, then young kids--like five-year-olds--shouldn't show the effect. It should only emerge later in childhood or in adulthood. And so that's what we aim to test in the paper.

Ted Fox  10:24  
So when you had folks come into the lab to test this, what are you asking them to do in order to kind of tease this out of them, so to speak?

Lisa Fazio  10:33  
Yeah, so we had five-year-olds, 10-year-olds, and adults that came into the study. The first phase of the experiment, we call the exposure phase, so everyone listened to the computer tell them some nature facts, some true and some false. And the nature facts also varied on kind of whether or not you have any prior knowledge about them. So there are some facts that even the five-year-olds should know if they're true or false. There were some in the middle that kind of the 10-year-olds should know. And then there were some that even the adults likely don't know if they're true or false. And so we introduced this little character that we called Rosie the robot, and she was going to tell them these nature facts. And for each fact, we just asked them to rate if they thought it was interesting or not so interesting.

Ted Fox  11:23  
Okay.

Lisa Fazio  11:24  
So here, we're just exposing them to some of the statements. And then after kind of a short delay where they did some mazes as a little filler task, participants entered the truth phase. And this time, they again heard statements from Rosie the robot, and we again warn them that Rosie knows a lot about some things but not so much about other things. So some of what she says won't be correct, will be false. And this time, what they judged was whether what Rosie said was true or not true. And then if they were really sure, or not so sure, of that judgment. And so this time, some of what they heard was repeated from the exposure phase, and other statements were new. And so we can look at what proportion of statements do people say are true when they're either new or repeated.

Ted Fox  12:15  
And what did you find doing this? Did you find that, Oh, this is something that we develop as we come to think more complexly about ourselves and our place in it [the world]? Or is it maybe something a little more innate?

Lisa Fazio  12:31  
Yeah, so our results suggest that this is something that kind of happens pretty automatically and implicitly, so even the five-year-olds showed an effect of repetition; they were more likely to believe the repeated statements than the new statements. And that effect was really similar across the five-year-olds, the 10-year-olds, and the adults. So by the age of five, kids are already using repetition as a cue for truth.

Ted Fox  12:57  
You made mention of this earlier, but unfortunately, researchers such as yourself aren't the only ones who are aware of how this illusory truth effect works, even if that term is not a common term in most people's vocabulary. But when we think about misinformation and propaganda and false headlines--all of which are easier to spread now than they've ever been because of the internet, and they do so much damage because people believe them--how do we guard against what can easily become errors in our own and other's knowledge? Because that, I mean, that was one thing that was really clear to me in reading the paper is that ... we're susceptible to this as human beings. It's not that, Oh, this kind of person [is susceptible]; it's, No this is evolutionarily or whatever else, this is how we come to associate things with the truth. So are there things that we can do to, I guess, not be impervious to it but better guard ourselves against it?

Lisa Fazio  13:59  
Yeah. First, I want to echo what you said that these are examples of just, this is part of what makes us human. These are just human biases. It's not that it only happens to other people. Each of us can get fooled, each of us can end up believing some misinformation. In terms of guarding yourself against it, some things that have shown to be effective are really thinking about accuracy when you're viewing the statements. Rather than just judging the truth at the repetition, if people judge truth both at exposure and repetition, then you get a smaller increase, a smaller boost. And there's also some evidence that getting people to explain how they know that a statement is true or false can also reduce the effects of repetition. So one way I think about it is that there's two routes to determine what's true. You can rely on kind of this gut feeling, these feelings of familiarity and fluency, and that's what we do most of the time. Or you can take the slower, more contemplative route of actually thinking through your prior knowledge, thinking about who the source is, what you already know, and trying to evaluate it. And trying to go through that second path is going to be more effective.

Ted Fox  15:13  
Right. The flip side of that, the case where someone has already come to believe the false statements, I think we all know--you, me, anyone listening to this--convincing someone that they've gotten something wrong, it's usually a lot harder than saying, Hey, here's the correct fact! And, Hey, I'm going to show it to you a second time, and there's this repetition there. Do we know anything about the persistence of information, true or false, and how it relates to the illusory truth effect? Or, you know, put another way, do we start running up [to] information having its limits to kind of elbow its way into our brains even if it's something that's being repeated to us?

Lisa Fazio  15:57  
Yeah, so there's a few things there. So one thing is that there's some things that you kind of believe so strongly that no matter how many times I repeat it, you're not going to believe the opposite. So my favorite one is like, Your name is Ted; no matter how many times I tell you that your name is Russell, you're not going to ...

Ted Fox  16:14  
(laughing) I'm going to push back.

Lisa Fazio  16:15  
... believe that, right. And so when you're trying to correct errors in other people's knowledge, there's a few things we've found that are more helpful, that kind of psychologists know are helpful. One is something that we call a truth sandwich. So one way of doing this debunking is to start with the true fact, then mention kind of the false information along with why it's wrong--you can mention the motivation of kind of why people are spreading the false information--and then end by repeating the true stuff again. And what you really want to make sure during this whole thing is that you're making the truth more memorable than the misinformation. And that can be hard to do because true information is complicated, there are nuances to it. Whereas if you're just making stuff up, you can make it as simple as you'd like. And so it really is a challenge. But there are ways to make it easier.

Ted Fox  17:13  
And I mean, I don't know if your work specifically has touched on this, but I imagine that's part of what makes something like propaganda or misinformation so sticky, because it can be really simplified. And it can just be kind of this bold statement. And if if you don't bother yourself with getting into the weeds of the nuance and the detail, and then that simpler thing is a lot easier for someone to remember, true or false. It's just an easier thing to remember; it's like, Oh, here's this really strong statement, and I don't really need to pick at it, it's just this thing, and okay, I can internalize that and walk away with it.

Lisa Fazio  17:52  
Exactly. We saw that a lot in the early days of the pandemic when scientists were just figuring out how this virus works and how it was transmitted. And there was this big void of information. And so if you're listening to the scientists, they had kind of partial explanations, and there were changes over time. And it's all things that you would expect for something new that scientists are still investigating. But you can contrast that to the people who are portraying false narratives, who can just have a very simple statement like, This is the hoax, or it's no more dangerous than the flu. And so if you're not careful at contradicting those, it can be a real uphill battle.

Ted Fox  18:36  
I mean, in that example in particular, too, then it even becomes--like you said, that's the natural scientific process with a novel virus--but then it becomes, Well, six months ago, you didn't say that, and now you're saying this. And it's, Well, no, that's how science works, we know more about the virus than we did before. But it becomes, again, that's that level of granularity that, like you said, it seems like it's a lot harder for us to hold onto.

Lisa Fazio  18:59  
Yup.

Ted Fox  19:00  
So we've been spending--you know, we've talked about how this effect works, and how it kind of develops in us at a very young age. And, you know, obviously elections and everything are on everyone's mind right now, so we think about information and misinformation. But a lot of your work is in the area of education. And I mean, I can definitely see some connections there. But I'd like to hear from you how the illusory truth effect and this idea of repetition and things like that, how does that come to play--or relatedly come to play, I guess--in the area of education and how students learn and how we can correct problems in how they're learning and things like that.

Lisa Fazio  19:43  
Yeah, so my research really focuses on how people learn true and false information from the world around them. And so one way that people learn information is just through this simple repetition and the types of things we see with the illusory truth effect. Another thing I've done a lot of research on is this knowledge neglect. So, how do people fail to notice errors in what they read? So we've done a series of studies where we have college students or even young children read fictional stories. And the stories contain true and false facts about the world. So you might read a story about a summer job at a planetarium. And the main character had to dress up as an astronaut and says either, Maybe I was supposed to be Neil Armstrong, the first man on the moon, or, Maybe I was supposed to be John Glenn, the first man on the moon. And then after they've read the stories and some time has gone by, we give them just a trivia quiz, and some of the questions pertain to the information in the stories. So we'll ask them, Who was the first man on the moon? And what you find is that people often rely on the information in the stories, even if they've been warned that they were fictional, even if we have them kind of going through specifically trying to detect errors in there. It's just really hard for our brains to notice these contradictions between what we're reading and what we already know. And so that has obvious implications for educational settings, where teachers like to use fictional materials, either books or movies, because they're more engaging to students. And so what research has suggested is that these materials can be a great tool in classrooms to encourage engagement and kind of dive into the material, but you need to be aware that students are also picking up on this misinformation from the stories and actively do something to kind of warn them about it and correct it at the end.

Ted Fox  21:44  
That example--I mean, our oldest child is in first grade, and I'm just kind of putting myself in those shoes of helping him. I mean, he's doing school at home right now, so helping him with his schoolwork. And I can envision myself sitting there reading something like that, and then reading the question and me going, Well, I think the story's wrong; like, it's Neil Armstrong, but I don't know, do they want you to say John Glenn because that was in there? And I wonder, does that point to--and again, I'm not, feel free to tell me, Okay, you're going way beyond what I've looked at--but I almost wonder if, you know, part of it with repetition is that there is some sort of kind of assumed social contract that if someone is relaying this information to me, and especially if I'm hearing it more than once, then okay, well, they're not naturally trying to mislead me, so I must be wrong, and they must be right. I don't know how much of that might be at play here. But I can definitely, with that elementary school knowledge, see how my own sense of doubt would start to creep in.

Lisa Fazio  22:47  
Yeah, so one thing researchers have emphasized is that in order to kind of exist in a society and talk to people and understand what they're saying, we just have this assumption that other people are telling us the truth. And in general, this is really useful. So people often misspeak, people often repeat themselves. Like, if you actually do a transcript of how people talk in a conversation, there's all sorts of different errors happening there. And most of the time, we just skip over them and assume that the person said what we expected them to say. And so in kind of our daily life, in general, this assumption that other people are going to be truthful works really well. But where it goes awry is in situations where people aren't truthful. So you can see that in different online communities, or when you're dealing with propaganda, or people who are intentionally trying to mislead. And that's when our natural inclinations kind of hit this stopgap.

Ted Fox  23:47  
It's funny you say that about transcripts because I do transcripts of these episodes. And when I go back--I mean, it's one thing, it's hard enough to go back and listen to yourself as a recording. And then when you go back and actually read a transcript of all your words printed out, and it's exactly what you're saying of, here is--here is this false start of this sentence. And you understood what I was going to say and why I changed tack mid-course, and people listening get that. But when you read it written out, it looks like, Wow, this guy doesn't know what he's talking about. (laughing) He's all over the place. That's, that's really interesting. You mentioned, you know, talking about with students, if saying to teachers, Be aware that when you use these things, if they're fictionalizing situations, of how students learn this information, and that they could take that in as, you know, this is the true-to-life fact. Has your work pointed to anything in terms of--like, I'm thinking of something if a student has learned how to do addition in a certain way, and they're doing it in a way that maybe either isn't the ideal way because they were taught by someone that didn't do a good job teaching them, or they just figured it out on their own. Has your work pointed to anything in terms of interventions in terms of how students can unlearn bad learning habits that they've had?

Lisa Fazio  25:10  
Yeah, so there's a few things there. One thing we find is you can take evidence from science education, where they often have to do kind of big revisions to the way students think about the world. So normally, young kids, like if you're just walking around the world, you have an assumption that the Earth is flat because you never see any evidence of it being curved. And so you have to kind of change your mental model to round Earth. Similarly, for the way that the solar system is set up, there's also a bunch of physics that kind of goes against our natural intuitions that students have to unlearn. And what researchers have found is that a lot of the techniques that work well for debunking misinformation are also helpful here. So these refutation texts is what they call them, where they talk about how kind of the solar system works. But then they also talk about the common misconception students have, and explicitly talk about why that's wrong before going on to explain in more depth how the system actually works.

Ted Fox  26:20  
It's that truth sandwich.

Lisa Fazio  26:21  
Yeah.

Ted Fox  26:22  
(Ted laughs) So do you--I mean, I'm certainly not asking you to give political advice or anything like that--but just in general, as we, you know, this episode is coming out on Thursday, October 22nd. So we're going into, you know, right before the election. Just kind of in general terms, when people are--especially online, it's not just online, but especially online--we're bombarded with so much information that comes so quickly. Do you have any general advice or any kind of general parting thoughts you would give to people when you're kind of, you know, you're either looking at that tweet, or looking at that Facebook post, and thinking, Ah, yeah, I knew it, this is exactly right! What [should] our better angels should be telling us to do, anyway?

Lisa Fazio  27:08  
Slow down.

Ted Fox  27:10  
(laughing) Slow down.

Lisa Fazio  27:11  
That's my big advice. So social media is designed to get you to scroll, and to keep scrolling, and to scroll fast. It is not designed to encourage kind of deep contemplation of what you're reading. But if you want to notice falsehoods and notice misinformation, that's what you've got to do. So especially before you share something with others, you should pause, think about: How do I know that this is true? Am I sharing with my friends and family accurate information? None of us wants to be the one who is conveying false information to everyone we know. So take that pause, check it out before you hit share.

Ted Fox  27:50  
And I've talked about this with several different guests over the years that it's just this, the promise and the peril of the internet. Of on the one hand, it's this great democratization of information, but the gatekeepers of information--for as problematic as that can be because it can keep information away from people--but now it's, we're all like our own mini-gatekeepers and our audience is--I mean, it's not even limited to the size of our following if it's something that, you know, gets spread around a lot. And it just seems like our mentality hasn't caught up to that yet of like, Oh, no, it's like it's my own little news--if this were 1985, it's like my own little newspaper sending this information out. Like you said, you don't want to be the person who is putting that bad information out there for other people--and frankly, other people you care about--to consume.

Lisa Fazio  28:40  
Yeah, a colleague of mine uses the term that you're the editor of other people's news feeds. So your interactions with posts, whether you're liking them or sharing them, because of the algorithms, affects what other people see. And so you want to make sure that your actions are changing the news feed for good and promoting accurate information, not promoting misinformation.

Ted Fox  29:01  
Lisa Fazio, thanks so much for taking some time to talk to me today. I really enjoyed it.

Lisa Fazio  29:06  
Definitely. Happy to be here.

Ted Fox  29:08  
(voiceover) With a Side of Knowledge is a production of the Office of the Provost at the University of Notre Dame. Our website is withasideofpod.nd.edu.