Why do people believe false stories on social media? Who is most susceptible to misinformation or to the more malicious version, disinformation?
In this episode, we speak with Ph.D. candidate in Cognitive Psychology at Northwestern, Nikita Salovich. Her research on how “fake news” grows and how we can slow or stop the spread.
Spark Science encourages our listeners to get vaccinated and to use the information in this episode to have discussions with hesitant family and friends to get vaccinated as well. Another great resource is the Debunking Handbook.
[♪♪♪]
Host: Welcome to Spark Science. I’m your host, Regina Barber DeGraff. I’m an astrophysicist and educator at Western Washington University. In this episode, we get to talk about something we’ve all become more familiar with throughout the past year: misinformation. Our guest today is a cognitive scientist named Nikita Salovich. And she’s here to help us understand how to keep our friends and ourselves from falling for lies that sound like truths.
Last season, we had an episode talking about: how do you convince your friend that maybe they’re being told things that aren’t quite full of facts? And we had Dr. [?]. And today we have one of Dr. [?]’s friends, a PhD candidate in cognitive psychology at Northwestern, Nikita Salovich. Welcome to the show.
Nikita: Thank you so much for having me.
Host: I just want to really thank you for talking to us because I know that there’s a lot of stuff in the media about people talking about the vaccines, things that are true, things that are not true, things that are put out in the media to scare people, to give them hope. So, what exactly is your research?
Nikita: Yeah, so generally I’m trying to better understand and support the way that people learn and interact with information. I specifically study the consequences of being exposed to false or misleading information, including why it influences people’s judgments and decisions even in cases when people should know better, like where they have the prior knowledge or an outside resource available to determine that a given thing is false.
I find that people often fail to think critically about the accuracy of information, even though they are both capable and benefit from doing so. I use a combination of lab experiments, surveys, and social media data to analyze my research.
Host: Tell our listeners a little more about what you mean about lab experiments. Because when we’re talking about psychology, I think when people think “lab,” people think beakers, telescopes, and microscopes, but that’s not exactly what you’re talking about, I’m guessing.
Nikita: No. Not exactly. We have a pretty cool eye tracker in our lab that would probably give some telescopes a run for their money. But the type of lab experiments I do– instead of it being with molecules or with particular substances that you may see in some biological or other physical sciences, we study people. So I have participants come into the lab. And we do randomized trials of showing people particular kinds of information, or testing different interventions to get people to think critically or in a different way. And compare those interventions or those groups of people with control groups, just like you would in traditional lab experiments that you may be thinking of.
I mostly use these lab experiments- so these are materials that I create, or my advisor creates, or other graduates–
Host: Oh; and you see how far they go. I see!
Nikita: Exactly. So we create these statements, so let’s say a list of 80 true and false statements about the world. And we randomly present participants with true and false statements, and ask them to rate how interesting these statements are, one at a time.
And then we give them an open answer, general-knowledge question there. Some of the questions ask about the statements that they saw previously. And then we look at the extent to which people reproduce the false information that they had seen, especially in cases where we already tested them to have the correct prior knowledge, but they regurgitate this false information anyway.
So, I guess, in this sense, we use this kind of canned true-false general knowledge statements because it lets us exhibit a degree of control as cognitive scientists, and to have this sort of lab experimental design, that studying misinformation in the real world often does not allow.
And so we’re able to craft these statements in a particular certain way, and manipulate what they see, and how much justification we give to participants, so on and so forth, so we can make these causal conclusions about the ways that what people see, if that’s what they think, or at least what they report to be true later on.
Host: I want to talk about you, Nikita. How did you get into your field?
Nikita: Yeah. The short answer is that I kind of accidentally became a cognitive scientist. Don’t get me wrong; I very intently and full heartedly prepared my PhD application. It was an opportunity that just fell into my lap. But pursuing a PhD in cognitive psychology or studying cognitive science was definitely unexpected, but a good unexpected turn of events.
My major in undergrad was biology. It was my favorite subject growing up, and I liked the idea of going to medical school, at least until I took organic chemistry. But I realized after a couple of years on this path that I was more interested in the people in my class than I was in the molecules, how people learned vs. what we were learning.
So I went to the university of Minnesota, which, for those of you that don’t know, is a really huge school. The size of the campus and the student body is great if you have, like, an awkward date in the dining hall, and never wanted to see someone again. But it was also hard to make connections in classes with a few hundred students in the lecture hall. And I learned that pretty quickly. And I used to challenge myself to make at least one friend or buddy in all of my classes. And I would sit by them during lecture. I was always nervous about getting the cold shoulder, but people were generally pretty friendly, and plus, I always shared my notes.
My first day of Earth science class, I set out on my mission to find my buddy. I saw someone sitting by themselves near the back of the auditorium. And they seemed friendly enough. We can call him Sam.
[Laughing.]
He turned out to be wicked at biology– gender neutral– but was a man– I think that he could use my flash cards as coffee posters, and still ace every exam.
But what I remember most vividly about that class is how, mid-semester, one of our lectures was interrupted by a protest of flat-Earthers, who stormed the stage, and refused to let my professor continue teaching. Just to make sure we’re on the same page, what I mean by flat-Earthers is people who generally believe that the Earth is flat, like has an edge, and you can fall off of it, and not a sphere.
Well, guess who was on the stage, with a picket sign and all? My buddy Sam. The Sam who was acing Earth science. So, as you can imagine, I felt a lot of emotions during this experience. But most of all, I was really intrigued. How could this budding biologist, my dear friend, have this incorrect belief about the state of the natural world? How did he form this conception? And would I ever be able to get him to change his mind?
Honestly, I don’t know if Sam ever changed his mind.
Host: You like, just cut ties?
Nikita: Yeah. He fell off the face of the Earth. Pun intended after that class. But it did open up a can of worms about all of the questions I wanted to ask about how people misunderstand the world. Ultimately, led me to study what I’m studying now. The science of how people are influenced by misinformation.
Host: Was there any other instances as you were, like, going through undergrad, or now in grad school, that you did follow all the way through, you did have an interaction where you saw some amount of change, even if it was a little, with what you learned.
Nikita: Yeah. Absolutely. I think that’s a great question. I think as most can relate, I have family members that have different beliefs about the world, and I think as I have become more and more confident about the best ways to talk to these individuals that hold beliefs that are different from mine– I also have made more traction, gained more ground in terms of being able to influence their thoughts and behaviors, and do so strategically vs. emotionally, which is definitely something I have struggled with, having my blood boil a bit when I disagreed with someone from a scientific perspective. But I’m able to cool my jets now and make more of an impact in that way. But it takes a while.
Host: To cool your jets!
Nikita: It does; it takes patience.
Host: Well, I mean, speaking of, I think, when we all see misinformation in front of us, we get really, you know, emotional. But I also get emotional when I see things that happened in the past, where I’m like, oh my gosh, I can’t believe this happened. So, in your studies or in your research, can you give us a story of something that happened in the past, before the pandemic, of like, an instance of misinformation that really spread wildly in society.
Nikita: Yeah. That’s a great question. Misinformation has existed far before the pandemic, unfortunately. And even before the time I started graduate school, and even the times before either of us was even born.
Host: Right. I mean there’s propaganda, right?
Nikita: Absolutely. We use this term “fake news,” and it gained a lot of traction and surge of interest in misinformation around the time of the 2016 presidency. But before then, for example, Nazi propaganda was used during World War II to spread anti-Semitic hate. And a tradition of tabloids publishing rumors as facts has been a popular attention-grabbing tactic for years now.
And I think it’s important, with that question, to recognize that we don’t get our information from just sources that were intended to inform us, like news articles. We also pick it up from sources where the main goal is to entertain us, like fictional books and movies, and also sources where it’s kind of in between entertainment and information, like podcasts, or social media. And research from my lab finds that: fiction or not, exposure to false information influences what we believe to be true about the real world. It’s not just fake news and disinformation that we need to worry about.
That said, misinformation has been shown to affect the way that we think, and navigate the world. Very much so. Some researchers have even attributed major world events, including that 2016 election, the 2019 measles outbreak in the US, and of course, the recent surge in COVID-19 cases, and hesitancy around the vaccine due to the circulation of false information, particularly online.
Host: You’re listening to Spark Science, and we’re discussing the foundations of misinformation.
Nikita: Research has shown that misinformation travels faster and wider than true news, so that means that it reaches more people, and it reaches more people quicker. And this is because a lot of times, misinformation is surprising. It’s something that we wouldn’t expect, or that’s really enticing to believe. And because of that, it plays into these biases of what we want to be true.
I had a college roommate who one time told me excitedly that she had read somewhere that redheads need more sleep. And so therefore I could not play music in the morning while I took a shower. And guess what color hair she had.
Host: Red!
Nikita: Red? Yeah. And so wherever she heard that information– I think it was from one of the People Magazines that we had lying on our living room table or something, to be completely honest. She was more likely to accept that information is true, or conversely less likely to scrutinize that information, right. So we have this bias for accepting things that we already agree with. And we have this bias toward not rejecting, or not thinking critically about things that we want to be true, because they sound really great, and it would be great to live in a universe where redheads did need more sleep, but to our knowledge, that is not true.
Host: It just reminds me of all of those news articles that are like “oh; coffee is good for you. Coffee is bad for you. Red wine is good for you. Red wine is bad for you.” You know, you know you have those people who are like, I drink a lot of red wine. Or I’m a coffee drinker so thank God this is actually going to help me rid off dementia, or something like that.
Nikita: Yeah. Absolutely. And because we live in an environment where we are faced with so much information, a single Google search can come with hundreds of thousands of hits of articles. The way that we even search for information affects what we view. If you’re a chocolate lover, you may be more inclined to google the benefits of chocolate, vs. the cons of eating chocolate, right? And so, in doing so, you are going to be faced with information that’s confirmatory vs. dis-confirmatory. Also, a little bit of a scary thing, but these search engines remember our types of searches.
Host: Right. It’s going to give you more of these positive about chocolate stories.
Nikita: Exactly. It’s a feedback loop. That would be a combination of confirmation bias, and motivated reasoning.
Host: Right.
Nikita: People also generally like things that are familiar to them. They tend to find solace in people that look like them, talk like them, have similar interests. And that can be a bad thing when we use that as a cue for whether a source is reliable or accurate.
Host: Mhmm.
Nikita: And so for example, if someone learned that so and so person also has the same birthday as them, also they’re attractive, went to the same university for undergrad, have a cat that has the same as you, whatever– you may be more likely to trust them, even if they’re not a reliable source. Like, perhaps going to the University of Minnesota makes you a reliable source, because you have a college education, but having a cat that’s named Lentil probably doesn’t do that, even though I’m probably more likely to think positively of you.
Host: This is Spark Science and we’re talking with cognitive scientist Nikita Salovich about misinformation and the context of the COVID-19 pandemic. We apologize in advance for the internet connection issues mid-interview.
[♪♪♪]
Let’s get to, like, the pandemic itself. What have you seen in your field that’s been very concerning during this pandemic?
Nikita: I think that, like many world events, there’s a surge of information that is circulating, but also unique to situations such as the pandemic. People are experiencing a heightened level of fear, or very strong emotional responses, which can lead to less scrutiny of information, and abandonment of logic. And as humans, we’re already not very logical. But when we are experiencing instance emotions, our latching onto these biases, and heuristics, and quick rules of thumb of how we process and jump to conclusions becomes even more and more extreme.
And so I think that that is a large concern right now, and has been for scientists, as people are being exposed to false information because of the extreme fear that people have, and also the current divides and polarization in the political sphere, now, in the US, across the world, but particularly in the US right now. We’re experiencing of the greatest divides in political ideology than ever before. And right now, health is becoming a partisan issue with the pandemic. And that is something that, even though it has been experienced in the past, or seen in the past, it is something that can be extremely dangerous where you view taking an action that is objectively and scientifically shown to be the healthier decision, and the better decision for the population as a whole, if you view that to be a political decision, or a decision where you’re abandoning your values as one party or another, then that becomes extremely dangerous because it’s dissuading people from making decisions that could very much save their lives.
Host: That’s a really good point. And as someone who teaches a science communication class– and I’m sure you, Nikita, you’ve seen this in your labs. The solution for that, though, is not to just give people facts, though, right? Like, that’s not going to work either. It seems to be this gut reaction a lot of scientists want to do, or are taught.
But what does your research show? What can we do to counteract that misinformation? Because I get a lot of push-back from scientists that are like: well we can’t appeal to people’s feelings, because that’s cheating. And I’m like: mmm, people have feelings, so . . .
Nikita: I think that that’s a really great question. There is definitely an art for the best practice of helping people realize that what they may be seeing or believing or sharing is incorrect. And even though it’s not something that we deliberately think about, sometimes there is an important difference between telling someone that they’re wrong, and helping them learn the correct information. We should be putting more emphasis on directing people’s knowledge, vs. telling people that they’re wrong, in our conversations.
And I think the main difference between those two things is that the former, or telling something that they’re wrong, can be accusatory, and even cause reactions. Being wrong is something that none of us want to hear, whether it’s getting a question wrong on an exam– it is particularly important to acknowledge this when attempting to correct misinformation that is politically or ideologically loaded. That information may be important in maintaining coherence of other beliefs that they have, or even central to their worldview. Approaching this situation as “you’re wrong and I’m right” can scare people away, and make them tune you out.
In some extreme cases, it can even make them dig their heels deeper, further insisting the misinformed belief in order to defined themselves, and prevent their worldview from crumbling, or to protect themselves from feeling silly or incompetent. So if someone starts tuning you out because the emphasis of how you approach the conversation is in what ways they’re wrong and incompetent, it doesn’t allow you the opportunity to do what you likely care about most, which is telling them what is right. So the flip-side to that is this potentially accusatory approach– is instead that you can approach a conversation with a misinformed individual with the primary objective of correcting their misinformed idea or belief, vs. a compelling laundry list of all the ways that they are ignorant.
So, this completely changes the tone of the conversation. You no longer want or can afford to alienate this individual. You have to win them over with the patience and enthusiasm for truth so that they listen to what you have to say. So if someone shares with you that they are hesitant in getting the COVID-19 vaccine, instead of jumping in immediately with how ludicrous their idea is, take a deep breath, let your blood come down from a boil to a simmer, and try to open the door for conversation, rather than slamming it in their face.
One way to let them know that you’re listening, and that, even though their belief is wrong, is that they’re not alone in holding it. So, for example, you could start out by telling them, “I’ve heard other people have that concern.” Or, “I actually had a similar conversation with my cousin the other day.” Or, “I actually used to think that for a while, too.” You may be thinking, isn’t it bad to reaffirm that others hold their beliefs? Well, it’s important not to let the conversation end at the affirmation that they’re not alone in their hesitancy, establishing that your following response is not out of the aim to belittle them, but instead to reiterate new, more reliable information, is very important. If your cousin could change their mind about the COVID vaccine, then maybe your childhood friend could, too.
Host: But there’s another side, too, right? Like, yes, we, as scientists, or science-minded people that are kind of for the vaccine have this conversation and we have to take a deep breath and really listen. But I think there’s a lot of things that go into all of these conversations. For instance, you hear people– like anyone who doesn’t want to take the vaccine, I hear my colleagues say this: it’s just an awful person; they’re a monster; they’re dumb.
And especially I have a lot of white colleagues that don’t understand why communities of color are suspicious of the vaccine, and suspicious of medical people. And the reason is being of history, right? communities of color, especially Black Americans, have been used as guinea pigs in the past. And so, to not trust medicine has some evidence to it in our society, in our country. And a lot of my colleagues don’t know that. And so to really not know your audience is, I think, the problem with a lot of these conversations.
Nikita: Absolutely. I couldn’t agree more. And I think that’s why making sure it’s a conversation and not a monologue is so important. That’s why you would be able to– you know– maybe even ask questions about why someone feels hesitant, vs. assuming that it’s because they don’t have the correct information.
I think that we have to be empathetic in these situations, even if we may want to jump to these accusatory conclusions, or accusatory monologues vs. dialogues.
Host: I think it’s really also important for our listeners, because there are people that you will not turn. You just won’t. Or you don’t even know, right? Like, you don’t have a connection to– but like, we’re talking about people who are part of your life, that you are willing or interested in changing. That’s a different story. There are some conversations you will not win.
Nikita: Exactly. And I think that sometimes, we find ourselves in these conversations even if we don’t want to. And one of the– like, if you are in the middle, and you’re not exactly in the comfortable position to be able to express your beliefs, or new information to someone, another go-to for fighting misinformation to add to your toolkit is starting with a question, rather than an assertion. So it may help the person realize that their argument or idea doesn’t make as much sense as they thought. So in a situation, for example, where you may be eating with– I don’t know– a great aunt of sorts that may have a certain conspiracy theory about how the world words.
You can ask her, like, where did you get that information? Are you sure that is a reliable source of information? Have you seen the latest information released by so-and-so source? So in this way, asking people to question their own beliefs does a lot of the hard work for you, with a lower risk of alienating that person, you’re working hard to have an open conversation.
[♪♪♪]
Host: You’re listening to Spark Science. It’s now one of our favorite portions of the show, where we discuss how the field of cognitive science is portrayed in pop culture.
[♪♪♪]
Can you think of a time where there is a cognitive scientist in a movie, or a comic book, or something like that? How is your field portrayed?
Nikita: I would say that there’s a general perception of misinformation researchers, or people who study misinformation, as being political scientists. And while it’s no secret that lots of mis- and disinformation that we hear about is related to political topics. There is just so much more, and so much research on the detrimental effects on non-partisan, inaccurate, misleading information.
And so, in my opinion, there’s an over-emphasis on pitting one political party vs. another, and “who is more misinformed?” Or “have you studied republicans vs. democrats?” And it’s a little-known fact that, over and above people’s political ideologies, an even better predictor of falling for false information, and sharing fake news, is how cognitively lazy people are.
Host: [Laughing.] Tell us more about that.
Nikita: Yeah. So people are generally cognitive misers. We don’t like giving much thought to anything. So the stuff that we see, for example, online– a lot of people don’t think critically about it. It doesn’t matter your particular demographics. In a way, people just don’t want to evaluate it.
And so I think that we need to start getting into the mindset that everyone, even ourselves, even me as a misinformation researcher, is influenced by inaccurate or misleading claims, because of this cognitive science behind it. And once we accept that, I think that we can become more humble in how we approach our information sphere, but also how we react to others who may be misinformed?
Host: Is there anything I haven’t asked you that you would like to share with our listeners? Like you were saying, the misconceptions people have about cognitive scientists, or anything else?
Nikita: Yeah; I’m not sure. I guess I would say– I would just want to emphasize that we have to take it on ourselves to be more deliberate consumers of information.
I think that a lot of people put emphases on, let’s say, interventions put in place by social media companies, news outlets, and journalists. I think that people should take matters into their own hands, as the information consumers. Because, let’s be honest, there’s no way that we can get rid of every single inaccurate claim, especially with the internet. But what we can do is help people produce and recognize when and how to evaluate the information that they see in here, day to day. Also, I would want to emphasize that cognitive science and psychology are definitely sciences.
I think that the variables that we study are a lot different, and often, the variables that I study are invisible, like people’s preferences, on intelligence.
Host: Right; they’re not tangible.
Nikita: Exactly. They’re not tangible. But just because we can’t see them in the traditional sense, or measure them as consistently, as like, with a thermometer in measuring temperature, they’re still there. And they still impact every decision that we make.
[♪♪♪]
Host: We’d like to thank Nikita Salovich for taking time away from her PhD studies to talk to us about fake news. We wish her a smooth dissertation and defense in the future.
Spark Science is produced in collaboration with KMRE, and Western Washington University. Today’s episode was recorded in Bellingham Washington, in my house, on my computer, during the great pandemic that’s still going on, as of May, 2021. Our producers are Suzanne Blais, and myself, Regina Barber DeGraff. Our audio engineers for this episode or Julia Thorpe and Sarah Coakley. The script writer for today’s episode was also Sarah Coakley. If you missed any of the show, go to our website, sparksciencenow.com. And if there’s a science idea you’re curious about, send us a message on Twitter or Facebook at SparkScienceNow. Thank you for listening to Spark Science.
[♪♪♪]