The science behind why fake news is so hard to wipe out
It's time for Facebook and Google to pay attention to the psychology of the illusory truth effect.
In the immediate aftermath of the Sunday night massacre in Las Vegas, Facebook and Google — the two largest distributors of news and information in the world — helped to spread misinformation.
In its "top stories," Google featured a 4chan forum — an anonymous message board notorious for fueling conspiracy theories — that misidentified the shooter as a Democrat with ties to leftist, anti-fascist groups, as BuzzFeed's Ryan Broderick found out. On Facebook, "trending stories" featured articles about the shooter from Sputnik, a Russian propaganda outlet, a New York Times reporter noted.
And that's just the start: An untold number of other pieces of user-generated misinformation and hoaxes on the shooting spread freely on social media. (Broderick compiled many of the hoaxes in a list here.)
The fringe conspiracy theory website Infowars ran a headline that suggested the killer specifically targeted conservatives. And as Broderick chronicled, the far-right website Gateway Pundit ran a headline that said the shooter was associated with an "anti-Trump army" (the post has since been removed).
None of these stories checked out. And the killer's motives still have not been verified in the days since the shooting.
But each time a reader encounters one of these stories on Facebook, Google, or really anywhere, it makes a subtle impression. Each time, the story grows more familiar. And that familiarity casts the illusion of truth.
Recent and historical work in psychology shows mere exposure to fake news makes it spread. To understand why — and the extent to which false stories seep into our brains — we need to understand the psychology of the illusory truth effect.
The more we hear a piece of information repeated, the more we're likely to believe it. "Even things that people have reason not to believe, they believe them more" if the claims are repeated, Gord Pennycook, a psychologist who studies the spread of misinformation at Yale University, says.
And recent research shows the illusory truth effect is in play when we hear or read fake news claims repeated, regardless of how ridiculous or illogical they sound.
It's research Google and Facebook must wrestle with as the world's most powerful media organizations. When they do, it will be clear that it's time for them to get serious about editing out falsehoods.
The illusory truth effect, explained
The illusory truth effect has been studied for decades — the first citations date back to the 1970s. Typically, experimenters in these studies ask participants to rate a series of trivia statements as true or false. Hours, weeks, or even months later, the experimenters bring the participants back again for a quiz.
On that second visit, some of the statements are new and some are repeats. And it's here that the effect shows itself: Participants are reliably more likely to rate statements they've seen before as being true — regardless of whether they are.
When you're hearing something for the second or third time, your brain becomes faster to respond to it. "And your brain misattributes that fluency as a signal for it being true," says Lisa Fazio, a psychologist who studies learning and memory at Vanderbilt University. The more you hear something, the more "you'll have this gut-level feeling that maybe it's true."
Most of the time, this mental heuristic — a thinking shortcut — helps us. We don't need to rack our brains every time we hear "the Earth is round" to decide if it's true or not. Most of the things we hear repeated over and over again are, indeed, true. But falsehoods can hijack this mental tic as well.
The more we encounter fake news, the more likely we are to believe it
Historically, psychologists have studied the illusory truth effect with topics of trivial importance. One study in the 1970s tested the phrase "French horn players get cash bonuses to stay in the U.S. Army."
Pennycook and his colleague David Rand at Yale are updating these tests to better understand the spread of misinformation in the real world, recreating these classic experiments with fake news headlines ripped from the 2016 presidential campaign.
In a recent study, participants were shown six real and six fake news headlines — and were asked about how accurate they were. The headlines were made to look like Facebook posts.
Here are some of the fake ones.
And the real ones.
After evaluating the headlines, the participants were distracted with another task (not relevant to the experiment) for a while. After, the participants were given a list of 24 headlines to evaluate, which included all of the fake news stories they saw earlier.
Pennycook was able to replicate the classic finding: When participants had been exposed to a fake news headline previously, they were more likely to accept it as truth later on.
"We found essentially the same effect, which was surprising because the stories that we're using are really quite implausible, like 'Mike Pence's marriage was saved by gay conversion therapy,'" Pennycook says. The effect was not limited to Republicans or Democrats in the study's large sample. And a follow-up test revealed the effect persisted a week later.
The effect isn't huge, but it's meaningful.
One of the fake news headlines used in the study was "Trump to Ban All TV Shows that Promote Gay Activity Starting with Empire as President."
If a group of participants hadn't seen it before, about 5 percent said it was accurate. If the group of participants had seen it before in an earlier stage of the experiment, around 10 percent said it was accurate. That's twice as many people agreeing an outlandish headline is truthful.
And while the change is small, think about this: Facebook and Google reach just about every person in the United States. A 5 percent increase in the number of people saying a fake news headline is true represents millions of people.
Though have some faith: Pennycook found truly, truly outrageous statements like "the Earth is a square" didn't gain acceptance with repetition.
I should mention: Pennycook's work has only been published in preprint form, which means it has not yet been through peer review. So treat these findings as preliminary. His team did preregister the study design, which is one safeguard in ensuring objective results. But other studies have found similar results.
In 2012, a small-scale paper in Europe's Journal of Psychology found that "exposure to false news stories increased the perceived plausibility and truthfulness of those stories." The study had participants read made-up (but not totally outlandish) news stories — like one on a California bill to limit the number of credit cards an in-debt person could own. Five weeks later, they were more likely to rate these false stories as being truthful as compared to a group of participants who had never seen those stories before.
Studying up on a topic isn't likely to help
The frustrating truth about the illusory truth effect is that it happens to us unthinkingly. Even people who are highly knowledgeable about topics can fall prey to it.
And it can happen whether or not we have some prior knowledge about a subject. In 2015, Fazio and co-authors published a paper that found prior knowledge about a topic doesn't inoculate you to the effect.
In her study, participants who knew facts like "kilts are the skirts that Scottish men wear" became more doubtful if they read, "Saris are the skirts Scottish men wear." And they became even more doubtful if they read, "Saris are the skirts Scottish men wear," for a second time. (Participants rated the truthfulness of the statements on a scale of 1 to 6.)
Fazio stresses that it's not that people completely change their understanding of Scottish fashion customs by reading one sentence. But doubt begins to creep in. "They moved from 'definitely false' to 'probably false,'" she says. Every time a lie is repeated, it appears slightlymore plausible to some people.
Our memories are very prone to mixing up real and false information
The research here suggests that even when there are fact-checks around bullshit claims, the illusory truth effect still influences our memories to confuse fact and fiction.
It's because our memories aren't so great. Recently I had a conversation with Roddy Roediger, one of the nation's foremost experts on learning and memory. In his experiments, he shows how even small suggestions from others can push us to remember whole scenes and experiences differently. And we tend to sloppily remember events like news reports.
"When you see a news report that repeats the misinformation and then tries to correct it, you might have people remembering the misinformation because it's really surprising and interesting, and not remembering the correction," Roediger, a psychologist at Washington University in Saint Louis, said.
Many of the fake news claims and hoaxes that followed the Las Vegas shooting implied that the shooter specifically targeted conservative-minded Trump supporters. That may also prove to be sticky. Stereotypically, we may think of country music fans as Trump supporters.
There isn't an easy fix to this problem
In one arm of his experiment, Pennycook even put a warning around the fake news headlines when participants first read them. "Disputed by 3rd Party Fact-Checkers," the note read (which is Facebook's exact wording for how it labels dubious stories). The warning made no difference.
"We basically said, 'This is something you shouldn't believe,'" he says. But participants later on still rated those headlines as being more accurate than ones they had never seen before.
Pennycook and Rand followed up with another paper looking at whether Facebook's warnings could have any effect on whether readers perceive a news article as being accurate. Rand explains that the warnings did slightly decrease accuracy ratings — but not to an extent that it overcame the illusory truth effect. "The size of that decrease is smaller than the increase you get from just having seen it," he says. "So what that means is seeing an article with a disputed tag on it still leaves you a little bit more inclined to believe it's true than not having seen it at all."
The experiment was pretty simple: Participants saw an array of real and fake news headlines either without warnings or with the warnings added. They were simply asked to state how accurate they thought the headlines were. (One caveat here: This study was not performed on Facebook itself, but on a web survey designed to look like Facebook. But as Pennycook says, Facebook hasn't made data on the effectiveness of its warnings public.)
Facebook and Google need to step up in their role as news publishers
The stakes here are extremely high, with democracy itself under attack. Increasingly there's evidence that the Russian government used Facebook to target Americans with misinformation and messaging to sow unrest during the 2016 election. Facebook made it easy.
"These companies are the most powerful information gatekeepers that the world has ever known, and yet they refuse to take responsibility for their active role in damaging the quality of information reaching the public," Alexis Madrigal writes in the Atlantic. He asks us to imagine: What if a newspaper had done this?
Facebook, Google, Twitter, and other forms of social media are the newspapers of today. They need to take the spread of misinformation on their platforms more seriously. They need to step up in their role as near-ubiquitous news publisher.
We're not sheep. It's not like we'll believe anything we read on Facebook. The effect misinformation has on our minds is much subtler; it works on the margins. But in today's world, where a few platforms dominate information sharing, the margins are huge, filled with millions, and influential.
No comments:
Post a Comment