Home

Donate

New Research Suggests Online Search Can Increase Belief in Misinformation

Prithvi Iyer / Dec 20, 2023

Much has been written about how misinformation threatens democratic discourse around the world, with social media companies often held responsible for the problem. Comparatively, less attention has been paid to the role of search engines in exacerbating the spread and belief in misinformation. A new paper by Kevin Aslett, Zeve Sanderson, William Godel, Nathaniel Persily, Jonathan Nagler, and Joshua A. Tucker published in Nature seeks to address this gap.

The paper, titled “Online searches to evaluate misinformation can increase its perceived veracity,” examines how results delivered by search engines such as Google to users seeking to evaluate news impacts their belief in misinformation. Despite many media literacy interventions encouraging people to verify news by doing research via search engines, there is limited evidence supporting the efficacy of this strategy in reducing belief in misinformation, according to the authors. Understanding if and to what extent using search engines reduces belief in misinformation is crucial because most internet users report relying on search engines rather than traditional news sources to learn about politics and form opinions about current affairs.

The answer, according to this research, seems clear. Across five experiments, the researchers find that for most users, “online search to evaluate the truthfulness of false news articles actually increases the probability of believing them.”

The first experiment looked at the effect of using search to evaluate news on belief in misinformation via a randomized control trial in which participants were asked to evaluate the truthfulness of three news articles. Individuals in the treatment group were encouraged to use search engines to help their evaluations, while those in the control group were told not to do so. The findings go against conventional wisdom, showing that those encouraged to search online were more susceptible to rate a false/misleading article as true.

This begs the question: Is the effect of online search to verify news articles strong enough to change the users' prior opinion? The researchers address this by asking the same pool of participants to evaluate news articles with and without online search. They found that “17.6% changed their evaluation to true after being prompted to search online (for comparison, among those who first incorrectly rated the article as true, only 5.8% changed their evaluation to false/misleading after being required to search online).” This finding goes against the idea that users have rigid opinions that cannot be changed. Rather, it seems like using online search engines to evaluate false news can “falsely raise confidence in its veracity.”

Thus, the overarching finding suggests that online searches to evaluate false news can exacerbate rather than reduce belief in misinformation. However, these studies only evaluated fresh publications (up to 48 hours after initial publication). Since misinformation can go viral weeks after initial publication, it is also possible that search engines provide more credible sources of information as time passes. The researchers tested the robustness of these findings using a longer timeframe (3-6 months after initial publication) and found that 18% more respondents “rated the same false/misleading story as true after they were asked to re-evaluate the article after treatment, even months after the article was published.” 

So why does searching online lead some people to believe false news stories? The authors attribute their findings to the concept of a “data void,” or “informational spaces in which there is corroborating evidence from low-quality sources.” When people conduct online searches to try and determine whether a news article is true or false, especially around breaking news events, search engines often return less credible information. Data voids often arise because low-quality news publishers utilize search engine optimization (SEO) and encourage users to “use specific search queries when searching online by consistently using distinct phrases in their stories.” This can lead to a “propaganda feedback loop,” like when Google’s search engine was shown to “interact with conservative elite messaging strategies to push audiences towards extreme and, at times, false views.” This feedback loop also leads a large network of news outlets to republish the same misinformation, which, because of increased traction, can flood search engines and bury credible information. One study shows that over 25% of Google users click the first search result. Thus, if data voids result in credible information never reaching the top of search results, misinformation will likely go unchecked.

While data voids can exacerbate the problem of misinformation, it is worth noting that the effect of data voids are not universal to all users, as some users are more susceptible to conducting searches that return low-quality news sources than others. The authors test whether a user’s political ideology or degree of digital literacy may determine the likelihood of exposure to these data voids. As expected, they found a strong correlation between low digital literacy and exposure to data voids, even after controlling for other demographic characteristics. Ideological congruence with a news outlet also increases the likelihood of exposure to data voids. Since search engine results are highly personalized, the importance of user-level characteristics like ideology, literacy, and demographic factors in shaping exposure to misinformation is pronounced.

Takeaways

Across these five experiments, the researchers provide compelling evidence to show that “doing your own research” is not always a remedy for refuting misinformation. Online search engines can also lead people to believe falsehoods, just like social media platforms can. Data voids can bury credible information under a slew of low-quality information, which may contribute to this phenomenon.

These findings caution against putting blind faith in online search engines as rigorous tools to verify news. The online news ecosystem has little to no barriers to entry, with blogs from unverified sources co-existing on the internet with reputed news outlets. This makes the challenge of discerning truth even more complex and requires search engines to conduct frequent quality checks, setting up guardrails that prevent low-quality news from reaching the top of the results page. Google’s policy of providing warning signs when no credible information exists for a search query is an encouraging step in the right direction, according to the researchers. Search engines must do more to ensure they are not contributing to the problem of believing in false information and ideas.

Along with changes to search engines, there is also an urgent need to invest in digital literacy programs. The Civic Online Reasoning Curriculum is one such endeavor the researchers point to, and more should follow suit. It is not prudent to expect search engines to be entirely free of misinformation, so the best bet is to cultivate a well-informed public that is more capable of spotting it.

Authors

Prithvi Iyer
Prithvi Iyer is a Program Manager at Tech Policy Press. He completed a masters of Global Affairs from the University of Notre Dame where he also served as Assistant Director of the Peacetech and Polarization Lab. Prior to his graduate studies, he worked as a research assistant for the Observer Resea...

Topics