Home

Donate
Perspective

In an Age of Information Gatekeeping, Don't Just Google It

Dia Kayyali / Aug 11, 2025

Dia Kayyali is a fellow at Tech Policy Press.

Ground Up and Spat Out / Janet Turra & Cambridge Diversity Fund / Better Images of AI

In an online world marked by censorship, manipulation, and information filtering, access to meaningful knowledge on the Internet is harder than ever. It’s time to reassess our relationships with the biggest information filters out there—for-profit tech companies. It’s also a time to be prepared for increasing government censorship in the “Global North,” to strengthen our ability to find information through alternative means, and open ourselves to the idea that by seeking information from human sources, we can potentially strengthen relationships in the face of rising authoritarianism.

It’s clear that abandoning or even reducing the use of today's tech giants’ services is not simple, but there are many reasons why now is the time to consider it. These services have helped many people access a world of information previously unimaginable. Social media platforms, search engines, and recommendation sites have made it easier to navigate new places, build community, and discover new experiences.

But the internet, which opened up information to billions across the world, has changed, sometimes slowly and sometimes in leaps. While tech companies once expanded access to diverse information and knowledge, information from commercial sources is heavily filtered. Search personalization, Search Engine Optimization (SEO), and paid results have become key features of the ecosystem, limiting access to unfiltered information through major search engines. More recently, AI-generated overviews are further obscuring diverse perspectives, as users begin to rely on these summaries rather than navigating to the original sources.

At the same time, misinformation is impacting, and even sometimes targeting, search engines, and governments around the world are increasingly restricting access to content related to controversial subjects such as LGBTQ+ people and racism, with a number of restrictions advancing in the US via state legislation. Now more than ever, it’s essential to strengthen critical thinking and media literacy skills, connect with people in person, seek out trusted sources directly, read books, check citations, share information independently, and avoid the constant offerings of big tech.

First, it is important to note that regardless of what search engines or social media sites do, information can’t be safely consumed without a certain degree of critical thinking and a lot of caution. For example, in 2023 research demonstrated that “online search to evaluate the truthfulness of false news articles actually increases the probability of believing them,” especially when using keywords taken from those articles. The study recommended media literacy programs, and it is clear that this kind of education is clearly needed, not only for kids but also for adults.

Some of the information filters mentioned above have been around for a long time, predating the more recent explosion of generative AI , such as search engine optimization (SEO). SEO refers to the techniques used to make content more likely to appear in search results. SEO can be manipulated in a variety of ways, and there are professionals who specialize in doing just that. Pages created by people who can’t afford these services or lack the technical skills will land lower in search results, even on non-commercial search engines. Similarly, paid search results push advertisements to the top. These ads are not always clearly labelled, but even when they are, they push other “organic” search results further down.

Google News deserves a special mention. It functions as a news “aggregator,” pulling in sources from across the web to create a news landing page, personalized to varying degrees. The site displays “top stories” and offers further sources when a user clicks on that option. While Google has some basic content guidelines for inclusion in News and Search, its methods for ranking stories and sources remain largely opaque — yet it is incredibly popular. According to Press Gazette, in May 2025, Google News was the sixth most-visited English-language news website in the world. Inclusion in Google News not only lends an outlet a sense of legitimacy but also drives significant traffic.

But not all of the outlets featured on the aggregator are created equal, a fact for which Google takes no real responsibility. In fact, news results also now include posts from X, bringing social media content directly into search results for people searching for news on specific topics. This has resulted in Google Search amplifying misleading content. For example, an AI-generated video of Hasan Nasrallah claiming responsibility for destroying Lebanon was tweeted out by Israel’s official account after Israel bombed Lebanon. This post appeared when I tested out Google News search results on the topic at the time, and was likely displayed in the search results of others seeking reliable information about the situation.

Unfortunately, AI-generated images and videos are not the only way that AI is making it harder to access reliable information. Generative AI is increasingly playing a central role, with concerning consequences. Some users are going directly to ChatGPT or Grok instead of traditional search engines, while search engines themselves are integrating generative AI in their results. For example, Google’s “AI Overviews” and Bing’s AI provide short answers with links, which are distinct from the more standard list of results.

This brings with it a host of problems, including AI hallucinations, built-in bias, or potentially intentional manipulation, as in the case of Grok. Research from the University of Washington examining generative AI search engines found that one “search engine manufactured information that did not exist in the first place” through mistakes such as combining quotes from two different sources. The recent leaks and indexing by Google search of ostensibly private conversations with Claude, ChatGPT, and Grok, in two separate instances, also have deeply concerning implications for privacy.

As troubling is the impact on how people interact with the information. Most users are unlikely to scroll past the AI-generated responses to verify the information or access the original sources. Industry research from last year indicated that “AI Overviews can cause a whopping 15-64% decline in organic traffic” and that “60% of searches now terminate without the users clicking through to another website.”A study from this year suggested that sites “previously ranked first in a search result could lose about 79% of its traffic for that query if results were delivered below an AI overview.” As longtime tech journalist Jason Koebler put it in a recent critique of AI’s negative impact on journalism, “This general dynamic—plummeting traffic because of AI snippets, ChatGPT, AI slop, Twitter no workie so good no more—has been called the ‘traffic apocalypse’ and has all but killed some smaller websites and has been blamed by executives for hundreds of layoffs at larger ones.”

This filtering by platforms, alongside increasing government censorship, is making it harder to access information and connect with communities engaged with some of today’s most urgent geopolitical and sociocultural issues. People have relied on the internet for information about the genocide in Gaza, immigrants’ rights, the rise of authoritarianism and the far-right, trans healthcare, and broader LGBTQ topics. It has also served as a vital space for finding solidarity and advocating for human rights. Movements such as Black Lives Matter, the Arab Spring, and the farmers’ protests in India have all been facilitated, supported, and amplified by the internet.

So what now? The first thing that needs to shift is our mindset. Not many people can simply stop using major platforms, but we should use them with our eyes wide open. It’s important to understand the ways in which the information we see is being filtered.

Media literacy, specifically focused on the online information environment, can help as well. We don’t have to wait for schools to do this. Civil society can consider partnering with influencers to reach younger audiences and collaborating with rights organizations outside the digital rights space to engage broader communities. With at least some media literacy, it becomes easier to judge information sources and develop the habit of consulting them directly whenever possible. And of course, there are some corners of the internet that still prioritize transparency. For example, Wikipedia makes it easy to find citations that allow you to assess the quality of the information you are consuming.

Perhaps the hardest thing to change now will be breaking the instinct to “Google it,” or “check a Facebook group,” or “go on X to get immediate news about a developing situation.” We need to start asking where we can turn for communication that’s mediated as little as possible by technology. In a time of rising authoritarianism and social isolation, connection is essential, but we have to consider doing it more offline and modifying how we do it online. This might mean returning to uncomfortable conversations and rebuilding neglected relationships, and working harder to find trustworthy information and supportive communities.

That’s not necessarily a bad thing. The World Health Organization recently created a “Commission on Social Connection” to fight the “loneliness epidemic.” But we can fight it ourselves by investing in forms of connection and understanding that an algorithm could never provide.

Authors

Dia Kayyali
Dia Kayyali (they/them) is a member of the Core Committee of the Christchurch Call Advisory Network, a technology and human rights consultant, and a community organizer. As a leader in the content moderation and platform accountability space, Dia’s work has focused on the real-life impact of policy ...

Related

An Advocate’s Guide to Automated Content ModerationFebruary 12, 2025
Perspective
How LLM Alignment Can Help Counteract Big Tech's Centralization of PowerJuly 21, 2025

Topics