Ellen P. Goodman is a Professor at Rutgers Law School, Co-Director of the Rutgers Institute for Information Policy & Law, and a Senior Fellow at the Digital Innovation & Democracy Institute at the German Marshall Fund. Marc Hand is CEO and founder of Public Media Venture Group and co-founder and former CEO of Public Media Company (PMC). Derek Slater is a tech policy strategist and co-founder of consulting firm Proteus Strategies.
Every day, there is a new spike of anger, recrimination, or legislation directed at “information disorder,” with concerns that people are steered towards harmful misinformation, radicalized by conspiracies, hived off into polarized media, and worse. To many, major social media services are the primary cause and amplifier of these ills, an inevitable consequence of market pressures and the current commercial incentives that platforms have. Even though the research on social media’s harms is inconclusive, contested, and complex, and even though the “disorder” has roots in traditional media and elsewhere, social media must improve to foster a better information ecosystem.
Many proposals to reimagine social media focus on regulating and rearchitecting today’s commercial services, particularly very large players with significant market share. Other (not mutually exclusive) proposals seek to bring more competition to the market. Rather than acting to directly restrict or shape what today’s platforms do, they seek to drive better market outcomes by giving consumers more choices.
Those choices should not simply be other commercial alternatives. Expecting commercial social media to fill our society’s most important information needs is like expecting a chain bookstore to do a library’s job. We should not count on the market to provide for all of our information needs. Just as market actors’ decisions will not fully account for negative externalities, they won’t account for positive externalities either, including the benefits of an informed public. Noncommercial alternatives, including public serving media institutions and libraries, have historically been a key part of meeting civic objectives.
The importance of a digital public sphere has another feature long associated with public serving media: resilience. Freedom and democracy depend on diverse control of communications systems. Today, even essential information produced by mayors and governors, even the outreach of independent community organizations and the work of universities all rely on the same concentrated commercial systems and infrastructure (including cloud services and distribution), making the system brittle and potentially undemocratic.
There are a number of efforts underway to create a new digital public sphere that supports communities with communications that foster constructive dialog, rather than poisoning it. To varying extents, these efforts rely on cooperative models and take inspiration from Wikipedia, Mozilla, and open source software, as well as from traditional public service institutions like local libraries and public broadcasting stations.
Could public broadcasters in the US play a greater role in the new digital public sphere and perhaps even in reimagining social media? They have long been innovators in new digital services that serve their communities and are capable of doing much more. Below, we’ll talk about one example that inspired this post. While we don’t prescribe how broadcasters can or should operate, we think it’s an important area for further exploration, and we’d welcome other pointers to people thinking along these lines.
Moments of crisis prompted investments in noncommercial media
If we find these adolescent years of social media to be disruptive and unstable, so were those salad days of radio and television in the twentieth century. To address fears that these new media would hurt democracy, the US government made bold moves to provide optionality in the form of investments in noncommercial media and infrastructure. In the 1930’s, it reserved radio spectrum to support the growth of noncommercial radio, ensuring there could be a trusted place for farmers and students to get relevant information. In the 1960’s, it invested in technologies and structures that put public television into every community, controlled by local entities responsible to their publics and not the market.
Over time, US investments in public media have tried to respond to technological changes by ensuring that public media kept up and could continue to provide independent, noncommercial, locally-based, and trusted services. Those investments included satellite interconnection and a transition to digital broadcasting. This is on top of investments the government has made in often subtle ways to foster a robust press that fights for truth.
Investments have also gone towards ensuring media institutions serve broader civic roles within their communities; for instance, just as the internet was coming online, the major public broadcaster WHYY embarked on a “civic space” strategy, focused on convening and supporting organizations and people within its local communities, and then extending those engagements into the digital realm.
That said, it’s fair to say that there has been a more limited effort in the US to strengthen a nonmarket civic communication infrastructure in the digital era than in connection with past innovations. This isn’t because of data suggesting that public media is no longer important; in fact, research shows that countries with more robust public service media are better able to address concerns around political polarization. Moreover, this isn’t because of a lack of will among broadcasters; to the contrary, with relatively limited resources, broadcasters continue to work to innovate in ways that serve their communities.
US public broadcasters’ continued evolution in providing information services
Today, the introduction of ATSC 3.0 is an important example of how broadcasters continue to innovate in information services and could provide more vital digital services in the future.
ATSC 3.0 (NextGen TV) is the world’s first IP-based television standard, meaning it can transmit IP data files — just like the internet — and deliver internet services alongside traditional over-the-air broadcast signals. It can deliver high-quality video, immersive audio, and data files to any device in any location using the existing television infrastructure. ATSC 3.0 is also more efficient than the current broadcast standard, increasing delivery capacity and allowing broadcasters to expand their number of program streams.
One of the most important elements of 3.0 technology is datacasting, an efficient, reliable method of delivering content through the television spectrum. Dubbed “Broadcast Internet” by the FCC, 3.0 datacasting is secure and addressable, and there are no per-person or per-device costs to deliver materials — so whether a station is reaching 100 users or 1,000, the cost is the same. And unlike traditional broadband, hardened broadcast transmission sites are designed to operate during natural disasters and other crises. What’s more, public media can help reach people who are un- or under-served by broadband with data services.
Potential use cases for this technology are wide and varied, and public media has been innovating in 3.0 for years now, almost completely independent of commercial station groups. Just a few examples: The NextGen Media Innovation Lab at WKAR/Michigan State University has been experimenting with interactive educational content and services since 2019. The Information Equity Initiative, a nonprofit created in 2021 by three PBS stations, is working with teachers, students, and school districts to bridge the digital divide using datacasting. Public station VPM Media in Richmond, VA, is piloting education datacasting for correctional centers with Southside Virginia Community College’s Campus Within Walls project.
For public broadcasters, ATSC 3.0 presents important opportunities that directly speak to their public service mission. Public media’s nationwide system reaches 98% of the US population, and with more than 25 million American households lacking internet access, NextGen TV could be a game-changer for economic opportunity, civic engagement, news and information, health and wellness, public safety, and education. Public media’s innovations in new digital services using NextGen TV technology are helping the industry build resiliency, communications independence, and the non-market alternatives that could help sustain public media and continue to provide essential services to the American public.
Could broadcasters play a greater role in new social media and online public spaces?
The deployment of ATSC 3.0 offers the only digital space for engagement independent of the internet and other intermediaries. Innovations like the Information Equity Initiative shows how public media can innovate to serve communities, but it’s only one example. Imagine if the same idea were applied to support other information services, delivered for free at high speed to everyone with local accountability and the stewardship of still-trusted public media.
In Europe, broadcasters are already leading such innovation. In his work making the case for reimagining social media and public media, Ethan Zuckerman highlights the Public Spaces Coalition, a group of Dutch broadcasters working together to both serve their communities and ensure they are not unduly dependent on commercial intermediaries to reach their constituencies. One outgrowth of this coalition is a project called PubHubs, a set of tools for community institutions that want to create communication services for their members. Specifically, their tools are built around Matrix, an open, decentralized communication standard that functions a bit like Slack or Discord.
Will this work be successful, and could similar innovation develop in the US (perhaps in cooperation with a global network of public service media institutions)? Perhaps. Certainly, funding is a major constraint, given the way public media’s existing activities are relatively underfunded.
Along with funding, public policy could also enable this sort of innovation in other ways — specifically, through interoperability. Today’s social media benefits from strong network effects, with billions of people already on today’s services, and not on a new entrant’s; switching costs matter even though toggling between new apps is trivial, the cognitive load for managing myriad apps matters, as do default settings, alongside losing the ability to access and reference the content in existing commercial services. Interoperability requirements (or at least clearing legal barriers to potential interoperating services) could help overcome these barriers; if one could use an alternative social network while retaining the ability to connect with users on other services, new public service media services could more easily gain users. Even if only a minority of users took advantage of this capability, it could still impact the overall ecosystem, creating a competitive check on today’s large platforms and proof of concept for new internet structures.
Generations ago, the US media landscape was described as a “vast wasteland.” Public broadcasting was one of the responses. It created a public service component to mass media that delivered more variety, independence, access, and service than the market had done before. While public media has been in the digital business for decades now, it has been a marginal player. Concentrated commercial control of the internet has made all legacy media dependent on the logic of digital platforms. Now, with new broadcast technology, public media has a chance to take a leap forward, carrying public media values into new broadband communications on its own infrastructure. The vision is there. What’s needed is an ecosystem of partners and funding.