On October 7th, Tech Policy Press hosted a mini-conference, Reconciling Social Media and Democracy.
While various solutions to problems at the intersection of social media and democracy are under consideration, from regulation to antitrust action, some experts are enthusiastic about the opportunity to create a new social media ecosystem that relies less on centrally managed platforms like Facebook and more on decentralized, interoperable services and components.
The first discussion at the event took on the notion of ‘middleware’ for content moderation, and the second discussion looked at this question through the lens of what it might mean for the fight against misinformation. The third discussion featured Cory Doctorow, who discussed the notion of competitive compatibility. And the fourth was a discussion on what projects point to the possibility of a more decentralized future, Tracy Chou, founder and CEO of Block Party, a company that builds anti-harassment tools against online abuse, and Michael Masnick, the editor of Techdirt.
This fifth features Dr. Ramesh Srinivasan, Professor in the UCLA Department of Information Studies, and Director of UC Digital Cultures Lab, who with Dipayan Ghosh was the author of the article Reining in Big Tech in the Journal of Democracy series on the future of platform power that served as the basis for some of the discussion at the event. At the end of the session, you will hear Dr. Nathalie Maréchal and Richard Reisman, two participants in the first session of the day, offer thoughts in reply, along with Bryan Jones, who is chairman and cofounder of Tech Policy Press.
Below is a rough transcript of the discussion.
I’m so very pleased that Ramesh can join us today. Ramesh Srinivasan is a professor at UCLA’s Department of Information Studies and director of the UC Digital Cultures Lab, and co-author– with Dipayan Ghosh– of Reforming the Business Model.
And I don’t want to kind of put too many words in your mouth, I don’t want to suggest that you’re going to throw cold water on this whole idea of decentralization, but maybe just as a prompt to get you to have an opening salvo on your point of view, one of the things that you do is essentially critique Fukuyama’s proposal– so bringing us background to where we started today, from the perspective of the market and perspective of where the industry is at the moment.
Absolutely, Justin, and thanks for having me and also for spotlighting the work that Dipayan and I did on this particular piece. It’s been quite a week, as we all know. I’ve been nonstop talking with media all around the world about what we can do about these challenges we’re facing. And I would just want to under score, maybe two main points that the Dipayan Ghosh and I made in the piece that we contributed to the Journal of Democracy that you are convening us around, Justin.
First of all, it’s absolutely essential that the algorithms that drive visibility and virality of content be that which are engaged with in a highly direct manner, particularly given what many of us have suspected and written about based on heuristic evidence. But now we see corroborating evidence from inside of Facebook.
Facebook is just a symptom of a much wider problem amongst, I would say, most big tech companies where the algorithm is privileging content that’s negative, that’s hateful, that’s arousal optimized. So farming out, no matter through model, whether it’s a market-based model or a model that we argue for content moderation. The moderation component of this– or what Dr. Fukuyama describes as a middle layer– won’t resolve that particular issue.
So you may have egregious content that might be moderated out with a greater multi-stakeholder market model around moderation, but that doesn’t resolve, ultimately, the choices that particularly heinous content tends to bubble up to the top. And this is all based on correlation, as many of you know, rather than a actual semantic analysis of that content or a semantic analysis of oneself.
So that’s point one.
Point two, just dramatically missing from Dr. Fukuyama’s proposal– which we do like some components of it and appreciate his diagnosis of the larger challenge here at hand– is the absence of a public focused model toward moderation and a public focused model toward any regulatory action, whether it’s broadening the content moderation, which we all agree is important, and also the algorithm itself.
So audit, design, content moderation, these all need to be in the hands of organizations that are dedicated to a certain notion of the public interest. Now, of course there is no single public. This is a debate that I’ll probably have with, or that I have silently but not directly with Jürgen Habermas, one of the great scholars of the public sphere. And I have deeply appreciated his work.
But when we talk about the reach of a technology platform, such as the portfolio of technologies owned by the company of Facebook, WhatsApp, Instagram, and so on, we’re talking about very different kinds of global publics. The teenage public, the teenage girl female public that Instagram is connected to depression around, is a different kind of public than people involved in human rights in Myanmar.
So Dr. Fukuyama is absolutely right that different kinds of stakeholders need to be involved in the moderation business, if you will, but it’s extremely important that those organizations– we can’t just naively trust that it’s a market based solution to this problem, when in many cases, market failure and actually absence of a market behavior is what we’re seeing when it comes to tech these days.
If anything, I would argue that this is a certain zombified aspect of capitalism rather than an extension of the free market that we see occurring today. Probably, Dr. Fukuyama would agree with that given his proposals. But given that the realities are that the free market has turned into this, because the free market is not simply free, it’s interwoven with the state, it’s not really regulated. It’s a free market failure that led to this situation, so it’s hard to believe that free market reintroduction would somehow resolve of these situations when all the while, the public interest is what has been ignored, particularly vulnerable people in communities. Queer communities, global south communities, Black and brown communities, women, etc.
Those are the main points. This is, of course, my relay of the main points that the Dipayan Ghosh and I made in our piece. But this is a point I’ve been making a million times in everything I write for the mainstream and progressive media, for policy I work on. I currently am working on a couple pieces of policy with members of the congressional progressive caucus and so on. So just wanted to start with that. Thanks, Justin.
So maybe one of the things that I’ll just ask you about a little bit, you two focused particularly on privacy and concerns around privacy, and maybe I’ll push you just for a little bit more on that particular angle and the concerns around that with regard to this general proposal around decentralization or unbundling.
So again, just to make sure I understood that. Our critique is one around related to privacy. Is your question how that’s connected to the decentralization proposal of Dr. Fukuyama?
Oh, I see. I don’t see Dr. Fukuyama’s proposal as actually engaging in a true type of decentralization. I think it’s saying that there are more market entrance in a market that’s been completely corroded by an oligopolic system, which is what we see right now. It isn’t really decentralizing.
I think there’s a little bit of an appropriation of that’s term decentralizing. Of course, we love using that term when it comes to the internet, but if we’re not intervening with the algorithm, if we’re not intervening with Facebook’s dominant market share, not just Facebook, think about Amazon here as well, we’re not actually doing anything that’s decentralizing.
So it’s not privacy alone, that’s part of our proposal, but it’s an understanding that privacy and the sovereignty of various types of communities that have been so adversely affected by this latest stage of techno-capitalism, is where we want to go. So of course I was, I was watching a good conversation with my colleague, Jessica Gonzalez from Free Press with Roger McNamee the last couple days on Democracy Now, which I’ve also appeared on before. And they were making the really important point that what we’re looking at here, especially Jessica was making this point, is not just a data privacy oriented form of intervention, but also a form of intervention that is more largely public interest based.
So I would call it more a digital rights set of proposals that we need that are interlocking. So one of the proposals that Dipayan Ghosh and I make in the piece is that we may want to explore utility based regulatory models and, or antitrust based regulatory models as we try to sandbox what to do about this situation.
The reason being that as we know that many utilities that have been taken over and are guided by the private interest, tend to fail at the times when the public most needs it. A very spectacular example of this was the failure of the electrical grid system when the brutal weather hits Texas and Louisiana. And another great example of it is what we just saw with the Facebook outage.
So a utility based regulatory model would ensure that the public is taken care of in whatever regulatory actions are taken. So it’s kind of like the lenses by which we want to look at, so individual privacy, data privacy is intimately connected to, obviously, political disinformation, which is also connected to this incredibly unequal and ever more unequal digital economy. So all of these things need to be considered concurrently and I would say a much more expansive piece of legislation.
And I also say this as a progressive, and I think it’s important that we go hard when we enter and then we think about the ways we want to amend policy accordingly. That’s my opinion.
I’ve been involved, and so has Dipayan Ghosh in several different bills that were introduced to cut into Section 230, liability. But to me, that doesn’t deal with a much larger issue that’s at play, which is the corporate private interest driven takeover of all things digital, which as we all know, is the basis by which almost everything in our lives is now mediated and expressed.
So we have to wean the internet off ad dollars?
That’s a big part. It’s this ad-focused model, but it’s this ad-focused model like it’s not selling me better soap. It’s looking into my experiences of depression arousal in what’s called engagement, and targeting me in ways that might reflect a nightmare I may have, or I’m yet to have. I’ll give you guys just an example. I’m very, very healthy, I just went to the physician last month or whatever. I get ads for cancer treatments, I get ads for diabetes treatments. So it’s like, I don’t have any… These are not predictions of what are likely to be my reality. They’re predictions of scary nightmares. They’re introducing them into my life and my mind. I’m not letting it have any credence, but at the same time, I think that’s important to note what the nature is. It’s not simply a certain kind of teleological prediction, it’s much more about what will get you freaked out.
So, let me just put it to you this way, to maybe elicit a little more color around your idea here. You see privacy as the most urgent need, you want to counter this digital economy, you want to find a way to treat attention in a different manner that is more in line with the types of values that you describe. If decentralization doesn’t do any of that, do you see any value to pursuing it? Is there any values to democracy if we can’t address those things?
So, I think Dr. Fukuyama’s proposal is important as one component of a much larger strategy that has public interests or publics interests as our North Star. I mean, if you look at content moderation issues as we all know, majority of content moderation tends to be obfuscated, disguised labor, and very similar to what we saw with call centers, and another facet. And if you also look at content moderation, it’s dominantly focused on English language content moderation. Despite English apparently– this is what Jessica Gonzalez was saying– being only about 10% of the actual traffic for a company like Facebook. So, let’s take the really important components of Dr. Fukuyama’s proposal and build upon them which would be to think about expanding out content moderation to actually support journalists and folks who have a certain kind of understanding of local publics, politics and cultures in the different parts of the world to which these companies have expanded and ever want to expand even more if they want to blanket the entire wider internet.
What if we hired good journalists in Burma to be content moderators and instead of them being some obfuscated, exploited, PTSD affected laborers in the Philippines, awake in the middle of the night, let’s have those people be front and center driving decisions around that can work with these algorithms in these different places. So I think that here’s a huge opportunity to resolve some of the digital economy issues, so the intersecting proposal in this particular case if we can find ways to get this trillion plus dollar evaluated company to fork over some money to hire people, to do the auditing and design and moderation work in these different parts of the world in non-English language placed this to which the networks of the access has expanded. That’s a great thing to do. So in that sense, there could be some good resonance there with Dr. Fukuyama’s proposal.
Are there other things that you think are going on in the world that, I don’t know, give you some hope that will get beyond this particular moment? You’ve brought back this sense of urgency I think to the problems that we face that we started the event off with.
Yeah. I think that it’s important to know that in much of the United States, and then we can see other examples globally, this status quo is considered unacceptable by the vast majority of people. That said, there’s a reliance on these platforms, the developers and investors and executives that these companies deserve some credit for building some really amazing technologies. I’m not trying to push that. I think it’s important for them to see this as a great design and engineering exercise that is intersecting with questions of humanizing and social sciences to really think about how do we design for a way that really supports multivocality? How do we design for such a way that supports more of a people’s based democracy while remaining profitable given that that’s the nature of the enterprise here? Is this toxic sympathetic nervous system on steroids model the only way to be profitable and have the kind of traction you have? Let’s do some A/B testing.
Let’s see this is the new disruption. So, I want to encourage my friends and colleagues who work at these companies, many friends of mine, I was Stanford late 90s, so this was my world. And I just want to encourage my colleagues there to like, let’s think about a certain kind of a notion of design that’s sociotechnical, that’s inclusive, that’s truly collaborative. So I think, I just want to say those things, and I think it’s extremely important in a world more widely where dominantly the people who are most rapidly upticking in terms of internet access or in the global south, for example, the African continent, the youngest people in the world are in these parts of the world. So, they are in many ways the future of where the internet will be going. So, let’s have them front and center in terms of helping drive the decision that we make rather than objectifying or patronizing these people.
So, there may be opportunities to see experiments thrive and other parts of the world that we could been maybe even bring back here.
Particularly given the very mainstream now, understanding of bias and technology, like those biases are geographic, they’re gendered, they’re racial, et cetera. I mean, I give Google credit with all the failures of their algorithms in the African continent. They did set up a lab and they are interacting with Black in AI in Accra in Ghana. So, where that goes, that’s an interesting question. So, I think those things. Or rather than simply thinking that the resolution to racist facial recognition technologies or algorithms should just be making them better or more inclusive. Let’s ask questions about whether those technologies should be applied or be present to begin with, and in what context? Like following the lead of, for example, the city of San Francisco, which banned facial recognition.
So, I think we need to think about design, not just as, “Oh, my gosh, we got to get a few black people to help us be inclusive.” That’s a very neoliberal accumulative approach. What I would argue instead is let’s really think about design as a way of conceptualizing what kinds of technologies made by whom, for whom? And whether they should even be developed or applied, in what cases. So I want us to think on that level, let’s start by thinking about this kind of concept of public spheres and then design based on that.
We’re close to the end of our conversation. Ramesh, do you have a few more minutes?
Great. Okay. So Natalie, if you want to camera back on. I don’t know, Dick, if you’re still there, Rob, you’re welcome as well, Bryan. Just want to invite Natalie in particular with any thoughts in reply. You’ve been with us the whole day, so you’ve watched the whole arc of the conversation. So I just thought I’d bring you back in with any response to Ramesh or to our earlier speakers.
Thanks so much, Justin. And Ramesh, it’s great to meet you. I’m a big fan of your work. And I have to say, I agree with all the points that you made today, and in your piece with Dipayan who I know very well also. The two big thoughts that are really… Well, three big thoughts that are front of mind. One is, it’s been said a lot, but I think it bears repeating. This is really a whole of society problem, and it’s going to take a whole of society solution. So there’s a ton of room for lots of people to think about and iterate on different aspects of the problem. I think to a large extent, experts tend to gravitate on the part of the problem that is the closest to their area of expertise which is why you don’t hear me talking about tech design or antitrust, because I’m like, “Huh? Sure, do that.”
I don’t know. So, I think it would be a mistake to assume that just because somebody else is focusing on part A of the problem that that means that they think that part B that you are working on is dumb or unproductive and so on. So, for me like this conversation is super interesting, super educational. It’s probably not going to change the focus of my work or of Ranking Digital Rights’ work, but that doesn’t mean that I don’t value innovation and research and hard thinking that’s being done on other aspects of this problem. The second thing that really comes to mind is that there’s a ton of really… That’s for my mind is that there’s a ton of super, super important questions about this middleware problem that identified as huge questions in my piece, in the Journal of Democracy and that I’m still not hearing answers to, or even a path toward answering, starting with how are we making today’s big platforms cooperate with this plan?
And second, how are we paying these middleware providers? I just can’t help but feeling that until we address these two questions, everything else is super interesting intellectual discussion, but I don’t see it going any further than that. And then my third big thought which anyone who follows me on Twitter is going to roll their eyes at because I’ve been harping on that a lot is the central necessity of corporate governance reform. It’s not nearly as sexy as poking holes in Section 230. I mean, I live in DC couple miles from Capitol Hill. So we have a very particular definition of sexy down here. But things like abolishing or at least setting in like a sunset clause for dual share structures, what is the actual phrase?
I always mess it up. But the thing where there’s A, B and C stock with different voting classes. We need to deal with that. We need to deal with the fact that Zuckerberg and a lot of the others are both chairman of the board directors and CEO. That means they’re only accountable to themselves. Those two facts together to my mind are an insurmountable barrier to literally everything else, because we cannot get anything through without approval from the guy who’s appointed himself dictator. And so, I’m really interested right now in thinking through legal mechanisms to compel changes to how massive multinational corporations that are headquartered in the U.S. and traded on the New York Stock Exchange are governed. It’s a really far field from what I thought I was getting into when I started working on these topics 10 plus years ago. I have degrees in International Relations and Human Rights and Communication. And here I am thinking about how to make the SEC do stuff.
I don’t have answers yet, but that’s what I’m super excited about talking about. But again, that doesn’t mean that other people shouldn’t be talking about the things that they’re super interested in talking about. I’ll stop there.
Ramesh, any thoughts by reply?
Oh, just completely agree. And I’m a big fan of your work also, Nathalie. So, it’s really nice to meet you this way. And Dipayan has told me great things. Because it’s also a political economy issue here, and that’s actually, I feel what you’re alluding to when you’re talking about these kind of internal mechanisms of even shareholder structuring. So even if you thought of the shareholders as some democratic body of governance, that’s not even possible in this case. So every single aspect of this needs to be peeled back, we need to look at the mechanisms by which the status quo is being constructed and worsened and perpetuated, and then intervene there. And it’s all hands on deck. All of us should be doing our own things. And it’s so great that we’re all talking with one another, thanks to Justin bringing us together. So, it’s stuff like that. We need to do things like this and present these proposals to the various representatives or legislators we are in contact with both in this country and around the world.
Dick, I want to bring you into the conversation. We’re almost finished up here. We’re going to be done here in just a minute folks, but just a couple of thoughts on that last point from Ramesh.
Sure. Yeah. So nice to see you. Yeah. I think your points make sense. I have somewhat different views on the few of them, but working backwards on the governance reform that Natalie mentioned, I agree completely that any company beyond a certain scale should not have that kind of governance structure– it’s just totally insane, especially for something with the amount of dominance that Facebook has. On the paying, well, first… Okay. The other simple question is, yeah, you’re not going to force Facebook to do it except by government regulation of some kind. So, that’s the challenge.
There’s one thing that struck me just to clarify how this thing might work is I worked for the Bell System before the breakup. The Bell System breakup to me was a huge success because it took something that was a utility, but didn’t have any competition, didn’t innovate, the labs did wonderful things, but they didn’t turn into products, you couldn’t connect things. So the government figured out how to break them up in a very well architected way that unleashed all kinds of competition and all kinds of innovation. And so, to me, the answer is you don’t get rid of the free market, but you regulate it so that it is truly a level free market, which is the kind of stuff that Cory [Doctorow]’s talking about in particular. And I think that’s what the whole proposal here is to get that sensibly regulated competition, not laissez-faire profit at all costs things.
One way to do it with Facebook might be to spin off the filtering side of Facebook as a separate entity and leave them as the underlying connectivity utility, make the code for that open source, give the spun off entity, the staff to operate it, but allow other big players to take that code and build competing services that do a better job, different job, fork the code and create that kind of middleware function. And then with that base, there may be enough infrastructure so that little guys can play the same game and deal with the resource issues. So, that’s I think the issue is to try and find a way to find a hybrid solution that works because you’ve got lots of valid points of why we can’t just do it instantly. And then on the business model, I think people have been blind in digital as to how you charge for digital services.
And one of the things I’ve been working on is some stuff that’s been in some marketing journals and in HBR on ways to get pricing that’s much more variable and much more related to the value people get, so that you could do this reverse metering so that people have more control over the ads they see, and how these companies profit in a way that serves users, where the user has a seat at the table basically.
Ramesh, did you want to jump in on that?
Lots of important points and really nice to meet you. I think of Tim Wu’s book, the Master Switch, when you were discussing the breakup of the Bell System and how there was an introduction of greater competition and that occurred as a function of that action. And I think that’s interesting to think about given that Tim Wu’s in the Biden administration and what that would mean exactly. But I think that the main point I’m trying to make is I understand that moving forward there that we need a more multiplicitous market here, but I also don’t necessarily have faith that more market entrance would mean that some of those entities or them at large will be kind of North Star focus on these public interest issues.
So I think, again, we are talking about public, private actions we have to be realistic about. And I love the idea of us being able to out of some of these reforms being able to support sustainable small and medium size businesses that can then help take up some of these problems that are occurring. I’m not sure that we could ever convince Facebook to make their code open source per se, because in a sense that might be what would be considered, not just Facebook, any of these guys, because the code seems to be a pretty proprietary intervention. However, they could give us lots of heuristics and lots of mechanisms. And also we could cut off their reach in particular ways, or we can even on a heuristic level influence the way the algorithms and data collection systems and so on are designed.
I think in terms of competition with Facebook itself, and a few reporters have asked me about this, and I’m sure you all too. The network effects issue is such where it would be hard to imagine. I know we’ve had these little stories over the past, like 10, 15 years where these social media companies form, and then everybody flocks to them and then they crash on a server level or people flock to them and then they never reuse them. Like, remember Ello, everybody? E-L-L-O. That was a great example of that. So, I’m on Facebook not to see– I’m not really interested in anything that Mark, no offense intended, or Sheryl Sandberg have to say, or post, I mean, other than hopefully them having a better response to what’s occurring than what I’ve seen so far– but I am on there because I want to see what Justin has to say, and what you have to say, and what Nathalie has to say. We’re on there for one another. So it’s really, really important to recognize– given that across Facebook owned products, we have 3.7 or so billion people on them– it’s hard to imagine an Ello or almost any other coming from the bottom up organization being able to compete on that level. So I think that’s the main thing. I think we have a lot of overlap in our positions on these things, which is great.
Well, I think we are almost out of time. I want to draw this to a close. I want to thank all of the folks who are on. Ramesh, I want to thank you for joining us to represent your work with Dipayan, And I’m very grateful to you. Nathalie, thank you for being part of this the whole day. You’ve given us the substantial part of your afternoon. So very grateful to you for both your essay and also your ongoing activism in this space. Thank you. And Dick, of course, we’ve talked about this event, in more depth than any of the others, I appreciate you very much all your leadership on this and your enthusiasm for it. So thank you for helping to put this together and being part of the afternoon as well. So, I’m very grateful to all of you. And Bryan, Rob, who are still up there, thank you so much for all your help with this event. And Dick, last thought from you?
Yeah. I just wanted to thank you for organizing this and providing the platform for it because I think it was a great event that was badly needed and hopefully will influence progress going forward.
Many complicated aspects to this, we’ve got to keep going. Bryan, any last thought there?
Just want to echo Dick and thank everybody for joining and thanking all the panelists and the conversation today was fantastic, much needed and appreciated. And I think it was a good starting point for future conversations. So thank you to everybody for participating today.
Thank you all and very grateful. And we will talk to you soon. And please, do listen to the podcast where this audio will show up in due time. Thank you. Have a good day.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.