Home

Donate

Governing the Fediverse: A Field Study

Justin Hendrix / Oct 20, 2024

Audio of this conversation is available via your favorite podcast service.

A lot of folks frustrated with major social media platforms are migrating to alternatives like Mastodon and Bluesky, which operate on decentralized protocols. This summer, Erin Kissane and Darius Kazemi released a significant report on the governance on fediverse microblogging servers and the moderation practices of the people who run them. I caught up with Erin Kissane about their findings, including the emerging forms of diplomacy between different server operators, the types of political and policy decisions moderators must make, and the need for more resources and tooling to enable better governance across the fediverse.

Below is a lightly edited transcript of the discussion.

Erin Kissane:

I'm Erin Kissane. I am unaffiliated and independent. I am a writer and researcher and a long-time internet community nerd.

Justin Hendrix:

Thank you so much for joining me. We're going to dig into this report that you've produced on the Thetaverse, but I suppose I should just let my listeners know it's not the first time we've talked. We have a minimal connection in the past through something that you were very involved in and helped to lead, the COVID tracking project. What have you been up to since the COVID tracking project?

Erin Kissane:

Since the COVID tracking project, first I took some time off to put my brain back together because that was super intense and put my body back together. And then I started in on a lot of fairly arcane research on the history of internet communities and then started digging into some of the newer, the new school networks is how I think about this cluster. So you've got the Thetaverse, which I'm thinking of as this network of systems united by largely the ActivityPub protocol. Some folks are using Thetaverse to mean different things, but that's mostly how I've been defining it. We also have things like Bluesky, there are others like Nostr that I haven't really got my head around yet except at the most basic level, but I got really interested in these decentralized or non-centralized or federated platforms as meaningful alternatives to the centralized platforms we've all been living with varying degrees of misery for the last decade or so.

Justin Hendrix:

And I understand that's where this project comes from. You say in your report that you propose to create this report, do this set of interviews that informed it. You say, "Based on our shared sense that the Thetaverse's history of resilience and expansion positions it as one of our best chances to allow more people to maintain strong social connections online while escaping the behavioral manipulation, pervasive surveillance, and capricious governance that characterizes large-scale centralized social platforms." We've talked about a variety of things in this same vein on this show in the past. Some of the people that you even refer to in the report, folks that are running servers that you interviewed have appeared on this podcast in the past. I'm thinking of folks like Nathan Schneider. Tell me about how you came together on this project. How did it happen and who is your collaborator?

Erin Kissane:

I've known Darius Kazemi, who is my co-researcher on this project, for a long time on the internet. He's definitely been dug into the fediverse for a lot longer than I have, and he is the maintainer of the Hometown fork of Mastodon, which is meaningfully different in a few ways that I think are really cool. So he has a pretty strong grounding in the mechanics of how this system works and also a critical perspective on networks.

It seemed like a pretty natural fit. When a friend mentioned that there was a grant happening through the Digital Infrastructure Insights Fund, Darius and I got together and thought about what both of us conceived of as the most useful thing we could do in the time allotted for a grant like that. And we both felt like there's a lot of theory about how things work on the fediverse, but there's not a ton of research based on actually going out and finding out what's happening specifically on the people level.

There's a lot of documentation, not necessarily easy to find, but it's around. About how ActivityPub works, how the protocol works. It's a standard protocol. You can find that. You can find information about how the various systems within ActivityPub work like Mastodon. But the servers in the fediverse are all run by people according to their own lights and their local norms. And although you could, and people have, I think usefully scraped all of the public server rules, things like that, you can collect those, you can diff them. That's interesting.

But there was just still this big question mark for me and I think for Darius, about how is it actually going? How do you run a server? What's it like, what's hard, what's great? Especially for him, he was really interested in what are the tooling gaps, what are the software problems or what's missing in the tool chain? And I'm especially interested in the cultural side of those systems. What do you need to do this the way you think would be best and how is it working for your members of your server?

So those were all to me, I think open questions. So we started talking about this back last October, got the grant and kicked off the project in January, and we spoke with people spread across 11 different Mastodon hometown servers. As well as some scholars and providers of other kinds of fediverse infrastructure like the folks at IFTAS, Independent Federated Trust and Safety. Who are convening moderators and providing, filling some of the gaps that we actually talk about in the report, including things like helping with cross-server moderation of various kinds, clarifying, reporting for illegal content like CSAM, that kind of thing.

So we were able to dig in pretty deeply with those folks, which was great. We had long conversations with multiple people on a lot of server teams and were able to put together what I think is a usefully heterogeneous set of responses. We do a lot of quotation in the report and to me, I was really looking for, we have to get deep enough into this that we find meaningful disagreement even within our subset.

Justin Hendrix:

For listeners of mine who may not be terribly familiar with the fediverse still and we're thinking about, there are a few minutes into this podcast and they're wondering, why does this matter? Why should I care about it? You actually spend a little time on that in the report. You talk about the stakes, you talk about what's at play here. Why should folks care about the fediverse and the way it's moderated?

Erin Kissane:

I have a couple answers. One is that I think probably, I don't know all that much about your listeners, but I'm guessing very few of them are perfectly contented with the recent history of social networks. The fediverse for me is one of the plausible challenges to the current order and the recent past, the way things have gone. There are a few other models, but I think the fediverse is particularly interesting because it completely deviates from the norm of centralized cross-platform moderation, which also means it is working around the idea of centralized surveillance.

A lot of people who come to the fediverse come because there is no centralized surveillance, there is no central telemetry. Those things don't exist on the fediverse, they are impossible to implement at scale as things stand now. It's not that all of the fediverse is immune to scrapers and things like that, but it is just really hard to scrape everything or meaningfully be able to determine what's happening across the whole fediverse.

And I should say Macedon and the fediverse is probably synonymous for a lot of casual users. Macedon is one thing on this interconnected set of systems, but it's the biggest. There are also systems like Lemmy that work more like Reddit. There are file-sharing systems and blogging things, but Mastodon works more like Twitter used to work. That's the elephant. It's certainly the big charismatic megafauna.

Justin Hendrix:

Some of my listeners have probably heard other news about fediverse related things. They've heard about the idea that Threads is meant to be on a decentralized model or they've thought about Bluesky. How do those things fit together in this current fediverse ecosystem?

Erin Kissane:

So Threads is currently implementing really limited federation using ActivityPub. So Threads is technically partially part of the fediverse, but it's a very limited mode of federation and it's also opt-in on the Threads side. So it's a pretty tiny fraction of Threads users. Most Threads users are continuing to experience Threads as a closed platform. It is possible to follow Threads accounts from Mastodon. I don't know where we are in the, you can follow Mastodon accounts from Threads progression. I think you can do some of that, but it's super, super limited and there are a lot of theories about why that is and the Threads folks have been going around to conferences talking about them. But right now the presence of Threads in the fediverse is really limited.

Bluesky runs on the AT protocol, it's their own system. That's a parallel to ActivityPub, but it works in a meaningfully different way. They wanted to build something that's different. They wrote their own protocol. It hasn't gone through the standardization process yet, but they do intend to take it to a standards body and develop it into sort of a peer protocol.

It is possible using some bridging tools to bridge back and forth between, for instance, Mastodon and Bluesky. So you can do some of that, but technically it's its own sort of circle with a little bit of overlap. And I think meaningfully Threads is the same way. Threads is a very large centralized service compared to the number of accounts on the fediverse, which is about 12 million coming up on 12 million total accounts. I think Bluesky just hit 10 million. I'm not sure where Threads, but it's an order of magnitude larger. So right now the fediverse is still pretty small and although there is one really large sort of flagship Mastodon server Mastodon.social, most of the fediverse runs on really small servers, a lot of single user servers, a lot of really little ones with a dozen accounts or a few dozen.

And then the group that we were especially interested in for our research is the layer of accounts on the fediverse and specifically Mastodon and hometown that had what we called medium size. So this is folks who have more than a few dozen. I think our smallest was around 80 users or members up to the 10,000 ish mark, and there's a lot of those. And that size in particular, servers of that size just behave really differently culturally, especially in the way they handle moderation, the way they handle governance than a giant centralized platform. And also probably meaningfully differently than a really large quarter million user, huge Mastodon server, which probably because of its size we hypothesize acts a little more like a centralized platform, but we didn't study those servers, so that's still a little bit more of a question mark. We dug into these medium-sized servers because they're doing things that are meaningfully different from anything you can accomplish on a gigantic platform.

No, these medium-sized servers, big servers, tiny servers can for the most part all talk to each other. So you can still be in community with people on many servers, with some diplomatic exceptions, but still have in terms of your experience of the network, you'll have a broad view. You'll be able to connect with people across the fediverse, but your own membership and also the kinds of things will be determined by local norms. So if you have a server that's really focused on keeping out as much offensive, illegal, obnoxious stuff as possible, you're going to have one kind of experience. If you choose a server that is a more open doors policy, they're only going to take out from your stream the most obvious or illegal kinds of content, then you're going to have a different experience of the fediverse. So there's many experiences of the fediverse and the server you're on is one slider on that board for what you're going to see and who you're going to be able to interact with, things like that.

Justin Hendrix:

So you've gone essentially on a kind of qualitative, almost anthropological journey here, talking to these various leaders of the servers and the folks who are essentially setting the moderation policies for their servers. Talk to me about some of the big themes, in particular around how the folks leading these different servers set community norms, set policies, how they go about essentially behaving like, I don't want to make this sound pejorative, but mini Mark Zuckerbergs, right? They're each responsible ultimately for being the deciders about how their server will operate and how it will function.

Erin Kissane:

I think first, I should take a step back and mention something I should have maybe mentioned early on, which is that the assumption that I went into this work with based on my experience of and research on the large centralized platforms is that model of content moderation is essentially failing, and that crystallized in something like Mike Masnick's paper, "Protocols, Not Platforms." It's essentially an admission that this model that we were working with, it works to a point and then it really doesn't and it collapses in a variety of ways depending on, as you say, who is the person making those decisions. I think very frequently you've got a person who's setting a culture and then you have a lot of really opaque decision-making systems that then take whatever that line is from the top and turn it into whatever kind of failure mode on a given centralized platform.

To me, something like the fediverse or on the other hand, something like Bluesky, these are experiments with alternate modes. If this centralized global experience, mostly run by giant US based tech corporations often run by billionaires with their own personality issues. If this model doesn't work, what can we try? And the proposition of the fediverse is let's try a whole lot of these different little jurisdictions each with essentially home rule and then if you don't like the way it works in one jurisdiction, you can move to a different one and improve your experience and then each of these jurisdictions can decide who else they're going to talk to.

The fediverse defaults open as it stands right now, you're going to have a list of servers that you don't federate with, and almost all the fediverse servers that are governed essentially at all have a list of the known worst actors, and those are the servers that are known hate farms. They have a lot of illegal content. They do network abuse and brigading and all of those things. Those are very widely defederated to the point that they're really in their own. I've heard people talk about it as the dark fediverse. It's its own system.

And the proposition of the fediverse in terms of governance and content moderation is okay, so what does it look like when anyone can run a server? And the answer is there are a lot of kinds of people, so they run them in really different ways. You can absolutely have a server that turns out to be run by someone who doesn't want to be accountable to anyone and who wants to govern by their own whim. And the open source world has a propensity toward what is semi-affectionately called the benevolent dictator for life model. I think most dictators think they're benevolent and not all of them actually are.

And there are also some servers where you've got one person who's not particularly accountable to anyone, but they make mostly really pretty good choices so people on their servers are happy. There are some servers that are organized as full-on legal cooperatives and they have councils and boards and a pretty extensive... You mentioned Nathan Schneider is someone you've spoken with before, and he's affiliated with Social.coop which is one of the, I think furthest along of all the cooperative servers in terms of the sophistication of its governance. So you have all of these different little republics or dictatorships that may or may not talk to each other. Mostly they do, and what we wanted to do is dig in and see what are the commonalities people experience across these models and what are the big differences.

Justin Hendrix:

You're seeing all of these experiments with community norms, all of these experiments, as you say, with modes of governance between ones that are more collaborative versus ones that are that benevolent dictator model. You talk about the idea that most of these servers have three to five active moderators, the folks who are actually doing the work on a regular basis, and I take from the report that a lot of what folks are doing is still very emergent, very reactive to the phenomenon that they're facing. You write that vibes matter. There's still a kind of sense of people feeling their way through it. What's your general take on where we're at when it comes to the norms across the fediverse when it comes to moderation right now?

Erin Kissane:

Yeah, so I'll take for example, one of the server teams that we spoke with, we actually spoke to one person who's a very small team who runs a server that's focused on a sexual subculture, and that's not to say that everything on the server is... Everything, if you're a member of that server, that all you do is post something fetish-related or something that would be appropriate in a letter bar, but it does mean that those things are explicitly fine. If you are a member of that server, you can rest assured that your server moderators are never going to bust you down for posting things that would not be acceptable on, for instance, most centralized platforms. Certain kinds of nudity are fine, it's explicitly allowed. That team gets reports all the time from other servers that are like, "We don't like having butts and you have butts and it's a problem for us," and that server's team can be like, "That's what we do. It's okay here."

That's an easy example. But there are a lot of people who are talking about things that are frequently moderated down very hard in a lot of places, a lot of kinds of political conversations. They're still often controversial on a lot of fediverse servers, but you can have a server that says for instance, we are going to focus on a particular kind of leftist politics that has room for this position on the war in Gaza and not that position, and that is a local norm. And as a member of that server that you're not going to get your account suspended for talking about your political position.

That particular, the Gaza-Israel conflict is something that came up a lot in our conversations because it was, this conversation started right at the beginning of our research really, and a lot of sort of general purpose servers were having to confront the fact that a lot of their members who otherwise got along pretty well and were happy together, were reporting each other or they were receiving reports from other servers about someone who was posting about genocide in Gaza or someone who was posting in support of the IDF. So that brought a lot of things to a head.

People in these situations who are running these servers have to make decisions and they have to make them explicitly and they have to make them for their own membership, which is I think the huge strength of the fediverse in moderation terms. You don't have to make a choice that's right for the entire network. You are not making a choice for a nation of people or the international community. You're making a choice for people on your server. So you can choose to do the right thing for a given group of people instead of an average best or something that is fair to an average rather than to specific human beings who are part of a community on your server.

Different servers handle that in different ways. There are many servers that are general interest or their, for instance, regional servers and they exist in order to serve a French-speaking population or people in Brazil or people in the Bay Area of the United States or something like that. Those servers sometimes come with a strong sense from their server leadership team of also, these are our political positions, these are our lines about different kinds of content.

I will say that we spoke with servers who all practiced the Mastodon minimum viable moderation, which includes, we don't allow racist speech, ableist speech, we don't allow open transphobia, those kinds of things. How that sort of minimum set gets interpreted on different servers, obviously it's handled differently, but a regional server may not necessarily have a stated position on a new geopolitical conflict and they have to figure that out. What is the right position for them for their community?

And then inevitably, if you make a new policy or you change a policy, some of your members are probably going to be unhappy at which point they can move to a different server that suits them and meets their needs. It's not true that it's, you can move everything. You can't right now move your posts on Mastodon, which is a real sticking point. You can move your social graph, so you can bring your followers with you and maintain the people that you followed and some of the metadata and things like that. You can't move your posts, but you can still move your identity and your connection to groups of people. So there is more exit in the exit voice system. There's more exit options on something like Mastodon than there is on Facebook, Instagram, Twitter, any of those things.

Justin Hendrix:

Which so many people are dealing with right now. Of course with X, the feeling of not being able to leave because they've built up such a social network or such a valuable source of information from all the folks they follow and interact with there. That's of course the great promise of the fediverse.

I want to ask you specifically about something I found really interesting in the report, which is very different from, I suppose on some level the big social media networks do communicate with one another. They do often take decisions seemingly together or at the same time often in response to world events. They do coordinate around things like terrorism and forums like the GIFCT, etc. But you looked at this sort of phenomena of what you call federation diplomacy, which I found really interesting, just the kind of emerging behaviors of the different server operators in terms of how they actively get on with one another, the types of decisions they have to take about who to connect with, who not to connect with, whose rules to observe and whose rules to reject.

Erin Kissane:

We specifically talked about this dynamic in terms of, in openly political terms, as federated diplomacy is what's happening when one server, one jurisdiction decides whether or not to connect with another jurisdiction. We made that choice in part because we think that's actually what's happening at a systems level. These are political conversations. They have obvious parallels in offline diplomacy, but what they look like in interpersonal terms is drama because drama is the interpersonal level of politics. So this is a good thing and a bad thing in fediverse terms.

If you are on a server and the server team or perhaps just the individual running your server gets into a fight with someone who's running another server and cuts the connections between your two servers, that can obviously feel unfair and upsetting and then you have to move to talk to your old friends again. Not ideal, that feels like drama. But then as the fediverse sort of continues to expand and evolve and increase in sophistication, it becomes clear that sometimes when it's happening, when people do make choices to federate or not federate with other servers to maintain or cut their connections, that it really is pretty explicitly a form of political organization.

And where we saw a lot of this was with the very early inception of the very early stages of Meta's Threads platform starting to federate and Threads announced that it was going to do this. Meta made that announcement a long way before anything actually happened, and the reaction on the fediverse was really polarized. There were a lot of people who were really excited about that, and it's good for ActivityPub as a protocol to have it be picked up by Meta. It's good for the network to make it so that people who want to have their home base on Mastodon still be able to follow friends or family members or celebrities or political figures on Threads.

And there were a lot of other people whose response was that involving your server in connections with a Meta project, given Meta's history of surveillance and various kinds of meaningfully terrible behavior is fundamentally an unethical thing to do and is in fact in opposition to the root ethics of the fediverse. There is no unified, there's no consensus on what the fediverse means or what the fediverse should be, but there are a lot of people who have really strong convictions about what it is and what it should be and everything else is wrong.

The classic thing happened that happens in any sort of political debate of this kind or frankly religious debate of that kind, and you get schisms. So there were servers that hived off into sort of called the Fetipact servers that agreed that they would never federate with Threads or any other Meta product and other servers that were obviously open to that and excited about it and saw it as a real path forward.

So what that did for a lot of the servers that didn't instantly know which way they were going to go is it forced more communal decision-making and more open and transparent decision making about this diplomatic function than we had seen before in a lot of the servers, so many servers that normally just made decisions within their own small team of moderators and administrators wound up needing to find ways to talk to their membership, to open up and have a conversation and say, how many of you really feel one way or another about this? What is the right thing for us to do?

There was more deliberative democracy happening on some of these servers than they'd ever seen before, so that made it a really interesting time to do some ethnography on how those decisions got made. And I think some of the servers, that's going to be a one time or a rare occurrence and for some servers it's turned into, maybe we need to be more consensus driven or more deliberative about these decisions that we now understand to be as much about politics and as much about ethics as about, oh, that server's letting spam through or this server made an obviously poor choice to allow someone who posts hate speech to keep an account there. It was a really fun and interesting time for us as researchers to be able to watch that stuff happen and to hear accounts of these very recent conversations.

It really does feel like being present at a critical moment in the development of a kind of civilization as these teams of just people with many kinds of backgrounds differentiate themselves and develop more sophisticated senses of what kind of jurisdiction do we want to run? What kind of local norms do we want to uphold, and ultimately, which people do we want to provide a safe home for? Those are difficult decisions. It's the kind of thing that you can't see in a centralized platform because that kind of decision is first to all happening internally and in really opaque ways. And second, the group is never a small community or even a medium-sized community. It's everyone and those central platforms are also dealing with job owning and all kinds of political pressures and economic pressures and so on.

So this is something that I think can only happen, the kinds of conversations we saw and the kinds of cultural choices, governance choices these folks are making. It's really only possible in a federated system where you can make a call to just say, "Actually, no, we are going to opt out of the whole Meta scene and we're going to federate with servers that work in the same way so that we have a safer community for our members who feel really strongly about this."

Justin Hendrix:

There's so many different interesting phenomena that you document here, the desire for the Federation of Moderation actions, so possibly making it so that different servers I assume could band together and maybe agree certain terms about how moderation will be conducted across multiple different servers. In particular, you mentioned Darius was interested in this subject, but the desire for more funding development of tooling, but the fact that might be hard to do in some cases because just not a lot of money in the fediverse.

Erin Kissane:

There's a very tiny amount of money in the fediverse. Yeah, it's a huge and well acknowledged problem. It's interesting because the moderation tooling, as soon as our report came out, immediately some of the folks who are working on this tooling, they're not part of Macedon the company, but people who are still working on the code base and things around it. Got in touch with us within hours to say, "So we've made this proposal, that proposal and have this plan to solve that kind of thing." So there are some kinds of tooling problems that are just going to be not easy to solve, but easier to solve because they can be handled by a small number of people.

Some of the more ambitious proposals for things people need, more people who are running these servers want involving particularly kinds of communication. It is really hard right now to communicate with the leaders of another server. If you don't know those people, it's not obvious how even in some cases to get in touch with them, you can send a DM and hope that they see it, but they might not. So there are other sort of layers of both technical and institutional capabilities that people who run these servers explicitly want.

Some of them are going to be relatively easy to implement. Some of them are not at all, and the people actually making the tooling. It's a tiny number of people working on ActivityPub based services. It's a tiny number of people. It's small amounts of funding and a lot of the funding is coming from foundations, various kinds of government institutions in the case of European developers. It's slow money. It's not the kind of money you get in a giant chunk upfront, there's not venture capital happening on the fediverse. So it's slow and frustrating, and the people who are doing the development work are frequently overwhelmed. This is open source work we're talking about, so it has the same problems that other open source systems often do. So that part is really tricky.

There are also just a lot of cultural things that aren't necessarily dependent on tooling that people on the fediverse are figuring out how to solve for themselves and for each other. So the development of, for instance, I mentioned IFTAS, Independent Federated Trust and Safety. They started a moderator, a network of moderators and made a place for those people to communicate together and to standardize some of their processes and some of their work. That's not actually dependent on tooling. That's culture work, and there's just a huge layer that needs to be done. IFTAS is shouldering sort of a disproportionate amount of that burden right now, but we're going to need more institutions and more of those kinds of projects.

Justin Hendrix:

Well, that leads me to one of the last things I wanted to ask you about. You cover the issue of moderator mental health. It's not the same of course as one of these large centralized platforms where there may be tens of thousands of moderators operating in some sweatshop in Kenya or the Philippines, but there's still people having to deal with terrible content that folks post on the internet. Many of them, as you say, volunteers still bearing the kind of emotional scars that come from that activity. What did you learn about that? What did you learn about what the fediverse might mean for moderator mental health?

Erin Kissane:

It was really interesting what we found, because I knew from other kinds of surveys and research work that moderator mental health was a real issue on the fediverse servers, but the people that we talked to for the most part, were not experiencing a really intense mental health and just stability challenge for the reasons that you just outlined. And so we tried to push on that. If that's the case, then what is it about what you are doing that is perhaps more protective?

And one of the things that I think I hope will be most useful in the findings that we've published is how are these well-governed servers taking care of their moderators, what are the things that they're doing right? A lot of it is taking, doing some really active technical administration work right up front where you immediately eliminate the known worst of the worst servers. You get rid of the known sources of terrible things so that your moderators never see those. In a lot of cases, if you take out the big known clusters of terrible things and you have a moderation team that's more than a couple of people so that people can rotate in and out and don't experience immediate burnout.

Even just doing those two basic things buys you an enormous amount of headroom compared to folks who come in, maybe they don't know who to start defederating from, they don't have a starting list, they open up federation and they're immediately drowned in just the worst of the internet. This is the open internet. There's no filter by default. You have to apply a filter. These folks are protecting their moderators by putting in a lot of proactive technical administrative work at the very beginning and then continuing to just pay attention. Like, where are the problems coming from? Blocking those leaks quickly so that the people, ideally before your moderators hit it, but especially before your members hit it. Because that's the worst, right, is you have, it's bad enough if your volunteer moderators have to see horrible things, but it's even worse if your entire membership encounters something awful.

Again, institutions like IFTAS doing a lot of education work about what are the early things you can do to level up and take protective action so that your moderators don't have to confront so much of that. Also, they're building tooling for things like just known actions like graying out things that are likely to be traumatizing gore and various kinds of child exploitation content, things like that. Doing hash matching to take as much of that out programmatically as possible. This stuff is available, but it's not available out of the box when you start a server.

The community is building these things for itself, and that's incredible to see because this is not a network that started out with anywhere near the kind of resources or technical sophistication or infrastructure as even a small venture funded platform. So it's really heartening to me to see all of these protective measures emerging and becoming available, and part of what we wanted to do in this report is say, hey, here's a bunch of servers that are having a better time just as human beings running things. So these are the practices that they think are important in maintaining them. Maybe if you're starting a server, you want to start here instead of starting from zero.

Justin Hendrix:

You mentioned that the report contains a lot of verbatim quotes from the various moderators you spoke to. And you end with one that I think is a nice way of maybe summing up what people see in the fediverse and what these folks are all up to together. From Johanna B, a moderator for Wandering Shop and CoSocial.ca, "And at least in the fediverse, the signal is still very high compared to the noise, whereas the collapsing legacy platforms are all noise at this point, doing more damage than good. I want to see things like federated social media set a better standard. I think it can. I don't know that it will."

I think that's a good place to stop. There's still a lot of open questions that your report raises. We'll see where things get to. I do want to ask you though, as someone who I now think of as a person who kind of studies interesting ways that we can use digital media to mediate collective behavior, what's next for Erin?

Erin Kissane:

There are a number of things that come directly out of the report that I would like to work on, that I'm taking around to different people and to see... possible. There are far more open questions than there are answers. There's a lot of user research that I think would be tremendously helpful, and something that doesn't exist right now that I really want to work on is helping people who are maybe curious about the fediverse find the right place to go. It is not at all clear. I think it wasn't to me when I wasn't using the fediverse regularly. Does it matter where I go, what server I pick?

And there was, when Twitter, X first started imploding, there was a lot of advice about this. It was a flourishing of folk advice about how to mastodon. It doesn't matter where you go or it really matters. The truth is it matters in that where you land will be the governance system you live in. You need to pick a jurisdiction, you need to pick your own locality that will be a good and fruitful place for you. It's really hard for people outside the fediverse to find those places. There are a lot of really good places for many values of good. So that's something I would really like to work on next. But God, there are two dozen open questions I would like to take on. We'll see how the fall and winter go.

Justin Hendrix:

Erin, when there is another artifact of your effort to answer those questions, I hope maybe we can talk again and I thank you so much for speaking to me today.

Erin Kissane:

Thank you, Justin.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics