Home

Donate

Reconciling Social Media & Democracy: Tracy Chou and Mike Masnick

Justin Hendrix / Oct 16, 2021

On October 7th, Tech Policy Press hosted a mini-conference, Reconciling Social Media and Democracy.

While various solutions to problems at the intersection of social media and democracy are under consideration, from regulation to antitrust action, some experts are enthusiastic about the opportunity to create a new social media ecosystem that relies less on centrally managed platforms like Facebook and more on decentralized, interoperable services and components.

The first discussion at the event took on the notion of ‘middleware’ for content moderation, and the second discussion looked at this question through the lens of what it might mean for the fight against misinformation. The third discussion featured Cory Doctorow, who discussed the notion of competitive compatibility.

This fourth session features Tracy Chou, founder and CEO of Block Party, a company that builds anti-harassment tools against online abuse, and Michael Masnick, the editor of Techdirt and the author of Protocols, Not Platforms: A Technological Approach to Free Speech, a paper we discussed during the event.

Below is a rough transcript of the discussion.

Justin Hendrix:

Welcome Tracy Chou, the founder and CEO of Block Party-- a software engineer, a diversity advocate, and I've invited her here today to tell us a little bit about her company that she's starting and the thoughts and thinking that's gone into it, the strategy she's got and her broader thoughts on this. So I'm going to let you have an opening salvo, and then I'll ask you a couple of questions about it. And then after that, I'm going to bring in Michael Masnick, who of course is founder and editor of Techdirt. He has a couple of things in mind to do with what's going on out there in the world that indicate what this potential decentralized future might look like, as well.

So thank you for joining us in the evening in London, Tracy. Tell us about Block Party.

Tracy Chou:

Sure. Thank you. So with Block Party, we are building consumer tools for online safety and anti-harassment. The way that our current product works is that you'd sign up for Block Party, link your Twitter account, set some filtering rules, and then Block Party runs in the background to filter out things that you may not want to see. Anything that's filtered out ends up in a folder on Block Party. You can go check it out later if you want to. So a lot of the premise of our company is giving people back control, being able to control what they see and when they want to see it. The immediate use case is solving harassment, which is a very big pain point. But the generalized version of this is wherever you're going online, you should be able to be in control of that experience.

Justin Hendrix:

One of the things that's been discussed quite a lot today, is this idea of sort of the power struggle between firms like yours and the platforms, that you're attempting to work with much larger entities and figure out a way to create value in a context where clearly there's an asymmetry. How have you navigated that in trying to build what is effectively a middleware solution?

Tracy Chou:

When I was first starting the company, it was a big concern of mine-- what the relationship with platforms would be like. It's actually been much better than expected. Right now, we're working with Twitter. We do want to go cross platform eventually. But with Twitter right now, they've actually been quite helpful. And there seems to have been a bit of a sea change internally, where they're recognizing now how important it is to have other folks working on solving this problem with them. So obviously, there is still some responsibility internally for them to address this. But they're now starting to think about how to empower developers and like these third party applications, to help give consumers more choice. And the framing of it, I think, is really like with applications like Block Party, which are built on top of the Twitter API, there are some folks who are just going to have a better experience now. They would never be able to get that through what Twitter would build internally, and what makes sense for them to prioritize. Some parts of it are that maybe these applications are too niche for Twitter itself to prioritize.

And in other cases, they actually prefer a third party to build some of these things. So I'm hopeful that other platforms will kind of follow Twitter's lead in terms of opening up the APIs a little bit more. And specifically, when I'm talking about the APIs, it's moderation, safety constructs-- things like muting, blocking, reporting. When these things are available through the API, it allows third-party applications like Block Party to build on top of them and give users just a lot more options in how they want to engage.

Justin Hendrix:

So can you maybe just give us a little more depth on specifically the technical interface, what you've built at the moment, how complex that is, where the problems might lie? And to some extent, what the platform could do maybe to make it easier if it were in their interest to use it?

Tracy Chou:

Yeah, the way our integration works right now is pretty straightforward on top of the publicly available API end point. So in order to synchronize data from Twitter, we're just pulling very regularly all of the app mentions, replies, and a bunch of other data from Twitter. Like the list of follows, mutes, blocks, user data that we're going to make filtering decisions on. And if a user that has been tweeting at you doesn't pass your filtering rules, we'll issue a mute call through the Twitter API. Which means that it will probably get through to all of the interfaces that you might be accessing Twitter through. You continue using Twitter as normal. The things that are critical there that we can access all the data we need to make the decisions, and that we can issue the mute calls. And then additionally, once we pull all this into our system, users can come to Block Party and choose to block all those folks as well. So being able to programmatically block is important.

The fact that we can pull the data and build and issue these mute and block calls, it's kind of a critical for this. There are some other things that would be helpful to have. Right now, the reporting API end point is not very powerful and it's mainly for spam, not for that kind of use case that we are thinking of. In general, I think what's critical for platforms to implement for Block Party to be able to build on top of it, is... Well, internally, they need to have these sorts of trust and safety constructs, these moderation constructs like muting and blocking.

There are some places that they have not thought through these solutions. So for example, with Twitter on direct messages (DMs). The abstractions around what is a request and what is in the main inbox-- and whether or not you can move things back and forth-- it's not very well thought through. So it's very difficult to build on top of that. There's kind of a core product piece that the platforms need to implement first, and then exposing those through the API is important for third parties to be able to build on top of it.

Justin Hendrix:

So tell us just a little bit more about the inspiration as well for the company and the type of community you're building around it. Because I think it's important for people to think about somewhat... A lot of people imagine this realm of middleware, if we move more substantially in this direction, that it will represent the interests of a variety of specific communities.

Tracy Chou:

Sure. So my background that led me to working on Block Party, I think the number one thing is I deal with a lot of online harassment myself. And so it was a very personal pain point I wanted to address. Some of the frustrations around it have been that it's been very difficult to get things actioned, unless you have a personal connection. And I'm privileged enough to know people who work at some of these companies. But to see that I could get special access and like get accounts taken down just because I knew folks, it feels very unfair. That this should be something that is widely available.

The other part of my experience that is pretty relevant here, is that I've worked as an engineer at a number of different social media platform companies. I interned at Facebook way back in 2008, I was on the very early engineering teams at Quora and at Pinterest. And I worked on not only moderation tools, but also I was on engineering projects across these platforms. Everything from home feed, search, and recommendations to all the other product features. And I have an understanding of how product decisions are made at platform companies.

Interestingly, one thing that I identified quite early as an engineer in some of these companies, was that there was not very much diversity on these teams. That led to a different kind of prioritization around features and sometimes, a lack of perspective around what abuse-- or potential misuse-- of the technology might be. One example I like to give is when I joined Quora, the first thing I wanted to build was the block button. Because I was already getting harassed by somebody on the site, despite it only having a few thousand users. And I was very motivated to make this person the first person ever blocked on Quora.

But it was that very personal motivation which also made me realize that if I wasn't there, it was very unlikely that the team would have prioritized building this. I've also done quite a bit of diversity and inclusion work related to this-- which ironically or not ironically, kind of ties into why I get so much more harassment now talking about these things. But I truly believe it is a lack of diversity, inclusion, representation on many of these early teams amongst the engineers, product managers, designers who are building these platforms that led to just like complete misses in terms of thinking through what are the potential impacts of the technology or how it might be misused, and then building in the safeguards against it.

Many of the folks that we are helping the most, I think now with our product, do come from marginalized communities. If you even count women as a marginalized community, that is a big one. A lot of women get tons of harassment online. We're looking at journalists, female journalists, politicians, female activists. Other people from marginalized communities often do also receive elevated levels of harassment. And so it is really important to us that we are helping to give people back the sense of like being able to participate online. Whereas, the online abuse or harassment might silence them otherwise. And it is the folks that we most want to hear from oftentimes. Let's think about during the pandemic, like the scientists, doctors, and health experts who are leading us through this getting tons of harassment. Climate activists who are trying to bring attention to how important it is that we act urgently. Journalists who are speaking truth to... It's all these people who are being targeted often from marginalized communities. And we want to kind of like, build these defenses and give people back this control.

Justin Hendrix:

And there's a question in the chat actually, that leads into my next question. Which is whether you're a for-profit or non-profit company. Just want to clarify that for the record.

Tracy Chou:

We are for-profit. And our thinking around this, is we are building something that we think is useful to people and it's creating value. And it's a consumer product and the way to build something that is ultimately scalable and sustainable... For this product, we want to build a for-profit entity.

Justin Hendrix:

So I'm going to bring in Mike Masnick here in just a second to talk about some of the other types of experiments that we see out there in the world. But just to follow up on that, is there a vision of the future with regard to this question of the potential decentralization of some aspects of social media? Unbundling or middleware that you or your investors have that you can describe here? Is there... Do you have a sense of a kind of zeitgeist at the moment or a direction where these things might be going? Maybe slightly informed by what's happening in governments, but also generally, by the interests of other entrepreneurs and investors?

Tracy Chou:

I think I'm hearing more of it on the policy side, and people pushing for the big platforms to give up some of the ultimate control they have right now. It does feel like we are moving towards a world where people want to have more control over their experiences, and it should not be fully dictated by the platform companies deciding what is okay or not okay. One analogy I've used sometimes to describe this is, some of the platforms will say, "There's freedom of speech. Like we don't want to be in the business of censoring people." At the same time, individuals should have the freedom to not listen to what they don't want to listen to. Particularly, when it's in the form of like abuse and harassment being hurled at them.

So there's a few things here. The platforms may not want to institute very strict standards around what is allowed or not allowed. And there's a pretty big space between what is definitely not okay-- like child sexual exploitation, death threats-- and what somebody may or may not want to engage with. So putting more of that choice into the hands of consumers seems to be the way to go, as opposed to having the platforms be the ones to determine in all cases like what is okay, what is not okay. They may want to apply that standard across the entire platform, it's just not going to apply in all cases, it will feel wrong. Whereas if you put more with control back into individuals, things can be calibrated to the communities that they're a part of, or what they're most comfortable with personally.

Justin Hendrix:

So just one last question for you and this is maybe just more specific to Block Party. But it's got this name, Block Party. It almost sounds like a party on some level, but this is serious, serious, serious stuff you're dealing with. How do you contend with that as a founder and then also in building a team?

Tracy Chou:

Yeah. When picking the name, one of the other things we were thinking through was... It is a very serious topic, but we also want to bring some levity or lightness to it and make social media a better place. So it should be a place that is pleasant to be on. And right now, online society is... Our online activity cannot be separated from offline world anymore. Everything is so integrated. Just being a part of this world in any meaningful way, you have to have an online presence. So kind of acknowledging that it is a part of life. Not everything we do is just dealing with the really negative stuff, like we are thinking about the positive value that people are getting from social media and trying to allow them to access that. So we wanted to have that positive as well.

It just played into a lot of our branding. Like some of the other anti-harassment services or companies that have been out there. If you look at their branding, it was like very black and red, and very warlike. And we wanted to come away from that and not treat it as if it's not a serious topic, but it doesn't have to be so dark and negative as well.

Justin Hendrix:

Excellent. And just really quickly before we bring Mike up, is there a place where people can find you?

Tracy Chou:

I am on Twitter too much @triketora, T-R-I-K-E-T-O-R-A. You can also find Block Party on Twitter @blockpartyapp_.

Justin Hendrix:

Excellent. Thank you. So I'm going to invite Mike. Most of you will know Michael Masnick from his work at Techdirt And of course, he's also on Twitter quite a lot like Tracy. And unfortunately, like myself. But I'm going to post into the chat here also an essay that he referred to obliquely in the discussion earlier, Protocols, Not Platforms. This is a good essay to read when you get a chance, if you haven't already to kind of get a sense of his thinking in this regard. And wanted to invite Mike to kind of do two things. One, obviously share his thoughts on this future, this potential future. But then also to talk a little bit about some of the experiments that he sees in the world, like Tracy's that point to green shoots or the possibilities. So Mike, I'll turn it over to you.

Michael Masnick:

Sure. Yeah. And I want to reiterate that what Tracy is doing, is really interesting, and it's really worth checking out. I think it's one of the most interesting examples that we've seen. I use it. I like it. I think it's hard to describe how much careful thought appears to have gone into the product and how well it's designed to accomplish what it's setting out to do. So I think if you weren't convinced already by Tracy, it's a really thoughtful design of a product that builds on a tool that so many of us use, and so many of us spend so much time on. And I think that it is a really great example, and it's a useful one to be thinking about this. And that what Tracy and her team have done, is take a platform that we're all sort of stuck in for better or for worse, and said, "There are problems here." And for a variety of reasons, some good, some potentially bad, Twitter itself is not solving these problems. But Tracy and her colleagues have a view on how that could be better.

And rather than having to go and apply to work at Twitter and get a job with the power to change Twitter, they're saying, "We can build this as a third-party app, and go in and make the Twitter experience better." And what I'd like to see in a world that I think would be a better world, is where that is happening on a much wider scale. And that as you see different services, platforms, and tools that you use, rely on, and both like and hate. That if you see a way to build a better part of it, or to make use of the sort of services that it provides and provide something better on top of it, that you could do that. And that you could do that easily. And right now, sort of as Tracy was explaining, she's somewhat reliant on the nature of the API.

And some of what Twitter allows is pretty good. And lots of people know the history where Twitter was good about allowing third-party developers, and then suddenly was not good about it. And now, has sort of gone back to being good again. And hopefully, they've learned their lesson. It feels like they have and that they're going to continue to be relatively good about these things, but there's always the risk and the fear that they could backslide and go in the other direction. And then you have other platforms. Facebook is sort of notoriously bad. I don't know how many people saw there was a Slate article today, which is somewhat horrifying about how Facebook... There was a guy who developed a service to let you unfollow people on Facebook. It's was actually another sort of really clever intervention to make your Facebook experience better.

And Facebook banned him for life and threatened to sue him, which is not good experience. And Facebook has a fairly long history of these kinds of interventions. In which anything that anyone who sort of tries to make the Facebook platform better, they might interfere with them or do bad stuff to them. And I think that's really, really problematic.

And it's a point that Cory [Doctorow] has raised plenty of times as well. And a lot of his ideas around adversarial interoperability, competitive compatibility, however you want to describe it around getting past that and not letting the companies themselves be so bad about this. And so I think there's a lot of really important stuff that can be done. And some of this goes back to the example that both, Cory and I talk about a lot, was the Power case, or power.com, or Power Ventures, however you want to refer to it, where this company tried to build an interface in sort of the late 2000s era that would be an interface for all of your different social media properties. And that you wouldn't have to go to Facebook and Twitter, and there were others around at that time. And sort of go to each one and pay attention to them. But you could build this sort of centralized interface to all of them. And Facebook sued them, claiming that it was a CFAA violation for Power to log into your Facebook account with your permission, and you providing the credentials. They said that that was an unauthorized login. And unfortunately, they won at court. There have been a bunch of other CFA decisions that might impact that if that's ever challenged again.

But there were a couple of very problematic aspects to that, that I think have really, really limited how much these kinds of companies can do to sort of provide these better services. As long as you're using the API and you're using it in a way that is allowed, that's okay. But that leaves out a lot of things. And I'm sure that... Speaking for Tracy a little bit here. I'm sure there are things that she would love to be able to do with Twitter that Twitter doesn't currently allow with the API, or with some other platforms as they expand past just Twitter. And I think that should not only be allowed, but it should be encouraged and that lots of people and companies should be able to build on it. So that's some of the thinking behind the Protocols, Not Platforms paper, sort of an ideal approach in sort of a different world.

If we were not in the world that we're in today, wouldn't it be nice if everything was built in this way that anyone could build on them and anyone could build different layers and different tools that fit into it? But recognizing the world that we're in today, I would like to see things move in a way that allows for much more of that. Now, at the same time, to actually answer the question which I've sort of been dancing around, I didn't quite get to. I think there are a lot of really interesting projects going on right now to try and build that better world. And a lot of them for better or for worse... And in many cases, probably for worse are happening in the crypto space. There are people who are building these sort of new projects that are really interesting and that are theoretically, really cool.

And if we didn't have a world with Facebook and Twitter and all these other things out there already, where all the users are, that might be really interesting. The problem is, that everybody is on these other platforms. Nobody for the most part, is using these sort of mostly crypto-based platforms. There are some projects that are getting some traction, Mastodon, which obviously, is not a crypto project. Some other things like projects around Scuttlebutt, and a few other things. And now, the potentially interesting one is Twitter potentially experimenting with this stuff with Bluesky.

And that is a project that Twitter announced two years ago, that they're going to experiment with building a protocol set up for social media. And hopefully, the way that they... They said it was like they were setting up a team of engineers to sort of work on it. And assuming something good comes out of it, Twitter wants to adopt this protocol. And then anyone else in theory could also adopt that protocol. It took them a really long time, they went through a very long process of talking to different people and sort of trying to get visions. They finally, just a couple of months ago, announced a head of that project is Jay Graber. If folks don't know Jay, Jay is wonderful, incredibly thoughtful, has a vision for all of this that is incredibly compelling.

And I'll say, I've seen Jay's vision for what she wants Bluesky to be. It's somewhat different than I think my own vision was, but she was compelling enough that I'm sort of convinced that she's probably right and I was not. And so that's just getting started now, in terms of how they're setting up Bluesky. They've set up a community, which is in Discord and Matrix. People are discussing stuff, and it's a little bit messy as sort of all communities can be. But it's really compelling for a few reasons. One of which, is that as I said, a lot of these interesting projects don't have any users and a lot of the projects with all the really interesting users aren't willing to do anything interesting. In theory, if Bluesky is successful, Twitter can bring the users. And then I think that becomes a really interesting experiment to have both of those components together. And with that, I think I kind of got to your question somewhere in there.

Justin Hendrix:

Yes, you did. Is there anything else out there you're excited about in the crypto space? We haven't talked much about crypto today. No one's brought up decentralized autonomous organizations, or any of these new fangled things that some investors are excited about. Anything else on the blockchain or in that future?

Michael Masnick:

Yeah. I think there are a lot of interesting things, there is a lot of junk too. The problem right now, is you can't just dive in and say like, "Oh yeah, crypto is going to solve it." Or, "Blockchain is going to solve it." A lot of those projects are disasters and will be. And I think the really interesting thing, and I'm doing some writing now on it that hopefully should be out soon, is sort of trying to find that middle ground between the sort of extremely mercenary nature of most of the Blockchain world, and the more altruistic view of like open internet, open source building a better society kind of world. Because there is a sort of meeting point. And the Venn diagram might be small, but there's something really interesting in there. Where you can build better tools and better setups that are not driven entirely by just the purely mercenary speculative nonsense that unfortunately, is involved with most of the crypto world right now.

And so I think... I'm not going to name particular projects. There are different ones that I've seen that... All of them are so early stage or so sort of just being implemented now, that it's hard to say which ones are actually going to catch on. But I do know that there are people who are working on interesting projects. And I'm sort of trying to pay attention to which ones are actually interesting, and which ones are just sort of garbage fires like a lot of the crypto projects become.

Justin Hendrix:

Great. So Tracy, I'll bring you a back in as well, just to kind of like bring the two of you into conversation with one another. One of the things, Mike, that you said in your piece on Protocols not Platforms is that, "This is an approach that would bring us back to the way the internet used to be." And of course, you're a prototypical blogger. I'm a blogger too, so of course we'd be into that. But is that kind of like primordial state, really something that we should be looking for at this point? Was that all great as well?

Michael Masnick:

No, it wasn't all great. There were all sorts of problems in the world at that time, too. And some of us are trying to address in the paper as well, that there were issues with the way that played out. To some extent, I see it as this pendulum that goes back and forth between more centralized systems and more decentralized systems. And there are good things about each of them, and there are lessons to be learned about each of them. And so my hope is that in moving back to a more decentralized protocol based world, we can take the lessons of both what worked in the early days of the internet and what didn't work, and try and get more of what did work, more user empowerment, more control by individuals, more power to the edges of the network, less siloed data, less controlled by just some individual companies. But also take in the lessons of what has worked well in the last 10 to 15 years in terms of much better user interface, much better user interaction, different interesting tools and services that have come out of it. And also more interesting business models.

Some of the problems of the original world of protocols was that you were almost wholly reliant on people who are just truly passionate believers in that, and that can get you to a certain distance but that gets trampled when a giant company comes in and says, "Well, I can build my own version of this, lock it up, have all the data, extract all the value from it." And that's a problem. Whereas if you can put in place better business models, as well, that are more aligned with the end users-- having the control and the power themselves, rather than making sure that all goes to the central provider-- I think you can create something a lot more interesting. But a lot of that is theoretical at this point.

Justin Hendrix:

Tracy, is there an ideal state that you have in your mind, either past or future? What are you working towards? What do you want to work towards?

Tracy Chou:

This is slightly beyond the scope of Block Party, but I've been working through this theory of what the ideal online society might look like. It's not even purely online, it overlaps with the offline world.

When I look at the internet right now, there's just really no governance and it feels like this state of lawlessness where there's no rules that are commonly agreed upon, there's no enforcement of the rules that we do know exist. The only type of justice we have is mob justice. People are left pretty defenseless, and it seems like we're just lacking the governance that does help to keep society functioning normally, in the offline world.

Obviously, there's challenges to just try to port a governance model from offline world into the online sphere, but I think we need to be thinking at that scale of, "We need a whole different governance model."

When I think about the role that Block Party is playing right now, it's, basically in this lawlessness, we give you a little bit of defensive. You have a private security person that comes along with you, but it would be better if generally there was more safety, as opposed to you need private security. I don't actually know what the answers are, but that is the world I'd rather get to.

People talk a lot about things like accountability without really defining what that means. And to me, accountability means that you must have some set of agreed upon rules, and then some idea of what punishment there is for violating those rules. And the rules can be at the level of laws in the offline world where there are things that are very clearly not okay or okay.

And there's also things like norms, social norms. Even if something is not illegal, if you're going to be shunned by society for doing weird things, you may not want to do them. So the cost to you for the transgression is ostracization.

I don't think we have really any of this online right now. The rules that different platforms have are usually pretty hidden, not transparent at all. They change all the time. There are really only the moderators who are given these kind of books of rules that they have to moderate based on, that know what is there. Enforcement is very uneven, which makes it so that the rules basically don't apply. The social norms are terrible. It feels like the social norms right now are, "Be really rude all the time and it's okay."

So I'd like a lot of these things to shift towards a slightly more civil society like we have offline. Not to say the offline society is perfect by any means, but there's slightly more structure there that I would like to see online.

Justin Hendrix:

Mike, do you think this direction gets ustowards that on some level?

Michael Masnick:

Yeah, I think it's really interesting. One of the things that I hope, again, a lot of this is speculative and I'm guessing too-- but if we had a world that that allowed for more experimentation, more people being able to build on and provide better tools for people and their social media experiences, is that you also get much more experimentation and innovation to go along with it so that we're not reliant on just Twitter setting the rules or just Facebook setting the rules. But you can have a lot of experiments. And a lot of them are likely to fail, but as you get those experiments, you might begin to get a better sense of what actually works.

And it might be different. Different things will work for different communities, and different people, and different communities, and different countries, and cultures will have different levels of what they're happy with and what they're not happy with. But allowing for that experimentation to happen, rather than having just a set of rules that are set within 50 miles of where I'm sitting right now, I think leads to a better solution.

I can't remember when, I can't remember what book it was in, but Clay Shirky many years ago had written this thing where he talked about the rise of the printing press 500 years ago. People know what happened before the printing press and people have this sense of what happened after the printing press, but in that case, it was like a hundred year period in which everyone was trying to come to terms with the fact that the printing press exists and suddenly things can happen. And it was crazy. And there was a lot of societal upheaval that happened in that time as people were coming to terms with the fact that the printing press was allowing people other than the scribes in the church to write stuff down.

And it took a while for society itself to come to terms with the fact that the printing press exists and what that means. And I feel like we're in that period right now with the internet, and what Tracy was saying about not having these rules or sense of governance. There are attempts and there are things that are happening, but we haven't settled on anything, and everything is changing so rapidly that I'm hoping that we're beginning to see the light out of the crazy revolutionary period where nothing makes sense towards one where it settles down a little bit and people do begin to get a sense of, "These are this social norms. This is what is acceptable, and this is what is not acceptable and how it's all going to work."

And so again, I think more rapid experimentation and allowing for more people to try more things, hopefully gets us to that point much faster, but who knows.

Well, obviously it's later in Tracy's evening and we've been talking now in this conversation for nearly three hours, so we can go into some uncharted territory, slightly. Clearly democracies need to evolve as well. That seems to be the case.

Justin Hendrix:

I really want to slightly bring it back to the news of the week with the Facebook whistleblower and the revelations that came out of those discussions. Looking through the whistleblower disclosures to the SEC, it's really some fascinating stuff in there about the types of studies and reports that Facebook is doing with regard to content moderation, with regard to addressing online harms.

There's two ways of looking at that. One is, good Lord, this is an incredible amount of social engineering and manipulation of content that's going on behind the scenes with no accountability and no transparency, on the one level.

And on the other hand, of course, this is an incredibly expensive endeavor, and Nick Clegg likes to say a thousand PhDs that are involved in this. Can we imagine that this world of middleware or unbundling, we can get to a place where it can support that kind of endeavor, that kind of social science-backed effort?

Maybe it's too hard a question.

Michael Masnick:

Yeah, it's a big question.

Justin Hendrix:

You can definitely try to cut it down or critique the question.

Michael Masnick:

I think there are a number of different elements within that question and it might be interesting or useful to break them apart. I'm not sure we want to take all the time that would be necessary to fully break them down, but in theory, it shouldn't be up to Facebook to have to do all of that work and to employ all those people.

A better setup in a better world is one in which, and this has come up a lot in the discussion of the whistleblower and the research that they've done is that, if Facebook had made that information and data more open, specifically to researchers and academics who could then take it and go through, do an analysis, point out what is happening without it having to go through the Facebook filter, or in some cases, be locked down and kept within Facebook, I think that would be really, really valuable.

Unfortunately, Facebook they do have their research programs and they always throw their comms people out and say, "Well, of course, we have this program. We work with all of these different academics and all of this kind of stuff." But they're still picking and choosing who they're working with, and how they're working with them, and that's not enough.

The other thing is if you move to a more decentralized world in which people had more control over their own information-- the information that they're producing and creating-- then that also becomes a much easier thing where you can route that information to different researchers as well, or make that information available to the different researchers and allow for that stuff to happen and not be reliant on one giant company to make all the decisions around that, or not to have to hire the PhDs that they need to hire.

There are other ways in which those services could be offered and it shouldn't be the companies who have a really strong bias for what they want that research to say, or at least which parts of it to get out to the public.

Justin Hendrix:

So the open versus the closed model. Tracy, is that part of your roadmap? Are you thinking about that kind of thing, how to bring in social sciences work with outside researchers, think things of that nature? Are those types of insights you think, eventually something you'd like to port into Block Party?

Tracy Chou:

Absolutely. I think it would be one of the things I find most distressing sometimes about the tech industry, is not incorporating all this knowledge that's coming from social sciences and all these other researchers, so I would love to be attuned to what's happening in terms of research and where people are finding possibilities to do better. And also, I think there's a lot of researchers who've been calling out some of the issues that we see and the tech companies have been able to largely ignore all of them.

I would like us to be more in a push and pull where there's a bit more steering of where technology is going instead of just technologists who don't quite know what their impact on society is going to be, making all the decisions.

Michael Masnick:

I think it's a really good point that Tracy was making and I think that if we had things more open and allowed for more people to be able to build stuff like what Tracy is doing, we can see different kinds of solutions. And right now, so many of the solutions are, again, created by a bunch of guys who look like me, more or less, within 50 miles of where I'm sitting, and that's not great.

And so if we had more ability to experiment and provide different solutions and just to even show... Part of the problem is a lot of the people in the tech industry, they have this belief that they know what's best and they're often wrong. Nobody knows what's best. And so, if we could just see a lot more experiments and a lot more different ways of going about things, we can inspire other people as well. And if it's built in a way that is more easily accessible, then it doesn't have to just be people around here either. It could be people around the world who say, "I have a better way to do this, and I can just build a better solution and begin to inspire others." And you get this flywheel of creativity going and create these other solutions that take into account, not just the views of tech industry guys, and can just build much better overall solutions that take into account the wider world that we actually live in.

Justin Hendrix:

Tracy, are you at all operating in any countries or have users in places that aren't free at the moment, or that are partly free or that have other kinds of issues to do with internet freedoms? Is that a complication to your business in any way yet?

Tracy Chou:

It's not an issue right now. We have no limits on who can sign up. I don't think we have any users in those places, but right now it's really just like, if you can use Twitter, then you can use Block Party on top of it.

Justin Hendrix:

Mike, do you see any complication to this type of thinking outside of democracies or outside of the places that score well and things like the Internet Freedom Survey?

Michael Masnick:

Yeah. There's all sorts of challenges and some of them, I think, like Daphne [Keller] brought up earlier in her talk and Cory [Doctorow] brought up some of these ideas as well, is that there are risks involved in all of these different things. And when you give different levels of control and power to different users, not all of them are going to use it for good. And so there is the risk of empowering more third parties to do things that, do you end up in certain countries where they basically say, "Well, you have to use our solution to that. And that is the one where we can spy on everything and we have control over who is saying what. And if we want to harass someone, we're going to enable that harassment and not allow it to be blocked or something along those lines." There's all sorts of things and, certainly, risks among more marginalized populations and those without power that could be abused.

And that's a real concern that I think people definitely don't always take as seriously as they should, and it needs to be a part of any of these conversations. To go back to the crypto world, there are a lot of people in that world who don't think about that stuff at all and have no concern whatsoever about how what they're building can be abused and will be abused. And that concerns me.

There are people now, more and more, who are coming into that world and saying, "Wait, we have to think this through and we have to have more thoughtful approaches to this." So I'm hopeful that that sort of thinking will begin to permeate, but I do think that there are real risks. There are risks to any approach, obviously, when you have authoritarian governments or people trying to crack down on marginalized groups. But having that thinking from early on, I think is really important.

Tracy Chou:

One of the specific tensions I've come across a lot in working on anti-harassment is many of the anti-harassment features you can build are bad for misinformation. So for example, being able to disable replies to your tweets, it's good in the sense of, "Oh, if you think of this space beneath your tweet as your space, that you should be able to control," that's good. You can't have harassers post things there. But the replies to tweets are often where misinformation gets debunked as well. And so I think the solution that Twitter built for that was still allowing quote tweets. So you can disable replies to your tweet but people can still quote tweet you and add commentary. So you have to balance these different features in that way.

And on things like Facebook pages where one potential anti-harassment solution might be disabling certain sets of boards from being posted. So you don't want people to post insults or profanity at you, you might want to enable block lists, but then that also means that there can be censorship and politicians who don't want to have certain topics discussed on their page could try to use that feature to block the discussion. So there is this tension between giving people these tools to defend their own space, but then they can also allow misinformation to proliferate. And that's just one example of these different tensions.

Michael Masnick:

Yeah. And I think that's a really good point. Almost all these things have trade offs and it's often difficult to think through the trade offs before they get started.

Another example of this in the anti-harassment space is even how some anti-harassment tools have been then used to harass people, in some ways, where it's brigading against someone for calling out someone for bigotry of some form or another. There have been attacks on people, and trying to get people kicked off platforms or suspended from platforms, claiming that they're harassing people or abusing people when it's the opposite. And so, again, thinking through all of the different trade offs and challenges of each of these is really difficult. I don't think any human being can possibly think through them all.

But coming into all of this with an open mind and allowing for more experimentation, you begin to see what works and what doesn't, really quickly. And the ability to iterate on that, and change, and recognize when a tool that is meant for anti-harassment is being used for harassment, something probably needs to be fixed or changed, I think is important. But that takes a humility in building these tools and how you go about them.

Justin Hendrix:

Mike and Tracy, I want to thank you so much for taking us, Tracy, inside your company and what it's doing, and Mike inside some of these incentives and some of what else is happening out there. Thank you both very much for joining this conversation today.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Topics