Home

Color Of Change Demands Breakup of Big Tech, Action on Disinfo

Justin Hendrix / Apr 29, 2021

Earlier this week, I spoke to Jade Magnus Ogunnaike, senior director on the campaigns team at Color Of Change, the nation's largest racial justice organization with more than seven million members, about the tech policy priorities the organization published this month.

A recording of this conversation is part of the Tech Policy Press podcast on Sunday- subscribe here.

Jade Magnus Ogunnaike, Color Of Change

Justin Hendrix:

Color Of Change has been advocating on tech issues fairly prominently over the last few years. Why has tech become such a major concern for the organization?

Jade Magnus Ogunnaike:

In many ways, social media platforms like Twitter and Facebook, business interfaces like Google, Gmail, they've become our town squares, they've become our meeting places. Especially during COVID, we saw more than ever social events and community being built via the Internet.

Because of that, some of the same problems and systemic racism that pop up in real life are popping up on the Internet in those spaces as well. I think, above anything, we see these tech corporations as businesses that need to properly regulate themselves, considering all of the sort of civil rights abuses that happen on them daily.

Justin Hendrix:

So, is your membership particularly keen on these issues? Do you find that they respond to the tech issues particularly, given all the things that you're advocating for?

Jade Magnus Ogunnaike:

Yes. Color Of Change is an organization that runs campaigns on everything from economic justice, to climate justice, to criminal justice. I think tech is one of those spaces that at first can be a little bit challenging to access or understand. I think there's a great swath of us in this world who, due to gender bias or due to racial bias, when we hear "tech," our brains sort of just shut off. That's someone like me.

Before I worked at Color Of Change, I never thought issues of tech were for me or related to me in any way. It's really been through this work that I sort of developed a politic around why this work is so important. One of our top performing mailers last year was all about Kyle Rittenhouse- how he shot and killed three people at a protest. The most important thing in that story was the way Facebook had enabled that. Kyle Rittenhouse and his friends had planned to go on Facebook, their posts had been flagged multiple times, and Facebook had refused to take down the post and the event listings. So, that's how you see how tech companies like Facebook can enable hate racism and actually real-world violence.

Justin Hendrix:

I count seven different priorities that are in your 2021 tech accountability priorities. Number one is breaking up the monopolies. So, Color Of Change is getting into the antitrust business. What's your concern with the kind of monopoly power of these tech companies?

Jade Magnus Ogunnaike:

Color Of Change has been leading the drum beat on Facebook and Google and Twitter, sort of regulating themselves, "Hey, put in civil rights infrastructure," "Hey, pay your content moderators a living wage." On sort of every level of these companies, there is systemic- I want to stop using the word racism, so I'm going to say inequity. You look at a company like Google, they're touted as a great place to work, incredible benefits, the best food, the best childcare leave policy, and yet you know the people that work in the cafeteria and the security guards that keep the campus safe don't even have access to those benefits.

So, we really see that federal regulation is really the only solution to sort of reign in the out-sized power of big tech. It's really the only way forward. These companies do not have responsibility to us. What we've seen is that they will only make moves and make big changes with incredible grassroots pressure. So far, what we really need is for the Biden Administration to break them up.

Justin Hendrix:

So, you've got two priorities around Internet access. One on net neutrality, and closing the digital divide- clearly these are huge problem areas, especially in the time of COVID. How do these two particularly affect Black communities?

Jade Magnus Ogunnaike:

Well, we know that the digital divide is highly concentrated in low income communities, and Black communities are more likely to be low income. What we see is that these telecom companies, these tech companies, they're operating with broken business models. Above all, they're going to prioritize profit. During COVID for the first time, pretty much ever, we saw children in public school have access to hotspots and have access to computers because that was the only way that they could continue their education. It's made a huge difference. Having Internet in your home is the same as having a phone line when I was younger. If you didn't have a phone line, you couldn't access most things in the community. So, yeah, the digital divide is in low-income communities, and Black communities are greatly impacted by not having access to broadband.

Justin Hendrix:

Let's spend just a minute on the next two. One is protecting privacy, preventing algorithmic discrimination, and then you also get into disinformation and prioritizing content moderation. I see these things as connected, of course. Just tell me a little bit about your thinking around content moderation specifically. You make some mention here of Section 230 reform and how you might like to see that unfold and what the exemption should be around that, particularly to do with civil rights violations, and there's some other stipulations here. What are you thinking about with regard to disinformation and content moderation?

Jade Magnus Ogunnaike:

We saw how disinformation becomes a subject of national conversation. I think post 2016 election after Donald Trump was elected, all of a sudden, there was this hyper-focus on the fake news that circulated on platforms like Facebook and Twitter. When you hear that the most popular article in 2016 was that the Pope endorsed Donald Trump with no sort of fact checking, you sort of see the real world implications of disinfo.

We've seen it spread beyond presidential elections over the past four years. For me, the most important thing that I'm really seeing is what happened during COVID. There was a time in March of 2020 when COVID first hit that there was disinformation around how herbs could prevent you from getting COVID-19, how Black people didn't get COVID-19. Then we saw that Black people were the most impacted by coronavirus. We were more likely to die, more likely to be hospitalized, and more likely to contract the coronavirus. When I was growing up, we had newspapers and we had the evening news. Not that they didn't have their bias, but they definitely went through a fact-checking process.

People are interacting with social media that same way too. They believe that if it's written down and it's on a professional looking website, that it's true and that it's real. The problem is that it's having horrific real-world implications. We're seeing that with the vaccine. Color Of Change released a Black Patient's Guide to COVID-19 this April on World Health Day, on April 7th. Because what we were seeing were people saying that the vaccine would make you grow an extra arm, or that would cause you to miscarry, or get cancer.

The goal of disinformation is to get people to no longer trust in institutions. As Black people, we totally understand why people don't trust an institutions. However, as a functioning community, as a functioning country, we have nothing if we don't have institutions and organizations that we can trust. Disinfo is a violent threat to our health as a society, to be honest.

Justin Hendrix:

Let's talk a little bit about biometric surveillance. You've got a set of concerns here around a moratorium on facial recognition, promoting a particular bill.

Jade Magnus Ogunnaike:

What we see with facial recognition is that flat out, it doesn't work. There've been so many cases. The one that's coming to my mind is something that happened in Detroit a couple of years ago. A father was hanging out with his daughter, a Black father was hanging out with his daughter on the lawn of his home, and was arrested because supposedly facial recognition had recognized him from robbing a diamond store earlier in that day. Obviously, that wasn't true. The police, though, didn't ask questions. They arrested him and took them down to the station. It was a horrific and traumatizing experience for his daughter, and something that's happening large-scale all across this country.

I live in New York, I live in Harlem, I can't walk down the street without being on a hundred cameras. Someone is taping me at all times a day. The problem with that is that it's not like it's a centralized, regulated authority that's keeping track of this and we know what's happening with the data. Private businesses, the police, who we know do not have Black people's interests at heart, are filming all of us all day. It has to end. Biometric facial surveillance is truly a violation to all of our privacy. I think, like I said earlier, the most important thing that we know is that it just doesn't work. So, let's just stop doing it. It's fake science.

Justin Hendrix:

I appreciate that. Is there, I guess, anything else that you'd want to emphasize about Color of Change's accountability priorities for the year ahead?

Jade Magnus Ogunnaike:

I think the most important thing to share is that, above all, these tech companies are often viewed as progressive, they're viewed as free and Californian, and the next step in our world, right? These companies are going to solve all the world's ills and technology will. The truth is that they are corporations at the end of the day. They had the same priorities, which is valuing profit over the health of all of us, and it's important that we view them as that and not just hippie communes, because they have foosball tables and long maternity leave. These corporations must be regulated. It's incredibly important that the Biden Administration regulates these corporations and break up Facebook, Twitter, and Google.

Justin Hendrix:

Thank you, Jade.

Jade Magnus Ogunnaike:

Thank you so much.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics