Skip to content

Evaluating the Algorithmic Justice & Online Transparency Act

Last week, Senator Edward J. Markey, D-MA and Representative Doris Matsui, CA-06 proposed the Algorithmic Justice and Online Transparency Act, a bill that would “prohibit the discriminatory use of personal information by online platforms in any algorithmic process” and “require transparency in the use of algorithmic processes and content moderation”.

To better understand the bill, I spoke to Carmen Scurato, senior policy counsel at Free Press Action, the lobbying arm of the media advocacy organization Free Press.

Justin Hendrix:

Why did your organization advocate for this particular set of issues?

Carmen Scurato:

We got involved because we really think that the platforms need to be held accountable for their data practices, and they need additional transparency about what they’re doing with our content and with our data. This bill really hits on that. It talks about, for instance, what are the companies dealing with our data? How are they collecting it? How is it being processed? How is it being used in their algorithms? And it also has some really great content moderation transparency report requirements. So it goes through and requires that these companies actually tell us what they’re doing with our content and with our data.

Justin Hendrix:

Can you describe what you know of the history of this bill, where it came from, who were the voices involved and how long it’s been in coming to the fore?

Carmen Scurato:

Senator Markey has always been a champion for marginalized communities. And you can tell that this particular bill is aimed at addressing some of those issues that we’re seeing- the disparate impact, the discriminatory impact that’s happening across the platforms in their advertising and in their content moderation. Since he has been a champion of those issues, he really took this up. We talked to him about what are the things that are happening on the platforms in a way that gets at the heart of that and ensures that marginalized communities are not being discriminated against.

Justin Hendrix:

In some of the drafting of it, I can almost read the headlines of news articles- things from the Markup or from other coverage of the tech industry that have found various forms of discrimination bias in particular areas. Can you just describe some of the specific ills that this bill is meant to address?

Carmen Scurato, senior policy counsel at Free Press Action

Carmen Scurato:

Part of the bill talks about ensuring that the platforms are places of public accommodation or treated as such. So that means that in the way that they are structuring their algorithms and their advertising, that they can’t discriminate in terms of housing, employment, or any sort of economic opportunities. And so that’s one of the ills that this bill is meant to address- it makes it unlawful for the platforms to do that. And so, that’s where we were seeing so many cases brought up where there were issues, that the advertising for housing, employment and other economic opportunities were discriminating against people of color.

Justin Hendrix:

So if I go to a company’s website and it doesn’t serve me an ad because I’m Black, that would be wrong. And basically we’re saying that that would be wrong on platforms, as well as in algorithmically generated results.

Carmen Scurato:

Exactly.  So this bill prohibits online platforms from using AI to violate Title II. It actually makes that explicit here.

Justin Hendrix:

So there are a couple of specific mechanisms in this. It creates a task force, which is a “whole of government” task force. How do you think that taskforce would work?

Carmen Scurato:

It’s an interagency task force. It lists a lot of different agencies that should have purview over these issues and understand really how the platforms are impacting people across our economy. Number one, the FTC is at the head of that. The bill asks the task force to put out a report 180 days after it’s formed to start understanding what these issues are. It says, “A task force shall submit to Congress a report containing the results of the study conducted…” So it’s really asking them, “What is the discriminatory use of this personal information? How is this data being collected? How is it impacting employment opportunities, education opportunities, insurance opportunities, hiring and screening practices?” I think at one point it even talks about housing, credit and lending, and even the way that proctoring is happening in academia, and how invasive some of those tools are. So it really is trying to get this task force to take a holistic view of what’s happening on the platform.

Justin Hendrix:

The bill refers to the idea of Dark Patterns, which is something we know the FTC is studying in other ways as well. Is this the first time Dark Patterns has shown up in a piece of legislation?

Carmen Scurato:

I’m not sure- I think the Dark Patterns language gets to the design elements and how those design elements are actually perpetuating harm in a way that some people are just unaware, honestly. So Dark Patterns can show up in numerous ways. And I think there is this element of deception that’s happening right now. And so that’s why, again, this transparency, this need to understand how the data is being used, can really bring light to what those discriminatory practices are and then how to remedy them.

Justin Hendrix:

So, this puts the onus on the FTC and then allows for a state attorneys general to do enforcement. Do we think the FTC is well enough resource to do this work? Or do you think it would need additional resources or support to get onto this type of enforcement?

Carmen Scurato:

The FTC will need some additional resources. Obviously they are starting their office of rulemaking, and this bill calls for the FTC to really drill down and get some rules on the books for these transparency reports, for the data collection reports as well. The FTC is an expert agency, and with a task force and with all these rulemaking powers, they’ll need some additional resources.

Justin Hendrix:

What do you know about the prospects for this bill? Senator Markey introduced it in the Senate, but it also has the backing of Representative Matsui. What do we think will happen now? What’s your hope for it?

Carmen Scurato:

Well, my hope for it is that members of Congress, both in the Senate and in the House, start co-sponsoring this bill. I really think that this is something that addresses issues that we have been raising in the public interest community for several years. And it really drilled down again into this need for transparency from these companies about what they’re doing with our data and what they’re doing with content moderation, because we’re seeing the disparate impact of their practices. And now it’s time for them to really show us what that means so that again, so we can start finding solutions. And so I really do hope that members in both houses start looking at this and get behind it and start co-sponsoring it as well.

Justin Hendrix:

What will you do to make sure that that’s the case?

Carmen Scurato:

Well, we’re going to keep highlighting that these issues exist on the platforms, and show that we’ve done a lot of analysis on content moderation practices on these data abuses. And we are going to do our best to make sure that this bill gets through, because it is something that we absolutely need in the public interest community and in academia. We’ve seen the impact of what platform practices are doing to our communities. So now it’s time for them to be held accountable and to really give us the transparency necessary to move forward and to find solutions.

Justin Hendrix:

Have you seen any response yet from industry or from industry associations, trade groups, others that may have a bone to pick?

Carmen Scurato:

Not yet. I haven’t. I’m sure there’ll be some reactions to this, but a lot of it- the platforms are transparent to a degree. I think this is asking for some more, and that’s absolutely necessary.

.

Justin Hendrix:

It bears some similarities to some of the interests in the Schakowsky-Castor bill a few weeks ago that also call on the FTC to do various forms of rulemaking and thinking around content moderation. Do you see any similarity or are you familiar with that one at all?

Carmen Scurato:

I think the similarity there is to hold the platforms to their word. And again, it really addresses this need for transparency and for meaningful transparency from the company. I feel like the Markey and Matsui bill really drills down into some of these particular reports that are necessary when it comes to the data collection and the content moderation. And I think putting this taskforce together is also going to be really helpful to have this inter-agency look into how all these opportunities are being impacted- whether it’s economic, housing, employment, and some of the other ones that we discussed earlier.

Justin Hendrix:

In Europe, of course they have this more omnibus approach. Things like GDPR, or they’re pursuing this Digital Services Act. Can you see these various strains coming together potentially into some kind of omnibus legislation in the United States? Or is that a fantasy?

Carmen Scurato:

Yeah, I think this is what I like about this bill in particular is that it really talks about, again, data. Collection of data. It is not explicitly a privacy bill, but it still captures a lot of those elements of what we need to protect individuals. And to protect individuals from the abuses that are happening with the information that is ours. And so that’s why I really like the way that this bill is framed and the way that it really centers marginalized communities.

Justin Hendrix:

Thank you!

.