Home

AI & Criminal Justice: A Conversation with Renée Cummings

Justin Hendrix / Jul 3, 2022

Audio of this conversation is available via your favorite podcast service.

One of the areas where applications of machine learning and artificial intelligence are most fraught with ethical concerns is in law enforcement and criminal justice. To learn more about the opportunities and the concerns, I spoke to Renée Cummings, who joined the University of Virginia’s School of Data Science in 2020 as the School’s first Data Activist in Residence. In addition to being an AI ethicist, she is also a Criminologist and Criminal Psychologist.

Below is a lightly edited transcript of the discussion.

Justin Hendrix:

Renée, how does one come to be a data activist? How did you come into that role?

Renée Cummings:

I'm still trying to figure that out. How did I really become a data activist? I certainly know that I've always been committed to justice. And I also know that where there is injustice, I always find myself in that particular space. For me, it was the criminal justice system. It was this curiosity that I had about these risk assessment tools that were being deployed in the criminal justice system to decide on sentences and parole, and just generally being used as a way to assess public safety. And of course, just safety and security in general. So I felt that these algorithms were behaving badly. I was questioning the kinds of sentences that we were seeing. I was questioning the fairness of these tools and whether or not they were equitable and whether or not they were paying attention to issues such as diversity and inclusion in the criminal justice space and whether or not these concepts were being used to design these tools.

I was also thinking about, who's designing this? What is their knowledge about the criminal justice system about criminology, about psychology and sociology? And whether or not they've been thinking about the populations that they are designing for. And for me coming from that space of working in homicide and gun and gang violence and just violence for many years and understanding trauma, I kept asking myself, these tools that we are deploying in the criminal justice system, are they justice oriented and are they trauma informed? And I think it is that intellectual curiosity that moved me across the criminal justice sort of pipeline, and really directed me into the world of algorithms and AI and data science.

Justin Hendrix:

So let's talk a little bit just about your background in more depth. I mean, Hunter College bachelor's degree in political science, philosophy. I see media studies in creative writing. And then you take a turn. You go and get a master's first in rehabilitation science with a focus on substance abuse, and then onto John Jay, where we take a master's in criminal justice and look at terrorism studies. Just describe your trajectory. It's very interesting to me. Rare, we see someone with this type of humanities background take this turn towards data science.

Renée Cummings:

Certainly. I began my career as a journalist working in print, and then I moved into broadcasting and worked in broadcasting for several years. I was also a sportscaster. So when I entered a university, I entered with that excitement for media and broadcasting and journalism. I still love writing and creative writing, I think it inspires me to always think creatively and always position things in a way in which they could be juxtaposed for greatest effectiveness. And then I just have always also been fascinated with addiction and how addiction impacts individuals and families and communities. And whether or not we were really understanding the impact of addiction on the brain. And my work in rehabilitation counseling looked at disability studies, including addiction as a disability. So I spent a lot of time working and interning in New York at several of our therapeutic communities, which are 24 hour substance abuse treatment programs, and also working with individuals with disabilities.

And I think it is that, again, intellectual curiosity about neurodiversity, neurological impacts, those relationships or the intersection of the brain and addiction. And it really, that is what led me to criminal justice because I realized that all of my clients... So I was working at a therapeutic community. All my clients were coming out of the two programs that were deployed on the streets at that time in the nineties, in New York. They were both programs that were alternative to incarceration programs. So individuals who had come into contact with the criminal justice system because of substance abuse. And I realized that they all had a criminal record. So I was dealing with their disabilities, getting them ready psychologically and emotionally, and working with them through sobriety and recovery, and also doing a lot of rehabilitation, vocational rehabilitation. But then this thing about relapse really sort of got me thinking and the intersection of relapse and recidivism.

So you're relapsing. And then you're finding yourself back in the criminal justice system. And that took me to John Jay. And while I was there just working in criminal justice, thinking about criminal justice, thinking a lot about forensic psychology. And of course just got myself involved in our terrorism studies because we had 9/11 and at that time terrorism was something that we were thinking about a lot more. And even in terrorism studies, my work was around the psychodynamics of terrorism studies. So all of my work has really been in the space of justice and then social justice and understanding the interaction with the brain and society and really what happens. And I think that's what has me here in data science, because it again, is looking at that question for me always rights, freedoms, values and how they intersect with the kinds of experiences that we have in society.

Justin Hendrix:

So clearly across the country, we do see police departments, law enforcement agencies, federal agencies, adopting various computational systems to both do criminal justice to find potential threats, to handle the workflows of our criminal justice system, to allow for forensics and investigation and the rest of that sort of thing. Do you think right now, these tools are fit to understand the kind of complexities that you're talking about with regard to what we know of how the brain works and what we know of... I mean, it seems like to me, there's a kind of disconnect right now between what we're learning about the human mind and how it operates in context and the way our criminal justice system generally treats people. Is that right?

Renée Cummings:

I think it is because I think the challenges that we are seeing in particular with the design and development, deployment, and adoption of many of these predictive analytic tools, predictive policing tools that are being deployed in preemptive situations would be that they are not being designed with the kinds of data sets that would give us the kind of accuracy in decision making that is required. What we are seeing always would be the over reliance on historical data. And we've got to appreciate when we think about justice in the American context, when we think about the history of justice and the intersection of justice and race and justice and socioeconomic status, and just a color of justice that has been deployed in this country. We've got to think critically about those data sets. And one of the things that I speak a lot about would be the transmission of intergenerational trauma through our data sets.

And when we think about the experience of people of color in the United States and the experience or the memory of that data, we've got to think of– always– enslavement and reconstruction and Jim Crow. And many of the systemic challenges and biases and discriminatory practice, and then systemic practices that have undermined these communities. And now we found those traumas baked into our data sets. So if we are using that to create the kinds of tools that we see being deployed on the streets, and what we're also seeing is that these tools are being deployed in spaces that have already been traumatized and victimized by our criminal justice system, particularly law enforcement. So we're definitely not getting the results. So that's one of the reasons at UVA, that I have this project, where we have designed what we call the digital force index. And the digital force index is really a compilation and a corralling of all the surveillance technologies being deployed in the United States.

And coming up with a score that looks at a particular community using your zip code and for the individual to see some public interest tech tool, an accountability tool for policing to bring more transparency and accountability and legitimacy to the ways in which we're deploying algorithms or just doing data science in criminal justice. And what it looks at and what we realize is that we continue to deploy an extraordinary amount of technology in spaces that are already vulnerable, and we're not getting the kinds of results that we're hoping for. So surveillance tech is not reducing crime in real time. It's not reducing violence. It is not building community resilience. But what it is doing is providing a forum for a lot of just tech companies and vendors to come up with these tools that they think are going to be the answer to crime in a community, and we're not seeing it.

And I think that's one of the reasons as well that I am so passionate about the work that I do, because I always say for me, data activism is about bringing more accuracy to the decision making process. It's about how we use that critical understanding of data to improve decision making? And that's what I'm passionate about. Improving the data quality, encouraging responsible innovation. And I always say data activism is real time auditing, real time checks and balances of what we are doing. And for me, it is about bringing people into the conversation about data and understanding that we have got to have those conversations that move beyond privacy and data protection and data management. Conversations that are situated around rights, freedoms, and values, and looking beyond the legal to engage the social and the psychological and those many facets of those dynamics that create who we are as individuals.

Justin Hendrix:

Some folks might listen to what you just said and hear the kind of concerns about the current state of these technologies and their application, and wonder why you're not full stop an abolitionist when it comes to the use of certain artificial intelligence or different surveillance systems in criminal justice or in law enforcement. How do you find yourself on that line? It sounds like you're in many ways more of a reformist or a believer that we can potentially apply these technologies well if we actually, I don't know, do it differently, or have a more robust perspective. Is that the right way to characterize the way you look at it?

Renée Cummings:

In a particular way. I look at it like this. There's some things we definitely need not be doing, and I am committed to ensuring that those things we don't do because we've just not been doing them well. And they've just been creating more trauma. So anything that is designed to track, to trace, to terrorize or traumatize individual, a community, I'm not supporting that. But I'm also committed to the ways in which we can use data science and technology to transform the world. I committed to the ways in which we could use AI. I'm very passionate about AI and very passionate about the things that we can do. It's transformative ability, it's ability to create sustainable legacies, it's ability to bring that kind of precision decision making into the policy space so we can really get the equity and the inclusion and the diversity that we continue to speak about in tech.

And if we are really committed to responsible innovation, then I think we've got to also be responsible in our thinking. Our data science is very powerful. Data has the ability to really reimagine and redesign the kind of experience I think we deserve as a society. So while there are some things that I'm totally against, there are some things that we can definitely do. And this is why these conversations are so critical. And this is why data activism is so critical, because I see myself as that bridge between companies and government and civil society and bringing the kind of consciousness that is required, the kind of critical thinking and creative thinking around data, around innovation, around technology, in particular AI, and engaging us in the conversations that ensure that we are all part of the process when it comes to understanding everything from data collection to data analysis.

Justin Hendrix:

So just for my listeners' sake, is there an example perhaps that you would give of one application that if you were in charge for a day of a particular law enforcement agency, that you would simply abolish, that you would do away with? And maybe one application where you would invest, where you think that there's a particular opportunity to expand justice?

Renée Cummings:

So what I would invest in would be providing law enforcement with that critical knowledge of data science and understanding data visualization and understanding how to really use data that is justice oriented, trauma informed to build the kinds of systems that are required and to deploy the kinds of strategies that are needed. So data for community resilience is what I would definitely encourage. So I would definitely invest and spend in knowledge building, building those knowledge assets of our police officers. What I would not invest in would definitely be at this moment, the deployment of any kind of facial recognition technology into those spaces. But then to be honest with you, anything that really looks at predictive policing. I think it is just not at the place that it needs to be in particular because of that historic data that we continue to use to build these systems.

And because one of the things that I would always say is that I've been in policing and criminal justice and criminology for quite some time. And police officers and law enforcement, I think they have all the tools that's required for them to get their job done. I think when we combine lazy policing and cavalier police practices with technology, we're creating a danger zone. And I talk a lot about us creating what I call the digital choke hold, which is using data science and using technology and using algorithms to create this choke hold that is also a deployment of force. So although we are not seeing data, data could be used in a way when weaponized against us in a very forceful way, because it is often so challenging to reverse the impacts of technology on society, or to pull back the deployment of an algorithm that has already impacted a community.

So I would definitely want to bring more criminologists and criminal justice practitioners and professionals and law enforcement professionals into the data science space for them to really get that critical understanding within the context of justice, equity, diversity, and inclusion, and how powerful those concepts are as creative design concepts. And of course, ethics. Ethics as design. Using ethics as a creative way of thinking about design. Definitely invest my money there. Anything that's preemptive and anything that's predictive and anything built on historic data sets, I would say put that on hold and let's focus on doing things right in real time.

Justin Hendrix:

So obviously in the last couple of months, we've had a couple of major events, big shootings. A racist inspired shooter in Buffalo, New York. In Uvalde, Texas, unfortunately, a school shooter who killed small children. And there have been a lot of calls for more social media surveillance after these particular violent events, as there has been around a variety of other events that have taken place and yet, or across the country, we're seeing police departments hoover up social media data often in ways that are increasingly problematic. How do you look at that conundrum of the thought that in some cases we have people putting evidence into the public domain about potential crimes, and yet law enforcement still feels it doesn't have the tools necessary to kind of catch the signal in all the noise. How do you look at that problem?

Renée Cummings:

It comes back to the knowledge and whether law enforcement are best equipped to really translate those kinds of messages that we are seeing in social media. So that's why even in law enforcement, it is so important for us to have that interdisciplinary thinking. I know Dr. Desmond Patton has done an extraordinary amount of work in that space, looking at the ways in which violence and social media could be filtered onto the streets. And he is definitely the expert in that space. So while there is great value to the things that we can decipher from social media, there's also great concern. If law enforcement is the one that's actually finding that and how are they interpreting that? So there are many things there, but that requires really pulling a team together that includes your social workers, your psychologists, your forensic psychologists, your criminologists, your data scientists, and really understanding what those messages are.

I think when we think about the kinds of violence that we're seeing, we are realizing that we need to invest in mental health. And we are still not focusing on teen or juvenile mental health, which is so critical and really putting the kinds of resources in communities because families and communities interact with these individuals long before the nation has an opportunity to be victimized, or the nation has an opportunity to be traumatized by those individuals. So really, we've got to look at families, we've got to look at communities, we've got to look at schools. School teachers come into contact with these individuals. And if we provide those support systems and we provide the kind of wraparound services that are required, that multidisciplinary, particularly multidisciplinary family therapy and services, I think those are the places that we need to invest. And just not into surveillance technology or any deployment of an algorithm saying that an algorithm is going to stop this or do this in real time.

Justin Hendrix:

When you said 'digital choke hold,' it reminded me of Desmond Patton's phrase, 'digital stop and frisk,' which I know he's talked about, particularly in the context of behavior of the New York City Police Department. One of the other kind of concerning things that I've been tracking a little bit is the use of false profiles on social media by law enforcement to kind of infiltrate different groups, often groups of teens or suspected gang activity and the like. And of course that behavior is more common when looking for potentially pedophiles or terrorists on the internet. And yet, I don't know how much you look at the kind of frontiers of AI. We start to see potential applications of conversational agents or large language models. Do you imagine that we're going to see those types of things employed? Will we eventually have police departments trying to deploy kind of, for lack of a better word, bots that are snooping around looking for potential crimes? Do you worry about those types of things?

Renée Cummings:

Well, I think it's already happening. I can definitely say that. Whether or not it worries me, it is of concern, but I also know that there's a very robust ethical AI and ethical data science and responsible tech community out there, including myself and yourself in that space. And I know that these things will not go unaddressed. So people are experimenting. And let's be honest, law enforcement is not an easy job. And it is a job that often requires a very creative way of thinking because perpetrators and bad actors are also very creative thinkers when it comes to deploying or just engaging in criminal activity or that criminality mindset. So things are definitely happening. But I think once we ramp up the robustness of the ethical AI and ethical data science and ethical tech community, I think we're going to be okay because we are having the conversations.

And it comes back always to public trust and accountability and transparency. And whether any of these tools that are being used are legal or are going to get the kind of legitimacy required to make them part of a standard operating procedure. I don't really see that. But what I do see is that we do have the kind of ethical vigilance and the kind of ethical resilience among this responsible tech community to keep law enforcement in check.

Justin Hendrix:

Do you have particular kind of concerns about either constitutional issues or legislative agenda that you'd like to see pursued? Are there things we need to change now that have become apparent to you about either our legal protections or constitutional protections that you think are crucial?

Renée Cummings:

Definitely. I think just in the data science space, the way we are thinking about big data. We've got to think about that within the context of civil rights and civil liberties and human rights. And we've got to understand those rights and those freedoms and those values attach to data. And that's why I think I'm so passionate about having conversations about data. So definitely data rights, a big issue at this moment. And we are going to see more and more and more questions around data rights, given where we are at this moment in this society. And given the fact that your data could be easily weaponized against you and can create really challenging situations for individuals. So I think more and more, we are all going to realize that we are data activists, because once our data is being accessed, we have got to be able to voice our concerns and ensure that our freedoms and our rights and our liberties are in place.

Justin Hendrix:

In the wake of the Supreme court's decision essentially overturning Roe v Wade, there's been a lot of writing about concerns around privacy, the ways that data brokerages, social media data information may be used by law enforcement who are seeking to criminalize women seeking reproductive health services. Do you think that creates a substantial new opening to address some of these issues in a way that perhaps some of the other concerns around racial justice in past have not? Can you see these things coming together and potentially producing a new movement in this country?

Renée Cummings:

I think what it's going to do is amplify questions and concerns, because when we think about the critical situation that we're in at the moment because of what has happened recently, what we are realizing is the intersections, the woman who are also going to be impacted. And again, we are going to see that intersection when it comes to healthcare equity or the lack of equity in healthcare, and who is going to be impacted. And we are going to feel the impact more in our underserved high needs, minoritized, vulnerable communities. Those are the women who are going to be impacted the most. So it comes back to that question of equity. It comes back to the question of justice and the question of trauma. So I think what we are going to see would be more intersections when it comes to understanding the power of data and understanding the privilege and the prejudice attach to data.

Justin Hendrix:

I want to ask you maybe a question to kind of finish things up here. And I don't know quite how to put it, but I ask a lot of the folks that I talk to some version of are you presently, I don't want to say optimistic or pessimistic, but are you presently kind of... You've obviously been optimistic in this conversation talking about the ability of this movement to potentially produce different outcomes, but in the broader trajectory of the US right now, AI, data science, surveillance, law enforcement, that intersection, are we headed in the right direction, in the wrong direction? What's your sort of threat vector threat valence at the moment? How concerned are you?

Renée Cummings:

I think for me, and it always comes back to the duality of data in that data is as powerful as it is vulnerable. So we've got to ensure we have those protections, those guardrails, those robust guardrails are there to ensure that we bring the requisite levels of ethical vigilance and due diligence when we are thinking about data. I am always very optimistic, always optimistic. Because for me, it is always about knowledge. Where there is knowledge there is no fear because we understand the work that needs to be done. I think my work, when it comes to AI ethics and data activism, is about building that kind of confidence where people feel comfortable to have those critical conversations around data. It's about public trust. It's about public confidence that gives technology the requisite legitimacy it needs to be designed and to be deployed and to be adopted.

I come from that space of responsible innovation. I come from that place of understanding that we've got to do data responsibly. We've got to bring an ethical approach to the ways in which we think about AI. And we've got to understand the long term impacts of data and society. So I'm excited. I'm optimistic because I believe at this moment, data is just the one thing that we can use to really create the sustainable legacies required to make those critical decisions about how we are going to use the resources that we have that may not be renewable and really understanding the importance of accountability. And I think if people have the knowledge, I think if people have the understanding about the power of data, of course, and understanding what we can do with it, I think we are going to be okay. So for me, it is always about being optimistic because I think in optimism, there is hope. And in hope we have the ability to really build those sustainable futures that are required. And just to build a world that we could all be proud of.

Justin Hendrix:

You sound like an educator, somebody investing in other people's understanding and hoping that they'll create that better society. Tell me what's next for you? Will you remain at UVA for a period of time? And hopefully we'll see you again here in New York City as well.

Renée Cummings:

Definitely. I'm going to be at UVA. I love what we're doing with the School of Data Science. I love the kinds of connections and intersections that we are creating. And I think working in the space of data activism, working as an AI ethicist and still keeping my hold on that criminology, criminal justice space, it's just really exciting. And as I said, for me, it's always about accountability. And one of the things that I really want us to do with data would be to use it equitably and to govern it with the kind of consciousness that's required. So to move just beyond the compliance model that we've been using for a very long time and bring in that critical thinking and lift and raise the kind of consciousness that's required to ensure that we not only inform with data, but we use data to empower ourselves and each other, and to really build a sustainable world.

Justin Hendrix:

Great. Renée, thank you so much for speaking to me today.

Renée Cummings:

Thank you. Thank you very much, Justin.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics