Through to Thriving: Protecting Our Privacy with Chris Gilliard
Anika Collier Navaroli / Nov 16, 2025Audio of this conversation is available via your favorite podcast service.
Thanks for listening to another episode of the special podcast series, Through to Thriving. This week, we spoke about protecting our privacy with Chris Gilliard. Chris is co-director of the Critical Internet Studies Institute and the author of the forthcoming book from MIT Press, Luxury Surveillance.
Chris and I talked about how he protects his own privacy while maintaining his online presence, how the pandemic reset work norms, how growing up as a kid in Detroit started his pathway to protecting privacy, how technology can alienate us in our own neighborhoods, and how AI is being used for surveillance.
We also talked about the changes from Twitter to X and what it meant for Chris:
I owe a ton of my notoriety to being on there for a certain moment in time. It's how some people know who I am and how some of my ideas got out there and things like that. You know, I wound up testifying before Congress because someone knew my word from Twitter.
So there was a time when it really drove the narratives, particularly in media. It had a very outsized influence in the ideas that were put into the public sphere. That may or may not still be true, but it's mainly true to the extent that those narratives are right wing. I think because of the way that it's been altered/tweaked that gaining notoriety for sort of, leftist or progressive ideas, is not really a thing on there like it used to be.
Chris also explained the concept of “luxury surveillance”:
I argue that an ankle monitor and an Apple Watch are essentially the same technology. They're essentially the same thing. And that we can really understand some of the things that tech companies are up to when we tease out those parallels. That there's a segment of devices often chosen by people who have the ability to say yes to surveillance where they think that they are likely to gain some benefits and they don't understand them as surveillance devices. And that ultimately a big part of this is helping to normalize surveillance for all of us by the fact that people embrace these technologies and understand them as aspirational.
During our conversation, he detailed his abolitionist stance toward technologies that are designed to surveil:
I am of the firm belief that many of these technologies should not exist and that we should reject them in any and all ways possible and available to us. I know that some people don't consider that practical. But, I think that the negative social impacts of these technologies have already been well established. I mean, look around.
I think for some of these technologies, there is not a legitimate use case, or at the very least for something like the Meta Ray-Bans, I would argue the main reason they exist is to normalize ever-present surveillance and lower the barrier for all kinds of harassment and antisocial behavior.
Chris also discussed the impacts of surveillance technologies on society and the goal of his work:
Right now I'd love to live in a society where if you said, it harms this segment of people, then people's response wouldn't be, well, but I still need to get my packages or whatever. I would love for that to be true. Unfortunately, in my experience, mostly not. And so part of what I have set out to do, or what I hope some of my projects do, is show people that the negative effects of these things are not isolated to the most vulnerable. Now, it's true that they hit them the hardest, the earliest, the most often. But they eventually come for everyone. It's also coming for you.
Finally, he discussed his hopes for the future of privacy and technology:
I have a phrase that I say often. Which is: every future imagined by a tech company is worse than a previous iteration… And the reason I say that is because the imagination of the tech company is driven by capitalism and the need to extract maximum value from us. That in order for these things to exist, in a way that didn't do that and actually benefited us, we kind of have to rewrite a bunch of the ways that things work. I think that's possible. I think it is actually super dark right now.
But, I think, in a lot of ways, the tech barons are really overplaying their hand. There's not really any pretense anymore, whose side they're on, whether they believe in things like racial justice or equity or even democracy. I don't have that question anymore.
I mean, again, I think some of them should not exist. But if you wanted a device that you thought helped you better stick to your fitness goals that didn't send that to a company that then sends it to an authoritarian government, that would be nice, you know?
What follows is a lightly edited transcript of the conversation.
Anika Collier Navaroli:
Hey y’all. Thank you for joining us for another episode of Through to Thriving. This is a special podcast series where I'm talking to a bunch of brilliant tech policy folks about how we can build futures beyond our current moment, and today we are going to be talking about protecting our privacy with Chris Gilliard. Chris, would you mind introducing yourself to our listeners?
Chris Gilliard:
My name is Chris Gilliard. I am the co-director of the Critical Internet Studies Institute and the author of the forthcoming book Luxury Surveillance.
Anika Collier Navaroli:
I'm glad you mentioned your book, because we're definitely going to get into that and talk about some of the things that you are working on, but we are here today again to talk a little bit about protecting our privacy. Our viewers do not know this, well, they don't see this, but I can actually see your face right now, which is why I said, "Wow, I've never seen your face before." Because you are a notoriously private person. Yes?
Chris Gilliard:
Yeah, absolutely. I do my best.
Anika Collier Navaroli:
Can you tell me a little bit about what it means and how it is to be so online while also maintaining your privacy?
Chris Gilliard:
It is difficult, because I think that there's an imperative driven by tech companies, but I think a lot of us have helped drive that imperative, that we're supposed to give away everything online, supposed to put pictures of ourselves, talk about our hobbies, show our pets, our families, all those things. I generally resist those things. I have some ideological reasons for doing so, but it's difficult.
Anika Collier Navaroli:
I would love to talk to you about your ideological reasons for resisting, but I also would just say as a personal point, I'm incredibly jealous. I think one of the things in my life that I've hated the most about the past couple of years is how visible I have had to be, and how much my face has been in various different places as someone who is also notoriously a private person and didn't have my face on the internet ever, for many, many years and now have to have be in this place. It's a very, very interesting sort of thought. As you said, it's sort of imperative to give away everything online, and I would love to hear a little bit more about your ideological resistance to that. Tell me more about that.
Chris Gilliard:
Shorthand, I would say that I try whenever possible to not feed the machine, and so that means when I have an option, and sometimes you don't, but when I have an option, I do not show my face or I do not give away intimate details about myself, things like that. I was in the Washington Post, Will Oremus did a profile of me several years ago.
Anika Collier Navaroli:
Right. Yes. I remember this.
Chris Gilliard:
It was quite the negotiation to get them to have pictures of me, but not pictures of my face. It was very difficult, but I do, whenever possible, I try to not feed the machine.
Anika Collier Navaroli:
That's the one who's also done a profile on the Washington Post and my face has been everywhere on that. I remember seeing this and being so jealous and being like, oh, man, what kind of negotiation did you get?
Chris Gilliard:
The piece almost didn't run, for that simple fact.
Anika Collier Navaroli:
That's really funny. I think this not showing of your face in so many ways is such a power move, to be really honest with you, I remember I was working at a tech company that shall not be named, and I was really burnt out, and I had just decided, I'm not showing my face on camera anymore, and I just started showing up to meetings, no camera on, and I remember the sort of resistance that people had to that, and the sort of no one really wanted to say anything. No one wanted to be the person to be like, "Can we please invade your privacy and be inside of your house right now while we're all working from home?" But it is definitely a shifting of power dynamics to be able to be in that space.
Chris Gilliard:
Yeah, absolutely. I'm glad you brought that up because the height of the pandemic really exacerbated some of the worst impulses there. I thought, and have been thinking a lot about this in terms of educational technologies, where remote proctoring services and things like that we're forcing students to show images of their face and do room scans and all these other things, but, during that time when we were all zooming, everything, there even were different accounts, Twitter accounts I think, and things like that that would rate people's rooms.
Anika Collier Navaroli:
Yes, yes, yes, yes.
Chris Gilliard:
Yeah. We're telling everyone we had to have a ring light and all these things, and I thought that that was so problematic, and so under-investigated that we were being conditioned to accept the idea that strangers had a right to peer into our houses. I mean, I think GenAI is changing that now too, because now, we're being told that we have to optimize our looks with GenAI, which is a kind of different kind of intrusion, but I really think these things need to be more investigated.
Anika Collier Navaroli:
I'm so glad you brought up this invasion of privacy and COVID, because I remember, there were so many different points at so many different times when I was working that I was like, I don't really want to see your bed. There were times when people's partners would be in the back in various states of dress and distress, and I would just be like, who okayed this? This sort of complete invasion and this sort of complete resetting of norms? There was never a place where in which I would ever have co-workers, no offense to my co-workers, that would be in my house, my home was sacred, that was my home, and yet we were in a space where these folks that would never be in my home were in my home every day, and that just felt so strange.
Chris Gilliard:
Yeah. And if you think about all the things that could be gleaned just from the background of individuals’ settings, because a lot of people don't have a dedicated space where they podcast or Zoom from. So given the world we live in, there's reasons why people wouldn't want everyone to have that kind of access. And there was a lot of work done to normalize that, I think to our detriment.
Anika Collier Navaroli:
Talking about learning from the backgrounds of people, I've talked about this on the podcast before, but one of my favorite things that I actually did during the pandemic when we were stuck inside of people's homes and were in all-hands meetings inside of these tech companies is we would have to go inside of the CEOs and the executives homes, which were, they were not the folks that you were mentioning that didn't have the separate place, work and, I will never forget. I know some folks who listen, this will remember this. We had a person at one place that we worked at that during the pandemic, we could have swore they were at a spa, right?
Just like the breeze blowing and the little curtain in the background. People were in alleyways, you know what I mean? Huddled into a computer and this complete, I mean, the disparity that showed at that time I think was just something that I think you said, it's completely under-investigated, right? This sort of the complete disparity between who had the ability to have these amazing places that were dedicated to work and who were forced inside to let people stare at their homes.
Chris Gilliard:
Yeah. Absolutely.
Anika Collier Navaroli:
I haven't thought a lot about this. I appreciate you for bringing this up.
Chris Gilliard:
And that's a big part of the reason I started saying no, is because I have the option to say no, and I felt I don't want to overstate my importance in any way, but I thought that if I can say no, I should say no because it gives other people the option. They say, well, there are some people who don't do it, and here's some of the reasons why. So more and more when I was on, being interviewed for podcasts or for television shows or whatever the case may have been, I would not show my face, and I just tried to establish it as a norm that gave people the permission to not do it if they didn't desire to.
Anika Collier Navaroli:
Well, I appreciate you setting that standard, and I want to encourage all of our listeners who want to follow your footsteps to go ahead and do that. I feel like we need to have the revolution of the camera off, a revolution over here of, you may not be inside of my home, but you said you don't want to overstate your importance, but everybody listening to this podcast and myself, part of the reason why I wanted you to be on here is because you are very important in this space, and you're a person who has helped shape our nomenclature in the way that we think and that we talk about tech policy. And you do that a lot by being online, which I think I mentioned is a kind of dichotomy between being extremely private, but then being, would you consider yourself very online or how online on a scale of an offline to very online, would you consider yourself?
Chris Gilliard:
Well, that's an interesting question because I mainly don't have social media.
Anika Collier Navaroli:
Okay.
Chris Gilliard:
I mean, I use Bluesky. It used to be Twitter, but I mainly don't use social media. I've never had Instagram. I quit Facebook in 2016, 2015. So if being online means being on platforms, I guess I'm not really, but I do spend a good chunk of my day reading and researching articles that I mainly kind of access through online.
Anika Collier Navaroli:
Right.
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
Right. And I think so I see very much of your posting about the sort of reading and research that you're doing, and I think that's what I see is the consistent sort of, here is this thing that's in the news, or here is this thing that's happening in the world that you might not necessarily see. You might've seen it, but you also add your own commentary and your context to it and bringing it sort of to the foreground in this space. And I think you have been one of the folks that people have constantly gone to see what is happening in the space. What we should we be thinking about, what should we be talking about? Where do you go to find all of your reading and your research? Because I mean, there are times you'll post an article that I'm like, oh, where? Where did you find that? That's interesting. And also, where did you get this from?
Chris Gilliard:
I mean, kind of everywhere. I have an RSS feed, I do check in on Bluesky. I mean, I try to follow other people who are doing and saying interesting things. I do unfortunately still read some of the legacy media.
Anika Collier Navaroli:
We all do.
Chris Gilliard:
Yeah. I mean we have to.
Anika Collier Navaroli:
It's kind of part of the job, right?
Chris Gilliard:
Yeah. I think there's been some incredible work by independent media like 404, should shout out ProPublica, there's so many, and individual journalists, Brian Merchant, researchers like DAIR. I try, I mean, I have the ability, especially in the last three years, I've had the ability to spend a good chunk of my time every day reading and writing about this stuff, in ways I don't think is mostly available to people. And so yeah.
Anika Collier Navaroli:
It's your job. It's your job.
Chris Gilliard:
Yeah. So I try to get sources from outside the US, rest of the world. I mean, I check Financial Times every day. All over.
Anika Collier Navaroli:
Okay. Can you tell me what you check every day? Is there a rotation that you look up in the morning and pull out the paper, or how do you do this?
Chris Gilliard:
Well, I check my RSS feed.
Anika Collier Navaroli:
Okay. You said you had to get-
Chris Gilliard:
I use Feedly. I mean, it's a little bit problematic.
Anika Collier Navaroli:
Everything's problematic.
Chris Gilliard:
Yeah. Yeah. I mean, I also have Apple News. I have a variety of terms that I set up alerts for. I still use Google Alerts.
Anika Collier Navaroli:
So do I.
Chris Gilliard:
Yeah, so I have a whole kind of system set up and then, yeah, part of what I do when I get up, it's not the first thing I do.
Anika Collier Navaroli:
I was going to say, is that the first, open your eyes and it's like, all right, tech policy, what's going on?
Chris Gilliard:
No, no, right.
Anika Collier Navaroli:
You have a life.
Chris Gilliard:
It's a little bit healthier. I go for a walk first.
Anika Collier Navaroli:
Okay, okay, okay. We love a healthy balance in the tech policy world.
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
Okay. All right. Good to know that that is not the thing that is opening your eyes and making you go. I appreciate you sharing that. I'm one of these folks too. One of the first things I'm in the morning is like, all right, let me read through my three to four different newspapers. I feel like my dad who used to wake up every morning with the hard newspaper with his cup of coffee and be like, "Let's see what's happening in the news." But it's still an interesting part of life, but I think now we do so much of that through social or through RSS feeds or through the various digital ways that we can. And you mentioned moving from X to Bluesky. So many people have made that move. Can you tell me a little bit about your exodus journey?
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
Transition, whatever we want to call it?
Chris Gilliard:
Well, I left actually the day Musk acquired it.
Anika Collier Navaroli:
Okay.
Chris Gilliard:
He walked into headquarters with the toilet, or the sink. I'm sorry, the sink.
Anika Collier Navaroli:
The sink. A toilet might have been better.
Chris Gilliard:
I wonder why I thought it was a toilet. Yeah.
Anika Collier Navaroli:
It's always going to be a toilet in my head from here on out.
Chris Gilliard:
It was an actual slip up, right. I knew it was a sink, really. I love that day.
Anika Collier Navaroli:
Yeah, okay.
Chris Gilliard:
I wrote an article with a dear friend and brilliant colleague, Kishonna Gray. We wrote a piece in Wired, and I believe the title was “Digital Migration is Nothing New for Black Folks.”
Anika Collier Navaroli:
Yeah. Okay. Tell me a little bit about that.
Chris Gilliard:
And I just explained that I don't think I had any special insight, but I knew it would become what it has become, that it exists to amplify right wing narratives and poison our information steer. That happened a little differently than I anticipated, but that's what was going to happen. And so I could not in good conscience stay on there, but also, I think even to this day, there are debates about whether or not people should be on there, which I think kind of fundamentally misunderstand how it works now.
Anika Collier Navaroli:
Tell me more.
Chris Gilliard:
And what I mean by that is, I owe a ton of my notoriety to being on there for a certain moment in time. It's how some people know who I am and how some of my ideas got out there and things like that. I wound up testifying before Congress because someone knew my work from Twitter. So there was a time when it really drove, it really drove the narratives, particularly in media, it had a very outsized influence in the ideas that were put into the public sphere. That may or may not still be true, but it's mainly true to the extent that those narratives are right wing. I think because of the way that it's been altered, tweaked, that gaining notoriety for sort of leftist or progressive ideas is not really a thing on there like it used to be. I think people who argue that you should stay on there don't understand that very well, and I feel weird saying this to you.
Anika Collier Navaroli:
And tell the audience. I'm the choir that you're preaching to, the audience, it's a congregation.
Chris Gilliard:
I mean, hopefully I'm representing it accurately, but I'm not the expert on that in this conversation. And so it was obvious that that's what was going to happen, and so I left. Now I wound up on Bluesky, again, like all social media, it has its problems, but I think, and it does not have the reach at all that Twitter did, but I think it is a better alternative for the moment.
Anika Collier Navaroli:
Yeah. Well, I appreciate you talking to us a little bit through that while you were talking about that. I believe this could be, I think that this is true. I think, let's say this, you were talking a little bit about the progressive leftist ideas, kind of being able to have a voice and a notoriety and a sort of outsized influence at a point, and Twitter, and that wasn't always the case, and I think a lot of the work that I ended up doing was thinking about shifting power balances and thinking about whose voices were being amplified, especially by the algorithm or the platform. I'm pretty sure that you were one of the folks that I worked with to get verified and then-
Chris Gilliard:
Absolutely. Yeah.
Anika Collier Navaroli:
Okay. My brain is like, I blocked out. I have blocked out a good amount of that. I think the year was 2020, and I have blocked out a good amount of what happened in the bird app between 2020 and 2021. But yes, I appreciate you talking a little bit about that because that was work that so many of us inside of the company were sitting around looking and seeing, and we're saying, "Whose voices are actually being amplified here? And why is it only very often the white men who happen to know somebody who works here and not necessarily the folks who have the ability to write an article that says something like Digital Migration is Nothing New for Black People?" Right?
Chris Gilliard:
Yeah, no, I mean, it's how I testified before Congress. It's how I was able to write for Wired in the Atlantic. It's how I was able to talk about video doorbells on NBC. I mean a lot of that came from that exposure.
Anika Collier Navaroli:
Well shout out to everybody at Twitter who worked on those various campaigns. I think there was a lot of work that was done that I think we see the fruit of in a world that we don't necessarily think about while we were sitting there and doing that. So I appreciate that. But talk to me-
Chris Gilliard:
Yeah, no, it was incredible. It was absolutely imperative to me sitting here right now, actually.
Anika Collier Navaroli:
Well, I'm glad that you're sitting here right now, and I'm glad that you are talking to me and talking to us about privacy. And I know that we've been sitting here talking a lot about this idea of privacy, and I want to talk a little bit about your work and get into some things that you have done, but I want to ask you just generally, what does privacy actually mean to you, or how would you define it?
Chris Gilliard:
Yeah. I rarely get asked this question.
Anika Collier Navaroli:
Really?
Chris Gilliard:
Yeah, so I reached out to a friend of mine who's also a surveillance scholar, Sava Saheli Singh, and I was like, "Oh, I know this is a bad question, but how would you define privacy?" Because sometimes you just, well, I mean, one of the things Sava is so good at is putting things very concisely, and I'm sure I could have come up with a definition. I mean, it's what I do. But I took her definition because it was so good.
Anika Collier Navaroli:
Okay, share it with us. Share this definition.
Chris Gilliard:
And so this is what she said, "The right and agency a person or community should have to reasonably avoid/evade scrutiny, be it of their body, data or property."
Anika Collier Navaroli:
So the right to avoid scrutiny.
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
Okay. Well, tell me, how did you become interested or become kind of focused on this thing called privacy, or this right to avoid scrutiny?
Chris Gilliard:
Well, I think there's a couple different ways I can think about that. Part of it is, I was growing up Black in Detroit. When we think about Simone Browne, I have her book Dark Matters right in front of me.
Anika Collier Navaroli:
I mean, the classic, the canon I should say.
Chris Gilliard:
Right. Now, I didn't know her work when I was a kid growing up in Detroit, but the way she talks about anti-Blackness being at the root of surveillance in carceral technologies, I think even as a kid, I kind of understood some of that, or, it might not even be fair to say I understood it, but I felt it. I talk about this in the book, and I've used this example a ton of times. I apologize for people who've heard it before, but...
Anika Collier Navaroli:
They can hear it again.
Chris Gilliard:
I think about the advent of electronic locks, and walking down the street as a kid in the city, and although I've come to find out that electronic locks first came out in the 1910s, or 1912.
Anika Collier Navaroli:
Oh interesting, something like that.
Chris Gilliard:
The first car that had them, but I think that they started to mass appear in cars in the 70s, so lots of Black people I know have had the experience of walking down the street and hearing someone lock their car.
Anika Collier Navaroli:
Locking the door.
Chris Gilliard:
This is before they were made to automatically lock when you shifted into drive. And that experience of that click, being sort of an audible representation of how you were viewed, and this is when I'm a child. Of being a representation of how you're viewed. It always stood out to me. I always kind of felt that. I came up in Detroit shortly after the disbandment of a vice group called STRESS, which stood for Stop the Robberies, Enjoy Safe Streets, which was this-
Anika Collier Navaroli:
What an acronym.
Chris Gilliard:
... real brutal Vice Unit in Detroit that wound up killing, I think in a very short span of time, 13 people, 12 of whom were Black. And a big part of that was kind of the surveillance of communities. Black communities in the city. Also wound up, part of what wound up taking them down was a Black woman, forensic scientist. It's a really interesting story.
Anika Collier Navaroli:
I think there's a documentary about this.
Chris Gilliard:
Yes.
Anika Collier Navaroli:
Yes, okay. I love a documentary. I'm like, "I'm seeing this." Yes.
Chris Gilliard:
There's a podcast about it too, which is amazing. And so part of it is that, but again, part of my original story I think is coming to social media, coming to Twitter and finding really incredible scholars who were having these discussions. Frank Pasquale always stands out as someone that in the early days when I was just on these platforms, I was like, oh my God, these people are so smart. Some of them I mentioned, Safiya Noble, just seeing that they were doing and talking about this stuff was really inspiring.
And I think thirdly, it was my teaching. At the college where I was at the time, they were doing some really shady stuff with filtering the internet, which required me to learn a lot about how that happens, why that happens, how tech companies were surveilling people, all those things. How what kinds of decisions went into who gets what kind of information, what kinds of information goes out about you, all these things, because in short, my students weren't able to do research the kinds of research I thought they should be able to do on campus because the internet was filtered. And I started a campaign against my school at the time to try and get them to reverse that policy. So I think those three things, this is a part of the story I think is important in terms of how I started to think about this stuff.
Anika Collier Navaroli:
Yeah. Your origin story. So you mentioned something about that moment, that feeling so visceral of walking down the street and hearing the click, and the doors locking. You mentioned that you felt that. What did it feel like to be in that moment?
Chris Gilliard:
It feels like being watched, and it is often, so it was never someone from the neighborhood, and that's the other thing that I think is really important. It was never someone, it was never my neighbor. It was someone who was there for some other reason, or it was someone who was going from point A to point B, but wasn't from around here. And so it feels like a sign that I'm the one who's not supposed to be there, even though it's my neighborhood. It feels alienating, it feels insulting.
One of the jokes I make is that often people would do this even when their windows were down. It's like, if I was going to snatch you out the car.
Anika Collier Navaroli:
That's not going to solve the problem.
Chris Gilliard:
But I mean, I think it's a good sort of icon for the ways that technologies, and particularly surveillance technologies, and I know a lot of people will not think of an electronic lock as something in that category, but I think it's a good symbol for how these technologies can alienate us in our own neighborhoods and places that we have every right to be.
Anika Collier Navaroli:
Thank you for sharing that and articulating that sort of feeling for me and for the listeners. I want to talk a little bit more about the work that you've been able to do, because I mentioned, I think when we were doing the intro that you really have shaped a lot of the way that we talk about many different things in tech policy. You've coined some of the terms that we've used. One of them, for instance, of course, as you know, is digital redlining. Would you mind just kind of sharing for folks who for some reason might have no idea what that means, what that means?
Chris Gilliard:
Yeah, so I did not invent it. I think even, I'm pretty sure Jathan Sadowski and Astra Taylor did an article before my piece came out that used the term, but I did help popularize it. And where it came from is the instance I just mentioned where I was teaching, and my students weren't able to do research because the internet was filtered. And for those who don't know, I taught at a community college in suburban Detroit. A lot of my students did not have internet at home, or were working multiple jobs and things like that, we're in situations where when they were on campus was when they were able to do their work.
And for people who don't know a lot about how filtering works, traditionally, one of the problems with it, just on a sort of technical level, is that it lets in things that if you're trying to keep things out, it lets them in anyway, if the person is dedicated enough. But it also keeps out things that would be legitimate arenas of inquiry.
Now, from my taste on a college, almost everything's a legitimate arena of inquiry, but I'm talking about from the perspective of someone who's installing or maintaining a filter. So it would keep out elements of the Bible. It would keep out a June Jordan poem or an e.e. cummings poem, things like that. Playboy used to do interviews of-
Anika Collier Navaroli:
Great interviews.
Chris Gilliard:
... political figures. Yeah, it did. There's an interview of Jimmy Carter in Playboy. You couldn't get it because of the filter. And so I started thinking about that not only in terms of what my students were not able to access, but what were the class and race implications of that. As someone from Detroit who's very clued into what redlining meant, not only in the 40s and 50s, but the remnants of that now. I grew up two miles away from 8 Mile, where you could see the very strong distinction between an area where investments and loans were allowed and encouraged, and on the other side of the street, they were not.
Anika Collier Navaroli:
They weren't, yeah.
Chris Gilliard:
And then later on in my life, I lived in another city or another kind of suburb, Grosse Pointe, where we could see the same things, right? And so the more I came to understand the implications of this were drawn along not only class and race lines in terms of digital, but they also kind of mirrored the exact geographies that we tend to see. That I remember, I don't know if I've told this story before, but I once lived in a loft downtown, and the ISP said that I could not, broadband wasn't available to me from where I lived, and it was only a couple, a matter of the telephone pole being a couple feet this way or a couple feet that way to get broadband.
Now, I mean, I was able to a little bit, kind of socially engineer and get the guy to give me broadband. But everybody is not able to do that. And I mean, we've seen that, educational outcomes, health outcomes, political outcomes, can all be connected to, there's a strong connection between whether or not some of those things are positive or negative, and people's ability to access information on the internet. And so yeah, I began speaking and writing about that as digital redlining.
Anika Collier Navaroli:
Well, the thing that you have been talking a lot about recently, and I know that you're writing a book about is luxury surveillance. Could you tell us a little bit about what that is and tell us a little bit about your book that you're working on?
Chris Gilliard:
Yeah. So this came about when I, so the basic root of it is that I argue that an ankle monitor and an Apple Watch are essentially the same technology. Oh my God.
Anika Collier Navaroli:
I'm wearing an Apple Watch, and I will tell you, I got this Apple Watch, because I was trying to be healthy, or some shit, and I literally got it and told my best friend, and her response was, "I didn't think we were allowed to do that." And I was like, "Screw you." Yes. Anyways, as you're saying-
Chris Gilliard:
They are essentially the same thing. And that we can really understand some of the things that tech companies are up to when we tease out those parallels, that there's a segment of devices often chosen by people who have the ability to say yes to surveillance, where they think that they are likely to gain some benefits, accrue some benefits, and they don't understand them as surveillance devices. And that ultimately, a big part of this is helping to normalize surveillance for all of us by the fact that people embrace these technologies and understand them as aspirational.
Anika Collier Navaroli:
Yeah. Is there any particular, I mean, you mentioned the Apple Watch, which again is, I feel like I should have taken it off before I talked to you. Is there any piece of technology that really just pushes the button or really irks you out there, the really luxury surveillance technology?
Chris Gilliard:
Yeah, I mean, the big one now are the video glasses, whether it's the Meta Ray-Bans or what have you, but also, all the wearables that are coming out that are stuffed with GenAI, video doorbells too. I mean, I've hated those since the beginning. I think those are also hugely detrimental to society. But the one that sort of pokes me the most in the worst way at this moment would be glasses, video glasses.
Anika Collier Navaroli:
Like the Meta glasses, those ones. I mean, I think it's really fascinating that we basically brought back Google Glass, but I thought we had shamed people away from it.
Chris Gilliard:
Yeah, we can again, right?
Anika Collier Navaroli:
Yeah, I think that piece of shame. So I would love to talk to you a little bit. I know that you describe yourself as an abolitionist, right?
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
And I think that there is something there. Would you describe or tell us what that means, especially in the tech policy space?
Chris Gilliard:
Yeah, I mean, I am of the firm belief that many of these technologies should not exist. And that we should reject them in any and always possible and available to us. I know that some people don't consider that practical, but yeah, I think that the negative social impacts of these technologies have already been well-established. I mean, look around. And one of the parts of pushback I often get is talking about their legitimate use cases. I think for some of these technologies, there is not a legitimate use case, or at the very least for something like the Meta Ray-Bans, a big part of the reason they exist, I would argue the main reason they exist is to normalize ever present surveillance, and lower the barrier for all kinds of harassment and antisocial behavior. Now, people can use them for other things.
Anika Collier Navaroli:
Like what?
Chris Gilliard:
Well, somebody on social media the other day was like, "I use it to do this thing." And it's like, look, okay, sure, you can use your AK as a doorstop, right? That's not what it's for. That's not what it's for.
Anika Collier Navaroli:
That's not what they made that for.
Chris Gilliard:
And so yeah, I think we should reject these things. No, again, there's ways, and I've talked about this before, there's a lot of investment in making sure that we're not able to, so for instance, Amazon is also coming out with eyewear for workers.
Anika Collier Navaroli:
Oh, wonderful.
Chris Gilliard:
And they mostly, I imagine, will not be able to say no to these. And this is why I came up with that formulation of luxury surveillance is that I don't think people, when I would talk about these things, a lot of times, if you bring up how it harms marginalized people, how it is used to target Black and Brown folks, trans people, people seeking reproductive care, people would often invoke some kind of the sort of nothing to hide argument or say that it didn't affect them or things like that. Now, I'd love to live in a society where it was, if you said it harms this segment of people, then people's response wouldn't be, "Well, but I still need to get my packages," or whatever.
Anika Collier Navaroli:
Who cares?
Chris Gilliard:
I would love for that to be true. Unfortunately, it's in my experience, mostly not. And so part of what I have set out to do, or what I hope some of my projects do is show people that the negative effects of these things are not isolated to the most vulnerable. Now they do, it's true that they hit them the hardest, the earliest, the most often, but they eventually come for everyone.
Anika Collier Navaroli:
It's also coming for you.
Chris Gilliard:
And also, the supposed benefits of these things are highly overrated.
Anika Collier Navaroli:
Oh right. I mean, Yeah, Apple Watch for health, right? Okay.
Chris Gilliard:
Yeah, right? It's not, just go for your walk. Just wait, just go to bed on time. You'll be fine. Most of us will.
Anika Collier Navaroli:
Right. I mean, I don't know that we all need to see our sleep categorized into colors and broken down by the minute. And look at it, I'm like, I was definitely not asleep. You know what I mean? I was awake, so you're wrong. You are so right in that, and you mentioned this piece that I think is so fascinating around tech that just simply shouldn't exist, and the way that we should reject these things. I was reading the Washington Post article that you were talking about, and you said no one would look at asbestos and say, "We can't outlaw chemistry." But they look at facial recognition and say, "You can't outlaw math." And I thought that was, right?
Chris Gilliard:
Yeah, yeah. I mean, recently, and I apologize because I don't have the quote in front of me, but Emily Bender said something, she doesn't use the term GenAI. I forget the term she uses, but she was talking about what we understand or was kind of commonly called GenAI, and she referred to it or thought in the piece that she wrote, she talked about it, and said, "We should view it in the same way we view plastic."
Anika Collier Navaroli:
Oh, wow. Okay. Yeah.
Chris Gilliard:
That an important thing would be to avoid it when we can, that if we had avoided it and understand the harms, right? Now we all have microplastics in our bodies, and it's at the bottom of the ocean. If we had thought about it differently when it was becoming more prevalent, that we might be looking at a different landscape now. And so I think for particularly, I mean this article was talking about GenAI, but there are a lot of things that we've let, I use that term loosely, it's not exactly the right term, but the trajectory of a lot of these technologies has been established by companies who do not have our best interests at heart, and frankly, probably should be seen, saying they do not have our best interests at heart is the lightest touch I can say. They're openly oppressive and aligned with authoritarianism is probably a better way to say it.
But we've let them set the terms for what society should look like and what technologies are out in the open and in the world. I would argue that the result of that has not been positive, and that some of these things, I think, yeah, we should say they should not exist. A good friend of mine, David Columbia, before he passed away, wrote an article called ChatGPT Should Not Exist.
Anika Collier Navaroli:
Yeah.
Chris Gilliard:
Now, this was three years ago. This is shortly after the initial version of Chat was launched. Got a lot of pushback on that, a lot of negative feedback. It's turned out to be the right call, and I think we should more openly articulate that when that's the case. I was talking to another friend who was reading, Joseph Bisembom's, and I forget the name of his work, but one of his early books where he talks about the negative effects of some of these technologies and how we can and should respond to them. And I think we, what's the right way to say this? I think a lot of people have been force-fed this notion of innovation, that serves the best interests of the tech companies that say they should be able to do whatever they want, cause whatever harms they want. And not letting them do that is somehow bad for society. I don't think the results of that have worked out very well for us at all, but in the last couple decades, certainly.
Anika Collier Navaroli:
Yeah. Hard agree. And question for you, do you use ChatGPT, do you use GenAI at all?
Chris Gilliard:
I have never, so the short answer is no.
Anika Collier Navaroli:
Good for you.
Chris Gilliard:
If you pulled up the website or the app or whatever, I don't even know what it looks like. You could tell me, I would have no idea. It's never been on my computer. Now someone listening is going to be like, "Well, there's GenAI in your phone.
Anika Collier Navaroli:
Right. Right, right, right.
Chris Gilliard:
Correct, or whatever it is.
Anika Collier Navaroli:
But you've never gone to ChatGPT and been like, "Tell me about myself."
Chris Gilliard:
Mm-mm. No. I never will. I never have and I never will.
Anika Collier Navaroli:
Okay. I love that because I think that there is such a world in which working in academia, there's just this, everybody's using AI, therefore we have to adopt it. We have to figure it out. You remind me so much of my friend's mom the other day when we were talking about ChatGPT, and their response was, "Chat who? What is this thing that we are talking about?" And so there is definitely still a way to resist as you're talking about, and to believe in this piece of abolition. I'd love to talk a little bit about the future and what you're thinking, what you're seeing. One of the questions I have for you is, what are the sort of biggest disparities in the realm of privacy that you are seeing right now today?
Chris Gilliard:
Yeah. Can I go back for one second?
Anika Collier Navaroli:
Please. Of course. All the time.
Chris Gilliard:
So I was talking, another part of my origin story, or another thing I think is so interesting or about how people have come to embrace this tech. There's a study I think out of U of M that talked about how marginalized and minoritized people were much less likely to sort of trust GenAI. I've been thinking a lot about, at some point I hope to write something, and Damien Williams and I have been talking about it online. Because I think about the history of literacy as I understand it, not only in my own journey, but in terms of Blackness, very recently, I could not go to the library and take out a book, not me personally, but someone who was Black.
Anika Collier Navaroli:
Right. Right. ancestors, yeah.
Chris Gilliard:
Illegal to teach Black people to read, things like that. That is in recent memory. And now these billionaires, and soon-to-be trillionaires in some cases, have come along and said, "Oh, we have a thing that's going to make it so that you don't have to read. We have a thing that will give you the answer." And I think it's such a weird, I'm a little bit surprised at the extent to which academics have embraced it. I mean, I understand from the point of administrators, academics, all kinds of people who are thinking and being creative.
Anika Collier Navaroli:
And writing.
Chris Gilliard:
And writing.
Anika Collier Navaroli:
Reading and writing.
Chris Gilliard:
And I can't help but think about it just in terms of the struggles for literacy that were very bloody and pronounced, just to be able to read. And now a bunch of, a small cadre of super rich people are telling you, "You don't need it anymore."
Anika Collier Navaroli:
You don't need it.
Chris Gilliard:
No, I'm not doing that.
Anika Collier Navaroli:
They have all the money, therefore we don't need to read.
Chris Gilliard:
Yeah. Yeah. We have the machine that'll give you the one answer. No, I don't think so. I read that story before. It doesn't work out.
Anika Collier Navaroli:
Yeah. Thank you for bringing us back to talk about that, because I think that that is a really great point that you're making, especially the, we've read that book before, and yet still we're living in the future of the books that warned us about our present, which feels a little fascinating. Okay. So the biggest disparity, and the run on the privacy. What's going on?
Chris Gilliard:
I think GenAI is a good kind of use case or example of that, that often it's not understood or talked about as a surveillance mechanism.
Anika Collier Navaroli:
Yeah, it's not.
Chris Gilliard:
I think a lot about Meredith Whittaker was talking about when people are talking about agentic AI and things like that. Now, I don't know how much it works, or if it'll ever fully work or anything like that. I have some questions, I have some skepticism, but if it were to work, the extent to which it would surveil you is unprecedented. I think a big part of what companies, so I have this kind of model or whatever that I'm working on for the book, and I think there have been three different important eras, for lack of a better term, in terms of algorithms and what people say their algorithms are going to do and things like that, but also what tech companies promise they're going to do. That the deal that they used to offer is, "We'll give you this service, and to pay for it you let us surveil you a little bit."
Anika Collier Navaroli:
The Facebook model.
Chris Gilliard:
Yeah. People would say, "Well, Mark, why can't we just pay for a Facebook where you're not surveilled?" They're like, "No, no, no."
Anika Collier Navaroli:
That's the whole point.
Chris Gilliard:
Now, that deal was a lie. The level of surveillance that is deployed against people, the data broker system, say to someone, there are data brokers who have categories like parents whose child just died of cancer, things like that. A lot of people don't know that. The extent of that surveillance, Facebook Pixels on your doctor's office, on the website and things like that. So that deal was sort of always a lie. But how they've tried to alter that deal, I think is really important in that now the wave of GenAI promises you all these things, but the only way, the surveillance has now become the thing, that the surveillance is the benefit. That if you let these devices watch you sleep, listen to you talk to your partner, record you on your way to and from things, all these things, that it will help you optimize your life, that pervasive and always on surveillance will somehow benefit you in untold ways.
Now, you asked about the disparity. I think there's going to be a lot of ways that some people, there are going to be some people who want and like these things. Unfortunately, I used to say, "Who wants this?" And it seems like there are people who want that.
Anika Collier Navaroli:
There's a market somewhere.
Chris Gilliard:
But there are also people who will not be able to say no. They're starting to, Axon now has a product of body cams for retail workers and hospital workers.
Anika Collier Navaroli:
Wow.
Chris Gilliard:
As I mentioned, Amazon's coming out with smart glasses, surveillance glasses for workers. That more and more, there'll be a segment of the population who is forced to endure a degree of this surveillance, not only in their working lives, then they'll come home or whatever, and that'll be part of their life as well. I think that that's where some of the disparity is going to come from.
And again, I think that as this becomes normalized, and I don't want this to be the future, I hope that this is not the future. I think the current trajectory kind of lends itself to that, that as it becomes normalized, we started talking about this from the very beginning. That I've seen, unfortunately, that people are adopting and accepting of these things with the belief that it will benefit them some way. And again, this is apart from when it's enforced on people for work, for school, as conditions for employment or benefits or whatever the case may be. So yeah, I think, and again, part of what GenAI promises, again, whether it can deliver, is that all this data is that something will be watching it and parsing it, right?
Anika Collier Navaroli:
Yeah, yeah.
Chris Gilliard:
Where in the past, I mean, there's untold hours of body cam footage that's never been watched, right? Because a human can't, they're not going to dedicate the resources for a human, but the promise, in quotation marks of GenAI is that you're going to allow all that stuff to be watched and parsed some way or another.
Anika Collier Navaroli:
Yeah.
Chris Gilliard:
That was a long explanation.
Anika Collier Navaroli:
No, no, no. I appreciate you. You mentioned the Axon cameras for retail workers, and it struck me because I remember, was it like 10 years ago, he was working on policies for Axon body cameras for police officers, and so thinking about how the policies for these things that we wrote in such a different context are now streaming down to everyday folks as something that I think is fascinating. So you mentioned something that is not your hope for the future, but I have one last question for you, which is actually, what is your hope for the future of privacy and what it could look like for all of us?
Chris Gilliard:
Yeah. I mean, so I always think a lot about Ruha Benjamin, who talks about-
Anika Collier Navaroli:
Me too.
Chris Gilliard:
Yeah. Everyone should, right?
Anika Collier Navaroli:
Right. Especially in this space. I think that's a good place to start.
Chris Gilliard:
Yeah. I mean, who talks a lot about imagination and the future we're living in. I hate to say what I'm about to say, because I'm not going to phrase it the right way. I think a lot of times people are like, "Ah, well, I'm not a Luddite." Which is not true. I would actually probably be better categorized as one. I do like technology, I think that it is possible to have some of these things, but that don't, I mean, we'd have to have them sort of divorced from capitalism. I mean, I have a phrase that I say often, which is, "Every future imagined by a tech company is worse than a previous iteration."
Anika Collier Navaroli:
Yes. Yes. That is on T-shirts, on stickers, on everything. Yes.
Chris Gilliard:
Yeah. I think I need to do another T-shirt drop, right?
Anika Collier Navaroli:
Yes. I think you do too, please.
Chris Gilliard:
And the reason I say that is because the imagination of the tech company is driven by capitalism and the need to extract maximum value from us, that in order for these things to exist in a way that didn't do that and actually benefited us, we kind of have to rewrite a bunch of the ways that things work. I think that's possible. I think it's actually super dark right now, but I was just talking to a collection of college students, and I think in a lot of ways, the tech barons are really overplaying their hand. And what I mean by that is that it's very clear where their alliances are. There's not really any pretense anymore whose side they're on, whether they believe in things like racial justice or equity or even democracy. I don't have that question anymore. I personally never did.
Anika Collier Navaroli:
Some people did.
Chris Gilliard:
Yeah. It's very obvious where they are. History says that that usually doesn't work out well for that crew of people. And so I think that it's possible to have some of these things in ways that don't feed directly into authoritarianism. I mean, again, I think some of them should not exist, but if you wanted a device that you thought helped you better stick to your fitness goals that didn't send that to a company, that then sends it to an authoritarian government. Nice.
Anika Collier Navaroli:
It would be. That'd be ideal, right?
Chris Gilliard:
Yeah.
Anika Collier Navaroli:
A dream.
Chris Gilliard:
I mean, yeah, there's going to have to be sort of reckoning with where we are, but I do think it's more and more common that people understand some of these things. That pushback, I've been talking about luxury surveillance maybe 10 years, and when I initially said, "Oh, this thing's like an ankle monitor." People were scandalized that I would make such a comparison, and nobody fights me on it anymore. They're like, "Oh, yeah, I kind of see that." It's clear what the imperatives and impulses of these companies are. And so as I think more and more people understand that, I think more and more pushback is inevitable.
Anika Collier Navaroli:
Well, I agree with you. I hope that we have that pushback that comes, and I want to reiterate something that you said, that we can rewrite the ways that things work and that it is possible. And so I appreciate you, Chris, for joining us on this podcast as we think about the ways that we are able to rewrite these things in a way that we can think about the way that privacy can be possible for all of us together in the future. Thank you so much for joining us today.
Chris Gilliard:
Thank you for having me. I really appreciate it.
Authors
