The Dangerous Combination of Technology and Capitalism
Justin Hendrix / Feb 2, 2025Audio of this conversation is available via your favorite podcast service.
Just days before he left office, former US President Joe Biden gave a farewell address from the Oval Office. Staring directly into the camera, Biden said he wanted to warn the country about things that gave him great concern.
And that’s the dangerous concentration of power in the hands of a very few ultrawealthy people, and the dangerous consequences if their abuse of power is left unchecked. Today, an oligarchy is taking shape in America of extreme wealth, power and influence that literally threatens our entire democracy, our basic rights and freedoms and a fair shot for everyone to get ahead.
He hearkened back to another farewell speech delivered by President Dwight D. Eisenhower on the same date, on the same date in 1961. In that speech, Eisenhower famously warned about the emergence of the military-industrial complex.
Six decades later, I'm equally concerned about the potential rise of a tech industrial complex that could pose real dangers.
Biden pointed to problems at the intersection of social media and democracy, warning that we must hold major technology platforms to account.
The truth is smothered by lies told for power and for profit. We must hold the social platforms accountable to protect our children, our families, and our very democracy from the abuse of power.
But then he turned to A. I. and the dangers it may pose, especially to our economy and how we work.
Meanwhile, artificial intelligence is the most consequential technology of our time, perhaps of all time. Nothing offers more profound possibilities and risks for our economy, and our security, our society, for humanity. Artificial intelligence even has the potential to help us answer my call to end cancer as we know it. But unless safeguards are in place, AI could spawn new threats to our rights, our way of life, to our privacy, how we work and how we protect our nation.
A couple of days after Biden delivered his speech, and on the eve of President Donald Trump's inauguration, I spoke with Jathan Sadowski, a senior lecturer in the Faculty of Information Technology at Monash University in Melbourne, Australia; co-host of This Machine Kills, a weekly podcast on technology and political economy. Sadowski is the author of the new book The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism from the University of California Press. He says that right now, technology escapes even the bare minimum of public accountability–let alone public control–that we demand from other forms of power.
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
Good morning. I'm Justin Hendrix, Editor of Tech Policy Press, a non-profit media venture intended to provoke new ideas, debate, and discussion at the intersection of technology and democracy. Just days before he left office, former US President Joe Biden gave a farewell address from the Oval Office. Staring directly into the camera, Biden said he wanted to warn the country about things that gave him great concern.
President Joe Biden:
And that's the dangerous concentration of power in the hands of a very few ultra-wealthy people, the dangerous consequences if their abuse of power is left unchecked. Today, an oligarchy is taking shape in America of extreme wealth, power, and influence that literally threatens our entire democracy, our basic rights and freedoms, and a fair shot for everyone to get ahead.
Justin Hendrix:
He hearkened back to another farewell speech delivered by President Dwight D. Eisenhower on the same date in 1961. In that speech, Eisenhower famously warned about the emergence of the military-industrial complex.
President Joe Biden:
Six decades later, I'm equally concerned about the potential rise of a tech-industrial complex that could pose real dangers.
Justin Hendrix:
Biden pointed to problems at the intersection of social media and democracy, warning that we must hold major technology platforms to account.
President Joe Biden:
The truth is smothered by lies told for power and for profit. We must hold the social platforms accountable to protect our children, our families, and our very democracy from the abuse of power.
Justin Hendrix:
Then he turned to AI and the dangers that it may pose, especially to our economy and how we work.
President Joe Biden:
Meanwhile, artificial intelligence is the most consequential technology of our time, perhaps of all time. Nothing offers more profound possibilities and risks for our economy and our security, our society, for humanity. Artificial intelligence even has the potential to help us answer my call to end cancer as we know it. But unless safeguards are in place, AI could spawn new threats to our rights, our way of life, to our privacy, how we work, and how we protect our nation.
Justin Hendrix:
Today's guest is the author of a book that looks directly at the intersection of technology and capitalism, and he argues that right now technology escapes even the bare minimum of public accountability, let alone public control that we demand from other forms of power. I spoke to him the day before Donald Trump was sworn in, flanked as he was by the billionaire founders and CEOs of the biggest technology firms in the world.
Jathan Sadowski:
I'm Jathan Sadowski. I'm a senior lecturer in the Faculty of Information Technology at Monash University in Melbourne, Australia, and I'm also the author of The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism.
Justin Hendrix:
Jathan, I'm so excited to speak to you today about this book. A little ways into the introductory chapter, you talk about the idea that there's a growing cottage industry of work that casts a skeptical eye on Silicon Valley, you talk about the kind of formula for people who work in that space, identifying harms, thinking about solutions, maybe coming up with various sort of policy considerations, but you're trying to do something a little deeper than that with this book. You're trying to kind of talk about the sort of river of ideas that I suppose have come together into this combination of technology and capitalism. What's this project for?
Jathan Sadowski:
I've been thinking about this book. When I've set out to write it, I decided I wanted to not just write another give a person a fish book, which is what my first book Too Smart is. A lot of the books in this space are really about giving you the conclusions, here's why bias is bad, or here's why tech is causing inequality. Here's the conclusions that you need to be able to walk away from reading this book or this article and be more critical of the things that the tech sector claims to be doing or the products and systems that they're foisting on our lives. And I think that kind of work is really important. But in this book, I really wanted to set out and go beyond conclusions. I wanted to get to the fundamentals, the dynamics, the kind of material relations that actually make the tech sector work in the way that it does, the kinds of other systems and structures that it emerges out of and is embedded in, capitalism being a really important one, but also things like the military-industrial complex.
So I set out thinking about this book as the teach a person to fish book. I don't want to just give you the conclusions, so then you walk away. I want to teach you how to think about tech in the way that I think about it as this product of these kind of material relationships, as fundamentally a product of political economy and understanding how you can't really see technology without getting ... You can't really understand it, understand its role in our society and our lives without getting to those more fundamental relations. In other words, learning how to fish, not just walking away with a fish.
Justin Hendrix:
There are a lot of ideas in this book, of course, we should talk a little bit about Marx and some of the other people whose names appear in this book again and again. But I also want to ask you a little bit about just how your personal experience led to this book. You talk about, well, when you're framing up the idea of the mechanic, you were born on the Gulf Coast of Mississippi you say, high school in the plains state of South Dakota, Badlands canyons. How did your experience lend itself to helping you craft this book?
Jathan Sadowski:
This really gets to that first part of the book's title, the mechanic. The book kind of takes these figures of the mechanic and the Luddite as these two models or these two archetypes for how I think we need to understand the relationship between technology and capitalism. And for me, the mechanic really embodies that idea of understanding how the technologies actually work, what they actually do, what work they do in our lives, piercing the hype and the marketing claims around what people claim they can do or what they hope they can do, but really how do they actually work? And part of that is about having a kind of what I think of as a mechanical curiosity about the world. This isn't me saying everyone needs to go out and learn how to code. I don't think that's going to solve any of our problems in any major way.
But what I want is for people to have and be able to act on this mechanical curiosity about the world. And I opened that section talking about my own upbringing where I grew up. I'm the first person in my family to go to college, and then I really stuck with it and went all the way to get a PhD and become an academic. But the first person in my family to go to college, I grew up outside of urban areas, very much in small towns or in rural areas. But I grew up around a lot of people who were tradespeople, who were craftspeople. My dad, he's retired now, but he was an electrician at a shipyard. My stepfather was a carpenter and a mason. This is also my brothers. I have two brothers, one whom is a HVAC technician and one of whom is a delivery truck driver.
And so I grew up, my mom really an expert in the lost art of homemaking. Any chemical for any stain, she knew what it was, knew how to sew a patch on a pair of jeans or hem a shirt or cook up an amazing meal from scratch, all that kind of stuff. And that was the environment I grew up in was this environment of having mechanical skills, knowledge, curiosity was just normal. My science teacher in South Dakota was also a cattle rancher. And I decided in high school I really wanted to learn how to weld because it was one of the skills that a lot of the people I grew up with, a family friend of ours was a welder. And I always found it really interesting learning how pieces of metal are joined together to make something else. And so I approached my science teacher and I was like, "Could you teach me during lunch hours how to weld?"
And it took a little bit of convincing, but he was a soft sell on it. And so we would. We would go off to lunch hours to the shop for the school district, introduced me to all the maintenance guys, and he'd show me the ropes of MIG welding and TIG welding as a kind of extracurricular activity in high school. This was the kind of environment I grew up in where it was normal to take apart some appliances that I find at a thrift shop or get a computer and put Linux on it so I can get closer to the root of my laptop. And it was this, being encouraged to have that mechanical curiosity, nobody telling me this is not something that you can learn how to do, that this is a forbidden knowledge or you don't have access to this, or why are you taking that apart? You're just going to break it or that you should be focusing on other things, endeavors of the mind, not just endeavors of the hand.
Growing up in this environment gave me this real curiosity about how the world works and it was taking that curiosity that I think has really given me the kind of foundations that I think I want everybody to have access to. And that I think are also really taken away from people in a very intentional way by the tech sector. We hear so much now around, for example, the Federal Trade Commission in the U.S has just filed a large lawsuit against John Deere tractors against them for what's called the right to repair. You have farmers who have equipment that is fundamental to their livelihoods, to the forms of agriculture production that they do, and they have no ability to actually get in and modify or repair the farm equipment. Not because they don't have the skills to, but because the companies have shut them off, have said, "You're not allowed to do this legally. Digitally we're putting up gate keeping and locks on things."
And that is, to me, emblematic of how the tech sector has de-skilled and mystified technology for the vast majority of people. We are now just users of technology. Even if you have the curiosity to understand how something works, the companies that make these technologies put up all kinds of legal and technical barriers to prevent you from ever exercising that curiosity. And so for me, growing up in a time where those kinds of barriers were not present, or at least not so entrenched, growing up in a place and with family and friends who encouraged this kind of curiosity and skills acquisition, this to me was really necessary to building the foundations for a more critical material understanding of technology. And it's what I want everybody to have access to.
Justin Hendrix:
Well, you write that, "If the mechanic knows how a machine operates, how it's put together, and how it can be repaired or engineered, the Luddite knows why the machine was built, whose purposes it serves and when it should be disassembled or destroyed." We've spent other time on this podcast, I suppose rehabilitating the memory of the Luddites with folks like Brian Merchant, but you claim you are a Luddite. What do you think it's important for the listener if they haven't read, for instance, Brian's book or some of the other books that explain in more detail what Luddites are actually about? What do you think it's important for the listener to know?
Jathan Sadowski:
For me, these are two sides of the same coin, the mechanic and the Luddite. They give us that socio-technical analysis of these devices, of these systems, of this industry. And the Luddite to me really represents the sharp edge of the socio-part of that socio-technical analysis. Luddism has been disparaged since the days of the Luddites 200 years ago. There's people contemporaneous with the original Luddites who were disparaging them as these ignorant technophobes who wanted to stall progress, who wanted to move society backwards, all the kinds of things that we associate with Luddism today, and that couldn't be further from the truth. And so in the book, I really want to contribute to this rising tide of rehabilitating the Luddites as a banner, as a symbol for the kind of sharp-edged ruthless criticism of technology and the tech sector. Because to dispel a few of the common myths of the Luddite, the idea here that they were indiscriminate in their machine breaking. It is like they see a machine and they're like Frankenstein's monsters, yell "Ah, fire. You've got to smash the machine." That wasn't the case at all.
What they were doing is they were very picky and choosy, very targeted about which machines they took the hammer to and which they didn't because in these big original industrial factories, you would have multiple capitalists or multiple owners have their machines kind of next to each other. And so one owner might have a form of a power loom in one part of the factory and then right next to it might be a different owner having the same machine, but it's different owners and they would treat their workers in different ways. And what the Luddites did is they would pick and choose to smash the machines owned by bosses and capitalists who were notorious for underpaying their workers, for speeding up the pace of work, for degrading the quality of work, for making the work more dangerous.
And so they would choose, this machine is representative of and owned by a boss who is notorious for treating workers in terrible ways and degrading the community. So they weren't indiscriminate, nor were they ignorant. These Luddites were also mechanics, they were artisans, they were craftspeople. They had really in-depth and expert knowledge about the work that they were doing. And so a lot of these machines that they were using had been around for a very long time as well. They had used them for a very long time. So they were not against innovation as such, but they were against the noxious use of that innovation. So technologies like the gig mill were around for 100 plus years before the Luddites rose up and smashed those machines, and they used those machines in their work for 100 plus years. It wasn't until those machines were appropriated by and redeployed within an exploitative capitalist context that the Luddites then rose up and said, "These machines are no longer beneficial to us. They're no longer good for society and for our lives and for the work that we're doing, and thus they don't deserve to exist."
And so to me, that really is taking this ruthless criticism of technology and capitalism to its logical conclusion. If you understand how the machines work in this real mechanical way, you understand what the machines mean for your livelihood, for your community, for society in this real social way. And if the result of that analysis is that these machines are causing irreparable harm, that there's no way to feasibly re-engineer or modify them, then the conclusion there is then they don't deserve to exist. And the Luddites took that conclusion to its logical endpoint.
Justin Hendrix:
The listener might be thinking about artificial intelligence. You get to that sort of after consideration of venture capital of data, some of the kind of lifeblood or I guess foundational systems that ultimately inform modern intelligence. But you also point out AI as well. You call it, "The latest in this quest to build a perpetual value machine." What's a perpetual value machine? What are folks trying to do?
Jathan Sadowski:
What I mean here by the perpetual value machine is in short, a machine that would be a way to create and capture an infinite amount of surplus value without needing any labor to produce that value. And so it really is this idea and it is this quixotic quest by capital. It's something that I think underpins a lot of the dreams and visions and hopes of automation is that you can finally have a way to create value, economic, financial, social value in the world without needing any of these pesky workers who get in the way, who are necessary for that value creation. But get in capital's way because they want things like wages or time off, they have complaints when they're worked to the bone for 16 hours a day. All these things that stand in the way of a form of perpetual value creation.
And so this is really, I think what underpins so much of the investment and the hype into AI right now. So many of the ways in which AI is not consumer-driven. It is supply-driven, not demand-driven. There is nowhere near the level of consumer demand for all of these AI products to justify billions and billions of dollars. Goldman Sachs says that it's close to a trillion dollars of capital investment into AI infrastructure over the next handful of years. That's just the data centers, the electrical grid upgrades and so on needed to power these AI systems, let alone the actual products themselves. There's nowhere near that level of consumer demand to justify that. But the demand is coming from the supply side. It's the corporation's building the technologies, it's the management consultants who are telling everybody, "You need to have this. This is going to be the future. You're going to be left behind if you don't get on board." It's the old Steve Job [inaudible 00:19:56] that people don't know what they want. You have to tell them what they want. And that is very much what's driving AI right now.
And I think what underpins that is the hope that AI will finally crack the nut of the perpetual value machine for capital. It will finally be this way to do everything that humans do in terms of producing value, but without needing the humans to produce that value. And in other words here, I don't think you can understand AI without understanding it within this context of political economy and without understanding it within this kind of historical development of capitalist technologies, technologies that capital has invested in, designed and used for the very purpose of increasing their ability to exploit and extract value from the world and from people.
Justin Hendrix:
So we've talked about many of these types of issues on this podcast on occasion, and I suspect many Tech Policy Press listeners are if not fully bought in to your way of describing the world, at least sympathetic to it. But I want to ask you a little bit about what we're observing right now. I mean, I'm talking to you on the eve of the inauguration of the second Trump administration in the United States. It's been a crazy weekend here of the Supreme Court ruling on a law banning TikTok and the President-elect somehow appearing to be the hero of I'm sure many millions of young kids who don't want to lose access to their TikTok apps.
Trump seems to have many of the leaders of Silicon Valley right where he wants them, Mark Zuckerberg, Jeff Bezos, Elon Musk... Google is probably afraid of the Department of Justice coming after it in this sort of antitrust context. It seems like we're witnessing a kind of strange merger of state power, corporate power, and a lot of it has to do with artificial intelligence. What of the today's events are you interpreting through the ideas in your book? And what do those ideas tell you about those events?
Jathan Sadowski:
I think what we're seeing here is the mask dropping from Silicon Valley. I don't think it's just that these people like Mark Zuckerberg or Elon Musk or Jeff Bezos or the companies that they own and the larger sector that they represent, I don't think it's just that they have changed their minds. They've become radicalized themselves by the information ecosystems that they've helped to create. I don't think they're just victims of this right-wing radicalization pipeline of posting online all the time. I think that's a story that is told that it's like, wow, how could Elon Musk become such a right-wing ghoul in the way that he is? How could Mark Zuckerberg become such a misogynist tech bro talking about Facebook or Meta needs more masculine energy on Joe Rogan? It's like how could this happen? And I think what we're actually seeing here is that it has always been that case.
They've just been very good at keeping up to varying degrees good, at keeping up a facade, wearing a mask that try to draw attention away from the undercurrents of right-wing thought and beliefs that are really crucial to Silicon Valley, that they now feel more comfortable stepping out of the shadows and really saying things that they think not only cynically will curry them favor with Trump. Although that's obviously a lot of what's going on here as well, but also in the sense that they really see themselves as masters of the universe in the way that like Tom Wolfe described Wall Street in the 80s as composed of a bunch of these financiers who saw themselves as masters of the universe. But now that is the case, but it's just technologists in Silicon Valley and they have long been ready to ascend to that throne to the top of capital. And now they are doing it in very public ways and in ways they're very much in partnership with politics because what we're seeing here is yet another instance of a long-standing partnership between capital and the state.
Capital has always relied on the state for capital to accumulate, to exploit, to extract, to continue to grow, to maintain the stability of markets, to open up new markets. Capital has always relied on the state to do that. And in return, the state has long relied on capital to provide it with resources, with other forms of soft power, with a revolving door of influence and wealth. And what we're seeing here is that partnership between Silicon Valley and the state becoming closer and closer.
People like Peter Thiel who have long expressed these kinds of beliefs was the original Trump supporter during his first campaign in Silicon Valley. People like Peter Thiel who used to be seen as on the fringes because they said the quiet part out loud are now moving towards the mainstream. Peter Thiel is no longer an aberration in Silicon Valley. He is the model of Silicon Valley and similarly with Elon Musk, we shouldn't be surprised that the son of a South African emerald miner who has long had these aspirations of wealth and power, has found ways to do that by getting buddy-buddy with Trump and buying a social media website, on Twitter.
In other words, there's been, I think this mystification of Silicon Valley that was somehow separate from the normal dynamics of capitalism that somehow represented a different form of a corporation, something a bit more progressive or nicer, not so focused on such harmful practices. It wasn't the tobacco industry, it wasn't the oil and gas industry, it wasn't the defense contractors, Boeing and Lockheed Martin and so on. It was something different. It was better, represented the better angels of our nature. Silicon Valley benefited from that facade for a very, very long time, but now they have their sights on usurping quite explicitly, many of these paragons of the worst excesses and harms of capital. Like Silicon Valley, you've got companies like [inaudible 00:27:07] and Anduril who have long been defense contractors in the tech sector, but now partnering with OpenAI to create a kind of defense tech consortium to sell hardware and software systems to the Department of Defense and to the Department of Homeland Security because those are lucrative contracts, with the explicit aim of upturning, overtaking Lockheed Martin and Boeing and these other defense contractors.
You have companies like Microsoft and Amazon selling AI systems to BP and ExxonMobil and Shell and these oil and gas industries for the purpose of finding more efficient ways to produce more barrels of oil in addition to the emissions that come from the AI technologies and systems and infrastructures themselves.
So what I think we see here, and I think I hope my book provides some context for, is understanding that technology and capitalism are not these two systems that exist separately from each other, as if technology is being corrupted by capital, which would be the argument that underpins, I think the surveillance capitalism thesis by Zuboff is that these companies have been corrupted by capitalism. My book argues that's not the case at all, that these two systems have always been intertwined and interconnected with each other, that you cannot disentangle them analytically or materially. You have to understand that these tech companies are the apex of capital right now. They are the apex capitalists, and that means they're going to act in the way that capitalists have always acted for the last 300 years. We shouldn't be surprised when they do it.
Justin Hendrix:
I found it striking to hear outgoing U.S President Joe Biden, in his farewell address from the White House in some ways using the same kind of language that you're using here, talking about oligarchy, talking about the military-industrial complex, the tech-industrial complex that he fears. Clearly trying to leave an impression on Americans that something's deeply wrong here, referencing the problems of artificial intelligence, referencing the problems of social media and the titans who operate these companies. Do you think that there is a kind of awakening going on to some of the ideas that you explore here?
Jathan Sadowski:
I think there is absolutely, there's a real growing tide of, at the very least skepticism if not outright criticism of the tech sector, many of its leaders and its products and fixations. It is really strange to see it happening where it is now. Yeah, I remember being a critic of tech, pre-2016 days and really having to ... When I'm writing about it or talking about it, I have to spend a fair amount of time justifying why we should actually be critical of tech and why these things, these products or these companies deserve criticism and having to expend a lot of that kind of background justifying this view. And now it's weird to enter into these spaces and that's just table stakes. Everyone's like, "Yeah, of course, now let's go on to say how exactly should we be critical of them or what should we do about it?" And I think that's great. I wish I could take credit for it. I wish that I could take credit and that me and my cadre of other Luddites and tech critical writers and speakers and thinkers could take credit for this.
And I think we collectively have provided some tools for people to understand this, but I honestly think the people who deserve the most credit is the tech sector themselves. It is this point at which that mask drops that I was just talking about, where they become so-
Justin Hendrix:
Cartoonish.
Jathan Sadowski:
So cartoonish. Absolutely. Yeah. They become so cartoonish that they lay bare their own folly. Then people have no option but to push back against it. No one has done better to criticize Elon Musk than Elon Musk himself. He himself has been a one man wrecking crew for his own reputation more than a whole army of critics could have possibly done. And similarly, I think the tech sector has really kneecapped themselves by going so fast and so heavy on AI immediately after spending years pushing Web3, NFT, crypto, blockchain, stuff that people thought was absolutely ridiculous and silly, but cartoonish in the sense of harmless like Looney Tunes.
But then to go from there to immediately after FTX implodes and one of the largest frauds in history to then pivot so explicitly and cynically to all in on AI, and AI is going to change the world, and we're spending billions of dollars on this, and it's going to upend every single part of your life and your work and everything else, and you have no choice but to get on board, and we're on the path to creating the techno cyber God of artificial general intelligence. It's one thing that I think they have laid the path of people starting to take them less seriously with Web3. And then once you see someone that you don't take seriously start talking about world domination a minute later, I think they really kneecaped themselves there by getting too ahead of themselves. They got too comfortable.
I think collectively, the public got too comfortable with Silicon Valley, the platform age of Ubers and Airbnbs and so on. There were a lot of harms that came out of there. But I think the public was largely like, "Well, it's a land of contrasts. There's benefits and there's downsides." I think the tech sector had a little bit of statements of grandiose, but it was about disrupting in sectors like the taxi sector or the hotel sector, these sectors that people didn't really have much attachment to anyways. But I think Silicon Valley got so comfortable with being able to make these grand statements about disrupting everything, about investing billions of dollars into these technologies. They got so comfortable with it that they got ahead of themselves, and then it got silly, and then people saw how it got dangerous.
Justin Hendrix:
Well, and certainly we'll see how far things go in this combination of far-right politics in the United States and Silicon Valley's interests. There are similar dynamics playing out across the world, not just here. One thing I like about this book, you do spend a bit of time thinking about the extent to which notions of the future are essentially kind of colonized or otherwise manipulated by technology interests, Silicon Valley interests. And you point to other thinkers, folks like David Graeber and others who've written so compellingly on those types of things.
I kind of want to ask you a question that's always in the back of my mind when I get to the end of a book like this. You have this last bit on the wrench and the hammer. You say, "What must be maintained can always be destroyed." You talk about different forms of resistance. You begin to kind of use language that perhaps almost sounds like fighting words. At what point do you think that many people will perhaps look at the machinery of artificial intelligence or the machinery of data centers, surveillance cameras, the rest of it, and begin to think maybe it's time to smash that?
Jathan Sadowski:
Well, I think we are already starting to see some of those tendencies boil up in very still anomalous, but spectacular events. Spectacular in the sense of a spectacle. Not that long ago, I think maybe last year sometime, that a bunch of people set upon a Waymo autonomous vehicle in the Bay Area and began smashing it with hammers and whatever they could get their hands on, and then ultimately set it ablaze. So this was really this cathartic, I think, outburst of violent energy attacking a Waymo autonomous vehicle by people who were fed up. They've had enough of these technologies infiltrating their neighborhoods, their communities, their cities, their lives, and so on. So I think we do see some of that violent reaction to the technologies, but this is where I think the Luddite becomes a bit more nuanced, and the idea of grabbing a hammer becomes a bit more varied.
We are not only limited to the direct physical violence of grab hammer, smash objects. For me, the Luddite is we have moved past the point at which the problems of a technology can be solved by simply smashing that technology because the problems are so collective at level, there's a societal level that you have to act in ways that treat them as political problems that demand political solutions. And so for me, that comes in a number of ways. I guess on a lower scale, more individual or group scale technologies like AI, they benefit a lot from what's called in the sociological literature, the business of expectations. This business of selling you a future, selling you a vision, but treating it as if it's already happened, or that it's inevitable, that you can't resist it. In fact, it's already here. And the point of that is to buy time for the technology to actually get here, to become standardized, to become normalized, to actually work in anything approximating the way people claim that it works.
But the point there is it's easier to get people to start behaving as if the technology already works, already exists, is already inevitable before that fact. But we're at this point in AI where norms are becoming norms and standards are becoming standards. In other words, it's still in the process of becoming. And that to me provides a lot of opportunity here for Luddite action. It can be as simple as rejecting the use of some AI tool like Microsoft Copilot because your boss tells you, "This is how we do our spreadsheets now," or "This is how we write emails," or whatever it is. And you say, "No, actually I don't want to use this and here's why. It makes the work worse. It's more expensive. We're investing tons of money into this. There's no return on investment, there's no feasible business model." It's about kind of providing that business case against the business use of the technology before it just becomes a norm. Whether it's better or not, once it becomes a norm, it becomes really hard to resist. Once it becomes standard business practice becomes hard to resist.
But we're still in that point of becoming, and I think that a lot of power can actually be taken by refusing to use things and refusing to use things in really thoughtful ways, making the case for why this technology should not be used. So that's one form of that Luddite action that's not the direct physical violent action against it, but is much more of a kind of cultural and institutional resistance to the technology. I think as well that there's a lot of inspiration that should be drawn from people like Lina Khan, who during Biden's administration was the Chair of the Federal Trade Commission and really spearheaded what for that agency was seen as absolutely radical and unprecedented action. It actually enforced laws that were written on the books.
And for the FTC, which so typically is a business friendly agency that is meant to be the federal market regulator, but is so often industry friendly, Lina Khan actually took the mandate of regulating competition in the market, regulating against unfair and deceptive practices in the market, took the rules and policies that are on the books and said, "My job is to enforce these. My job is to investigate these kinds of alleged wrongdoings and then act accordingly in judgment." And that was seen as really radical. I think the Wall Street Journal ran something like more than 100 op-eds and editorials that were anti-Lina Khan because the business community could not understand why the FTC was actually enforcing laws and regulations against them.
That to me, is a Luddite action because she also targeted the tech sector a lot. She did things like making technology companies that built algorithms on stolen data or data that was otherwise acquired in illicit and illegal ways, making them not just delete that data, but delete any algorithmic products created from that data. That is the literal destruction and disassembly of technologies based on a social, political, and economic analysis of those technologies. That's the kind of thing that should be motivating us. And when Biden talks about the rising oligarchy and the rising threat of the tech-industrial complex, I always find it weird when high-level politicians, let alone the President of the United States calls out problems and then says, "Somebody should really do something about this." You're the one that really has the power to do something about this. But until that happens, the responsibility does fall on all of us in the ways that we can to do something about that. And that does not necessarily mean picking up the hammer in a physical way, but it can mean picking up the hammer in institutional, cultural and political ways.
Justin Hendrix:
I had the same sort of feeling about Biden's speech on some level. There's certainly always that sense that there's more that could have been done. Certainly, I always think fundamentally privacy and privacy legislation wasn't a significant legislative priority and would've struck to the heart certainly of what goes on in Silicon Valley. On the other hand, he did hire Lina Khan. I suppose we can maybe save our discussion of some retrospective on Biden's approach to technology for some other time. I want to thank you so much for joining me, talk about this book, The Mechanic and the Luddite: A Ruthless Criticism of Technology and Capitalism by Jathan Sadowski. This is from the University of California Press. You're also the author of another book, Too Smart and host of the podcast, This Machine Kills, which many Tech Policy Press listeners may well be aware of already. Definitely go check out both those. Jathan, thanks so much for speaking to me. I hope we can do it again.
Jathan Sadowski:
Yeah, absolutely. This was great. Thanks Justin.
Justin Hendrix:
That's it for this episode. I hope you'll send your feedback. You can write to me at justin@techpolicy.press. Thanks to my guest, thanks to my co-founder Bryan Jones, and thank you for listening.
Authors
![](https://cdn.sanity.io/images/3tzzh18d/production/13e89d005fe677a1c18c1b8be04209d7849a7de8-748x748.jpg?fit=max&auto=format)