Home

Douglas Rushkoff on the Escape Fantasies of the Tech Billionaires

Justin Hendrix / Sep 6, 2022

Audio of this conversation is available via your favorite podcast service.

On this podcast, we try to give critical consideration to visions of the future that a certain set of Silicon Valley tech and venture accelerationists are working hard to advance.

In this episode we hear from author and scholar Douglas Rushkoff about his latest book-Survival of the Richest: Escape Fantasies of the Tech Billionaires- which lampoons and deflates these characters, offering instead a humanist approach to defining the future by how we comport ourselves in the present.

What follows is a lightly edited transcript of our discussion.

Justin Hendrix:

Doug, my listeners probably have read some of your prior books, they will at least remember the titles, Throwing Rocks at the Google Bus: How Growth Became the Enemy of Prosperity, Present Shock: When Everything Happens Now, Program Or Be Programmed. You've written a lot of books. Couple of novels.

Douglas Rushkoff:

Yeah.

Justin Hendrix:

A couple of graphic novels.

Douglas Rushkoff:

Keeps me off the street. Yeah.

Justin Hendrix:

Let me ask you this, because it's probably important upfront for me to tell the listener that you and I teach together on occasion and have had that opportunity, so we know one another reasonably well... but let me ask for the sake of the listener. What was little Doug up to? How did you become this sort of... I suppose multi-talented writer, artist?

Douglas Rushkoff:

Little Doug watched a lot of TV. I was one of those typical 1970s, late '60s, '70s latchkey kids. I sucked on the glass teeth of mainstream media every afternoon. And ultimately, for better and for worse kind of went meta on television. I would watch sitcoms and then look at the evolution of the sitcom swing set from doors on the right side of the house, to doors on the left side of the house. And realized that sitcoms that had doors on the left tended to be single or divorced people and doors on the right tended to be still whole nuclear families, and wondered what that was all about. And so I was just analyzing media from a really early age. And then my breakthrough experience was... It's weird to talk about this, but it was during return of the Jedi, you know the Ewok? They initially take Luke and Han Solo prisoner. They think that bad-

Justin Hendrix:

Absolutely.

Douglas Rushkoff:

Right? They take them prisoner, tie them all up. And eventually, C-3PO and R2-D2 tell the story of the rebels against the empire. And R2-D2 is making all these little 3D audio sounds of the spaceships crashing and little... I think he made a couple of holographic projections. And C-3PO because he speaks all the languages, came up with this great story and wonderful rhetoric, explaining about how the empires so bad and the rebels are so good. And you see the little Ewoks' eyes moving back and forth and watching this story. And then the Ewoks are so moved by this story and the technology through which the story is told, that they not only release Han Solo and Luke Skywalker and make them friends with the tribe, but they fight a war on their behalf in which Ewoks die.

Douglas Rushkoff:

And there's this one moment when one little Ewok realizes the other Ewok is dead and you hear him cried. And that was the moment that little Doug said, "Wait a minute. If Darth Vader had gotten down to the moon of Endor first and told his story with his technology about these horrible rebels that are ruining the empire, would they have fought on his behalf?" And that's when I became a media theorist and decided I need to understand whether the essential God's honest truth and the real ethical reality of the situation has any advantage over the fake one in between rhetoric and technology. And if so, how do we make things transparent and real enough for media to serve the truth rather than fiction.

Justin Hendrix:

And C-3PO's many talents of course come into focus in that moment, his cultural sensitivities, his language capabilities, all the rest. If I remember correctly though, they elevate him to the status of a sort of deity.

Douglas Rushkoff:

Of a god. Exactly. So he's like a Margaret Mead and Gregory Bateson going to the south Pacific islands in advance of FDR in the army to understand. I mean, sure, they wanted to understand how girls' culture worked in those places, but they were also doing advanced intelligence, just like the... Or Christian missionaries went before the conquistadors in South America. I mean, it's a cynical way to look at it, but then you come all the way back around to what we're doing today. Our media and technology and interactive, and boy, those of us who pioneered this space where the fringe counterculture, psychedelic weirdos, but what were we ultimately? We ended up being just the advanced scouts of the yuppie scum who came in and took this place over.

Justin Hendrix:

In this book, Doug, you call yourself a humanist mistaken for a futurist. You reveal you are, in fact, a Marxist media theorist, but you start the book in the desert. Why do you start us there?

Douglas Rushkoff:

I got invited out there for a ton of money. I mean, it's a story... Some people are familiar with at this point, because when I told it, I didn't know it would be a big deal, but it turns out it really upset people. I got invited to do a talk for the typical group of bankers who pay you too much money to come out, but too much money is enough for me. So I went did this. What I thought was going to be a talk for these bankers about the digital future and it turned out they brought five dudes into the dressing room. There was no stage. They just brought five guys into the green room, sat down at this table. Two of them were definitely billionaires and the other three, probably at least close enough. And they were peppering me originally with kind of investor like questions about the net, Bitcoin or Ethereum, a virtual reality, augmented reality.

And then eventually they got around to Alaska or New Zealand and spent the whole hour with me paying me, honestly between a third and half of my annual CUNY professor salary to sit there for an hour with them at this resort and talk about how do you maintain control of your security force after your money's worthless? And it set me on this mission. I would say if the wealthiest and most powerful technologists and investors and people I've ever sat in a room with, feel powerless to influence the future, that the best they can do is prepare for the inevitable event, the climate catastrophe or pandemic or social unrest or economic upheaval or nuclear war, whatever it is, that destroys the world, it's pretty frigging grim.

And I came up with this... One, really I observed this phenomenon, what I've come to call the insulation equation, which is that they're living with this idea of, "How much money do I need to earn in order to insulate myself from the reality I'm creating by earning money in this way." And there's no way out. I just read Cory Doctorow's piece... Oh, you published Corey Doctorow's piece. That's where I read it. On the Epson printers. Did you link to that?

Justin Hendrix:

I may have put it on Twitter.

Douglas Rushkoff:

Yeah. Oh, my God. So he writes this thing about Epson and how they have this sort of, it counts the number of pages and then just bricks your printer after that. And it's justifying that it's preventing like ink from leaking onto your desk because there's a little sponge in there that might be filled by then. So you've got to just throw out the printer. And I'm thinking the guy who makes that decision today surely understands about climate change, right? He surely understands. He is accelerating the rate at which the planet will end by coming up with a technology like that. But what he's thinking is, "I will make enough money through this evil technology that it will give me a competitive advantage in outrunning the disaster that I'm actually creating by doing this." For my money, that's just insane.

Justin Hendrix:

Tesla founder, Elon Musk, colonizing Mars; Palantir's Peter Thiel, reversing the aging process; AI developers, Sam Altman, Ray Kurzweil, uploading their minds into super computers. You go from there to luxury underground apartments and converted cold war munitions storage facilities, missile silos, other fortified locations around the world, miniature club med resorts. And then you explain to us this idea that you call The Mindset and perhaps that's how we got here. What is The Mindset?

Douglas Rushkoff:

Yeah. Well, The Mindset takes a while to explain in a way, because it's set this whole book ends up being about, is I'm trying not to blame it on individual people so much as this mindset that they've internalized. So yeah, The Mindset is this Silicon Valley belief that human beings are the problem and technology is the solution, that, yes, that they contend, that they can outrun the externalized damage of their enterprises. And there's a number, I guess, of tenants of The Mindset. It's fun because I'm just starting to talk about this book now. So it's like, "So what are they?" I'm asking myself. I mean, there's kind of a staunch atheistic scientism that there's nothing going on here. Move right along. No soul, no people, no, nothing. We are just the computers running on selfish jeans. There's a techno-solutionism that these guys have that they believe that whatever problem, there's going to be some big technology that you can throw at it.

Even if you need $100 million MacArthur award to do it, you can do it. There's an adherence and kind of a surrender to the biases of digital code, which is that sort of one, zero, yes, no, everything can be resolved into one of these quantized levels. There's an understanding of human relationships as market phenomena, that this is all at the best altruism is just a form of self-interest. For long term self-interest, there's a fear of women and nature and I guess a need to neutralize the unpredictability and the unknown by dominating it or deaminating it. And then finally, I guess there's this need to see one's own contributions as just utterly unique and without precedent. I've gone down to Burning Man. I did ayahuasca and I saw the truth. The climate is imperiled. And I'm going to go, "So dude, we know this. We've known this for a while." So if they're not making it up, then somehow it doesn't exist. On the other thing is they're addicted to going meta on stuff.

Whenever they reach a problem, whenever they reach... It's like Zuckerberg reaches his peak of his subscriber base on Facebook and social, the public is turning on him. So what does he do? He got meta. I'm going to invent meta. Oh, it's sort of Web3, crypto, virtual reality, augmented reality, everything. I'm just going meta. Or Peter Thiel's idea of going from zero to one, is meta, right? It's like you have to rise one order of magnitude above everybody else, that Web 2.0 Stewart Brand, we are as God. So we may as well get good at it. I'm going to be... Even a self-sovereign individual is what? It's like you're going meta on... I'm king of me. So I'm both me and I'm this sort of sovereign over me at the same time and that's sort of this going meta that they all do. So there's sort of the main, I guess, features of this Silicon Valley mindset that's getting us in all this trouble.

Justin Hendrix:

There is, I suppose, an indictment of capitalism that we would expect from a Marxist media theorist.

Douglas Rushkoff:

Yeah. And I'm not really. At the time, I think I might have been a Marxist theorist. If I'm anything now, I'm an [inaudible 00:13:12] media theorist. But yeah, I would also criticize capitalism for sure. And for a long time, and even in the first kind of almost draft of the outline of this book, I basically blamed capitalism for technology's woes to say that the internet sold out to the mob in the same way when a restaurant sells out to the mob, the restaurant is just kind of a front for money running. So the internet, which was about unleashing the collective human imagination and creativity and all that, just became a poster child for the NASDAQ stock exchange, and they just burned these businesses like pop stars just to financialize something.

But as I worked on it, I realized it's not just capital... Capitalism dovetailed perfectly with digital technologies' ability to go abstract and to, again, go meta. What financialization is, what capitalism is, is going meta on the market. There's a real market. Then there's stock shares that represent the market. Now there's derivatives that represent stock shares and derivatives of derivatives. And it's that, that's the part of capitalism that got amplified so easily by digital, which just loves to scale and scale.

Justin Hendrix:

In the book, you illustrated this by telling the story that you got mugged in Park Slope, which is not something I knew. Why did that particular account sort of feed into your thesis?

Douglas Rushkoff:

Well, it's funny. Yeah. I mean, we lived in Park Slope a short time between the lower east side and out here in the suburbs, which we could actually afford. So yeah, we were living in this apartment. I couldn't really afford to be in Park Slope, hoping to be able to stay there long enough for my kid to get into... What was that school? 321, the famous good elementary school in Park Slope. We didn't last. But I'm taking out the garbage in front of the apartment there and I got mugged at gunpoint. And I went back and posted on this list. It's called the Park Slope parents' list, this great, crunchy, how we're all going to know each other-

Justin Hendrix:

Famous list. Absolutely.

Douglas Rushkoff:

Yeah. And it's a good list and all that. I posted, "Oh, my God, I got mugged on 7th Street and 6th Avenue. And this is what happened and look out." And the first two emails I got after that were from people who were mad at me that I had mentioned the location of where I was mugged, that it would adversely impact their property values. And I'm like, "Oh, dude, I'm sorry. You were meant to sell..." Neither of these people were actually selling their homes, but they were dependent on their home value going up so they could keep refinancing their mortgage at bigger valuations so that they would own more of the house over time. Everyone was doing these interest only loans that lasted like five years and then you had to buy a new one, but it only worked... You only got to stay in your house if the value of the house was going up.

So then you could get in at better valuations. If it wasn't, then you'd be screwed. And when that happened, I was like, wait a minute. So these people are in a situation where they care more about the abstracted asset value of their home, than the quality of life they're actually living, right? It's like, "La, la, la, don't tell me." And that's the mindset right there. "La, la, la, it's not happening. La, la, la, la, la, la, la." I'm on the Stanford campus where it's beautiful. I'm not going to look over at the east gate of Stanford where there's a tent village of people who can't even afford a home, while I'm here, I'm this... In Utopia. And I get it. We want to close it out, but we've gotten to the point where you can't. That's the sort of Trumpian... Yeah, we could build a wall between us in Mexico, but at some point, you got to share vaccines with the world so that bugs don't get you.

Justin Hendrix:

There are so many of what I think of as Doug Rushkoff's greatest hits in this book, but where you seem to push into new ground is to some extent, this combination of the role of tech and money, where these two things are sort of in conversation with one another. But you also take us, I suppose, on another tangent into your theater kid past. And apparently you were in close proximity to Paul Reubens, Pee-wee Herman at one point.

Douglas Rushkoff:

Yeah, I went to Cal Arts. After I went to college, I went to Cal Arts to get an MFA in theater directing, which I still want to go back to. And in some ways, this book has made me value theater again. I've been writing these series non-fiction books for a long time. And as I reread this book, I realized that it's a comedy, that it's really a black comedy. And that the fact that it's a comedy is not a bad thing, but a good thing that it's empowering to be able to read about the visions of Musk and Thiel and Bezos, and those guys and laugh at it. It's like, "Oh, there, but for the grace of God, go we, thank God, I'm not addicted to this mindset. Oh, I just get to live a happy life. I don't need to spend all this energy and time and money to try to insulate myself from the world." "Oh, good. Ha, ha."

So it's made me actually like theater again and feel less guilty about what seemed like a self-indulgent artistic fun. I mean, just making theater is just so fun. Are we allowed to do that when the climate's burning? Yes. It turns out we are because it can actually help change the way we think about the world. And I went to Cal Arts with all those kind of guys. I mean, Tim Burton left right before I got there, but it was a crazy, wonderful, wonderful school, but the theater department was really old school, classical theater training. And everything that they taught us was based on Aristotle's narrative structure, beginning, middle, end crisis, climax relief. It's that sort of male orgasm curve of narrative fiction. Rising action, reverse on a... As if the whole point that you tell a story is so people have this catharsis and they can go back to their humdrum, middle class lives and smile.

Whereas for me, I was much more kind of Brett Dean or David Lynch in my thinking about, "I like narratives that are open-ended, that require people to think longer and have arguments and go out and do something." The resolution, being addicted to resolution is part of the problem. That's what the billionaires are. They need the event, they need the climax, they've lived with these business plans that have exit strategies so much so that they think that we need an exit strategy as a civilization to get out of this thing.

So at Cal Arts, me and my friend Bernie staged a bit of a revolt against Aristotelian narrative. And we're like, "There's other story structure, there's other narrative forms. And if we, as a civilization can adopt a narrative form other than the Avengers: Endgame, then maybe, just maybe we can grow into a sustainable middle age rather than burning out. So yeah, The Mindset, again, it's addicted to endings, it's addicted to the IPO. The release, the thing, the proof. And there's not, life is about living in that weird squishy in between, "Here we are, man. What the fuck is going on here?"

Douglas Rushkoff with a precariously balanced spoon.

Justin Hendrix:

One of the people who comes in for a bit of caricature that I would link to this perspective in the book is Steven Pinker, the cognitive scientist, and also very prolific author, perhaps a Rushkoffian nemesis, but you bring up the critique of Pinker that's offered by David Graeber and David Wengrow in the Dawn of Everything, which I must admit, I am slogging through- I have not completed the book.

Douglas Rushkoff:

Yeah. It's weird. It's so interesting to me, because that book is both the greatest book ever written and a slog, isn't that interesting? Because it's like, you've got to be... And that's the thing, I think it's a book you buy it and then go to it when you're in the mood for that. And when you are, it's frigging profound. It's just texturally, historically, but you got to be... It's like-

Justin Hendrix:

Yeah, absolutely.

Douglas Rushkoff:

It's a thing. So don't try to read it when you're not in the mood for that. It's like don't take mushrooms when you're in a bad mood either, right? But it may stand up as really when push comes to shove, this is like Origin of Consciousness in the Bicameral Mind. This is the real deal. Mumford, Technics and Civilization. I think this book's going to stand up. Even with some wrong stuff in there maybe, as a profoundly world shifting thing. And for me, it was. It was like, "Oh, my God, there's finally people who were doing the anthropological work to say, no, this story of Western civilization incrementally getting better with new technologies. And then the new technologies allowed for the origins of consciousness. And then with consciousness became the individual. Then we got the enlightenment and individual rights and democracy. And now in the west, Alexander Hamilton." And that thing, it's like, "No, wait a minute, they were democratic societies. They were advanced forms of senates and representation. What do they call it? At Occupy when we used to do... Not the mic check, but what's the form of government that it was-

Justin Hendrix:

Well, very much distributed.

Douglas Rushkoff:

Yeah. Basically a different kind of consensus building technique that was used and occupied different than parliamentary left, right debate. And their book is great because it does say, no, society is not an arrow moving in this one direction towards progress. It's very circular or spiral stair-casey and things repeat and we do have dead ends. And it gets us out of this Western emancipation movement where science, technology, capitalism, democracy, individualism, Liberty, Westward Ho, here we go.

And we get all the way to California. We take over these lands from all these other people. We extract their resources, we enslaved their children and we ruin them. And then as all as if... Oh, but things are getting better now because you can walk down the street reasonably assured that someone's not going to attack you with a sword. Yeah, things are better like that if I am a white Western upper middle class Manhattan person walking in a good neighborhood, you're right. You're right, but at what cost to everybody else and to our environment? It's like, "Yes, I feel better today because I injected 30 grams of steroids and I'm hitting baseballs really fast", but what's that doing to my liver and my kidneys and my babies?

Justin Hendrix:

Well, certainly the first hundreds of pages of that book, if that's all you get through are likely to make you more anarchy-curious, if nothing else, which I suppose may connect also to the Occupy, a form of governance on some level. But I noted that part in the book in particular because I feel like that book like yours asks you to rethink some basics of this linear plot line, which you, of course are now attending to take apart. But let's get a little further into it here because the book doesn't sort of just present caricature and critique, it also suggests some ways that we might go about perhaps reclaiming this sort of civilizational effort. Where do you think we have to start?

Douglas Rushkoff:

I mean, I do like policy and I know that's part of the whole frame of this organization, is policy. I do. And I think that a very few intelligent people can work on policy, ideally without the Twitterverse dominating the conversation. It was weird, it was after Biden withdrew from Afghanistan and Twitter was going all nuts about it. And people started direct messaging me, "Doug, why haven't you weighed in on Biden's Afghanistan withdrawal policy?" And I'm like, "Honestly, I know very little about how you withdraw from a war, much less... I just don't know. I don't know the logistics. I don't know. I really don't know." And I think maybe just, if just 100,000 people have that argument, there's probably enough brain power. Just 100,000 of the top thinkers of that, we don't need it to be a million or a hundred million people having that conversation.

So with policy, I'm so happy, I am delighted to serve as a friend to a policymaker who might want to understand the biases of a particular medium, because that's what I'm really good at, but not then I'm not the one to take it all the way to the policy level and most of us aren't. So it's weird, I'm all for policy, but policy done by people who understand how policy works would be like really cool with us staying out of it. And what we can do as people, as the other 99.999% of people is... I hate to say it like this, but it's, just do less, you'd go local, become more social, meet your neighbors, share things. It's really as simple as if you live in a suburban neighborhood, everybody doesn't need a lawnmower. The current strategy is for everyone to go to Home Depot and get the minimum viable product lawnmower, which is made in a crappy way, extracting the worst resources and it lasts just a few years and then you throw it out and it sits on a big pile and then you get another one.

Like the Epson printer, the smarter thing would be if everyone on the block chips in and you get one good lawnmower that actually can last for 50 years-- my dad's still works-- a nice, good, great lawnmower or a manual lawnmower for that matter, which is quieter and leaves the grass on the lawn, which turns out is better than taking it away and dumping it somewhere else. You shouldn't even have grass anyway, but let's say you have one lawnmower for the block. You all share it. You make a schedule because no one's really mowing more than a couple hours a week. It turns out one is enough for 10 or even 20 houses. Is that bad for the lawnmower company? In the short term, it is because now they're selling less lawnmowers in a fewer jobs, but you need less money because you're not buying as much stuff and we don't have as much pollution, as much externalities.

So I would say go local, go social and accept that moving towards degrowth is not a bad thing, but a good thing. Degrowth means we have more time to spend with our families and making love and playing cards and tutoring kids and learning to read, doing mutual aid. The argument against that, the only argument when people yell at me for it is they say, "Well, yeah, but the economy has to keep growing. The economy will not keep growing if you let people do that." And I would say, "Since when are we obligated to keep the market growing?" That's the same as the idea that technology should be using people instead of people using technology. We get the cart and the horse, the reverse, the cause and effect, the figure in the ground. Once we can, as individuals kind of transcend this game that we're in, the Elon Musk, Bezos game of winning, I got to win the game.

Once you can get over that, read James Carse, Finite and Infinite Games. It'll take you two hours and change your life. It's basically saying you're not playing in order to win. You play in order to keep the game going. And if you look at things that way, rather than trying to fix the future world and this ends justifies the means thing, sacrificing the people of today for the sake of future generations in space, you instead say, "No, no, we're doing it right now. I'm going to help other people right now. I'm going to make life better right now at all flips."

Justin Hendrix:

So you are actually arguing for a cultural transformation and a reorganization of the way we live. And I'm certainly keen perhaps to follow your particular version of things and follow that set of ideas towards its logical conclusion before I am, perhaps Peter Thiel's vision. But I'm thinking back to your original comment on who got down to indoor first. I mean, the cultural transformation that you are suggesting, maybe it's the sort of C-3PO rebel version of it and we might characterize the sort of Musk, Thiel, et cetera, as the Darth Vader version of it. But I'm afraid Darth Vader made it to indoor first. It's not going to play very well in Peoria.

Douglas Rushkoff:

Right. We're living in a society where people believe... I mean, who's closer to Darth Vader than Peter Thiel, right? I mean, that's like it. Who's closer than that? It's kind of perfect. But the advantage we have is that we don't need a story for people to get this. All you got to do is say, "Look into your lover's eyes, hold your child's hand, put your baby on your bare chest." And it's like, "Oh, right." You're reminded. Luckily this is not, "Okay. We've got to all hold hands and hold a revolution against Trump and these guys, it's going to be hard. There's going to be death. There's going to be pain. But at the end of the thing, follow me up the hill." It's that striving thing that's kind of the opposite energy of the... Maybe the alternative narrative structure I'm talking about, but it's a more experiential form of propaganda.

So maybe it's still is. And maybe we do. So yeah. If I write a book, what is a book, but propaganda? But at least I'm disclosing it. Here's this problem and I think there's a different way. But again, that's why the arts are so powerful that if you use the arts to have in a sort of experiential thing, you can move out of this other way. But I hear you. I mean, I look at QAnon, the conspiracy theorists, and what are they doing? But a form of media studies, do the research, look online, connect the dots, use your own judgment and figure it all out. What are they doing? But that, the difference is... And someone actually accused me with this book. They said, "Oh, you're going to give more ammunition to all the QAnon people who want to see the bad of these guys." It's like yes and no.

But I think the QAnon people are among the few people who take these people literally, who actually believe them. If you believe the great reset and that they can genetically remodify the humanity and redo this. Then yet, if you take these dudes at face value, the only place you could go is crazy conspiracy theory. But the fact is none of their bunkers are going to work. They're not getting off the planet. It'd be harder to put a dome and live on Mars than to live on earth even after nuclear disaster. I mean, it's like it's beyond comprehension. These guys are crazy. They are crazy. They're not close to the stuff that they're describing. I mean, Pokemon GO is state of the art. That's where we're at right now. And it's beautiful. It's wonderful. I mean, play with them 3D audio on your thing.

It's cool. It's great. These devices are wonderful. I'm not saying technology is bad. It's wonderful. You know I love it. Here we are talking on this thing. This is great. This is the greatest thing since sliced bread, better I think. Because I like to slice my own bread personally, but this is great, great stuff. It's just that the people who are making it are doing it in such a way as to destroy the world, they know they're destroying the world and they are actively making the preparations for the disaster of their own making. And if you realize that, then you realize, "Oh, well maybe this is not the way."

Justin Hendrix:

One of the technologies that you talk about specifically, of course, which is perhaps core to that is artificial intelligence. The fact that a lot of these individuals are both, of course heavily invested in the development of AI technologies and also appear to be terrified about what it will mean.

Douglas Rushkoff:

I know. Have you heard of... What was it called? Foo Camp. It's Friends of O'Reilly.

Justin Hendrix:

Sure.

Douglas Rushkoff:

I got invited to that one year. And it's a lot of big techies. They're people who invented the streak feature on Snapchat, like real names. And one guy, we're talking about AI and he had seen something I had written that was critical of AI. And he is like, "You know Doug, they probably shouldn't be doing that." Like, "Why?" He goes, "On the AI's take over, they're going to see what you said. They're going to see what you said and who knows what they'll do at that point, because they'll be in charge." And I'm like, "What? Oh, so you don't tweet or say", he said, "No, I don't tweet anything about AI because I don't want them to know." And I'm like, "Well, if they're AIs then won't they be able to do to statistical analysis and machine learning and figure out from what you didn't do? That you're a person who doesn't like them and also deserves to be killed?"

Justin Hendrix:

So he's already hiding from the AI.

Douglas Rushkoff:

He's already hiding proactively from AI without the knowledge that, of course the AI is going to figure out he's one of the people that was hiding from AI and is probably the most dangerous. They're going to come for him first. So I freaked him out. He ran to the bathroom after I said that. I unleashed his colon or something with fear by saying that. But the great thing about the AI thing and speaking with you, I realize this is really the whole thing is how we're saying in the mindset that these guys want to go meta, that they feel safe if they go meta on us or look at us from above or go into the virtual reality version or create the derivative of the stock. The reason they're afraid of AI is because AI is the one thing that could go meta on them.

It's the thing, but it's the next level, and we can't join them there. It is up there. So yeah, if you really do believe that going meta is the way to dominate the other, then sure, AI is going to dominate you. I'm not so scared of AI so much as... And this is where I need you, so much as policy people. So even if an AI has the intelligence of ant, it's not that I'm afraid of it being smarter than me. I'm afraid of us surrendering vital societal functions and decisions to that ant, right? Just it's being turn over decisions now to shareholders who are deciding whether to create pollution or whatever. Don't give... It's what we voluntarily surrender to the AI that's the problem, not the AI itself. These are babies, they're babies, not even.

Justin Hendrix:

In this book by inviting folks to join you at listening more carefully to the promises, the tech titans and the billionaire investors, as well as the world leaders in their trawl, in each of every one of their grand plans, technology solutions and great resets, there's always an and or a but, some element of profit, some temporary compromise or cruelty, some externality to be solved at a later date or some personal safety valve for the founder alone, along with his promise to come back for us on the next trip.

Douglas Rushkoff:

Yeah.

Justin Hendrix:

A word that doesn't appear in this book that I've seen discussed quite a lot lately is longtermism, which is perhaps emerging as a... I guess maybe not unified, but a set of ideas that a lot of these folks espouse.

Douglas Rushkoff:

Yeah. I mean, some long term thinking is a beautiful thing. If you understand, that's why I always say I'm not a futurist. The tech bros problem is that they look at the future as this inevitability, that the best thing you could do is predict and prepare for. Whereas me as more of a hippy anarchist person, I look at the future as this thing that we're creating in the present with our actions. The future is fungible, not fixed, but I'm not betting on it, which is why I don't need it to be predictable. I'm hoping and I'm ready for unpredictable outcomes. I like that. I like novelty. The last thing an investor wants [inaudible 00:38:42] was novelty. No, no. I just bet on that. I bet on colonialism, I bet on this particular profit margin, but the longtermism that some of these folks are talking about.

If you read Musk's tweets or Thiel or one of these guys, they believe that there's only 8 billion people alive today. But after we migrate to space, there'll be like 90 trillion people out in the cosmos. So it's okay to sacrifice the lives and quality of life of people today in order to promote the joy of those people, because the joy of 90 trillion people far outweighs the pain of 8 billion. But I challenge that whole notion. First, whenever you're sacrificing people in the present for some future fictional thing, you are going to end up in a worse place, not a better one. There's no ends justifies the means, never works. So whether you're a libertarian, accelerationist who says, "Let's rip the bandaid off now", or if you're a more longtermist who says, "We'll slowly let these people die today so we can do this other thing in the future."

No, no, no, no. The only way is to look at the now, yes, we can put sandbags up. Yes, we can work together to mitigate disasters and stuff where I'm not saying ignore the future, but we address the future through the present. And most importantly, through our comportment in the present. The way that we work together now is the future. It's not whether we put up the wall. It's how we build the wall. And who are we building the wall against? Are we putting the wall up against the tide that's going to come in and destroy Miami? Or are we putting in the wall against the Mexicans who are going to come walking up here when they're flooded and climate change down there? Two very different walls put up in two very different ways.

Justin Hendrix:

Doug, I'll ask you a last question. Do you remain hopeful? Do you remain sort of optimistic perhaps? And I realize optimism and hope perhaps are different terms, but that we can perhaps make the set of decisions that you would like to see here that your version of vast cultural transformation at this point is possible?

Douglas Rushkoff:

Yeah, I'm actually... Oddly enough, I'm more hopeful after writing and just rereading this book than I was beforehand. When I started this book, I was like, "We're all going to die. I'm going to witness the end of the world in my lifetime. And if I don't, at least my daughter will." And now I'm looking at it as, "Oh, wow. These loser, capitalist, technocratic, billionaire idiots are so addicted to their way of building technology and doing business that they would actually rather the world end in an apocalypse than change their ways that they are in some ways actively wishing for the end game to put them out of their misery of worrying about what it is they're doing. Just bring it on already. It's a Steve Bannon kind of a thing. Let's just wipe the slate clean, so we can get from game A to game B, reboot the civilization and just put out a whole bunch of humane technologies that raise people like nice cage, free range chickens.

And I'm like, "No", they're crazy. They're crazy. So I challenge the underlying foundation of the whole thing and I challenge the underlying foundation that climate change is not a much easier thing to solve through massive reduction in our energy expenditure, a change of our expectations about how many times you're going to get to go to Europe in your lifetime. Maybe everyone gets to go once and that's it. And then eventually we can go on little helium if there's still helium left or whatever, hot air drones or something, we'll find other ways in the future, but just everybody just shut up, calm down, play baseball, meet your kids. They're really cool. And there's going to be a whole lot more for everybody.

Justin Hendrix:

The book is called Survival of the Richest: Escape Fantasies of the Tech Billionaires, the latest from Doug Rushkoff, but there are many more on the list to get through once you've read that one. So Doug, thank you so much for talking to me. I hope we'll do it again.

Douglas Rushkoff:

Thank you. Thanks for doing Tech Policy Press. This is an important frigging thing.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics