Contemplating YouTube's Rise: A Conversation with Mark Bergen
Justin Hendrix / Sep 4, 2022Audio of this conversation is available via your favorite podcast service.
In a June report that I helped co-author with Paul Barrett, Deputy Director of the NYU Stern School of Business Center for Business and Human Rights, we noted something that other observers of the major tech firms often point out about YouTube: that the massive social video-sharing platform has received disproportionately less scrutiny from journalists, social scientists, politicians, and civil society groups. Legal scholar evelyn douek calls YouTube’s uncanny ability to escape the kind of attention that Facebook draws “magic dust.”
But there are those who do pay very close attention to the company- and today we’re going to hear from one of them-- Bloomberg journalist Mark Bergen. He’s the author ofLike, Comment, Subscribe: Inside YouTube’s Chaotic Rise to World Domination, available Tuesday from Viking Press.
This is a business book, a history, and a contemplation of YouTube’s role in society all in one. Bergen explores how the company evolved into the massive juggernaut it is today, and along the way gives insight into concerning phenomena that we’ve discussed on this podcast in the past, such as the relationship between YouTube and violent extremism, misogyny, racism, white nationalism and a variety of other ills. The book pulls the curtain back on the internal dynamics and decisions that bring us to today. And it asks us to contemplate whether anyone- from Google’s leadership to regulators in any of the world’s governments- can truly get their heads or hands around YouTube.
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
So just after a very sweet dedication to your partner, and before the table of contents in this book, there are two quotes. One's from Mary Shelley's Frankenstein, "So much has been done," exclaimed the soul of Frankenstein, "more, far more will I achieve treading in the steps already marked. I will pioneer new way, explore unknown powers and unfold to the world the deepest mysteries of creation."
And the second ones from someone that many of my listeners may not be familiar with, Logan Paul. And that quote is, "It was going to be a joke. This was all going to be a joke. Why did it become so real?" Why did you start with these two quotes?
Mark Bergen:
I wanted it hit it, maybe too on the nose a little bit. I think the story of YouTube in many ways is a Frankenstein story. They created this monster that they couldn't control, being the platform and I think what's most fascinating about YouTube is unlike a lot of other social companies and networks, it's really not run by one founder and sort of controlled by a monolith. It's gone through three different chief executives. There were two founders that left really early.
So just by structurally, how it worked, the platform is this monster that the company's always trying to tame and then the Logan Paul incident I'm happy to get into, was one of these, it happened during the holidays. The scandal was he showed a dead body hanging in what's called a suicide forest on YouTube, it became briefly one of the most popular videos in... Sorry, suicide forest in Japan became briefly one of the most popular videos on YouTube and a real inflection point for the company trying to handle out of control creators and at that point they made a series of changes from that incident that had still have repercussions today.
Justin Hendrix:
So we'll get a little bit into, of course, how we got there, but when I read that Logan Paul quote, it reminded me of a scene in the HBO documentary on QAnon. Cue into the storm, where Jim Watkins is marching down Pennsylvania Avenue with a bunch of other MAGA supporters on January 6th, marveling at how all this online phenomena has played out in real life. and in this book, I feel like there's this constant thing of, A, it's amazing history of, of course, the company and the business of YouTube, but it's also just an endless set of reminders of these crazy pop culture phenomena that come in and out.
Mark Bergen:
The Logan Paul thing, there could be an entire book just about Logan Paul, but what was so fascinating to me is he's just this really prime example of someone who was sort of raised on YouTube, which is interesting. The early generation of YouTube was this response to the TV and traditional media and Logan Paul's part of this generation, Gen Z, not the TikTok generation, but that came up and this idea that influencers and being a creator and making money on the internet was a given and something you could strive for and have this neat understanding of the spectacle and what works and operate with this level of ironic detachment that makes it really difficult to see when they're being sincere.
He's very much a YouTube star in the sense of what we talk about parasocial relations, like a lot of teens that watch people like Logan Paul, think they know him and it's not a social network in the sense that the post you see are not from people you actually know, but they are from people that you have these, I think, more meaningful relationships with than watching George Clooney. And in YouTube grew in the sense because they'd let these creators run wild and do things that never would've appeared on broadcast TV and then unleashed all sorts of wonderful new forms of creativity and invention and then had these consequences that the company didn't have these safeguards in and it's just one of many examples where that just becomes pretty obvious.
Justin Hendrix:
So the book is, in many ways, a history, a chronology, of YouTube across the last, well now 17 years, but you take us back to 2005 when the founders of YouTube, like Mark Zuckerberg are apparently inspired by a website, Hot or Not, as well as other nascent social media. What was in the cultural and the engineering stew in those early days of YouTube?
Mark Bergen:
Yeah, certainly online dating just launched so much of the modern internet. This was a time there was, I think there's a phrase in the book about the companies were sort of like sharks turning in the water, the smell of blood. The YouTube founders had come, all three of them all worked at PayPal and PayPal was famous during this pivot. They started off as mobile security software. They pivoted to payments and they became PayPal, and then there was this famous PayPal mafia, Elon Musk, and the founders of LinkedIn and Yelp. It was this really early end stage of the Web 2.0 and I think something that wasn't really appreciated, I didn't fully understand, is we talk a lot about Uber as this classic example of we're going to operate in this legal gray zone, and we're going to do something and then ask permission later or ask for forgiveness later, rather than permission.
I think YouTube's a really great example of this, where these companies that we're operating in these gray zones around copyright and IP, but just leaned in and went really hard and that was the history of, "We're going to put things up." Like when they first operated, didn't have a lawyer on staff when they started and had a somewhat understanding of copyright law, but went in with this gusto, that a big company, in this case, Google and Microsoft that had competing products, were much more reserved and YouTube could take these risks that arguably led to their success.
At the time there was also this culture of very live journal, early internet, hacker culture, where the internet was this respite from corporate media, from government censorship. This was the Bush era, right? This was, I think, and I get into this a little bit in the book, of some of the founding team who worked at YouTube came up in this culture, either worked for or around or with EFF and that scene and I think were really important in shaping the platform and then later on, it gets much thornier and changes a little bit when YouTube joins Google.
Justin Hendrix:
So one of the things I was struck by in looking back that far at 2005, was the fact that YouTube's form came into being almost at once and it really is somewhat the same today. There must be, and we'll talk about this, a lot more going on the back end, but you talk about how the trio-- Chad Harley, Steve Chen, Jawed Karim-- you point out that they essentially added a set of features, the ability for people to leave comments, a small button to share the videos and then this idea that when someone clicks a video, a row of related ones appears prompting them to watch some more. In many ways the form was there from the beginning.
Mark Bergen:
I think that's true. YouTube hasn't really, the core product hasn't really changed in its entire 17 years of existence. That being said, its made a lot of, obviously a lot of differences than back then. I did think there was something, some of this is a little bit... People I talked to, maybe the historical revisionism a bit, but at the time there was a bit more like, if you think about Flicker was another company around that time that got sold to Yahoo before YouTube joined Google and was sort of an inspiration for YouTube and certainly for YouTube investors, that validated their business, but there was an idea that this was an online community of people. The assumption and early on, I think it was largely the case, people that were making videos were also the ones watching, and it was a pretty tight-knit community and very different today than where most people using YouTube are not uploading videos or trying to be online creators.
There was a much bigger overlap and there was more of these are online communities and I think just the framing and thinking about that really has a lot of impact on the decisions they made. One interesting example is there was a feature that YouTube spun out over a weekend, according to people I talked to, like response videos to actually put video replies in comments, which today I imagine would be a nightmare to moderate, but at the time it was sort of a social feature and I think that there are a lot of those social features and interactions were stripped away and perhaps that's sort of, in a way, partly why YouTube's not really seen as social media today, but I think that there was a certainly different outlook and approach to the company back then.
Justin Hendrix:
It's not only that form function that seems to be there from the beginning, but right around the time that Sequoia comes in big with big investment, we start to see some of the first real problematic content arrive on YouTube and YouTube's first go at content moderation. You tell the story of this woman, Julie Mora-Blanco, who sees a video she would never forget.
Mark Bergen:
Yeah, they were kind of the first front line of content moderation. It didn't really exist on the internet. Julie is a really good example, these were part of the YouTube squad. There was great anecdote I'll share from the book is that in YouTube's first office, they just happened to be sitting, they hired these people to do content moderation and they put them in the front of the office and then actually had a few visitors and they realized, "Okay, we need to buy those industrialized screeners so no one else can see the screen and then maybe we shouldn't put these people in the front of the office," but I thought it was just this really great metaphor for, flash forward a few years and now the screeners are sitting in the Philippines and India and Ireland and working for companies not called... knock off Google, right?
I think that is valuable in the sense that they were much more integrated with, especially when YouTube's a smaller team, decision making, and it was, admittedly the processes weren't in place as they are today of this massive operation with lots of lawyers and protocols, but they were writing some of these initial rules and I think being fairly thoughtful at the time about how they wanted to deal with things. At the time, this was before this idea of misinformation, this was really a thing that they were... Another example that I said in book is ideas about phenology and basically shot as documentaries, talking about the science behind it and at the time I think Julie decided to mark that video for deletion, YouTube early on built this exception for documentary, scientific educational videos.
There were a lot of really fascinating debates and I think it's kind of hard to even imagine that there was no guidebook or precedence for a lot of these decisions they were making. Micah Schaefer, who was one of the early YouTube policy leaders that they hired, he had worked around the EFF and some of the experience, my understanding, was early live journal culture.
YouTube was actually pretty early in setting a policy around eating disorders and content around eating disorders because Michael had seen that on live journal and in those online discussions and I think the team would peruse for Chen and there was a moment, I forget when exactly, when on for Chan, a bunch of people planned to like, "We're going to do a coordinated attack and just flood YouTube with porn," and early on porn was a really big problem for YouTube, before they built up the system for detection and the team was able to flag that because they spent time, they knew where to hang out on these backwater forums on the internet.
Justin Hendrix:
So you take us through the early history of YouTube. It's everything from Hollywood to Viacom ads, ad tech, the copyright wars, and then ultimately acquisition and it's interesting to me that the strategy to buy YouTube emerges from Google's failures around video and in particular, Susan Wojcicki's failures around video,
Mark Bergen:
Yeah, there's the Silicon valley phrase, "You can buy or build," or I guess it's, "Build or buy," and Google initially tried with Google Video, which was a product that actually started off as a very Google-y thing. It was just going to do caption TV, make a searchable index of TV captions, and then they moved over to user generated video and pretty gingerly, at the time, they were actually screening, which is hard to believe, but they were screening the videos in advance because they just thought that was the only way to avoid liabilities.
So Susan Wojcicki was in the book one of a few, the staff would kind of joke and call her a mini CEO, these product managers. Marissa Mayer was another one that basically did a lot at the day to day operations and Wojcicki oversaw a bunch of different products, including Google Video. I think there's some... She talked about this later on in interviews, if she always had the foresight to identify YouTube as this culture phenomenon, Eric Schmidt, who was the CEO was also instrumental in that deal.
I think the most interesting thing to me was Larry and Sergey, the founders of Google, obviously back, much more involved than they are today, for them YouTube was a search property and it was a growing search property and they knew everyone else involved in tech that after text video could be the next big thing on the internet and it's not talked about enough today, but YouTube is the world's second biggest search engine remain behind Google and that was, I think, one of the primary driving interests and why Google bought them.
Justin Hendrix:
And right around this moment, you talk about this idea that one of the early founders, Harley, who you say brand YouTube, someone who used YouTube quite a lot, he steps aside and things start to happen Google's way, spreadsheets, algorithms, and the companies now more run by people who don't actually spend much time watching YouTube, but right around this time, a street vendor in Tunis sets himself a blaze.
Mark Bergen:
Yeah, the Arab Spring was a fascinating moment for a variety of reasons. So to set the stage that Google had, I think, there's two different versions of the story. They're both slightly true. One is that people that came in from Google were much more sort of Google-ly, right, spreadsheets and algorithms, scale, not familiar with internet cultures as much and certainly not familiar with the [inaudible 00:18:57] media, but YouTube was not a commercial success. There was no clear business model. It was just sort of also ran, there was this heavy, high-comm lawsuit still lingering until 2010 and these new managers that came over from Google, they were trying to whip it into commercial shape and actually make it a viable a product and I think there was legitimate fear there that, "If we don't do this, Google might let this thing die on the vine."
It was definitely part of the stable pop culture by then, but it's no guarantee for success. Arab Spring was one of these major turning points. like Facebook and Twitter, YouTube saw, "We are basically the only media that's coming out of many of these countries," and I think did a couple things. One is of legitimize them in the face of traditional media. This is great anecdote where CNN producer calls up someone on YouTube and is like, "How did you get that footage?" where it demonstrates they didn't quite understand YouTube and then CNN couldn't get into... this was an Iran, so before the Arab Spring.
Mark Bergen:
That's was... another, and I'm happy to, love to talk about this more, but this was still a time at YouTube that they were kind of experimenting and a little bit more willing to do what I would call editorial work. So there's a storyline in the book that I think is really important is that YouTube partnered with this company called Storyful, which some of your audience might remember, that was this digital newsroom that came up around the Arab Spring and did a lot of work on verifying footage and invested the time to sift through this extreme uptick in social media output around the Arab Spring, and actually confirm the validity of the YouTube footage or Facebook post or Twitter, which was a heavy lift.
I think so much of the book is this alternative vision of what could have been for YouTube in the internet and I really think it's worth asking what could have been if that Storyful model was actually given resources and more attention. So eventually Google dissolved that partnership and Storyful went to News Corp, but at the time they were leaning in a little bit on this and the Storyful team was actually curating videos that YouTube would try to promote more and put on its page and into algorithm.
Justin Hendrix:
So at this Arab Spring moment, you really get a sense that this is when these war parties start to see YouTube as a platform to game, they're going to have their information battle on YouTube as a platform. All kinds of things start to appear in the book at this point, guns, FPS, Russia, the beginning of mentions of PewDiePie, a character who lurks in the background throughout this book and then you have these other characters, of course, who come onto the stage, interestingly, right around the same time. Jonah Peretti, who you describe as a "weedy hipster," who ran a fledging website called BuzzFeed...who's on the same page, in the same paragraph as the emergence of Russia today on YouTube.
Mark Bergen:
My apologies to the description Jonah, only meant it with the utmost respect.
Yeah, I think this was the era of rose color glasses and the internet, and a lot of what was someone describing in the book as like internet of awesome, and I think the Arab Spring was certainly part of that. This is before Benghazi and things took a turn and took a turn for YouTube for sure.
You mentioned Russia today. I thought it was really... To take a step back a little bit, YouTube did a very intentional push to expand internationally, with Google pushing them internationally and then expanding and a lot of this, I think, hasn't really been fully understood and they went into countries and they really wanted to get people making videos in native languages where they didn't have, and some didn't have, like Pakistan didn't have an office, so Pakistan blocked out YouTube for a while. Certainly didn't have moderation teams or people familiar with the politics and the actual cultures and languages and yet were very strong and encouraging people to upload videos. And Russia today is just a great example. I think YouTube... and Google at the time didn't even think much of it. There's Robert Kyncl, who's now their Chief Business Officer at YouTube went on Russia Today, I think you and Paul discussed this on your prior podcast.
I didn't talk to... YouTube didn't let me talk to Robert for this book. Someone who worked with him told me he probably... They send him on this trip to Russia and they're like, "Here are the top performing YouTube channels that you need to meet with," and didn't think much of it and at that time Russia Today was leaning in on digital and YouTube and for YouTube, it's like, "Wow, this is..." YouTube, who spent so much of its life trying to convince traditional media of the value of YouTube, saw a big media operation going all on on YouTube and that was super exciting. This was before 2016, so we can excuse some of their naivete, but didn't even process the political implications of that.
Justin Hendrix:
So there's so much here and I would push the listener towards the book of course, in order to get more, but you go from this period of loose change and 9/11 truth or conspiracies and Isis, which gives way to Stefan Molyneux and Donald Trump and Milo Yiannopoulos....
Mark Bergen:
Is like who are my favorite of that Motley crew, is that what you're saying?
Justin Hendrix:
No.... We're sort of in a different patch at this point. This is when the kind of dark side starts to become clear.
Mark Bergen:
Yeah. I'm curious to get your thoughts on this too. One of the interesting things when I was reporting this was Milo and Breitbart army, and there's been a lot written about that, like the rise of the Alt-Right. One of the strange subcultures, the storylines in YouTube was this called the skeptics. If you recall, this a big moment for atheists on YouTube, like 2010, 2011, Richard Dawkins and Christopher Hitchens, not they were uploading YouTube videos, but people would upload videos of Christopher Hitchens and then discover the algorithm and then tweet, "Christopher Hitchens destroys creationist," that sort of thing and that became this... So much of YouTube is pattern matching. This is really interesting game where that's what happens, how YouTube's algorithms work and that's how YouTubers work. They're like, "Oh, this thing works? I'm going to test this. Oh, so I saw someone else doing this. This thing is taking off on the site. I'm going to do this."
There's this really interesting flow there that leads in these unexpected directions and one of them, and I'm sure there are people that have studied this and gone more in depth about this transition of like in the online atheist community, there was this big split and one consequence of that it's that there was just so much more misogyny and it turned on... this is a throwback phrase, but social justice warriors that sort of early, before Trump's election, this was a big moment on the internet of this engaging fight between feminists and people to oppose them.
I briefly touched on Gamergate, which is this explosive moment in internet culture and our modern politics, which people associate with Reddit and I think YouTube actually played a not insignificant part of that, and a major reason is that YouTube was, at the time, and still is, the only place where online creators can make money and you've talked about this on prior podcasts with what incentives for some of these creators to make videos? Money was clearly one of them.
And I think that this was a blind spot for YouTube for a long time, but misogyny and a lot of these battles and early on it was creationists were this interesting target and then it became social justice warriors and then it became YouTube itself, but there's always... the videos that tend to do well and have these tensions, like any type of media narrative and have really good foils and women on the internet became this fantastic foil for a lot of YouTubers and now it's having this new wave of attention, but for a long time, it was just treated as like, "This is inevitable and this is part of the internet and there's nothing we can do about it."
Justin Hendrix:
There are so many different interesting anecdotes in here and in characters, of course, who come in. One that stuck out to me was the scene of a bus ride that Guillaume Chaslot took in Paris, where he encounters a man watching videos about a purported plot to exterminate a quarter of the world's population. That also struck me as happening at a moment in time, which says something about where YouTube was at the moment and its role in society.
Mark Bergen:
The history of conspiracies and misinformation on YouTube is a really fascinating one and the book touches on this a little bit, you mentioned Loose Change, which was the really early 911 truth or video that actually took off, I think, on Google video of all places and then people I talked to at early YouTube were, "We would just assume that viewers would be able to sort out that this was..." That viewers would have enough of a sensibility and intelligence to identify this as bunk, as conspiracy. I think that the 911 truth [inaudible 00:31:17] was like it didn't... At the time there was no sense that this has any harm.
One interesting thing I discovered in the book and I'm curious to the people who read it, what they thought of it. So YouTube had built up around this, they were starting to build this creator team. They were the first platform on the internet to pay online creators, but they were relatively late to actually coming around to their commercial potential. It was like, "Oh, we're going to have maybe a few that'll be big hits, but they're amateur advertisers aren't going to get behind it. We really need A-listers in Hollywood.
And then around 2014, they're like, "Oh wow, these YouTube stars are actually bigger than Hollywood stars in many ways. Maybe this is our business." So they started assigning people to work with creators and help them shoot video and come up with all sorts of teams and directions, working with beauty creators and video games. A person told me that they had this idea that they saw all these paranormal videos on YouTube, which some of it's the Discovery channel, right, History channel, we'd see this on TV, but on YouTube, it's got a much more of a YouTube flavor, "Why don't we work with these creators and build a bit, effectively a vertical around paranormal," which in some ways you can go like, "Oh my God, flat earth." This would be YouTube of embracing this, what ended up being a total conspiracy theories.
I think the other hand is what if YouTube had done that? What if it had leaned in and paid more attention to sort of fringe at the site, but these strange worlds, like videos around paranormal activity and aliens and UFOs, if it had a little bit more handholding and touch, I think there's an argument that it could have seen the issues with, say, like flat earth-ers or QAnon earlier. Who's to say they would've... anything would've done differently, but I thought that was a really telling example of this thread in the story that I try to drive, which is YouTube is just... Google is not good at managing people, and then typically, for a variety of reasons, avoided hands-on management of their creator class and then had real consequences for these naughty issues around misinformation. This was a really interesting example. There
Justin Hendrix:
There's a sense that in the background, this business model chugs forward, no matter really what decisions YouTube makes, that people make mistakes, things come and go. I was reminded of the phenomenon of multichannel networks, for instance, and how big a deal those seemed just a few years ago, but YouTube just keeps getting bigger, it keeps growing, its creators keep earning and things plod ahead.
Mark Bergen:
Not without major crises and some of the key moments in the book where when these collisions happen, where YouTube had an extended brand advertising boycott and put in a lot of resources to fix those adjacency problems and we saw it, what was it, two summers ago, I think, when advertisers boycott Facebook around hate speech. As far as I know they did not leave YouTube at all, which is remarkable. I think speaks to just how lousy Facebook's reputation was, but I think YouTube had been through the ringer with this in 2017 and put a lot of pieces in place and there was a big internal effort. The criticism here is they put, I think, a lot more resources and effort into combating this because it was critical for their business, than say dealing with some other issues that critics have pointed out for years, around and we talked about, like harassment of women, later on they dealt more with health misinformation during the pandemic, but people have been pointing that out for a long time.
Anyway, sorry, they did bring in a lot of resources. Like Google is exceptionally good at modernization and it never really slowed down. A lot of the book deals with kids' content too, which we haven't touched on, but that's by far, the top channels on YouTube are design for toddlers and for kids and that's like unrelenting, even with this new competition from TikTok.
Then we're doing a podcast here. YouTube is now currently under this big push from the podcasting and I wouldn't be surprised that they made it very easy ways for one click upload your podcast as a YouTube video and podcasts, that's just hours and hours of hours and hours of content that are going to be added to an already gigantic corpus.
Justin Hendrix:
There are so many subplots and side stories and minor characters who walk in and off the stage in this book and we don't have time to talk about them. Everything from the Google walk out to Christ Church and the pandemic. You've mentioned health misinformation. You take us through moments like the murder of George Floyd on through to the January 6th insurrection and you, at that point, leave us with this question that seems to be, I guess, still unresolved, which are the problems on YouTube a reflection of society's problems or to what extent is YouTube playing a role in exacerbating some of the dynamics in politics and culture, not just in the United States, but also abroad.
And there's a phrase that you mentioned in here that apparently has had become popular in Silicon Valley, "Don't blame the mirror."
Mark Bergen:
Yeah, love that one. I think it's a worthwhile question, in part because I do think that people have to understand that's a sentiment across the company. You mentioned Guillaume Chaslot, the former engineer who's now doing a lot of work around algorithm and transparency. One thing he told me is when he initially brought some of his findings about conspiracy theories to people at YouTube, they were like, "Yeah, well, people watch it. It's not our fault. Who are we just to tell people what to watch and what not to watch?" which I think is, sure, valid and if you assume that that's the case, but YouTube does tell people what to... YouTube does do programming, which is, I think, one of the points I try to hit in the book is YouTube does dictate, not just what people watch, but what videos get made.
That's the other study of algorithm that people don't talk about a lot. There's been a lot of interesting conversations about radicalism and rabbit holes on YouTube, but I think what's equally interesting is what type of videos are made are dictated by the way that YouTube sets its algorithms and its decisions. So in that way, they are programming and they're not a mirror to society, but they're a gigantic and powerful media programmer in this relatively opaque... not relatively, this outright very opaque way.
So I think that the mere argument doesn't really hold water, in part for that reason. Second, if you're going to be a reflection of society, you can't also be a $20 billion year advertising platform, which is like work. Like Susan Wojcicki at one point in 2018 was like, "We're kind of like a library," which is a preposterous notion if you think about it. A library is not a for profit, a gigantic advertising institution, but it's also really, I think, gets into the mindset a little bit both of this is how Google defines itself, but this is how Google thinks about itself.
I think that another interesting and important thing not to forget is that YouTube's part of Google. Google thinks about indexing the web and I think a lot of the way that YouTube has approached its business and a lot of problems around misinformation and propaganda and speech is the way that the search engine works too. I think borderline content, this is really basically putting it on page 12 of Google search. No one goes there and I think that's a really valuable tool that Facebook and Twitter don't necessarily have. YouTube developed the tool and [inaudible 00:40:21] one, which we haven't really talked much about demonetization. YouTube can take away or reduce someone's ability to make money in a platform and there's all sorts of problems with that.
I think that the mirror argument is telling because it reveals how the company thinks about itself, but I do genuinely believe that it gives them cover for a lot of the economic and incentives that they're responsible for.
Justin Hendrix:
And you do take us on that trajectory from Larry Page, going on about neural networks at a TED conference, on through to the time, I suppose YouTube is decomposing what happened in the 2020 election, Stop the Steal, et cetera, and there seems to be a sense, from the book, from the folks that you talked to there, that they're satisfied that they've fixed the algorithms to some extent, that they're now not promoting false claims in quite the way that perhaps they were in past. Do you think they're somewhat, I don't know, resting on their laurels at the moment?
Mark Bergen:
I think part of that is a little bit of a really savvy political strategy where YouTube is, as you talked about does not get this scrutiny that Facebook and Twitter, in some ways doesn't even get the scrutiny that TikTok gets right now, but that's a little bit separate because given the Chinese company, but it's because YouTube is really good at staying out of these, literally just not participating in these debates.
I do think that I would say confidently they realize they've made a lot, and they have made a lot of changes since 2019, certainly since 2016, 2017. I think that we have the structures and mind you, almost at every turn of the crises, they've said sometimes it was like, "We're putting in the structures in place," now it's like, "We have the structures in place." They have the machinery, they have the literal machinery where they'll say, I forget the stat, but it's something on order of close to a 100% of the violated videos are taken down with machine learning alone, it doesn't even have to deal with a human screener, very few people watch it, relatively speaking, although we don't have exact numbers, they have these systems in place to down rank videos of borderline and all these sort of tools.
I think there is a sense that, "Okay, now we have the guardrails in place and now we can handle a lot what's thrown at us." Their response time, to be fair, has gotten much faster and at times in the past they spent days or weeks of laboring over these decisions. I think the nature of the problems are changing and getting more interesting and outside the U.S. is a really interesting example. Russia, India, I think is fascinating. India is YouTube's biggest market and you talked about this, but now, just because of the way they put these protocols in place, it's a very different company than it was when it fought tooth and nail to keep videos up that governments want it down.
Now it's sort of like, "Okay." They're much more willing to take these videos down in order to continue smooth operations in a place like India and avoid any controversies. And they're more willing to, say, moderate to appease something like the Indian BJP, which I think does have very real consequences for... and then the book doesn't really get into this too much, because there's only so much space, but India is a really fascinating test case. There's no TikTok there, so that's one of the reasons that YouTube is massive. I've talked to people there that have staff there, but God, dozens of languages and all sorts of complicated issues around how do you define caste discrimination and hate speech in these things that people on YouTube admitted that they're still not prepared to handle.
Justin Hendrix:
Mark, there's so much in this book. We could spend another hour going through the history and all the specifics of YouTube's evolution, of this problem, the extent to which the company may or may not be organized to deal with problems abroad, but I want to ask you maybe just stepping back from it, you paint this picture of a hulking beast and just an enormous chaotic entity that makes an enormous amount of money, that perhaps no one can control. Perhaps no one can really get their minds around all of what's going on with YouTube and certainly couldn't control it. You even suggest that Susan Wojcicki could perhaps not even truly steer the thing, at least certainly not quickly, if she chose a particular direction.
Justin Hendrix:
Does that mean also, from your perspective that Congress, the FTC, Europe, through its DSA, any democratic mechanism, can you imagine anyone in a position of, I don't know, governance, getting their heads around YouTube?
Mark Bergen:
So one hot take alert, I guess, one place that, at least in the U.S., that regulators have actually gone after big tech is the FTC with YouTube's violation of the children's online privacy law. So that was 2019. It was, at the time, a record flying... It was jump change for Google, but it was significant for the agency to accuse YouTube of knowingly violating online privacy rules and serving targeted ads to children under 13.
That has had consequential changes in the way that YouTube kids and YouTube operates with kids' media. So basically YouTube is split in the two right now, and most viewers might not know this, but you have a video that's either marked as made for kids or not and if it's marked as made for kids, can't serve targeted ads, has a lot more restrictions on it and this is partially because of the FTC and partially because of this thing we've been talking about, like advertising pressure. I think this is a particular issue where people at YouTube have kids and they're like, "Wow, we actually have the world's biggest kids' entertainment service. Maybe we should be more thoughtful about how we operate this."
But I think it's a really interesting case study in that it's... Listen, I'm sure the people get these... Like some of the FTC get the changes they want, but YouTube was criticized for a long time for being overtly commercial toy and boxing videos. Some of the most popular content on YouTube was effectively just a nonstop ad in a world where TV and FCC actually regulates TV, so it can't do that, and it has to have enough educational programming. I mean, YouTube's moving that direction in a very YouTube-y way. It's sort of saying it's encouraging algorithms to get kids to do things outside and make educational videos, like who knows?
Arguably... how much it's doing that at scale? So yeah, I do think that there was the one time that the regulators in the U.S. actually took action against YouTube, I think it's had some marked change in the way the site operates, not just around the issue of online privacy and ad targeting, but in some of these broader important issues about what type of YouTube content is made and viewed.
That being said, and I'm not an expert on DC and policy, it doesn't seem like a DOJ case is going to force Google to spin off YouTube. I think it might have some consequences for the way that their ads business, ad tech operations work and Google's already moving in that direction as anticipating that those changes. I actually think Apple's had a bigger impact on YouTube and other social networks and their advertising model than the DOJ or any regulator ever will.
Justin Hendrix:
There any final thoughts you have on this company as you step back from the book and as you think about how to cover it going forward. One thing we've talked a little bit about here, of course, is TikTok s a phenomenon, which I suppose many people will wonder about, is that ultimately going to disrupt somehow YouTube's dominance?
Mark Bergen:
Yeah, YouTube certainly, they've now putting so many resources into and I'm sure anyone who uses the YouTube app can see there are shorts, which is their competitor to TikTok. It's not Instagram that's remaking its entire product. I think YouTube has the benefit of a built in audience and it's a little more savvier about that kind of stuff, but they are certainly driving a lot more towards shorts. I think shorts are going to have their own content moderation and policy implications that we haven't quite explored.
I think one thing I'm just landing in there is that so much of... it's interesting. So much of the YouTube's for response to a lot of the pressure they've had in the past few years has been like, "We're going to start having more responsible..." it's the word responsibility, and "We're going to promote valuable," whether it's legitimate news sources around COVID or just creators that don't get them into trouble with advertisers or politicians. I think they can still operate that with shorts, but the thing about shorts is in some ways it gives you as a viewer, you have more feedback because you're flipping through your phone a lot more and you're seeing a lot more videos than you would watched a 12 minute YouTube video, but in other ways there's less mechanisms for us giving actual feedback, if that makes sense.
YouTube has started to run these surveys and that's something that they're plugging into their algorithm as far as promotional videos and it almost becomes there's this world in which TikTok is driving this a bit more mindless consumption. So maybe that's where I think part of the whole book is a plea to pay attention to this Oz behind the curtain of this really powerful company, and in some ways the shift toward... I think people will be talking a lot more about YouTube and its competition with TikTok, but just the way the structure of the platform works if we're flipping through minute long clips, that's a different way than engaging.
Mark Bergen:
So I think it's sort of a plea for... You talked about this a lot too. It's like YouTube is just not very transparent and I think more and more pressure it has and more transparency always has good outcomes.
Justin Hendrix:
Well, you've certainly done an enormous amount to reveal Oz. I don't know if that makes you the Toto of the story, but is this going to continue to be your beat? Are you going to keep looking at YouTube going forward? Is there another book in the future?
Mark Bergen:
I am actually taking a little bit of a side step. I'm certainly looking at YouTube and Google. I'm starting to cover climate tech actually with Bloomberg, which feels like an equally important, albeit very different in the way that the technology's interacting with society and politics. But that being said, I still think YouTube's fascinating, so we'll keep tabs on it.
Justin Hendrix:
Mark, thank you so much, and I would encourage folks to go out and read this book, which is on sale September 6th.
Mark Bergen:
Awesome. Thanks for having me.