Transcript: YouTube CEO Susan Wojcicki on online speech, government regulation, and Donald Trump’s suspended account

Watch the full event

Event transcript

Speaker

Susan Wojcicki
CEO, Youtube

Moderator

Frederick Kempe
President & CEO, Atlantic Council

FREDERICK KEMPE: Hello and welcome across the world. I’m Fred Kempe, president and CEO of the Atlantic Council.

I’m delighted to welcome you to Atlantic Council Front Page, our premier ideas platform for global leaders, and to this session with Susan Wojcicki, the CEO of YouTube. You can also follow us online through hashtag #ACFrontPage. On this platform, we’ve hosted heads of state and government, leading lawmakers, chief executives, and innovators in the private sector. President Macron of France recently spoke to our global audience and tomorrow we’ll host Malala Yousafzai, the Pakistani human rights activist and Nobel Prize laureate.

But seldom have I ever been so looking forward to one of our Atlantic Council Front Page discussions as I am today. And that’s not just because Susan has been a leader in the tech industry for more than twenty years or that she runs the leading video—first platform in the world with two billion users. Pause on that for a moment: two billion users.

It’s not just because of her particular focus on the growth of YouTube’s creator economy. Over the past three years, YouTube has paid more than thirty billion dollars to creators, artists, and media companies, supporting 345,000 jobs. And in this year of COVID-19, we know how important that is to people. We’ll get to that.

We’ll also get to the burdens and responsibilities of running such a company in our world of disinformation, online harm, and growing digital dangers to democracy. The events of January 6 in Washington, DC have brought a reckoning in many respects.

There is also no topic more relevant at this moment than that of technology’s role in society. As more of the world comes online, especially during this year of coronavirus, we are grappling with complex debates over the role of private industry in governing the internet. What’s the meaning of human rights in digital spaces? How do all of these issues shape the world we want to live in?

And in that spirit, join me in welcoming the chief executive officer of YouTube, Susan Wojcicki. A humanities major, she took her first computer class in college as a senior. She sold spice ropes when she was eleven. So, Susan, there’s a reason I tell this story. My thirteen-year-old daughter was not impressed that we were hosting President Macron, but when I told her we were hosting you she was beyond herself.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: And she is learning animation now because of the inspiration of YouTube.

It was also in Susan’s garage that Larry Page and Sergey Brin spawned what became Google. She was its first marketing manager, and as employee number sixteen, went on to help launch and grow AdSense, Google Analytics, Google Books, Google Images, and even the famous Google Doodle. In 2006, Susan convinced Google to acquire a fast-growing video-sharing service called YouTube. The rest, as they say, is history.

Since taking over as CEO in 2014, she has helped grow the company to its now estimated ninety-billion-dollar valuation, boasting, again, the two billion monthly global users and an estimated ad revenue of fifteen billion dollars in 2019. She is one of only two women leading a company listed in the top ten of most-visited websites. And she has used her position to address the challenges women face in the tech industry, leading to changes in her company and Google, while speaking out publicly for paid family leave at the national level.

So, wow, Susan, that’s really a lot, and that just scratches the surface. We’re all looking forward to getting to know you better.

Before we get started, let me just share a little bit of context about the Atlantic Council’s work in the tech-and-society space through our GeoTech Center, which focuses on harnessing technology for good; through our Cyber Statecraft Initiative, which today began its ninth annual Cyber [9/12] student contest across thirty-five countries and four continents; and in particular through our remarkable Digital Forensic Research Lab, which is hosting this event. So let me salute Graham Brookie, who leads it so capably; and also Rose Jackson, the director of policy initiatives at DFRL.

DFRL has been at the forefront of the effort to better understand, document, and confront online harms, build resilience, and develop a community of practitioners globally—we call them digital Sherlocks. In so doing, our team has extensively researched and documented [misinformation] and disinformation spread in many parts of the world. As they have demonstrated, confronting the very serious and real offline consequences of online harms requires policy, legal, product, and societal changes.

Finally, I’d be remiss if I didn’t acknowledge that as we raise all these questions about technology platforms, we are operating off of them today. You may be watching us on YouTube, commenting about it on Twitter, sharing reactions through Facebook. Plenty of evidence to suggest that social networks have become an essential part of business, government, and life in the world. Indeed, one of the biggest differences between our pandemic now—the worst in a century—and the one that took place a century ago is we had a virtual world to which we could retreat, a digital world that would help sustain economies and so many livelihoods. So setting and advancing the positive vision for tech in the world requires us to have the sort of direct and honest conversation we’ll have today and across industry, governments, and civil society about the rules and incentives that govern online spaces.

So I’m sorry for that long of a windup, but I wanted to set the context for what we’ll be doing now. I’m so grateful for Susan’s willingness to join us today to discuss how she views these issues and the role of YouTube. We welcome the audience to ask questions via Zoom Q&A if you’re on Zoom or by tweeting at #ACFrontPage.

So, Susan, after that introduction, let me set the stage—which I hope sets the stage for the first question—so, how big was the garage where Google was born? And why did you let these clowns—no, that’s not really the first question. I’m really—

SUSAN WOJCICKI: It was small. It was small.

FREDERICK KEMPE: (Laughs.)

SUSAN WOJCICKI: But thank you for having me here. I’m delighted to be here and looking forward to the conversation. But, yeah, their garage was small. They actually entered through the garage. They had a small part of our house that they worked into. The whole house was tiny.

FREDERICK KEMPE: So do tell us that story. Since you’re on it, tell us.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: I mean, did you have any idea what they were—what any of this could have led to and –

SUSAN WOJCICKI: No. No, I can’t say I thought, oh, they have this amazing idea, I’m going to rent them the garage because I want to have equity or anything. I just wanted the rent. (Laughter.) I wanted the rent to make sure I could cover the mortgage of our house that we had just bought. And I had known Sergey beforehand. And so then they moved in. They had one employee. They got up to seven employees in the house and it got pretty crowded at that point.

But I did realize when they were there—they would be there all night and I would come over and talk to them—I did realize what they were working on and how compelling it was. And at the time, this wasn’t seen as a very interesting part of the tech market and it didn’t seem like it was really going anywhere, and but I realized, like, wow, this is really making a difference and it can help me find information. And I grew up in an academic house and being able to find information was really something I knew the importance of, and I saw what a good job they were doing and that’s, ultimately, what convinced me to join them.

FREDERICK KEMPE: So the first lesson of this is if you live in Menlo Park, rent out your garage or parts of your house because it could pay off big time.

So a question that comes off of that, why were you so convinced, because you said what they were working on wasn’t so recognized at that time. But why were you so convinced when you joined Google as the number sixteen employee? Why were you so convinced they should acquire YouTube? What was unique about it as a platform, product, and approach that attracted you and, ultimately, convinced them?

SUSAN WOJCICKI: So we were working on a product, a Google video product, and so we had a product that was similar to YouTube. And so as a result of that, we actually saw how people were using it and what they were uploading, and I would say there were two really important insights that we gained from working on it.

And the first is that people want to share their story. So we didn’t know [if people were] going to upload their videos, and what we saw is that people—like, right away, we saw millions of videos being uploaded, people [were driven by] just the desire to share their story. And, of course, lots of people want to become famous or talk about what was important to them. But then there was this question of does everyone want to hear from everyday people about what’s meaningful to them or about their hobby, and what we saw is that they do.

And there was actually a video I remember in particular that really cemented that for me, which was of these two students in their dorm room. Their roommate is in the background doing homework and they sing this song, the Backstreet Boys, and it’s so funny. And that was really the first big hit that we had and I just realized, wow, you can have hits here in user-generated content. People want to watch it and people want to see [content] from other people like them, and I realized what a powerful medium it could be for entertainment but also for information.

FREDERICK KEMPE: That’s really interesting, and it’s so interesting to talk to somebody that was there so much at the beginning of all of this, which gets to my next question, which is, I think we all saw promise in the internet. We all saw promise in things like Google and YouTube and Twitter.

But did you ever think it would become such an essential part of life, business, [governance] even, all around the world? So I think that’s question one, and then question two, now that it is that, how does that change the way you look at your responsibilities compared to, say, when you started as CEO 2014?

SUSAN WOJCICKI: Well, I didn’t really see that when we first started. I mean, when I first joined, just to put it in perspective, Google had sixteen people, right. And I was in place sixteen and I did see the promise right away of information and the way people were writing to us and discovering new places to go or new doctors, new treatments, new information, new songs, [and] new music. I saw that upfront.

But I couldn’t have anticipated how it was going to grow into the platform that it is today. And, again, it’s been over twenty years. I think most people—it’s hard to see twenty years into the future and predict what that’s going to be. So, I mean, it’s hard for us now, right, to think [about] what’s technology going to be in 2040.

But now that we are where we are and I see the role we play, I’ve been very focused on the responsibility [aspect], and because I’ve been at Google for over twenty years, [and] because I’ve been now [YouTube’s] CEO for over seven years, I feel the responsibility to—because I know how our systems work, I know how to build them, I know how to change them—to take everything I know and to make sure we apply them to these hard questions that we’re dealing with as a society and make sure that we are a responsible platform. And it is some of the hardest work that I’ve ever done.

But I also know that the combination of consulting with experts, talking with policymakers, and being able to translate that into whether the right policies and products for YouTube and for Google, I know that we can do that and we’re on a journey to do that. But we’ve come tremendous—we’ve made tremendous progress.

And so, really, the responsibility work for me really started around 2016. I mean, of course, we had always talked about doing the right thing for users. That’s been there since day one of Google. Do the right thing for users. And but in 2016, we really started talking about responsibility and the role that we played and I can answer more questions about that because that’s a complex area. What is the right thing to do? What is responsibility and what does that mean for a platform that is global and deals with so many hard issues?

FREDERICK KEMPE: Let’s stay there for a second because—and talk about one specific category of harmful content and that’s [misinformation] and disinformation, linked with everything from people refusing to take the COVID-19 vaccine, to ethnic violence in Myanmar, and to the attack on the Capitol.

YouTube announced… site wide policies to moderate against extremist content in 2017. Why did you make the decision then and what have you learned through that period of time and where do you go now?

SUSAN WOJCICKI: So you referenced 2017, and the first category that we worked really hard on was violent extremism, and there were a number of events that happened in 2016. Whether it was the attack in Nice or the London Bridge attack in 2017, there were a number of attacks that we spent a lot of time self-reflecting [about] what role does our platform play and how can we make sure that we are being very careful with regard to violent extremism.

And there was actually, like, a particular event—I believe it was the London Bridge one—where there were some accusations of the role that YouTube had played. And I went home that night and I looked at the content that we were being accused of leading to violent extremism and I talked to our reviewers, and there was nothing in the content that technically violated our policy.

So there were videos from various—I won’t say who, but various individuals that technically met the bar. But when I started researching, you could see from people who were experts in the field that they felt that this led to violent extremism. And, really, what I discovered was [that there were] a lot of dog whistles that were going on that a normal person wouldn’t hear but yet were potentially problematic.

And at that point, we went and we hired a large number of violent extremism experts and worked on a plan… We made changes in terms of our policies and approach, and we made significant progress in removing that content. And, again, these were not groups that were on the foreign terrorist organization or prescribed list, right, because if they are, then that’s content that we understand has been clearly marked by a government as not being appropriate to have on the platform.

But it was other content, other messages, and we had to get really, really detailed. And so that was the first [hurdle] that we really tackled and made tremendous progress, and then we realized that we needed to take that same approach and apply it to many, many other areas, whether it was hate, child safety, dangerous tricks, many different areas, and we have done that and I’m really proud of the work that we have done since that time, and ongoing. [There’s] still more work to do.

FREDERICK KEMPE: There are policy decisions and there are design decisions. How do you weigh each of these? Because sometimes it looks like the design may be responsible for a person going from one thing to another and being led inadvertently to content that the person wasn’t necessarily starting with. And then there are also the policy decisions where drawing the line between censorship and dangerous speech and inciting of violence is sometimes a difficult one to draw.

SUSAN WOJCICKI: Well, we need to work on all of them. So you need to have a comprehensive solution between your policies, your product, how your systems work. And so we’ve actually come up with what we call the four R’s of responsibility, and pretty much everything we do is captured in one of these four R’s.

So the first one is “remove.” And that’s generally where we update our policies. That’s a very high-leverage move, to make a new policy, because then [the content] becomes… no longer allowed on our platform. So last quarter, for example, we removed approximately nine million videos. Those were all violations of our policy. We do so very quickly. Ninety percent of those removals are automated. And the vast majority are done within a few views. So that’s “remove.”

The second one would be “raise.” I’ll use COVID-19 as an example because you talked about it. We passed ten different policies associated [with] COVID-19 and we removed that content very quickly and were able to make sure that that content was not being viewed as a result.

But then there’s also “raise.” So we wanted to make sure that the authoritative information that came from different health authorities on COVID-19, that we could raise it up, whether that was on searches [or] whether that was on the watch page: People who did any kind of video watching around COVID-19, we [raised] information that came from authoritative sources.

Then we look at “reduce.” And, by the way, we served over four hundred billion impressions from COVID-19 that came from authoritative sources, which is probably one of the largest campaigns we’ve ever run.

Then there’s “reduce,” which is content that technically meets the letter of the policy but doesn’t really meet the spirit or is very low-quality content, like aliens are in my backyard or aliens caused COVID-19, right; that’s not content we’re going to promote. So that’s under the “reduce,” meaning it’s not content we’re going to recommend.

And then, lastly, “reward,” how we handle and then work with our advertisers, because once you enable monetization of content, then that creates an environment where there’s more and more of that content created. So saying that you’re not going to monetize it reduces any incentive to create that content. It’s also something advertisers wouldn’t ever want to be on.

So then the last one is [“reward.”] So it’s really a combination of our policies, our recommendation systems, our search, how that works, and then, of course, our monetization policies.

FREDERICK KEMPE: Thank you for those four Rs. That’s really, really useful. Speaking of removal, although I know you were talking about videos being removed, one user was removed after January 6, and that was the president of the United States, Donald Trump. Could you talk about why you did that? And will we see him again on YouTube?

SUSAN WOJCICKI: Yeah. Well, so, first of all, before I go into that, I think it’s really important to just be clear about how YouTube works and how we make decisions. So, first of all, we have a three-strikes system. We have had that system since the very beginning of YouTube. And those three strikes, they apply to everybody. So there are no exceptions. Politicians aren’t excepted. Famous people aren’t excepted. CEOs aren’t excepted. Everybody follows the same rules.

And so after the Capitol attack on January 6, we did see videos that were uploaded by the Donald Trump channel that were a violation of our incitement-to-violence policy. As a result, we removed those videos very quickly. And when we see a violation of our policy, we suspend the channel for seven days. So that is standard procedure about how our strike system works. And again, it works that way for everyone.

Now, it’s a minimum of seven days. And the channel remains suspended due to the risk of incitement to violence. And given just the warnings by the Capitol Police yesterday about a potential attack today, I think it’s pretty clear that that elevated-violence risk still remains.

So, however, I do want to confirm that… we will lift the suspension of the Donald Trump channel when we determine that the risk of violence has decreased. That’s per our policies. That’s how our three-strike system works. But when the channel is reinstated, it will be subject to the same policies that every other channel is.

So if we see content that is uploaded that in any way violates any of our policies, incitement-to-violence or any kind of election-integrity policy violations, then a second strike would be issued. And when there are three strikes within a ninety-day period, then the channel is removed. And again, these are policies that we are public with. Everybody follows them. Our policies are public. And we’ve designed a system that we will continue to make sure is fair, clear, and applies consistently to everybody.

FREDERICK KEMPE: So thank you. That’s really a useful answer.

Let me follow up on that—

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE:—because it was a little bit vague when President Trump and when that channel could come back. You said when the danger would be reduced. How will you measure that? And then, as I understand it, the former president would then come in with one strike against him. I guess if I want to be provocative, I can say January 6 was a pretty big strike and maybe it should be one and a half or two or more.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: But… not all strikes are created equal. So I wonder if you could deal with both of those things.

SUSAN WOJCICKI: Sure. Well, the way we would determine whether or not that risk of violence has—and where we are with that—is by looking at government statements and government warnings. We certainly would look at increased law enforcement around the country. We also would look at any kind of violent rhetoric that we’re seeing on our platform.

We have an intelligence desk where we look and try to understand everything and get ahead of what’s happening on our platform. So those are all different signals that we would look at. And where we stand today, it’s hard for me to say when that’s going to be. But it’s pretty clear right now where we stand. But there still is that elevated risk of violence.

I think you had a second question.

FREDERICK KEMPE: It was whether some strikes are larger than others.

SUSAN WOJCICKI: Yeah. So definitely… yes, there are some variations. So, for example, there are certain categories where even one strike results in a termination of the account. So, like, child safety or if we saw something that was inappropriate with regard to kids, violent extremism—there are a number of different categories which just results immediately in a termination.

I think with this one it was the first strike that we had issued. It wasn’t like we just turned it back on in seven days… but continued to make sure that we’re monitoring the situation. And so we will turn the account back on, but it will be when we see the reduced law enforcement in capitols in the United States; we don’t see different warnings coming out of government agencies. Those would be all signals to us that it would be safe to turn the channel back on.

FREDERICK KEMPE: And related to that, are all global leaders created equal? I think what I mean more broadly on this is you have to operate in very different legal jurisdictions, and you’re operating all over the world. Can you really set a set of standards and set of rules that work everywhere, or do you find that to be one of the more difficult things to do? So leaders in other parts of the world, if they had those strikes, would you pursue the same? And have you been in a position where you had to decide whether to pursue with the same approach?

SUSAN WOJCICKI: We have applied strikes to other world leaders. So, like, in Brazil, [President Jair] Bolsonaro, for example, received a strike for the COVID-19 misinformation policy. Right now in Myanmar, there are various channels that we have taken down. So although there are global standards—our policies are all global standards—we work with experts, first of all, to make sure that we are interpreting them and understanding them as clearly and precisely as we can. But then there are going to be certain things—like, with COVID-19 it was very clear in some cases that these are violations of our policies.

And we have to be consistent. So the policies apply to all global leaders consistently. There are no exceptions. And we’ve spent a long time thinking about this. This isn’t a question I ever thought that I would be asked about or something I would be doing. And we spent a long time initially thinking about how we wanted to handle this. Did we want to have exceptions for policymakers, for politicians? What was the right approach?

And I think it’s a very dangerous path to say that some people have a free pass and that they can say whatever they want, the rules don’t apply to them, and… so from my experience and from looking at this, and also knowing where do you draw the line—like, what if you give a special exception to heads of state? Then what about members of Congress? What about governors? Where does the line end? And what happens when those people violate some of the policies? Is that OK? Why would I give a pass to someone who’s an elected official but not to an everyday citizen?

So, in general our approach has been to have the policies apply to everybody consistently. Now, that said, when you do have an elected official and they do say something, that is also covered by all of the news organizations. And so it will most likely be reuploaded to YouTube, but there will be context around it, meaning it will be uploaded by CNN, by Fox News, [and] that will cover what that politician said, but they will give context around it.

And so, for that reason, we have an educational-documentary-scientific-artistic exception, EDSA, so there can be content that is uploaded that way. So people would still know what the politician said. It’s just that it would be given the context by a journalist [or] by a news organization, as opposed to just the direct words of the politician.

FREDERICK KEMPE: Susan, I know all of these things you could probably answer in hour-long answers because you know so much about it. So thank you so much for these really brief but also very focused and informative answers.

One question from our audience. I’ve got a couple lined up here. One question, probably the last question on this issue.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: Did Trump not spread things like COVID-19 misinformation before January 6 and intellectual-fraud misinformation? Did he have no strikes before January 6? It’s from the audience.

SUSAN WOJCICKI: So good question. So one thing I want to point out is we did implement a policy and enforcement starting when the states certified the election. So that was on December 9, I believe. So we actually at that point made it very clear that you couldn’t upload content that alleged that the outcome of the election was due to widespread fraud. Like, you could say, hey, I saw my neighbor committed fraud, but you couldn’t say the whole election was due to widespread fraud. And starting a month before the Capitol attack, we started removing content that alleged that the outcome was due to fraud. And, yes, there were a number of videos uploaded by the Donald Trump channel that were violations of that and that we did remove.

So any new policy that we implement, we do have a grace period for giving strikes because we don’t want to just say overnight, hey, new policy, strike; you lost your channel for seven-plus days. So we have a grace period where we remove the content but we don’t issue a strike. And so many videos were uploaded that were removed, but there wasn’t a strike. We are now giving strikes for that, of course, and we changed that in January. So, yes, there were. And again, the Trump channel has always been subject to the same rules as everybody else.

FREDERICK KEMPE: Thank you so much for that.

A good question from journalist David Lynch. And Susan, I worked for the Wall Street Journal for more than twenty-five years, so I like this question a lot about authoritative content.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: I’d like to know what Susan—from Susan what kind of new tools and products for news and journalism that YouTube is working on to make sure people can see reliable news and cutting-edge visual journalism.

SUSAN WOJCICKI: Great question. And we’re very focused on making sure that authoritative information, especially from different news organizations around the world, can be viewed, seen, and also that there’s a revenue model behind it. So we actually have news—breaking-news shelves. For example, in COVID-19, we’ve had a breaking-news shelf that has been very prominent on our service where we’ve highlighted news that comes from various authoritative sources. We do that every single time there is some kind of breaking news to make sure that our users are seeing news from reputable sources, which is incredibly important both for the organizations but also for our users to make sure that they get the facts.

We also have a series of different tools and services that we do with journalists to make sure that they can use our technology to be able to easily have more video on their site. We have different monetization programs. So we will continue to invest in the work that we’re doing with news organizations. And we are seeing more and more news being delivered via video. I see that as an increasingly important format. And we will continue to invest and work with news organizations on that.

FREDERICK KEMP: One other question I have here that’s quite interesting, and it gets to the global nature of YouTube. And it comes from Evelyn Perez-Verdia. And she talks about the many Spanish-language disinformation campaigns that one saw in the 2020 elections, denigrating candidates, calling the media “enemy of the people,” QAnon conspiracy theories. And her question is that since Spanish is the second-most-spoken language in the United States, how is YouTube focusing on this challenge, and is it equally the same as English? And I guess one could extrapolate that across the world. It’s a lot of languages to deal with.

SUSAN WOJCICKI: Mhm. So we do have policies across all those different areas. So there were a number of different subjects that you mentioned there, whether it was different election integrity, so we certainly had policies around spreading false information about candidates. And again, this was information about candidates where there’s a very clear answer, like where they were born, whether they’re eligible to be elected, et cetera. So we do have very clear policies associated with that.

But we do work in every language. And we have reviewers, we have enforcement. And that is one thing I want to point out, too, is that when we roll out a new policy, we need to make sure that we can enforce that policy globally and we can do so in all languages. And it’s not just in Spanish. We need to do that in all languages around the world—all the different Indic languages, European languages—and we have reviewers and enforcement and all of that.

That said, we try to do a good job in all those languages, and it’s possible that there are some languages that we have a slightly higher enforcement in. But we work incredibly hard to do all of those—and we measure. We measure our enforcement to make sure that we are at a very high level on all of these different topics.

FREDERICK KEMPE: We could go on for hours, such a great conversation, but we’re going to run out of time relatively soon. I’d like to go to regulation.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: Regulation’s coming. What changes do you see on the horizon?

And in general terms, what role do you think governments should play in the collective responsibility to mitigate online harms? Where does government responsibility rest? Where does YouTube and private-company responsibility rest?

SUSAN WOJCICKI: Well, I see that regulation is here, it’s not just coming. So there are many regulations that we follow right now, whether we’re talking about e-Commerce Directive, GDPR, NETDC. I mean, there are many, many regulations across the board. Copyright. And we do follow all local laws. So whenever the governments have any kind of area where they want more enforcement—like, whether that’s copyright or content—there are laws and we do follow those laws.

What I have seen is that there [are] a lot of [challenges] not around… the most egregious content, I’d say there’s general agreement we want to remove it. Governments want to remove it. They have some very clear laws about content that is illegal. But what I would say the challenge has really been has been content that is harmful but legal.

So I’ll give an example: “COVID-19 is caused by 5G towers.” Like, do we want governments passing laws saying you can’t say that? And also, can governments even do that? These conspiracies come very quickly. And so we as a platform need to be able to address them extremely quickly to be able to remove them. And so I think that this category of legal but harmful is really where we are grappling with.

And look, we’ll certainly work with all governments around the world, but what we see is that—and there are a lot of forces right now that influence and work with us. So I’d certainly say one of the largest is—of course, there’s press that covers us and criticizes us if they find anything that they think doesn’t meet their bar or society’s bar. But then there’s also that whole advertising agency which has actually come up with a whole set of rules of what they want us to monetize, and agreed as a global set of advertisers, and publish it, and we report on our compliance to those. And that’s certified by a third party, too.

So, again, this category is also very contradictory. We see government officials—let’s just say half of them say hey, you remove too much content, [and] half say you didn’t remove enough. And so we don’t receive consistent feedback from governments in terms of what they actually want with regard to that regulation on content moderation.

FREDERICK KEMPE: So what is your preference? What’s the preference regarding… regulation? Should governments be giving private companies more clear guidance on what types of speech are deemed harmful? I would say libertarians in most parts of the world would say no. But in Germany, where the government does take more of a leading role regarding hate speech, clearly growing out of German history, Chancellor [Angela] Merkel actually criticized the de-platforming decisions some had taken and actually thinks the government should take more of a role. How would you weigh that? What’s your preference?

SUSAN WOJCICKI: I do think for content that is clearly harmful that it makes sense for governments to be able to give guidance around that. But I always think there’s going to be a very, very broad range of this legal-but-potentially-harmful [content], and there I am concerned about government’s ability to move quickly. I’m also concerned about some of the chilling effects when governments start to overregulate speech and what the implications for that would be. So, again, I want to point out there are these other market forces. Whether it’s press, advertising, our brand reputation, there are many other forces that cause us to move very quickly to do what we think is the right thing for society and the right thing for users, and to balance all of these different conflicting needs.

FREDERICK KEMPE: So we’re down to the last three to five minutes. I’m going to take three questions from the audience. You can answer them really quickly or you could go a little longer.

SUSAN WOJCICKI: Yeah, I’ll answer them quickly.

FREDERICK KEMPE: The first is whether you think it’s useful to speak directly to government about regulation. So do you want a direct conversation?

Related to that is hearings later this month will take place again with Sundar Pichai of Google, Mark Zuckerberg of Facebook, Jack Dorsey of Twitter. Shouldn’t they be talking to you, not just for gender balance but because it’s a different question altogether?

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: And third and finally is what about monetization model. Isn’t that also something beyond regulation that could get at this issue?

And I’m sorry to throw all that at the end.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: But I wanted to get the audience in a little bit more before we were out.

SUSAN WOJCICKI: Sure. What was the first one?

FREDERICK KEMPE: The first one is whether you think it would be useful for the companies—you personally –

SUSAN WOJCICKI: Speak directly.

FREDERICK KEMPE:—you know, others—to speak directly to government about regulation.

SUSAN WOJCICKI: Yes, and we do. So we do speak. There’s so much potential ongoing and new regulation we really had to staff up our teams to make sure that we can reach out, we can talk to governments so that we can preserve what’s good about the internet—preserve the jobs that we’re creating, the revenue, the educational content that we’re producing—but also make sure that we’re being responsible. So we are speaking to governments, and we really do appreciate when governments work with us to figure out how can we do this in a productive way.

I think you had a second question around the testimony. And –

FREDERICK KEMPE: Basically, are you eager to get in front of the congressional committees?

SUSAN WOJCICKI: (Laughs.) Well, I can’t say it’s something I’m dying to do right now. But I would always, of course—if asked, I certainly would always be there and answer any question. Sundar does represent all of Alphabet. Google, of course, is part of that and YouTube is part of that. So Sundar does answer a lot of questions associated with YouTube and I feel confident that, he can represent all the different questions that he asked. But if asked, I certainly would do my best to answer them in whatever format is appropriate for various governments.

Ready for the last question.

FREDERICK KEMPE: Yeah. So let’s—it was on monetization, but let’s wrap up with something a little bit bigger and broader.

SUSAN WOJCICKI: OK.

FREDERICK KEMPE: You’ve watched technological changes. Let’s end with this one. As you look to the future—you talked about twenty years ago. As you look to twenty years in the future, what technology do you see that’s exciting you the most? And what issue do you see that…

SUSAN WOJCICKI: Oh, gosh. Well, what technology am I most excited about? Well, I mean, maybe I can just tell you a little bit about what I think about—it’s hard to know about all the different technology.

I’ll say that I think health can be transformed. I’m very optimistic that there will be a lot more cures, that we’ll have better information about how to treat patients. I think if I were just getting started today I’d probably want to go into something in terms of medical and health and internet because I think there’s just a huge amount of opportunity. I feel like it’s how I saw the internet twenty years ago, to be able to grow and expand and offer people treatments and live longer and healthier lives.

But if I had to talk about YouTube a little bit, which is where I have more expertise… I compare YouTube sometimes to a public library because I see us being a huge public library that is available to everybody for free. Every video is available. So it’s not just focused on what people said today, but it’s, like, the full history of videos that we have there. And you could look up speeches and news events and how to fix anything, how to learn anything. And so part of my goal is to continue to grow that library to be [an] even more essential, important, useful-for-our-users resource, and to be able to enable the next generation of storytellers.

So we see storytellers coming who otherwise never would have had a voice, never would have been able to tell their story, never be able to share their craft or their music or their passion, or talk about issues that we didn’t use to talk about. We didn’t use to talk about mental health. There weren’t a lot of topics to talk about all different kinds of hair, talk about so many different types of music. We just didn’t have those resources. And to have that all be available and to bring millions of more storytellers, and bring them from places we didn’t necessarily have them—we don’t have a lot of storytellers right now that we’re hearing from who come from various African countries or Asian [countries]. We’re just seeing—it’s amazing to see the growth we’re seeing across Asia. But I’m very excited about new technologies on storytelling and how we’ll grow the YouTube library—and the number of jobs and the number of storytellers to continue to enable storytelling going forward.

FREDERICK KEMPE: So, Susan, I hope you’ll indulge me. I promise this is the last one.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: But it’s from ambassador—the EU ambassador, Stavros Lambrinidis. And he’ll probably kill me if I don’t ask it.

SUSAN WOJCICKI: (Laughs.) Sure, go ahead. (Laughs.)

FREDERICK KEMPE: No, Stavros is a great friend. Of course he wouldn’t. But it’s on authoritarianism.

SUSAN WOJCICKI: Sure.

FREDERICK KEMPE: What about content that an authoritarian government might [deem] legal that violates human rights?

And I think an addendum to this is there may be domestic laws in certain countries where opponents to that government, you can’t give them a voice on YouTube. How do you handle that sort of local situation?

SUSAN WOJCICKI: Yeah. So good question.

So, yes, we do follow the local laws. But I would say that if we see something involving human rights or political speech in that country that is being suppressed, then that’s content that we do leave up on the platform. So we have seen various cases where we may get a request from a government to remove political speech coming from a dissenting group and they may say, look, this is illegal in our country. But if it is political speech coming from a valued—or a dissenting group that has an opinion about that, we’re going to leave that up.

We’ve also seen different human rights situations where there might be content not allowed there that is about human rights, whether it’s LGBTQ or women or different protected groups in those countries. We will leave that up. And those are often tough discussions that we have with that government.

FREDERICK KEMPE: So, Susan, what a wonderful conversation, from a garage in Menlo Park to the global stage.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: From the number of questions that have come in, I think there would be a lot of demand for regular townhalls of this sort.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: Let me end by, first of all, praising you—you’ve got a really hard job—and thanking you. Thank you for letting us in to the complexities of the world and what you’re dealing with, and for so frankly and openly answering so many different questions. We look forward to working with you in the future. And of course, we’ll all be eager members of that two billion community.

SUSAN WOJCICKI: (Laughs.)

FREDERICK KEMPE: So, Susan, thanks so much from our end.

SUSAN WOJCICKI: Well, thank you so much for having me, Fred, and I look forward to coming back. And hopefully, when COVID-19 is better I can come and be live at Atlantic Council. So thank you for having me and look forward to continuing the conversation.

Image: YouTube CEO Susan Wojcicki attends a conference at the Cannes Lions International Festival of Creativity, in Cannes, France, June 19, 2018. REUTERS/Eric Gaillard