Skip to main content

The Central Question Behind Facebook: 'What Does Mark Zuckerberg Believe In?'

A talk with New Yorker staff writer EVAN OSNOS about the crisis at Facebook. Serious data breaches and the 2016 Russian disinformation campaign have put the company and its founder, Mark Zuckerberg, under scrutiny as the mid-term elections approach.

51:30

Transcript

TERRY GROSS, HOST:

This is FRESH AIR. I'm Terry Gross. Last week, Facebook announced the most serious security breach in the company's history in which an unknown hacker was able to log onto the accounts of at least 50 million Facebook users. But that's just one element of the crisis facing the world's largest social media platform.

Facebook was the conduit for a Russian-backed disinformation campaign in the 2016 election that reached tens of millions of its users. And the political consulting firm Cambridge Analytica got access to the personal information of 87 million Facebook users, which they used to target messages for the Trump campaign. Facebook is now under investigation by the FBI, the Securities and Exchange Commission, the Federal Trade Commission and authorities in Europe and Australia.

Our guest, New Yorker staff writer Evan Osnos, explores Facebook's history and profiles its 34-year-old founder and chief executive Mark Zuckerberg in a recent issue of the magazine. He says Zuckerberg's relentless drive to expand Facebook's reach has jeopardized the privacy of its users and made it vulnerable to political manipulation. His story is titled, "Ghosts In The Machine: Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?" He spoke with FRESH AIR'S Dave Davies.

DAVE DAVIES, BYLINE: Well, Evan Osnos, welcome back to FRESH AIR. You know, we all know Facebook is a very big deal. Give us a sense of its size and reach.

EVAN OSNOS: There's really nothing quite like it in the history of American business. It has now 2.2 billion monthly active users, meaning it's larger than any country in the world. It's really got no natural precedent when you look at the history of enterprise. It's really closer, in terms of scale and reach, to a political ideology or a religious faith. I mean, just in literal terms, it now has as many adherents as Christianity. And that has all been built in the last 14 years, since it was founded in 2004.

DAVIES: For this piece, you visited Mark Zuckerberg at his home, at his office. You had numerous meetings. A lot of access from a guy who is pretty careful about, you know, journalists getting connected to him. How did you convince him to share so much time with you?

OSNOS: Well, he did it, I should say, reluctantly. You know, this was a long process that started a year ago when I first approached them about the idea of this kind of story. I said, look, we do in The New Yorker these very long, detailed profiles. And I think initially Facebook's view was, we don't need to do this. We're a big, powerful company. And I continued to work on it, basically started interviewing a lot of other people around Facebook, people who worked there in the past, people who work there in the present. And as that accumulated, I continued to say, look; this fundamentally should be about how Mark Zuckerberg sees the world.

And over time, they came to, I think, sort of grudgingly - but I'm grateful for it - accept the idea that, to an extraordinary degree, Mark Zuckerberg is Facebook. I mean, that is the reality. He's the chairman. He's the CEO. He owns, he controls 60 percent of the voting shares. So if you're going to understand Facebook in any meaningful way, the conversation really has to start with him and end with him. And for that, I think they ultimately recognized he had to be a part of this project.

The company has come up against a growing and really serious decline of public trust, both among politicians and among the general public. And I think they recognize that at the core of that - and I think he recognizes that - there is this profound question mark around what does Mark Zuckerberg believe in? What does he stand for? What does he care about? How does he see his role in the country? What does he see for the role of technology? And this project was my attempt to try to answer some of those questions for myself and for readers.

DAVIES: Zuckerberg started Facebook when he was at Harvard. It takes off, and it grows and it grows because they're determined, you know, to get more users, to connect more people, as Zuckerberg likes to say. And you're right that in about 2007, it plateaued at about 50 million users, which a lot of other kind of similar platforms had done. What did the company do to break through that ceiling?

OSNOS: Well, they first panicked, really. There was a sense internally that they wondered whether they'd sort of hit the wall and whether this thing that had grown so fast was now over. And what they did was they discovered something important, which is that in order to punch through that wall, they had to come up with a whole range of new ways of accessing new populations, people who wouldn't otherwise have been on Facebook.

The first thing they did was they discovered they simply had to translate the site into other languages, make it available to other countries around the world and allow people to be able to post in Spanish and so on. And by doing that, they crossed this barrier, this moat, which had prevented other social media sites from growing. But it did something even more important, which was that it established the sacred importance - really, sacred is the word - of growth internally.

They created a team called the Growth Team, which became - as a former Facebook executive said to me, it was really the cool crowd. This was the team everyone wanted to be on, was Growth. It was about really sanctifying the idea of growth as an end in itself, that if Facebook stopped growing then its whole reason for being would cease to exist. And that idea ended up becoming the dominant fact of the next decade, from 2007 until today, when Facebook became extraordinarily focused on whatever it could do to continue growing year after year.

DAVIES: And one of the things they did was that they made it a platform for outside developers so that - maybe you want to explain what this means. So that other software developers could hitch into Facebook and use it.

OSNOS: They made a big choice, which was that, instead of saying we're just going to be our own site, we're actually going to try to become a platform, like, almost like an operating system, the way that Windows used to be the, you know, the way that everybody went onto their personal computer, and then you would build applications on top of that. And that decision to open themselves up as a platform meant that they were then sharing data to a much larger group of developers, of programmers, than they otherwise would.

And this turned out to be a fateful decision because if you fast-forward many years later to, you know, what we all now know as the Cambridge Analytica scandal, that was - it was directly a result of this decision to open themselves up as a platform. Because what they'd done was they'd allowed an academic researcher to use some of the data on the platform to build a personality quiz. But then he sold that data to the political consultancy known as Cambridge Analytica. And that sort of back door, the way in which that data went out the door, has created much of the crisis that now engulfs the company today.

DAVIES: And you write that an executive within Facebook, Sandy Parakilas, was put in charge of looking into what these outside, you know, app developers were doing with the data that they got from the Facebook users who, you know, plugged into their games. What did he find?

OSNOS: What he found unnerved him. Sandy Parakilas had joined Facebook in 2011 and was one of the people responsible for going out and figuring out whether the data that they were giving to developers was being misused in any way. You know, were people violating privacy? Were they taking more data than they were supposed to? And what he found was that they were. In some cases, programmers, for instance, were siphoning off people's pictures and their private messages. In other cases, he found a developer that was building essentially shadow profiles, profiles for people who've never given their consent, including for children. And it was - this was a case in which there was just this feeling of it being the Wild West. This data was out there, and nobody was paying attention. And he raised alarms internally.

As he tells the story, what he did was he said, look, we need to do a major audit to go out and figure out where is our data, who has it and how are they using it. And as he says he was told, that's not going to happen because if you do it, you may not want to know what you're going to find. Meaning that they may have already lost control of so much of that data that they didn't really want to discover the full reach. And Sandy Parakilas left, but in many ways his warnings turned out to be prophetic because exactly the kind of undisciplined use of data which he had warned about and tried to raise greater alarms about internally turned out to be the origins of the Cambridge Analytica scandal, which became so consuming for the company this year.

DAVIES: So a lot of resources within the company devoted to pushing growth, not so much into doing it responsibly.

OSNOS: Yes. Exactly. As he said, you know, I needed people, I needed engineers to help me try to police where this data was going, but all of the resources were going to growth. He said at some point, he told me, that the growth team had engineers coming out of their ears. They had everything they wanted. Everything was devoted to growing, and it was not devoted to making sure that what they were doing at the time was safe.

DAVIES: What are some of the other things the company did to keep growing that kind of pushed the boundaries of privacy?

OSNOS: Well, they began to build in lots of ingenious details into the design of the site, things like autoplay videos. This is something as simple as changing the nature of Facebook so that when you scroll down your page, that the videos would begin to play without you having to click on them. What that was was essentially taking advantage of some of our own psychological wiring. That means that just eliminating that small obstacle makes you much more likely to stay on the site, to watch that ad and ultimately to consume whatever advertising is around it.

And they did other things. They, for instance, you know, you remember in the very beginning Facebook used to have pages - single pages - you'd have to click onto the next page. They got rid of that, so it's just a continuous scroll. And all of these little tiny details were, in a way, making Facebook into a new generation of behavioral experts. They figured out how do you tweak people's vanities and their passions and their susceptibilities and their desires in order to keep them on the site.

The most important thing Facebook could do - and this is how they measured it - was make sure that people were signing on and staying on. Whenever you joined the company, in your orientation, you were taught about something very important, a metric known as L6 of 7, which means the number of people who logged in six of the last seven days. And whatever you could do to try to raise that L6 of 7, that was the priority.

But as, you know, somebody put it to me, the problem with that is that eventually you exhaust the positive ways of boosting that engagement, and eventually you start to look at what this person described as the dark patterns, the ways that you can use anxiety or vanity to try to get people to sign on.

One of the things they discovered, for instance, was that if you send somebody an email that says that a Facebook friend has uploaded a picture of them to Facebook, that people are almost incapable of resisting the temptation to look. And that sort of tweak, that just minor behavioral nudge, turned out to be really hugely important to the growth of Facebook.

DAVIES: Evan Osnos is a staff writer for The New Yorker. We'll talk some more after a short break. This is FRESH AIR.

(SOUNDBITE OF MICHAEL BELLAR AND THE AS-IS ENSEMBLE'S "HOT BOX MAGIC")

DAVIES: This is FRESH AIR, and we're speaking with New Yorker staff writer Evan Osnos. He has a piece in the magazine called "Ghost In The Machine" about Mark Zuckerberg and Facebook and its current controversies. It's in a recent issue. Now Facebook is in a lot of trouble these days because of data breaches - its role in elections.

But you're right that some former Facebook executives are voicing doubts about the company's role for other things - exacerbating isolation, outrage, addictive behaviors. You want to give us an example?

OSNOS: This is a big change for Facebook. Traditionally, its former executives have been silent. They leave the company and very often don't say much about it. Starting last year, Sean Parker who was Facebook's first president - he's a sort of well-known Silicon Valley figure - gave an interview in which he said that, as he put it, he's become a conscientious objector to social media. He said God knows what it's doing to our children's brains.

And then a few days later, there was another very prominent former Facebook executive named Chamath Palihapitiya, who had been the head of growth, a vice president of user growth, which was an absolutely central position at the company for a number of years. And he came out and said, you know, in the back of our minds we all had a sense that something might be going wrong, that our product, the thing that we were building - I'm paraphrasing here - was, as he put it, contributing to the breakdown of social discourse, the breakdown of society. And he said he would never let his children use this kind of product.

I think because the public face of the company has been so on message about how they contribute, in their minds, to doing good in the world, to have some very senior former executives come out and talk about what the public has come to believe, which is that there are these very serious side effects to the company's growth, was a real wakeup call, externally and I think somewhat internally.

DAVIES: Yes, side effects in terms of increasing social tensions and, in some cases, violent political activity. What about isolation? Addictive behaviors?

OSNOS: There has been a growing body of research that shows - and it's been published in sort of the major scientific and academic journals - that there is a correlation, in some cases, between heavy Facebook use and decreased sense of well-being, a sense of connection. People feel lonely. There is a lot that they're trying to figure out about this, but it's gotten to the point where the data is unimpeachable, and the company has begun to acknowledge it.

You heard for the first time early this year that Facebook said, look, we recognize that there are different ways that you can use this, and if you're just using it passively, if you're just sitting there scrolling with glazed eyes, hour after hour, we recognize this is not good for you. And so what they did is they said we're going to try to change it a bit, so that people are more actively engaged, so that they're, you know, talking to friends, actively communicating with their family members and stuff like that.

But that was a key difference from how they used to talk about it, which was that if you didn't agree with the fundamental premise that Facebook was basically good, then it was a kind of heresy, and they wouldn't want to have the conversation. They are now slowly acclimating to the idea that people just don't buy that anymore.

DAVIES: Yeah, you talked to Zuckerberg a fair amount about this, you know, pushing of boundaries of privacy to grow. How does he regard this kind of experimentation?

OSNOS: Well, he absorbed a central belief early on in his career, and I think it's become just key to understanding why he's made choices that he has and some of the mistakes that he has, which is that he decided early on that he was very often going to be criticized. He said, look, this is just a fact of what we do. As he said, look, we're not selling dog food here, we're doing something intensely, inherently controversial.

It's at the intersection of psychology and technology. And so when people criticize Facebook for being too casual about their privacy, for allowing data to be - to get out into the world, very often what he came to believe was that they would criticize him at the time, but over time, eventually they would accept it, they would get used to it, and they would keep signing on. They would keep growing.

And that idea about criticism really hardened and became a central governing principle at Facebook. The sense that they were leaders, they were pioneers, they were forging ahead. They had to push the public beyond its comfort zone when it came to being less private. Because if they didn't do it, then people wouldn't go there.

But they were convinced that even in the cases of controversy, when you had civil libertarians or regulators or politicians or ordinary members of the public complaining about Facebook, that that was simply a sign that they were being bold, and that idea continued, really, until the present day.

DAVIES: Is it true that in 2010, he said privacy is no longer a social norm?

OSNOS: He did. And it caused a big uproar at the time. He said, look; this is a generational difference. We don't feel the same way about privacy that our parents and grandparents did. And people said that's wild. That's not right. Privacy is built into the very nature of the United States. It's really embedded in the Bill of Rights. And his belief was that it was, as it was often described, an antique, and that we needed to push people further.

There was - in the early days of Facebook, there was a theme, a phrase that was bandied about called radical transparency, the idea that you had to be aggressively transparent in order to be modern. The sense was, as one person had put it, you know, that in the future, because of Facebook and other things like it that were exploding the boundaries of privacy, that extramarital affairs would become impossible. People couldn't hide things like that. They could no longer hide their lives outside of work from their lives in work.

And they believed that to be a virtue, this sense that there would be this fusion, this union of our private selves and our public selves. But that put them at odds with the public. And the key fact I think was that over and over again, Mark Zuckerberg believed that being at odds with the public was not a sign you were doing something wrong; it was a sign that you were doing something innovative. And their mantra, their motto of course became move fast and break things. And that motto really captured the way that they see the world.

DAVIES: Yeah, almost, like, redefining what it is to be human.

OSNOS: Yeah, they believed that this tool, Facebook, had that kind of power. And they came to being at a time in Silicon Valley where you had this almost messianic sense of ambition - this belief that you weren't just building computer applications. You were actually building tools that were fundamentally reshaping society. And they embraced that wholeheartedly.

DAVIES: There was a recent data breach at Facebook where 50 million users' information was taken by somebody. How serious a problem is this?

OSNOS: This is a serious one. This is the largest security breach in Facebook's history. And what was unusual about this and what sets it apart from other cases, like Cambridge Analytica, was that this was outright theft. This was a case of hackers or hacker - we still don't know who it was - finding essentially an under-protected door and walking through it and taking control of at least 50 million Facebook user accounts. Facebook also, to be safe, took another 40 million users and kicked them off, forced them to log back in. So it may be as many as 90 million or more that were affected by this.

And in this case, the hackers were able to get total control of the accounts. So they were able to get control of your privacy settings. They could go into your messages. They could post things on your behalf. At this point, Facebook says they haven't found any evidence of these hackers doing that. So that only heightens the mystery. They don't know why they did it. They don't know if this was a foreign government or if this were individuals - if this was a criminal act.

But what's interesting about this particular case and why it really leaps out to people who study the history of Facebook is that a few years ago, Facebook might not have gone public with this as fast as they did. They would have - they would have probably investigated it more internally. But under the new rules that have been imposed by the European Union, they were required to announce this very fast. And as a result, they had to talk about this breach really before they know very much about it. So it's raised as many questions as it's answered at this point.

GROSS: We're listening to the interview FRESH AIR's Dave Davies recorded with Evan Osnos, a staff writer for The New Yorker. Osnos' article about Facebook is titled "Ghost In The Machine: Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?" We'll hear more of the interview after a break. I'm Terry Gross, and this is FRESH AIR.

(SOUNDBITE OF MOLE'S "STONES")

GROSS: This is FRESH AIR. I'm Terry Gross. Let's get back to the interview FRESH AIR's Dave Davies recorded with Evan Osnos about his recent New Yorker article, "Ghost In The Machine: Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?" Osnos profiles Zuckerberg and writes about how Zuckerberg's relentless drive to expand Facebook's reach has jeopardized the privacy of its users and made it vulnerable to political manipulation.

DAVIES: There's been lots of discussion about Facebook being used to exacerbate social and political tensions in the United States. There are also serious questions about it being a catalyst for political violence in other parts of the world - Myanmar, for example. What are we seeing?

OSNOS: Once Facebook in effect saturated the Western world, it began to move more aggressively into developing countries. And as it did - as it moved into places like India and Sri Lanka and Myanmar, it really became a powerful new ingredient and a dangerous new ingredient, in some cases, into these longstanding ethnic and religious rivalries.

In Myanmar, for instance, when Facebook arrived, it really became in effect the Internet. There was only about 1 percent of the population that was online before Facebook was there. And then, over the course of the next few years, it grew rapidly. And people who were interested in fomenting this kind of ethnic hatred between Buddhists and Muslims figured out how to use Facebook to great effect. And they would spread hoaxes or rumors. And in - there were cases where a rumor or a hoax that was spread on Facebook directly contributed to the outbreak of a riot.

And what you heard over that period was that people in Myanmar began to talk to the company, warned the company that this was a problem. As early as 2013, people began visiting Facebook's headquarters in Menlo Park, giving slide presentations, talking about the problem of ethnic violence being trafficked on Facebook. And the company would listen. And in some cases, they would give a hearing to that view. Over the years, they continued to visit activists and technology entrepreneurs from Myanmar. But they didn't see any fundamental change.

And they were - every time they went, they would they would hear very often the same message, which was, we're going to hire dozens of Burmese language speakers to be able to police this kind of thing more effectively. But still fundamentally it didn't change.

And eventually it became so pronounced. The role of Facebook as a catalyst in this violence became so serious that earlier this year, the U.N. investigator in charge of probing the persecution of the Rohingya Muslim minority in Myanmar described the role of Facebook as, in her words, a beast. She said it has become something that it was never intended. And it is actively contributing to this - what the U.N. now considers a genocide.

DAVIES: Did you talk to Mark Zuckerberg about this?

OSNOS: I did. And I asked him about it. And at first he was frankly a little glib about it. He said, look; this is a problem that is similar in a lot of places, and it's one that we're dealing with. I'm paraphrasing there. And I pushed him on it. I said, look; I talked to people in Myanmar just yesterday in fact, and they're baffled about why a company as big and as rich and as innovative as Facebook has been unable to deal with this problem.

And what he said was, we take this seriously. We really do understand this is a problem, but it's not something in which you can just snap your fingers and solve it overnight. It takes a process. You have to build out these systems in order to allow artificial intelligence to detect hate speech and then hire the kinds of people who can solve the problem. He said that they're going to have over a hundred people - Burmese speakers who are going to be policing the Facebook in Myanmar.

But I think it speaks to a broader dynamic at work here, which is that for a long time, as one of Mark Zuckerberg's friends said to me, when there were complaints about the company, he either thought that these were just Luddites. These were people who were slow to embrace technology or in other cases, they were exaggerating or overstating the role that Facebook was playing. He looked at a place like Myanmar and said, well, they were probably going to be fighting anyway. I'm putting words in his mouth there. Those are not words he said to me.

But I think over time, he has come to this realization - and by his own description, a belated realization - that Facebook is not just a tool on the table. It is not just a new implement. It is in fact a fundamental and very - by its own nature, it has to be responsible for the forces that it unleashes. But then building the systems to try to get control of this is hard, and they have moved much more slowly than they should have, in some cases by their own admission.

DAVIES: Let's talk about the 2016 elections. Did Facebook when the election was approaching see this as a big revenue opportunity?

OSNOS: They did. They saw this as a important moment. They - Sheryl Sandberg in a call with investors and analysts compared it to the World Cup or the Super Bowl, which was...

DAVIES: She's the chief operating officer, yeah.

OSNOS: She is, yeah. She's sort of arguably the second most powerful person at Facebook. And what she said was this was going to be a major opportunity for them to sell ads - political ads. Projections at the time were that as candidates and political organizations became more aware of the importance of the Internet, that they were going to shift a lot of their spending from television into the Internet and that you were going to see a nine or tenfold increase in how much spending was going to be available. And Facebook wanted to be a big recipient of that.

DAVIES: Right. It had gotten a special exemption to prevent ads which didn't disclose who paid for them, right? So it was kind of a wide-open opportunity.

OSNOS: Yeah. Facebook had used its lobbying power. It had argued to the Federal Election Commission that it should be exempted from rules that require television advertising to be identified by the source of the funding - you know, that point at the end where...

DAVIES: Right.

OSNOS: ...They always say who paid for the ad. They said, we shouldn't have to follow those rules because we're a new technology. And in their filings, they said, you don't want to stifle the growth of new innovation. But as a result, that meant that it was in a sense a very dark terrain, that things that were being posted on Facebook that were ads around politics were in many cases of mysterious origin. It was very hard to know who was posting them and why.

DAVIES: And Facebook offered to embed a Facebook employee with both the Clinton and Trump campaigns to help the campaigns use the platform effectively. How did they respond?

OSNOS: Well, the Clinton campaign rejected the offer. They thought they had more or less enough of their own technical capability to do it. But the Trump campaign embraced it eagerly. They were a much smaller, almost sort of shoestring operation. They had very little of the seasoned political expertise that was rallying around other presidential candidates. And so Facebook moved employees into the Trump campaign headquarters.

And they helped them craft their messages. They helped them figure out how to reach the largest possible audience, how to test different messages - many, many messages a day to figure out just what - small differences, changing the background color or changing the text or the font - how that would impact the number of people that would click on it and ultimately might give money and support the candidate.

So later, in the end after Donald Trump won the election, the senior campaign strategists were very clear. As one of them, Theresa Hong, said to an interviewer, without Facebook, we would not have won. They played an absolutely essential role in the process.

DAVIES: How did the Trump campaign itself use the platform to affect things like turnout?

OSNOS: Well, one of the things they did was the Trump campaign bought an ad campaign on Facebook that was designed to suppress turnout among constituencies that they expected to be important to the Democrats, including African-Americans and young liberals and white women. And by targeting that population using these incredibly powerful tools of persuasion that Facebook has, which have been engineered to optimize, to get people to respond - in the view of the Trump campaign, that was an important piece of their success. And they've talked about it ever since.

DAVIES: Evan Osnos is a staff writer for The New Yorker. We'll talk some more after a short break. This is FRESH AIR.

(SOUNDBITE OF DANILO PEREZ'S "THE SAGA OF RITA JOE")

DAVIES: We're speaking with Evan Osnos. He's a staff writer for The New Yorker. His piece about Mark Zuckerberg and Facebook, "Ghost In The Machine," is in a recent issue of the magazine.

So when word began to emerge about the spread of false information on Facebook, some of it by Russian actors - we now know about the Internet Research Agency in Russia - how did Zuckerberg respond to this?

OSNOS: Initially he rejected it. He said it just seems, as he put it, pretty crazy that the presence of fake news might have affected the outcome of the 2016 election. He said that just a few days after the results were in. Since then initially Facebook was really reluctant to embrace this idea that they played a meaningful role in the election. Mark Warner, the senator from Virginia who's the ranking Democrat on the Senate Intelligence Committee, contacted Facebook shortly after the election and said he really wanted to talk about the role of Russian interference on Facebook.

And as he put it to me, they were completely dismissive. They just didn't believe that they had a serious role to play here. Over time, they have come to understand that that's simply not the case. Initially they'd estimated that about fewer than 10 million Facebook users might have been affected by Russian disinformation, and they later had to revise that in preparation for testimony in Congress. And they said, actually, as many as maybe 150 million Facebook users were affected by Russian disinformation.

And what's remarkable about that is how efficient it was, actually, as a conduit for disinformation because the Russian Internet Research Agency, which was reporting to the Kremlin, had fewer than a hundred members of its staff on this project, and yet they were able to reach a size, 150 million Facebook users, that is extraordinary. And it was - I think to this day Facebook is struggling with that fundamental paradox, which is that on the one hand their business and their success depends on their ability to tout their powers of persuasion. They are telling advertisers, we can encourage users to listen to you, to believe in you and to act on what you're telling them. And yet at the same time, they're trying to say that they have not had this dispositive effect on our politics. And that is a contradiction.

DAVIES: Right. And then there was the Cambridge Analytica scandal in which, you know, it emerged that a firm working for the Trump campaign had acquired the personal data of, what - 87 million Facebook users?

OSNOS: That's right.

DAVIES: So the company was in big trouble. Zuckerberg went before Congress, carefully prepped, of course. How did he do?

OSNOS: Well, there was a lot riding on that appearance. You know, in many ways it kind of felt like a trial. Here he was on behalf of the company, going in front of Congress, and there was growing calls for regulation. And he, in some ways, vastly exceeded expectations, and that was because - largely because Congress showed itself to be really extraordinarily unprepared to deal with the complexity of Facebook. They just simply didn't ask the kinds of questions that would have really gotten to the heart of Facebook's operations and how it makes choices.

So much so that at one point, Orrin Hatch, senator from Utah, said to Mark Zuckerberg, if you don't charge customers then how do you make any money? And Zuckerberg kind of gave a little smile and said, Senator, we run ads. It was such an obvious fact to anybody who's paid attention to technology that it really, I think, underscored the mismatch between the scale and investment and sophistication of these companies and Congress's inability to come up with the laws and the rules that can respond to them in real time.

DAVIES: Maybe you could just explain that a bit. I mean, Facebook makes a fortune by digital ads. How does it work?

OSNOS: Yeah. In some ways, Facebook is actually a little bit like a newspaper in the sense that the way that it pays for itself is by running ads alongside the content that people post and look for on there. So on any given moment when you go on Facebook, you will find these highly targeted ads. These are things that are chosen just for you based on your browsing behavior around the Internet, based on the posts that you've clicked on, the things that you look for.

They choose ads that you are much more likely to click on than you would if they were just sending the same ad to everybody else. And that formula, that ability to micro-target, as it's known, ads to specific users has been this extraordinary geyser of business success. They just stumbled on something that was able to generate returns for companies that kept them coming back over and over again and advertising on Facebook.

DAVIES: And the company is taking steps - certainly says taking steps - to monitor the content that political players are using the platform for and that, you know, is arguably critical as we approach the midterms. It's also pretty tricky, right? I mean, how do you distinguish spin from fakery or, you know, dangerous content from distasteful? How's the company doing this?

OSNOS: It's become this consuming effort. You know, Facebook over the last year, as the controversy has grown, they've undertaken one initiative after another. So when it comes to political election advertising, for instance, what they said was, OK, even though for years we argued that we shouldn't have to disclose the funding sources for political ads, now we are going to not only do that, but we're going to go farther than TV does. We're going to let users be able to click on an ad and then know not only who paid for it, but what other ads do those people pay for and who are they targeting? They're also saying, we're going to do more to defend against the kind of disinformation campaigns. What they call coordinated inauthentic behavior. Essentially misinformation that's distributed to try to shape elections, not only in the United States, but in other countries.

And they've had to - in many ways, Dave, it's almost like they've begun to take on some of the qualities of a government. You know, they've had to hire people who actually worked in the U.S. government on things like misinformation in order to try to ferret out efforts by Russian officials - or in one case, there was an Iranian campaign - to try to spread misinformation. But what we don't know and really won't know until after the midterm elections in the U.S. and the elections that come to follow are whether or not that's working.

And I think what's interesting is Facebook has become much more public these days in terms of talking about when it finds examples of disinformation. It'll announce it. It will say, we took down a group of Russian impostors who were seeking to affect American voting behavior, for instance. But that's either a sign that they're winning the battle, or it's a sign that the battle has grown so much that they're going to continue to face this. And what I'm struck by is how much the integrity and the credibility of elections now rests on the shoulders of individual employees inside a private company in California. That's a very unusual situation for our democracy to be in.

DAVIES: 'Cause they can actually make distinctions about what the public sees and what it doesn't.

OSNOS: Yes, yeah. I mean, they have to make very subtle choices. Take, for example, the, you know, misinformation. What is the definition of misinformation? When is somebody being wrong by accident, and when is somebody being wrong on purpose? When are they trying to deceive large numbers of the public? Those kinds of very subtle things which are usually the province of the Supreme Court or of lawmakers are now being handled in conference rooms at Facebook, and that's very complicated.

DAVIES: Evan Osnos is a staff writer for The New Yorker. His piece "The Ghost In The Machine" about Mark Zuckerberg and Facebook appeared in a recent issue. We'll talk some more after a short break. This is FRESH AIR.

(SOUNDBITE OF THE ADAM PRICE GROUP'S "STORYVILLE")

DAVIES: This is FRESH AIR, and we're speaking with Evan Osnos. He's a staff writer for The New Yorker. His story "Ghost In The Machine" about Mark Zuckerberg and Facebook, appears in a recent issue of the magazine.

They recently took down posts from Alex Jones of Infowars. You want to tell us that - tell us about that?

OSNOS: Yeah. This was really an important case study in how Facebook's going to deal with its most complicated problem in some sense, which is content. What do you simply do with the fact that people are posting a billion items of content to Facebook every day? That's the actual number. And what they've tried to do is to say, OK, we're going to punish hate speech; we're going to prevent hate speech from being on here. But for things that are less than hate speech - if it's just misinformation or things that appear to be wrong by accident, well, then we're not going to ban that person from Facebook. We're going to try to use other tools. We may make their posts less visible. We might share them less.

But the case of Infowars, which, as we all know, is a conspiracy website led by Alex Jones - and for years, it has promoted in particular the falsehood - it's a false conspiracy theory that the massacre at Sandy Hook Elementary School was staged; it was a hoax, and it was designed to try to advance an anti-gun agenda. That's the theory. And for years, people have complained to Facebook about it. They said, this really has risen to the level of harassment of the parents who are - whose children were killed at Sandy Hook.

And this summer, people started to criticize the company more publicly and said, look; you need to remove Infowars. This is no longer normal content. They've disqualified themselves from being a part of civil discourse here by directing all of this harassment at the parents. And then there was - the parents of one of the children at Sandy Hook wrote an open letter, quite candid and very tough, directed at Mark Zuckerberg and said that they have been driven into hiding because of their, as they put it, inexplicable battle with Facebook to try to get this kind of material taken down so that they don't have to deal with trolls and people who are, you know, discovering their address and then threatening them online.

And finally this summer, Apple took down Infowars' podcasts. And then very soon thereafter, Facebook and other companies followed suit. And I talked to Mark Zuckerberg about it. I said, why did you wait this - why did you wait so long? Why did you wait so long to take down this thing that people had so clearly complained about? And he said, well, we don't want to punish people who are just wrong. We don't want to ban them from Facebook. What we're trying to do is figure out how to shape this thing. And he acknowledged in a fact that because Apple had moved on this, that - he said, at that point, we realized we had to make a decision. We had to get rid of this, and so we did it.

But from my perspective, what was interesting about this was that this is the very beginning of an issue for Facebook. This is not the end. I mean, this is just the front edge of an unbelievably complex problem, which is, what are the bounds of free speech? What do we actually want to be able to have, and what do we consider to be out of bounds? What is, in effect, shouting fire in a crowded theater, and what is legitimate provocative, unsavory speech? And these are some of the hardest problems that we face, and they're now in the hands of - let's face it. They're in the hands of the engineers, the people who created this incredibly powerful application.

DAVIES: Yeah, you know, there's lots of First Amendment law that says the government can't, you know, distinguish types of speech and prohibit it. Is it clear that a private company like Facebook can when it's this big?

OSNOS: Well, actually, Facebook, as a private company, can do whatever it wants on speech. If they decided tomorrow that you couldn't talk about golden retrievers on Facebook, they could put that rule in place. And I think for some reason, that - you know, we find ourselves really torn. Even if you're not a fan of Infowars - and God knows I'm not - it has to make a person uneasy to know that there is now a company which is capable of deciding not only what kind of information it's going to suppress but also which kind of information it's going to promote. And on any given day, there are people who are going to be offended by those choices.

But the tools by which Facebook is held accountable are not the tools that we use in politics. It's not like you vote the bums out. It's not like people are appointed to Facebook's board as if they were Supreme Court justices. This is a case in which a private company is making profound choices about the contours and the boundaries of political expression. And we don't have obvious tools with which to regulate them.

DAVIES: You know, when you look at the arc of the story, I mean, this company founded by Mark Zuckerberg has this astonishing growth, is deeply committed to growth and, in doing so, you know, compromises privacy and ends up, you know, sharing data it shouldn't about its users and gets into some trouble. And a question arises about whether the company has the ability - has the capacity for self-reflection, whether it can take, you know, adverse information and re-examine its assumptions and practices. And in many respects, this really comes down to Zuckerberg. What did you find about that?

OSNOS: I found that he is insulated to I think an unhealthy degree from this kind of criticism. And if he was sitting with me right now, I would say this directly. The reality is he's built the company in his own image. He's had the luxury of sculpting an organization to his like - I mean, quite literally, the blue color that Facebook has as its signature blue is chosen because he is red-green colorblind, and he prefers to look at the color blue. He can see it very distinctly. So in every way, both physical and spiritual, this company reflects his sensibilities.

But in order to be able to continue to grow and evolve and respond to the problems that it's encountered, he needs people sitting in the room with him who will tell him, Mark, I think you're not seeing this the right way; you're not seeing this clearly; you're wrong. And I was struck that in our interviews, I got the sense from him that he knows that on some level.

He's tried over the years to make these choices, to get outside what he described as the bubble. He's got five people who report directly to him. And they are all people who he has in effect chosen and installed in those positions. And there is very few - there are very few people at Facebook who are willing to stick their neck out and say, I fundamentally disagree; we need to do things differently.

DAVIES: You know, at the end of this piece, you write that some people think of Mark Zuckerberg as an automaton with little regard for the human dimensions of his work. And you say, not exactly. The truth is something else. What's the truth?

OSNOS: The truth is that he is at peace with what he has done, with the choices that he has made. I came to really understand that Mark Zuckerberg, in his own conception of his place in history, believes that no change happens painlessly and that change is difficult. And in many ways, it's like his inspiration - Augustus Caesar. He believes that he's made tradeoffs, that he has - in order to grow, he had to give up perfection. If he wanted to be vastly influential, then he couldn't always be quite as safe as people wanted him to be.

And in his mind and in the mind of the people around him, they are vindicated by their sheer scale and success. And for that reason, it's very hard for them to accept that the public is howling, in many cases, for real change because they believe if we had given in to the critics at every step along the way and made changes, then we wouldn't be as big as we are today.

DAVIES: Evan Osnos, thanks so much for speaking with us.

OSNOS: Thanks very much for having me, Dave.

GROSS: Evan Osnos is a staff writer for The New Yorker. His article about Facebook is titled "Ghost In The Machine: Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?" If you'd like to catch up on FRESH AIR interviews you missed, like our interview with Washington Post national security correspondent Greg Miller, author of the new book "The Apprentice: Trump, Russia And The Subversion Of American Democracy," check out our podcast. You'll find lots of FRESH AIR interviews, including my recent interview with pianist, composer and singer Jon Batiste, who leads the house band on "Late Night With Stephen Colbert" and was at the piano for our interview.

(SOUNDBITE OF THE ROB DIXON TRIO'S "SAN LEANDRO")

GROSS: FRESH AIR's executive producer is Danny Miller. Our interviews and reviews are produced and edited by Amy Salit, Phyllis Myers, Sam Briger, Lauren Krenzel, Heidi Saman, Therese Madden, Mooj Zadie, Thea Chaloner and Seth Kelley. I'm Terry Gross.

(SOUNDBITE OF THE ROB DIXON TRIO'S "SAN LEANDRO")

Transcripts are created on a rush deadline, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of Fresh Air interviews and reviews are the audio recordings of each segment.

You May Also like

Did you know you can create a shareable playlist?

Advertisement

Recently on Fresh Air Available to Play on NPR

52:30

Daughter of Warhol star looks back on a bohemian childhood in the Chelsea Hotel

Alexandra Auder's mother, Viva, was one of Andy Warhol's muses. Growing up in Warhol's orbit meant Auder's childhood was an unusual one. For several years, Viva, Auder and Auder's younger half-sister, Gaby Hoffmann, lived in the Chelsea Hotel in Manhattan. It was was famous for having been home to Leonard Cohen, Dylan Thomas, Virgil Thomson, and Bob Dylan, among others.

43:04

This fake 'Jury Duty' really put James Marsden's improv chops on trial

In the series Jury Duty, a solar contractor named Ronald Gladden has agreed to participate in what he believes is a documentary about the experience of being a juror--but what Ronald doesn't know is that the whole thing is fake.

There are more than 22,000 Fresh Air segments.

Let us help you find exactly what you want to hear.
Just play me something
Your Queue

Would you like to make a playlist based on your queue?

Generate & Share View/Edit Your Queue