STEVE INSKEEP, HOST:
Near the highway in Menlo Park, Calif., is the headquarters of Facebook. It's an immense building, wide open inside with bare steel girders holding up the ceiling. Makes you think of an old auto factory. Shortly after we arrived, Mark Zuckerberg strolled by. And then we met his chief operating officer, Sheryl Sandberg. We spoke in her personal conference room, which has a name plate by the door that says only good news. But Sandberg was in the uncharacteristic position of giving news that was not good at all.
SHERYL SANDBERG: We know that we did not do enough to protect people's data. I'm really sorry for that. Mark's really sorry for that. And what we're doing now is taking really firm action.
INSKEEP: Sandberg spoke as Mark Zuckerberg prepares to testify before Congress. He'll be asked next week about Facebook's sharing user data without permission. To ease the way, Facebook says it will make it easier for users to keep their information from being spread. And...
SANDBERG: As part of that, if your data might have been accessed by Cambridge Analytica, you're going to be notified.
INSKEEP: Sandberg says up to 87 million individual users will be told their data was vacuumed up by Cambridge. That company later worked for President Trump's campaign. Sheryl Sandberg is accustomed to a different sort of publicity. She's the author of two books, including "Lean In," a phrase that entered the language as advice for women in the workplace. Mark Zuckerberg says he hired Sandberg to do things he did not want to do at Facebook.
I want people to know that you have been credited with being part of the reason that Facebook is so profitable. This is a story that's told about you, that the company was popular before you came, that it became much more profitable afterward. Have the events of the last year or two, though, shown that the business model of this company is part of the problem?
SANDBERG: We have an adspace business model just like TV, just like radio. Our content's available to anyone for free because it's ad-supported, and that we feel really proud of. And we're really - we think it's really important. We're trying to connect the whole world. Two billion people use our service. A lot of them would not be able to if they had to pay for the content itself. And privacy and ads are not at odds.
INSKEEP: Not necessarily, but the data collection linked to that advertising has drawn criticism. Years ago, Sandberg considered selling subscriptions to Facebook. Instead, it's supported by ads, which means users are sold to advertisers. And Facebook gathers enormous amounts of data about users in order to target those ads. The company is not about to stop studying you, but it says it is trying harder to keep your data from falling into the wrong hands.
SANDBERG: But I think what we weren't good enough at doing is thinking in advance about the ways misuse could happen on the platform. But what's different now is that we're taking a much broader view and a much more proactive approach. A few weeks ago, when the Cambridge Analytica thing happened, we made a commitment that we would go through and find data uses that were potentially too risky or even ones we didn't want to do. And we have been proactively bringing things to the surface, shutting things down, telling people. And we're going to continue to do that.
INSKEEP: Let's talk about what you discovered with Cambridge Analytica. Some people will know that you've disclosed this week that it was 87 million people whose information was shared, the overwhelming majority of them without having given explicit permission.
SANDBERG: Yeah. That's not exactly right, so let me explain.
SANDBERG: The 87 million is anyone who Cambridge Analytica might have accessed their data. We're being super conservative and careful. This is anyone who might have been connected, might have been connected to someone who connected to them.
INSKEEP: So you still don't know what the number is is what you're saying.
SANDBERG: We still don't know.
INSKEEP: Well, let me ask if you know something else. Were there other firms, political or otherwise, who used data in the same way that Cambridge Analytica did?
SANDBERG: We don't know. What we announced when we talked about Cambridge Analytica is we're doing a thorough investigation and an audit. That is ongoing, and as we find those, we're going to notify people as we did with Cambridge Analytica.
INSKEEP: Talk to me as somebody who's been on Facebook for more than a decade, though. I think about Cambridge Analytica. I think about the fact this company that I had no idea about was gathering perhaps my data or somebody that I know and using it in ways I'd have no idea about. There's an issue there with consent. But even beyond consent, from the average person's point of view, doesn't Facebook and anybody who deals with Facebook essentially do the same thing Cambridge Analytica did? They gather lots of data about people and use them in ways that, whether we formally consent or not, we really don't understand.
SANDBERG: One of the things we're very focused on is making sure you do understand how all of your information is used and you do understand what information you've shared with Facebook. We announced we're rolling out privacy shortcuts. The controls have been there and most have been there for a long time, but you're right, they're hard to understand and hard to find. So this is a very simple way to see where those controls - and control them, including ad preferences. And we're taking very assertive steps to make it easier, simpler, clearer so the controls are in one place and people can take the steps they want to take.
INSKEEP: But the business is still the same, right? You're still going to have lots and lots of information about me in ways that might make me uncomfortable.
SANDBERG: Well, we want you to know all the information we have about you. We want you to know all the controls we have. And we want to make sure you're not uncomfortable.
INSKEEP: I'm curious. When you're talking with Mark Zuckerberg or whoever else you may talk with around this company, have you had moments when you've asked the question, are we, as a company, too powerful?
SANDBERG: It's an important question. And people have that question about us and others, particularly as our size and scope. And we've had a lot of long and thoughtful conversations about what that means. We know that a lot of regulators have that question. We know that consumers around the world have that question.
INSKEEP: Do you take it seriously, or does it seem ridiculous to you?
SANDBERG: We take it very seriously. We've always had a deep responsibility for people, but at our size and scope with billions of people using our products, we have a very deep responsibility. We're having conversations with regulators around the world, but we're not even waiting for regulation.
INSKEEP: There is one bill in Congress that would make Internet firms disclose who is paying for ads.
SANDBERG: We're not waiting for it. We built a tool that shows every ad that any page is running on Facebook. It's live in Canada. It will be live in the U.S. before the election.
INSKEEP: Although when it comes to data, privacy advocates contend that Facebook has been more reluctant.
Given that the Federal Trade Commission reached a consent agreement with Facebook in 2011 to better protect people's privacy, should you have taken these steps years ago?
SANDBERG: Well, we're in constant conversation with the FTC. And that consent decree was important. And we've taken every step we know how to make sure we're in accordance with it. But the bigger answer is, should we have taken these steps years ago anyway? And the answer to that is yes, like a very clear, a very firm yes. We really believed in social experiences. We really believed in protecting privacy. But we were way too idealistic. We did not think enough about the abuse cases.
INSKEEP: That is some of our talk here in California with Sheryl Sandberg, one of the top executives at Facebook. Her boss, Mark Zuckerberg, testifies before Congress next week. Transcript provided by NPR, Copyright NPR.