After going MIA for several days following the explosive Cambridge Analytica controversy, Facebook CEO and co-founder Mark Zuckerberg faced the music and did press interviews dealing with the misuse of user’s private data. It took five full days for him to finally take charge of the situation and own up to the company’s missteps. Any crisis pr firm would tell you that’s five days too long to regain your users trust.
Zuckerberg apologized for a “major breach of trust,” admitted to mistakes made and outlined steps to protect users following Cambridge’s data grab.
His mea culpa on cable television came a few hours after he acknowledged his company’s mistakes in a Facebook post, but without saying he was sorry.
Zuckerberg and Facebook’s No. 2 executive, Sheryl Sandberg, had been quiet since news broke Friday that Cambridge may have used data improperly obtained from roughly 50 million Facebook users to try to sway elections. Cambridge’s clients included Donald Trump’s general-election campaign.
Facebook shares have dropped some 8 percent, lopping about $46 billion off the company’s market value, since the revelations were first published.
Zuckerberg did say he was open to testifying to Congress, but he quickly hedged saying that he would that only if he was the right person to talk on the subject. He was quick to let it be known that Facebook is fixing the current privacy issues and it will cost the company “many millions of dollars.”
Many users have grumbled that if Facebook had dealt with the privacy issues that have hounded them for a decade right away, the billionaire wouldn’t be whining about having to spend millions.
“You can bet that if his stock price hadn’t taken such a hit, Zuckerberg would have kept out of sight,” one user said Wednesday. “His apology is a decade too late.
MTTG: Can you talk a little bit about the things you announced on Wednesday. Let’s have you explain each of them very briefly.
Mark Zuckerberg: Sure. At a high level, and this is a major breach of trust issue, and our high-level responsibility is to make sure that this doesn’t happen again. So, if you look at the problem, it kind of breaks down into a couple of areas. One is making sure that going forward, developers can’t get access to more data than they should. And, the good news there is that actually the most important changes to the platform we made in 2014, three or four years ago, to restrict apps like [researcher Aleksandr Kogan] from being able to access a person’s “friends” data in addition to theirs.
So, that was the most important thing, but then what we did on our platform is we also are closing down a number of other policies like, for example, if you haven’t used an app in three months, the app will lose ability to clear your data without you reconfirming it, and a number of things like that. So, that’s kind of category 1 is going forward. And again, the good news there is that as of three or four years ago, new apps weren’t able to do what happened here. So this is largely … this issue is resolved going forward for a while.
Then there’s going backwards, which is before 2014, what are all the apps that got access to more data than people would be comfortable with? And, which of them were good actors, like legitimate companies, good intent developers, and which one of them were scams, right? Like, what Aleksandr Kogan was doing, basically using the platform to gather a bunch of information, sell it or share it in some sketchy way. So, what we announced there is, we’re going to do a full investigation of every single app that had access to a large amount of people’s data, before 2014 when we lost out the platform, and if we detect anything suspicious, we’re basically going to send in a team to do a full forensic audit, to confirm that no Facebook data is being used in an improper way.
And of course, any developer that isn’t comfortable with that, then we’ll just ban them from the platform. If we find anything that is bad, then we’ll of course also ban the developer, but we will then also notify and tell people, everyone whose data has been affected. Which we’re also going to do here.
MTTG: So that begs the question … this started off in 2007, 2008 when you were [launching] Facebook Connect, a lot of this stuff started very early, and I remember being at that event where you talked about this. Open and sharing, and it was helpful to growing your platform, obviously. Why wasn’t this done before? What’s in the mentality of your engineers of Facebook where you didn’t suspect this could be a problem?
Mark Zuckerberg: Well, I don’t think it’s engineers.
MTTG: Well, whatever. People [at Facebook].
Mark Zuckerberg: So, in 2007 we launched the platform.
Mark Zuckerberg: The vision, if you remember is to help make apps social.
Mark Zuckerberg: So, the examples we had were you know, your calendar should have your friend’s birthday. Your address book should have your friend’s picture. In order to do that, you basically need to make it so a person can log into an app, and not just port their own data over, but also be able to bring some data from their friends as well. That was the vision, and a bunch of good stuff got created. There were a bunch of games that people liked. Music experiences, things like Spotify Travel, you know things like Airbnb they were using it. But there was also a lot of scammy stuff.
You know, there’s this values tension playing out between the value of data portability, right? Being able to take your data and some social data … To be able to create new experiences on the one hand, and privacy on the other hand, and just making sure that everything is as locked down as possible.
You know, frankly, I just got that wrong. I was maybe too idealistic on the side of data portability, that it would create more good experiences. And it created some, but I think what the clear feedback was from our community, was that people value privacy a lot more. And they would rather have their data locked down and be sure that nothing bad will ever happen to it, then be able to easily take it and have social experiences in other places. So, over time we have been just kind of narrowing it down. And 2014 was a really big-
MTTG: I get that. 2014 you absolutely did that. But I’m talking about the … You know and I’ve argued with [Facebook executives] about this, this anticipation of problems, of possible bad actors on this platform. Do you all have enough mentality, or do you not see … I want to understand what happens within Facebook that you don’t see that this is so subject to abuse. How do you think about that, and what is your responsibility?
Mark Zuckerberg: Yeah. Well, I hope we’re getting there. I think we remain idealistic, but I think also understand what our responsibility is to protect people now. And I think the reality is, is that in the past we had a good enough appreciation of some of this stuff. And, some of it was that we were a smaller company, so some of the issues and some of these bad actors just targeted us less, because we were smaller. But we certainly weren’t in a target of nation states trying to influence elections back when we only had 100 million people in the community.
But, I do think part of this comes from these idealistic values of openness, and data portability, and things that I think the tech community holds really dear, but are in some conflict with some of these other values, are in protecting people privately, right? And a lot of the most sensitive issues that we faced today are conflicts between our real values, right? Freedom of speech and hate speech an offensive content. Where is the line, right? And the reality is that different people are drawn to different places, we serve people in a lot of countries around the world, a lot of different opinions on that.
MTTG: Right, so where’s your opinion right now? Sorry to interrupt.
Mark Zuckerberg: On that one specifically?
Mark Zuckerberg: You know, what I would really like to do is find a way to get our policies set in the way that reflects the values of the community, so I’m not the one making those decisions. Right? I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. So, there are going to be things that we never allow, right like terrorist recruitment and … We do, I think, in terms of the different issues that come up, a relatively very good job on making sure that terrorist content is off the platform. But things like where is the line on hate speech? I mean, who chose me to be the person that …
MTTG: Well. Okay …
Mark Zuckerberg: I have to, because [I lead Facebook], but I’d rather not.
MTTG: I’m going to push back on that, because values are what we argue about. And companies have values, and they have, you know, the New York Times has a set of values that they won’t cross, and they make decisions. Why are you so uncomfortable making those value decisions? You run the platform. It is more than just a benign platform that is neutral. It just isn’t. I don’t know; we can disagree on that, we obviously disagree on this. But, why are you uncomfortable doing that?
Mark Zuckerberg: Well, I just want to make the decisions as well as possible, and I think that there is likely a better process, which I haven’t figured out yet. So, for now, it’s my job, right? And I am responsible for it. But I just wish that there were a way … a process where we could more accurately reflect the values of the community in different places. And then in the community standards, have that be more dynamic in different places. But I haven’t figured it out yet. So, I’m just giving this as an example of attention that we debate internally, but clearly until we come up with a reasonable way to do that, that is our job, and I do well in that.
MTTG: I’m curious you talked about going back and trying to figure out if there were other developers that had used your API before 2014, and checking were there any other bad actors that maybe you guys missed at the time. I’m curious how you actually go about doing that, and if it’s actually possible at this point to go out and detect, you know, if someone collected data in 2012, if that data still exists.
Mark Zuckerberg: Well, the short answer is the data isn’t on our servers, so it would require us sending out forensic auditors to different apps. So, the basic process that we’ve worked out … And this is a lot of what we were trying to figure out over the last couple of days and why it took a little while to get this post out … is we do know all the apps that registered for Facebook and all the people who are on Facebook who register for those apps and have a log of the different data requests that the developer has made.
So, we can get a sense of what are reputable companies, what are companies that were doing unusual things … Like, that either requested data in spurts, or requested more data than it seemed like they needed to have. And, anyone who either has a ton of data, or something unusual, we’re going to take the next step of having them go through an audit. And, that is not a process that we can control, they will have to sign up for it. But, we’ll send in teams, who will go through their servers and just see how they’re handling data. If they still have access to data that they’re not supposed to, then we’ll shut them down and notify … and tell everyone whose data was affected.
But, this is a complex process. It’s not going to be overnight. It’s going to be expensive for us to run, and it’s going to take a while. But look, given the situation here, that we had a developer that signed a legal certification saying that they deleted the data, now two years later we’re back here and it seems like they didn’t, what choice do we have? This is our responsibility to our community is to make sure that we go out and do this. So, even though it’s going to be hard and not something that our engineers can just do sitting in their offices here, I still think we have to go do this.
MTTG: Did you ever think of doing these kinds of audits before 2014? Or, even when you got that signed contract from … or excuse me, signed statement I guess, from Cambridge Analytica, did you think, “Well, we need to actually go out and check to make sure that they’re telling us the truth.” Why didn’t you do this kind of stuff earlier, or did you think about doing this earlier?
Mark Zuckerberg: In retrospect, it was clearly a mistake. Right? The basic chronology here is in 2015, a journalist from the Guardian pointed out to us that it seemed like the developer Aleksandr Kogan and shared sold data to Cambridge Analytica and a few other firms. So, as soon as we learned that, we took down the app, and we demanded that Kogan, Cambridge Analytica and all the other folks give up the formal, legal certification that they didn’t have any other data. And, at the time, Cambridge Analytica told us that not only do we not have the data and it’s deleted, but so we actually never got access to raw Facebook data. Right? So, what they said was this app that Kogan built, it was personality quiz app and instead of raw data they got access to some derived data, some personality scores for people. And, they said that they used it in some models, and it ended up not being useful so they just got rid of it.
So, given that, that they said that they never had the data and deleted what derivative data that they had, at the time it didn’t seem like we needed to go further on that. But look, in retrospect it was clearly a mistake. I’m explaining to you the situation at the time, and the actions that we took, but I’m not trying to say it was the right thing to do. I think given what we know now, we clearly should have followed up, and we’re never going to make that mistake again.
I think we let the community down, and I feel really bad and I’m sorry about that. So, that’s why we’re going to go and do these broad audits.
MTTG: Alright, when you think about that idea of … it’s not exactly a “mistakes were made” kind of argument, but you are kind of making that. That idea. I want to understand, what systems are going to be in place, but it’s sort of, you know, the horses are out of the barn door. Can you actually go get that data from them? Are you … It’s everywhere, I would assume. I’ve been told by many, many people that have access to your data, I was thinking of companies like RockYou and all kinds of things from a million years ago that have a lot of your data … Can you actually get it back? I don’t think you can. I can’t imagine you can.
Mark Zuckerberg: Not always. But, you know the goal isn’t to get the data back from RockYou. You know people gave their data to RockYou. So RockYou has the right to have the data. What RockYou does not have the right to do is share the data or sell it to someone without people’s consent. And, part of the audits and what we’re going to do is see whether those business practices were in place, and if so we can kind of follow that trail and make sure that developers who might be downstream of that comply or they’re going to get banned from our platform overall.
So, it isn’t perfect. But I do think that this is going to be a major deterrent going backwards. I think it will clean up a lot of data, and going forward the more important thing is just preventing this from happening in the first place, and that’s going to be solved by restricting the amount of data that developers can have access to. So, I feel more confident that that’s going to work, starting in 2014 and going forward. So again, for the last few years already it hasn’t been possible for developers to get access to that much.
MTTG: Let me ask just two more quick questions.
Mark Zuckerberg: Alright, I’m talking to you while walking over there for Q&A.
MTTG: Alright, the cost of this and are you going to testify in front of Congress? And if so, when?
Mark Zuckerberg: You know, I’m open to doing that. I think that the way that we look at testifying in front of Congress is that … We actually do this fairly regularly, right? There are high profile ones like the Russian investigation, but there are lots of different topics that Congress needs and wants to know about. And the way that we approach it, is that our responsibility is to make sure that they have access to all the information that they need to have. So, I’m open to doing it.
MTTG: What is “open,” is that a “yes” or a “no”?
Mark Zuckerberg: Well.
MTTG: They want you, Mark.
Mark Zuckerberg: Well look, I am not 100% sure that’s right. But the point of congressional testimony is to make sure that Congress gets the data in the information context that they need. Typically, there is someone at Facebook whose full time job is going to be focused on whatever the area is. Whether it’s legal compliance, or security. So, I think most the time if what they’re really focused on is getting access to the person who is going to be most knowledgeable on that thing, there will be someone better. But I’m sure that someday, there will be a topic that I am the person who has the most knowledge on it, and I would be happy to do it then.
MTTG: Mark, can you give us a sense of the timing and cost for this? Like, the audits that you’re talking about, is there any sense of how quickly you could do it and what kind of cost it would be to the company?
Mark Zuckerberg: I think it depends on what we find. But we’re going to be investigating and reviewing tens of thousands of apps from before 2014, and assuming that there’s some suspicious activity we’re probably going to be doing a number of formal audits, so I think this is going to be pretty expensive. You know, the conversations we have been having internally on this is “Are there enough people who are trained auditors in the world to do the number of audits that we’re going to need quickly?” But, I think this is going to cost many millions of dollars and take a number of months and hopefully not longer than that in order to get this fully complete.
MTTG: Okay, last question Mark, and then you can go. How badly do you think Facebook has been hurt by this, and you yourself, the reputation of Facebook?
Mark Zuckerberg: I think it’s been a pretty big deal. The number one thing that people care about is privacy, and the handling of their data. You know, if you think about it, the most fundamental thing that our services are, whether it’s Facebook or Whatsapp or Instagram is this question of, “Can I put content into it?” Right? Whether it’s a photo, or a video or a text message. And, will that go to the people I want to send it to and only those people? And, whenever there is a breach of that, that undermines the fundamental point of these services. So, I think it’s a pretty big deal, and that’s why we’re trying to make sure we fully understand what’s going on, and make sure that this doesn’t happen again. I’m sure there will be different mistakes in the future, but let’s not make this one again.
MTTG: Yes, let’s not. Okay, Mark, I really appreciate you talking to us. I know you now have to talk to your employees …
Mark Zuckerberg: I’m walking into my Q&A now, alright, see ya.
Below is the full statement released by Zuckerberg Wednesday:
I want to share an update on the Cambridge Analytica situation — including the steps we’ve already taken and our next steps to address this important issue.
We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you. I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.
Here’s a timeline of the events:
In 2007, we launched the Facebook Platform with the vision that more apps should be social. Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures. To do this, we enabled people to log into apps and share who their friends were and some information about them.
In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends’ data. Given the way our platform worked at the time, this meant Kogan was able to access tens of millions of their friends’ data.
In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today.
In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica. It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.
Last week, we learned from The Guardian, The New York Times and Channel 4 that Cambridge Analytica may not have deleted the data as they had certified. We immediately banned them from using any of our services. Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit by a firm we hired to confirm this. We’re also working with regulators as they investigate what happened.
This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.
In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people’s information in this way. But there’s more we need to do, and I’ll outline those steps here:
First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.
Second, we will restrict developers’ data access even further to prevent other kinds of abuse. For example, we will remove developers’ access to your data if you haven’t used their app in 3 months. We will reduce the data you give an app when you sign in — to only your name, profile photo, and email address. We’ll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we’ll have more changes to share in the next few days.
Third, we want to make sure you understand which apps you’ve allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.
Beyond the steps we had already taken in 2014, I believe these are the next steps we must take to continue to secure our platform.
I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.
I want to thank all of you who continue to believe in our mission and work to build this community together. I know it takes longer to fix all these issues than we’d like, but I promise you we’ll work through this and build a better service over the long term.