Better late than never!
Or is it a case of “Too Late the Hero”?
Well, it took Mark Zuckerberg five long days to speak out on the raging controversy surrounding the alleged misappropriation of Facebook’s user information by British political consultancy firm Cambridge Analytica.
In a 937-word Facebook post, the billionaire CEO has laid out a series of corrective measures to secure user data, while taking full responsibility for everything that takes place on the social networking site.
“I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community,” wrote Zuckerberg.
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” he said.
All this, however, is somewhat contrary to what his vice president and General Counsel, Paul Grewal, said over the weekend. His statement appeared to be a blatant attempt to absolve Facebook of all responsibilities concerning the data leak in question.
“The claim that this is a data breach is completely false,” Grewal had said. “Aleksandr Kogan requested and gained access to information from users who chose to sign up to his app, and everyone involved gave their consent. People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked.”
While Zuckerberg’s version is not much different from that of Grewal, he has at least owned up to lapses on the part of Facebook and has pledged to take necessary steps to address privacy and security-related issues, in an attempt to make Facebook a breach-proof platform, if you will.
But, before Zuckerberg went on to outline those steps, he tried to explain the company’s position on what had actually transpired between his company and Cambridge Analytica – the subject of the ongoing crisis.
According to Zuckerberg, Aleksandr Kogan – a Cambridge University professor/researcher -developed an app and used it on Facebook to gather information from some 300,000 users on the pretext of a personality test, which the users voluntary shared along with their friends’ data, thereby giving Kogan access to tens of millions of FB accounts.
“In 2013, a Cambridge University researcher named Aleksandr Kogan created a personality quiz app. It was installed by around 300,000 people who shared their data as well as some of their friends’ data. Given the way our platform worked at the time this meant Kogan was able to access tens of millions of their friends’ data,” Zuckerberg wrote in his post.
However, in a clear breach of trust, Kogan not only shared the data of the wilful participants but also the personal information of their friends with Cambridge Analytica, which, in turn, used the information to unfairly benefit Donald Trump’s 2016 presidential campaign.
Zuckerberg further said that the company made big-time changes to the platform in 2014 to put a leash on abusive apps, by limiting the amount of information these offending programs could access.
“In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access,” reads the post.
“Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today,” he wrote, explaining the steps taken at the time.
However, the horse had already bolted with the personal data of 50 million users, before Mr. Zuckerberg got down to locking the stable door.
And, here’s a harsh reminder of that: Donald Trump did become, and is, the President of the United States of America.
According to Zuckerberg, it wasn’t until 2015 that the company came to know from The Guardian that the data collected by Kogan somehow found its way to Cambridge Analytica, after which, the company put pressure on Kogan and Analytica to provide written guarantees that all user information had been wiped clean from their records.
“In 2015, we learned from journalists at The Guardian that Kogan had shared data from his app with Cambridge Analytica,” reads Zuckerberg’s post.
“It is against our policies for developers to share data without people’s consent, so we immediately banned Kogan’s app from our platform, and demanded that Kogan and Cambridge Analytica formally certify that they had deleted all improperly acquired data. They provided these certifications.”
However, the latest developments suggest that the said data may not have been actually deleted by the concerned parties, which, again, is a breach of the formal certifications they gave to Facebook in 2015.
“Cambridge Analytica claims they have already deleted the data and has agreed to a forensic audit” by a company Facebook has hired to investigate the matter, Zuckerberg said.
“We’re also working with regulators as they investigate what happened,” he added.
The Facebook CEO acknowledged that not only was this a “breach of trust between Kogan, Cambridge Analytica and Facebook,” but also a violation of the trust between the company and the people who expect Facebook to protect their data. “We need to fix that,” he asserted.
Here are the key points of his plans to prevent “bad actors from accessing people’s information.”
One: Investigate old apps that had access to huge amounts of data and thoroughly audit all apps showing questionable activity. Developers who do not agree to the audit or those found to have “misused personally identifiable information” will be banned on the platform.
Two: Apps that users have not accessed for three or more months will be blocked from accessing user information. Also, the amount of data an app can access will be limited to username, profile photo, and email address. Any developer seeking more information will be required to get prior approval and also sign a contract.
Three: The Facebook privacy settings tool which allows users to revoke apps’ permissions to user data, will now appear on top of the News Feed, making it easier for users to keep track of the apps they have allowed to access their data.
While these steps may prevent new apps from doing a Cambridge Analytica again, it “doesn’t change what happened in the past,” said the beleaguered CEO.
“We will learn from this experience to secure our platform further and make our community safer for everyone going forward,” he pledged – and we hope.