Shaken Trust: The Aftermath of the Cambridge Analytica Scandal
You have probably heard the story: a data analytics company called Cambridge Analytica has harvested more than 50 million Facebook profiles without the users’ knowledge, and built a system that could target them with the political advertisements most effective for their psychological profile.
As it turns out, hundreds of millions of the social network’s users might have had their data harvested without their consent, and this has been going on for years – Facebook has been warned about Cambridge Analytica since 2015 but only suspended the access of the company this year. The data has allegedly been used to sway the results of the 2016 US election and the Brexit vote, among others.
The information
What was exactly the information the company harvested from Facebook?
Well, there’s no way of knowing for sure, but data scientists made an educated guess based on Facebook’s Graph API version 1.0, used until 2014. The list includes your birthday, location, hometown, and check-ins, interests, likes, games, notes, political and religious views, probably even how many times you visited websites like onlinecasinoreview.co.nz looking for some no-strings-attached gaming. What makes things worse, the company collected similar data about people in your friend list, without their consent.
The aftermath
The immediate aftermath of the scandal breaking out was a global uproar against Facebook. Visible brands like Tesla and Playboy along with some celebrities like Will Ferrell, Cher, Farhan Akhtar, English trip-hop band Massive Attack, and the list will probably continue to grow.
As another painful result of the scandal, Facebook has lost $100 billion in market value since this February, most of it after the Cambridge Analytica scandal broke out. And it has become the boogieman of the month.
What will change?
Facebook’s Mark Zuckerberg has acknowledged the service’s part of the blame and issues a formal apology. But admitting that things are wrong is just the start – something has to be done to prevent similar events from happening in the future. Looker Chief Data Evangelist Daniel Mintz had the following to say about the incident:
Technologists, in general, need to grapple with this. We’re building tools that can quite clearly be used for evil or for less good. It’s not enough to say, ‘We didn’t mean for them to be used this way.’ We need to have a community discussion of what does ethical technology use look like and people can subscribe to them or not.”
Facebook has to go beyond updating its user agreement. While it should still make anonymous data available to academic researchers, it will have to rework its contracts with the companies using its data, making them far stricter and far more transparent. And this is just a necessary first step to be taken in a long chain of events that will perhaps one day lead to a more transparent future.