Businesses that make money by collecting and selling detailed records of private lives were once plainly described as "surveillance companies”. (Edward Snowden)
“In total, we believe the Facebook information of up to 87 million people – mostly in the US – may have been improperly shared with Cambridge Analytica.” With these words, Facebook’s chief technology officer, Mike Schroepfer, admitted to the massive data breach that affected Facebook users around the world. This scandal was revealed by a whistleblower, Christopher Wylie, a data scientist from Cambridge Analytica, who denounced the breach to The Guardian and The New York Times. Cambridge Analytica used this data to “play with the psychology of an entire country without their consent or awareness (…) in the context of the democratic process,” said Wylie.
Before explaining how this happened and what we can do as civil society, I believe it is important to highlight that the business of personal data is not something new. Activists and civil society have warned of the danger of social media platforms and their core business: “Facebook makes money, in other words, by profiling us and then selling our attention to advertisers, political actors and others. These are Facebook’s true customers, whom it works hard to please.” Maybe what makes the situation of Cambridge Analytica more critical is that we just realised the massive power and value of our data and privacy.
The Guardian and New York Times published that Cambridge Analytica used personal data from a Facebook app called thisisyourdigitallife, built by academic Aleksandr Kogan in 2013, to create psychological profiles and manipulate electors through political propaganda. This app was a personality test that collected personal data not only from people who accepted its terms but also from their Facebook friends, without their consent and awareness. This explains why the app’s acceptance by 265,000 people allowed the gathering of the personal data of tens of millions. Wylie revealed that Kogan sold this information to Cambridge Analytica violating Facebook policies. This information consisted of likes, shares, personal status and even private messages.
Although many people around the world have started a campaign calling on others to delete their Facebook profile (#DeleteFacebook), I do not believe that will solve the problem. Many people use Facebook regularly to communicate with personal friends, colleagues or family; they use Facebook as a part of their job or education and, moreover, Facebook is interconnected with several apps and social media platforms. That is why deleting Facebook is not a solution.
In the surveillance capitalism era, there is no single and magical solution. As I already said, we have to understand that Facebook and Cambridge Analytica exist within an opaque ecosystem built to gather personal data, which can be used by companies for profiling for political and other purposes. The problem, as some activists already have said, is not Facebook (or Cambridge Analytica), it is their core business. Privacy is a human right and we have to regulate this market in order to prevent our privacy (and other public values such as democracy) from vanishing on behalf of companies’ interests.
Firstly, companies have a corporate responsibility to bring out their products knowing their impact on their users’ human rights. In this case, Facebook waited more than two years before revealing this unprecedented data harvesting and did not notify the affected users; this, of course, shows an enormous recklessness with regard to their duty of preventing and mitigating adverse effects of their core business. Global companies like Facebook have an international obligation to prevent, mitigate and provide redress when human rights of any person are violated as a consequence of their actions.
Secondly, we have to understand that contractual terms are not enough. There is usually no such thing as an equal and informed consent in the terms and conditions of most apps. Normally, as a US congresswoman highlighted in the Zuckerberg hearing, terms and conditions of apps are extremely ambiguous, vague and hard to understand for average people. Companies should be more transparent about their operations (i.e. explaining the difference between “sell my data” and “use my data for ad targeting”) and, particularly, about how their (opaque) algorithms work. That way, people could genuinely consent and be aware – in an equal and informed position – if they really want to be part of Facebook, Instagram or any other platform.
Finally, we have to bear in mind that apart from the issue of regulation, this is also a political matter. It is necessary to empower average people to demand better technological solutions with due respect for human rights; support civil society organisations to demand better regulatory solutions; and spread the message that the Cambridge Analytica scandal is a human rights (and democracy) matter. Facebook will lose profits with more constraints and supervision. This is why, in response to the question from Congresswoman Anna Eshoo, “Are you willing to change your business model in the interest of protecting individual privacy?", Mark Zuckerberg answered: “Congresswoman, I’m not sure what that means.”
Photo: facebook celular 3 by Esther Vargas via Flickr (CC BY-SA 2.0)