A Sting in 68 Likes

When we use social networks, search engines or other online services on our computers or smartphones, we leave behind a long digital trail.

A LOGO of Twitter is pictured next to the logo of Facebook. (photo credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)
A LOGO of Twitter is pictured next to the logo of Facebook.
(photo credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)
The Facebook and Cambridge Analytica scandal should serve as a wake-up call to us all. According to suspicions, Cambridge Analytica, which was hired by the Trump campaign ahead of the presidential election, made use of personal data taken from around 50 million Facebook users to analyze the behavior of American voters in order to influence their political views.
When we use social networks, search engines or other online services on our computers or smartphones, we leave behind a long digital trail that can be used to create a fairly accurate profile of our personalities.
Until now, we knew that this profile was being used to personalize advertisements for us, and we have become accustomed to seeing adverts that reflect our search queries, posts we have uploaded onto social networks, or even conversations we have had while the microphone of our smartphone was on. But this data breach proves that it is possible to learn far more about us. For example, by reviewing an aver - age of 68 “likes” made by any given user on Facebook, it is possible to accurately predict a range of personality characteristics about the user, ranging from sexual orientation to political outlook.
Any attempt to minimize the dangers are inherent in the use of personal data to create individual personality profiles must be a combined effort. As citizens, we should improve our digital awareness and take a more crit - ical stance toward the digital services we use.
When we use the services provided by technology companies such as Amazon, Google, Facebook and others, we hand over information about ourselves that can be used to identify the personality strengths and weaknesses of each and every one of us – not just in order to personalize advertising that will influence what we buy and where we eat, but also to affect our views and our beliefs. For example, if an analysis of my likes and search queries shows that I suffer from anxiety, then I can be targeted with election propaganda that plays on that attribute, showering me with terrifying possible future scenarios.
Continued exposure to such content over time may be used to persuade me to vote for a particular candidate as a “savior” figure, though this would not have been my rational choice had my levels of anxiety not been raised. This type of use can even affect those of us who claim that “they have nothing to hide” and so – are unconcerned about protecting their privacy in the digital environment.
This is no longer about monitoring users in order to show them advertisements. This is about sophisticated psychological warfare being waged against us without our knowledge. All this, even before mentioning the effective use that could be made of this information by terrorist organizations or hostile entities.
Therefore, we need to check what information is being collected about us through different applications and avoid using those that collect more data about users than is needed for their operation.
This may not seem an effective strategy, but as we can see from the current Facebook episode, making data breaches public knowledge – meaning that Facebook has to notify the public if there is a breach of their privacy – may in itself be enough to force Facebook or other giant corporations to change the way they treat our data. In this way, we can bring about a change in the existing app market, and train technology developers toward sensitivity to user privacy as well.
For its part, the state must revise the outdated privacy protection law and demand that technology companies improve the security and oversight of the personal data they collect. For example, it could be made a legal requirement for app developers to adopt privacy by design strategy and embed organizational and technological safeguards at the development stage, and subsequently, during the data collection and use stages, that will ensure proper protection of the right to privacy. Furthermore, it is possible to force companies to make use of personal data only for the time required and for the purpose stated to the user when downloading the app, and to delete the data from their servers once this purpose has been attained.
This type of combined effort – by citizens and the state – would make it possible to continue to develop and use new information technology while also protecting democracy and freedom of choice, and would limit the dangers inherent in the use of data to influence our social and political outlooks. Now is the time to act.
The author is an attorney and a researcher in the Democracy in the Information Age program at the Israel Democracy Institute.