With all the palaver about Facebook and Cambridge Analytica in the media I found myself wondering exactly what had been done that was so wrong. Sure they collected data on millions of Facebook users and used this to target the users with adverts but this is what Facebook does. We all know this. The thing that seems to have made the difference, this time, is that there was some kind of security “breach” and the data may have been used to encourage people to vote for Trump.
I had a hunt around and found most of the press more interested in the politics and people than the facts of what actually happened. However, an article in Wired covers the facts so I’ll precise it here and throw in my own interpretation.
Like many “platform” providers Facebook provide normal access for Joe Public and a second “interface” for developers usually known as an Application Programming Interface (API). One example of this use is when you see a poll on Facebook. You answer the questions and the info is channeled back to the organisation which developed the poll. It’s worth noting here that many other companies besides Facebook do this too. I am continually irritated by iPhone apps which demands to know my location and read my contact list even though this is irrelevant for their function (I usually decline).
In November 2013 Aleksandr Kogan, a psychology professor at the University of Cambridge, applied for an API account on Facebook explaining that he wanted access to Facebook’s data for research. Kogan then created a personality quiz and collected responses from 270,000 Americans. Back then the Facebook API was very open and by agreeing to take part in the quiz users also gave access to many of their Facebook friends’ “likes” and interests as well. Kogan ended up obtaining information for roughly 87 million people at the last count!
About 5 months after Kogan obtained this data Facebook tightened up security so that developers could not see information on friends however, this was not finalised for about a year and Kogan already had the data anyway. Kogan then sold the data to Cambridge Analytica in violation of his agreement with Facebook.
In 2015 The Guardian newspaper published a story saying that American Republican politician Ted Cruz used Facebook to get information on users but this story didn’t get a lot of attention. However Facebook reacted by obtaining written assertions that the data had been deleted from the Director of Research at Cambridge Analytica, Christopher Wylie and Aleksandr Kogan. It seems that these assertions were false and that the data had been used in PR campaigns by Cambridge Analytica.
So, what are we to make of this?
It seems to me that it was Kogan who broke the law by selling the data to Cambridge Analytica and thereby violating his agreement with Facebook. But wait! Aren’t companies which subscribe to the Facebook API allowed to use the data to target messages? Surely this is the point of it? Perhaps Facebook don’t allow companies to resell the data but in that case Cambridge Analytica could have set up the access themselves. It’s all a bit vague. Facebook are also to blame for security which was so lax in the first place and for allowing their data to pass to Kogan so freely without any checks on his work – Those multi page agreements one assents to when signing up to a site are absurdly inadequate and, in my view, should not be acceptable in a court of law.
Cambridge Analytica, Wylie and Kogan are also responsible as they appear to have lied about the data being deleted. Wylie claims that he was under pressure from more senior Cambridge Analytica figures but he was a Director himself and so it’s unrealistic for him not to accept responsibility.
If we’re fair we should remember that when all this whiz bang digital media emerged few of us foresaw the downside. We raved about the opportunities and a new PR and marketing companies were formed to to take advantage of Facebook and other digital media.
The laws of personal data have not kept pace with technology and it is only in 2018 that the first law which really seeks to place a legislative framework around personal data comes into law (GDPR). I know of many organisations which are nowhere near ready and will remain non-compliant for some time to come.
We should also consider this in a broader context. The general tendency is to become apoplectic over data privacy but in the past this was just not an issue. I recall when the scandal broke about journalists dialing into people’s mobile phone answer service to replay their messages. Yes, it was immoral but up until then I doubt anyone would have considered it criminal. It is only now that we understand how information technology can be abused that our morals and laws are changing.
Democracies have evolved various mechanisms to keep the public informed. In the 19th century we had newspapers, in the 20th century television. In the 21st century it looked as though digital media would allow us to bypass the powerful and distribute information freely. The reality is now dawning. Just as the rich and the powerful fought for control of newspapers and television they are now fighting to control digital media.