Your bank buys your social network profile and decides to turn down a loan, because the patterns in the data do not fit with the lending profile they want. When a prospective employer buys your social network data in order to check your pattern for a good match against specified culture and values desired by the leadership of the company. When? Now.
I do not share my social network profile as public but this does not prevent an analysis by the social network engineers based on criteria provided by the data analytics company who have been given my email address among many (encrypted) to search for those criteria in its database of everybody’s patterns. Only the ‘hits’ are then further selected for a job, a loan, or marketing, or suspicion of crime. The rest are never viewed. Within the criteria a range of validities and predictive power might apply. If searching for murderers in a specific area a low predictive validity occurs, it may help to save someone’s life but with a few false positives, corroboration then excluding them. False positives carry less weight against true positives in that case. However, when a job is refused and there is low predictive validity those false positives are potential employees turned down. Jobs offered to fewer people with a predicted higher profit value, based on having given them your attributes and associations by engaging in and presenting data on the social network platform in addition to your financial record. Or possibly more predictive than your financial record.
Let’s not do this. A Minority Report in Philip K Dick’s novel is one of three clairvoyants who does not see the predicted crime while the other two do see the pattern. This minority report is discarded and the person is jailed for pre-crime despite the lack of consensus. Basically that’s what corporations are trying to do because predictability increases profit, despite the consequences of false positives. That level of predictability could have a significant potentially negative effect on society, not to mention false positives when a person is discriminated against because of an algorithm.
In another example the police can search for the patterns of people who may be psychopaths, perhaps because a raised incidence of murder is associated with people who have this attribute. Profiling. Profiling of whole lives looking for particular attributes and association patterns with others. Criteria are set with a low threshold of likely behaviours occuring, but by default raising checks for patterns against all in that group. Many psychopaths do not do criminal harm even if they can create havoc and suffering all around them. Intrusive policing. Intrusive marketing, intrusive and discriminatory personnel selection. A large quantity of information about your attributes and your associations with others provides a fingerprint of behaviours that indicate what kind of person you are – to an employer. You give them your email address and it is simple. But the potential consequences are even worse.
How bad could it get? When we work for a company we work for a psychopath, the company is not its directors but a set of objectives and capabilities that follow the money. In Joel Bakan’s The Corporation “…the diagnostic criteria in the DSM-IV; Robert D. Hare, a University of British Columbia psychology professor and a consultant to the FBI, compares the profile of the contemporary profitable business corporation to that of a clinically diagnosed psychopath.” http://en.wikipedia.org/wiki/The_Corporation_(film)
OK, let me state it clearly because I am finding it hard to believe that we are doing this: Psychopaths and sociopaths are able to choose the kinds of people they need in order to create their team who exploit opportunities and people and the environment for profit.
Still not concerned? An area that I have researched is geospatial patterns. The movements of you mobile phone as you go about you daily life are a fingerprint, the way those movements interact with those of others are even more concrete in defining you and the way you live than your patterns of association on social media. If you do something that connects with activities of interest (an opposition political group for example) then it will show. We can pinpoint on the map where the good people are and where the persons of interest are, where they are going, whom they will meet with. We know where to place the road blocks in order to strictly control society. Easily.
What have we walked into?
We will only discover the extent to which analytics are affecting our lives some time after the grid has been welded in place. Perhaps up to 5 years after this seismic shift in the relationship between customers and corporations, citizens and state, after it enters public awareness.
Privacy can be achieved for example by using what is known as homomorphic encryption http://www.npr.org/blogs/alltechconsidered/2013/12/13/250737120/a-movement-to-bake-online-privacy-into-modern-life-by-design , patterns are analysed and if a suspicious pattern of terrorist activity is spotted a court order can permit unencryption of the names and addresses of the suspects, but when weighing the proportionality of predictability and the seriousness of the crime most applications for a warrant would never even be made. The point being that proportionality and the risks of minority reports would be tested in court.
And back to corporations being able to choose people it likes, rather than the people who fulfill job descriptions, that concerns me far more even than state intrusions of privacy. We are setting up a system where psychopaths can gain an even tighter grip on the world.
Privacy is something we need as a Magna Carta right.