Could you put a figure on the value of your data?
You’ll likely have a good idea on the value of some of your most personal information; how much is in your bank account, how much you pay in taxes, what your invoices have been like for your business, customer relations information, but what about Big Data?
Thanks to digital applications and wearables, there’s a good chance you’ll also have information on how your body is working; how well you sleep, your heart rate, how hard you’re training, how many calories you’re eating, how far you’ve run.
The collection of data in our day-to-day lives – the so-called “Internet of Things” – has transformed how we live, work, play and consume. Whether it’s figures on which customers are most influenced by an advert or data on optimising your workout, we’ve grown accustomed to measuring every aspect of our lives. Collecting the information is relatively simple, but there are harder moral questions over who should be allowed access to this information?
Facebook is now one of the world’s most comprehensive customer relation management tools. Users willingly supply demographic information, defining themselves by what they Like, where they visit, who they interact with and – thanks to Facebook reactions – what they are angered, saddened, shocked or enamored with.
This is a dream come true for us digital marketers, who are able to target their products and services with laser-sighted accuracy. Do you think your service will play well with single 25-year-old fans of Game of Thrones, living in Ipswich? Has focus group feedback been that you play well to forty-something divorcees with a completely unrelated interest in football management simulators? You can specifically target those groups with posts.
For some, the realisation that you’ve opted in to targeted advertising is horrifying. It wasn’t so long ago that films such as Minority Report showcased a seemingly distant future whereby commercials are tailored to each individual. We’ve all heard stories like the woman who received adverts for newborn baby products before she knew she was pregnant. The future arrived without us knowing.
While that may seem like the sign of an impending Skynet-inspired apocalypse, it’s actually very sensible. Targeting may seem dehumanising, but it’s simply a way of making sure that the adverts you see are relevant to your interests. If we accept that adverts are required to pay for the content we consume, it’s surely better to keep them relevant? Big Consumer Data held by Big Companies helps them target those adverts; it’s far from sinister. Data is a vital competitive advantage, so it tends to be kept in the utmost privacy.
The role, reach and responsibilities of government are less clear. Should the authorities have the right to examine your health statistics? What about your web purchasing history? How often you exercise? What your personal eating or sleeping patterns are like? All of this information is likely to be held on the modern smartphone. There are brewing battles over who should be able to access this information.
A new report from Facebook suggests that there were over 45,000 requests for profile information from authorities in the second half of 2015, with over 30,000 of those being under “gag orders”, prohibiting the company from informing individuals that their data was being passed on. This is not census data or police records, this is information held by a publicly-listed company.
There have already been high profile cases testing this moral dilemma. In the wake of the San Bernardinho shootings last year, the US Government requested access to a secure phone belonging to one of the suspects, in the hope of gathering information for national security. In an unprecedented move, Silicon Valley unanimously raised concerns and refused to grant access. The disagreement only avoided the courts when the authorities found an alternate way of hacking the data.
The case, described by FBI director James Comey as the “hardest question I have ever seen in government”, is a moral dilemma straight out of dystopian sci-fi. The authorities want to have a “back door key” to access information that may help them prevent serious crime or acts of terror. Meanwhile, the digital companies argue that would set a dangerous precedent for civil liberties and leave their devices vulnerable to hackers; once there’s a back door key, there has to be a back door. Phone data is far more than contacts, messages and emails. Every aspect of our lives are held in our hands, and most in the form of metrics.
Your data is a commodity, and the right to see it may just become one of the most difficult civil liberties cases in the 21st century.