Big Data Demands Big Picture Thinking, Part 2

In Big Data, Data Management by IRM UKLeave a Comment

Print Friendly, PDF & Email

Current discussions around big data continue to focus either on specific business (usually marketing) benefits or technology platforms. However, they often skirt the bigger issues raised by this pervasive and rapidly evolving environment. Here, I paint the bigger picture; part 1 focused on business reinvention and disruption. Part 2 homes in on issues of privacy.

Concerns about privacy of personal big data and its use in surveillance by governmental agencies have been well exercised—although largely unresolved—since Edward Snowden’s dramatic revelations beginning in June 2013. Thinking and real debate about the privacy of personal data collected and used by commercial organisations has been remarkable largely by its absence.
In essence, a business exists—ideally, perhaps—to satisfy, to the best of its ability, its customers’ and society’s needs through the products and/or services it develops and offers. An implicit agreement exists between the parties that, in order to best meet those customer needs, the business must understand something about the customers themselves, their uses of and opinions about existing products, and so on. Furthermore, customers also agree to receive (hopefully relevant and timely) marketing information about the business’ offerings. Both activities necessitate that the customer (or prospect) willingly relinquishes some measure of privacy in return for better products or service. Since the 1980s, this agreement has been broadly shaped by various privacy codes based on declaration of information usage by the business and the customer’s ongoing and informed consent.
With the advent of big data and, in particular, the smartphone as harbinger of the Internet of Things, both transparency of use and ongoing, informed consent have been seriously compromised. Transparency of use is almost non-existent thanks, for example, to data brokers now gathering thousands of measurable attributes about unsuspecting consumers (people) to create mailing lists and scoring algorithms to enable targeted marketing that is often largely indistinguishable from blatant discrimination. And as anybody who has ever tried to install a smartphone app without checking all data collection boxes knows, consent has been reduced to a formality. Indeed, as marketing moves to always-on and location-aware, the idea of ongoing consent may become a distant memory.
Addressing privacy concerns requires consideration of both business and technology. On the business side, monetisation models that fund “free” services through targeted advertising are particularly prone to abuse of users’ privacy. Of course, Internet behemoths like Google and Facebook are highly dependent on and successful because of this approach. But at what cost to personal privacy, which is at the heart of democracy? At a more detailed level, the ethics of collection and use of particular types of data must be considered. What are the potential negative implications of having particular data about people? Even if you have data, or the ability to combine existing data for new insights, that doesn’t mean you should use it. Above all, you must be transparent about the planned uses of the personal data you collect, and avoid the use of data that has been gathered by dubious means.
From a technology viewpoint, it is clear that strong data security is a sine qua non for even basic protection of privacy. Observe that when different data is combined from multiple sources, the context offered by one source may expose behaviours obscured in another. At the extreme, some researchers maintain that anonymised data in one source can be relatively easily de-anonymised when combined with as few as three other data sets. IT is also responsible for ensuring that data is managed and used in accordance with widely varying local privacy laws; such considerations must be reflected right back in the logical architecture of corporate data warehouses and BI systems. Legal and financial consequences of noncompliance can be severe: for example, new EU privacy laws allow fines up to 5% of global revenue or €100m. Furthermore, best legal/ethical practice suggests that the use of personal data must be regulated according to individual privacy preferences.
Privacy is a topic whose application is viable at the level of individual businesses and their IT staff. In the final part of this series, I’ll focus on an issue with broader scope of impact and resolution: the potential economic and social consequences of analytics and automation enabled by big data.

Image via ‘Privacy’ by Zabou, Chance Street, London, 2014

Read Part 3 of this article series here.

0521e60Barry Devlin, 9sight Consulting.

Barry will be delivering multiple sessions at the Enterprise Data & BI Conference 2015, 2-5 November.

Leave a Comment