Follow ACC Docket Online:  

Breaking Down Big Data: Legal Concerns of Data Monetization

This article is the third part of the “Breaking Down Big Data” series, written by members of the ACC big data subcommittee. The first article of this series overviewed select laws that govern de-identification of personal information. The second article of this series focused on the practical challenges of meeting de-identification standards. In this article, we discuss the legal issues and concerns surrounding data monetization.

A ccording to a paper from IBM, 90 percent of the data in the world today has been created in the last two years, reaching the staggering amount of 2.5 quintillion bytes of data created per day. This number is certain to grow as the number of connected devices increases. 

Big data is becoming essential to the development of data-driven technologies, and the European Union’s digital single market strategy recognizes the potential of big data as a driver of the economy and innovation.

Organizations are also seeking to capitalize on this growing trend, since data analysis can help businesses better adapt themselves to ever-changing market needs, create web content that will draw more visitors, or gain insight into buying behaviors of customers. For example, the growing number of “smart devices” (also known as the Internet of Things) means that location data and other data generated from sensors, smart meters, and mobile devices can now be gathered and collated against other databases, thereby increasing the value of both data sources.

Big data is becoming essential to the development of data-driven technologies, and the European Union’s digital single market strategy recognizes the potential of big data as a driver of the economy and innovation.

It is therefore no surprise that more and more businesses are realizing that data can be an asset that can be sold, like any other product. The Center for Information Systems Research (CISR) at MIT Sloan defines data monetization as “the act of exchanging information-based products or services for legal tender or something of perceived equivalent value.”

While some commenters have recommended not to even try selling any data, most notably due to reputational concerns, this does not appear to be a market trend. Some organizations are seeing untapped value in the data they own. According to an article published in the Harvard Business Review, “even data that may seem trivial to your business, when seen longitudinally over time, could be a prism of insight for another company.” However, before rushing to monetize the data held, organizations should first carefully consider social, legal, and ethical considerations, including the privacy and data protection rights of the affected individuals.

In a non-legislative resolution passed in March 2017, members of the European Parliament passed a resolution on the fundamental rights implications on big data. The resolution stresses that “the prospects and opportunities of big data” can only be realized “when public trust in these technologies is ensured by a strong enforcement of fundamental rights and in compliance with current EU data protection law.”

One issue raised in the resolution is the risk of “algorithmic discrimination,” such as price discrimination, where consumers are given different prices of a product based on data collected from their previous internet behavior, or unlawful discrimination and targeting of certain groups or persons defined by their race, color, ethic or social origin, religion, or political views, or being refused from social benefits, education, or employment opportunities. The MEPs highlighted the need for greater accountability and transparency of algorithms with regards to data processing and analytics and warn that low quality of data or low quality procedures could result in biased algorithms.

While some commenters have recommended not to even try selling any data, most notably due to reputational concerns, this does not appear to be a market trend. Some organizations are seeing untapped value in the data they own.

The second issue to consider is privacy and data protection. For example, the European legal framework for data protection is applicable to the processing of personal data in big data operations. Currently, the European directive 95/46/EC of 24 October 1995 relating to the processing of personal data is applicable — together with related EU rules — ensuring the protection of individuals by providing with specific rights that cannot be waived.

Beginning May 28, 2018, the European General Data Protection Regulation (GDPR) will apply and provides that "the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” (article 22 of the GDPR).

The GDPR defines profiling in Article 4(4) as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

The Article 29 Data Protection Working Party, which is composed of representatives of the national supervisory authorities from the EU member states, has recently clarified that “simply assessing or classifying individuals based on characteristics such as their age, sex, and height could be considered profiling, regardless of any predictive purpose.” Accordingly, a data broker would be carrying out “profiling” according to the GDPR simply by placing a person into a certain category according to their clients’ interests, even if there is no predictive purpose.

In addition, article 35(3) (a) of the GDPR requires a data protection impact assessment (DPIA) to be carried out to assess the risks involved in automated decision-making. This includes profiling, since a DPIA mandatory in the case of “a systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person.”

The GDPR also imposes a number of rights for individuals. For example, a company should inform the individual about the purposes for using this profile (article 14(1) (c) of the GDPR) and from what source they obtained the information (article 14(2) (f) of the GDPR). The company must advise the individual about their right to object to processing, including profiling, for direct marketing purposes (article 21(2) of the GDPR). The individual should have the right to access the information relating to themselves, for example in which segments or categories they are placed (article 15 of the GDPR), to correct any erroneous information (article 16 of the GDPR), and in certain circumstances erase the profile or personal data used to create it (article 17 of the GDPR).

Additionally, the increase in data flows imply further vulnerabilities and new security challenges, as evidenced by the frequent amount of data breaches regularly making the headlines. In other to achieve this, techniques such as de-identification and anonymization techniques (see the first article of this series for further details), encryption, and privacy by design and by default.


Recommended best practices:

To manage compliance with legal and regulatory requirements, consider implementing the following best practices:

  • Conduct an assessment of all data activities that may qualify as ‘profiling’ and determine the applicable legal basis: (i) consent, (ii) required for the entry into or performance of a contract or (iii) authorization by law;
  • Identify any decision that relates to sensitive data or children, in both cases bear in mind that further analysis may be needed;
  • To the extent that consent is likely to be required, identify the most appropriate mechanism for obtaining this and how to deploy it in practice; and,
  • Assess applicable works council agreements and assure negotiations for any changes in data collection, use, or retention to align with the GDPR as soon as possible.

[1] Anne Buff, Barbara Wixom,and Paul Tallon, Foundations for Data Monetization, MIT Sloan CISR Working Paper, 17 August 2015, available here (login required). See also Barbara Wixom, Cashing In on Your Data, MIT Sloan CISR Research Briefing, 21 August 2014, available here (login required).

About the Author

Paul Lanois is vice president and senior legal counsel at Credit Suisse.

 

Views expressed are entirely his own and are not necessarily those of his employer.



The information in any resource collected in this virtual library should not be construed as legal advice or legal opinion on specific facts and should not be considered representative of the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical advice and references for the busy in-house practitioner and other readers.