In recent years, information gleaned from the analysis of large data sets has led to scientific innovation, critical research, and advances that benefit society. Commercially, data mining and analytics have become a key aspect of efficient marketing and production practices. In addition, data collection and processing have become an important part of protecting consumers against fraud and other digital dangers. While these increased collection and processing capabilities have served to showcase the value of data, advancements in data mining, often termed “Big Data,” have also created new privacy concerns.
How can we maximize the benefits of data collection and retention while confronting its challenges to individual privacy? Through projects on big data, de-identification, and consumer wearables, FPF is exploring the intersection between the benefits of data and its privacy implications.
The Future of Privacy Forum’s big data work highlights the increasingly challenging intersection among data analytics, privacy, and ethics. In 2014, the Obama Administration engaged in a comprehensive “Big Data Review,” which explored how data can be used to generate new insights across a range of fields, from health care to the environment, while emphasizing the potential of risks of data to discriminate and undermine equal opportunity.
In support of this effort, FPF has released a recent whitepaper encourages organizations to engage in a more thorough “data benefit analysis” to consider how the benefits of big data can be achieved. Currently, FPF is engaged in a project to explore the development of “Consumer Subject Review Boards” to evaluate and consider ethical and privacy issues in novel uses of information.
In the era of big data, the debate over the definition of personal information, de-identification and re-identification has never been more important. Privacy regimes often rely on data being considered Personal in order to require the application of privacy rights and protections. Data that is anonymous is considered free of privacy risk and available for public use.
Yet much data that is collected and used exists somewhere on a spectrum between these stages. FPF’s De-ID Project seeks to describe a practical framework for applying privacy restrictions to data based on the nature of data that is collected, the risks of de-identification, and the additional legal and administrative protections that may be applied. Important questions FPF hopes to consider include:
- What weight should be given to non-technical factors such as legal commitments not to make data public or not to attempt to re-identify data.
- What weight is to be given to impacts of de-ID techniques on utility of data.
- What status should be awarded to linkable or pseudonymous data.
The FPF Consumer Wellness & Wearables Working Group is comprised of industry leaders and stakeholders from across the consumer wellness and wearables ecosystems. Participants include leading mobile operating systems, wearables manufacturers, healthcare organizations, app developers, chipmakers, researchers, and others focused on consumer-generated health and wellness data.
The goal of the working group is to build on best practices that support consumer trust, as well as to develop responsible guidelines for appropriate research and other secondary uses of such data. This will enable companies to address consumer and advocate concerns and demonstrate accountability. Ultimately, by shining a light on responsible privacy practices we hope to ensure continued innovation and consumer trust within the wearables ecosystem.
Some key issues for this group to address include: sharing consumer health-related data with researchers and third parties, including employer wellness programs; advertising and marketing restrictions; appropriate notice and consent mechanisms; access, correct, and deletion for health-related data; and data security standards.