Author Archive

Framing the “Big Data Industry”

For all its hype, discussions about Big Data often still devolve into debates about buzzwords and concepts like business intelligence, data analytics, and machine learning. Hidden in each of these terms are important privacy and ethical considerations. A recent article by Kirsten Martin in MIS Quarterly Executive attempts to bring these considerations to the surface by moving past framing Big Data as merely some business asset or computational technique. Instead, Martin suggests analyzing risks and rewards at a macro-level by looking at the entire Big Data ecosystem, which she terms the Big Data Industry (BDI).

Yes, her paper still largely focuses on the negative impacts of Big Data, but instead of a general sense of doom-and-gloom, her focus is on a systemic analysis of where the data industry faces specific challenges. Though the article is peppered with examples of privacy-invading headlines, like Target’s purported ability to predict pregnancy, her framing is particularly helpful because it largely divorces the “risks” posed by Big Data from individualized company practices, anecdotes, and hypotheticals. Instead, she describes the entire Big Data information supply chain from upstream data sources to downstream data uses. Consumer-facing firms, tracking companies, and data aggregators — or data brokers — work together to exchange information and add more value to different data sources.

Martin breaks down the different negative effects that can impact individuals at different points in the supply chain. She highlights some of the existing concerns around downstream uses of Big Data. For example, she notes that both incorrect and correct inferences about individuals could limit individual’s opportunities, encourage consumer manipulation, and ultimately be viewed as being disrespectful to individual concerns. While these sorts of Big Data harms have been long debated, Martin places them on a spectrum alongside concerns raised by upstream suppliers of data, including poor data quality, biases in the data, and privacy issues in the collection and sharing of information. Analogizing to how food providers have become responsible for everything from labor conditions to how products are farmed, she argues that Big Data Industry players, by choosing and creating supply chains, similarly become “responsible for the conduct and treatment of users throughout the chain.”

By looking at Big Data as one complete supply chain, Martin appears to believe it will be easier for members of the Big Data Industry to identify and monitor economical and ethical issues with the supply chain. Yet problems also exist across this nascent industry. Even if we can effectively understand data supply chains, Martin is perhaps more concerned with the systemic issues she sees in the BDI. Specifically, the norms and practices currently being established throughout the entire data supply chain give rise to “everyone does it” ethical questions, and the BDI, in particular, poses two pivotal ethical considerations.

First, data supply chains may create negative externalities, especially in aggregate. Air pollution, for example, can become a generalized societal problem through global warming, and the harm from actions across the manufacturing industry can be considerably greater than the pollution caused by any individual company. Martin posits the Big Data Industry presents a similar dynamic, wherein every member that captures, aggregates, or uses information creates costs to society in the form of surveillance. By contributing to a “larger system of surveillance” and by frequently remaining invisible and out-of-sight to individuals, the BDI may be generating an informational power imbalance. Perhaps because individual companies that are part of the BDI fail to see themselves as part of a larger data ecosystem, few companies have been put in a position to take account of — or even to consider — that their data practices may give rise to such a negative externality.

Second, the Big Data Industry may foster “destructive demand” for consumer-facing companies to collect and sell increasing amounts of consumer data with lower standards. According to Martin, demand can become destructive (1) when a primary markets that promise a customer-facing relationship become a front for a secondary market, (2) when the standards and quality of the secondary market are less than the primary market, and (3) when those consumer-facing companies have limited accountability to consumers for their transactions and dealings in the secondary market. Martin sees a cautionary tale for the BDI in the recent mortgage crisis and the role that mortgage-backed securities played in warping the financial industry. She warns that problems are inevitable as the buying and selling of consumer data becomes more important than “selling an application or providing a service.” Invoking the specter of the data broker boogeyman, Martin argues that consumer-facing organizations lack accountability for their activities in the secondary data market, particularly so long as consumers remain in the dark as to what is going on behind the scenes in the greater BDI.

So how can the Big Data Industry address these concerns? She places much of her faith in the hope that organizations like the Census Bureau that have “unique influence” as well as “providers of key products within the Big Data Industry, such as Palantir, Microsoft, SAP, IBM” can help shape sustainable industry practices moving forward. These practices would embody a number of different solutions under the rubrics of data stewardship, data integrity, and data due process. Many of the proposals under the first two amount to endorsing additional transparency mechanisms. For example, publicly linking companies through a larger supply chain could create “a vested interest in ensuring others in the chain uphold data stewardship and data due process practices.”

Data due process, on the other hand, would help firms to “internalize the cost of surveillance.” Additional internal oversight and due process procedures would, according to Martin, “increase the cost of holding individualized yet comprehensive data and internalize the cost of contributing to surveillance.” As to what these mechanisms could look like, Martin points to ideas like consumer subject review boards, which was first popularized at a Future of Privacy Forum event two years ago and is an effort we have continued to expand upon. The call for data integrity professionals mirrors the notion of “algorithmists” that could monitor not just the quality of upstream data sources but downstream data uses. (As an aside, she chastises business schools, who, even as they race to train Big Data professionals, do not require business students to take courses in ethics.) Effective ethical reviews would require such professionals, which could potentially mitigate some of risks inherent in the data supply chain.

While Martin’s proposals are not a panacea, industry and regulators alike should take her suggestions seriously. Her framing of a greater Big Data Industry provides a path forward for companies — and regulators and watchdogs — to better target their efforts to promote public trust in Big Data. She has identified places in the information supply chain where certain industry segments may need to get “more skin in the game” so to speak. And, at the very least, Martin has moved Big Data from amorphous buzzword to a full-fledged ecosystem with some shape to it.

-Joseph Jerome, Policy Counsel

Customer Privacy and the National Labor Relations Act

Last month, an Administrative Law Judge for the National Labor Relations Board ruled that Macy’s employee handbook contained overly broad confidential information policies. The decision continues efforts by the NLRB to police employer confidentiality policies, but it also demonstrates how industry efforts to protect privacy can inadvertently run afoul of Section 7 of the NLRA, which gives employees various rights to organize and engage in collective bargaining.

In this instance, Macy’s employee handbook repeatedly emphasized the importance of taking the privacy of fellow employees and customers seriously. It explained that the company holds “personal data of its present and former associates, customers and vendors,” and that the company is committed to using this information in a way that respects privacy. As a result, Macy’s required all employees with access to this personal data to protect it against unauthorized use, disclosure, or access. The ALJ found that the handbook’s repeated invocations about protecting information would make employees reasonably believe the were restricted form discussing certain terms and conditions of their employment, which runs contrary to the NLRA.

Yet whenever personal information is shared, it raises privacy issues. Cases like this pose a clash between competing values, and moving forward, it will be important for the NLRB to recognize that protecting employee organizing rights may come at a cost to their privacy expectations, and vice versa. As this case demonstrates, employee rights can also challenge the privacy rights of customers and vendors.

What makes this decision so problematic from a privacy perspective is that the ALJ also found Macy’s “restrictions on the use of information regarding customers and vendors” violated the NLRA. In certain circumstances, that may be true, but here, the ALJ largely relied on a single NLRB decision, Trinity Protection Services, Inc., 357 NLRB No. 117 (2011), which found that “employees’ concerted communications regarding matters affecting their employment with their employer’s customers or with other third parties, such as governmental agencies, are protected by Section 7 and, with some exceptions not applicable here, cannot lawfully be banned.”

Trinity was a very different case. It involved a dispute between a security contractor and newly hired guards. During their training, the guards believed Trinity Protection Services used inadequate security measures and threatened to notify the Department of Homeland Security, who had contracted Trinity, about the inadequacies. Trinity told the guards any information they received was confidential and any disclosure would violate a non-disclosure agreement. This sort of policy was found unlawful because it “inhibit[ed] employees from bringing work-related complaints to, and seeking redress from, entities other than the Respondent.”

And this case could also be distinguished on the grounds that it extends Trinity‘s protection of communications to a whole host of private customer information. The ALJ does concede that his concerns about Macy’s restrictions on use of customer information should apply “to a lesser extent,” but this case demonstrates how legitimate efforts to protect privacy can push up against labor laws.

As David Katz explains, the decision should “concern employers, as the provisions attacked here are likely quite similar to other employers’ policies. Employers should be mindful of the Board’s recent crusade against overbroad handbook provisions, and should review their policies—including those not typically associated with NLRB scrutiny (such as confidentiality and privacy policies)—with an eye towards the Board’s recent rulings.”

Beyond the burdens this decision could place on companies, it also highlights the need for the NLRB and its ALJs to be aware of the privacy implications of their decisions. Certain types of information sharing may indeed be necessary to protect employee rights, but let’s be clear: companies need to stress to their employees the importance of taking privacy seriously. With new data breaches making headlines daily, it’s important to remember that human error can be the biggest threat to data privacy and security. For that matter, employee snooping can also present serious privacy concerns.

The NLRB’s priorities are no doubt to protect workers, but it is important to consider how privacy of customers, and indeed other employees, can be impacted by the disclosure and sharing of information. We would encourage the NLRB as it attempts to rein in employer policies to consult with privacy experts and understand more deeply the need for clear employee rules and guidelines around personal information.

-Joseph Jerome, Policy Counsel

Future of Privacy Forum Releases New Survey on Privacy and Trust Issues in the “Sharing Economy”

FUTURE OF PRIVACY FORUM RELEASES NEW SURVEY ON PRIVACY AND TRUST ISSUES IN THE “SHARING ECONOMY”

Whitepaper Examines Benefits and Challenges of Reputation Management in Peer-to-Peer Services and Provides an Overview of Market Leaders in Key Sharing Economy Sectors

WASHINGTON, D.C. – Monday, June 8, 2015 – As peer-to-peer services comprising the “Sharing Economy” continue to gain wide acceptance with U.S. consumers, the Future of Privacy Forum (FPF) today released a timely whitepaper that focuses on the reputational, trust and privacy challenges users and providers face concerning the management and accuracy of shared information.

Released in advance of a June 9 workshop focused on the sharing economy, the FPF paper titled “User Reputation: Building Trust and Addressing Privacy Issues in the Sharing Economy” – comes at a time when the sharing economy, especially in the hospitality and transportation sectors, is expanding in popularity and growth at breakneck speed. The total value of global sharing economy transactions was estimated at $26 billion in 2013, and is estimated to generate as much as $110 billion in coming years.

At the same time, consumers are recognizing the benefits of shared services: a recent study notes 86 percent of adults in the U.S. believe such services make life more affordable, while 83 percent believe they make life more convenient and efficient.

Sharing economy services – such as Uber, Airbnb, Etsy, and TaskRabbit, among others – rely heavily on online and mobile platforms for transactions and the peer-to-peer sharing of critical, ‘reputational’ information. This includes data regarding recommendations, ratings, profile access, review challenges, account deletion, and more. How access to and control of this data is managed by sharing economy brands and services is essential to building user trust, and has important privacy implications as well.

“Uber’s new option that provides riders with access to their ratings is an important step forward,” said Jules Polonetsky, FPF’s Executive Director. “If consumer access to services is dependent on ratings and reviews, consumers need transparency into their scores and into how these systems work”

The FPF survey provides an overview of how reputation-building and trust are frequently essential assets to a successful peer-to-peer exchange, and how ratings, peer reviews, and user comments serve as core functions of such services. It examines the commonly used mechanisms to build reputation, as well as issues surrounding identity and anonymity, and the role of social network integration.

The highlight of the group’s study is a section entitled, “Maintaining Reputation: Privacy Challenges of Rating Systems.” How sharing economy and peer-to-peer platforms are implementing Fair Information Practices concerning user-generated data, especially access and correction capabilities for users and providers, has tangible privacy implications.

As a result, the FPF paper undertook a survey of a number of market leaders in the sharing economy sectors of transportation (Lyft, Sidecar, Uber), hospitality (Airbnb, HomeAway, Couchsurfing), retail goods (Etsy, NeighborGoods, eBay) and general services (TaskRabbit, Instacart, Handy) to review how these platforms implement access and correction capabilities. Brands were surveyed to see how they implement access rights, correction and response mechanisms, and whether they provide clear guidance for deleting account information.

The report concludes with a call to action for many companies in the sharing economy marketplace, encouraging them to strive to provide more guidance to users about reputation and to be more transparent about access and control over information. Such moves will not only amplify consumer trust, but also help ensure fair treatment of consumers.

This is especially important for the future growth of the sharing economy sector, as the report notes:

“While platforms need to have good and reliable reputational systems in place in order to create trust between users, they will also have to ensure their users trust them. It is very likely that…users will rely on the platform’s reputation, in addition to user reputation alone.”

The survey was authored by FPF staffers Joseph Jerome, Benedicte Dambrine, and Ben Ambrose.

About Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. For more information, visit www.futureofprivacy.org

Media Contact

Nicholas Graham, for Future of Privacy Forum

fpfmedia@futureofprivacy.org

571-291-2967

Balancing Free Expression and Social Media Monitoring

Last week, central Florida’s largest school district announced that it would begin monitoring a number of social media sites for posts “that may impact students and staff.” As more and more school districts look to social media to monitor and track students, it raises big privacy questions. Certainly, many schools have reacted to school shootings, student suicides, and bullying concerns by connecting with social-media-monitoring companies to help them identify problems for which school personnel, parents, or even law enforcement may need to take action. In fact, when tragedies have taken place, the first reaction has often been to scour social media to see whether there were clues that should have led to action or intervention.

Parents appear to have largely accepted this general practice, but the limits of school’s tracking and monitoring their students remain unclear. As Jules and I explored in an op-ed for Education Week last month, we don’t yet know how to strike the right balance between monitoring and tracking—while allowing individuals to vent, blow off steam, and otherwise freely express themselves online without feeling surveilled.

While this story deals largely with monitoring students, the public at large has contradictory and conflicting views about social listening. According to a 2013 Netbase study51% of consumers want to be able to talk about companies without them listening, but 58% want them to respond to complaints and 64% want companies to respond when spoken to. To avoid a notorious creepy label, schools — and indeed any organization — ought to be open and transparent about why they’re listening and what they’re listening for. The public’s views of social listening can be contradictory and confusing.

We hope to explore this issue further, and welcomes any thoughts and feedback from anyone out there . . . listening.

-Joseph Jerome, Policy Counsel

Talking Cars and the Internet of Things at TRUSTe’s IoT Privacy Summit


Future of Privacy Forum is excited to partner with TRUSTe to provide attendees with a full day of case studies, workshops and panels at the second IoT Privacy Summit on June 17th in Menlo Park, California. This year’s Summit focuses on practical solutions to the privacy challenges brought on by the Internet of Things, with topics focusing on key FPF priorities like connected cars, smart cities and homes, wearable devices, and more.

FPF’s Joseph Jerome will participate in a panel titled “How the Automobile Industry Took the Lead in Industry Self-Regulation,” along with representatives from General Motors and Hogan Lovells. The panel will discuss how car makers came together to address privacy issues head-on as vehicles become increasingly connected — and data fueled. The group will also discuss how a set of automotive privacy principles were developed, and what industry is doing to implement them ahead of their 2016 start-date. Click here to view a current list of other speakers.

Ahead of the Summit, on June 16th, FPF will also participate in the IoT Privacy Tech Working Group. The group will meet to identify both the technical standards and best practices necessary to help enhance consumer privacy in the IoT. More information about the IoT Privacy Summit 2015 is  available here.

 

More important, to register, click here. We look forward to discussing the Internet of Things next month!


Privacy Calendar

Jul
6
Mon
all-day PL&B’s 28th Annual International...
PL&B’s 28th Annual International...
Jul 6 – Jul 8 all-day
The Privacy Laws & Business 27th Annual International Conference featured more than 40 speakers and chairs from many countries over 3 intensive days. At the world’s longest running independent international privacy event participants gained professionally by[...]
Jan
28
Thu
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
Jan
28
Sat
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]

View Calendar