Author Archive

Call for Papers: Beyond IRBs

CALL FOR PAPERS

Beyond IRBs: Designing Ethical Review Processes for Big Data Research

In the age of Big Data, innovative uses of information are continuously emerging in a wide variety of contexts. Increasingly, researchers at companies, not-for-profit organizations and academic institutions use individuals’ personal data as raw material for analysis and research. For research on data subject to the Common Rule, institutional review boards (IRBs) provide an essential ethical check on experimentation. Still, even academic researchers lack standards around the collection and use of online data sources, and data held by companies or in non-federally funded organizations is not subject to such procedures. Research standards for data can vary widely as a result. Companies and non-profits have become subject to public criticism and may elect to keep research results confidential to avoid public scrutiny or potential legal liability.

To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,”[1] formal privacy review boards,[2] private IRBs,[3] and other ethical processes implemented by individual companies.[4] Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.

To address these challenges, the Future of Privacy Forum (FPF) is hosting an academic workshop supported by the National Science Foundation, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Authors are invited to submit papers for presentation at a full-day program to take place on December 10, 2015. Successful submissions may address the following issues:

  • What are the key ethical questions surrounding data-driven research and consumer testing? Should analysis of online data be treated differently than subject testing generally? Which issues warrant additional review?
  • What lessons can be learned from existing institutional review boards (IRBs), including both academic panels and standalone private entities? Which features of existing IRB structures are applicable to corporate and non-profit big data research?
  • Which existing government or industry practices offer insights into how researchers can approach ethical questions for Big Data?
  • How should organizations structure an ethical review process? Which substantive requirements and principles should govern it? How should organizations ensure independent review? Could review mechanisms be wholly internal or must external stakeholders or committees be involved?
  • What is the proper scope of review? While IRBs currently focus on human-subject testing, Big Data raises a different set of concerns focusing on subsequent analysis and novel uses of data. Which factors are essential for consideration in an ethical process?
  • What is the proper role of consent in data-driven research environment?

Papers for presentation will be selected by an academic advisory board and published in the online edition of the Washington and Lee Law Review. Four papers will be selected to serve as “firestarters” for the December workshop, awarding each author with a $1000 stipend.

Submissions must be 2,500 to 3,500 words, with minimal footnotes and in a readable style accessible to a wide audience.

Submissions must be made no later than October 25, 2015, at 11:59 PM ET, to papersubmissions@futureofprivacy.org. Publication decisions and workshop invitations will be sent in November.


 

[1] Ryan Calo, Consumer Subject Review Boards: A Thought Experiment, 66 Stan. L. Rev. Online 97 (2013).

[2] White House Consumer Privacy Bill of Rights Discussion Draft, Section 103(c) (2015).

[3] Jules Polonetsky, Omer Tene, & Joseph Jerome, Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings, 13 Colo. Tech. L. J. 333 (2015).

[4] Mike Schroepfer, CTO, Research at Facebook (Oct. 2, 2014), http://newsroom.fb.com/news/2014/10/research-at-facebook/.

Beyond the Common Rule: IRBs for Big Data and Beyond?

In the wake of last year’s news about the Facebook “emotional contagion” study and subsequent public debate about the role of A/B Testing and ethical concerns around the use of Big Data, FPF Senior Fellow Omer Tene participated in a December symposum on corporate consumer research hosted by Silicon Flatirons. This past month, the Colorado Technology Law Journal published a series of papers that emerged out of the symposium, including “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

“Beyond the Common Rule,” by Jules Polonetsky, Omer Tene, and Joseph Jerome, continues the Future of Privacy Forum’s effort to build on the notion of consumer subject review boards first advocated by Ryan Calo at FPF’s 2013 Big Data symposium. It explores how researchers, increasingly in corporate settings, are analyzing data and testing theories using often sensitive personal information. Many of these new uses of PII are simply natural extensions of current practices, and are either within the expectations of individuals or the bounds of the FIPPs. Yet many of these projects could involve surprising applications or uses of data, exceeding user expectations, and offering notice and obtaining consent could may not be feasible.

This article expands on ideas and suggestions put forward around the recent discussion draft of the White House Consumer Privacy Bill of Rights, which espouses “Privacy Review Boards” as a safety value for noncontextual data uses. It explores how existing institutional review boards within the academy and for human testing research could offer lessons for guiding principles, providing accountability and enhancing consumer trust, and offers suggestions for how companies — and researchers — can pursue both knowledge and data innovation responsibly and ethically.

The Future of Privacy Forum intends to continue the conversation about Big Data review boards. Joseph Jerome will be leading a panel discussion on the topic at the IAPP’s fall Privacy Academy, and FPF will be hosting an invite only workshop this winter with leading researchers, ethicists, and corporate policymakers to address how to build an ethical framework for Big Data research.

Click here to read “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

Peter Swire on Encryption and Mandated Access

Senate Committee on the Judiciary

Questions for the Record from Senator Grassley

To: Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

  1. Global Competitiveness

In my opening statement, one of the concerns I expressed was that, in considering solutions to the “Going Dark” problem, we carefully consider the impact on the global competitiveness of American technology companies. You testified that if U.S. companies were required to give U.S. law enforcement access to encrypted communications and devices, U.S. companies find themselves at a disadvantage in the global marketplace. Yet it appears that countries like the United Kingdom, France and China are considering laws that would move in this direction.

  1. Do you agree that these foreign governments may be moving in this direction? If so, how would the global competitiveness of U.S. companies be damaged if foreign governments mandate the same sort of access?

 

Swire: I agree that other countries have been considering laws concerning mandated access. My view is that the position of the United States government is highly relevant to the likelihood of other countries adopting such laws, especially for close allies such as the United Kingdom and France. If the United States were to mandate access legally, which I hope it will not, my view is that the U.S. decision would substantially increase the likelihood of such laws being enacted by our allies. By contrast, if the United States maintains the status quo of no such mandates, then that fact becomes an important and relevant precedent against enactment of such measures by our allies.

I believe the U.S. position would also have a large impact on other countries around the world, especially for authoritarian or dictatorial regimes that would like to use mandated access to clamp down on political dissent, religious activity, and other activities. If the U.S. resists mandates, then the U.S. based technology companies have a much greater ability to resist demands for mandated access in such countries. Being able to resist such demands will protect devices and sensitive data of Americans and American businesses in those countries. By contrast, if the U.S. requires access, then it will be much for difficult for U.S. based technology companies to push back against requests from China or other foreign governments.

My initial point, therefore, is that the U.S. actions in this area have a very important impact on whether other countries adopt mandated access. As I stated during the hearing, I also believe that mandates in the U.S. would harm U.S. based technology companies because of the suspicion around the world that their products and services are not secure and information is shared with U.S. government agencies.

In terms of mandates in another country, such as a U.S. ally, there would be multiple effects and the overall outcome depends on the circumstances. For instance, if a small market country mandates access, then that might aid local companies that comply with the local law while U.S. companies may decide not to take the reputational risk of doing business in that jurisdiction. In that event, U.S. companies might lose access to a small market but face less competition from companies based in there in other markets. If the country is seen globally as having a weak human rights record, mandated access may push the U.S. companies, consistent with the Global Network Initiative principles, not to continue doing business there, thus losing market access. Such company decisions to eschew a market, however, may send a strong signal globally about the importance of customer security to the U.S. based companies, with offsetting gains in other markets.

In addition, there is a crucial dynamic aspect to such mandates. The small country, or country with weak human rights, might find the consequences negative if they lose access to cutting edge technology from U.S. based companies. They thus might reconsider their decision to mandate access, in order to bring U.S. based companies back into the jurisdiction. In such an event, a clear U.S. policy of not requiring access is crucial – the good long-term outcome of U.S. company participation and no mandates occurs only if the U.S. retains its policy where no mandates are imposed.

Tackling Privacy, One Carnegie Mellon Project at a Time

CMU Event

CMU Privacy Researchers Norman Sadeh, Lorrie Cranor, Lujo Bauer, Travis Breaux, and Jason Hong (l-r). Photo by JC Cannon.

Last Thursday, the Future of Privacy Forum hosted a conversation among five of CMU’s leading privacy researchers. While the panelists discussed a number of their leading privacy projects, I wanted to highlight some of the interesting takeaways I took from the presentation.

Many of the researchers focused on how subtle nudges can be used to change people’s behaviors. While this is frequently done to encourage users to share more data, the CMU researchers expressed in interest in exploring how nudges can be “used for good.” Discussing efforts by hotels to get patrons to reuse wash towels, Jason Hong explained how subtle changes in wording reminders — from “please recycle” to “75% of guests in this room” — could have significant impacts on patron recycling behaviors.

Lujo Bauer explained how these sorts of nudges could be applied to password composition meters. Increasingly, online services detail password requirements to users and show either colored bars or outright classify a user’s proposed password as “weak” or “strong.” According to Bauer, people typically do not try very hard to get to the point where a meter tells them the password is excellent, but “they will avoid it if a meter tells them their password sucks.” His takeaway: when it comes to security measures, avoid giving users too much positive feedback.

Bauer lamented that the online ecosystem is forcing users to engage in insecure behaviors. Of course, while nudges could be used to reinforce positive behaviors, it begs the question what is defined as “positive” behavior. When it comes to security issues like passwords, promoting better security may be a no brainer, but things are much less binary when it comes to privacy. Privacy-protective nudges can push towards privacy paternalism, which may be no more ethical than the alternative.

Travis Breaux highlighted the continuing challenge of communicating privacy policy into engineering objectives. He noted that many mobile app developers still do not understand the privacy implications that can come with connecting their apps through outside services and social networks, which calls for the need to further detail the entire data supply chain. Breaux explored the potential behind establishing rich data collection/use descriptions that could be more detailed and useful than generic privacy policies, and describing a case study involving applications on Facebook, explained how these sorts of tools could help developers understand more accurately how they are collecting, using, and repurposing information.

Lorrie Cranor discussed the difficulties with communicating data use in the Internet of Things whether through visual, auditory, or haptic channels, or make information “machine readable (if you remember P3P and DNT).” She also highlighted one study that looked at the timing dimension of providing users with notice.  A student developed a simple history quiz app that displayed a privacy notices in different places: (1) in the app store, (2) as soon as the app was opening, (3) in the middle of the history quiz, (4) at the quiz’s end or (5) never at all. “We invited people to take our quiz, but didn’t tell them it was about privacy,” she explained.

When users where then asked about the contents of that privacy notice, the study found that people who “saw” the policy in the app store could not recall it any better than people who did not see it at all. According to Cranor, at the time a user is downloading an app, they are not paying attention to other information in the app store. This “doesn’t suggest you don’t put that info in the app store . . . but suggests that sort of timing may not be sufficient. Also suggests it’s really important to test these things.”

Norman Sadeh further criticized the state of our overly-complicated privacy policies. “It’s not the case that every single sentence in a privacy policy matters,” he stated, discussing his effort to try to extract the key points of interest to users from privacy policies.

Last but not least, the group described its Bank Privacy Project. The researchers described how larger banks tend to collect more information and use it for more purposes, while smaller banks do the exact opposite. “If you don’t want your bank sharing,” Cranor explained, “you need to find a bank you’ve never heard of.” Because this is nigh-impossible for an average consumer to do, enter the Bank Privacy Project.

-Joseph Jerome, Policy Counsel

Peter Swire Testifies on Encryption and “Going Dark”

This morning, FPF Senior Fellow Peter Swire presented testimony before the Senate Judiciary Committee on encryption and the balance between public safety and privacy. Swire highlights the concerns raised by a diverse coalition of cybersecurity and privacy experts, tech companies, and human rights activists about law enforcement’s “going dark” argument.

“We can respect the heartfelt concerns of law enforcement officials facing new challenges while respectfully disagreeing with proposed policies,” he concludes. You can read his full testimony here.


Privacy Calendar

Jan
28
Thu
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
Jan
28
Sat
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]

View Calendar