Author Archive

New FPF Survey Shows Parents Overwhelmingly Support Using Student Data to Improve Education




WASHINGTON D.C., September 21, 2015 – Today, as students, parents, teachers and school administrators begin another academic year, the Future of Privacy Forum released new survey data showing that a majority of parents support using student data to improve education. While support for using data in the classroom is strong, parents remain concerned about the level of student data privacy and security in U.S. K-12 schools.

Few topics in education have generated as much discussion as the potential for data and technology to transform teaching and learning. The public discourse has been dominated by technology proponents and critics alike. But less time has been spent working to understand how most parents of school-aged children view the risks and opportunities of using data and technology in the classroom.

To help answer this question—and to better inform decisions made by educators, school leaders, product developers and policymakers—The Future of Privacy Forum and Harris Poll conducted a national survey of parents with children in public school grades K-12.

The survey, which was conducted online by Harris Poll on behalf of Future of Privacy Forum   from March 26 – April 2, 2015, included 1,002 parents in the United States with children between the ages 0-17 (of which 672 have children in public school grades K-12). It focused on understanding what parents know about the way technology is being used in schools;

“Parent’s are one of the most important stakeholders in the discussions around using student data to improve classroom education,” said Future Privacy Forum Executive Director Jules Polonetsky. “Yet, not near enough work has been done to bring parents into the conversation. This survey is an important first step.”

The survey asked parents to outline their goals and fears about the use of technology and student data. The findings are summarized below.


Key Survey Takeaways


  • What Do Parents Know About the Use Of Technology in Schools?


According to the survey, most parents (71%) say their child uses technology provided by school

and over half (58%) say they have used school-related technology. A majority of parents (76%) understand what data are being collected and how they are used. The results here demonstrate a strong baseline of knowledge and communication between schools and parents.

  • When Do Parents Support Access to and Use of Student Data Within Schools or the Educational System?


The vast majority of parents express comfort with using student data to improve teaching and learning such as grades (97%), attendance records (94%), special needs status (91%) and standardized test scores (88%). But they want a strong justification of administrative need or educational benefit. For example, parents say they are less likely to be comfortable with schools sharing student data with private companies that create educational software, websites, apps (42%). However, if companies are providing a service that parents perceive to directly benefit students, the results are more favorable, as a majority do support companies using their data to improve their products and services (57%).

  • What Do Parents See as Benefits from Additional Uses of Student Data?


Parents support many uses of individual and aggregate level student data to improve education. According to the survey, parents are strongly in favor of using individual student data to both identify struggling students in need to provide appropriate support earlier (84%) and to personalizing learning by identifying the strengths and weaknesses of individual students (79%). By wide margins, parents say that using aggregate level student data to improve the effectiveness of teacher instruction (78%) and help schools measure and hold teachers accountable for their effectiveness in the classroom (73%) are convincing reasons to use student information.

  • Where Do Parents Stand on the Creation of Electronic Education Records Amid Security Concerns?


While most parents worry about student data being hacked or stolen (87%) or that an electronic record would be used in the future against their child by a college or an employer (68%), a majority are comfortable with the creation of electronic education records for their child as long as those records are properly protected (71%). That support level increases when parents know that schools are required to ensure student data security (85%); and when parents are aware that student data use is restricted to educational purposes (87%).

  • What Protections Do Parents Want; What Do They Know About Existing Laws and Policies?


More than half of parents (54%) say they know nothing about existing federal laws regulating the use of student data. This finding may explain why parents say the best way to ensure student data privacy is the adoption of new state or federal laws (57%). Nearly half of parents (47%) say that they want companies to adopt better contracting practices and publish enforceable privacy policies. Most parents appear to be unaware of the benefits of complementary strategies to ensure the responsible use of student data like industry codes and best practices.



In general, parents are very aware of, engaged with, and concerned about technology and student data use in schools. While they are eager for the individual learning benefits that educational data can provide, parents are also concerned about the security of their child’s personal information. Educators, education service providers, advocates, and policymakers should embrace the opportunity to work with parents as partners in addressing these issues.

“This survey makes it clear that we must do a better job of explaining to parents how their children benefit from improving the effectiveness of education products based on things learned in the classroom,” Polonetsky said. “And parents want a commitment that their student data will never be exploited. I think that’s a commitment they deserve.”


About The Student Privacy Symposium

The survey is being released in conjunction with the National Student Privacy Symposium in Washington, D.C. on Monday, September 21, 2015. The symposium features research experts, education leaders, privacy and security professionals, advocacy groups, parents and government leaders in an open and honest debate about how to best serve our youth.

In addition to core student data privacy issues – education, privacy, security, and civil rights leaders will also discuss the benefits and risks of data use for underserved student populations—and its impact on inequality and discrimination.

The Keynote speaker is Kati Haycock, President of the Education Trust, a leading national nonprofit education advocacy organization. Haycock is a civil rights champion and one of the nation’s top advocates for high academic achievement for all students, especially low-income students and students of color.

Support and funding for the event has been provided by: the Bill & Melinda Gates Foundation; the Digital Trust Foundation; Data Quality Campaign, the National Association of State School Boards of Education; the Consortium for School Networking (CoSN); the Houston Independent School District; iKeepSafe; the Software & Information Industry Association (SIIA), and AASA: The School Superintendents Association.

For more information, please visit


About Harris Poll Methodology

This survey was conducted online within the United States by Harris Poll on behalf of Future of Privacy Forum from March 26 – April 2, 2015 among 1,002 parents ages 18 and older who have children ages 0-17 in the household (of which 672 have children in public/charter school grades K-12). This online survey is not based on a probability sample and therefore no estimate of theoretical sampling error can be calculated. For complete survey methodology, including weighting variables, please contact


About the Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. It facilitates discussions with privacy advocates, industry leaders, regulators, legislators (and their staffs) and international representatives.

FPF expanded into the student data policy area in 2014 with the introduction of FERPA|Sherpa (, a compilation of education privacy resources and tools with sections aimed at parents/students, schools, service providers and policymakers. In addition to original tools and resources, this site has aggregated many of the references made available by government agencies, education advocates, academic centers, and commercial partners. FPF continued its work in education with the announcement of the Student Privacy Pledge for K-12 Ed Tech Providers, in partnership with the Software & Information Industry Association (SIIA), to safeguard student privacy built around a dozen commitments regarding the collection, maintenance, and use of student personal information. Started with 14 market leaders, the Pledge has since expanded to over 170 signers, and has been endorsed by President Obama.


Call for Papers: Beyond IRBs


Beyond IRBs: Designing Ethical Review Processes for Big Data Research

In the age of Big Data, innovative uses of information are continuously emerging in a wide variety of contexts. Increasingly, researchers at companies, not-for-profit organizations and academic institutions use individuals’ personal data as raw material for analysis and research. For research on data subject to the Common Rule, institutional review boards (IRBs) provide an essential ethical check on experimentation. Still, even academic researchers lack standards around the collection and use of online data sources, and data held by companies or in non-federally funded organizations is not subject to such procedures. Research standards for data can vary widely as a result. Companies and non-profits have become subject to public criticism and may elect to keep research results confidential to avoid public scrutiny or potential legal liability.

To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,”[1] formal privacy review boards,[2] private IRBs,[3] and other ethical processes implemented by individual companies.[4] Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.

To address these challenges, the Future of Privacy Forum (FPF) is hosting an academic workshop supported by the National Science Foundation, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Authors are invited to submit papers for presentation at a full-day program to take place on December 10, 2015. Successful submissions may address the following issues:

  • What are the key ethical questions surrounding data-driven research and consumer testing? Should analysis of online data be treated differently than subject testing generally? Which issues warrant additional review?
  • What lessons can be learned from existing institutional review boards (IRBs), including both academic panels and standalone private entities? Which features of existing IRB structures are applicable to corporate and non-profit big data research?
  • Which existing government or industry practices offer insights into how researchers can approach ethical questions for Big Data?
  • How should organizations structure an ethical review process? Which substantive requirements and principles should govern it? How should organizations ensure independent review? Could review mechanisms be wholly internal or must external stakeholders or committees be involved?
  • What is the proper scope of review? While IRBs currently focus on human-subject testing, Big Data raises a different set of concerns focusing on subsequent analysis and novel uses of data. Which factors are essential for consideration in an ethical process?
  • What is the proper role of consent in data-driven research environment?

Papers for presentation will be selected by an academic advisory board and published in the online edition of the Washington and Lee Law Review. Four papers will be selected to serve as “firestarters” for the December workshop, awarding each author with a $1000 stipend.

Submissions must be 2,500 to 3,500 words, with minimal footnotes and in a readable style accessible to a wide audience.

Submissions must be made no later than October 25, 2015, at 11:59 PM ET, to Publication decisions and workshop invitations will be sent in November.


[1] Ryan Calo, Consumer Subject Review Boards: A Thought Experiment, 66 Stan. L. Rev. Online 97 (2013).

[2] White House Consumer Privacy Bill of Rights Discussion Draft, Section 103(c) (2015).

[3] Jules Polonetsky, Omer Tene, & Joseph Jerome, Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings, 13 Colo. Tech. L. J. 333 (2015).

[4] Mike Schroepfer, CTO, Research at Facebook (Oct. 2, 2014),

Beyond the Common Rule: IRBs for Big Data and Beyond?

In the wake of last year’s news about the Facebook “emotional contagion” study and subsequent public debate about the role of A/B Testing and ethical concerns around the use of Big Data, FPF Senior Fellow Omer Tene participated in a December symposum on corporate consumer research hosted by Silicon Flatirons. This past month, the Colorado Technology Law Journal published a series of papers that emerged out of the symposium, including “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

“Beyond the Common Rule,” by Jules Polonetsky, Omer Tene, and Joseph Jerome, continues the Future of Privacy Forum’s effort to build on the notion of consumer subject review boards first advocated by Ryan Calo at FPF’s 2013 Big Data symposium. It explores how researchers, increasingly in corporate settings, are analyzing data and testing theories using often sensitive personal information. Many of these new uses of PII are simply natural extensions of current practices, and are either within the expectations of individuals or the bounds of the FIPPs. Yet many of these projects could involve surprising applications or uses of data, exceeding user expectations, and offering notice and obtaining consent could may not be feasible.

This article expands on ideas and suggestions put forward around the recent discussion draft of the White House Consumer Privacy Bill of Rights, which espouses “Privacy Review Boards” as a safety value for noncontextual data uses. It explores how existing institutional review boards within the academy and for human testing research could offer lessons for guiding principles, providing accountability and enhancing consumer trust, and offers suggestions for how companies — and researchers — can pursue both knowledge and data innovation responsibly and ethically.

The Future of Privacy Forum intends to continue the conversation about Big Data review boards. Joseph Jerome will be leading a panel discussion on the topic at the IAPP’s fall Privacy Academy, and FPF will be hosting an invite only workshop this winter with leading researchers, ethicists, and corporate policymakers to address how to build an ethical framework for Big Data research.

Click here to read “Beyond the Common Rule: Ethical Structures for Data Research in Non-Academic Settings.”

Peter Swire on Encryption and Mandated Access

Senate Committee on the Judiciary

Questions for the Record from Senator Grassley

To: Peter Swire

Huang Professor of Law and Ethics

Scheller College of Business

Georgia Institute of Technology

  1. Global Competitiveness

In my opening statement, one of the concerns I expressed was that, in considering solutions to the “Going Dark” problem, we carefully consider the impact on the global competitiveness of American technology companies. You testified that if U.S. companies were required to give U.S. law enforcement access to encrypted communications and devices, U.S. companies find themselves at a disadvantage in the global marketplace. Yet it appears that countries like the United Kingdom, France and China are considering laws that would move in this direction.

  1. Do you agree that these foreign governments may be moving in this direction? If so, how would the global competitiveness of U.S. companies be damaged if foreign governments mandate the same sort of access?


Swire: I agree that other countries have been considering laws concerning mandated access. My view is that the position of the United States government is highly relevant to the likelihood of other countries adopting such laws, especially for close allies such as the United Kingdom and France. If the United States were to mandate access legally, which I hope it will not, my view is that the U.S. decision would substantially increase the likelihood of such laws being enacted by our allies. By contrast, if the United States maintains the status quo of no such mandates, then that fact becomes an important and relevant precedent against enactment of such measures by our allies.

I believe the U.S. position would also have a large impact on other countries around the world, especially for authoritarian or dictatorial regimes that would like to use mandated access to clamp down on political dissent, religious activity, and other activities. If the U.S. resists mandates, then the U.S. based technology companies have a much greater ability to resist demands for mandated access in such countries. Being able to resist such demands will protect devices and sensitive data of Americans and American businesses in those countries. By contrast, if the U.S. requires access, then it will be much for difficult for U.S. based technology companies to push back against requests from China or other foreign governments.

My initial point, therefore, is that the U.S. actions in this area have a very important impact on whether other countries adopt mandated access. As I stated during the hearing, I also believe that mandates in the U.S. would harm U.S. based technology companies because of the suspicion around the world that their products and services are not secure and information is shared with U.S. government agencies.

In terms of mandates in another country, such as a U.S. ally, there would be multiple effects and the overall outcome depends on the circumstances. For instance, if a small market country mandates access, then that might aid local companies that comply with the local law while U.S. companies may decide not to take the reputational risk of doing business in that jurisdiction. In that event, U.S. companies might lose access to a small market but face less competition from companies based in there in other markets. If the country is seen globally as having a weak human rights record, mandated access may push the U.S. companies, consistent with the Global Network Initiative principles, not to continue doing business there, thus losing market access. Such company decisions to eschew a market, however, may send a strong signal globally about the importance of customer security to the U.S. based companies, with offsetting gains in other markets.

In addition, there is a crucial dynamic aspect to such mandates. The small country, or country with weak human rights, might find the consequences negative if they lose access to cutting edge technology from U.S. based companies. They thus might reconsider their decision to mandate access, in order to bring U.S. based companies back into the jurisdiction. In such an event, a clear U.S. policy of not requiring access is crucial – the good long-term outcome of U.S. company participation and no mandates occurs only if the U.S. retains its policy where no mandates are imposed.

Tackling Privacy, One Carnegie Mellon Project at a Time

CMU Event

CMU Privacy Researchers Norman Sadeh, Lorrie Cranor, Lujo Bauer, Travis Breaux, and Jason Hong (l-r). Photo by JC Cannon.

Last Thursday, the Future of Privacy Forum hosted a conversation among five of CMU’s leading privacy researchers. While the panelists discussed a number of their leading privacy projects, I wanted to highlight some of the interesting takeaways I took from the presentation.

Many of the researchers focused on how subtle nudges can be used to change people’s behaviors. While this is frequently done to encourage users to share more data, the CMU researchers expressed in interest in exploring how nudges can be “used for good.” Discussing efforts by hotels to get patrons to reuse wash towels, Jason Hong explained how subtle changes in wording reminders — from “please recycle” to “75% of guests in this room” — could have significant impacts on patron recycling behaviors.

Lujo Bauer explained how these sorts of nudges could be applied to password composition meters. Increasingly, online services detail password requirements to users and show either colored bars or outright classify a user’s proposed password as “weak” or “strong.” According to Bauer, people typically do not try very hard to get to the point where a meter tells them the password is excellent, but “they will avoid it if a meter tells them their password sucks.” His takeaway: when it comes to security measures, avoid giving users too much positive feedback.

Bauer lamented that the online ecosystem is forcing users to engage in insecure behaviors. Of course, while nudges could be used to reinforce positive behaviors, it begs the question what is defined as “positive” behavior. When it comes to security issues like passwords, promoting better security may be a no brainer, but things are much less binary when it comes to privacy. Privacy-protective nudges can push towards privacy paternalism, which may be no more ethical than the alternative.

Travis Breaux highlighted the continuing challenge of communicating privacy policy into engineering objectives. He noted that many mobile app developers still do not understand the privacy implications that can come with connecting their apps through outside services and social networks, which calls for the need to further detail the entire data supply chain. Breaux explored the potential behind establishing rich data collection/use descriptions that could be more detailed and useful than generic privacy policies, and describing a case study involving applications on Facebook, explained how these sorts of tools could help developers understand more accurately how they are collecting, using, and repurposing information.

Lorrie Cranor discussed the difficulties with communicating data use in the Internet of Things whether through visual, auditory, or haptic channels, or make information “machine readable (if you remember P3P and DNT).” She also highlighted one study that looked at the timing dimension of providing users with notice.  A student developed a simple history quiz app that displayed a privacy notices in different places: (1) in the app store, (2) as soon as the app was opening, (3) in the middle of the history quiz, (4) at the quiz’s end or (5) never at all. “We invited people to take our quiz, but didn’t tell them it was about privacy,” she explained.

When users where then asked about the contents of that privacy notice, the study found that people who “saw” the policy in the app store could not recall it any better than people who did not see it at all. According to Cranor, at the time a user is downloading an app, they are not paying attention to other information in the app store. This “doesn’t suggest you don’t put that info in the app store . . . but suggests that sort of timing may not be sufficient. Also suggests it’s really important to test these things.”

Norman Sadeh further criticized the state of our overly-complicated privacy policies. “It’s not the case that every single sentence in a privacy policy matters,” he stated, discussing his effort to try to extract the key points of interest to users from privacy policies.

Last but not least, the group described its Bank Privacy Project. The researchers described how larger banks tend to collect more information and use it for more purposes, while smaller banks do the exact opposite. “If you don’t want your bank sharing,” Cranor explained, “you need to find a bank you’ve never heard of.” Because this is nigh-impossible for an average consumer to do, enter the Bank Privacy Project.

-Joseph Jerome, Policy Counsel

Privacy Calendar

all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]

View Calendar