Author Archive

FERPA | SHERPA: Providing a Guide to Education Privacy Issues

Education is changing. New technologies are allowing information to flow within classrooms, school, and beyond, enabling new learning environments and new tools to understanding and improve the way teachers teach and students learn. At the same time, however, the confluence of enhanced data collection with highly sensitive information children and teens also makes for a combustive mix from a privacy perspective. Even the White House recognizes this challenge! Its recent Big Data Review specifically highlighted the need for responsible innovation in education.

There are many organizations – many of which we’ve partnered with – working tirelessly to privacy issues in education and provide the best experience for students. So too is the Department of Education. Yet these resources are scattered. The need for an education privacy resource clearinghouse is clear. With “back to school” now in full-swing, we thought it a great time to launch FERPA|SHERPA. The site – named after the core federal law that governs education privacy – aims to provide a one-stop shop for education privacy-related offerings of interest to parents and schools, as well as education service providers and the policymakers struggling to grapple with the legal landscape.

Everyone in the educational ecosystem has a role to play here, lest legitimate privacy concerns combine with other worries to overwhelm the benefits of education technologies and the expanded use of student data. One need only look at the recent collapse of inBloom – a new technology platform that school systems were clamoring for until a combination of poor communication and privacy fears came to dominate any and all conversations about the underlying technologies – as an example of the need for schools and the companies they partner with to better address education privacy issues.

To ensure parents have a voice in the ongoing privacy debate, the site will also host a blog written by parent privacy advocate Olga Garcia-Kaplan, a Brooklyn, NY public school parent of three children.

Additionally, we’re also releasing an education privacy whitepaper by Jules Polonetsky, our executive director, and Omer Tene, Vice President, Research & Education, IAPP, that analyzes the opportunities and challenges of data-driven education technologies and how key stakeholders should address them. The piece – “The Ethics of Student Privacy: Building Trust for Ed Tech” – was recently published in a special issue of the International Review of Information Ethics, “The Digital Future of Education.”

We hope FERPA | SHERPA will help get everyone on the same page when it comes to privacy issues around student data.  We would love your feedback and thoughts on the new site, and we look forward to helping to jump start conversations about education privacy in the new school year.  If we’ve missed something or you’d like to join our effort, please reach out to

Future of Privacy Forum Launches One-Stop Shop Website for Student Privacy


FPF Urges Parents, Teachers and Policymakers to Follow the “ABC’s” of Education Privacy & Make “D” for Data Protection

WASHINGTON, D.C. – August 21, 2014 – As schools increasingly rely on data to improve education, and as teachers increasingly rely on technology in the classroom to improve the learning experience, privacy concerns are being raised about the collection and use of student data. With ‘back to school’ now in full-swing, and to address both the promise and challenges surrounding privacy and data in education, the Future of Privacy Forum (FPF) today unveiled a first-of-its-kind, one-stop shop resource website providing parents, school officials, policymakers, and service providers easy access to the laws, standards and guidelines that are essential to understanding student privacy issues and navigating a responsible path to managing student data with trust, integrity, and transparency.

More than at any other time in the evolution of education, data-driven innovations and use of emerging technologies – such as online textbooks, apps, tablets and mobile devices, and internet-based learning – are bringing advances and critical improvements in teaching and learning, with profound implications.

At the same time, the increased use of vendors and data is matched by the need for heightened responsibility to manage and safeguard student data and implement policies that benefit education and minimize risk. Concerns are being raised about how student data is collected and used in a next-stage learning ecosystem buzzing with social media, mobile devices, central databases, student records, Big Data, and an array of vendors and software.

The new, resource-rich FERPA|SHERPA website – named after the core federal law that governs education privacy – seeks to address these opportunities and concerns. The unique site hosts a comprehensive, digital dashboard of quality education privacy-related offerings for four distinct audiences: parents, service providers, schools, and policymakers.

To ensure parents have a voice in the ongoing privacy debate, the site will also host a blog written by parent privacy advocate Olga Garcia-Kaplan, a Brooklyn, NY public school parent of three children.

Some of the assets available at FERPA|SHERPA include:

    • Vendor quick tips – for app and software developers
    • Overview and explanation of relevant federal laws and policies on student data – such as FERPA, COPPA, PPRA, ESRA and CIPA
    • Policy papers about education privacy
    • Clearinghouse of education websites and resources for parents and school administrators
    • Topical blog that brings a parent’s perspective to the many facets of privacy issues related to learning and education
    • Ongoing expertise provided by FPF staff, partners, and other stakeholders to help shape and guide understanding of data privacy and responsible use

“Getting privacy right in student education requires a partnership of trust between families, teachers and schools, technology companies and education officials,” said Jules Polonetsky, executive director, FPF. “Any weak link in this chain of responsibility could undermine education and risk student data. With FERPA|SHERPA, we are making sure that the laws and best practices are easy to find.”

“Since our creation, Edmodo has been focused on safeguarding user privacy, and we’re excited to partner with FPF on this effort to provide schools, teachers, and parents with great resources about student privacy issues,” said Aden Fine, chief privacy officer of Edmodo. “Education is critical to addressing questions about privacy, and we think the FERPA|SHERPA website will really help the public better understand these complicated issues.”

“Parents have to sort through a tremendous amount of information issued about student data privacy to learn how and why data compiled pertaining to their children may be used. As a parent, the FERPA|SHERPA site is an invaluable resource for obtaining timely, accurate and impartial information necessary to understand this evolving landscape,” said Olga Garcia-Kaplan, parent and advocate for student data privacy.

“Educational leaders, service providers, parents and policy makers increasingly need accurate and reliable information on privacy issues.  For too long, it has been a real challenge to find that information.  The Future of Privacy Forum’s new FERPA|SHERPA is a great starting place to find what you need,” said Keith Krueger, CEO, Consortium for School Networking.

Protecting student data and privacy involves navigating myriad regulations, policies, and practices,” said Marsali Hancock, CEO & President, iKeepSafe. “iKeepSafe has worked with schools, parents, students, and industry to promote safe and effective use of technology, and we are thrilled that FERPA|SHERPA is providing these stakeholders with additional resources on important laws and best practices to protect student data.”

The FERPA|SHERPA website initiative – which began in the fall of 2013 – is the first of many offerings generated by the FPF on education privacy, which began as the FPF invested its privacy expertise and leveraged staff talent in education issues and subsequently developed a comprehensive education privacy campaign with wide stakeholder engagement – including parents, teachers, school administrators, trade associations, and leading education and technology companies in the private sector.

In addition, the FPF today released an education privacy whitepaper that has been published in a special issue of the International Review of Information Ethics, “The Digital Future of Education.” The piece – “The Ethics of Student Privacy: Building Trust for Ed Tech” – is authored by Polonetsky and Omer Tene, Vice President, Research & Education, IAPP, and analyzes the opportunities and challenges of data-driven education technologies and how key stakeholders should address them.

About the Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. Visit

Media Contact:

Nicholas Graham, for Future of Privacy Forum

FPF Statement on Today’s Safe Harbor Complaint

Today, the Center for Digital Democracy filed a complaint with the Federal Trade Commission, alleging that companies are violating the U.S.-EU Safe Harbor agreement. CDD’s filing came with a report criticizing the practices of thirty companies.

“We are carefully reviewing the report’s claims, but the dozen we have examined so far seem to reflect the authors distaste for marketing, rather than legal safe harbor violations,” said Jules Polonetsky, Executive Director, Future of Privacy Forum.

The Future of Privacy Forum has long focused on the value of the Safe Harbor agreement, and issued a comprehensive report on the framework last fall.

Cross Border Privacy Rules Advance at Beijing Meetings

APEC’s Data Privacy Subgroup concluded its 2014 meetings in Beijing, China earlier this week.   The Future of Privacy Forum participated in these meetings as a member of the U.S. delegation.  The biggest development of the week was Canada’s submission of its Notice of Intent to participate in the Cross Border Privacy Rules (CBPR) system.  After a favorable determination by the APEC’s Joint Oversight Panel, Canada will become the fourth country to join the system, along with the United States, Mexico and Japan.   In addition, TRUSTe, an APEC-approved Accountability Agent, announced that 14 companies are in the process of seeking certification.  Taken together, these developments, along with Mexico’s recent steps toward interoperability have provided promising momentum in the establishment of an international privacy framework.

Still much work remains before the true potential of the system can be fully realized.  In July, FPF hosted officials from Privacy Thailand, a University-based consortium that advises the Thai Prime Minister’s office on data privacy and security issues.  During their week-long visit, FPF and Privacy Thailand met with representatives from the Department of Commerce, the Federal Trade Commission and the U.S. Department of State to consider Thailand’s accession to the system.   FPF will continue work with interested APEC members to provide capacity building assistance.

On August 8, APEC Economies and representatives from the EU’s Article 29 Working Party met to discuss next steps on the jointly developed Common Referential.  This document identifies points of commonality between the CBPR system and the EU’s system of Binding Corporate Rules (BCRs).  APEC members agreed to take this work forward by developing case studies that demonstrate the practical interoperability of these two systems and a checklist outlining the combined obligations for a company seeking certification under both.

On August 10, APEC Economies agreed to establish a working group to consider the applicability of the APEC Privacy Framework to Big Data.  This group will consider, among other things, appropriate administrative and policy safeguards when de-identifying personal information.  FPF plans to participate in this working group.

Participants continued the development of a CBPR certification system for data processors.   In July, FPF hosted a meeting of this working group to develop the program requirements under this certification.  Completion of this project is expected in advance of the next APEC Data Privacy Subgroup meetings in Clark, Philippines in January, 2015.

Comments to NTIA on Big Data

Today, FPF submitted comments to the NTIA as it begins its exploration of how big data impact the Consumer Privacy Bill of Rights. While the NTIA sought comment on over a dozen key questions, our filing focus largely on four issues: (1) the need for additional clarity surrounding the flexible application of the Consumer Privacy Bill of Rights’ privacy principles, (2) challenges to the “notice and choice” model and using context to inform a use-based approach to data use, (3) practical de-identification, and (4) what internal review boards might look like and consider in the age of big data.

Much of our filing builds upon FPF’s thinking on how to develop a benefit-risk analysis for data protects, with big data concerns of particular importance. Industry increasingly faces ethical considerations over how to minimize data risks while maximizing benefits to all parties. As the White House’s earlier Big Data Report acknowledged, there is a potential tension between socially beneficial and privacy invasive uses of information in everything from educational technology to consumer generated health data. The advent of big data requires active engagement by both internal and external stakeholders to increase transparency, accountability and trust.

FPF believes that a documented review process could serve as an important tool to infuse ethical considerations into data analysis without requiring radical changes to the business practices or innovators or industry in general. Institutional review boards (IRBs), which remain the chief regulatory response to decades of questionable ethical decisions in the field of human subject testing, provide a useful precedent for focusing on good process controls as a way to address potential privacy concerns. While IRBs have become a rigid compliance device and would be inappropriate for wholesale use in big data decision-making, they could provide a useful template for how projects can be evaluated based on prevailing community standards and subjective determinations of risks and benefits, particularly in cases involving greater privacy risks. Using an IRB model as inspiration, big data may warrant the creation of new advisory processes within organizations to more fully consider ethical questions posed by big data.

Moving forward, broader big data ethics panels could provide a commonsense response to public concerns about data misuse. While these institutions could provide a further expansion of the role of privacy professionals within organizations, they might also provide a forum for a diversity of viewpoints inside and out of organizations. Ethics reviews could include members with different backgrounds, training, and experience, and could seek input from outside actors including consumer groups and regulators.While these panels will vary between the public and private sector, businesses and researchers, they could provide an important check on any data misuse.

Organizations and privacy professionals have become experienced at evaluating risk, but they should also engage in a rigorous data benefit analysis in conjunction with traditional privacy risks assessments. FPF suggests that organizations could develop procedures to assess the “raw value” of a data project, which would require organizations to identify the nature of a project, its potential beneficiaries, and the degree to which those beneficiaries would benefit from the project. Our guidance for this process is included in our filing for the first time.

Of course, big data hasn’t changed all the rules. And not every use of big data implicates our privacy. Many uses of big data are machine-to-machine or highly aggregated. Many new uses of data are marginal, which our current processes for mitigating risks can well address.

Privacy Calendar

12:00 pm ABA: Big Data and Privacy Frameworks: Perspectives from Government, Industry and Policy Experts
ABA: Big Data and Privacy Framew…
Sep 11 @ 12:00 pm – 1:00 pm
Large scale data analytics – often referred to as “Big Data” is transforming lives in areas such as transportation, education, and health care. It is [...]
all-day Big Data: A Tool for Inclusion or Exclusion? @ Constitution Center
Big Data: A Tool for Inclusion o… @ Constitution Center
Sep 15 all-day
The Federal Trade Commission will host a public workshop entitled “Big Data: A Tool for Inclusion or Exclusion?” in Washington on September 15, 2014, to [...]
all-day IAPP Privacy Academy and CSA Congress 2014 @ San Jose Convention Center
IAPP Privacy Academy and CSA Con… @ San Jose Convention Center
Sep 17 – Sep 19 all-day
This fall, the International Association of Privacy Professionals (IAPP) and Cloud Security Alliance (CSA) are bringing together the IAPP Privacy Academy and the CSA Congress [...]
6:00 pm Consumer Action’s 43rd Annual Awards Reception @ Google
Consumer Action’s 43rd Annual Aw… @ Google
Oct 21 @ 6:00 pm – 8:00 pm
To mark its 43rd anniversary, Consumer Action’s Annual Awards Reception on October 21, 2014, will celebrate the theme of “Train the Trainer.” Through the power of [...]
all-day Data Privacy Day
Data Privacy Day
Jan 28 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The [...]
all-day Data Privacy Day
Data Privacy Day
Jan 28 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The [...]

View Calendar