Author Archive

Google Taps the YubiKey for Better Account Security

With identity theft and cybersecurity issues in the news seemingly on a daily basis, better tools to protect our data – and our privacy – are always welcome. For some time, FPF has endorsed the use of two-factor authentication as an “extra” step consumers can take to protect their accounts across a variety of online services. While everyone at FPF uses two-factor authentication for everything from our email accounts to our social media networks, two-factor authentication can be cumbersome and inconvenient. Every time one logs into a different account on a different machine, a code has to be retrieved from a mobile phone. Lose the phone, and you have to hope you have a set of paper-based fallback authentication codes.

Enter the physical security key. Yubico’s new “YubiKey” physical security key, for example, supports both USB and NFC communications and makes two-factor authentication as simple (and as fun) as tapping the device itself. The device is supported by the FIDO Alliance, a non-profit group dedicated to creating strong, interoperable authentication standards. The FIDO protocols that YubiKey follows use standard public key cryptography, and build in privacy by design. This means that the security key cannot track you across services as you log-in, and that local authentication information never leaves the device.

Today Google announced its support for the key, making its use with use with the Chrome browser and Google Accounts enrolled in two-factor authentication much easier.

Google has been interested in physical security keys for a while. The company has been using them internally for years, and last year used its experiences to publish an in-depth look at “Authentication at Scale.” In it, Google explains that investing in password alternatives like public-key-based technologies will help users more easily protect sensitive accounts – like their primary email account or a company’s spokesperson account – from common security threats.

After registering the device with your Google Account, the YubiKey can be easily used to sign into Google services by simply installing the key into a USB port and tapping it when prompted. No more reaching for your cell phone every time you encounter a new computer or clear your cookies. Instead, just plug in and tap! When you’re done, just stick the YubiKey on your keychain and go – it’s waterproof, battery-free and nearly indestructible.

In addition to being potentially easier to use, there are also significant security benefits to using a physical key like YubiKey. Physical security keys can’t be phished and can’t be fooled by fake look-alike websites. They also reduce the threat of man-in-the-middle attacks. And because YubiKey is touch-based, malware can’t silently activate it when you’re looking away. The key can also help users keep an eye out for suspicious activity. For example, when you onto a computer for the first time, you’ll be prompted to tap. If your account suddenly receives a log-in request from a new location, it will also trigger a tap request.  If your YubiKey doesn’t authenticate the request, then your account stays locked and Google can flag the failed log-in attempt. Although this may be a small step for security, it is a huge leap for usability.

The benefits from an enterprise standpoint are obvious, but physical keys also point toward a more secure future for consumers, as well. Online, passwords are roughly analogous to the keys we all use to lock our front doors, but the proliferation of online services and the need for ever-stronger and more varied passwords have overwhelmed consumers. Two-factor authentication has helped to make our cell phones our de facto “keys” to the Internet, but permanent security keys may offer even better online security – and convenience. With Google’s support, FPF looks forward to seeing how devices like the YubiKey develop in the future.

-Kelsey Finch & Joseph Jerome, Policy Counsels

“Databuse” as the Future of Privacy?

Is “privacy” such a broad concept as to be meaningless from a legal and policy perspective? On Tuesday, October 14th, the Center for Democracy & Technology hosted a conversation with Benjamin Wittes and Wells Bennett, frequently of the national security blog, Lawfare, to discuss their recent scholarship on “databuse” and the scope of corporate responsibilities for personal data.

Coming from a world of FISA and ECPA, and the detailed statutory guidance that accompanies privacy in the national security space, Wittes noted that privacy law on the consumer side is vague and amorphous, and largely “amounts to don’t be deceptive and don’t be unfair.” Part of the challenge, as number privacy scholars have noted, is that privacy encompasses a range of different social values and policy judgments. “We don’t agree what value we’re protecting,” Wittes said, explaining that government privacy policies have values and distinctions such as national borders and citizen/non-citizen than mean something.

Important distinctions are much less easier to find in consumer privacy. Wittes’ initial work on “databuse” in 2011 was considerably broader and more provocative, applying to all data controllers — first and third party, but his follow-up work with Bennett attempted to limit its scope to the duties owed to consumers exclusively by first parties. According to the pair, this core group of duties “lacks a name in the English language” but “describe a relationship best seen as a form of trusteeship.”

Looking broadly at law and policy around data use, including FTC enforcement actions, the pair argue that there is broad consensus that corporate custodians face certain obligations when holding personal data, including (1) obligations to keep it secure, (2) obligations to be candid and straightforward with users about how their data is being exploited, (3) obligations not to materially misrepresent their uses of user data, and (4) obligations not to use them in fashions injurious to or materially adverse to the users’ interests without their explicit consent. According to Wittes, this core set of requirements better describes reality than any sort of “grandiose conception of privacy.”

“When you talk in the broad language of privacy, you promise consumers more than the legal and enforcement system can deliver,” Wittes argued. “If we want useful privacy policy, we should focus on this core,” he continued, noting that most of these requirements are not directly required by statute.

Bennett detailed how data uses fall into three general categories. The first, a “win/win” category,” describes where the interests of business and consumers align, and he cited the many uses of geolocation information on mobile devices as a good example of this. The second category reflects cases where businesses directly benefit but consumers face a neutral value proposition, and Bennett suggested online behavioral advertising fit into this second category. Finally, a third category of uses are when businesses benefit at consumer’s expense, and he argued that regulatory action would be appropriate to limit these behaviors.

Bennett further argued that this categorization fit well with FTC enforcement actions, if not the agency’s privacy rhetoric. “FTC report often hint at subjective harms,” Bennett explained, but most of the Commission’s actions target objective harms to consumers by companies.

However, the broad language of “privacy” distorts what harms the pair believe regulators — and consumers, as well — are legitimately concerned about. Giving credit to CDT for initially coining the term “databuse,” Wittes defines the term as follows:

[T]he malicious, reckless, negligent, or unjustified handling, collection, or use of a person’s data in a fashion adverse to that person’s interests and in the absence of that person’s knowing consent. . . . It asks not to be left alone, only that we not be forced to be the agents of our own injury when we entrust our data to others. We are asking not necessarily that our data remain private; we are asking, rather, that they not be used as a sword against us without good reason.

CDT’s Justin Brookman, who moderated the conversation, asked whether (or when) price discrimination could turn into databuse.

“Everyone likes [price discrimination] when you call it discounts,” Wittes snarked, explaining that he was “allergic to the merger of privacy and antidiscrimination laws.” Where personal data was being abused or unlawful discrimination was transpiring, Wittes supported regulatory involvement, but he was hesitant to see both problems as falling into the same category of concern.

The conversation quickly shifted to a discussion of the obligations of third parties — or data brokers generally — and Wittes and Bennett acknowledged they dealt with the obligations of first parties because its an easier problem. “We punted on third parties,” they conceded, though Wittes’ background in journalism forced him to question how “data brokers” were functionally different from the press. “I haven’t thought enough about the First Amendment law,” he admitted, but he wasn’t sure what principle would allow advocates to divine “good” third parties and “bad” third parties.

But if the pair’s theory of “databuse” can’t answer every question about privacy policy, at least we might admit the term should enter the privacy lexicon.

-Joseph Jerome, Policy Counsel

Thoughts on the Data Innovation Pledge

Yesterday, as he accepted the IAPP Privacy Vanguard award, Intel’s David Hoffman made a “data innovation pledge” that he would work only to promote ethical and innovative uses of data. As someone who only relatively recently entered the privacy world by diving headfirst into the sea of challenges surrounding big data, I think an affirmative pledge of the sort David is proposing is a great idea.

While pledges can be accused of being mere rhetorical flourishes, words do matter. A simple pledge can communicate a good deal — and engage the public in a way that drives conversation forward. Think of Google’s early motto to “don’t be evil.” For years this commitment fueled a large reservoir of trust that many have for the company. With every new product and service that Google releases, its viewed through the lens of whether or not its “evil.” That places a high standard on the folks at Google, and for that, we should pleased.

Of course, pledges present obligations and challenges. Using data only for good presents a host a new questions. As FPF explored in our whitepaper on benefit-risk analysis for Big Data, there are different aspects to consider when evaluating the benefits of data use — and some of these factors are largely subjective. Ethics is a broad field, and it also exposes the challenging philosophical underpinnings of privacy.

The very concept of privacy has always been a philosophical conundrum, but so much of the rise of the privacy profession has focused on compliance issues and the day-to-day reality of data protection. But today, we’re swimming in a sea of data, and all of this information makes us more and more transparent to governments, industry, and each other. It’s the perfect catalyst to consider what the value of “privacy” truly is. Privacy as an information on/off switch may be untenable, but privacy as a broader ethical code makes a lot of sense.

There are models to learn from. As David points out, other professions are bound by ethical codes, and much of that seeps into how we think about privacy. Doctors not only pledge to do no harm, but they also pledge to keep our confidences about our most embarrassing or serious health concerns. Questionable practices around innovation and data in the medical field led to review boards to protect patients and human test subjects and reaffirmed the role of every medical professional to do no evil.

Similar efforts are needed today as everything from our wristwatches to our cars is “datafied.” In particular, I think about all of the debates that have swirled around the use of technology in the classroom in recent years. A data innovation pledge could help relieve worried parents. If Monday’s FTC workshop is indication, similar ethical conversations may even be needed for everyday marketing and advertising.

The fact is that there are a host of different data uses that could benefit from greater public confidence. A data innovation pledge is a good first start. There is no question that companies need to do more to show the public how they are promoting innovative and ethical uses of data. Getting that balance right is tough, but here’s to privacy professionals helping to lead that effort!

-Joseph Jerome, Policy Counsel

FTC Wants Tools to Increase Transparency and Trust in Big Data

However we want to define “Big Data” – and the FTC’s latest workshop on the subject suggests a consensus definition remains elusive – the path forward seems to call for more transparency and the establishment of firmer frameworks on the use of data. As Chairwoman Ramirez suggested in her opening remarks, Big Data calls for a serious conversation about “industry’s ethical obligations as stewards of information detailing nearly every facet of consumers’ lives.”

Part of that challenge is that some Big Data uses are often “discriminatory”. Highlighting findings from his paper on Big Data and discrimination, Solon Barocas began the workshop by noting that whole point of data mining is to differentiate and to draw distinctions. In effect, Big Data is a rational form of discrimination, driven by apparent statistical relationships rather than any capriciousness. When humans introduce unintentional biases into the data, there is no ready solution at a technical or legal level. Barocas called for a conversation for lawyers and public policy makers to have a conversation with the technologists and computer scientists working directly with data analytics – a sentiment echoed when panelists realized a predictive analytics conference was going on simultaneously across town.

But the key takeaway from the workshop wasn’t that Big Data could be used as tool to exclude or include. Everyone in the civil rights community agreed that data could be a good thing, and a number of examples were put forth to suggest once more that data had the potential to be used for good or for ill. Pam Dixon of the World Privacy Forum classifying individuals creates a “data paradox,” where the same data can be used to help or to harm that individual. For our part, FPF released a report alongside the Anti-Defamation League detailing Big Data’s ability to combat discrimination. Instead, there was considerable desire to understand more about industry’s approach to big data. FTC staff repeatedly asked not just for more positive uses of big data by the private sector, but inquired as to what degree of transparency would help policy makers understand Big Data decision-making.

FTC Chief Technologist Latanya Sweeney followed up her study that suggested web searches for African-American names were more likely than searches of white-sounding names to return ads suggesting the person had an arrest record by looking at credit card advertising and website demographics. Sweeney presented evidence that advertisements for harshly criticized credit cards were often directed to the homepage of Omega Psi Phi, a popular black fraternity.

danah boyd observed that there was a general lack of transparency about how Big Data is being used within industry, for a variety of complex reasons. FTC staff and Kristin Amerling from Senate Commerce singled out the opacity surrounding the practices of data brokers when describing some of the obstacles being faced when policy makers try to under how Big Data is being used.

Moreover, while consumers and policy makers are trying to grapple with what companies are doing with their streams of data, industry is also placed in the difficult position of making huge decisions about how that data can be used. For example, boyd cited the challenges JPMorgan Chase faces when using analytics to evaluate human trafficking. She applauded the positive work the company was doing, but noted that expecting it to have the ability or expertise to effectively intervene in trafficking perhaps asks too much. They don’t know when to intervene or whether to contact law enforcement or social services.

These questions are outside the scope of their expertise, but even general use of Big Data can prove challenging for companies. “A lot of the big names are trying their best, but they don’t always know what the best practices should be,” she concluded.

FTC Commissioner Brill explained that her support for a legislative approach to increase transparency and accountability among data brokers, their data sources, and their consumers, was to help consumers and policy makers “begin to understand how these profiles are being used in fact, and whether and under what circumstances they are harming vulnerable populations.” In the meantime, she encouraged industry to take more proactive steps. Specifically, she recommended again that data brokers explore how their clients are using their information and take steps to prevent any inappropriate uses and further inform the public. Companies can begin this work now, and provide all of us with greater insight into – and greater assurances about – their models,” she concluded.

A number of legal regimes may already apply to Big Data, however. Laws that govern the provision of credit, housing, and employment will likely play a role in the Big Data ecosystem. Carol Miaskoff at the Equal Employment Opportunity Commission suggested there was real potential with Big Data to gather information about successful employees and use that to screen people for employment in a way that exacerbates prejudices built into the data. Emphasizing his recent white paper, Peter Swire suggested there were analogies to be made between sectoral regulation in privacy and sectoral legislation in anti-discrimination law. With existing legal laws in place, he argued that it was past time to “go do the research and see what those laws cover” in the context of Big Data.

“Data is the economic lubricant of the economy,” the Better Business Bureau’s C. Lee Peeler argued, and he supported the FTC’s continued efforts to explore the subject of Big Data. He cited earlier efforts by the Commission to examine inner-city marketing practices, which produced a number of best practices still valid today. He encouraged the FTC to look at what companies are doing with Big Data on a self-regulatory basis as a basis for developing workable solutions to potential problems.

So what is the path forward? Because Big Data is, in the words of Promontory’s Michael Spadea, a nascent industry, there is a very real need for guidelines on not just how to evaluate the risks and benefits of Big Data but also how to understand what is ethically appropriate for business. Chris Wolf highlighted FPF’s recent Data-Benefit Analysis and suggested companies were already engaged in detailed analysis of the use of Big Data, though everyone recognized that businesses practices and trade secrets precluded making much of this public.

FTC staff noted there was a “transparency hurdle” to get over in Big Data. Recognizing that “dumping tons of information” onto consumers would be unhelpful, staff picked up on Swire’s suggestion that industry needed some mechanism to justify what is going on to either regulators or self-regulatory bodies. Spadea argued that “the answer isn’t more transparency, but better transparency.” The Electronic Frontier Foundation’s Jeremy Gillula recognized the challenge companies face revealing their “secret sauce,” but encouraged them to look at more way to give consumer more general information about what was going on. Otherwise, he recommended, consumers ought to collect big data on big data and turn data analysis back on data brokers and industry at large through open-source efforts.

At the same time, Institutional Review Boards, which are used in human subject testing research, were again proposed as a model for how companies can begin affirmatively working through these problems. Citing a KPMG report, Chris Wolf insisted that strong governance regimes, including “a strong ethical code, along with process, training, people, and metrics,” were essential to confront the many ethical and philosophical challenges that flirted around the day’s discussions.

Jessica Rich, the Director on the FTC’s Consumer Protection Bureau, cautioned that the FTC would be watching. In the meantime, industry is on notice. The need for clearer data governance frameworks is clear, and careful consideration of Big Data project should be both reflexive and something every industry privacy professional talks about.

-Joseph Jerome, Policy Counsel

Relevant Reading from the Workshop:

 

A Path Forward for Big Data

How should privacy concerns be weighed against the benefits of big data? This question has been at the heart of policy debates about big data all year, from the President’s announcement of the White House Big Data review in January to the FTC’s latest workshop looking at big data’s ability to exclude or include. Answering this question could very well present the biggest public policy challenge of our time, and the need to face that challenge is growing.

Increasingly, there are new worries that big data is being used in ways that are unfair to some people or classes of people. Resolving those worries and ensuring that big data is being used fairly and legitimately is a challenge should be a top priority for industry and government alike.

Today, FPF is releasing two papers that we hope will help frame the big data conversation moving forward and promote better understanding of how big data can shape our lives. These papers provide a practical guide for how benefits can be assessed in the future, but they also show how data is already is being used in the present.  FPF Co-Chairman Christopher Wolf will discuss key points from these papers at the Federal Trade Commission public workshop entitled “Big Data: A Tool for Inclusion or Exclusion?” in Washington on Monday, September 15.

We are also releasing a White Paper which is based on comments that will be presented at the FTC Workshop by  Peter Swire, Nancy J. & Lawrence P. Huang Professor of Law and Ethics, Georgia Institute of Technology.  Swire, also Senior Fellow at FPF, draws lessons from fair lending law that are relevant for online marketing related to protected classes.

The papers are entitled:

*                      *                      *

The world of big data is messy and challenging. The very term “big data” means different things within different contexts. Any successful approach to the challenge of big data must recognize that data can be used in a variety of different ways. Some of these uses are clearly beneficial, some of them clearly are problematic, some are for uses that some believe beneficial and others believe to be harmful.  Some uses have no real impact on individuals at all. We hope these documents can offer new ways to look at big data in order to ensure that it is only being used for good.

Big Data: A Benefit and Risk Analysis

Privacy professionals have become experts at evaluating risk, but moving forward with big data will require rigorous analysis of project benefits to go along with traditional privacy risk assessments. We believe companies or researchers need tools that can help evaluate the cases for the benefits of significant new data uses.  Big Data: A Benefit and Risk Analysis is intended to help companies assess the “raw value” of new uses of big data. Particularly as data projects involve the use of health information or location data, more detailed benefit analyses that clearly identify the beneficiaries of a data project, its size and scope, and that take into account the probability of success and evolving community standards are needed.   We hope this guide will be a helpful tool to ensure that projects go through a process of careful consideration.

Identifying both benefits and risks is a concept grounded in existing law. For example, the Federal Trade Commission weighs the benefits to consumers when evaluating whether business practices are unfair or not. Similarly, the European Article 29 Data Protection Working Party has applied a balancing test to evaluate legitimacy of data processing under the European Data Protection Directive. Big data promises to be a challenging balancing act.

Big Data: A Tool for Fighting Discrimination and Empowering Groups

Even as big data uses are examined for evidence of facilitating unfair and unlawful discrimination, data can help to fight discrimination. It is already being used in myriad ways to protect and to empower vulnerable groups in society. In partnership with the Anti-Defamation League, FPF prepared a report that looked at how businesses, governments, and civil society organizations are leveraging data to provide access to job markets, to uncover discriminatory practices, and to develop new tools to improve education and provide public assistance.  Big Data: A Tool for Fighting Discrimination and Empowering Groups explains that although big data can introduce hidden biases into information, it can also help dispel existing biases that impair access to good jobs, good education, and opportunity.

Lessons from Fair Lending Law for Fair Marketing and Big Data

Where discrimination presents a real threat, big data need not necessary lead us to a new frontier. Existing laws, including the Equal Credit Opportunity Act and other fair lending laws, provide a number of protections that are relevant when big data is used for online marketing related to lending, housing, and employment. In comments to be presented at the FTC public workshop, Professor Peter Swire will discuss his work in progress entitled Lessons from Fair Lending Law for Fair Marketing and Big Data. Swire explains that fair lending laws already provide guidance as to how to approach discrimination that allegedly has an illegitimate, disparate impact on protected classes. Data actually plays an important role in being able to assess whether a disparate impact exists! Once a disparate impact is shown, the burden shifts to creditors to show their actions have a legitimate business need and that no less reasonable alternative exists. Fair lending enforcement has encouraged the development of rigorous compliance mechanisms, self-testing procedures, and a range of proactive measures by creditors.

*                      *                      *

There is no question that big data will require hard choices, but there are plenty of avenues for obtaining the benefits of big data while avoiding – or minimizing – any risks. We hope the following documents can help shift the conversation to a more nuanced and balanced analysis of the challenges at hand.

To contact the authors to discuss any of the papers or issues related to privacy and big data, contact FPFMedia@futureorprivacy.org.


Privacy Calendar

Oct
29
Wed
4:00 pm Big Data and Privacy: Navigating... @ Schulze Hall
Big Data and Privacy: Navigating... @ Schulze Hall
Oct 29 @ 4:00 pm – 7:00 pm
The rapid emergence of “big data” has created many benefits and risks for businesses today. As data is collected, stored, analyzed, and deployed for various business purposes, it is particularly important to develop responsible data[...]
Oct
30
Thu
9:00 am The Privacy Act @40: A Celebrati... @ Georgetown Law
The Privacy Act @40: A Celebrati... @ Georgetown Law
Oct 30 @ 9:00 am – 5:30 pm
The Privacy Act @40 A Celebration and Appraisal on the 40th Anniversary of the Privacy Act and the 1974 Amendments to the Freedom of Information Act October 30, 2014 Agenda 9 – 9:15 a.m. Welcome[...]
Nov
7
Fri
all-day George Washington Law Review 201... @ George Washington University Law School
George Washington Law Review 201... @ George Washington University Law School
Nov 7 – Nov 8 all-day
Save the date for the GW Law Review‘s Annual Symposium, The FTC at 100: Centennial Commemorations and Proposals for Progress, which will be held on Saturday, November 8, 2014, in Washington, DC. This year’s symposium, hosted in[...]
Nov
11
Tue
10:15 am You Are Here: GPS Location Track... @ Mauna Lani Bay Hotel & Bungalows
You Are Here: GPS Location Track... @ Mauna Lani Bay Hotel & Bungalows
Nov 11 @ 10:15 am
EFF Staff Attorney Hanni Fakhoury will present twice at the Oregon Criminal Defense Lawyers Association’s Annual Sunny Climate Seminar. He will give a presentation on government location tracking issues and then participate in a panel[...]
Nov
12
Wed
all-day PCLOB Public Meeting on “Definin... @ Washington Marriott Hotel
PCLOB Public Meeting on “Definin... @ Washington Marriott Hotel
Nov 12 all-day
The Privacy and Civil Liberties Oversight Board will conduct a public meeting with industry representatives, academics, technologists, government personnel, and members of the advocacy community, on the topic: “Defining Privacy.”   While the Board will[...]
Nov
20
Thu
all-day W3C Workshop on Privacy and User... @ Berlin, Germany
W3C Workshop on Privacy and User... @ Berlin, Germany
Nov 20 – Nov 21 all-day
The Workshop on User Centric App Controls intents to further the discussion among stakeholders of the mobile web platform, including researchers, developers and service providers. This workshop serves to investigate strategies toward better privacy protection[...]
Dec
2
Tue
all-day IAPP Practical Privacy Series 2014
IAPP Practical Privacy Series 2014
Dec 2 – Dec 3 all-day
Government and FTC and Consumer Privacy return to Washington, DC. For more information, click here.

View Calendar