Author Archive

Moving the Internet of Things Forward Without Hard Numbers on Risks

Today’s release of the FTC’s long-awaited report on the Internet of Things concludes that connected devices are “likely to meld the virtual and physical worlds together in ways that are currently difficult to comprehend.” It’s this great unknown where it seems many of the revolutionary benefits and more abstract risks from connectivity lie. While the Future of Privacy Forum was largely supportive of the report, the separate statement of Commissioner Ohlhausen and dissent from Commissioner Wright reflect the reality that it is still early days for the Internet of Things and the path forward is not crystal clear.

Both commissioners caution the FTC against focusing on speculative harms and urge a more rigorous cost-benefit analysis. For Commissioner Ohlhausen, this perspective dictates that the Commission approach new technologies with a degree of regulatory humility, while Commissioner Wright is more blunt: “Paying lip service to the obvious fact that the various best practices and proposals discussed in the Workshop Report might have both costs and benefits, without in fact performing such an analysis, does nothing to inform the recommendations made in the Workshop Report.”

There is some merit to this criticism. At the end of the day, I always want hard numbers to back up my policy positions and lamented the fact that I often don’t have them.  The problem is that when it comes to privacy – particularly privacy harms – evaluating costs is difficult. Academics have been struggling for years through efforts both serious and silly to quantify the value of privacy, and the Internet of Things appears to have only added to the perception that there are new, more abstract threats to privacy. Existing risk assessment frameworks are well geared to identify and address tangible harms, such as financial loss or security vulnerabilities, but it is much harder to measure the chilling effects, if any, of smart thermostats and televisions.

It is not sufficient to simply dismiss concerns as too abstract or inchoate. By now, it should be increasingly clear that the public’s sense that individuals have lost control over their information is problematic. It’s unclear whether the FTC alone is in the position to ensure privacy and security concerns about connectivity will not undermine public confidence in the wider Internet of Things. So what can be done?

In some spaces, the Internet of Things seems to finally open the door for companies to compete on privacy in a way that consumers can understand. The Association of Global Automakers, for example, view its recently released automotive privacy principles as a floor for its members and fully expects car companies to be able to compete on privacy, both due to our close relationships with our cars and preeminent security concerns. There are plenty of opportunities for companies to be proactive on privacy. There are already industry efforts underway to establish a baseline for how wearable devices should be treated and algorithms be governed. Recently, FPF has called for companies to engage in a more detailed and concrete analysis of the benefits of their data projects. Part of the aim of this effort is to encourage industry to develop more elaborate and more structured processes such as review panels and “consumer” IRBs that can consider seriously the ethical challenges posed by some innovative uses of data.

Beyond that, the Internet of Things promises new opportunities for users to engage with their information. Connectivity tools that offer users controls will be important not just for privacy but for basic functionality. I was pleased to see the FTC report highlight our calls for consumer profile management portals. Implemented correctly, better user dashboards will give individuals the ability to make their own decisions about what information about them is collected, used, and shared. These tools respect user autonomy and let them make their own decisions about what benefits they want – and risks they’re willing to tolerate. Better, more individualized control obviously doesn’t resolve collective privacy challenges, but it is one option to look at alongside codes of conduct, traditional benefit-risk assessments, and review mechanisms. Until we can find a better way to measure and analyze the societal value of privacy, this multi-pronged approach is the best way forward on the Internet of Things.

-Joseph Jerome, Policy Counsel

FPF Statement on FTC Internet of Things Report

The Report, which appropriately does not call for IoT specific legislation, reflects the fact that the Internet of Things is in its infancy, and strongly supports context as a way to assess appropriate uses.  The staff recognized concerns that a notice and choice approach could restrict unexpected new uses of data with potential societal benefits.  They sensibly  incorporated certain elements of the use-based model into its approach linking the idea of choices being keyed to context to take into account how the data will be used.

However, the report is overly cautious in that it recognizes that there are beneficial uses that can strain data minimization or could warrant out of context uses, but worries that allowing companies alone to judge the bounds of such uses without legislation would lead to mistakes.  In many cases, the FTC already has the ability to use deception or unfairness authority to take action when a company creates risk to consumers without countervailing benefit.  We hope the Administration’s soon to be released Consumer Bill of Rights charts options that can frame the parameters for out of context uses or data retention, by looking to codes of conduct and consumer subject review boards.

-Jules Polonetsky & Christopher Wolf, Future of Privacy Forum

Paper on Video Games and Privacy Released

At the start of the new year, one of the most anticipated video games of the year was Watch_Dogs, an open-world experience where players played the role of a hacker living in near-future Chicago, racing around the city using a mobile device to retrieve sensitive data and harnessing augmented reality feeds to pick up information about non-player character’s demographic data or potential in-game behavior. The game not only highlighted current concerns about privacy, but it got me thinking about all the many privacy issues at play in the world of video games.

Many of these issues are similar to work FPF has already done in the mobile space with regards to apps and online services, but as a long time gamer, these data collection and use issues were things I didn’t really think about when picking up a controller in front of my television. Modern games collect data such as a player’s physical characteristics (including facial features, body movement and voice data), location and nearby surroundingsbiometrics, and information gleaned from one’s social networks, to start. Additionally, within the game environment itself data analysts monitor in-game behavior in order to discover a great deal about a gamer’s mind: from their temperament to their leadership skills; from their greatest fears to their political leanings.

The use of data is rapidly changing the gaming landscape, leading to a whole host of new innovative ways to play, but also potentially giving this gamer some pause. I teamed up with Joe Newman, now at Electronic Arts, and Chris Hazard, a game developer and researcher, to survey privacy issues in video gaming. Our paper, Press Start to Track?, was presented at the 2014 Privacy Law Scholars Conference, and was this week published in the journal of the American Intellectual Property Law Association.

If interested in the subject (but unable to do a deep dive), Joe and I blogged about some of our thoughts on The Escapist and our thoughts for how data-hungry game developers can build trust with gamers at Gamasutra.

-Joseph Jerome, Policy Counsel

Onwards and Upwards

Today, Joe Newman, our former legal and policy fellow, started working as a privacy attorney at Electronic Arts, one of the largest video game companies in the world. While at FPF, Joe was vital to our projects reviewing the U.S.-EU Safe Harbor and the implementation of “Do Not Track,” but he identified early on some of the interesting legal and privacy issues in the gaming space. I was pleased to be able to collaborate with Joe on a paper scoping out some of these issues, and the end result, Press Start to Track: Privacy Questions Posed by Video Game Tech, is slated for publication this month. There’s no question Joe is well-positioned to the take the privacy world by storm at EA.

He also joins the long list of FPF alumni who have gone on to interesting positions across technology and privacy. Joe’s co-fellow, Sarah Gordon, left FPF to work in-house at Zillow, the West Coast-based online real estate database, and previous fellows have gone on to work at American Express, Nielsen, and Promontory, among other ports of call.

But as our alumni move onward and upward, FPF is always looking for new law graduates and would-be policy wonks to join us! If you are interested in joining us and applying to work as a fellow at FPF, please get in touch at jjerome@futureofprivacy.org.

-Joseph Jerome, Policy Counsel

Travis LeBlanc on the FCC’s New Privacy Role

At today’s FCBA brown bag lunch, FCC Enforcement Bureau Chief Travis LeBlanc discussed the Commission’s recent entrance into privacy enforcement and fielded questions as to what companies might do to avoid running afoul of the Enforcement Bureau. LeBlanc emphasized the innovation continues to outpace regulators, noting that much of the Commission’s investigative and enforcement work is a five to seven year process. “We’re at the point where we’d be having the Supreme Court judge [problems] with first-generation smartphones,” he mused. He highlighted the Commission’s recent decision to join the Global Privacy Enforcement Network as an effort to help keep pace with change in technology.

Kelley Dyre’s John Heitmann pressed LeBlanc on the FCC’s notices of apparent liability (NALs) against TerraCom and YourTel, which he suggested interpreted Sections 222(a) and 201(b) of the Telecommunications Act in novel ways to protect consumer privacy. Section 222(a) states that “[e]very telecommunications carrier has a duty to protect the confidentiality of proprietary information of, and relating to …customers.” While this has long been the basis for the FCC’s security rules around CPNI, but LeBlanc argued that Section 222 does not limit the duty of carriers to protecting only CPNI. He admitted that for “folks in the industry, in the media, and in the privacy community, there was an ‘uh huh, interesting’ moment” regarding the Commission’s interpretation, but he suggested this interpretation has been used to support other privacy work within the FCC, “if not squarely in the enforcement context.” He argued that Section 222’s protection of proprietary information was designed “to encompass the protection of information customers intended to keep private, which includes PII” and is more than just CPNI as defined by the FCC. “Going forward, fair to say, that’s the concept we’ll be using in our work,” LeBlanc stated.

LeBlanc also explained that Section 201(b), which prohibits carriers from engaging in unjust and unreasonable business practices, must be viewed as being co-extensive with Section 5 of the FTC Act. “It’s a basic consumer protection tool that we use to ensure carriers can’t engage in unjust practices,” he said, citing a recent settlement against AT&T for “cramming” extra charges onto consumer bills as an example of how to apply Section 201. He explained that the application of this interpretation within the context of policing privacy practices is “an iteration of that view and not a transformation.” Echoing the FTC’s actions on privacy policies, LeBlanc emphasized that the FCC hoped “to marry [company’s] language with their practices.” He added that the cramming settlement shows that the FCC is focused on conduct that directly harms consumers. The Enforcement Bureau, he suggested, was not interested in technical rules violations where no one was harmed or impacted. He also suggested it was important to differentiate between breaches of personal information, such as credit cards, that can be remedied and those that cannot such as Social Security number breaches. “In that circumstance, [a person’s] identity may be stolen or it may not, but no one’s going to re-issue you a Social Security number.”

LeBlanc spoke at length about the differences between the FCC and the Federal Trade Commission, the nation’s primary privacy cop. “We’re a regulatory agency with rule-making authority in contrast to the FTC, which is a primarily a policing agency,” he explained. “The benefit of having a law enforcement unit in the same angry as the one making the rules [is that] we can go talk to them before we do an enforcement action. If we’re going to do anything, we need to pick up the phone first. . . . It is impossible for anyone writing laws or rules to anticipate every circumstance out there you intend to bar, so you leave some part of it ambiguous. That’s an advantage over doing enforcement independently. There are risks that an enforcer could exploit a small error in the language of a statute.” He suggested housing both rule-making and enforcement in one entity improves effectiveness and efficiency.

The ramifications of the Commission’s recent $7.4 million settlement against Verizon for its past failure to notify consumers of their opportunity to opt-out of marketing using CPNI information were also a key topic of discussion. LeBlanc suggested the more interesting parts of the settlement were its non-financial terms. He applauded Verizon’s decision to include a notice of consumer opt-out rights in every monthly bill going forward. He suggested more notices like this give consumers the ability to evaluate (and rethink) their decisions to share information. He also suggested that CPNI rules move away from unclear “reasonable standards” and place stronger protections on customer’s proprietary information.

LeBlanc also reiterated his desire to see companies admit to wrong-doing in settlement actions. He suggested that negotiations with Verizon were already on-going at the time the Enforcement Bureau announced a practice of seeking admissions of liability or facts in settlements. Explaining that FCC settlements were designed to provide guidance to others engaging in similar conduct, “the only way to effectively do that is to provide some detail into what a company did that was wrong.” He was also dismissive of notions that admissions-of-wrongdoing would impede the ability of companies either to retain business or gain government contracts. “I don’t think that’s true,” he said, suggesting settlements could be narrowly worded enough to protect companies from that sort of sanction.

Turning to emerging privacy issues, LeBlanc emphasized that he hoped to prevent industry mistakes rather than to respond after the fact. “Where I can provide guidance to the industry to operate in compliance with the law, I’d like to do that,” he said. His chief recommendation was for companies to do better with their privacy policies. He admitted that the lack of baseline federal privacy law forced him, as well as other agencies, to “work on the representations industry makes,” pointing to existing FTC practice. He suggested that the SEC will be interested in this moving forward, as well.

“We understand that sometimes companies are victims,” he said. “They are targets — no pun intended.”  He pointed to some of the “mitigating practices” companies could pursue in the event of breaches, including (1) notifications when information was compromised, (2) credit monitoring services, and (3) providing hotlines or websites to consumers. He also highlighted the importance of chief privacy officers, training, and the adoption of industry best practices and security audits. That said, he also appeared skeptical of some common “excuses” for breaches such as (1) errant employees, (2) technological glitches, and (3) contractor practices. “The company that collects personal information from the consumer, that has that relationship with the consumer, is responsible for protecting it [downstream],” he said. “That duty cannot be out-sourced.”

Finally, Heitmann could not avoid asking LeBlanc whether all of his comments might apply to broadband services in the event the FCC reclassifies broadband under Title II. “Wouldn’t you like to know?” LeBlanc laughed. “I cannot speculate on what the Commission is going to do in this context . . . We will stand ready and prepared to meet the Commission’s goals.”

-Joseph Jerome, Policy Counsel


Privacy Calendar

Mar
4
Wed
all-day Global Privacy Summit 2015
Global Privacy Summit 2015
Mar 4 – Mar 6 all-day
For more information, click here.
Mar
10
Tue
6:00 pm CDT Annual Dinner “TechProm” 2015
CDT Annual Dinner “TechProm” 2015
Mar 10 @ 6:00 pm – 9:00 pm
Featuring the most influential minds of the tech policy world, CDT’s annual dinner, TechProm, highlights the issues your organization will be facing in the future and provides the networking opportunities that can help you tackle[...]
Mar
13
Fri
all-day BCLT Privacy Law Forum
BCLT Privacy Law Forum
Mar 13 all-day
This program will feature leading academics and practitioners discussing the latest developments in privacy law. UC Berkeley Law faculty and conference panelists will discuss cutting-edge scholarship and explore ‘real world’ privacy law problems. Click here[...]
May
27
Wed
all-day PL&B’s Asia-Pacific Roundtable (...
PL&B’s Asia-Pacific Roundtable (...
May 27 all-day
PROFESSOR GRAHAM GREENLEAF, Asia-Pacific Editor, Privacy Laws & Business International Report, will lead a roundtable on the countries of most interest to business in the Asia-Pacific region. Click here for more information.
Jul
6
Mon
all-day PL&B’s 28th Annual International...
PL&B’s 28th Annual International...
Jul 6 – Jul 8 all-day
The Privacy Laws & Business 27th Annual International Conference featured more than 40 speakers and chairs from many countries over 3 intensive days. At the world’s longest running independent international privacy event participants gained professionally by[...]
Jan
28
Thu
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
Jan
28
Sat
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]

View Calendar