Author Archive

Paper on Video Games and Privacy Released

At the start of the new year, one of the most anticipated video games of the year was Watch_Dogs, an open-world experience where players played the role of a hacker living in near-future Chicago, racing around the city using a mobile device to retrieve sensitive data and harnessing augmented reality feeds to pick up information about non-player character’s demographic data or potential in-game behavior. The game not only highlighted current concerns about privacy, but it got me thinking about all the many privacy issues at play in the world of video games.

Many of these issues are similar to work FPF has already done in the mobile space with regards to apps and online services, but as a long time gamer, these data collection and use issues were things I didn’t really think about when picking up a controller in front of my television. Modern games collect data such as a player’s physical characteristics (including facial features, body movement and voice data), location and nearby surroundingsbiometrics, and information gleaned from one’s social networks, to start. Additionally, within the game environment itself data analysts monitor in-game behavior in order to discover a great deal about a gamer’s mind: from their temperament to their leadership skills; from their greatest fears to their political leanings.

The use of data is rapidly changing the gaming landscape, leading to a whole host of new innovative ways to play, but also potentially giving this gamer some pause. I teamed up with Joe Newman, now at Electronic Arts, and Chris Hazard, a game developer and researcher, to survey privacy issues in video gaming. Our paper, Press Start to Track?, was presented at the 2014 Privacy Law Scholars Conference, and was this week published in the journal of the American Intellectual Property Law Association.

If interested in the subject (but unable to do a deep dive), Joe and I blogged about some of our thoughts on The Escapist and our thoughts for how data-hungry game developers can build trust with gamers at Gamasutra.

-Joseph Jerome, Policy Counsel

Onwards and Upwards

Today, Joe Newman, our former legal and policy fellow, started working as a privacy attorney at Electronic Arts, one of the largest video game companies in the world. While at FPF, Joe was vital to our projects reviewing the U.S.-EU Safe Harbor and the implementation of “Do Not Track,” but he identified early on some of the interesting legal and privacy issues in the gaming space. I was pleased to be able to collaborate with Joe on a paper scoping out some of these issues, and the end result, Press Start to Track: Privacy Questions Posed by Video Game Tech, is slated for publication this month. There’s no question Joe is well-positioned to the take the privacy world by storm at EA.

He also joins the long list of FPF alumni who have gone on to interesting positions across technology and privacy. Joe’s co-fellow, Sarah Gordon, left FPF to work in-house at Zillow, the West Coast-based online real estate database, and previous fellows have gone on to work at American Express, Nielsen, and Promontory, among other ports of call.

But as our alumni move onward and upward, FPF is always looking for new law graduates and would-be policy wonks to join us! If you are interested in joining us and applying to work as a fellow at FPF, please get in touch at jjerome@futureofprivacy.org.

-Joseph Jerome, Policy Counsel

Travis LeBlanc on the FCC’s New Privacy Role

At today’s FCBA brown bag lunch, FCC Enforcement Bureau Chief Travis LeBlanc discussed the Commission’s recent entrance into privacy enforcement and fielded questions as to what companies might do to avoid running afoul of the Enforcement Bureau. LeBlanc emphasized the innovation continues to outpace regulators, noting that much of the Commission’s investigative and enforcement work is a five to seven year process. “We’re at the point where we’d be having the Supreme Court judge [problems] with first-generation smartphones,” he mused. He highlighted the Commission’s recent decision to join the Global Privacy Enforcement Network as an effort to help keep pace with change in technology.

Kelley Dyre’s John Heitmann pressed LeBlanc on the FCC’s notices of apparent liability (NALs) against TerraCom and YourTel, which he suggested interpreted Sections 222(a) and 201(b) of the Telecommunications Act in novel ways to protect consumer privacy. Section 222(a) states that “[e]very telecommunications carrier has a duty to protect the confidentiality of proprietary information of, and relating to …customers.” While this has long been the basis for the FCC’s security rules around CPNI, but LeBlanc argued that Section 222 does not limit the duty of carriers to protecting only CPNI. He admitted that for “folks in the industry, in the media, and in the privacy community, there was an ‘uh huh, interesting’ moment” regarding the Commission’s interpretation, but he suggested this interpretation has been used to support other privacy work within the FCC, “if not squarely in the enforcement context.” He argued that Section 222’s protection of proprietary information was designed “to encompass the protection of information customers intended to keep private, which includes PII” and is more than just CPNI as defined by the FCC. “Going forward, fair to say, that’s the concept we’ll be using in our work,” LeBlanc stated.

LeBlanc also explained that Section 201(b), which prohibits carriers from engaging in unjust and unreasonable business practices, must be viewed as being co-extensive with Section 5 of the FTC Act. “It’s a basic consumer protection tool that we use to ensure carriers can’t engage in unjust practices,” he said, citing a recent settlement against AT&T for “cramming” extra charges onto consumer bills as an example of how to apply Section 201. He explained that the application of this interpretation within the context of policing privacy practices is “an iteration of that view and not a transformation.” Echoing the FTC’s actions on privacy policies, LeBlanc emphasized that the FCC hoped “to marry [company’s] language with their practices.” He added that the cramming settlement shows that the FCC is focused on conduct that directly harms consumers. The Enforcement Bureau, he suggested, was not interested in technical rules violations where no one was harmed or impacted. He also suggested it was important to differentiate between breaches of personal information, such as credit cards, that can be remedied and those that cannot such as Social Security number breaches. “In that circumstance, [a person’s] identity may be stolen or it may not, but no one’s going to re-issue you a Social Security number.”

LeBlanc spoke at length about the differences between the FCC and the Federal Trade Commission, the nation’s primary privacy cop. “We’re a regulatory agency with rule-making authority in contrast to the FTC, which is a primarily a policing agency,” he explained. “The benefit of having a law enforcement unit in the same angry as the one making the rules [is that] we can go talk to them before we do an enforcement action. If we’re going to do anything, we need to pick up the phone first. . . . It is impossible for anyone writing laws or rules to anticipate every circumstance out there you intend to bar, so you leave some part of it ambiguous. That’s an advantage over doing enforcement independently. There are risks that an enforcer could exploit a small error in the language of a statute.” He suggested housing both rule-making and enforcement in one entity improves effectiveness and efficiency.

The ramifications of the Commission’s recent $7.4 million settlement against Verizon for its past failure to notify consumers of their opportunity to opt-out of marketing using CPNI information were also a key topic of discussion. LeBlanc suggested the more interesting parts of the settlement were its non-financial terms. He applauded Verizon’s decision to include a notice of consumer opt-out rights in every monthly bill going forward. He suggested more notices like this give consumers the ability to evaluate (and rethink) their decisions to share information. He also suggested that CPNI rules move away from unclear “reasonable standards” and place stronger protections on customer’s proprietary information.

LeBlanc also reiterated his desire to see companies admit to wrong-doing in settlement actions. He suggested that negotiations with Verizon were already on-going at the time the Enforcement Bureau announced a practice of seeking admissions of liability or facts in settlements. Explaining that FCC settlements were designed to provide guidance to others engaging in similar conduct, “the only way to effectively do that is to provide some detail into what a company did that was wrong.” He was also dismissive of notions that admissions-of-wrongdoing would impede the ability of companies either to retain business or gain government contracts. “I don’t think that’s true,” he said, suggesting settlements could be narrowly worded enough to protect companies from that sort of sanction.

Turning to emerging privacy issues, LeBlanc emphasized that he hoped to prevent industry mistakes rather than to respond after the fact. “Where I can provide guidance to the industry to operate in compliance with the law, I’d like to do that,” he said. His chief recommendation was for companies to do better with their privacy policies. He admitted that the lack of baseline federal privacy law forced him, as well as other agencies, to “work on the representations industry makes,” pointing to existing FTC practice. He suggested that the SEC will be interested in this moving forward, as well.

“We understand that sometimes companies are victims,” he said. “They are targets — no pun intended.”  He pointed to some of the “mitigating practices” companies could pursue in the event of breaches, including (1) notifications when information was compromised, (2) credit monitoring services, and (3) providing hotlines or websites to consumers. He also highlighted the importance of chief privacy officers, training, and the adoption of industry best practices and security audits. That said, he also appeared skeptical of some common “excuses” for breaches such as (1) errant employees, (2) technological glitches, and (3) contractor practices. “The company that collects personal information from the consumer, that has that relationship with the consumer, is responsible for protecting it [downstream],” he said. “That duty cannot be out-sourced.”

Finally, Heitmann could not avoid asking LeBlanc whether all of his comments might apply to broadband services in the event the FCC reclassifies broadband under Title II. “Wouldn’t you like to know?” LeBlanc laughed. “I cannot speculate on what the Commission is going to do in this context . . . We will stand ready and prepared to meet the Commission’s goals.”

-Joseph Jerome, Policy Counsel

Big Data Papers

FPF  and the Local Search Association release a primer that explains how the Bluetooth devices work.

Discussing the Merits of Device Encryption

In the wake of Apple and Google’s recent decision to implement “whole device encryption” on their latest mobile operating systems, the FBI has warned that the tech giants’ actions will force law enforcement to “go dark” when it comes to keeping tabs of criminals. FPF has previously explored the question of encryption and law enforcement access, and encourages efforts by tech companies to make their devices and services more secure.

In the wake of Snowden’s revelations about government surveillance last year, there has been a renewed conversation about whether communications technology is sufficiently secure. At minimum, encryption helps to protect users against unauthorized access to their personal information. The question now facing policymakers is whether improvements in technical security must be sacrificed to enable lawful government access.

Kicking off a conversation on the merits of device encryption, Chris Wolf wondered whether today’s debate was simply a repeat of the crypto wars of the 1990s, or whether a new security balance ought to be struck. Wolf discussed that and more with Georgetown Law’s Carrie Cordero, Amie Stepanovich from Access Now, and Cato’s Julian Sanchez, who stepped away from planning a full-day symposium on the larger issue of government surveillance.

A Renewed Conversation about “Going Dark”

Cordero noted that the concept of “going dark” is nothing new, but stressed that there were significant differences between how the debate was waged in the 1990s versus today. Whereas previously the FBI was concerned about the ability to engage in real-time surveillance, it now has very real concerns about its ability to lawfully obtain stored information. This has changed since Snowden the aggressive implementation of encryption and other technologies by tech companies.

“Why are we talking about encryption now?” Stepanovich mused. “Computers have had default encryption on hard drives for many years without anyone raising an eyebrow, but now because it’s on a phone it’s different?” She argued that the current debate is inexorably tied to concerns about surveillance in the wake of the Snowden revelations. “The conversation we’re having isn’t because governments were going after bad actors, but because they were going after everybody. [We now know] how robust the efforts are to get access to your data when access can be gotten. If there is any vulnerable point . . . somebody is probably going to break in and get the data,” she stated. “[Encryption] comes from an abuse of gathering information.”

Wolf pushed back, asking whether such a decision ought to be made as a matter of public policy and not by device manufacturers. Stepanovich countered by suggesting one take a larger view: “These devices are sold around the world. If we start looking at the risk to the user worldwide, it becomes unacceptable . . . not to offer the most security they can offer.” Encryption should be viewed not as an unnecessary obstruction, but rather as an additional protection from unauthorized access to personal information.

However, Cordero cautioned against abandoning efforts to work on technical solutions to protect users against bad actors and allow compliance with law enforcement. She stressed that there remained a societal interest in preserving the capacity of law enforcement to serve lawful process to investigate crimes and national security threats. “What the government is talking about now is the ability to serve a court order,” she said. She

What’s the Honest Impact?

Sanchez was skeptical of government’s ability to calculate how encryption actually impacts law enforcement. “We’ve been ‘going dark’ for a long time according to the government,” he stated. He highlighted lots of different ways that law enforcement can gain access to information without physically accessing a mobile device, and suggested that it was quite possible for an individual to be held in contempt of court and jailed for refusing to unlock an encrypted phone. While all conceded the Fifth Amendment protections against self-incrimination are murky at best when it comes to being compelled to unlock an encrypted device, Cordero cautioned that holding individuals in contempt was not a useful mechanism when time is of the essence. “Contempt proceedings aren’t going to be particularly satisfying for law enforcement,” she explained.

“We basically need magic,” Sanchez responded, critiquing the government’s position. He cautioned against treating tech companies like “magicians” and highlighted The Washington Post editorial board’s recent call for “golden keys” that would only work for law enforcement. Technical experts and security researchers largely agree that implementing any sort of hidden access feature also introduces exploitable vulnerabilities, he explained.

He also made the point that Apple’s “soup-to-nuts” business model, with its walled gardens and closed systems, is largely unique. “A general premise in computing is that someone will sell you a computer that comes pre-installed with things like Windows, and you could install other software like Linux,” he explained. “That’s an important value that’s given rise to a tremendous amount of innovation.” Comparing Apple’s mobile device business model to Android’s, which is largely open-source, Sanchez explained that the government’s position effectively wages a war on open-computing. “It’s not possible to force people to keep a backdoor they don’t want, or any attempt would be extraordinarily destructive,” he explained.

Looking Forward on Device Encryption

Wolf asked each panelist to preview where the conversation would be a year from now. Sanchez flippantly suggested public discourse would continue to be filled with “hypotheticals cribbed from The Blacklist.” Stepanovich noted that this debate has been ongoing in some form for decades, and we will likely be in the exact same place a year from now. She argued the only positive change could come from revisiting the logic behind the Communications Assistance for Law Enforcement Act (CALEA). She suggested that privacy advocates were largely playing defense rather than offense. “We need to put a law on the books [that states] government cannot force companies to put in a backdoor that makes users less secure,” she stated.

Cordero offered a different perspective. “If law enforcement is serious about pursuing this issue, they’re going to have to make the case.” Noting that many of the FBI’s most recent anecdotal examples of “going dark” have been debunked, she suggested the law enforcement needs to develop a more comprehensive factual record. “In the 1990s, the FBI presented a range of statistics and data that demonstrated factually that there was a situation requiring legislation. As well as GAO reports and independent studies. We need additional facts.”

At its core, she continued, this debate is the same argument as was against CALEA in 1994. “We made a judgment then [that forcing companies to comply with law enforcement] was a valid purpose,” she explained. If companies are no longer required to preserve that capability in the future, it will become costly for government to adapt as technology rapidly evolved.

Sanchez disagreed with comparisons to CALEA. He explained that CALEA applied to a small number of telecoms with centralized hubs, and there is a huge difference between what CALEA accomplished and what is being proposed now. “What we’re talking about now is forcing an architecture used by hundreds of millions of consumers that would preclude devices from running arbitrary code,” he argued.

Stepanovich returned to Cordero’s point that device encryption could prove costly to law enforcement. She noted that “tech has trended the other way.” Instead, technology has largely decreased the cost of government surveillance (which FPF Senior Fellow Peter Swire has also explained as leading to a “golden age of surveillance”). “Things like encryption counter that dip in price by forcing law enforcement to invest in more targeted surveillance,” Stepanovich said, which should be encouraged.

A Big Policy Choice: To Kill Encryption of Not?

Encryption, Stepanovich concluded, “gives users the ability to control their own data and gives them an option.” Highlighting was has been called “the least trusted country problem,” the costs of encryption must also be weighed against the effects of surveillance in other countries, which lack the legal safeguards of the United States.

Tech companies are responding to market pressures to do more to secure information, and additional encryption options are the result. The panel largely agreed that law enforcement still has alternative ways of accessing most of the information being encrypted on a device. “No body wants perfect encryption,” Sanchez concluded. “We forget our complicated pass phrases, and then everything is irretrievably lost.”

More discussion on the matter is clearly needed. As Cordero explained, “Law enforcement and national security may continue to stress this issue.” However, she also acknowledged that the issue may well be “politically impossible” to address.

-Joseph Jerome, Policy Counsel


Privacy Calendar

Jan
26
Mon
8:30 am Privacy as a Profit Center: Leve... @ Old Slip by Convene
Privacy as a Profit Center: Leve... @ Old Slip by Convene
Jan 26 @ 8:30 am – Jan 27 @ 4:15 pm
Learn how those on the leading edge of privacy governance and digital innovation from companies including Cigna, Cisco Systems, eBay Inc. Public Policy Lab, FocusMotion,Ghostery, Goodyear Tire & Rubber Company, Google, HP Enterprise Security Products, JPMorgan[...]
Jan
28
Wed
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
Mar
4
Wed
all-day Global Privacy Summit 2015
Global Privacy Summit 2015
Mar 4 – Mar 6 all-day
For more information, click here.
Mar
10
Tue
6:00 pm CDT Annual Dinner “TechProm” 2015
CDT Annual Dinner “TechProm” 2015
Mar 10 @ 6:00 pm – 9:00 pm
Featuring the most influential minds of the tech policy world, CDT’s annual dinner, TechProm, highlights the issues your organization will be facing in the future and provides the networking opportunities that can help you tackle[...]
Mar
13
Fri
all-day BCLT Privacy Law Forum
BCLT Privacy Law Forum
Mar 13 all-day
This program will feature leading academics and practitioners discussing the latest developments in privacy law. UC Berkeley Law faculty and conference panelists will discuss cutting-edge scholarship and explore ‘real world’ privacy law problems. Click here[...]
May
27
Wed
all-day PL&B’s Asia-Pacific Roundtable (...
PL&B’s Asia-Pacific Roundtable (...
May 27 all-day
PROFESSOR GRAHAM GREENLEAF, Asia-Pacific Editor, Privacy Laws & Business International Report, will lead a roundtable on the countries of most interest to business in the Asia-Pacific region. Click here for more information.
Jul
6
Mon
all-day PL&B’s 28th Annual International...
PL&B’s 28th Annual International...
Jul 6 – Jul 8 all-day
The Privacy Laws & Business 27th Annual International Conference featured more than 40 speakers and chairs from many countries over 3 intensive days. At the world’s longest running independent international privacy event participants gained professionally by[...]

View Calendar