Looking at Privacy Protections for Facial Recognition

Looking at Privacy Protections for Facial Recognition

On Sunday, Google announced that it would not allow facial recognition applications on Google Glass until “strong privacy protections” were in place. But this announcement begs the very question: what sort of privacy protections can actually be put in place for this sort of technology?

Thus far, concerns about facial recognition technology have appeared within the context of “tagging” images on Facebook or how it might be used to transform marketing, but these interactions are largely between users and service providers. Facial recognition on the scale offered by wearable technology such as Google Glass can change how we navigate the outside world. As one commenter put it, notice and consent mechanisms can protect Glass users but not the use by the user himself.

Many suggestions have focused on sending signals to the outside world that Glass is at work, such as blinking lights or other audio or visual cues. This is similar to efforts such as requiring cameras to go “click” whenever a photo is taken in order to make surreptitious photography more difficult. However, these sorts of mechanisms place the responsibility on non-users to constantly be aware of their surroundings lest they be recognized without their approval.

In its report last year on best practices for facial recognition technology, the FTC specifically addressed scenarios where companies use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, pointing to mobile apps that could permit users to surreptitiously discover information about people on the street. Noting “the significant privacy and safety risks that such an app would raise,” the FTC suggested that “only consumers who have affirmatively chosen to participate in such a system should be identified.”

As a practical matter, for now, facial recognition on Glass could be tied to a user’s social network. Information that a user has access to about people out in the world would reflect information shared on that social network. Though a heads-up display could be permitted to recognize only “friends,” it seems inevitable that this technology will creep beyond this sort of artificial barrier. Drawing the line will be incredibly difficult. For example, what reason would there be to exclude professional email contacts or prominent public figures from being identified?  With some work, almost anyone who has set foot in a public space can be visually identified. Facial recognition on wearable devices simply lowers this already-diminishing bar. Empowering the general public to affirmatively choose to participate in broad-based, public facial recognition on the scale offered by wearable technologies poses a tremendous challenge to many of our traditional privacy protection tools.

Stopping the collection of this information may prove impossible. Even as Google has pledged to limit facial recognition abilities on Glass, Lambda Labs, which provides facial recognition services, have indicated that facial recognition is “a core feature” of wearable technology and that “Google will allow it or be replaced with something that does.” While creating a comprehensive opt-out program will likely serve as one potential solution, such a system could create further privacy problems by requiring the collection of facial information in order for the application to “know” to ignore that face in the future. Another option could be for other wearable tech to send signals not to identify an individual’s face, creating a Google Glass duel of sorts.

However, the challenge of stopping or restricting facial data collection suggests a focus on regulating potential uses could be more productive. We could attempt to draw distinctions among what facial recognition is being used to accomplish—is it being used to assist or augment the user’s memory? For example, using facial recognition technology to help recall a distant, long absent relative could be distinguished from using additional data sources to learn about someone as you sit across from them at a table. Further, facial recognition applications could provide information based on contextual cues, such as identifying restaurant managers and staff at a restaurant while ignoring other people. In the end, applications will need to specifically enumerate how they will use the facial data they are collecting.

Both software developers and device manufacturers need to think creatively about how to establish guidelines around facial recognition technology. The alternative is a complete loss of anonymity in public, or a complete transformation of the public sphere into a place where individuals must cover up, lower their gazes, and avert their eyes—all actions that seem contrary to Google Glass’ effort to present individuals with new ways to experience our world.

-Joseph Jerome, Legal & Policy Fellow


Posted On
Jul 04, 2013
Posted By

If one looks at the recent Snowden revelations, e.g. on http://www.wonkie.com/2013/07/04/privacy-vs-gambling-with-security/ I find it increasingly difficult to trust a company like Google. While the technology may make for great social interactions, the potential for it to be misused for spying and the like is enormous.

Leave a Reply

Privacy Calendar

9:00 am Web Privacy & Transparency Confe... @ Princeton University
Web Privacy & Transparency Confe... @ Princeton University
Oct 24 @ 9:00 am – 4:00 pm
On Friday, October 24, 2014, the Center for Information Technology Policy (CITP) at Princeton University is hosting a public conference on Web Privacy and Transparency. It will explore the quickly emerging area of computer science research that[...]
4:00 pm Big Data and Privacy: Navigating... @ Schulze Hall
Big Data and Privacy: Navigating... @ Schulze Hall
Oct 29 @ 4:00 pm – 7:00 pm
The rapid emergence of “big data” has created many benefits and risks for businesses today. As data is collected, stored, analyzed, and deployed for various business purposes, it is particularly important to develop responsible data[...]
9:00 am The Privacy Act @40: A Celebrati... @ Georgetown Law
The Privacy Act @40: A Celebrati... @ Georgetown Law
Oct 30 @ 9:00 am – 5:30 pm
The Privacy Act @40 A Celebration and Appraisal on the 40th Anniversary of the Privacy Act and the 1974 Amendments to the Freedom of Information Act October 30, 2014 Agenda 9 – 9:15 a.m. Welcome[...]
all-day George Washington Law Review 201... @ George Washington University Law School
George Washington Law Review 201... @ George Washington University Law School
Nov 7 – Nov 8 all-day
Save the date for the GW Law Review‘s Annual Symposium, The FTC at 100: Centennial Commemorations and Proposals for Progress, which will be held on Saturday, November 8, 2014, in Washington, DC. This year’s symposium, hosted in[...]
10:15 am You Are Here: GPS Location Track... @ Mauna Lani Bay Hotel & Bungalows
You Are Here: GPS Location Track... @ Mauna Lani Bay Hotel & Bungalows
Nov 11 @ 10:15 am
EFF Staff Attorney Hanni Fakhoury will present twice at the Oregon Criminal Defense Lawyers Association’s Annual Sunny Climate Seminar. He will give a presentation on government location tracking issues and then participate in a panel[...]
all-day PCLOB Public Meeting on “Definin... @ Washington Marriott Hotel
PCLOB Public Meeting on “Definin... @ Washington Marriott Hotel
Nov 12 all-day
The Privacy and Civil Liberties Oversight Board will conduct a public meeting with industry representatives, academics, technologists, government personnel, and members of the advocacy community, on the topic: “Defining Privacy.”   While the Board will[...]
all-day W3C Workshop on Privacy and User... @ Berlin, Germany
W3C Workshop on Privacy and User... @ Berlin, Germany
Nov 20 – Nov 21 all-day
The Workshop on User Centric App Controls intents to further the discussion among stakeholders of the mobile web platform, including researchers, developers and service providers. This workshop serves to investigate strategies toward better privacy protection[...]

View Calendar