Looking at Privacy Protections for Facial Recognition

Looking at Privacy Protections for Facial Recognition

On Sunday, Google announced that it would not allow facial recognition applications on Google Glass until “strong privacy protections” were in place. But this announcement begs the very question: what sort of privacy protections can actually be put in place for this sort of technology?

Thus far, concerns about facial recognition technology have appeared within the context of “tagging” images on Facebook or how it might be used to transform marketing, but these interactions are largely between users and service providers. Facial recognition on the scale offered by wearable technology such as Google Glass can change how we navigate the outside world. As one commenter put it, notice and consent mechanisms can protect Glass users but not the use by the user himself.

Many suggestions have focused on sending signals to the outside world that Glass is at work, such as blinking lights or other audio or visual cues. This is similar to efforts such as requiring cameras to go “click” whenever a photo is taken in order to make surreptitious photography more difficult. However, these sorts of mechanisms place the responsibility on non-users to constantly be aware of their surroundings lest they be recognized without their approval.

In its report last year on best practices for facial recognition technology, the FTC specifically addressed scenarios where companies use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, pointing to mobile apps that could permit users to surreptitiously discover information about people on the street. Noting “the significant privacy and safety risks that such an app would raise,” the FTC suggested that “only consumers who have affirmatively chosen to participate in such a system should be identified.”

As a practical matter, for now, facial recognition on Glass could be tied to a user’s social network. Information that a user has access to about people out in the world would reflect information shared on that social network. Though a heads-up display could be permitted to recognize only “friends,” it seems inevitable that this technology will creep beyond this sort of artificial barrier. Drawing the line will be incredibly difficult. For example, what reason would there be to exclude professional email contacts or prominent public figures from being identified?  With some work, almost anyone who has set foot in a public space can be visually identified. Facial recognition on wearable devices simply lowers this already-diminishing bar. Empowering the general public to affirmatively choose to participate in broad-based, public facial recognition on the scale offered by wearable technologies poses a tremendous challenge to many of our traditional privacy protection tools.

Stopping the collection of this information may prove impossible. Even as Google has pledged to limit facial recognition abilities on Glass, Lambda Labs, which provides facial recognition services, have indicated that facial recognition is “a core feature” of wearable technology and that “Google will allow it or be replaced with something that does.” While creating a comprehensive opt-out program will likely serve as one potential solution, such a system could create further privacy problems by requiring the collection of facial information in order for the application to “know” to ignore that face in the future. Another option could be for other wearable tech to send signals not to identify an individual’s face, creating a Google Glass duel of sorts.

However, the challenge of stopping or restricting facial data collection suggests a focus on regulating potential uses could be more productive. We could attempt to draw distinctions among what facial recognition is being used to accomplish—is it being used to assist or augment the user’s memory? For example, using facial recognition technology to help recall a distant, long absent relative could be distinguished from using additional data sources to learn about someone as you sit across from them at a table. Further, facial recognition applications could provide information based on contextual cues, such as identifying restaurant managers and staff at a restaurant while ignoring other people. In the end, applications will need to specifically enumerate how they will use the facial data they are collecting.

Both software developers and device manufacturers need to think creatively about how to establish guidelines around facial recognition technology. The alternative is a complete loss of anonymity in public, or a complete transformation of the public sphere into a place where individuals must cover up, lower their gazes, and avert their eyes—all actions that seem contrary to Google Glass’ effort to present individuals with new ways to experience our world.

-Joseph Jerome, Legal & Policy Fellow

Comments

Posted On
Jul 04, 2013
Posted By
Sizwe

If one looks at the recent Snowden revelations, e.g. on http://www.wonkie.com/2013/07/04/privacy-vs-gambling-with-security/ I find it increasingly difficult to trust a company like Google. While the technology may make for great social interactions, the potential for it to be misused for spying and the like is enormous.

Leave a Reply


Privacy Calendar

Apr
23
Wed
6:30 pm Behind the Headlines: NSA Surveillance and Ongoing Revelations @ The Washington Post
Behind the Headlines: NSA Survei… @ The Washington Post
Apr 23 @ 6:30 pm – 8:30 pm
Nearly a year after former government contractor Edward Snowden revealed the extent of the NSA’s surveillance system, revelations about the global programs continue to emerge. [...]
Apr
24
Thu
all-day 6th Biannual International Surveillance & Society Conference
6th Biannual International Surve…
Apr 24 – Apr 25 all-day
The 6th Biannual International Surveillance & Society conference hosted by the University of Barcelona and supported by the Surveillance Studies Network is currently calling for [...]
12:00 pm Data Privacy in Education: Ensuring Student Security while Encouraging Innovation in K-12 Education @ Rayburn House Office Building, Room B-354
Data Privacy in Education: Ensur… @ Rayburn House Office Building, Room B-354
Apr 24 @ 12:00 pm – 1:00 pm
The Congressional E-Learning Caucus in cooperation with Into and the National Coalition for Technology in Education and Training presents a luncheon to discuss “Data Privacy [...]
Apr
29
Tue
all-day IAPP Europe Data Protection Intensive 2014
IAPP Europe Data Protection Inte…
Apr 29 – May 1 all-day
The IAPP Europe Data Protection Intensive features timely programming centred on the top issues impacting the European data protection community, with a focus on addressing [...]
Apr
30
Wed
5:30 pm InSecurity: Race, Surveillance and Privacy in the Digital Age @ New America Foundation
InSecurity: Race, Surveillance a… @ New America Foundation
Apr 30 @ 5:30 pm – 7:30 pm
Now more than ever, digital tools sit at a precarious tipping point, and many question whether they will be used to address pre-existing disparities, [...]
May
7
Wed
all-day IAPP Canada Privacy Symposium 2014
IAPP Canada Privacy Symposium 2014
May 7 – May 9 all-day
The IAPP Canada Privacy Symposium is the leading conference for education, debate and discussion of issues that matter most to Canadian privacy and data protection [...]
Jun
5
Thu
all-day Privacy Law Scholars Conference (7th Annual) @ The George Washington School of Law
Privacy Law Scholars Conference … @ The George Washington School of Law
Jun 5 – Jun 6 all-day
  UC Berkeley School of Law and The George Washington University Law School will be holding the seventh annual Privacy Law Scholars Conference (PLSC) on [...]
Jun
8
Sun
all-day Computers, Freedom, and Privacy 2014 Conference @ Airlie Center
Computers, Freedom, and Privacy … @ Airlie Center
Jun 8 – Jun 10 all-day
Mark your calendars! The 2014 Computers, Freedom, and Privacy Conference will be held June 8-10 at the Airlie Center in Warrenton, Virginia. The Airlie Center [...]

View Calendar