Posts Tagged ‘privacy’

Public Perceptions on Privacy

Today’s new report by the Pew Research Center gives the lie to the notion that privacy is unimportant to the average American. Instead, the big take away is that individuals feel like they lack any control over their personal information. These feelings are directed at the public and private sector alike, and suggest a profound trust gap is emerging in the age of big data.

While Pew has framed its report as a survey of Americans’ attitudes post-Snowden, the report presents a number of alarming statistics of which businesses ought take note. Advertisers take the brunt of criticism, and the entire report broadly suggests that public concerns about data brokers and the opacity of data collection are only growing. Seventy-one percent of respondents say that advertisers can be trusted only some of the time, and 16% say they never can. These numbers track every demographic group, and indeed, get worse among lower income households. Eighty percent of social network users are concerned about the information being shared with unknown third parties. Even as Americans are concerned about government access to personal information, they increasingly support more regulation of advertisers. This support is strong across an array of demographic groups.

Further, even as consumers remain willing to trade personal information in return for access to free Internet services, two-thirds of consumers disapprove of the suggestion that online services work due to increased access to personal information. More problematic, however, is that 91% of Americans now believe that “consumers have lost control over how personal information is collected and used by companies.” Though this Pew study does not show that privacy values are trumping digital services — and every indication suggests that they are not — it is a likely topic for Pew to return to in the future. It would be interesting to see whether this anxiety translates into action.

However, in the meantime, anxiety about privacy suggests an opportunity for companies to win with consumers simply by providing them with more control. Fully 61% “would like to do more” to protect their online privacy. We have repeatedly called for efforts to “featurize” data and have supported efforts to help consumers engage with their personal information. Many companies already provide meaningful controls on the collection and use of personal information, but the challenge is both making consumers aware of these options and ensuring that taking advantage of these dashboards and toggles is as fun as using a simple app.

So we need more tools to make privacy fun. And industry may also need to a better job staying attuned to consumer preferences. Pew reiterates how context-dependent privacy is, and that the value of privacy and consumer interest in protecting their privacy can vary widely from person to person, in different contexts and transactions, and perhaps most pointedly, in response to current events. “[U]sers bounce back and forth between different levels of disclosure depending on the context,” the report argues.

The challenge is ensuring that context is understood similarly by all parties. Part of this is understanding where and when personal information is sensitive. This is a debate that was highlighted at the FTC’s recent big data workshop, and is a theme that increasingly arises in conversations about big data and civil rights. Aside from Social Security numbers, which 95% of respondents considered to be sensitive information, data ranging from health information and phone and email message content to location information and birth date could be viewed as sensitive depending upon the context.

Depending upon context, everything is sensitive or nothing is sensitive. Obviously, this can be a tricky balancing act for consumers to manage. Information management requires users to juggle different online personas, platforms, and audiences. Thus, the door is open for companies to both take certain information off the table — or make a better case why some sensitive information is invaluable for certain services.

While Pew has not shown whether these privacy anxieties trump other pressing economic or social concerns, the report also suggests that the Americans’ perceptions of privacy are heavily intertwined with their understanding of security. Privacy may be amorphous, but security is less so — but being proactive on the one can often be a boon to the other. Positive and proactive public actions on privacy are essential if we are to reverse Americans’ doubts that they can trust sharing their personal information.

-Joseph Jerome, Policy Counsel

“Databuse” as the Future of Privacy?

Is “privacy” such a broad concept as to be meaningless from a legal and policy perspective? On Tuesday, October 14th, the Center for Democracy & Technology hosted a conversation with Benjamin Wittes and Wells Bennett, frequently of the national security blog, Lawfare, to discuss their recent scholarship on “databuse” and the scope of corporate responsibilities for personal data.

Coming from a world of FISA and ECPA, and the detailed statutory guidance that accompanies privacy in the national security space, Wittes noted that privacy law on the consumer side is vague and amorphous, and largely “amounts to don’t be deceptive and don’t be unfair.” Part of the challenge, as number privacy scholars have noted, is that privacy encompasses a range of different social values and policy judgments. “We don’t agree what value we’re protecting,” Wittes said, explaining that government privacy policies have values and distinctions such as national borders and citizen/non-citizen than mean something.

Important distinctions are much less easier to find in consumer privacy. Wittes’ initial work on “databuse” in 2011 was considerably broader and more provocative, applying to all data controllers — first and third party, but his follow-up work with Bennett attempted to limit its scope to the duties owed to consumers exclusively by first parties. According to the pair, this core group of duties “lacks a name in the English language” but “describe a relationship best seen as a form of trusteeship.”

Looking broadly at law and policy around data use, including FTC enforcement actions, the pair argue that there is broad consensus that corporate custodians face certain obligations when holding personal data, including (1) obligations to keep it secure, (2) obligations to be candid and straightforward with users about how their data is being exploited, (3) obligations not to materially misrepresent their uses of user data, and (4) obligations not to use them in fashions injurious to or materially adverse to the users’ interests without their explicit consent. According to Wittes, this core set of requirements better describes reality than any sort of “grandiose conception of privacy.”

“When you talk in the broad language of privacy, you promise consumers more than the legal and enforcement system can deliver,” Wittes argued. “If we want useful privacy policy, we should focus on this core,” he continued, noting that most of these requirements are not directly required by statute.

Bennett detailed how data uses fall into three general categories. The first, a “win/win” category,” describes where the interests of business and consumers align, and he cited the many uses of geolocation information on mobile devices as a good example of this. The second category reflects cases where businesses directly benefit but consumers face a neutral value proposition, and Bennett suggested online behavioral advertising fit into this second category. Finally, a third category of uses are when businesses benefit at consumer’s expense, and he argued that regulatory action would be appropriate to limit these behaviors.

Bennett further argued that this categorization fit well with FTC enforcement actions, if not the agency’s privacy rhetoric. “FTC report often hint at subjective harms,” Bennett explained, but most of the Commission’s actions target objective harms to consumers by companies.

However, the broad language of “privacy” distorts what harms the pair believe regulators — and consumers, as well — are legitimately concerned about. Giving credit to CDT for initially coining the term “databuse,” Wittes defines the term as follows:

[T]he malicious, reckless, negligent, or unjustified handling, collection, or use of a person’s data in a fashion adverse to that person’s interests and in the absence of that person’s knowing consent. . . . It asks not to be left alone, only that we not be forced to be the agents of our own injury when we entrust our data to others. We are asking not necessarily that our data remain private; we are asking, rather, that they not be used as a sword against us without good reason.

CDT’s Justin Brookman, who moderated the conversation, asked whether (or when) price discrimination could turn into databuse.

“Everyone likes [price discrimination] when you call it discounts,” Wittes snarked, explaining that he was “allergic to the merger of privacy and antidiscrimination laws.” Where personal data was being abused or unlawful discrimination was transpiring, Wittes supported regulatory involvement, but he was hesitant to see both problems as falling into the same category of concern.

The conversation quickly shifted to a discussion of the obligations of third parties — or data brokers generally — and Wittes and Bennett acknowledged they dealt with the obligations of first parties because its an easier problem. “We punted on third parties,” they conceded, though Wittes’ background in journalism forced him to question how “data brokers” were functionally different from the press. “I haven’t thought enough about the First Amendment law,” he admitted, but he wasn’t sure what principle would allow advocates to divine “good” third parties and “bad” third parties.

But if the pair’s theory of “databuse” can’t answer every question about privacy policy, at least we might admit the term should enter the privacy lexicon.

-Joseph Jerome, Policy Counsel

Interest Based Ads and More Transparency

Facebook Ads

Facebook wasn’t doing interest based advertising until now?  Huh?

Most users of Facebook know that the ads they see are selected by Facebook based on information on their profile, what they have “liked” and interests they have selected.  Most have also noticed that if they visit a web site off Facebook like Zappos, they may get “retargeted” ads on Facebook for Zappos. Similarly, Facebook works with online and offline retailers to help them buy ads on Facebook aimed at users who have been their customers.

Today Facebook, with much fanfare, has announced that it is launching an interest based advertising program. What’s new? Well, the one thing Facebook hasn’t been doing is selling ads targeted based on the web sites and apps you use outside of Facebook. An individual advertiser could buy an ad, based on your visit to a particular site – but many advertisers couldn’t buy an ad based on your visits to many sites. Now they can.

Got it? Ads on Facebook are selected in an attempt to make them relevant based on your profile, and your activity off of Facebook. And now they will use more activity off Facebook.

What is new is a major new effort to show users extensive detail about the many categories that are used to select ads, and to let users add or edit many categories of interest. This is one of the most extensive moves to give users a deep look at the data used to target ads that we have seen and should make some users feel more in control of the experience.

Don’t like it?  Click on the icon on every targeted ad and turn off the interest based targeting. On mobile, use the limit ad tracking settings on iOS or Android (which will actually tell all apps you dont want interest based ads, not just Facebook).

Privacy legal fights should focus on intrusion, not hurt feelings

Please see FPF Advisory Board member Neil M. Richards in “Privacy legal fights should focus on intrusion, not hurt feelings”, an article from Washington University in St. Louis Newsroom by Jessica Martin. Richards discusses how American privacy law was created in the 19th and 20th centuries and is an inadequate guide for 21st century privacy battles. Richards, JD, is a privacy law expert and professor at Washington University in St. Louis School of Law.

For the full article, click here.

Privacy Legislation Low on Legislators’ List of Priorities

Rep. Anna Eshoo (D-CA) participated in the State of the Net West conference on Tuesday of this week where she said prospects were bleak for any privacy legislation to make it through Congress this year. Even though online privacy tops Eshoo’s list of technological priorities, she believes it lands differently on Congress’ list. With debates centering on the national economy, jobs and the European economy, legislators most likely will not have the time to rewrite privacy legislation. However, the congresswoman did say she wants companies working to adopt transparent privacy policies that are user-friendly, including policies that protect children.

For the full article on the conference from Palo Alto Online, click here.

Privacy Calendar

Jan
26
Mon
8:30 am Privacy as a Profit Center: Leve... @ Old Slip by Convene
Privacy as a Profit Center: Leve... @ Old Slip by Convene
Jan 26 @ 8:30 am – Jan 27 @ 4:15 pm
Learn how those on the leading edge of privacy governance and digital innovation from companies including Cigna, Cisco Systems, eBay Inc. Public Policy Lab, FocusMotion,Ghostery, Goodyear Tire & Rubber Company, Google, HP Enterprise Security Products, JPMorgan[...]
Jan
28
Wed
all-day Data Privacy Day
Data Privacy Day
Jan 28 – Jan 29 all-day
“Data Privacy Day began in the United States and Canada in January 2008, as an extension of the Data Protection Day celebration in Europe. The Day commemorates the 1981 signing of Convention 108, the first[...]
Mar
4
Wed
all-day Global Privacy Summit 2015
Global Privacy Summit 2015
Mar 4 – Mar 6 all-day
For more information, click here.
Mar
10
Tue
6:00 pm CDT Annual Dinner “TechProm” 2015
CDT Annual Dinner “TechProm” 2015
Mar 10 @ 6:00 pm – 9:00 pm
Featuring the most influential minds of the tech policy world, CDT’s annual dinner, TechProm, highlights the issues your organization will be facing in the future and provides the networking opportunities that can help you tackle[...]
Mar
13
Fri
all-day BCLT Privacy Law Forum
BCLT Privacy Law Forum
Mar 13 all-day
This program will feature leading academics and practitioners discussing the latest developments in privacy law. UC Berkeley Law faculty and conference panelists will discuss cutting-edge scholarship and explore ‘real world’ privacy law problems. Click here[...]
May
27
Wed
all-day PL&B’s Asia-Pacific Roundtable (...
PL&B’s Asia-Pacific Roundtable (...
May 27 all-day
PROFESSOR GRAHAM GREENLEAF, Asia-Pacific Editor, Privacy Laws & Business International Report, will lead a roundtable on the countries of most interest to business in the Asia-Pacific region. Click here for more information.
Jul
6
Mon
all-day PL&B’s 28th Annual International...
PL&B’s 28th Annual International...
Jul 6 – Jul 8 all-day
The Privacy Laws & Business 27th Annual International Conference featured more than 40 speakers and chairs from many countries over 3 intensive days. At the world’s longest running independent international privacy event participants gained professionally by[...]

View Calendar