scispace - formally typeset
Search or ask a question
Journal ArticleDOI

To Track or 'Do Not Track': Advancing Transparency and Individual Control in Online Behavioral Advertising

TL;DR: The use of online behavioral tracking for advertising purposes has drawn criticism from journalists, privacy advocates and regulators as discussed by the authors, leading businesses to employ increasingly sophisticated technologies to track and profile individual users.
Abstract: The past decade has seen a proliferation of online data collection, processing, analysis and storage capacities leading businesses to employ increasingly sophisticated technologies to track and profile individual users. The use of online behavioral tracking for advertising purposes has drawn criticism from journalists, privacy advocates and regulators. Indeed, the behavioral tracking industry is currently the focus of the online privacy debate. At the center of the discussion is the Federal Trade Commission’s Do Not Track (DNT) proposal. The debate raging around DNT and the specific details of its implementation disguises a more fundamental disagreement among stakeholders about deeper societal values and norms. Unless policymakers address this underlying normative question – is online behavioral tracking a social good or an unnecessary evil – they may not be able to find a solution for implementing user choice in the context of online privacy. Practical progress advancing user privacy will be best served if policymakers and industry focus their debate on the desirable balance between efficiency and individual rights and if businesses implement tracking mechanisms fairly and responsibly. Policymakers must engage with these underlying normative questions; they cannot continue to sidestep these issues in the hope that “users will decide” for themselves.

Summary (5 min read)

To Track or “Do Not Track”: Advancing Transparency and Individual Control in Online Behavioral Advertising

  • Online advertising is greatly enhanced by the ability to analyze and measure the effectiveness of ad campaigns and by online behavioral tracking, which tracks users’ online activities in order to deliver tailored ads to that user.
  • Granted, both sides of the online behavioral tracking debate may be guilty of policy laundering: the industry, for holding out users’ vacuous, uninformed consent as a basis for depicting tracking as a voluntary practice; and privacy advocates, for proposing opt-in rules in order to decimate the data-forservice value exchange.
  • Rosch recently suggested that the potential downsides of regulatory initiatives include “the loss of relevancy, the loss of free content, the replacement of current advertising with even more intrusive advertising.”.
  • While the only way to disable the airbag was the general deactivation by a garage several years ago, some techniques are offered today allowing the deactivation and reactivation in a simple way.

II. ONLINE TRACKING DEVICES

  • Online tracking technologies have been progressing rapid- ly, from cookies to “super cookies,”20 to browser fingerprinting and device identifiers.
  • In addition, today information can be collected and stored with considerable ease and at low costs.
  • 23 This Part describes the main tracking technologies, 20. 23. Kenneth Cukier, Data, Data Everywhere, ECONOMIST (SPECIAL REPORT), Feb. 27, 2010, at 2 (“The amount of digital information increases tenfold every five years.”); see Ira S. Rubinstein et al., Data Mining and Internet Profiling: Emerging Regulatory and Technological Approaches, 75 U. CHI.
  • Noting their relative transparency to users and how amenable they are for user control.

A. COOKIES

  • Today, many people may be aware that their web browsing activity over time and across sites can be tracked using browser, or HTTP, cookies.
  • In a series of empirical research projects, Joseph Turow, Chris Hoofnagle, Jennifer King and others have uncovered a striking degree of “privacy illiteracy” on the part of online users.
  • Hence, a first-party website may use tracking technology, such as cookies or other means, to observe the person’s interaction.
  • 38 Browsers now provide users a degree of control over cookies that include: allowing users to block all cookies, or only those cookies that were shared with third parties; to selectively enable or disable cookies on a site-by-site basis; or to allow cookies 32.

B. FLASH COOKIES

  • Recent news reports,42 as well as class action lawsuits, alleged online advertisers misused Flash cookies, or “local shared objects,” to store information about users’ web browsing history, employing Flash cookies in a way unrelated to the delivery of content through the Flash Player.
  • 47 “[E]rasing HTTP cookies, clearing history, erasing the cache,” or even using the “Private Browsing” mode added to most browsers, still allows Flash cookies to operate fully.
  • 51 While Flash cookies have been the focus of litigation, similar tracking results can be obtained with other types of local storage, such as Microsoft’s Silverlight framework,52 HTML 5 databases, and ETags.53.

C. BROWSER FINGERPRINTING

  • Initially deployed by banks to prevent identity fraud55 and by software companies to preclude illegal copying of computer software, browser fingerprinting also is a powerful technique lease of Flash Player, which will bring together feedback from their users and external privacy advocates.
  • ALEECIA M. MCDONALD & LORRIE FAITH CRANOR, A SURVEY OF THE USE OF ADOBE FLASH LOCAL SHARED OBJECTS TO RESPAWN HTTP COOKIES 17 (2011), available at http://www.cylab.cmu.edu/files/pdfs/tech_reports.
  • That could include a user’s location, time zone, photographs, text from blogs, shopping cart contents, e-mails and a history of the Web pages visited.”).
  • By gathering seemingly innocuous bits of information, such as a browser’s version number, plugins, operating system, and language, websites can uniquely identify (“fingerprint”) a browser and, by proxy, its user.

D. MOBILE DEVICES

  • Mobile browsing is expected to surpass fixed Internet use in the next few years, rendering the tracking of users of mobile devices, including phones and tablets, increasingly important.
  • 66 Mobile apps thus replace browsers and search engines as the main entry gate to the mobile Internet.
  • The logic underpinning the blanket immunity granted to online intermediaries under Section 230 of the Communications Decency Act76 applies in similar force here.

E. DEEP PACKET INSPECTION

  • One technology that has created significant concern when used for online behavioral tracking is deep packet inspection (DPI).
  • Steve Stecklow & Paul Sonne, Shunned Profiling Method on the Verge of Comeback, WALL ST.
  • As a result, the leading United States company in the DPI business, NebuAd, folded.

F. HISTORY SNIFFING

  • Browser history sniffing exploits the functionality of browsers that display hyperlinks of visited and non-visited sites in different colors (blue for unvisited sites; purple for vis- 81.
  • (“Testimony this morning from AT&T, Verizon and Time Warner Cable executives were [sic] all very similar: the authors respect their customers [sic] privacy, customers should be given an opt-innot opt-out- choice . . . .”).
  • Scott Austin, Turning Out the Lights: NebuAd, WALL ST.
  • Jack Marshall, Phorm Shifts Focus to Brazil, Posts First Revenues, CLICKZ (July 1, 2010), http://www.clickz.com/clickz/news/1721855/phormshifts-focus-brazil-posts-first-revenues. ited).

III. USES OF TRACKING

  • The collection, retention, use and transfer of information about online users come in many guises.
  • In order to maintain a stable equilibrium between user expectations and the legitimate needs of online businesses, the market must reinforce mechanisms for transparency and user control over online behavioral tracking, while at the same time not overly impeding the fundamental business model of the Internet economy, fi- 86.
  • Kashmir Hill, History Sniffing: How YouPorn Checks What Other Porn Sites You’ve Visited and Ad Networks Test The Quality of Their Data, FORBES (Nov. 30, 2010, 6:23 PM), http://blogs.forbes.com/kashmirhill/.
  • In a recent research paper, Howard Beales, former Director of the Bureau of Consumer Protection at the FTC, asserted that the price of behaviorally targeted advertising was 2.68 times greater than the price of untargeted ads.91.

A. FIRST PARTY TRACKING

  • This concept of a first party has largely been the result of users’ perception as to what constitutes a first party.
  • ”95 Examples of first parties include websites that track users to support billing, complete online transactions, personalize user experience and website design, provide product recommendations and shopping cart services, tailor content and target their own products or services.
  • When a user signs on to Amazon and enters a username and password, the system will match that sign-on information to saved preferences and personalize the experience for that user, maintaining her shopping cart and providing personalized product recommendations.
  • The self-regulatory principles proposed by the Federal Trade Commission also exclude from their scope any non-advertising behavioral targeting, contextual advertising, and first party tracking.

B. ANALYTICS

  • Many website owners use third-party analytics tools to evaluate traffic on their own websites.
  • ”99 This activity is not considered “online behavioral tracking”—even though the data is collected by a third party— because the information collected relates exclusively to traffic on the first party’s site.
  • See, e.g., OPEN RECOMMENDATIONS FOR THE USE OF WEB MEASUREMENT TOOLS ON FEDERAL GOVERNMENT WEB SITES, CTR.
  • /2010/wp169_en.pdf (discussing the obligations of a processor with regard to confidentiality and security); ARTICLE 29 DATA PROTECTION WORKING PARTY, OPINION 10/2006 ON THE PROCESSING OF PERSONAL DATA BY THE SOCIETY FOR WORLDWIDE INTERBANK FINANCIAL TELECOMMUNICATION , 2006, WP 128, at 19 (U.K.), http://ec.europa.eu/justice/policies/privacy/docs/wp.

C. MEASUREMENT

  • Given that the online ecosystem is supported by advertising, websites, advertisers and ad intermediaries must use various tools to measure user engagement and the effectiveness of ad campaigns.
  • Many ad networks use the same cookie for web measurement that they do for online behavioral tracking, so the opt-out they provide for tracking does limit collection for measurement as well.

D. NETWORK SECURITY

  • Websites and ISPs have multiple reasons to log and track the traffic that comes through their systems, including limiting malicious activity, such as denial of service attacks, viruses and 107.
  • See Zachary Rodgers, Few Google Users Are Opting out of Behavioral Targeting, CLICKZ (Dec. 13, 2009), http://www.clickz.com/.
  • (“Evidon had served over 11 billion impressions.

E. FRAUD PREVENTION AND LAW ENFORCEMENT

  • Various laws and regulations allow, or even require, websites and online intermediaries to track users and maintain profiles for purposes of fraud prevention, anti-money laundering, national security and law enforcement.
  • ”115 In the European Union, “providers of publicly available electronic communications services or of a public communications network” must retain “traffic data and location data and the related data necessary to identify” subscribers or users for a period no less than six months and no more than twenty-four months.
  • SOC’Y 761 (2006) (arguing that a mandatory two-factor authentication system would go beyond the purpose of the Act, which is to “implement minimum standards” across many financial institutions).
  • CIVIL LIBERTIES UNION (Feb. 8, 2011), http://www.aclu.org/free-speech/legal-battle-over-government-demandstwitter-records-unsealed-court.

IV. REGULATING ONLINE TRACKING

  • The regulatory framework for both online and offline pri- vacy is currently in flux.
  • This led governments, regulators, and industry leaders in the European Union and United States to introduce new regulatory and self-regulatory frameworks applicable to online behavioral tracking.

A. EUROPE

  • In Europe, the legal framework applying to online behavioral tracking consists of the European Data Protection Directive—which regulates the collection, processing, storage and transfer of personal data124—and the European e-Privacy Directive, which regulates data privacy on communication networks.
  • For a demonstration, see David Naylor, EU “Cookies” Directive.
  • “The Recitals are the part of the act which contains the statement of reasons for the act; they are placed between the citations and the enacting terms.

C. SELF-REGULATION

  • Partly due to sparse legislation and partly a deliberate policy choice, the FTC has over the years promoted industry selfregulation in the field of online behavioral tracking.
  • At this point in time, it appears that self-regulation has not yet been successful in relaxing consumers’ concerns about privacy, fulfilling businesses’ interest in clarity, and satisfying regulators’ calls for additional enforcement tools.

V. PROPOSALS FOR REGULATORY REFORM

  • Additional criticism is pointed at the EASA recommended compliance and enforcement mechanism.
  • Wired magazine noted in August 2009 that attempts at self-regulation by the online behavioral tracking and advertising industry “have conspicuously failed to make the industry transparent about when, how and why it collects data about Internet users.”.
  • It has been anchored by the FTC Preliminary Report, followed by a swift response from industry, and reinvigorated by a slew of legislative bills.
  • It included the creation for the first time of a dedicated Senate Sub-Committee on Privacy, Technology and the Law, headed by Senator Al Franken (D-MN) and charged with “[o]versight of laws and policies governing the collection, protection, use, and dissemination of commercial information by the private sector, including online behavioral advertising.

C. DRAFT LEGISLATION

  • The renewed public interest in privacy and online behavioral tracking, spurred by the Wall Street Journal “What They Know” series,235 FTC and Department of Commerce engagement with the topic, and occasional front-page privacy snafu (e.g., Google Buzz,236 iPhone location tracking237), has led to an unprecedented flurry of activity and legislative proposals on the Hill.
  • //online.wsj.com/public/page/whatthey-know-digital-privacy.html (last visited Oct. 7, 2011), also known as J., http.
  • The bill would require a covered entity “to offer individuals a clear and conspicuous” opt-out mechanism for any “unauthorized use” of covered information, except for any use requiring opt-in consent.

VI. MOVING FORWARD

  • The general public, meanwhile, often expresses in opinion polls an interest in privacy and aversion towards online behavioral tracking.
  • Why force users to pay, or force them to see the ad before the content (or both).

B. ENHANCING NOTICE

  • //www.ghostery.com (last visited Oct. 7, 2011) (“Ghostery tracks the trackers and gives you a roll-call of the ad networks, behavioral data providers, web publishers, and other companies interested in your activity.”), also known as GHOSTERY, http.
  • Absent such consensus, labels and privacy notices, visceral or not, will continue to fail in the eyes of those who dispute the merit of the direction users are “nudged.”.

C. SHIFTING THE BURDEN TO BUSINESS

  • A better focus for policymakers to take may be shifting the burden of online privacy from users to business, by dimming the highlight on user choice while focusing on businesses’ obligations under the FIPs.
  • Consider, for example, patients’ social networking website PatientsLikeMe.com, which explicitly, conspicuously, and unmistakably holds out to its users a philosophy of openness and use of medical data not only for commercial purposes but also for medical research.349 348.
  • Children are increasingly subjected to a wide array of behavioral targeting practices through social networks, games, mobile services, and other digital platforms that use techniques that evade current legal restrictions, also known as Privacy advocates note.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

7 TENE POLONETSKY FINAL_JAD (DO NOT DELETE) 2/28/2012 11:25 AM
281
To Track or “Do Not Track”: Advancing
Transparency and Individual Control in
Online Behavioral Advertising
Omer Tene* and Jules Polonetsky**
Introduction .............................................................................. 282
II. Online Tracking Devices ...................................................... 288
A. Cookies ........................................................................ 289
B. Flash Cookies ............................................................. 292
C. Browser Fingerprinting ............................................. 294
D. Mobile Devices ........................................................... 296
E. Deep Packet Inspection .............................................. 298
F. History Sniffing .......................................................... 299
III. Uses of Tracking ................................................................. 300
A. First Party Tracking .................................................. 301
B. Analytics ..................................................................... 302
C. Measurement ............................................................. 303
D. Network Security ....................................................... 304
E. Fraud Prevention and Law Enforcement.................. 305
IV. Regulating Online Tracking ............................................... 307
A. Europe ........................................................................ 307
B. United States ............................................................. 313
C. Self-Regulation ........................................................... 314
V. Proposals for Regulatory Reform ......................................... 319
A. The FTC Do Not Track Proposal ............................... 320
© 2012 Omer Tene & Jules Polonetsky
* Associate Professor, College of Management Haim Striks School of
Law, Israel; Affiliate Scholar, Stanford Center for Internet and Society; Senior
Fellow, Future of Privacy Forum. I would like to thank the College of Man-
agement Haim Striks School of Law research fund and the College of Man-
agement Academic Studies research grant for supporting research for this ar-
ticle.
** Co-chair and Director, Future of Privacy Forum. The authors would
like to thank Christopher Wolf, Michael Birnhack, Boris Segalis and the par-
ticipants at the Privacy Law Scholars Conference in Berkeley for their helpful
comments.

7 TENE POLONETSKY FINAL_JAD (DO NOT DELETE) 2/28/2012 11:25 AM
282 MINN. J. L. SCI. & TECH. [Vol. 13:1
B. Industry Proposals ..................................................... 322
C. Draft Legislation ........................................................ 327
1. The Best Practices Act ....................................... 327
2. Commercial Privacy Bill of Rights Act of
2011. .................................................................... 329
3. Consumer Privacy Protection Act of 2011. ........ 331
VI. Moving Forward .................................................................. 332
A. Demystifying Consent ................................................ 335
B. Enhancing Notice ....................................................... 342
C. Shifting the Burden to Business ............................... 347
1. Sensitive Data .................................................... 349
2. Children’s Data ................................................... 351
3. Anonymization/Psuedonymization .................... 352
4. No Discriminatory Non-Marketing Related
Uses ..................................................................... 352
5. Retention ............................................................ 353
6. Access and Rectification ..................................... 355
7. Data Security ...................................................... 355
8. Accountability ..................................................... 356
Conclusion ................................................................................. 356
INTRODUCTION
For many years, Internet users considered online activity
to be confidential, their whereabouts protected by a veil of ano-
nymity. This approach was best captured by the famous New
Yorker cartoon-cum-adage, “On the Internet, nobody knows
you’re a dog.”
1
The reality, alas, is quite different. Every search,
query, click, page view, and link are logged, retained, analyzed,
and used by a host of third parties, including websites (also
known as “publishers”), advertisers, and a multitude of adver-
tising intermediaries, including ad networks, ad exchanges,
analytics providers, re-targeters, market researchers, and
more. Although users may expect that many of their online ac-
tivities are anonymous, the architecture of the Internet allows
multiple parties to collect data and compile user profiles with
various degrees of identifying information.
2
1. Peter Steiner, On the Internet, Nobody Knows You’re a Dog, NEW
YORKER, July 5, 1993, at 61.
2. See generally Omer Tene, Privacy: The New Generations, 1 I
NTL DATA

7 TENE POLONETSKY FINAL_JAD (DO NOT DELETE) 2/28/2012 11:25 AM
2012] TO TRACK OR “DO NOT TRACK” 283
The value created by online advertising, which fuels the
majority of free content and services available online, has been
immense.
3
Online advertising is greatly enhanced by the ability
to analyze and measure the effectiveness of ad campaigns and
by online behavioral tracking, which tracks users’ online activi-
ties in order to deliver tailored ads to that user.
4
The more fine-
ly tailored the ad, the higher the conversion or “clickthrough”
rate and, thus, the revenues of advertisers, publishers, and ad
intermediaries.
5
Increasingly, users are voluntarily posting large amounts
of data online, on social networking services, web forums, blogs
and personal web pages. The harvesting and use of such data,
while raising significant privacy issues, are beyond the scope of
tracking discussed in this paper.
6
The paradigmatic tracking
activity examined here involves a third party, largely unfamil-
iar to the user, collecting and processing information about her
based on her browsing activity on various unrelated websites in
order to compile an individual profile, which will be used to fa-
cilitate the targeting of ads.
7
We call this type of activity, which
PRIVACY LAW 15 (2011), available at http://idpl.oxfordjournals.org/content/
1/1/15.full (explaining that online profiling may relate information not to an
identified user but rather to an IP address, cookie or device—which, in turn,
permit re-identification with various levels of difficulty); Arvind Narayanan &
Vitaly Shmatikov, Robust De-anonymization of Large Sparse Datasets, in 2008
IEEE SYMPOSIUM ON SECURITY & PRIVACY 111, 112 (2008) (showing how an
“adversary” could identify anonymous Netflix subscribers).
3. See F
ED. TRADE COMMN, SELF-REGULATORY PRINCIPLES FOR ONLINE
BEHAVIORAL ADVERTISING i (2009) [hereinafter OBA REPORT], available at
http://www.ftc.gov/os/2009/02/P085400behavadreport.pdf (“This expanding
[online] marketplace has provided many benefits to consumers, including free
access to rich sources of information . . . .”).
4. Id. at 2.
5. Tene, supra note 2, at 16.
6. For a discussion of the use of voluntarily-posted data see Julia Angwin
& Steve Stecklow, Scrapers’ Dig Deep for Data on Web, W
ALL ST. J., Oct. 12,
2010, at A1. See generally Facebook, Inc. v. Power Ventures Inc., No. C08-
05780 JW 2010 WL 3291750 (N.D. Cal. July 20, 2010) (rejecting defendant’s
contention that it was not bound by Facebook’s Terms of Use because the in-
formation collected was posted by the user voluntarily).
7. Cf. C
TR. FOR DEMOCRACY & TECH., WHAT DOES “DO NOT TRACK
MEAN? 3, 5 (Jan. 31, 2011), http://www.cdt.org/files/pdfs/CDT-DNT-Report.pdf
[hereinafter CDT
WHAT DOES “DO NOT TRACK MEAN?] (defining “tracking” as
“the collection and correlation of data about the Internet activities of a particu-
lar user, computer, or device, over time and across non-commonly branded
websites, for any purpose other than fraud prevention or compliance with law
enforcement requests” and “third-party online behavioral advertising” as “the
collection of data about a particular user, computer, or device, regarding web
usage over time and across non-commonly branded websites for the purpose of

7 TENE POLONETSKY FINAL_JAD (DO NOT DELETE) 2/28/2012 11:25 AM
284 MINN. J. L. SCI. & TECH. [Vol. 13:1
studies indicate has created an uneasy feeling among many us-
ers, “online behavioral tracking.”
8
“In the past decade, the number and quality of online data
collection technologies have increased.”
9
The collection and use
of large amounts of data to create detailed personal profiles
have clear privacy implications. Users have remained largely
oblivious to the mechanics of the market for online information,
including data collection processes, prospective data uses, and
the identity of the myriad actors involved.
10
While users clearly
benefit from the rich diversity of online content and services
provided without charge, such benefits need to be weighed
against the costs imposed on users’ privacy.
Behavioral tracking is currently a major issue in the online
privacy debate. At the center of the discussion is the Federal
Trade Commission’s Do Not Track (DNT) proposal. This is be-
cause the simplicity of DNT crystallizes the deep ideological di-
vide about right and wrong in online activities. The debate rag-
ing around DNT and the specific details of its implementation
(opt-in; opt-out; browser, cookie or black list based; etc.) dis-
guise a more fundamental disagreement among stakeholders
about deeper societal values and norms. Unless policymakers
address this underlying normative question—is online behav-
ioral tracking a social good or an unnecessary evil?—they may
not be able to find a solution for implementing user choice in
the context of online privacy. Practical progress advancing user
privacy will be better served if policymakers and industry focus
using such data to predict user preferences or interests and to deliver adver-
tising to that individual or her computer or device based on the preferences or
interests inferred from such web viewing behaviors”).
8. See J
OSEPH TUROW ET AL., AMERICANS REJECT TAILORED
ADVERTISING AND THREE ACTIVITIES THAT ENABLE IT 17 (2009),
http://repository.upenn.edu/cgi/viewcontent.cgi?article=1138&context=asc_pap
ers (reporting that sixty-six percent of adults in the United States do not want
websites to show them tailored advertising; seventy-five percent say that tai-
loring ads to them based on their age and the website they are currently visit-
ing is not “OK”; and eighty-seven percent say that tailoring ads to them based
on their age and other websites they have previously visited is not “OK”).
9. Eric C. Bosset et al., Private Actions Challenging Online Data Collec-
tion Practices are Increasing: Assessing the Legal Landscape, I
NTELL. PROP. &
TECH. L.J., Feb. 2011, at 3, 3.
10. See, e.g., T
UROW, supra note 8, at 4 (“When asked true-false questions
about companies’ rights to share and sell information about their activities
online and off, respondents on average answer only 1.5 of 5 online laws and
1.7 of the 4 offline laws correctly because they falsely assume government reg-
ulations prohibit the sale of data.”).

7 TENE POLONETSKY FINAL_JAD (DO NOT DELETE) 2/28/2012 11:25 AM
2012] TO TRACK OR “DO NOT TRACK” 285
their debate on the desirable balance between efficiency and
individual rights, and on whether businesses implement track-
ing mechanisms fairly and responsibly.
By emphasizing “transparency and user consent,” in Euro-
pean data protection terms, or “notice and choice,” in United
States parlance, the current legal framework imposes a burden
on business and users that both parties struggle to lift. Users
are ill placed to make responsible decisions about their online
data—given, on the one hand, their cognitive biases and low
stake in each data- (or datum-) transaction and, on the other
hand, the increasing complexity of the online information eco-
system.
11
Indeed, even many privacy professionals would be
hard pressed to explain the inner-workings of the online mar-
ket for personal information, the parties involved, and the ac-
tual or potential uses of information. Imposing this burden on
users places them at an inherent disadvantage and ultimately
compromises their rights. It is tantamount to imposing the
burden of health care decisions on patients instead of doctors.
Granted, both sides of the online behavioral tracking de-
bate may be guilty of policy laundering: the industry, for hold-
ing out users’ vacuous, uninformed consent as a basis for de-
picting tracking as a voluntary practice; and privacy advocates,
for proposing opt-in rules in order to decimate the data-for-
service value exchange. Instead of repeatedly passing the buck
to users, the debate should focus on the limits of online behav-
ioral tracking practices by considering which activities are so-
cially acceptable and spelling out default norms accordingly. At
the end of the day, it is not the size of the font in privacy notic-
es or location of check-boxes in advanced browser settings
which will legitimize or delegitimize online behavioral tracking.
Rather, it will be the boundaries set by policymakers—either in
law, regulation, or self-regulation
12
—for tracking practices
11. See GCA SAVVIAN, DISPLAY ADVERTISING TECHNOLOGY LANDSCAPE:
DYNAMIC ENVIRONMENT RIPE FOR CONSOLIDATION, available at
http://tmblr.co/Z-WxPy56mpet (last visited Oct. 27, 2011) (flowcharting the
relationship between various advertising and publishing companies); Before
You Even Click . . . ., F
UTURE OF PRIVACY FORUM (Apr. 29, 2010),
www.futureofprivacy.org/2010/04/29/before-you-even-click (graphically illus-
trating the complexity of the online ecosystem).
12. Danny Weitzner, Associate Administrator at the National Telecom-
munications and Information Administration (NTIA), recently suggested the
United States would seek a framework for online “privacy law without regula-
tion.” Declan McCullagh, White House Pledges New Net Privacy Approach,
CNET (Aug. 22, 2011, 4:54 PM), http://news.cnet.com/8301-31921_3-20095730-

Citations
More filters
01 Jan 2014
TL;DR: In this paper, Cardozo et al. proposed a model for conflict resolution in the context of bankruptcy resolution, which is based on the work of the Cardozo Institute of Conflict Resolution.
Abstract: American Bankruptcy Institute Law Review 17 Am. Bankr. Inst. L. Rev., No. 1, Spring, 2009. Boston College Law Review 50 B.C. L. Rev., No. 3, May, 2009. Boston University Public Interest Law Journal 18 B.U. Pub. Int. L.J., No. 2, Spring, 2009. Cardozo Journal of Conflict Resolution 10 Cardozo J. Conflict Resol., No. 2, Spring, 2009. Cardozo Public Law, Policy, & Ethics Journal 7 Cardozo Pub. L. Pol’y & Ethics J., No. 3, Summer, 2009. Chicago Journal of International Law 10 Chi. J. Int’l L., No. 1, Summer, 2009. Colorado Journal of International Environmental Law and Policy 20 Colo. J. Int’l Envtl. L. & Pol’y, No. 2, Winter, 2009. Columbia Journal of Law & the Arts 32 Colum. J.L. & Arts, No. 3, Spring, 2009. Connecticut Public Interest Law Journal 8 Conn. Pub. Int. L.J., No. 2, Spring-Summer, 2009. Cornell Journal of Law and Public Policy 18 Cornell J.L. & Pub. Pol’y, No. 1, Fall, 2008. Cornell Law Review 94 Cornell L. Rev., No. 5, July, 2009. Creighton Law Review 42 Creighton L. Rev., No. 3, April, 2009. Criminal Law Forum 20 Crim. L. Forum, Nos. 2-3, Pp. 173-394, 2009. Delaware Journal of Corporate Law 34 Del. J. Corp. L., No. 2, Pp. 433-754, 2009. Environmental Law Reporter News & Analysis 39 Envtl. L. Rep. News & Analysis, No. 7, July, 2009. European Journal of International Law 20 Eur. J. Int’l L., No. 2, April, 2009. Family Law Quarterly 43 Fam. L.Q., No. 1, Spring, 2009. Georgetown Journal of International Law 40 Geo. J. Int’l L., No. 3, Spring, 2009. Georgetown Journal of Legal Ethics 22 Geo. J. Legal Ethics, No. 2, Spring, 2009. Golden Gate University Law Review 39 Golden Gate U. L. Rev., No. 2, Winter, 2009. Harvard Environmental Law Review 33 Harv. Envtl. L. Rev., No. 2, Pp. 297-608, 2009. International Review of Law and Economics 29 Int’l Rev. L. & Econ., No. 1, March, 2009. Journal of Environmental Law and Litigation 24 J. Envtl. L. & Litig., No. 1, Pp. 1-201, 2009. Journal of Legislation 34 J. Legis., No. 1, Pp. 1-98, 2008. Journal of Technology Law & Policy 14 J. Tech. L. & Pol’y, No. 1, June, 2009. Labor Lawyer 24 Lab. Law., No. 3, Winter/Spring, 2009. Michigan Journal of International Law 30 Mich. J. Int’l L., No. 3, Spring, 2009. New Criminal Law Review 12 New Crim. L. Rev., No. 2, Spring, 2009. Northern Kentucky Law Review 36 N. Ky. L. Rev., No. 4, Pp. 445-654, 2009. Ohio Northern University Law Review 35 Ohio N.U. L. Rev., No. 2, Pp. 445-886, 2009. Pace Law Review 29 Pace L. Rev., No. 3, Spring, 2009. Quinnipiac Health Law Journal 12 Quinnipiac Health L.J., No. 2, Pp. 209-332, 2008-2009. Real Property, Trust and Estate Law Journal 44 Real Prop. Tr. & Est. L.J., No. 1, Spring, 2009. Rutgers Race and the Law Review 10 Rutgers Race & L. Rev., No. 2, Pp. 441-629, 2009. San Diego Law Review 46 San Diego L. Rev., No. 2, Spring, 2009. Seton Hall Law Review 39 Seton Hall L. Rev., No. 3, Pp. 725-1102, 2009. Southern California Interdisciplinary Law Journal 18 S. Cal. Interdisc. L.J., No. 3, Spring, 2009. Stanford Environmental Law Journal 28 Stan. Envtl. L.J., No. 3, July, 2009. Tulsa Law Review 44 Tulsa L. Rev., No. 2, Winter, 2008. UMKC Law Review 77 UMKC L. Rev., No. 4, Summer, 2009. Washburn Law Journal 48 Washburn L.J., No. 3, Spring, 2009. Washington University Global Studies Law Review 8 Wash. U. Global Stud. L. Rev., No. 3, Pp.451-617, 2009. Washington University Journal of Law & Policy 29 Wash. U. J.L. & Pol’y, Pp. 1-401, 2009. Washington University Law Review 86 Wash. U. L. Rev., No. 6, Pp. 1273-1521, 2009. William Mitchell Law Review 35 Wm. Mitchell L. Rev., No. 4, Pp. 1235-1609, 2009. Yale Journal of International Law 34 Yale J. Int’l L., No. 2, Summer, 2009. Yale Journal on Regulation 26 Yale J. on Reg., No. 2, Summer, 2009.

1,336 citations

Journal ArticleDOI
Lei Xu1, Chunxiao Jiang1, Jian Wang1, Jian Yuan1, Yong Ren1 
TL;DR: This paper identifies four different types of users involved in data mining applications, namely, data provider, data collector, data miner, and decision maker, and examines various approaches that can help to protect sensitive information.
Abstract: The growing popularity and development of data mining technologies bring serious threat to the security of individual,'s sensitive information. An emerging research topic in data mining, known as privacy-preserving data mining (PPDM), has been extensively studied in recent years. The basic idea of PPDM is to modify the data in such a way so as to perform data mining algorithms effectively without compromising the security of sensitive information contained in the data. Current studies of PPDM mainly focus on how to reduce the privacy risk brought by data mining operations, while in fact, unwanted disclosure of sensitive information may also happen in the process of data collecting, data publishing, and information (i.e., the data mining results) delivering. In this paper, we view the privacy issues related to data mining from a wider perspective and investigate various approaches that can help to protect sensitive information. In particular, we identify four different types of users involved in data mining applications, namely, data provider, data collector, data miner, and decision maker. For each type of user, we discuss his privacy concerns and the methods that can be adopted to protect sensitive information. We briefly introduce the basics of related research topics, review state-of-the-art approaches, and present some preliminary thoughts on future research directions. Besides exploring the privacy-preserving approaches for each type of user, we also review the game theoretical approaches, which are proposed for analyzing the interactions among different users in a data mining scenario, each of whom has his own valuation on the sensitive information. By differentiating the responsibilities of different users with respect to security of sensitive information, we would like to provide some useful insights into the study of PPDM.

528 citations


Cites methods from "To Track or 'Do Not Track': Advanci..."

  • ...A user’s opt-out preference is signaled by an HTTP header field named DNT : if DNT=1, it means the user does not want to be tracked (opt out)....

    [...]

  • ...The DNT technology seems to be a good solution to privacy problems, considering that it helps users to regain the control over ‘‘who sees what you are doing online’’....

    [...]

  • ...The W3C Tracking Protection Working Group [11] is now trying to standardize how websites should response to user’s DNT request....

    [...]

  • ...There is no compulsion for the server to look for the DNT header and honor the DNT request....

    [...]

  • ...A major technology used for antitracking is called Do Not Track (DNT) [10], which enables users to opt out of tracking by websites they do not visit....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors use a field experiment informed by behavioral economics and decision research to investigate individual privacy valuations and find evidence of endowment and order effects, which highlight the sensitivity of privacy valuation to contextual, nonnormative factors.
Abstract: Understanding the value that individuals assign to the protection of their personal data is of great importance for business, law, and public policy. We use a field experiment informed by behavioral economics and decision research to investigate individual privacy valuations and find evidence of endowment and order effects. Individuals assigned markedly different values to the privacy of their data depending on (1) whether they were asked to consider how much money they would accept to disclose otherwise private information or how much they would pay to protect otherwise public information and (2) the order in which they considered different offers for their data. The gap between such values is large compared with that observed in comparable studies of consumer goods. The results highlight the sensitivity of privacy valuations to contextual, nonnormative factors.

477 citations


Cites background from "To Track or 'Do Not Track': Advanci..."

  • ...The importance of privacy defaults is perhaps nowhere more apparent than in the current debate over the so-called Do Not Track list (see Tene and Polonetsky 2012)....

    [...]

Book ChapterDOI
01 Jun 2014
TL;DR: In this paper, the authors focus on attempts to avoid or mitigate the conflicts that may arise, taking as a given that big data implicates important ethical and political values, and they do so because the familiar pair of anonymity and informed consent continues to strike many as the best and perhaps only way to escape the need to actually resolve these conflicts one way or the other.
Abstract: Introduction Big data promises to deliver analytic insights that will add to the stock of scientific and social scientific knowledge, significantly improve decision making in both the public and private sector, and greatly enhance individual self-knowledge and understanding. They have already led to entirely new classes of goods and services, many of which have been embraced enthusiastically by institutions and individuals alike. And yet, where these data commit to record details about human behavior, they have been perceived as a threat to fundamental values, including everything from autonomy, to fairness, justice, due process, property, solidarity, and, perhaps most of all, privacy. Given this apparent conflict, some have taken to calling for outright prohibitions on various big data practices, while others have found good reason to finally throw caution (and privacy) to the wind in the belief that big data will more than compensate for its potential costs. Still others, of course, are searching for a principled stance on privacy that offers the flexibility necessary for these promises to be realized while respecting the important values that privacy promotes. This is a familiar situation because it rehearses many of the long-standing tensions that have characterized each successive wave of technological innovation over the past half-century and their inevitable disruption of constraints on information flows through which privacy had been assured. It should come as no surprise that attempts to deal with new threats draw from the toolbox assembled to address earlier upheavals. Ready-to-hand, anonymity and informed consent remain the most popular tools for relieving these tensions – tensions that we accept, from the outset, as genuine and, in many cases, acute. Taking as a given that big data implicates important ethical and political values, we direct our focus instead on attempts to avoid or mitigate the conflicts that may arise. We do so because the familiar pair of anonymity and informed consent continues to strike many as the best and perhaps only way to escape the need to actually resolve these conflicts one way or the other.

199 citations

Proceedings ArticleDOI
21 Apr 2018
TL;DR: Exposing users to their algorithmically-derived attributes led to algorithm disillusionment---users found that advertising algorithms they thought were perfect were far from it, and a design implication is proposed to effectively communicate information about advertising algorithms.
Abstract: Advertisers develop algorithms to select the most relevant advertisements for users. However, the opacity of these algorithms, along with their potential for violating user privacy, has decreased user trust and preference in behavioral advertising. To mitigate this, advertisers have started to communicate algorithmic processes in behavioral advertising. However, how revealing parts of the algorithmic process affects users' perceptions towards ads and platforms is still an open question. To investigate this, we exposed 32 users to why an ad is shown to them, what advertising algorithms infer about them, and how advertisers use this information. Users preferred interpretable, non-creepy explanations about why an ad is presented, along with a recognizable link to their identity. We further found that exposing users to their algorithmically-derived attributes led to algorithm disillusionment---users found that advertising algorithms they thought were perfect were far from it. We propose design implications to effectively communicate information about advertising algorithms.

121 citations


Cites background from "To Track or 'Do Not Track': Advanci..."

  • ...3174006 with their potential customers, advertisers must clearly communicate their algorithmic ad curation process [51, 53]....

    [...]

References
More filters
Proceedings ArticleDOI
18 May 2008
TL;DR: This work applies the de-anonymization methodology to the Netflix Prize dataset, which contains anonymous movie ratings of 500,000 subscribers of Netflix, the world's largest online movie rental service, and demonstrates that an adversary who knows only a little bit about an individual subscriber can easily identify this subscriber's record in the dataset.
Abstract: We present a new class of statistical de- anonymization attacks against high-dimensional micro-data, such as individual preferences, recommendations, transaction records and so on Our techniques are robust to perturbation in the data and tolerate some mistakes in the adversary's background knowledge We apply our de-anonymization methodology to the Netflix Prize dataset, which contains anonymous movie ratings of 500,000 subscribers of Netflix, the world's largest online movie rental service We demonstrate that an adversary who knows only a little bit about an individual subscriber can easily identify this subscriber's record in the dataset Using the Internet Movie Database as the source of background knowledge, we successfully identified the Netflix records of known users, uncovering their apparent political preferences and other potentially sensitive information

2,241 citations

MonographDOI
TL;DR: Arguing that privacy concerns should not be limited solely to concern about control over personal information, Helen Nissenbaum counters that information ought to be distributed and protected according to norms governing distinct social context, be it workplace, health care, schools, or among family and friends.
Abstract: Privacy is one of the most urgent issues associated with information technology and digital media This book claims that what people really care about when they complain and protest that privacy has been violated is not the act of sharing information itselfmost people understand that this is crucial to social life but the inappropriate, improper sharing of information Arguing that privacy concerns should not be limited solely to concern about control over personal information, Helen Nissenbaum counters that information ought to be distributed and protected according to norms governing distinct social contextswhether it be workplace, health care, schools, or among family and friends She warns that basic distinctions between public and private, informing many current privacy policies, in fact obscure more than they clarify In truth, contemporary information systems should alarm us only when they function without regard for social norms and values, and thereby weaken the fabric of social life

1,887 citations

Posted Content
TL;DR: It is necessary to respond to the surprising failure of anonymization, and this Article provides the tools to do so.
Abstract: Computer scientists have recently undermined our faith in the privacy-protecting power of anonymization, the name for techniques for protecting the privacy of individuals in large databases by deleting information like names and social security numbers. These scientists have demonstrated they can often 'reidentify' or 'deanonymize' individuals hidden in anonymized data with astonishing ease. By understanding this research, we will realize we have made a mistake, labored beneath a fundamental misunderstanding, which has assured us much less privacy than we have assumed. This mistake pervades nearly every information privacy law, regulation, and debate, yet regulators and legal scholars have paid it scant attention. We must respond to the surprising failure of anonymization, and this Article provides the tools to do so.

927 citations

Proceedings ArticleDOI
20 Apr 2009
TL;DR: This paper reports on a longitudinal study consisting of multiple snapshots of an examination of the diffusion of private information for users as they visit various Web sites triggering data gathering aggregation by third parties.
Abstract: For the last few years we have been studying the diffusion of private information for users as they visit various Web sites triggering data gathering aggregation by third parties. This paper reports on our longitudinal study consisting of multiple snapshots of our examination of such diffusion over four years. We examine the various technical ways by which third-party aggregators acquire data and the depth of user-related information acquired. We study techniques for protecting privacy diffusion as well as limitations of such techniques. We introduce the concept of secondary privacy damage. Our results show increasing aggregation of user-related data by a steadily decreasing number of entities. A handful of companies are able to track users' movement across almost all of the popular Web sites. Virtually all the protection techniques have significant limitations highlighting the seriousness of the problem and the need for alternate solutions.

308 citations

Posted Content
TL;DR: In this article, the authors argue that the debate about data privacy protection should be grounded in an appreciation of the conditions necessary for individuals to develop and exercise autonomy in fact, and that meaningful autonomy requires a degree of freedom from monitoring, scrutiny, and categorization by others.
Abstract: In the United States, proposals for informational privacy have proved enormously controversial. On a political level, such proposals threaten powerful data processing interests. On a theoretical level, data processors and other data privacy opponents argue that imposing restrictions on the collection, use, and exchange of personal data would ignore established understandings of property, limit individual freedom of choice, violate principles of rational information use, and infringe data processors' freedom of speech. In this article, Professor Julie Cohen explores these theoretical challenges to informational privacy protection. She concludes that categorical arguments from property, choice, truth, and speech lack weight, and mask fundamentally political choices about the allocation of power over information, cost, and opportunity. Each debate, although couched in a rhetoric of individual liberty, effectively reduces individuals to objects of choices and trades made by others. Professor Cohen argues, instead, that the debate about data privacy protection should be grounded in an appreciation of the conditions necessary for individuals to develop and exercise autonomy in fact, and that meaningful autonomy requires a degree of freedom from monitoring, scrutiny, and categorization by others. The article concludes by calling for the design of both legal and technological tools for strong data privacy protection.

228 citations

Frequently Asked Questions (8)
Q1. What are the contributions in "To track or “do not track”: advancing transparency and individual control in online behavioral advertising" ?

In this paper, the authors present a review of the use of tracking devices in the context of online tracking. 

Some activities are value creating, socially desirable, and minimally intrusive; they should be permitted to exist as default options. 

The value of data collection and use to broader society includes ease of obtaining credit, support of free web content, encouraging users to conserve energy, and more. 

Rosch recently suggested that the potential downsides of regulatory initiatives include “the loss of relevancy, the loss of free content, the replacement of current advertising with even more intrusive advertising.” 

170 In April 2011, the European Advertising Standards Alliance (EASA), a Brussels-based NGO bringing together national advertising self-regulatory organizations and organizations representing the advertising industry in Europe, submitted its own best practice recommendation on online behavioral advertising. 

It states that:[t]he most practical method of providing uniform choice for online behavioral advertising would likely involve placing a setting similar to a persistent cookie on a consumer’s browser and conveying that setting to sites that the browser visits, to signal whether or not the consumer wants to be tracked or receive targeted advertisements. 

160 Aleecia McDonald and Lorrie Cranor calculated that it would take the average user 40 minutes per day to read through all of the privacy policies she encounters online. 

In fact, in order for companies to qualify under the FTC Safe Harbor program contained in my bill, they would have to set up a ‘Do-Not-Track like’ mechanism for consumers to allow them to opt-out of having the personal information they provide, both online and offline, to third parties.”