See Lynne G. Zucker, 'Production of Trust: Institutional Sources of Economic Struc ture, 1840–1920,' Research in Organizational Behavior 8 (1986): 53.

37.

Price discrimination is the ability to charge different prices for the same good. Airplane tickets are the best example — the same seat can cost hundreds of dollars more for a traveler who cannot stay over Saturday night. See, for example, Joseph Gregory Sidak, 'Debunking Predatory Innovation,' Columbia Law Review 83 (1983): 1121, 1132–35; see also Easterbrook, 'Intellectual Property Is Still Property'; Fisher, 'Reconstructing the Fair Use Doctrine,' 1659; but see Janusz A. Ordover et al., 'Predatory Systems Rivalry: A Reply,' Columbia Law Review 83 (1983): 1150, 1158–64.

38.

Viviana A. Zelizer, The Social Meaning of Money, 2d ed. (Princeton: Princeton Univer sity Press, 1994), 94–95 (footnote omitted).

39.

Susan Brenner puts the point very powerfully. As she frames the question,'is it reason able to translate the values incorporate in the Fourth Amendment into a context created and sustained by technology?' Susan Brenner,'The Privacy Privilege: Law Enforcement, Technology and the Constitution,' Journal of Technology Law and Policy 7 (2002): 123, 162. The question isn't simply whether anonymity has a value — plainly it does. The question instead is 'how to translate rights devised to deal with real world conduct into a world where greater degrees of anonymity are possible. . . .' Ibid., 139–40.'Because the technology alters the contours of the empirical environment in which the right to remain anonymous is exercised, it creates a tension between this aspect of the right to be let alone and the needs of effective law enforcement.' Ibid., 144.

40.

Shawn C. Helms, 'Translating Privacy Values with Technology,' Boston University Journal of Science and Technology Law 7 (2001): 288, 314. ('We should approach the translation of anonymity on the Internet through `code' by developing and implementing privacy-enhancing technologies.')

41.

As William McGeveran writes, Marc Rotenberg, one of privacy's most important advocate, doesn't view P3P as a PET 'because Rotenberg defines a PET as technology that inherently reduces transfer of personal data.' William McGeveran, 'Programmed Privacy Promises: P3P and Web Privacy Law,' New York University Law Review 76 (2001): 1813, 1826–27 n.80. I share McGeveran's view that P3P is a PET. If privacy is control over how information about you is released, then a technology that enhances that control is a PET even if it doesn't 'reduce[ the] transfer of personal data' — so long as that reduction is consistent with the preferences of the individual. No doubt, a PET could be a bad PET to the extent it fails to enable choice. But it isn't a bad PET because it fails to enable the choice of someone other than the consumer. For a wonderful account of how norms have risen to change data privacy practice, see Steven A. Hetcher, 'Norm Proselytizers Create a Privacy Entitlement in Cyberspace,' Berkeley Technology Law Journal 16 (2001): 877.

42.

See U.S. Department of Health, Education and Welfare, Secretary's Advisory Commit tee on Automated Personal Data Systems, Records, Computers, and the Rights of Citizens viii (1973), cited at http://www.epic.org/privacy/consumer/code_fair_info.html (cached: http://www.webcitation.org/5J6lfi8l6).

43.

Ibid.

44.

Lior Jacob Strahilevitz nicely explores this fundamentally 'empirical' question in 'A Social Networks Theory of Privacy,' University of Chicago Law Review 72 (2005): 919, 921.

45.

See Guido Calabresi and A. Douglas Melamed, 'Property Rules, Liability Rules, and Inalienability: One View of the Cathedral,' Harvard Law Review 85 (1972): 1089, 1105–6. 'Property rules involve a collective decision as to who is to be given an initial entitlement but not as to the value of the entitlement. . . . Liability rules involve an additional stage of state intervention: not only are entitlements protected, but their transfer or destruction is allowed on the basis of a value determined by some organ of the state rather than by the parties themselves' (1092).

46.

Ibid.

47.

See, e.g., Mark A. Lemley, 'Private Property,' Stanford Law Review 52 (2000): 1545, 1547; Paul M. Schwartz, 'Beyond Lessig's Code for Internet Privacy: Cyberspace Filter, PrivacyControl, and Fair Information Practices,' Wisconsin Law Review 2000 (2000): 743; Julie E. Cohen, 'DRM and Privacy,' Berkeley Technology Law Journal 18 (2003): 575, 577; Marc Rotenberg, 'Fair Information Practices and the Architecture of Privacy: (What Larry Doesn't Get),' Stanford Technology Law Review (2001): 1, 89–90. Andrew Shapiro discusses a similar idea in The Control Revolution, 158–65.

Вы читаете Code 2.0
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×