10.

There is no standard reference model for the TCP/IP layers. Hunt refers to the four lay ers as the 'network access,' 'internet,' 'host-to-host transport,' and 'application' layers; TCP/IP: Network Administration, 9. Loshin uses the terminology I follow in the text; TCP/IP: Clearly Explained, 13–17. Despite the different moniker, the functions performed in each of these layers are consistent. As with any protocol stack model, data are 'passed down the stack when it is being sent to the network, and up the stack when it is being received from the network.' Each layer 'has its own independent data structures,' with one layer 'unaware of the data structures used by' other layers; Hunt, TCP/IP: Network Administration, 9.

11.

Hunt, TCP/IP: Network Administration, 9; Loshin, TCP/IP: Clearly Explained, 13–17.

12.

As Hafner and Lyon explain: 'The general view was that any protocol was a potential building block, and so the best approach was to define simple protocols, each limited in scope, with the expectation that any of them might someday be joined or modified in various unanticipated ways. The protocol design philosophy adopted by the NWG [network working group] broke ground for what came to be widely accepted as the `layered' approach to protocols'; Where Wizards Stay Up Late, 147.

13.

The fights over encryption at the link level, for example, are fights over the TCP/IP protocols. Some within the network industry have proposed that encryption be done at the gateways, with a method for dumping plain text at the gateways if there were proper legal authority — a kind of 'private doorbell' for resolving the encryption controversy; see Elizabeth Kaufman and Roszel Thomsen II, 'The Export of Certain Networking Encryption Products Under ELAs,' available at http://www.cisco.com/web/about/gov/downloads/779/govtaffs/archive/CiscoClearZone.doc (cached: http://www.webcitation.org/5J6iFdx8M). This has been opposed by the Internet Architectural Board (IAB) as inconsistent with the 'end-to-end' architecture of the Internet; see IAB statement on 'private doorbell' encryption, available at http://www.iab.org/documents/docs/121898.html (cached: http://www.webcitation.org/5J6iHv95Y). Since Code v1_,_ there has been an explosion of excellent work extending 'layer theory.' Perhaps the best academic work in this has been Lawrence B. Solum and Minn Chung, 'The Layers Principle: Internet Architecture and the Law,' University of San Diego Public Law Research Paper No. 55, available at http://ssrn.com/abstract=416263 (cached: http://www.webcitation.org/5J6iKFgUO). Solum and Chung have used the idea of Internet layers to guide regulatory policy, locating appropriate and inappropriate targets for regulatory intervention. This is an example of some of the best work integrating technology and legal policy, drawing interesting and important implications from the particular, often counter intuitive, interaction between the two. I introduce 'layers' in my own work in The Future of Ideas: The Fate of the Commons in a Connected World (New York: Random House, 2001), 23–25. See also Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (New Haven: Yale University Press, 2006), 391–97. For other very useful work extending this analysis, see Craig McTaggart, 'A Layered Approach to Internet Legal Analysis,' McGill Law Journal 48 (2003): 571; Thomas A. Lane, 'Of Hammers and Saws: The Toolbox of Federalism and Sources of Law for the Web,' New Mexico Law Review 33 (2003): 115; Jane Bailey, 'Of Mediums and Metaphors: How a Layered Methodology Might Contribute to Constitutional Analysis of Internet Content Regulation,' Manitoba Law Journal 30 (2004): 197.

14.

See Hafner and Lyon, Where Wizards Stay up Late, 174.

15.

A 1994 HTML manual lists twenty-nine different browsers; see Larry Aronson, HTML Manual of Style (Emeryville, Cal.: Ziff-Davis Press, 1994), 124–26.

16.

Source code is the code that programmers write. It sometimes reads like a natural lan guage, but it is obviously not. A program is (ordinarily) written in source code, but to be run it must be converted into a language the computer can process. This is what a 'compiler' does. Some source code is converted on the fly — BASIC, for example, is usually interpreted, meaning the computer compiles the source code as it is run. 'Object code' is machine-readable. It is an undifferentiated string of 0s and 1s that instructs the machines about the tasks it is to perform.

17.

Hypertext is text that is linked to another location in the same document or in another document located either on the Net or on the same computer.

18.

T. Berners-Lee and R. Cailliau, WorldWideWeb: Proposal for a HyperText Project, 1990, available at http://www.w3.org/Proposal (cached: http://www.webcitation.org/5J6iMja8s).

19.

Of course, not always. When commercial production of computers began, software was often a free addition to the computer. Its commercial development as proprietary came only later; see Ira V. Heffan, 'Copyleft: Licensing Collaborative Works in the Digital Age,' Stanford Law Review 49 (1997): 1487, 1492–93.

Вы читаете Code 2.0
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×