practically free. So long as you have access to cyberspace and a scanner, you can scan a picture of “the girl next door” and then distribute the digital image across USENET to many more than one million people for just the cost of an Internet connection.

With the costs of production so low, a much greater supply of porn is produced for cyberspace than for real space. And indeed, a whole category of porn exists in cyberspace that doesn’t in real space — amateur porn, or porn produced for noncommercial purposes. That category of supply simply couldn’t survive in real space.

And then there is demand. Porn in cyberspace can be accessed — often and in many places — for free. Thousands of commercial sites make porn available for free, as a tease to draw in customers. Even more porn is distributed in noncommercial contexts, such as USENET, or free porn websites. Again, this low price translates into much greater demand.

Much of this supply and demand is for a market that, at least in the United States, is constitutionally protected. Adults have a constitutional right in the United States to access porn, in the sense that the government can do nothing that burdens (perhaps unreasonably burdens) access to porn. But there is another market for porn in the United States that is not constitutionally protected. Governments have the right in the United States to block access by kids to porn.

As we saw in the previous section, for that regulation to work, however, there needs to be a relatively simple way to know who is a kid. But as we’ve seen throughout this book, this is an architectural feature that cyberspace doesn’t have. It’s not that kids in cyberspace can easily hide that they are kids. In cyberspace, there is no fact to disguise. You enter without an identity and you identify only what you want — and even that can’t be authenticated with any real confidence. Thus, a kid in cyberspace need not disclose that he is a kid. And therefore he need not suffer the discriminations applied to a child in real space. No one needs to know that Jon is Jonny; therefore, the architecture does not produce the minimal information necessary to make regulation work.

The consequence is that regulations that seek selectively to block access to kids in cyberspace don’t work, and they don’t work for reasons that are very different from the reasons they might not work well in real space. In real space, no doubt, there are sellers who want to break the law or who are not typically motivated to obey it. But in cyberspace, even if the seller wants to obey the law, the law can’t be obeyed. The architecture of cyberspace doesn’t provide the tools to enable the law to be followed.

A similar story can be told about spam: Spam is an economic activity. People send it to make money. The frictions of real space significantly throttle that desire. The costs of sending spam in real space mean that only projects expecting a significant return get sent. As I said, even then, laws and norms add another layer of restriction. But the most significant constraint is cost.

But the efficiency of communication in cyberspace means that the cost of sending spam is radically cheaper, which radically increases the quantity of spam that it is rational to send. Even if you make only a .01% profit, if the cost of sending the spam is close to zero, you still make money.

Thus, as with porn, a different architectural constraint means a radically different regulation of behavior. Both porn and spam are reasonably regulated in real space; in cyberspace, this difference in architecture means neither is effectively regulated at all.

And thus the question that began this section: Is there a way to “regulate” spam and porn to at least the same level of regulation that both face in real space?

Regulating Net-Porn

Of all the possible speech regulations on the Net (putting copyright to one side for the moment), the United States Congress has been most eager to regulate porn. That eagerness, however, has not yet translated into success. Congress has passed two pieces of major legislation. The first was struck down completely. The second continues to be battered down in its struggle through the courts.

The first statute was the product of a scare. Just about the time the Net was coming into the popular consciousness, a particularly seedy aspect of the Net came into view first. This was porn on the Net. This concern became widespread in the United States early in 1995[36]. Its source was an extraordinary rise in the number of ordinary users of the Net, and therefore a rise in use by kids and an even more extraordinary rise in the availability of what many call porn on the Net. An extremely controversial (and deeply flawed) study published in the Georgetown University Law Review reported that the Net was awash in porn[37]. Time ran a cover story about its availability[38]. Senators and congressmen were bombarded with demands to do something to regulate “cybersmut.”

Congress responded in 1996 with the Communications Decency Act (CDA). A law of extraordinary stupidity, the CDA practically impaled itself on the First Amendment. The law made it a felony to transmit “indecent” material on the Net to a minor or to a place where a minor could observe it. But it gave speakers on the Net a defense — if they took good-faith, “reasonable, effective” steps to screen out children, then they could speak “indecently[39]”.

There were at least three problems with the CDA, any one of which should have doomed it to well- deserved extinction[40]. The first was the scope of the speech it addressed: “Indecency” is not a category of speech that Congress has the power to regulate (at least not outside the context of broadcasting)[41]. As I have already described, Congress can regulate speech that is “harmful to minors”, or Ginsberg speech, but that is very different from speech called “indecent.” Thus, the first strike against the statute was that it reached too far.

Strike two was vagueness. The form of the allowable defenses was clear: So long as there was an architecture for screening out kids, the speech would be permitted. But the architectures that existed at the time for screening out children were relatively crude, and in some cases quite expensive. It was unclear whether, to satisfy the statute, they had to be extremely effective or just reasonably effective given the state of the technology. If the former, then the defenses were no defense at all, because an extremely effective block was extremely expensive; the cost of a reasonably effective block would not have been so high.

Strike three was the government’s own doing. In arguing its case before the Supreme Court in 1997, the government did little either to narrow the scope of the speech being regulated or to expand the scope of the defenses. It stuck with the hopelessly vague, overbroad definition Congress had given it, and it displayed a poor understanding of how the technology might have provided a defense. As the Court considered the case, there seemed to be no way that an identification system could satisfy the statute without creating an undue burden on Internet speakers.

Congress responded quickly by passing a second statute aimed at protecting kids from porn. This was the Child Online Protect ion Act (COPA) of 1998[42]. This statute was better tailored to the constitutional requirements. It aimed at regulating speech that was harmful to minors. It allowed commercial websites to provide such speech so long as the website verified the viewer’s age.

Вы читаете Code 2.0
Добавить отзыв
ВСЕ ОТЗЫВЫ О КНИГЕ В ИЗБРАННОЕ

0

Вы можете отметить интересные вам фрагменты текста, которые будут доступны по уникальной ссылке в адресной строке браузера.

Отметить Добавить цитату
×