Censorship/filtering/Markets/Lessig: the LAW creates an incentive (…) for sites with “harmful to minors”material to change their ARCHITECTURE (by adding
browser manufacturers (new markets) to add filtering to their code, so that parents can protect their kids. The only burden created by this solution is on the speaker; this solution does not burden the rightful consumer of porn at all. To that consumer, there is no change in the way the Web is experienced, because without a browser that looks for the
But why not simply rely upon filters that parents and libraries install on their computers? Voluntary filters don’t require any new laws, and they therefore don’t require any state-sponsored censorship to achieve their ends. It is this view that I want to work hardest to dislodge, because built within it are all the mistakes that a pre-cyberlaw understanding brings to the question of regulation in cyberspace.
But then what about public filtering technologies, like PICS?Wouldn’t PICS be a solution that avoided the “secret list problem” you identified? PICS is an acronym for theWorldWideWeb Consortium’s Platform for Internet Content Selection.We have already seen a relative (actually, a child) of PICS in the chapter about privacy: P3P. Like PICS, is a protocol for rating
and filtering content on the Net. In the context of privacy, the content was made up of assertions about privacy practices, and the regime was designed to help individuals negotiate those practices. With online speech the idea is much the same. PICS divides the problem
of filtering into two parts—labeling (rating content) and then filtering.
PICS would be neutral among ratings and neutral among filters; the system would simply provide a language with which content on the Net could be rated, and with which decisions about how to use that rated material could be made from machine to machine. (1)
Neutrality sounds like a good thing. It sounds like an idea that policymakers should embrace. Your speech is not my speech; we are both free to speak and listen as we want.
But PICS contains more “neutrality” than we might like. […] PICS is also vertically neutral—allowing the filter to be imposed at any level in the distributional chain. […] Nothing in the design of PICS, that is, requires that such filters announce themselves. Filtering in an architecture like PICS can be invisible. Indeed, in some of its implementations invisibility
is part of its design. (2)
If content is labeled, then it is possible to monitor who gets what without even blocking access. That might well raise greater concerns than blocking, since blocking at least puts the user on notice.
So what values should we choose? Inmy view, we should not opt for perfect filtering. We should not design for the most efficient system of censoring— or at least, we should not do this in a way that allows invisible upstream filtering.Nor should we opt for perfect filtering so long as the tendency worldwide is to overfilter speech.
I would opt for a zoning regime even if it required a law and the filtering solution required only private choice. If the state is pushing for a change in the mix of law and architecture, I do not care that it is pushing with law in one context and with norms in the other. From my perspective, the question is the result, not the means—does the regime produced by these changes protect free speech values? […]The values of speech are different from the values of privacy; For the same reasons that we disable some of the control over intellectual property, we should disable some of the control over speech. A little bit of messiness or friction in the context of speech is a value, not a cost. But are these values different just because I say they are? No. They are only different if we say they are different. In real space we treat them as different. My core argument is that we choose how we want to treat them in cyberspace.
1. Paul Resnick, “PICS-Interest@w3.0rg,Moving On,” January 20 1999, available at link
#89; Paul Resnick, “Filtering Information on the Internet,” Scientific American 106 (March
1997), also available at link #90; Paul Resnick, “PICS, Censorship, and Intellectual Freedom
FAQ,” available at link #91; Paul Resnick and JimMiller, “PICS: Internet Access ControlsWithout
Censorship,” Communications of the ACM 39 (1996): 87, also available at link #92; Jim
Miller, Paul Resnick, et al., “PICS 1.1 Rating Services and Rating Systems—and TheirMachine-
Readable Descriptions,”October 31, 1996, available at link #93); TimKrauskopf, Paul Resnick,
et al., “PICS 1.1 Label Distribution—Label Syntax and Communication Protocols,”October 31,
1996, available at link #94; Christopher Evans, Paul Resnick, et al., “W3C Recommendation:
PICS Rules 1.1, REC-PICS, Rules-971229,”December 29, 1997, available at link #95.
2. See Jonathan Weinberg, “Rating the Net,”Hastings Communications and Entertainment
Law Journal 19 (1997): 453, 478 n.108.
Code: Version 2.0 New York 2006ff