A Background Paper
Introduction
"If you fail in industry self-regulation we will have to go the legislative route."
Ira Magaziner, White House Internet adviser, in a November 1997 speech
"Regulations that 'drive certain ideas or viewpoints from the marketplace' for the benefit of children risk destroying the very political system and cultural life' that they will inherit when they come of age."
United States District Court, in ACLU v. Reno, June 11, 1996*
Free and unfettered access to information and ideas is a hallmark of an open society. That access is threatened today by government efforts, via legislation and bully pulpit urging, to put a brown paper bag around "objectionable" material on the Internet and television. Industry, responding both to government pressure and consumer demand, has compounded the problem by creating easy tools for censorship. Government is privatizing censorship, contracting it out.
We have recently witnessed efforts to make television and the Internet "family friendly," by giving parents tools to create "kid-safe" environments. But the consequences may be far-reaching, as the tools used in the home to shield children from certain ideas and images increasingly appear in schools and libraries, where they limit the ability of the entire community to gain access to valuable information. The promise of the electronic media – to create a more vibrant and democratic "marketplace of ideas" than has ever before been possible – could be undermined, if rating and filtering is widely embraced.
By directing a bright light at these censorship tools du jour, we may persuade viewers and users to resort not to ratings and filters, but to open and unfettered access to information and freedom of expression – to demand "more speech, not enforced silence," in Justice Brandeis' famous words.
The Problem in a Nutshell
The technological details about filters and rating systems can be daunting. However, almost all efforts to restrict access to information or label based on content share some basic problems:
Oversimplification. How to distinguish "good" sex (or violence) from "bad"? Filters and labels assume that television programs and Internet sites can be reduced to a single letter or symbol, or a combination. People disagree about what kind of violence or sexually suggestive material is "appropriate" and valuable, and the answer may be different depending on the age and maturity of the viewer.
Overbreadth. Ratings and filters often ignore context and, thus, inevitably exclude material that users might want to have, along with material they may not want. Some keyword filters screen out "Mars exploration" and "breast cancer awareness." Filters for violent content will screen out documentaries of historical events, and movies like Schindler's List, along with hate sites and The Texas Chainsaw Massacre. Similarly, blocking Internet sites that contain sexual content will limit access to information about biology, anthropology, and contraception, along with "pornography."
Feasibility. What about better descriptions of television programming and Internet sites? It sounds like a good idea, but it isn't feasible. There are thousands of television programs, content changes daily, and each new program would require a new description. The Internet is many times vaster, and the task of describing its contents is virtually unimaginable.
Subjectivity. Any rating system that classifies or describes content is dependent on the subjective judgment of the rater. Even if all participants voluntarily agreed to self-rate, which is highly unlikely, different raters would describe or rate the same content differently. Governmental standards are not likely to avoid the subjectivity problem and will increase First Amendment concerns.
Full disclosure. Few Internet filters disclose what you lose by using them. The makers of these products claim that information is proprietary and its disclosure would provide a roadmap to objectionable material. While those may be good reasons from their point of view, the end result is that users don't know what they're missing. Likewise, TV sets equipped with the V-chip can be programmed to automatically block programs with certain ratings, and viewers will remain in the dark about those programs.
Security. Filters and ratings give a false sense of security, by suggesting that all parents need to do to protect children is to block disturbing ideas and images. But this material, and the threats parents fear, exist in and out of cyberspace, and children need help learning to deal with them.
These concerns are exemplified by a 1997 study by the Electronic Privacy Information Center (EPIC). EPIC conducted 100 searches on the search engine Alta Vista, then the same searches using the filter NetShepherd Family Search for family friendly terms like "American Red Cross," "National Basketball Association" and "Boy Scouts of America." The filtered searches blocked out more than 90% of the sites identified by Alta Vista.
TV Ratings
The Telecommunications Act of 19962 creates a statutory scheme to regulate content of television programming. It mandates that new television sets be equipped with a V-chip capable of blocking programs, and it requires the FCC to prescribe guidelines to identify programming containing sexual, violent or other indecent material to help parents limit children's viewing – unless the television/cable industry "voluntarily" devises its own "acceptable" rules for labeling programs. The terms "violent" "sexual" or "indecent" are not defined in the Act.
In July 1997, the major networks, with the exception of NBC, agreed to begin using TV Parental Guidelines, a television rating system, to supplement the previously introduced movie-style age-based rating: TV-G (general audiences), TV-PG (parental guidance suggested), TV-Y7 and TV-14 (programs unsuitable for children under 7 and 14, respectively) and TV-MA (mature audiences only). The TV Parental Guidelines ratings include S for sex, V for violence, L for foul language, D for suggestive dialogue and FV for fantasy violence (children's programming). News and sports programming are exempt from the rating system.
The FCC has invited public comment on TV Parental Guidelines to determine whether these new ratings are "acceptable," as mandated by the Act. Lurking in the shadows is the threat of government oversight, if the "voluntary" industry-administered system is not "acceptable." Although refusing to rate content, NBC has begun to add advisories to shows, such as "This program has some violent scenes" or "This episode has a level of violence unusual for the series."
NBC's resistance to ratings has not been without consequences. Almost immediately, Senator John McCain, chairman of the Senate Committee on Commerce, Science and Transportation, threatened:
I will pursue a series of alternative ways of safeguarding, by law and regulation, the interests that NBC refuses to safeguard voluntarily. These will include, but not be limited to, the legislation offered by Senator Hollings to channel violent programming to later hours, as well as urging the Federal Communications Commission to examine in a full evidentiary hearing the renewal application of any television station not implementing the revised TV ratings system.3
Congressional efforts to restrict TV content are responding to pressure from groups like The American Family Association, which called on advertisers to boycott programs rated TV-14 or TV-MA. This year, the AFA is touting a new product called TVGuardian that automatically detects and mutes profanity and other "offensive" language from broadcast television and video movies.4
No one knows whether this ratings system can in fact work as intended. Some critics suggest that ratings will encourage more graphic sex and violence, on the theory that anyone who doesn't want to see such programming can block it out. Others claim that ratings may draw young viewers to shows rated for sex and violence. Few observers have addressed the fact that a half million hours of television programming a year is a lot to rate, and the absence of criteria is likely to make the effort hopelessly subjective. Is one punch enough to classify a show as "violent"? Is partial nudity acceptable for all audiences, but full nudity only acceptable for children over 14? Which swear words will earn an "L" label for a show? Inconsistencies and mistakes are inevitable.
Some proponents, such as Senator Joseph Lieberman (D-Conn.), candidly acknowledge that they hope ratings will affect content. Once a program is tagged with a rating such as "V" or "PG-14" it may well be abandoned by advertisers. Producers will be more likely to stay with "safe" programs with the biggest possible audience and the broadest financial backing, resulting in a media that will become increasingly sanitized and homogenized. The V-chip, expected to be available in sets later this year, will escalate these problems; when activated, it will automatically block programs with certain ratings, insuring the decline of controversial programs, no matter how worthwhile, because advertisers will be less willing to support programs with a limited audience.
While efforts to rate television are not new, today's technological advances make it possible to block programs in a wholesale fashion. Once we travel down this road, how easy will it become to rate other material, such as books, magazines, recordings, videos, video games? After ratings, what is next? An instructive case in point is the music industry, which has been forced to adopt "voluntary" labeling under the threat of regulation and is now facing proposals to criminalize the sale of music with warning labels to minors.
Internet Filters
The panic that greets the prospect of children browsing the Internet unsupervised is even greater than the fear about sex and violence on TV. Confirmed cases of Internet-facilitated solicitation of minors are rare, but well-publicized, and have turned even progressive parents into advocates for content control, and galvanized school and community administrators to embrace filtering.
Although the Communications Decency Act, which would have made it a crime to transmit indecent speech to minors on-line, was declared unconstitutional by the United States Supreme Court in Reno v. ACLU, or maybe because of that ruling, the Internet has become the lightening rod for the enemies of free speech. In the aftermath of a ruling that gave on-line speech the same protection as books and other printed matter, there is an explosion of proposals and devices designed to send controversial but protected expression to the outer regions of cyberspace.
Immediately following the Court's decision in ACLU v. Reno, the White House announced that it would sponsor a "family friendly Internet summit" to encourage "voluntary" industry efforts to rate and block controversial on-line speech. Industry leaders have been all too willing to comply. Here is what is on the menu of censorship tools du jour:
Stand-alone filters. Filtering software with descriptive names such as Cyber Patrol, CYBERSitter, SurfWatch, and Net Nanny, are now widely marketed to individual users as well as schools and libraries. They block access to websites, or edit out words and phrases when those websites are downloaded, based on criteria determined by a third party, the software vendor.
Keyword searching is crude, especially because words are not judged contextually; for example, filters cannot discriminate between pornography and sex education. Numerous examples show that filters block an extensive body of useful information – about breast cancer, contraception, and AIDS, critiques of pornography, material with gay and lesbian themes, and so forth. Site blocking has different but equally disturbing problems: blocking is based on undisclosed criteria, and information about what is blocked is considered proprietary and kept secret by most software vendors. As a result, consumers don't know what they're missing or why it's been deleted.
It is difficult to determine the popularity of these stand-alone filters. Often they are "bundled" with a particular Internet service, so that a subscriber might also get SurfWatch, for example, at no extra cost. While promising "parental controls," few of the products actually deliver real control. Instead they maintain undisclosed lists of sites considered inappropriate for children, including many that might be useful, depriving parents of the information to make meaningful choices.
PICS ( Platform for Internet Content Selection). Unlike stand-alone filters, PICS does not contain information regarding which sites should be blocked. It is a protocol for exchanging rating information. To utilize PICS, a user must first choose a rating systems and then install software to filter content.5
PICS has been adopted by industry giants such as Disney and Microsoft which have embedded it in their Web browsers. The equipped browser has the ability to read V-chip type ratings from vendor software (third party rating) or from website publishers (self-rating). As of September 1997, there were only three major third-party PICS rating services: RSACi, SafeSurf and NetShepherd. Moreover, most sites remain unrated, and risk being blocked by major search engines. Although an individual theoretically could develop his or her own rating system, that is impractical and expensive, involving the rating of countless sites or convincing websites to rate based on that system.
More flexible than stand-alone filters, PICS still has serious flaws. The problems of subjectivity and lack of uniformity in rating TV content apply here but in magnified form: how much violence is too much? Which violent/sexual images are "appropriate" and which are "gratuitous"? Should categories such as history, docu-drama, great work, or news be exempt? If so, who decides what qualifies?
* * *
Federal and state proposals portend unabated efforts to censor the Internet. Senator Patty Murray (D-Wash.) proposed legislation to require or induce websites to self-rate. Senator Coats (R-Ind.) has proposed a "son-of-CDA" to penalize on-line distribution of commercial material that is "harmful to minors." Senator John McCain (R-Ariz.) has proposed linking federal money for Internet technology in schools to the use of filters, a proposal opposed by the National Education Association, among others. Congress is additionally considering bills that would eliminate or regulate unsolicited commercial e-mail. Finally, states around the country have enacted laws censoring on-line speech, by banning material that is "indecent" or "harmful to minors," by penalizing "spam" or unsolicited on-line advertising, and by requiring libraries to install filtering software. Many are being challenged; the ACLU has already been successful in defeating state laws in New York and Georgia.
Is it realistic to think that the vast Internet can be regulated? There are a hundred million pages already on the Web with thousands being added daily. As of the end of 1997, only about 50,000 out of more than a million sites had rated themselves. About 5,000 new sites rate themselves each month. Use of PICS technology could cause many sites to become invisible – if unrated, they may automatically be blocked. And this is not counting chat rooms. On any day, about 100,000 AOL customers visit chat rooms, and all servers have about a million a night in chat rooms. All of these would have to be rated, to avoid being inaccessible in many systems.
News organizations have almost uniformly declined to conform to a rating system that screens out sex or violence, even if it means that their on-line editions will be blocked to Internet customers using PICS. The Recreational Advisory Software Panel has proposed a special "news" label to rate by category, but this raises the difficult question of what qualifies as a "news agency."
While labeling poses significant problems, some parents welcome it as a way to protect their children, fearing that pedophiles and stalkers abound on the Internet. However, these fears are not justified by the little data that exist, which suggest that Internet-facilitated crimes against children are rare. Moreover, existing laws already criminalize stalking, harassment, child pornography, obscenity, and solicitation – wherever they occur. Parents wishing to guide their children's Internet activities can refer to 50+ Great Web Sites for Parents and Kids, from the American Library Association, and other resources promoted by many groups. And, of course, there is no substitute for parental attention and guidance in helping children acquire the skills they need to protect themselves in cyberspace and real space.
Libraries and Schools
"The library is not a safe haven. It's a place for ideas," according to Santa Clara Librarian Susan Fuller. The role of libraries is to provide unlimited access to information, within the traditional limits of space and budget. How does the Internet impact on that mission?
The American Library Association is opposed to filtering in libraries because it does not provide for individual choice, imposes filtering on everyone, treats users of all ages identically, and blocks access to valuable information. To address the needs of those concerned about children's use of the Internet, the ALA has published The Librarian's Guide to Cyberspace for Parents and Kids. Nevertheless, some libraries around the country are installing blocking software, in response to concerns about children's exposure to pornography and other "inappropriate" material.
Oklahoma City's libraries have installed Bess, a filtering service that, according to the ACLU, blocks access to a wide variety of sites that contain valuable information for minors as well as adults, such as AIDS and HIV-related speech, safe sex information, art sites with classic nudes, gay and lesbian issues and literature. The ACLU has joined other plaintiffs in challenging the use of the filter X-Stop in the Loudon County, Virginia, public library – the library says it acted to protect children from pornography and to protect female library workers from a "hostile workplace." The library's director says that he knew the software would have flaws: his own test determined that the filter allowed access to nine pornographic sites and blocked 57 sites with no objectionable material.
Public libraries, as government institutions, must respect First Amendment rights and obligations. Adults are entitled to override any filtering software in public libraries, so that they can access lawful information, even if libraries can limit minors' access to material that is "harmful to minors" under state law. As the Supreme Court reiterated in ACLU v. Reno, "'the level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox.'"
As technology develops, libraries may allow users to filter their own content according to individual preferences, without implementing library wide filtering. Many libraries have purchased the Library Channel, software that allows access to 18,000 Web sites pre-selected by librarians, but which also allows unlimited access to the Web using a traditional browser. Library Channel may be viewed as a search engine that filters by making recommendations, equivalent to a librarian's guide, and might be unobjectionable if unlimited access to the Internet was also available to users who choose it. Other libraries are requiring children to have parental permission for unrestricted access to the Internet.
Attitudes about the use of Internet filtering and blocking devices in libraries may depend on whether they are viewed more as the removal of materials from library shelves, as opposed to judgments about the selection of materials to acquire that librarians have always made. But neither analogy is entirely apt, as the library of the 21st century is not the same as the library of the past. The Internet offers each library the opportunity to be comprehensive in its holdings, without regard to space and budgetary restraints. Viewed in that way, the decision to install filters represents a choice to limit information available to patrons, because of a judgment about content.
All of these debates occur in the context of school as well, complicated by Supreme Court decisions6 giving school administrators substantial discretion in making pedagogical decisions. Nonetheless, conflicts are bound to arise over the extent to which students' access can be restricted.
Summary
Perhaps some day technology will allow each of us to customize the information and entertainment coming into our homes. Today, however, the tools available for rating and blocking content on television and the Internet are imprecise, denying access to constitutionally-protected material. The technological ability to "customize" information, however, will only deepen the inevitable clash of values, as individuals with diverse views seek to have their personal preferences reflected in their schools and libraries. Once again, advances in information technology force us to re-evaluate our commitment to First Amendment freedoms. The challenge is not a new one:
"I thank God we have not free schools nor printing; and I hope we shall not these hundred years. For learning has brough disobedience and heresy, and sects into the world; and printing has divulged them and libels against the government. God keep us from both."7
__________________________
* American Civil Liberties Union v. Reno, 929 F. Supp. 824, 882 (E.D.Pa. June 11, 1996), aff'd Reno v. ACLU, 521 U.S. 844 (1997)
1 Prepared by Marilyn C. Mazur, NCAC cooperating attorney
2 47 U.S.C. sec. 330 (1996).
3 As this suggests, the Telecommunications Act has spawned a series of legislative proposals. Some would shift "violent" programming to late night Siberia for networks that refuse to implement program-content ratings. "Violent" programming is not defined and could therefore include a wide range, from Shakespearean plays and films about slavery or the civil rights movement, to animated cartoons such as Popeye and documentaries about domestic or gang violence.
4 Efforts to clean up the airwaves are not new. In the 1970's, Congress tried to create a "voluntary" "Family Viewing Hour." The effort was found unconstitutional: "threats, and the attempted securing of commitments coupled with the promise to publicize noncompliance… constituted per se violations of the First Amendment," according to the federal judge who decided the case.
5 PICSRules technology, developed late last year, raises PICS to the highest level, combining different pieces of different rating systems. Although PICSRules allows for customized ratings, this technology is theoretical, as no software has yet to come on the market that employs PICSRules. Experts suggest that it is too complicated and cumbersome for widespread use.
6 See, e.g., Hazelwood School District v. Kuhlmeier, 484 U.S. 260 (1988).
7 Sir William Berkeley (1671) quoted in Ingelhart, Press and Speech Freedoms in America, 1619-1995: A Chronology (Westport CT: Greenwood Press, 1997) p. 9.