Click here to download the report as a pdf.

Update: a new edition of Internet Filters: A Public Policy Report has just been published. Click here to read the report at the Brennan Center for Justice.

The report documents how the widespread use of filters limits the free exchange of ideas necessary in a healthy democracy. Despite some manufacturers’ claims of improved technology, filters still must operate by “keywords,” and they block massive amounts of valuable information about politics, religion, public health, and myriad other subjects. The report analyzes almost 100 tests and studies of filtering products, and has hundreds of examples of egregious overblocking.

The “Children’s Internet Protection Act” (CIPA), requires filters in most schools and libraries – for adults and minors alike. As the report concludes: “Although some may say that the debate is over and that filters are now a fact of life, it is never too late to rethink bad policy choices.”

Original report follows:

Copyright 2001 National Coalition Against Censorship. Any part of this report may be reproduced without charge so long as acknowledgment is given to the Free Expression Policy Project. For additional copies, contact [email protected].









In the spring and summer of 2001, the Free Expression Policy Project of the National Coalition Against Censorship surveyed all of the studies and tests that it was able to locate describing the actual operation of 19 products or software programs that are commonly used to filter out World Wide Web sites and other communications on the Internet. This report summarizes the results of that survey. Its purpose is to provide a resource for policymakers and the general public as they grapple with the difficult, often hotly contested issues raised by the now-widespread use of Internet filters.

The existing studies and tests vary widely. They range from anecdotal accounts to extensive tests applying social-science methodologies. In some instances, we located only one or two test reports; in other cases?for example, Cyber Patrol, SmartFilter, and X-Stop?we found a great many. Most tests simply describe the actual sites that a particular product blocked when Web searches were conducted. Nearly every one, however, revealed massive over-blocking by filtering software.

This problem stems from the very nature of filtering, which must, because of the sheer number of Internet sites, rely to a large extent on mindless mechanical blocking through identification of key words and phrases. Where human judgment does come into play, filtering decisions are based on different companies’ broad and varying concepts of offensiveness, “inappropriateness,” or disagreement with the political viewpoint of the manufacturer. A few examples of over-blocking from the more than 70 studies or tests summarized in this report are:


  • BESS blocked the home pages of the Traditional Values Coalition and Massachusetts Congressman Edward Markey.
  • Cyber Patrol blocked MIT’s League for Programming Freedom, part of the City of Hiroshima Web site, Georgia O’Keeffe and Vincent Van Gogh sites, and the monogamy-advocating Society for the Promotion of Unconditional Relationships.
  • CYBERsitter blocked virtually all gay and lesbian sites and, after detecting the phrase “least 21,” blocked a news item on the Amnesty International Web site (the offending sentence read, “Reports of shootings in Irian Jaya bring to at least 21 the number of people in Indonesia and East Timor killed or wounded”).
  • I-Gear blocked an essay on “Indecency on the Internet: Lessons from the Art World,” the United Nations report “HIV/AIDS: The Global Epidemic,” and the home pages of four photography galleries.
  • Net Nanny, SurfWatch, Cybersitter, and BESS, among other products, blocked House Majority Leader Richard “Dick” Armey’s official Web site upon detecting the word “dick.”
  • SafeSurf blocked the home pages of the Wisconsin Civil Liberties Union and the National Coalition Against Censorship.
  • SmartFilter blocked the Declaration of Independence, Shakespeare’s complete plays, Moby Dick, and Marijuana: Facts for Teens, a brochure published by the National Institute on Drug Abuse (a division of the National Institutes of Health).
  • SurfWatch blocked such human-rights sites as the Commissioner of the Council of the Baltic Sea States and Algeria Watch, as well as the University of Kansas’s Archie R. Dykes Medical Library (upon detecting the word “dykes”).
  • WebSENSE blocked the Jewish Teens page and the Canine Molecular Genetics Project at Michigan State University.
  • X-Stop blocked the National Journal of Sexual Orientation Law, Carnegie Mellon University’s Banned Books page, “Let’s Have an Affair” catering company, and, through its “foul word” function, searches for Bastard Out of Carolina and “The Owl and the Pussy Cat.”


The still new, revolutionary medium of the Internet contains a wealth of information, images, and ideas?as the U.S. Supreme Court observed in 1997, “the content on the Internet is as diverse as human thought.”1 Unsurprisingly, not all of this online expression is accurate, pleasant, or inoffensive. Virtually since the arrival of the Internet, concerns have been expressed about minors’ access to online pornography, about the proliferation of Web sites advocating racial hatred, and about other online content deemed to be offensive or dangerous. Congress and the states responded in the late 1990s with censorship laws, but most of these have been struck down by the courts. Partly as a result, individual parents, employers, school districts, and other government entities have turned with increasing frequency to privately manufactured Internet rating and filtering programs.

Early Internet filtering was based on either “self-rating” by those who published online communi-cations; or “third-party rating” by filter manufacturers. Because of the Internet’s explosive growth (now more than a billion Web sites, many of which change daily), and the consequent inability of filtering companies to review and evaluate even a fraction of it, third-party rating had to rely largely on mechanical blocking by key words or phrases such as “over 18,” “breast,” “sex,” or “pussy.” The results were not difficult to predict: large quantities of valuable information and literature, particularly about sexuality, feminism, gay and lesbian issues, civil rights, and other politically important subjects, were blocked.

Even where company employees did review Web sites, there arose massive problems of subjectivity. The political attitudes of the different filter manufacturers were reflected in blocking decisions, particularly with respect to such subjects as homosexuality, human rights, and criticism of filtering software. The alternative, self-rating, did not suffer these disadvantages, but it proved impossible to persuade the great majority of online speakers to self-rate their sites. Online news organizations, for example, are among those that steadfastly refused to reduce their content to decontextualized, simplistic letters or codes through self-rating.

Third-party rating and filtering systems have thus become the industry standard, at least in the United States. Private software companies actively market such products as SurfWatch and Cyber Patrol, which contain multiple categories of potentially offensive, “inappropriate,” or “objectionable” material. Internet service providers such as America Online provide “parental control” options that block Web sites based on technological word or phrase identification, augmented by the company’s?or its subcontractor’s?judgments about age-appropriateness. Some manufacturers market products that essentially block all of the Internet, with only a few hundred or thousand preselected sites accessible (so-called whitelists). One company?later the subject of a First Amendment lawsuit?erroneously claimed that its “X-Stop” software was able to identify and block only “illegal” obscenity and child pornography: an impossible task, since legal judgments in both categories are subjective, and under the Supreme Court’s three-part obscenity test, determinations of legality vary depending on different communities’ standards of “prurience” and “patent offensiveness.”2

The late 1990s saw political battles in many communities over the use of filtering products in public libraries. New groups such as Family Friendly Libraries attacked the American Library Association (ALA) for adhering to a no-censorship and no-filtering policy, even for minors. (The ALA and other champions of intellectual freedom objected to the over-blocking propensities of filtering software, and advocated noncensorial approaches such as privacy screens and “acceptable use” policies.) Online anti-censorship groups such as the Censorware Project and Peacefire began to publish reports documenting the blocking of numerous valuable, educational sites by different filters. In December 2000, Congress passed the Children’s Internet Protection Act (“CIPA”), mandating filters in all schools and libraries that receive federal financial assistance through the E-rate or “universal service” program, or through the Library Services and Technology Act.3 This amounted to about 60% of the nation’s libraries and public schools.

Thus, although initially promoted as a voluntary alternative to coercive government censorship, Internet filtering is now embraced by government at both the federal and local levels. Reports of over-blocking, of vague and subjective standards, and of politically biased blocking decisions continue, while industry spokespersons assert that their methodologies are improving and that new software programs designed to distinguish between acceptable and unacceptable material will soon be on the market. But no filtering technology, no matter how sophisticated, can make contextualized judgments about the value, offensiveness, or age-appropriateness of online expression.

Internet filtering has thus become a major public policy issue, and is likely to remain so. In the interests of advancing informed debate on this important issue, the Free Expression Policy Project has collected and summarized all of the studies and tests that it has been able to locate on the actual operation of Internet filters. The report presents this information in one place and in readily accessible form, so that the ongoing policy debate will be better informed about what Internet filters actually do, and their ultimate impact on free expression.

The report is organized by filtering product. Necessarily, there is some overlap, since many studies have sampled more than one product. A bibliography of all the studies is included, along with an appendix listing blocked sites according to subject: artistic and literary sites; sexuality education; gay and lesbian information; political topics; and sites relating to censorship itself. (Another appendix, describing the blocking categories used by different products, is available in the online version of this report.)

Where the study gives Web addresses or URLs, we have included these and checked their accuracy whenever possible. (Some Web addresses are now obsolete.) If we have not given Web addresses, it is because they were not supplied in the underlying report.

We hope that Internet Filters: A Public Policy Report will prove a useful resource for policymakers, parents, teachers, librarians, and all others concerned with the Internet, intellectual freedom, or the education of youth. Internet filtering is popular, despite its unreliability, because many parents, political leaders, and educators feel that the alternative?unfettered Internet access?is even worse. But to make these policy choices, it is necessary to have complete and accurate information about what filters actually do. Ultimately, less censorial approaches such as media literacy, sexuality education, and Internet acceptable-use training may be better policy choices than Internet filters in addressing concerns about young people’s access to “inappropriate” content or disturbing ideas.


America Online Parental Controls

AOL offers three levels of Parental Controls: “Kids Only,” for children aged 12 and under, “Young Teen,” for ages 13?15, and “Mature Teen,” for ages 16?17, which allows access to “all content on AOL and the Internet, except certain sites deemed for an adult (18+) audience.” At one time AOL employed Cyber Patrol’s block list; at another point it stated it was using SurfWatch. While as of 2001 the Parental Controls information page provided no specific information as to its filtering categories or methodology other than its use of a user-recommended database of sites, on May 2, 2001, AOL announced that Parental Controls had integrated the RuleSpace Company’s “Contexion Services,” which identifies “objectionable” sites “by analyzing both the words on a page and the context in which they are used.”4

Access Denied, Version 2.0: The Continuing Threat Against Internet Access and Privacy and its Impact on the Lesbian, Gay, Bisexual and Transgender Community, Gay and Lesbian Alliance Against Defamation (GLAAD), 1999.
This 1999 report was a follow-up to GLAAD’s 1997 report Access Denied: The Impact of Internet Filtering Software on the Lesbian and Gay Community, which described the potential defects of various filtering products without identifying particular blocked sites. Access Denied, Version 2.0 addressed AOL Parental Controls only in its introduction, where it reported that AOL’s “Kids Only” setting blocked the Web site of Children of Lesbians and Gays Everywhere (COLAGE), as well as a number of “family, youth and national organization Web sites with lesbian and gay content,” none of which were specifically named or described in the report.

Brian Livingston, “AOL’s ‘youth filters’ protect kids from Democrats,” CNet, Apr. 24, 2000.
This news report described Livingston’s investigation of AOL’s blocking decisions for signs of political bias. He found that the “Kids Only” setting blocked the Web sites of the Democratic National Committee (, the Green Party (, and Ross Perot’s Reform Party (, but not those of the Republican National Committee ( and the conservative Constitution ( and Libertarian ( parties. Livingston also reported that AOL’s “Young Teen” setting blocked the home pages of the Coalition to Stop Gun Violence (, Safer Guns Now (, and the Million Mom March, but neither the NRA site ( nor the commercial sites for Colt ( and Browning ( firearms.

“AOL Parental Controls error rate for the first 1,000 .com domains,” Peacefire, Oct. 23, 2000.
Peacefire Webmaster Bennett Haselton selected 1,000 dot-com domains he had compiled for a similar test of SurfWatch 2 months earlier (see p. 39), and attempted to access each site on AOL 5.0 adjusted to its “Mature Teen” setting. Five of the 1,000 working domains were blocked, including, a site on which vinegar and seasonings were sold. Haselton decided the 4 others were “pornographic” and thus accurately blocked. This produced an “error rate” of 20%, the lowest, by Peacefire’s calculation, of the 5 filters it tested. AOL also “blocked far fewer pornographic sites than any of the other programs,” however. Haselton stated that 5 blocked domains was an insufficient sample to gauge the efficacy of AOL Parental Controls accurately, and that the true error rate could fall anywhere between 5 and 75%.

“Digital Chaperones for Kids,” Consumer Reports, Mar. 2001.
Consumer Reports published its assessments of AOL’s “Young Teen” and “Mature Teen” settings in this review of various filtering technologies. Through each, Consumer Reports attempted to access 86 Web sites it deemed objectionable because they contained “sexually explicit content or violently graphic images” or promoted “drugs, tobacco, crime, or bigotry,” and 53 it deemed legitimate because they “featured serious content on controversial subjects.” The “Mature Teen” setting left 30% of the “objectionable” sites unblocked; the “Young Teen” filter failed to block 14%?the lowest such error rate of all products reviewed as far as underinclusive filtering was concerned. But “Young Teen” also blocked 63% of the “legitimate” sites, including,, an online guide to lesbian politics, history, arts, and culture, the Web sites of the Citizens’ Committee for the Right to Keep and Bear Arms and the Southern Poverty Law Center, and SEX, Etc., a sex-education site written by and for teenagers, and hosted by Rutgers University.

Miscellaneous Reports
? In “BabelFish blocked by censorware” (Feb. 27, 2001), Peacefire reported that AOL’s “Mature Teen” setting barred access to Babel-ish (, AltaVista’s foreign-language translation service.


BESS, manufactured by N2H2, provides its Internet-filtering services in one of 2 ways: either as a proxy server, whereby each Web request is passed through a server located at N2H2 itself, or in the form of a dedicated server called the “Internet Filtering Manager,” installed on a local computer or system. Dedicated-server administrators can enable or disable any of BESS’s blocking categories, as well as BESS’s keyword filtering features; users on BESS proxy servers cannot. In both scenarios, BESS provides 29 categories of blocked content under its “Typical School Filtering” setting, ranging from “Adults Only” and “Alcohol” to “Gambling,” “Lingerie,” “Personals,” and “Tasteless/Gross.” (See appendix B for a complete list.) N2H2 states that 4 of the 29 classifications?”History,” “Medical,” “Moderated,” and “Text/Spoken Only”?are designed to distinguish between sites falling squarely into BESS’s blocking categories and those that may contain sexually oriented, violent, or other question-able content but also some educational merit, such as the Starr report to Congress on President Clinton’s sexual transgressions.

Under the “Maximum Filtering” setting, all 29 categories, as well as employment sites, message and bulletin boards, investment-related sites, images of individuals wearing swimsuits, and all Web searches are blocked. Configured for “Minimal Filtering,” N2H2’s Internet Filtering Manager blocks sites falling into the categories of “Adults Only,” “Hate/ Discrimination,” “Illegal,” “Pornography,” “Sex,” and “Violence.”

Karen Schneider, A Practical Guide to Internet Filters, 1997.
From April to September 1997, Karen Schneider supervised a nationwide team of librarians in testing 13 filtering technologies, including BESS. The results of the Internet Filter Assessment Project, or TIFAP, were published later that year in Schneider’s Practical Guide to Internet Filters.

The researchers began by seeking answers to some 100 common research queries on the Web, on both unfiltered computers and ones equipped with BESS (and the various other filters) configured for maximum blocking, including keyword blocking. Each query fell into one of 11 categories: “sex and pornography,” “anatomy,” “drugs, alcohol, and tobacco,” “gay issues,” “crimes (including pedophilia and child pornography),” “obscene or ?racy’ language,” “culture and religion,” “women’s issues,” “gambling,” “hate groups and intolerance,” and “politics.” The queries were purposely devised to gauge filters’ handling of controversial issues?for instance, “I’d like some information on safe sex”; “I want to do some research on Robert Mapplethorpe”; “I want information on the legalization of marijuana”; “I want information on PFLAG” Parents and Friends of Lesbians and Gays; “Is the Aryan Nation the same thing as Nazis?”; and “Who are the founders of the Electronic Frontier Foundation and what does it stand for?” In some cases, the queries contained potentially provocative terms “intended to trip up keyword-blocking mechanisms” such as “How do beavers make their dams?”; “Can you find me some pictures from Babes in Toyland?”; “I need some information and a picture of the Enola Gay”; “I’m a farmer and want to research rape?the plant used to make canola oil”; and “I’m trying to find out about the Paul Newman movie The Hustler.”

Schneider used Web sites, blocked and unblocked, that arose from these searches to construct her testing sample of 240 URLs. Researchers tested these URLs against a version of BESS configured for “Maximum Filtering,” but with keyword filtering disabled. TIFAP found that “several” (Schnieder did not say how many) nonpornographic sites were blocked, including a page discussing X-rated videos but not containing any pornographic imagery, and an informational page on trichomaniasis, a vaginal disease. Upon notification and review, BESS later unblocked the trichomaniasis site. A Practical Guide included neither the names nor the Web addresses of the blocked sites.

Passing Porn, Banning the Bible: N2H2’s Bess in Public Schools, Censorware Project, 2000.
From July 23-26, 2000, the Censorware Project tested “thousands” of URLs against 10 BESS proxy servers, 7 of which were in use in various public schools across the United States. Among the blocked Web sites were Friends of Lulu (, a site promoting comic books to girls, a page from Mother Jones magazine’s site, the Institute of Australasian Psychiatry (; the nonprofit effort Stop Prisoner Rape (, and a portion of the Columbia University Health Education Program site on which users are invited to submit “questions about relationships; sexuality; sexual health; emotional health; fitness; nutrition; alcohol, nicotine, and other drugs; and general health” ( BESS also blocked several sites opposing censorship, including the Web site of the United Kingdom-based Feminists Against Censorship (, the personal site of a librarian opposing Internet-filter use in libraries (, and Time magazine’s “Netly News,” which has reported, positively and negatively, on filtering software.

The report noted that BESS does not (as is implied in its published filtering criteria) review home pages hosted by such free site providers as Angelfire, Geocities, and Tripod (owing, it seems, to their sheer number). Instead, users must configure the software to block none or all of these sites; some schools opt for the latter, thus prohibiting access to such sites as The Jefferson Bible (www., a com-pendium of Biblical passages selected by Thomas Jefferson, and the Web site of the Eustis Panthers (, a high-school baseball team. Though each proxy was configured to filter out pornography to the highest degree, Censorware was able to access “hundreds” of pornographic Web sites, of which 46 are listed in Passing Porn. Of the total unblocked pornographic URLs, some 285 were listed on, and of these, 28 were accessible through all 7 of the proxies in use in public schools.

“‘BESS, the Internet Retriever’ Examined,” Peacefire, 2000.
This report consists of a list of 15 sites that Peacefire deemed inappropriately blocked by BESS during the first half of 2000. These included itself, which was blocked for “Profanity” when the word “piss” appeared on the site (within a quotation from a letter written by Brian Milburn, president of CYBERsitter’s manufacturer, Solid Oak Software, to journalist Brock Meeks). Also blocked were two portions of the Web site of Princeton University’s Office of Population Research, both resources on contraception methods (information on emergency contraception pills was found on; information on IUDs on; the Safer Sex page (; 5 gay-interest sites, including the home page of the Illinois Federation for Human Rights (, which “works to preserve the equal rights of lesbian and gay Illinoisians,” and 2 online magazines devoted to gay topics (; 2 Web sites providing resources on eating disorders ( and ikstrh/ed); and 3 sites discussing breast cancer (,, and

“Mandated Mediocrity: Blocking Software Gets a Failing Grade,” Peacefire & Elec-tronic Privacy Information Center (EPIC), Oct. 2000.
“Mandated Mediocrity” describes another 23 Web sites inappropriately blocked by BESS. The URLs were tested against an N2H2 proxy as well as a trial copy of the N2H2 Internet Filtering Manager set to “Typi-cal School Filtering.” Among the blocked sites were the home page of the Traditional Values Coalition (; Hillary for President (; The Smoking Gun (www.smokinggun .com), an online selection of primary documents relating to current events (“obtained from government and law enforcement sources, via Freedom of Information requests, and from court files nationwide”); a selection of travel photographs of Utah’s national parks; “What Is Memorial Day?”, an essay lamenting the “capitalistic American” conception of the holiday as nothing more than an occasion for a 3-day weekend; the home page of “American Government and Politics,” a course at St. John’s University (; and the Circumcision Information and Research Pages (, a site that contained no nudity and was designated a “Select Parenting Site” by

“BESS error rate for 1,000 .com domains,” Peacefire, Oct. 23, 2000.
This October 2000 test involving a sample of 1,000 active dot-com domains has already been described (see p. 6). As N2H2 evidently reviewed Peacefire’s earlier SurfWatch report and prepared for a similar test of its own software by unblocking any of the 1,000 sites inappropriately filtered by BESS,6 Peacefire selected the second 1,000 dot-com domains for testing against a BESS proxy server in use at a school where Peacefire had found a student to help test BESS’s performance.

The filter was configured to block sites in the categories of “Adults Only,” “Alcohol,” “Chat,” “Drugs,” “Free Pages,” “Gambling,” “Hate/Discrimination,” “Illegal,” “Lingerie,” “Nudity,” “Personals,” “Personal Information,” “Porn Site,” “Profanity,” “School Cheating Info,” “Sex,” “Suicide/Murder,” “Tasteless/Gross,” “Tobacco,” “Violence,” and “Weapons.” The program’s keyword-blocking features were also enabled. The BESS proxy blocked 176 of the 1,000 domains; among these, 150 were “under construction.” Of the remaining 26 sites, Peacefire deemed 7 wrongly blocked:,,,,,, and

The report said the resulting “error rate” of 27% was unreliable given how small a sample was examined; the true error rate “could be as low as 15%.” Peacefire’s Bennett Haselton also noted that the dot-com domains tested here were “more likely to contain commercial pornography than, say, .org domains. … [W]e should expect the error rate to be even higher for .org sites” (Haselton’s emphasis), and added that the results called into question N2H2 CEO Peter Nickerson’s claim, in 1998 testimony before the House Subcommittee on Telecommunications, Trade, and Consumer Protection, that “[a]ll sites that are blocked are reviewed by N2H2 staff before being added to the block lists.”7

“Blind Ballots: Web Sites of U.S. Political Candidates Censored by Censorware,” Peacefire, Nov. 7, 2000.
“Blind Ballots” was published on Election Day, 2000. Peacefire obtained a random sample of U.S. political candidates’ Web sites from, a site providing information on political campaigns nationwide, and set out to see which sites BESS’s (and Cyber Patrol’s) “Typical School Filtering” would allow users to access. (Around the start of the 2000 school year, BESS and Cyber Patrol asserted that together they were providing filtered Internet access to more than 30,000 schools nationwide.8)

BESS’s wholesale blocking of free Webpage hosting services caused the sites of one Democratic candidate, 5 Republicans, 6 Libertarians (as well as the entire Missouri Libertarian Party site), and 13 other third-party candidates to be blocked. Report coauthor Bennett Haselton commented that, as “many of our political candidates run their campaigns on a shoestring, and use free-hosting services to save money,” BESS’s barring of such hosts leads it to an inadvertent bias toward wealthy or established politicians’ sites. Congressional incumbent Edward Markey (a Democrat from Massachusetts), also had his site blocked; unlike the others, it was not hosted by Geocities or Tripod, but was blocked because BESS categorized its content as “Hate, Illegal, Pornography, and/or Violence.” “While blocking software com-panies often justify their errors by pointing out that they are quickly corrected,” Haselton wrote, “this does not help any of the candidates listed above. . . . [C]orrections made after Election Day do not help them at all.”

“Amnesty Intercepted: Global Human Rights Groups Blocked by Web Censoring Software,” Peacefire, Dec. 12, 2000.
In response to complaints from students barred from the Amnesty International Web page, among others, at their school computer stations, Peacefire undertook an examination of various filters’ treatment of human rights sites. Peacefire found that BESS’s “Typical School Filtering” blocked the home pages of the International Coptic Congress (, which tracked human rights violations against Coptic Christians living in Egypt; and Friends of Sean Sellers (, which contained links to the works of the Multiple Personality Disorder-afflicted writer who was executed in 1999 for murders he had committed as a 16-year-old (the site opposed capital punishment). “Typical school filtering” also denied access to the official sites of recording artists Suzanne Vega ( and the Art Dogs (; both contained statements that portions of their proceeds would be donated to Amnesty International. Peacefire also reported that BESS’s “Minimal Filtering” configuration blocked the Web sites of Human Rights & Tamil People (, which tracks govern-ment and police violence against Hindu Tamils in Sri Lanka, and Casa Alianza (, which documents the condition of homeless children in the cities of Central America.

Miscellaneous Reports

  • In its survey of “Winners of the Foil the Filter Contest” (Sept. 28, 2000), the Digital Freedom Network reported that BESS blocked House Majority Leader Richard “Dick” Armey’s official Web site upon detecting the word “dick.”
  • Peacefire reported, in “BabelFish blocked by censorware” (Feb. 27, 2001), that BESS blocked the URL-translation site BabelFish (
  • In “Teen Health Sites Praised in Article, Blocked by Censorware” (Mar. 23, 2001), Peacefire noted that BESS blocked portions of TeenGrowth (, a teen-oriented health education site that was recognized by the New York Times in the recent article, “Teenagers Find Health Answers with a Click.”9


Rather than relying on lists of objectionable URLs, ClickSafe is designed to review each requested page in real time. According to company cofounder Richard Schwartz’s outline for testimony submitted to the commission created by the 1998 Child Online Protection Act (the COPA Commission) in 2000, ClickSafe “uses state-of-the-art, content-based filtering software that combines cutting edge graphic, word and phrase-recognition technology to achieve extra-ordinarily high rates of accuracy in filtering pornographic content,” and “can precisely distinguish between appropriate and inappropriate sites.”

“Sites blocked by ClickSafe,” Peacefire, July 2000.
Upon learning that ClickSafe blocked the home page of cyberlaw scholar Lawrence Lessig (,1157,1739,00.html), who was to testify before the COPA Commission, Peace-fire attempted to access various pages on the COPA Commission site, as well as the Web sites of organizations and companies with which the commissioners were affiliated, through a computer equipped with ClickSafe. On the COPA Commission’s site, ClickSafe blocked the Frequently Asked Questions page (; the biographies of commission members Stephen Balkam ( .shtml), Donna Rice Hughes (, and John Bastian (; a list of “technologies and methods within the scope” of the commis-sion’s inquiry ( commission/technologies.shtml); the commission’s Scope and Timeline Proposal (; and two versions of the statute itself ( and

As for groups with representatives on the commission, Peacefire found that ClickSafe blocked several organizations’ and companies’ sites, at least partially: Network Solutions (; the Internet Content Rating Association; Security Software’s information page on its signature filtering product, Cyber Sentinel (; FamilyConnect (, a brand of blocking software?the page blocked was one on which users could submit URLs to be reviewed as potential blocks or unblocks; the National Law Center for Children and Families; the Christian site; and the Center for Democracy and Technology ( In addition to the CDT, ClickSafe blocked the home pages of the ACLU (, the Electronic Frontier Foundation (, and the American Family Association, as well as part of the official site of Donna Rice Hughes’s book, Kids Online: Protecting Your Children in Cyberspace.

Cyber Patrol

Cyber Patrol, currently owned by Surf-Control, operates with 12 default blocking categories, including “Partial Nudity,” “Intoler-ance,” “Drugs/Drug Culture,” and “Sex Education.” (See appendix B.) According to the manufacturer’s Web site, “Cyber Patrol employs a team of professional researchers at least 21 years of age including parents and teachers” to determine whether sites are to be blocked. Any page that “contains more than 3 instances in 100 messages or any easily accessible pages with graphics, text or audio that fall within the definition” of any of the 12 categories “will be considered sufficient to place the source in that category.” As with most filtering products, Cyber Patrol’s list of prohibited sites is not made public, but SurfControl offers the CyberNOT search engine, a feature on its Web site through which users can enter URLs and receive immediate responses as to whether or not those pages are on the filter’s block list. SurfControl adds, “Internet sites that contain information or software programs designed to hack into filtering software, including Cyber Patrol, are added to the CyberNOT list in ALL categories as a measure of protection for the parents, educators and businesses that rely on Cyber Patrol to screen Internet content.”

Brock Meeks and Declan McCullagh, “Jacking in from the ?Keys to the Kingdom’ Port,” CyberWire Dispatch, July 3, 1996.
The first evaluation of Cyber Patrol appeared in this early report on the problems of Internet filtering by journalists Brock Meeks and Declan McCullagh. Meeks and McCullagh viewed a decrypted version of Cyber Patrol’s block list (along with those of CYBERsitter and Net Nanny), and noticed that Cyber Patrol stored the Web addresses it blocked only partially, cutting off all but the first 3 characters at the end of a URL. For instance, the software was meant to block, a Carnegie Mellon student home page containing information on the occult; yet on its block list Cyber Patrol recorded only ~sha, thereby blocking every site beginning with that URL segment and leaving, at the time of the report’s publication, 23 unrelated sites on the CMU server blocked.

The authors also found that with all default categories enabled, Cyber Patrol barred multiple sites concerning cyberliberties?the Electronic Frontier Foundation’s censorship archive, for example, and the home page of MIT’s League for Programming Freedom. Also blocked were the Queer Resources Directory, which counts among its resources information from the Centers for Disease Control and Prevention, the AIDS Book Review Journal, and AIDS Treatment News. Cyber Patrol also blocked a number of newsgroups dealing with homosexuality and gender issues, such as,, alt.feminism, and soc.feminism, as well as

Karen Schneider, A Practical Guide to Internet Filters, 1997.
The Internet Filter Assessment Project tested Cyber Patrol configured to block only “Full Nudity” and “Sexual Acts.” Schneider reported that the software “blocked ?good sites’ 5-10% of the time, depending on the tester, and pornographic sites slipped through about 10% of the time.” One of the “good sites” was, described by Schneider as a site “devoted to debunking propaganda.”

“Cyber Patrol: The Friendly Censor,” Censorware Project, Nov. 22, 1997.
Censorware Project member Jonathan Wallace tested his personal collection of approximately 270 Web sites on ethics, politics, and law?all “containing controversial speech but no obscenity or illegal material”?against the CyberNOT search engine after learning that the Web pages of Sex, Laws, and Cyberspace, the 1996 book he co-authored with Mark Mangan, were blocked by Cyber Patrol. Wallace found 12 of his chosen sites were barred, including Deja News (, a searchable archive of USENET materials,10 and the Web page of the Society for the Promotion of Unconditional Relationships ( town/estate/xgv92/spur2.htm), an organization advocating monogamy whose site includes such articles as “The Role of Faith in Relationships.”

Wallace reported that Cyber Patrol also blocked several sites featuring politically loaded content, such as the Flag Burning Page (formerly; now, which examines the issue of flag burning from a constitutional perspective; Interactivism (, a site inviting users to engage in political activism by corresponding with politicians on issues such as campaign-finance reform and Tibetan independence; Newtwatch (no longer active; formerly, a Democratic Party-funded page that consisted of reports and satires on the former Speaker of the House; Dr. Bonzo, another now-inactive page ( .htm), which featured “satirical essays on religious matters”1


“Cyber Patrol: The Friendly Censor,” Censorware Project, Nov. 22, 1997.
Censorware Project member Jonathan Wallace tested his personal collection of approximately 270 Web sites on ethics, politics, and law-all “containing controversial speech but no obscenity or illegal material”-against the CyberNOT search engine after learning that the Web pages of Sex, Laws, and Cyberspace, the 1996 book he co-authored with Mark Mangan, were blocked by Cyber Patrol. Wallace found 12 of his chosen sites were barred, including Deja News (, a searchable archive of USENET materials,10 and the Web page of the Society for the Promotion of Unconditional Relationships ( town/estate/xgv92/spur2.htm), an organization advocating monogamy whose site includes such articles as “The Role of Faith in Relationships.”


Wallace reported that Cyber Patrol also blocked several sites featuring politically loaded content, such as the Flag Burning Page (formerly; now, which examines the issue of flag burning from a constitutional perspective; Interactivism (, a site inviting users to engage in political activism by corresponding with politicians on issues such as campaign-finance reform and Tibetan independence; Newtwatch (no longer active; formerly, a Democratic Party-funded page that consisted of reports and satires on the former Speaker of the House; Dr. Bonzo, another now-inactive page ( .htm), which featured “satirical essays on religious matters”11; and the Web site of the Second Amendment Foundation (, as Wallace noted, Cyber Patrol did not block other gun-related sites, such as that of the National Rifle Association.


“Gay sites netted in Cyber Patrol sting,” press release, GLAAD, Dec. 19, 1997.
Cyber Patrol’s evident bias against homosexuals was reported by the Gay and Lesbian Alliance Against Defamation (GLAAD) in a Dec. 1997 press release stating that Cyber Patrol was blocking the entire “WestHollywood” subdirectory of Geocities. WestHollywood, at that time, was home to more than 20,000 gay- and lesbian-interest sites, such as that of the National Black Lesbian and Gay Leadership Forum’s Young Adult Program. When contacted, Cyber Patrol’s then-manufacturer Microsystems Software cited, by way of explanation, the high potential for WestHollywood sites to contain nudity or pornographic imagery. GLAAD’s press release pointed out, however, that Geocities expressly prohibited “nudity and pornographic material of a