Conclusions

to

Access Denied, Version 2.0

(September, 1999)

by

John E. Bowes

School of Communications

Box 353740

University of Washington

Seattle, WA 98195-3740

jbowes@u.washington.edu

 

Draft Copy 

 


In the two years since publication of the original Access Denied in 1997, many changes have taken place in the fast-moving world of filtering software and the Internet environment surrounding it. In a sense this follow-up report serves as a marker, showing rapid growth in filtering, the emergence of threats predicted earlier and some entirely new developments. In this section we consolidate and amplify implications from the themes stressed by our contributors to this report. The themes set an agenda for both GLAAD’s future work on filtering and serve as a continuing critique for manufacturers and consumers of this intrusive software.

Recent Growth and Consolidation in Filtering: In their 1998 report, Censorship in a Box,[i] The ACLU suggests that use of filtering software has experienced “explosive growth” since the defeat of the Communication Decency Act of 1996 in Reno v. ACLU. The report notes that in 1997, some $14 million in filtering software was sold with projections that “blocking” software products would grow to $75 million in 3 years – by the year 2000. The American Library Association reports an Internet access rate of 60% by 1999 in US libraries, up from 28% in 1996.   The universal access mandates of the Telecommunications Act of 1996 help assure that this figure will climb to near totality over the next few years. The Constitutional problems apparent in “supply side” control of Internet content as seen with the Children’s Online Protection Act (COPA) leave filtering as the main “protective” product available to schools, libraries and parents.

Paralleling the growth of filtering products is its growing availability on the servers of Internet access providers. This “upstream” filtering requires no user implementation; offending websites never appear past the filter placed on the network server. One filter program, Cyber Patrol, has served America Online, Compuserve, Prodigy, AT&T, Bell-AtlanticNet, and Scholastic Net, among others – over 24 million subscribers.  A half decade ago, the variety of Internet providers might have offered more choices from no to heavy censoring. They were local and relatively small, reflecting regional norms.  But since the mid 1990s, there has been a rapid consolidation of ISPs. Local “mom & pop” providers are increasingly bought-up or competitively flanked by national scale organizations.[ii]  Some 160 Internet service providers were consolidated in 1998 with another 70 meeting a similar fate in the first three months of 1999.[iii] America Online has attained the scale of a large international telecommunications organization with over 19 million subscribers. Portal page providers like Yahoo, have become billion dollar organizations with customers across a half dozen nations.

As these newly large corporations face anxious politicians and a tangle of differing local and national criteria for acceptable content, they veer towards the safest and most conservative standards. They have a lot to lose by protracted legal problems with government and policy problems with frightened, angry parents. In one extreme case, a complaint of Internet pornography transmission laid by the Bavarian state prosecutor in 1995, Compuserve was obliged to disconnected customers across Europe from over 200 Usenet news groups. Felix Somm, a Compuserve manager, was sentenced to a heavy fine and a two-year suspended prison term because he had "abused" the Internet and “allowed” child pornography and Nazi literature – both of which are illegal in Germany – to be available to German CompuServe users.[iv]  It made little sense to the court that Somm had no real ability to monitor or control the flood of international items coursing through the Internet.

It is seductively easy for AOL, Compuserve and others to hand over “vigilance” by buying a popular commercial filter and giving their subscribers the choice to turn it on. The problem is that software selection may be geared to minimal price, low supervisory overhead and a common denominator bland inoffensiveness.   In short, as size, consolidation and corporate standardization of software take place, filtering has grown in impact. In contrast to the landscape of thousands of small Internet providers several years ago, each with a particular local outlook and varied policies, consolidation narrows the choices.

Far from an abstract problem, these trends spell a progressive denial of access by GLBT citizens from information about their community and themselves at the hand of clumsy, over-cautious filter software. Gay high school students, gay and lesbian Hispanics, parents of lesbians and gays, transsexuals and transgendered all disappear from the Internet forum of ideas and information.

Filtering Legislation as a First Amendment Work-around: In the two years since GLAAD’s last report, there is ample evidence that governments increasingly see filtering as their quickest fix for calming those worried about Internet content. The negative outcomes given outright censorship in ACLU v. Reno and later in a second round with the Child Online Protection Act have increased reliance on filters, most recently seen in The Internet School Filtering Act (S. 1619) of 1999 proposed by Sen. John McCain.  If Federal efforts weren’t enough, the ACLU reports in 1998 that 10 Internet censorship bills were proposed in state legislatures, 5 of them specifically requiring filter or blocking software in schools and libraries.[v]  By recommending filtering software, the main gate-keeping task is transferred from provider and regulator to parents, schools and libraries. More importantly, in the hands of vulnerable local institutions, filters become an easy pressure point for special interests, local and state politicians and any parent at odds with the Internet.

GLAAD’s position is that censorship is not an appropriate function for libraries and schools. The recent Mainstream Loudoun v. Loudoun County Library decision showed that filtering all of a library’s machines was equivalent to “removing books from the shelves.”[vi]  The decision in late 1998 found a number of problems with indiscriminant use of filtering software: adults were denied access to constitutionally protected material because it was unfit for children; standards were not spelled out, and there was a lack of safeguards or process for judicial review of policy.  Despite this decision, it has not slowed legislative efforts to mandate filtering software. As Karen Schneider describes in this report, over 15% of libraries now filter.

GLAAD reaffirms its 1996-7 recommendation against filtering software; that parental oversight, school supervision, and training of youthful web-surfers is preferable to the mechanistic censorship of filters. If filtering or blocking is unavoidable, then blocking (a) by specific URL is preferable to blocking by keywords; and that (b) filters should have the ability to adjust for multiple users, invoking different criteria for different ages, rather than subjecting all to the most restrictive censorship. But, we feel both techniques afford scant mitigation of the overall, long-term negative consequences of blocking and filtering.

GLAAD’s position aligns closely with many concerned groups who have arrived at similar conclusions.  Prominent among these is The Global Internet Liberty Campaign, an international union of human rights organizations favoring unfettered access to electronic information. Their member statement to the Internet Content Summit at Munich this September (1999) is worth repeating here:

“The creation of an international rating and filtering system for Internet content has been proposed as an alternative to national legislation regulating online speech. Contrary to their original intent, such systems may actually facilitate governmental restrictions on Internet expression. Additionally, rating and filtering schemes may prevent individuals from discussing controversial or unpopular topics, impose burdensome compliance costs on speakers, distort the fundamental cultural diversity of the Internet, enable invisible "upstream" filtering, and eventually create a homogenized Internet dominated by large commercial interests. In order to avoid the undesirable effects of legal and technical solutions that seek to block the free flow of information, alternative educational approaches should be emphasized as less restrictive means of ensuring beneficial uses of the Internet.”[vii]

 

It is important to note that fears of filtering extend well beyond exclusion of minorities. With wide, uncritical adoption of filtering, the Internet becomes bland, branded and standardized. All of us lose a vibrant, difficult and inclusive medium for discussion and education.

Faulty Filters:  A recurrent theme in this report is that filters have unintended consequences. While it is heartening that GLAAD and other concerned groups have prevailed on software manufacturers to fix glaring problems, a variety of forces can make the best-intentioned efforts go wrong. In this report, Karen Schneider described in grim detail just how bad this software could be for GBLT-oriented information. As she aptly concludes, “ . . . It isn’t realistic to expect that a $40 piece of software could effectively manage something as huge as the Internet – and it’s equally unrealistic to expect a piece of software to instantly customize itself to the values of each person who uses it.” The evidence of her conclusions abound.

In Passing Porn, Banning the Bible, The Censorware Project tested one popular blocking product, N2H2’s Bess. As the title suggests, this popular filter software had many unintended consequences, despite the firm’s assurances to Congress that human editors reviewed all banned sites.  Web sites advocating celibacy, having information on cats, about ghosts, Serbia, or sites critical of Internet censorship were all caught in Bess’ filters. All free web page sites were banned as a matter of policy. These servers, where individuals may post their own web sites without cost, are among the largest sources of Internet content. On the other hand, many obviously pornographic sites slipped by Bess, such as “stripshowlive.com,” “legal-sex.com” and “digitaldesires.com.”[viii]  It should satisfy no one that safe and risqué topics seem abused by filtering in apparently equal measure.

These errors clearly fault the idea that software presently available can adroitly filter the massive 18 terabytes of data that make-up the Internet database. A 1996 study estimated the Internet at 1.5 terabytes (then) and calculated that better than a third of the web pages changed per month.[ix]  One current estimate is that new pages now are added to the web at a rate exceeding 25 pages per second. Filters are swiftly blind-sided by changes in key word usage, semantics and the onrush of new web pages. They demand continual maintenance by editors who should quickly see the categorical differences between, say, Asian wildlife and “hotasianfoxes.com.”

Will institutions pay the costs of continual maintenance? And will parents who have installed blocking software on home machines do likewise? Can any software organization maintain massive review and vigilance for the sake of a $40 product? Data can’t be found on these questions, but we suspect for many – individuals and organizations – the task may fall into the class of “to do” computer chores that never quite get done.  It’s just too time-consuming and troublesome. Editors must also anticipate inventive countermeasures by those most likely to be screened out, such as loading sites with tame keywords deliberately to frustrate filters. A site like wwwmen.com, despite the innocuous name, is about gay porn, not about men’s health or fashion.

Most filters and blocking software are proprietary or secret in terms of their precise functioning and so too their keyword lists of censored terms.  One must test filters with benchmark programs to learn, for example, that Searchopolis, the engine underlying Bess, returns the same website list on “testicular cancer” as it does on “cancer.”[x]  Only through experience do we discover that “testicular” is banned, having no influence on the search. While many filter software firms are open to citizen criticism and have advisory boards to adjudicate tough categorical issues (such as OK-for-kids GBLT websites), few reveal the details of their products’ operations.

A growing number of studies have tested filters, but few with anything approaching a comprehensive benchmark characteristic of the vast range of material making-up the Internet. It is estimated that the best commercial search engines rarely cover more than about 20% of the web’s contents.  The analytic overhead of not only representative surveying of web content, but of gauging accurately its use, is the focus of intense commercial competition and debate. In many respects, the magnitude of this task is like that of a national census – expensive to do, open to continual political controversy and likely to be biased, however keenly safeguards are employed.

Policies, Monitoring and Privacy: Tim McVeigh’s account in this report of his ruined naval career through the carelessness of an America Online employee is one of a growing list of problems with the Internet industry at large, but specifically with issues of user privacy and clear commercial policy. As John Aravosis details in his chilling account of threats to personal privacy on the Internet, developments such as data mining, “cookies,” reverse DNS look-up and smart logging of users puts all on guard for unseen surveillance and a loss of anonymity or privacy. 

In data gathered from the Georgetown Internet Privacy Policy Survey (GIPPS), of 7,500 busiest servers on the World Wide Web, ninety-three percent of the sites in this survey collect personal information from consumers; some 66% post at least one disclosure about their information practices.[xi]  The Online Privacy Alliance, a coalition of industry groups, found that 99% of sites in their study collect personal information.[xii]  A 1999 report by the Forrester Group claims that 90% of commercial websites are deficient in protecting the privacy of their users.[xiii]  Beyond commercial interests, government has persisted since the Reagan years to increase the surveillance powers of the Federal government though electronic means to counter “threats to the infrastructure” or solve problems in pacing the electronic security counter-measures of organized crime.[xiv] While Americans have often been careless of their privacy to enjoy a world of easy credit and marketing promotions, the Internet offers new threats and little by way of protective policy.

What then are the hallmarks of good privacy on the Internet?  Aside from the important self-protection steps outlined in John Aravosis’ sidebar, “Top Tips for Safe Surfing,” what should we be asking from websites and Internet service providers? While it is difficult to rate any one item as more important than others, several key steps are clear:[xv]

 

Monitoring software is sold as an adjunct to filters, either as a separate package (Cybersnoop) or a capability included with a filter (CyberSitter 99).   It is almost always clandestine in operation, since its purpose is a kind of parental entrapment for wayward youth or employees.  It is inherently undermining of trust and presumes wrongful behavior.

The point of all this is that on-line privacy in operation is a mix of web-based features and proper notification. Yet, as the McVeigh case so painfully shows, even with mechanisms in place and consumer notification, one poorly trained, inattentive employee can breach the best of security plans. Operations must be based on clear, published policy and carried out by carefully trained personnel.

Research and Measurement Needs: For GLAAD and other organizations threatened by filtering, there is recurrent need for tracking data. Have filters become more subtle, worse or more pervasive? Are they targeting new key terms and web sites? Have once-banned terms become acceptable? This information must be solid and credible to industry, academe and government policy-makers. Certainly it must earn the attention of those who build and promote filtering products. Some issues:

Will assessment aid come from without? Software manufacturers serve themselves in sidestepping evidence of their products’ clumsiness. Government seems ready to embrace this software – faulty or not – as an alternative to first amendment tangles and constituent pressure. And increasingly large Internet service providers seek easy, low maintenance solutions to their oversight responsibilities.   Few of the proposals mandating filtering or blocking consider the quality of software to be used in much detail, if at all. It is probably from consumer groups, academics and organizations who stand to be harmed by the status quo that useful criticism will come.

Until widely accepted quality checks on filtering are available, GLAAD’s recommendations remain largely unchanged from our 1997 report and embrace those points listed above by John Spear in his call for “Back to Basics.” In brief, instilling honor, integrity, morals and ethics in children will be a better safeguard from Internet excesses than any software product. 

A Growing Threat to Minority Voices? Multicultural content on the web has grown as improved access and training broaden those able to create websites. No longer are GLBT matters linked to a few chat groups or mainstream organizations, but gay fathers, gay and lesbian high school students, parents of gays, and elderly gays have newly found voices on the Internet. Groups that had only a local reach, now have a world audience – a process know as “glocalization” where web media are financially viable for small, local groups as well as global audiences. In this way, as shown in our report, a Texas site for gay Latino/Latina youth and another for Pacific Islanders have both local and an international reach for low costs that no other medium can approach. These subtleties are lost to filter software that block a world of good information and social support upon the threat of a few disfavored words.  In this terminological triage, diversity and the volume of useful information are reduced with cleaver-like haste.

It is useful to note that the Internet affords a content diversity that possibly has not been seen since the birth of radio in the 1920s. As media historians know, much of this variety – from churches, schools and community groups – was lost by early government favor for commercial interests who went on to homogenize content to a handful of profitable formulas for mainstream audiences.[xviii]  As web commerce becomes dominant – again assisted by Federal government policies – a similar risk arises.

A Final Word: Regulators – from legislatures to the courts – find it difficult to categorize the novel character of the web, complicating policies for socially constructive management of the Internet. Is the web broadcast, telephone or publication? Complexly, it is all of these, yet not any one. The strength of the Internet is in its mutability as multiple media, going quickly from one form to another as a one- or two-way device available to most anyone with modest equipment and a telephone line. With about 40 million Internet-connected US households, there is a mass reach platform for each of thousands of groups denied access to costly commercial media like television. Just as importantly, it is a way for minorities to reach out across great distances to gather their community on a world scale. Filtering, blocking and monitoring stall these efforts, handing control of cyberspace to commercial forces or special interests that owe little save to their investors and content ideologues.

Notes:

[i] ACLU, Censorship In a Box: Why Blocking Software is Wrong for Public Libraries. Internet document. http://www.aclu.org/issues/cyber/box.html#appendix1. 1998.

[ii] Amanda Long. “More ISP Consolidation Expected” Financial Post. 11 Nov. 1997.

[iii] Yardena Arar. How to avoid getting chewed up when your ISP is swallowed” PCWorld Online. Internet document,  http://japan.cnn.com/TECH/computing/9905/10/isp.idg/, 11 May 1999.

[iv] Reuters. “Prosecutors Appeal Somm’s Case.” 3 June 1998.

[v] ACLU. “Online Censorship in the States” Internet document http://www.aclu.org/issues/cyber/censor/stbills.html

[vi] ACLU. Testimony Before the US Senate Committee on Commerce, Science and Transportation, Hearing on Internet Indecency, February 10, 1998. Internet document.

[vii] Global Internet Liberty Campaign. Member Statement to the Internet Content Summit, Munich, Germany, September 9-11th, 1999. Internet document, http://www.gilc.org/speech/ratings/gilc-munich.html

[viii] The Censorware Project. Passing Porn, Banning the Bible: N2H2’s Bess in Public Schools. 1999. Internet document.

[ix] Brewster Kahle. “Preserving the Internet” Scientific American, March 1997, p. 82. Available as an Internet document, http://www.archive.org/.

[x] Ibid.

[xi] Culnan, M. (1999). Georgetown Internet Privacy Policy Study. Available: http://www.msb.edu/faculty/culnanm/gippshome.html

[xii] Landesberg, M. M., L. (1999). Self-Regulation and Privacy Online:  A Report to Congress . Washington, D.C.: Federal Trade Commission.

[xiii] Paul Hagan with Stan Dolberg. “Privacy Wake-up Call” Forrester Brief. September, 1999.

[xiv] Electronic Privacy Information Center (EPIC). Critical Infrastructure Protection and Endangerment of Civil Liberties: An Assessment on the President’s Commission on Critical Infrastructure Protection. Washington, DC, 1998. Internet document  http://www.epic.org/security/infowar/epic-cip.html

[xv] Jeanette Burkette and John Bowes. The Transaction Snarl: Can eCommerce be Tamed Across Regulatory Frontiers? Paper presented to he annual meeting of the International Association for Mass Communication Research, Leipzig, July 24-27th, 1999.

[xvi] See The Internet Archive Project, http://www.archive.org/

[xvii] GetNetWise.org. “Authors & Contributors” Safety Guide. Internet document, http://www.getnetwise.org/safetyguide/authors.shtml

[xviii]Robert W. McChesney. Telecommunications, Mass Media, and Democracy : The Battle for the Control of U.S. Broadcasting, 1928-1935. New York: Oxford, 1993.