MANDATORY INTERNET FILTERING IN PUBLIC LIBRARIES: THE DISCONNECT BETWEEN LAW AND TECHNOLOGY

By

BARBARA H. SMITH

A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

UNIVERSITY OF FLORIDA

2009

1

© 2009 Barbara H. Smith

2

I dedicate this dissertation to my Mom, Dorothy Marie Butterworth Smith, who contributed the most to my success as a human being with her unconditional love, ongoing support and daily inspiration.

3

ACKNOWLEDGMENTS

I would like to thank Professor Bill F. Chamberlin, Eminent Scholar Emeritus in Mass

Communications, and chairman of my dissertation committee, for his wisdom, steadfast support,

encouragement and patience as I prepared this dissertation. I also would like to thank my other

committee members for their helpful insights: Professor Laurence Alexander, Professor Charles

Collier and Professor John W. Wright. In addition, I would like to thank colleagues and friends

for their ongoing support during the dissertation process, especially Stuart Blacklaw, Fred Brock,

Tom Caswell, Irina Dmitrieva, Rick Donnelly, Tony Fargo, Jennifer Freer, Martin Halstuk, Jody

Hedge, Michael Hoefges, Kimberly Lauffer, Christina Locke, Charles Lubbers and Michelle

O’Malley and Steve Smethers. Finally, I owe special thanks to Dorothy Marie Butterworth

Smith, my mother, my friend, my role model and my beacon in the night, for her continued support throughout the dissertation process, and for her inspiration and unconditional love

throughout my life.

4

TABLE OF CONTENTS

page

ACKNOWLEDGMENTS ...... 4

ABSTRACT ...... 8

CHAPTER

1 INTERNET ACCESS IN PUBLIC LIBRARIES ...... 10

Introduction ...... 10 Relevance of Study ...... 19 Research Questions ...... 19 Contributions to the Field ...... 20 Theoretical Framework ...... 20 Review of the Literature ...... 21 The Children’s Internet Protection Act of 2000 ...... 22 Relevant Articles Before the CIPA Was Adopted ...... 34 Methodology ...... 38 Outline of Dissertation ...... 41

2 THE PUBLIC LIBRARY AND THE FIRST AMENDMENT ...... 44

Introduction ...... 44 The Mission and Role of the Public Library ...... 46 The Applicability of Public Forum and the Right to Receive Doctrines to Public Libraries ...... 56 Case Analysis—The Application of Public Forum and Right to Receive Information Doctrines to Public Libraries ...... 59 Case Analysis—Historical Overview of the Right to Receive Ideas and Information ...77 Commentator Analysis on Public Forum Doctrine ...... 93 The Public Library as a Traditional Public Forum ...... 93 The Public Library as a Designated/Limited Public Forum ...... 94 The Public Library as a Nonpublic Forum ...... 96 The Public Library as Mixture of Fora ...... 96 The Inapplicability of Public Forum Doctrine to Public Libraries ...... 98 Commentator Analysis on the Right to Receive Ideas and Information Doctrine ...... 101 First Amendment Theory ...... 104 Minors’ Access to Public Library Material ...... 110 Conclusion ...... 118

3 INTERNET FILTERING TECHNOLOGY ...... 120

Introduction ...... 120 History of Internet Technology ...... 121 Internet Content and Usage ...... 124

5

How Filtering Technology Works ...... 129 Internet Filtering in Public Libraries ...... 135 Filtering Software Studies ...... 138 Conclusion ...... 152

4 THE PROTECTION OF MINORS FROM MATERIAL DEEMED HARMFUL ...... 155

Introduction ...... 155 The Changing View of the Child ...... 156 The Parental Role in Childrearing ...... 165 Sexually Explicit Material and the Law ...... 170 Children, the Variable Obscenity Standard and Indecency ...... 174 Social Science Research on the Effects of Pornography ...... 184 Pornography and the Third Person Effect Theory ...... 195 Pornography and Children ...... 196 Conclusion ...... 199

5 FEDERAL ATTEMPTS AT PROTECTING MINORS FROM ONLINE MATERIAL DEEMED HARMFUL ...... 201

Introduction ...... 201 Communications Decency Act of 1996 ...... 202 Child Online Protection Act of 1998 ...... 209 Round 1: U.S. District Court for the Eastern District of —1998 ...... 214 Round 2: U.S. District Court for the Eastern District of Pennsylvania—1999 ...... 215 Round 3: Third Circuit of the U.S. Court of Appeals—2000 ...... 216 Round 4: U.S. Supreme Court—2002 ...... 218 Round 5: Third Circuit of the U.S. Court of Appeals—2003 ...... 221 Round 6: U.S. Supreme Court—2004 ...... 223 Round 7: U.S. District Court for the Eastern District of Pennsylvania—2007 ...... 228 Conclusion ...... 230

6 THE LEGISLATIVE HISTORY OF THE CHILDREN’S INTERNET PROTECTION ACT ...... 232

Introduction ...... 232 The Emergence of the Children’s Internet Protection Act ...... 238 Proposed Legislation in 1998 ...... 241 Proposed Legislation in 1999 ...... 255 Proposed Legislation in 2000 ...... 273 The Enactment of Mandatory Filtering Legislation in 2000 ...... 276 The McCain Amendment ...... 277 The Santorum Amendment ...... 280 The Final Legislation ...... 281 Conclusion ...... 285

6

7 COURT DECISIONS ON THE CHILDREN’S INTERNET PROTECTION ACT ...... 287

Introduction ...... 287 Federal District Court Holds the CIPA Unconstitutional ...... 287 Role of Public Libraries ...... 289 The Use of Filtering Technology ...... 289 The First Amendment and Public Forum Doctrine ...... 292 Congress’ Spending Clause and Unconstitutional Conditions Doctrine ...... 301 Supreme Court Upholds the CIPA ...... 306 The Use of Filtering Technology ...... 308 The Congressional Spending Clause ...... 309 The First Amendment and Public Forum Doctrine ...... 312 Concurrences ...... 314 Dissents ...... 317 Conclusion ...... 327

8 BRIDGING THE GAP BETWEEN LAW AND TECHNOLOGY ...... 329

Introduction ...... 329 The First Amendment Right to Receive Information and the Role of the Public Library ...331 Internet Filtering and Public Libraries ...... 341 The Supreme Court, the Protection of Minors, and Competing Interests ...... 348 Suggestions for Future Research ...... 359 Conclusion ...... 361

LIST OF REFERENCES ...... 366

Government Statutes & Reports ...... 366 Regulatory Agency Materials ...... 369 Letters ...... 370 Principal Cases ...... 370 Secondary References ...... 374

BIOGRAPHICAL SKETCH ...... 395

7

Abstract of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy

MANDATORY INTERNET FILTERING IN PUBLIC LIBRARIES: THE DISCONNECT BETWEEN LAW AND TECHNOLOGY

By

Barbara H. Smith

December 2009

Chair: Bill F. Chamberlin Major: Journalism and Mass Communication

The Children’s Internet Protection Act of 2000 requires public libraries receiving certain types of federal funding to install filtering software or another “technology protection measure” on all computers connected to the Internet. The Children’s Internet Protection Act (CIPA) prohibits access to three types of content: 1) “visual depictions” that are obscene; 2) “visual depictions” that involve child pornography; and 3) to any person under the age of 17, “visual depictions” that are considered “harmful to minors.” In 2002, a federal district court held that the

Children’s Internet Protection Act was unconstitutional, a ruling that the Supreme Court reversed, in 2003, in United States v. American Library Association.

The Children’s Internet Protection Act has significant implications on the role that public

libraries play in providing patrons with access to information. The purposes of this dissertation

are to 1) further public understanding of the role of the public library; 2) analyze the legal and

practical aspects of implementing mandatory Internet filtering in public libraries; and 3)

determine if the CIPA is capable of doing what Congress is asking of it.

In this dissertation, the author will use First Amendment theory and the mission and role

of the American public library to examine the Children’s Internet Protection Act and the two

court decisions reviewing the CIPA. The author of this study will trace the legislative history of

8

the Children’s Internet Protection Act of 2000 and its predecessors to try to determine whether the CIPA can do what Congress expected of it. The author will analyze studies on the strengths and limitations of Internet filtering and the implications of mandatory filtering on the librarian’s role and on patrons’ access to information in public libraries.

9

CHAPTER 1 INTERNET ACCESS IN PUBLIC LIBRARIES

Introduction

Nearly 99% of public libraries in the United States offered Internet access to the public in

2008.1 That percentage contrasts with 25% in 1997,2 73% in 1998,3 and 95% in 2000, 4 the year that Congress enacted the Children’s Internet Protection Act.5 From 2006 to 2008, in-person visits to public libraries increased by 10%, with 8% of patrons visiting the library to gain Internet access.6

To receive federal funding7 for Internet access, the Children’s Internet Protection Act of

1 From 2006 to 2008, libraries providing Internet access remained constant at just under 99%. See AM. LIBRARY ASS’N, THE STATE OF AMERICAN LIBRARIES 5 (April 2007) and JOHN CARLO BERTOT, CHARLES R. MCCLURE, ET AL., FLORIDA STATE UNIVERSITY INFORMATION INSTITUTE, PUBLIC LIBRARIES AND THE INTERNET 2008: STUDY RESULTS AND FINDINGS (2008). In fiscal year 2006, the latest year for which data are available, there were 9,208 public library systems or administrative units, and 16,592 central library outlets and branch library outlets in the fifty states and the District of Columbia. See INSTITUTE OF MUSEUM AND LIBRARY SERVICES, PUBLIC LIBRARIES SURVEY: FISCAL YEAR 2006 4 (2008).

2 THE STATE OF AMERICAN LIBRARIES (2007), supra note 1, at 5.

3 Press Release, Am. Library Ass’n, New Report Shows More Libraries Connect to the Internet; Access Still Limited (Nov. 17, 1998), available at http://bubl.ac.uk/archive/journals/alawon/v07n149.htm. The survey of library connectivity was conducted by John Carlo Bertot, associate professor in the School of Information Science and Policy at the University of Albany, State University of New York, and Charles R. McClure, distinguished professor formerly in the School of Information Studies at Syracuse (N.Y.) University and now a distinguished professor in the School of Information Studies at Florida State University.

4 Of the 9,129 public library systems, 7,352 were single-outlet libraries, 1,776 were multiple-outlet libraries, and one had no public-service outlets (providing books-by-mail service only). The number of library systems or administrative units encompassed a total of 16,241 buildings, including central and branch locations. ADRIENNE CHUTE ET AL., U.S. DEP’T OF EDUC., NATIONAL CENTER FOR EDUCATION STATISTICS, PUBLIC LIBRARIES IN THE UNITED STATES: FISCAL YEAR 2001, NCES 2003–399 (2003), available at http://nces.ed.gov/pubs2003/2003399.pdf.

5 See infra Chapter 6, pp. 281 to 286, and Chapter 7 for a complete discussion of the Children’s Internet Protection Act of 2000.

6 See AM. LIBRARY ASS’N, THE STATE OF AMERICAN LIBRARIES 4 (April 2009). The report did not include raw numbers.

7 For an explanation of the two programs—the Library Services Technology Act and the universal service program, also known as the E-rate—see infra notes 36 to 41 and accompanying text.

10

20008 requires that public libraries and most schools implement an Internet safety policy and install “a technology protection measure,” such as blocking or filtering software, on all computers connected to the Internet to block or filter sexually explicit images.9 The safety policy requires libraries and schools to monitor minors’ online activities and to monitor the operation of

“a technology protection measure.”10 As of 2009, the only “technology protection measure” available was blocking and filtering software.11

Blocking and filtering software operate differently, even though software designers usually combine the two programs into one package. Blocking software and filtering software

8 Pub. L. No. 106-554, signed into law on Dec. 21, 2000, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) (2000) and 47 U.S.C. 254(h) (2000)). In passing the Children’s Internet Protection Act, Congress added two different amendments to the appropriations bill, the Children’s Internet Protection Act and the Neighborhood Children’s Internet Protection Act. Dep’t of Labor, Health & Human Servs. & Educ. Appropriations Act, 2001, H.R. 4577, 106 Cong., 2d Sess. (2000), Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) and 47 U.S.C. 254(h)). The Children’s Internet Protection Act was sponsored by Senator John McCain. See S.Amdt. No. 3610 to H.R.4577, 106th Cong. (2d Sess. 2000). The text of Senator McCain’s amendment to the Consolidated Appropriations Act was based on S.97, the Children’s Internet Protection Act, that McCain introduced in 1999. See S.97, 106th Cong. (1st Sess. 1999). The Neighborhood Children’s Internet Protection Act was sponsored by Senator Rick Santorum. See Neighborhood Children’s Internet Protection Act, S.Amdt. No. 3635 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. 2000). The text of Senator Rick Santorum’s amendment was based on S.1545, the Neighborhood Children’s Internet Protection Act, which he had introduced in 1999 (see S.1545, 106th Cong. (1st Sess. 1999)). After the appropriations bill passed, Congress referred to both Internet protection acts as the Children’s Internet Protection Act. For an analysis of McCain’s and Santorum’s bills and the court cases on the Children’s Internet Protection Act, see Chapter 6.

9 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f), 47 U.S.C. § 254(h)(6)), mandating that “a blocking technology measure” be installed on “any” computer connected to the Internet at libraries receiving E-rate funding and Library Services and Technology Act (LSTA) funding.

10 20 U.S.C. § 9134(f)(1)(A) and (B), 47 U.S.C. § 254(h)(5)(A) and 47 U.S.C. §254(h)(6)(A). The Internet safety policy applies only to minors and not to adults. The safety policy “includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any [library or school] computers with Internet access that protects against access through such computers to visual depictions that are obscene, child pornography, or harmful to minors.” The Internet safety policy must “address” minors’ access to “inappropriate matter” on the Internet and “measures designed to restrict minors' access to materials harmful to minors.” The Internet safety policy also must “address” minors’ safety and security when using e-mail, chat rooms, and other forms of direct communications. The Internet safety policy also must address minors’ unauthorized access to the Internet, including “hacking,” and minors’ unauthorized use of personal information regarding minors. Libraries and schools must “provide reasonable public notice and hold at least one public hearing or meeting to address the Internet safety policy.” 47 U.S.C. § 254 (l) (1). For a more detailed discussion of the Children’s Internet Protection Act, see infra Chapter 6, pp. 281-286, and Chapter 7.

11 For a discussion of filtering technology, terminology, strengths and limitations, see Chapter 3.

11

are both installed on the user’s computer and allow the user, such as a parent, to choose which types of content to avoid. Blocking software prevents access to specific Web sites, such as playboy.com, and to other types of Internet applications, such as social networking sites, chat rooms and e-mail. In contrast, filtering software prevents access to Web sites based on key words or rating systems,12 or allows access to a pre-selected list of Web sites only.13 Although blocking and software technologies operate differently, the terms “filter,” “filters,” “filtering,” and

“blocking” often are used interchangeably by software manufacturers, commentators and courts.

Therefore, to avoid confusion and to improve sentence clarity, the author of this dissertation will use the term “filters” and “filtering” in referring to both blocking and filtering technology when discussing the function of the software as a whole. When the two terms need to be differentiated, such as in Chapter 3 on filtering technology, the author will do so.

Congress tacked the filtering legislation onto a major appropriations bill14 that President

Bill Clinton reluctantly signed into law in 2000.15 The law combined the Children’s Internet

Protection Act and the Neighborhood Children’s Internet Protection Act.16 The federal filtering

12 See Children’s Internet Protection Act, Report of the Senate Committee on Commerce, Science and Transportation on S.97, S. Rep. No. 106-141 106th Cong. (1st Sess. 1999), Aug. 5, 1999, at 6. See Chapter 6 for a discussion of hearings and reports on the Children’s Internet Protection Act.

13 See Richard J. Peltz, Use “the Filter You Were Born With”: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Public Libraries, 77 WASH. L. REV. 397, 404 (2002); see also Adam Horowitz, The Constitutionality of the Children’s Internet Protection Act, 13 ST. THOMAS L. REV. 425, 429-35 (2000).

14 H.R. 4577, 106th Cong. (2d Sess. 2000).

15 Pub. L. No. 106-554, signed into law on Dec. 21, 2000. President Clinton said he was “very disappointed” that Congress passed the CIPA. He said that while his Administration “actively promoted the protection of children from harmful materials on the Internet,” a local Internet-usage plan was more appropriate and effective. See Statement on Signing the Consolidated Appropriations Act, FY 2001, 36 WEEKLY COMP. PRES. DOC. 52 3171-72 (Dec. 29, 2000).

16 CIPA, Pub. L. No. 106-554, 1(a)(4), 114 Stat. 2763, 2763 (2000). On the day Congress passed the CIPA, the Act was copied into 1701-1741 of H.R. 5666, 106th Cong. (2d Sess. 2000), which in turn was added as Appendix D, including CIPA, 1701-1741, 114 Stat. 2763A-336 to -351, to H.R. 4577, 106th Cong. (2d session 2000), the final omnibus budget bill. CIPA's provisions were enacted at 20 U.S.C. 9134 (2000) and 47 U.S.C. 254 (Supp. 2001). See also Neighborhood Children's Internet Protection Act, Pub. L. No. 106-554, app. D, 1731-1741, 114 Stat. 2763A- 350 to -351 (2000).

12

law was to have become effective on April 20, 2001, but the American Civil Liberties Union

(ACLU) and the American Library Association (ALA) independently filed suit to block its implementation.17 A federal district court consolidated the cases, and in 2002 held that the challenged sections of the Children’s Internet Protection Act were unconstitutional,18 a ruling that the Supreme Court reversed in 2003.19 However, the Court ruled that the CIPA could be challenged on an “as applied” basis,20 which would allow a library patron to file a lawsuit alleging that the law was improperly administered under a specific set of circumstances.21

In this dissertation, the author will use First Amendment theory and the mission and role of the American public library to examine the Children’s Internet Protection Act and the two court decisions reviewing the CIPA. The author will discuss three compelling and sometimes competing government interests: protecting minors from constitutionally-protected material deemed harmful, protecting adults’ First Amendment rights, and allowing parents to raise their children as the parents see fit. The author of this study will trace the legislative history of the

17 See AM. CIV. LIBERTIES UNION, LIBRARY INTERNET ACCESS IS STILL FREE FROM CENSORSHIP AS LAW GOES INTO EFFECT, ACLU TELLS LIBRARIES, PATRONS (April 19, 2001), available at http://www.aclu.org/Privacy/Privacy.cfm?ID=7224&c=252. See also Am. Library Ass’n, The Children’s Internet Protection Act (CIPA), http://www.ala.org/ala/aboutala/offices/wo/woissues/civilliberties/cipaweb/legalhistory/legalhistory.cfm (last visited July 20, 2009).

18 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). The U.S. District Court for the Eastern District of Pennsylvania held that the Children’s Internet Protection Act was facially invalid. Id. at 411.

19 United States v. Am. Library Ass’n, 539 U.S. 194 (2003), rev’g Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). See Chapter 6 for the legislative history of the Children’s Internet Protection Act and see Chapter 7 for a discussion of the two court cases on the CIPA.

20 United States v. Am. Library Ass’n, 539 U.S. at 215 (Kennedy, J., concurring).

21 The CIPA contains a disabling provision that allows librarians to disable the filter for adults for “bona fide research or other lawful purposes.” See 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D)). If a librarian refuses to disable the filter or is unable to disable the filter in a timely manner, or if an adult patron’s access to constitutionally-protected online content is “burdened in some other substantial way,” the patron would be able to challenge the CIPA on an “as applied” basis. See United States v. Am. Library Ass’n, 539 U.S. at 215 (Kennedy, J., concurring). For a discussion of the Children’s Internet Protection Act and its disabling provision, see infra Chapter 6, pp. 281 to 286.

13

Children’s Internet Protection Act of 2000 and its predecessors to try to determine whether the

CIPA can do what Congress expected of it. She will analyze studies on the strengths and limitations of Internet filtering and the implications on the librarian’s role and on patrons’ access to information in public libraries.

The CIPA prohibits access to three types of content: 1) “visual depictions” that are obscene to any patron; 2) “visual depictions” that involve child pornography to any patron; and

3) “visual depictions” that are considered “harmful to minors” to any person under the age of seventeen.22

Congress defined “harmful to minors” in the Children’s Internet Protection Act as

any picture, image, graphic image file, or other visual depiction that (i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.23

The CIPA’s definition of “harmful to minors” parallels in many ways the longtime three-

part obscenity test that the Supreme Court established in Miller v. California in 1973,24 which is still the current precedent.

22 Children’s Internet Protection Act, Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified at 20 U.S.C. § 9134(f)(1)9A)(i) and (B)(i); 47 U.S.C. § 254(h)(6)(D)).

23 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as 20 U.S.C. § 9134 (f)(7)(B) and 47 U.S.C. § 254 (h)(7)(G)).

24 Miller v. California, 413 U.S. 15, 24 (1973). In 1957, the Supreme Court held that obscenity was not protected by the First Amendment in Roth v. United States, 354 U.S. 476 (1957). In Miller, the Court established the current three-part obscenity test: 1) “whether ‘the average person, applying contemporary community standards,’ would find the work, taken as a whole, appeals to the prurient interest; 2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 3) whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value.”

14

In Miller, the Court said that sexually oriented content was only obscene, and therefore not protected by the First Amendment, when all three criteria were met:

1) whether ‘the average person, applying contemporary community standards,’ would find the work, taken as a whole, appeals to the prurient interest; 2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 3) whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value.25

In contrast to the Miller obscenity test, the CIPA only applies to “visual depictions.”26

The Miller obscenity test applies to all types of content and not just images. Unlike Miller, the

CIPA does not apply a “contemporary community standards” provision to determine a “prurient interest,” but instead the CIPA states that the image must appeal to the prurient interest of minors,27 suggesting all minors adhere to the same standard. The CIPA also uses terms related to

sex and excretion that were not used in Miller. According to the CIPA’s “prurient interest”

prong, images must appeal to a prurient interest in nudity, sex, or excretion.28 The Miller test did

not list nudity, sex and excretion as part of the prurient interest prong. Under the CIPA’s

“patently offensive” prong, the images must be “an actual or simulated sexual act or sexual

contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the

genitals.”29 In contrast, the Miller test’s “patently offensive” prong states that sexual conduct

must be patently offensive, but Miller does not provide examples of sexual conduct.30

25 Miller, 413 U.S. at 24.

26 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified at 20 U.S.C. § 9134 (f)(7)(B) and 47 U.S.C. § 254 (h)(7)(G)).

27 Id.

28 Id.

29 Id.

30 Miller, 413 U.S. at 24.

15

The CIPA seems to reflect a variable obscenity standard, which means that what is obscene for a minor might not be obscene for an adult. The Supreme Court upheld New York state’s variable obscenity standard in 1968, in Ginsberg v. New York.31 A luncheonette owner had been convicted under the New York statute for selling a sixteen-year-old boy a “girlie” magazine on two separate occasions.32 However, the Court did not preclude parents from purchasing sexual material to share with their children.33 The magazines would not have been considered obscene by adult standards.34 For a magazine to be obscene by adult standards in

1968, the Court said, the magazine, as a whole, would need to appeal to the prurient interest and be “utterly without redeeming social value,” as determined by an average person applying contemporary national standards.35

The Children’s Internet Protection Act amended the E-rate program by requiring libraries and schools to install Internet filters in order to purchase Internet access, telecommunications technology, and computer systems and equipment at a discounted rate. 36 The E-rate program, established under the Telecommunications Act of 1996, 37 amended the Communications Act of

31 Ginsberg v. New York, 390 U.S. 629 (1968). The Supreme Court held that material considered obscene for minors may not necessarily be considered obscene for adults. Id. at 636.

32 Id. at 636-38, 645.

33 Id. at 639.

34 Id. at 636-38, 645.

35 Memoirs v. Massachusetts, 383 U.S. 413, 418-20 (1966) and Roth v. United States, 354 U.S. 476, 488-89 (1957). See also FREDERICK SCHAUER, THE LAW OF OBSCENITY 30-48 (1976) (discussing the U.S. Supreme Court cases dealing with obscenity law in the twentieth century). Schauer explained that the Roth and Memoirs courts used a national standard, which the Miller Court rejected when it established the local community standard. See SCHAUER at 120-24, 139 and Miller v. California, 413 U.S. 15 (1973). Schauer also said that the Miller Court limited the kinds of works that would be protected under obscenity law by changing the “social value” prong of the test. The Miller Court deleted the phrase “utterly without redeeming social value” and substituted the phrase “whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value.” SCHAUER at 140-47.

36 United States v. Am. Library Ass’n, 539 U.S. at 199. See also Telecomm. Act of 1996, 47 U.S.C. § 254; FCC, E- rate, http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

37 Telecomm. Act of 1996, 47 U.S.C. §254.

16

1934 by providing libraries and schools with “access to advanced telecommunications and

information services” at a discounted rate.38 The Children’s Internet Protection Act also

amended the Library Services and Technology Act by requiring libraries to implement filtering

technology and Internet use policies to receive funding.39 The Library Services and Technology

Act (LSTA) provides funding only to libraries so that they can expand services for learning, access to information and educational resources in a variety of formats.40 The LSTA allows

libraries to use funds to access information electronically and acquire or share computer systems

and telecommunication technology, including the purchase of Internet access.41

To receive funding for computers and network access through the LSTA42 and discounts

on telecommunication services through the E-rate,43 public libraries would need to install

filtering software either on each computer in the library or on the library’s server that links

computers in the library together.44

Libraries must have in place, and enforce, filtering on all computers connected to the

Internet, even those not funded under the two programs.45 In addition to mandating Internet

38 Id.

39 Children’s Internet Protection Act, 20 U.S.C. § 9134(f) (2000).

40 Library Servs. & Tech. Act, 20 U.S.C. § 9141(a)(1)(C) (2003).

41 United States v. Am. Library Ass’n, 539 U.S. 194, 199 (2003) (citing Library Servs. & Tech. Act, 20 U.S.C. § 9141(a)(1)(C) (2003)).

42 Title III of the Elementary & Secondary Educ. Act, 20 U.S.C. § 9121, 9134 (2000).

43 47 U.S.C. § 254. See also H.R. 4577, 106th Cong. (2d Sess. 2000).

44 A server is a computer that delivers information and software to other computers linked by a network. When individual computers are configured in such a way that they must go through the server to connect to the Internet, a software filter installed on the server computer would block selected content from reaching the individual computers connected to it. Libraries typically also have “internal” computers that allow patrons to search the library system’s holdings but that are not connected to the Internet. See PETER BUCKLEY & DUNCAN CLARK, THE ROUGH GUIDE TO THE INTERNET 20, 323 (2007).

45 Children’s Internet Protection Act, 20 U.S.C. § 9134(f)(1)(B); 47 U.S.C. § 254(h)(6)(B) and (C). See also United States v. Am. Library Ass’n, 539 U.S. at 230-31. Neither the legislation nor the legislative history distinguishes staff

17

filtering technology, the Children’s Internet Protection Act also requires libraries and schools to

implement an Internet safety policy for minors.46

In 2007-2008, 44% of public libraries did not apply for E-rate funding. According to a

national survey, the three most common reasons that libraries listed for not applying for the

funding were 1) The application process was too complicated (40.4%); 2) the E-rate discount

was fairly low and not worth the time needed to participate in the program (38.8%); and 3) the

libraries were not willing to comply with the CIPA’s filtering requirements (31.6%). Suburban

and low poverty libraries were least likely to apply for E-rate funding, whereas high poverty and

medium poverty libraries were most likely to apply.47

The Children’s Internet Protection Act states that librarians “may disable” the filtering

technology “to enable access for bona fide research or other lawful purpose(s),”48 although the

law does not require librarians to do so. Under the E-rate program, adults can request that filters be disabled for adult use only, whereas under the LSTA program, adults and minors can request that the filtering technology be turned off and then use the computer without filtering software.49

However, if a library receives both E-rate and LSTA funds, librarians cannot disable the filters when minors are using computers.50

and patron computers; therefore, anyone reading the legislation could assume filtering software must also be installed on nonpublic computers.

46 20 U.S.C. § 9134(f)(1)(A), 47 U.S.C. § 254(h)(5)(A) and 47 U.S.C. § 254(h)(6)(A). For an explanation of the Internet safety policy, see supra note 10.

47 Bertot et al., supra note 1, at 46-47.

48 Children’s Internet Protection Act, 47 U.S.C. § 254(h)(6)(D); 20 U.S.C. § 9134(f)(3). In the E-rate provision, 47 U.S.C. § 254(h)(6)(d), the singular term “purpose” is used, whereas in the LSTA provision, 20 U.S.C. § 9134(f)(3), the plural term “purposes” is used.

49 United States v. Am. Library Ass’n, 539 U.S. at 201. See also Pub. L. No. 106-554 (codified as 20 U.S.C. § 9134(f)(3)).

50 47 U.S.C. § 254(h)(6)(d).

18

Relevance of Study

The Children’s Internet Protection Act has significant implications on the role that public

libraries play in providing patrons with access to information. This dissertation will examine the

important role that the CIPA plays on online content seen in public libraries. Because of the

history of obscenity and indecency laws in the United States, the obscenity and indecency issues have played powerful roles in how Internet content is regulated.

The purposes of this dissertation are to 1) further public understanding of the role of the public library; 2) analyze the legal and practical aspects of implementing mandatory Internet

filtering in public libraries; 3) examine the technological and regulatory scheme of the CIPA to

determine if it is capable of doing what Congress is asking of it.

To that end, the author will analyze the Children’s Internet Protection Act and previous

Internet filtering bills, examine the congressional process that led to the CIPA, and examine the

constitutional and technical issues surrounding Internet filters in public libraries. She will

analyze the Supreme Court’s upholding of the CIPA. By examining the role of the public library in providing access to information and trying to ascertain whether or not the CIPA will accomplish what Congress was asking it to do, the author will fill in some of the gaps in the collective body of literature.

Research Questions

This dissertation will try to illuminate the significance and implications of the Children’s

Internet Protection Act by attempting to answer the following research questions: 1) What is the role of American public libraries? 2) How do the traditions and jurisprudence of the First

Amendment relate to the role of American public libraries? 3) What is the tradition in the United

States of protecting children from sexually oriented materials and have the reasons behind the motivation to protect children been substantiated by social science research? 4) What could

19

Internet filtering/blocking technologies do to protect children from sexually oriented material on

the Internet before the adoption of the Children’s Internet Act? 5) What is the legal history of

attempts to protect children from harmful sexually oriented content on the Internet? 6) What

legislative developments led to the adoption of the CIPA? 7) How has the CIPA been treated by

the courts? 8) Is the regulatory and technological system established by the CIPA capable of

doing what Congress is asking of it?

Contributions to the Field

While the secondary literature has addressed some of the constitutional issues

surrounding the mandatory installation of filtering technology on public library computers, no

one has done a thorough legislative analysis of the Children’s Internet Protection Act or past

Internet filtering bills or an examination of current filtering technology. This dissertation will fill

a void in the literature by examining the Act in the context of First Amendment theory,

discussing the mission and role of the public library, examining the status of current filtering

technology, and analyzing the Children’s Internet Protection Act of 2000 to try to determine if

the CIPA does what Congress expected it to do.

Theoretical Framework

Because public libraries are designed to provide patrons with access to information, First

Amendment theory, public forum doctrine, and the mission and role of the public library provide

a useful and appropriate foundation for this dissertation.51 Several First Amendment theories are helpful in understanding the Supreme Court’s emphasis on the First Amendment in library cases, even though these theories are seldom developed in the opinions themselves. The First

Amendment theories most relevant to the issues of this dissertation—other than the legal concept

51 For a discussion of the history, mission and role of public libraries, see Chapter 2.

20

of public fora which will be discussed separately—are those that deal with 1) the right to receive

information and ideas, 2) personal self-fulfillment and self-development, 3) a commitment to an

unfettered marketplace of ideas where truth can perhaps be best discovered, 4) the need for

information in order to effectively participate in our system of government and monitor

government performance, and 5) the need to monitor or “check” on the abuse of government

power. In Chapter 2, the author of this dissertation will discuss the public library’s mission and

role and First Amendment theory as it applies to public libraries.

Review of the Literature

More than thirty commentators have written about the Children’s Internet Protection Act

of 2000. At least a dozen commentators have written about proposed legislation prior to the

CIPA and Internet filtering in public libraries in general. Most of the authors focused on the

constitutionality of the Children’s Internet Protection Act and argued against one or more

provisions of the CIPA. However, no one has done an extensive legislative and judicial study of

the issue, nor looked at the technology and traditions of the public library and the historical

regulations on behalf of children. In addition, no one has examined whether Internet filtering can

accomplish what Congress expected of it, or the state of current filtering technology nearly a

decade after Congress enacted the CIPA. Commentators writing about the CIPA in recent years

reported on the older filtering technology studies that had been cited in the 2002 district court

case and the 2003 Supreme Court case. An analysis of studies on newer filtering software is

important because the courts and the government addressed the potential for improved

technology in the future.52

52 In United States v. American Library Association, the Supreme Court acknowledged that filtering software “in the foreseeable future” will contain “fundamental defects.” 539 U.S. at 221 (Stevens, J., dissenting). In American Library Association v. United States, 201 F. Supp. 2d 401, 431 (2002), the District Court wrote, “Image recognition technology is immature, ineffective, and unlikely to improve substantially in the near future.” A congressionally

21

The Children’s Internet Protection Act of 2000

In 2009, professors Paul Jaeger and Zehng Yan stated that very little research has been done on the social, economic and political ramifications of the Children’s Internet Protection

Act.53 Because the CIPA applies to public libraries as well as schools, Jaeger and Yan questioned the “appropriateness” of adult library patrons and librarians being restricted to the same level Internet of access as school children.54 The two researchers also said that all minors, regardless of age, are treated alike under the CIPA, even though older and younger children do not have “the same level of maturity.”55 They recommended that future researchers study the

“societal effects” of the CIPA, such as whether the CIPA has changed the education and information-provision roles of libraries and schools, and whether minor patrons’ “information behaviors” have changed.56

In 2008, the previous year, Lisa Schneider Finsness briefly summarized the CIPA in her doctoral dissertation that examined whether Internet content filters blocked information that high school students needed to complete Minnesota Academic Standards.57 She found that the level of

commissioned study by the National Research Council found that filters will continue to suffer from overblocking and underblocking, in part because human beings who are judging content will bring diverse (and inconsistent) perspectives to the job. See Youth, Pornography, and the Internet (Dick Thornburgh & Herbert S. Lin, eds., 2002), available at http://bob.nap.edu/html/youth_internet/. For a discussion of filtering technology, see Chapter 3. For a discussion of Congressional hearings and reports on filtering technology, see Chapter 6.

53 Paul Jaeger & Zheng Yang, One Law with Two Outcomes: Comparing the Implementation of CIPA in Public Libraries and Schools, 28 INFO. TECH. AND LIBRARIES 6, 12 (March 2009).

54 Id. at 12.

55 Id.

56 Id.

57 Lisa Schneider Finsness, The Implication of Internet Content Filters in Secondary Schools 57 (May 2008) (unpublished Ph.D. dissertation, University of Minnesota) (on file with author). The Minnesota Academic Standards for grades 9 through 12 requires students to compare and contrast viewpoints and to use critical analysis. Id. at 68. Students are required to meet state standards in nine subject areas, including language arts, history, math, health education, arts and science. Some subject areas require state testing, while other subject areas are assessed by the schools. See Minnesota Dept. of Education, Graduation Requirements (2007-08), available at http://cfl.state.mn.us/MDE/Academic_Excellence/Academic_Standards/Graduation_Requirements/index.html.

22

filtering can affect a student’s ability to access the needed information to meet Minnesota’s state standards, and that different filtering software manufacturers categorize the same content on the same Web site differently.58 In studying nine school districts, Finsness found only three of the technology administrators and none of the teachers understood the CIPA law.59 “It is possible that districts would choose less restrictive Internet content settings and that teachers would be more confident about reporting blocked sites if everyone had a better understanding of the law,” she wrote.60

In 2007, one more year earlier, in a master’s thesis on social networking sites and the

Deleting Online Predators Act (DOPA), Anna Forslund summarized the Children’s Internet

Protection Act and the Supreme Court’s decision that upheld the CIPA.61 The Deleting Online

Predators Act, which was not enacted, would have required schools and libraries receiving E-rate funding to block access to “commercial social networking Web sites” and “chat rooms.”62 The

Finsness studied two content areas—history and health education. She said that she chose history and health education because each course “requires the study and analysis of multiple perspectives and opposing viewpoints.” Finsness at 68. Finsness studied nine of Minnesota’s 345 school districts, which she listed as a limitation of the study. However, she reported that rural and metropolitan districts were included in the study. Eight of the nine schools used filtering software, and seven different brands of Internet filters were represented in the study. Id. at 71, 153. In conducting her research, Finsness surveyed technology administrators and interviewed teachers and administers. She also requested access to Web sites blocked by each school’s filtering software in an effort to determine how the filtering companies rated the content on the Web sites. Seven of the nine districts involved in the study used Internet content filters. Finsness reviewed each of the seven different products and checked twenty-one URLs to find out how each blocked site was categorized and whether the site would be blocked in other schools. Id. at 71.

58 Finsness, supra note 57, at 127.

59 Finsness, supra note 57, at 147.

60 Finsness, supra note 57, at 148.

61 Anna Forslund, Protecting America’s Youth Online: A Legal and Ethical Analysis (December 2007) (unpublished master’s thesis, Southern Illinois University at Carbondale) (on file with author).

62 Id. at 49. The DOPA was introduced in Congress in both 2006 and 2007, but was not enacted in either year. In 2006, the bill passed in the House, but died in the Senate Commerce Committee. See Deleting Online Predators Act of 2006, H.R.5319, 109th Cong. (2d Sess. 2006). See also Susan Hanley Duncan, MySpace Is Also Their Space: Ideas for Keeping Children Safe from Sexual Predators on Social--Networking Sites, 96 KY. L.J. 527, 547 (2007-

23

bill did not define “commercial social networking Web sites” or “chat rooms,” but rather left it

up to the Federal Communications Commission to develop specific definitions at a later date.63

Forslund stated that the text of the DOPA indicated Congress’ concern with protecting minors from online predators.64 Similar to the Children’s Internet Protection Act, the Deleting Online

Predators Act contained a disabling provision for adults and for minors when computers were used for educational purposes.65 Forslund’s thesis did not include a legislative history of the

CIPA. Her thesis also did not include a discussion of the history of public libraries, filtering technology studies, or the tradition in the United States of protecting children from sexually oriented materials. She argued against Internet filtering, stating that “[l]awmakers must resist the temptation to panic and instead focus on striking a balance between the expression of and the protection of America’s youth.”66

Also in 2007, Tonnis Venhuizen, a law student, argued that the Supreme Court, in

upholding the CIPA in 2003, missed the opportunity to reconsider the limits of Congressional

spending power67 that were established in South Dakota v. Dole.68 Venhuizen said that the Dole

08). In 2007, the bill died in the House Subcommittee on Telecommunications and the Internet. See Deleting Online Predators Act of 2007, H.R. 1120, 110th Cong. (1st Sess. 2007).

63 Forslund, supra note 61, at 49.

64 Forslund, supra note 61, at 49. Forslund wrote, “According to DOPA’s text, Congress has found that ‘(1) sexual predators approach minors on the Internet using chat rooms and social networking Web sites, and, according to the United States Attorney General, one in five children has been approached sexually on the Internet; (2) sexual predators can use these chat rooms and Web sites to locate, learn about, befriend, and eventually prey on children by engaging them in sexually explicit conversations, asking for photographs, and attempting to lure children into a face to face meeting; and (3) with the explosive growth of trendy chat rooms and social networking Web sites, it is becoming more and more difficult to monitor and protect minors from those with devious intentions, particularly when children are away from parental supervision’ (Deleting Online Predators Act, 2006, 2007).”

65 Forslund, supra note 61, at 49.

66 Forslund, supra note 61, at 77.

67 U.S. CONST., art. I, § 8, cl.1. “The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States.”

24

decision gave Congress a “plenary power” under the spending clause, thus allowing Congress to enact legislation that it could not enact directly.69 The CIPA Court reinforced the perception that

“[Congress’] spending power is without significant limit,” Venhuizen said.70 He argued that the

CIPA Court should have replaced the Dole standard or imposed “significant new limits on congressional (spending) power by redefining the elements of the Dole standard.”71 The CIPA

Court viewed the CIPA case “primarily” as a spending clause case, rather than a First

Amendment case, he said.72 If the CIPA Court had revised the Dole standard, the Court might have upheld the CIPA for computers purchased with federal funds, but not for computers purchased with non-federal funds, Venhuizen said.73

At least three authors argued that the Children’s Internet Protection Act was constitutional.74 Prior to the Supreme Court’s decision to uphold the CIPA, Mark Nadel, an attorney with the Federal Communications Commission, stated that the First Amendment allows libraries to use Internet filters as long as the filters are not used “to favor one socio-political

68 Tonnis Venhuizen, United States v. American Library Association: The Supreme Court Fails to Make the South Dakota v. Dole Standard a Meaningfull [sic] Limitation on the Congressional Spending Powers, 52 S.D. L. REV. 565, 597, 604 (2007). See also South Dakota v. Dole, 483 U.S. 203 (1987) (upholding the National Minimum Drinking Age Amendment of 1984, which required the federal government to withhold 5% of highway funds from states that allowed persons under twenty-one to purchase or possess alcoholic beverages). For a discussion of the Supreme Court’s decision on the Children’s Internet Protection Act and the application of the Dole test to the CIPA, see Chapter 7.

69 Venhuizen, supra note 68, at 604. For a discussion of the Supreme Court’s decision on the Children’s Internet Protection Act and the application of the Dole test to the CIPA, see Chapter 7.

70 Venhuizen, supra note 68, at 604.

71 Venhuizen, supra note 68, at 604.

72 Venhuizen, supra note 68, at 604.

73 Venhuizen, supra note 68, at 603. See supra notes 8-10 for a summary of the Children’s Internet Protection Act. See Chapter 7 for a discussion of the Supreme Court’s decision on the CIPA.

74 Mark S. Nadel, The First Amendment Limitations on the Use of Internet Filtering in Public and School Libraries: What Content Can Libraries Exclude? 78 TEX. L. REV. 1117 (2000); Susannah Malen, Protecting Children in the Digital Age: A Comparison of Constitutional Challenges to CIPA and COPA, 26 COLUM. J.L. & ARTS 217 (2003); Patrick Garry, The Flip Side of the First Amendment: A Right to Filter, 2004 MICH. ST. L. REV. 57, 68 (2004).

25

viewpoint over another.”75 “Librarians may install filters to help their patrons use the library's computer terminals and Internet links to gain easier access to those categories of content the librarians choose to include in their collections,” Nadel said.76 Before the Supreme Court upheld the CIPA, a law student stated that when Congress allocates funding, it has the right to restrict how those funds can be used as long as it does not use its Spending Power to violate constitutionally protected rights.77 After the Supreme Court upheld the CIPA, a law professor stated that the Supreme Court was correct in viewing librarians as editors, rather than selectors, of information. He compared Internet filtering to editing and said the Court has recognized a

“constitutional right to edit.”78

Several commentators stated that the CIPA is unconstitutional.79 They argued that the

CIPA is a content-based restriction, and therefore, the Supreme Court should have applied the

75 Nadel, supra note 74, at 1119.

76 Nadel, supra note 74, at 1119.

77 Malen, supra note 74, at 250.

78 Garry, supra note 74, at 68. However, Garry’s analysis does not seem to be directly applicable to the CIPA. In United States v. Am. Library Ass’n, 539 U.S. 194, 208 (2003), the Supreme Court stated that librarians are making acquisition or collection decisions when filtering Internet content and not removal decisions. Also, the two cases Garry used as support for his argument also do not seem to be directly applicable to the CIPA. He cited Miami Herald v. Tornillo, 418 U.S. 241 (1974), and he correctly stated that the Court upheld the right of newspapers to control and edit content as they saw fit. However, the roles of a commercial newspaper and public library are different. A newspaper is not publicly funded to serve citizens, nor are its facilities open to the public for obtaining information. Tornillo, 418 U.S. at 258. Garry also cited Denver Area Educational Telecommunications Consortium v. FCC, 518 U.S. 727 (1996), in which he said the Court affirmed the right of cable operators to edit out sexually explicit programming. However, that statement is not totally accurate. In Denver, the Court upheld one of three challenged parts of the Cable Act of 1992 and struck down the other two on First Amendment grounds. The Court affirmed the rights of cable operators to decide whether or not to broadcast programming depicting sexual or excretory activities on leased access channels, that is, channels leased to other program providers. The Court struck down a provision requiring leased channel operators to segregate and to block such programming. The Court also struck down a provision allowing cable operators to ban offensive or indecent programming on public access channels. See Denver Area Educ. Telecomm. Consortium, 518 U.S. at 727, 733, 752-54, 766-68.

79 Michael Cassidy, Note, To Surf and Protect: The Children’s Internet Protection Act Polices Material Harmful to Minors and a Whole Lot More, 11 MICH. TELECOMM. TECH. L. REV. 437, 440 (2005); Katherine Miltner, Note, Discriminatory Filtering: CIPA’s Effect on Our National’s Youth and Why the Supreme Court Erred in Upholding the Constitutionality of the Children’s Internet Protection Act, 57 FED. COMM. L.J. 555, 578 (2005); Larissa Piccardo, Note, Filtering the First Amendment: The Constitutionality of Internet Filters in Public Libraries Under the Children’s Internet Protection Act, 41 HOUS. L. REV. 1437, 1467 (2004); Leah Wardak, Note, Internet Filters

26

strict scrutiny test.80 In discussing the CIPA’s provision mandating a technology blocking mechanism, several authors have argued that the CIPA violates the First Amendment because filtering software blocks too much constitutionally protected speech.81 Others stated that filtering is more analogous to library removal decisions than acquisition decisions,82 and therefore, filtering would require the use of the strict scrutiny test.83

Some commentators also argued that the Supreme Court incorrectly labeled Internet access in public libraries as a nonpublic forum.84 The designation of the type of forum

and the First Amendment: Public Libraries After United States v. Am. Library Association, 35 LOY. U. CHI. L.J. 657, 725-26 (2004); Darin Siefkes, Note and Comment, Explaining United States v. American Library Association: Strictly Speaking, a Flawed Decision, 57 BAYLOR L. REV. 327, 357 (2005).

80 Cassidy, supra note 79, at 440; Miltner, supra note 79, at 578; Piccardo, supra note 79, at 1467; Wardak, supra note 79, at 725-26; Siefkes, supra note 79, at 357. For a content-based restriction to pass the strict scrutiny test, the restriction “must be narrowly tailored to serve a compelling Government interest” and must be the “least restrictive alternative.” See United States v. Playboy Entm’t Group, 529 U.S. 803, 813 (2000).

81 Wardak, supra note 79, at 726; J. Adam Skaggs, Note, Burning the Library to Roast the Pig? Online Pornography and Internet Filtering in the Free Public Library, 68 BROOKLYN L. REV. 809, 847 (2003); Peltz, supra note 13, at 478-79.

82 The term “acquisition” refers to a library’s method of selecting materials to include in its collection. In 1933, the Public Library Association (PLA), a division of the American Library Association (ALA), established general standards for deciding the amount and kinds of materials to acquire. In the 1970s, the PLA shifted from a quantitative standard to a community standard. Today, public libraries collect materials in a variety of formats (such as books, musical scores, photographs and microform) to meet individual and group needs for information, education, self-realization, recreation and cultural growth, as well as to assist patrons in carrying out their duties as citizens and community members. See JEAN KEY GATES, INTRODUCTION TO LIBRARIANSHIP146-47 (1990). See also BRUCE A. SHUMAN, FOUNDATIONS AND ISSUES IN LIBRARY AND INFORMATION SCIENCES 26-34 (1992). The American Library Association’s Bill of Rights states: “Books and other library resources should be provided for the interest, information, and enlightenment of all people of the community the library serves. Materials should not be excluded because of the origin, background, or views of those contributing to their creation.” AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm.

83 Miltner, supra note 79, at 557, 570; Piccardo, supra note 79, at 1459; Wardak, supra note 79, at 668; Siefkes, supra note 79, at 327; Raizel Liebler, Institutions of Learning or Havens for Illegal Activities: How the Supreme court Views Libraries, 25 N. ILL. U. L. REV. 1, 3, 39 (2004).

84 Liebler, supra note 83, at 3-5, 70; Cassidy, supra note 79, at 465-67; Paul Jaeger & Charles McClure, Potential Legal Challenges to the Application of the Children’s Internet Protection Act (CIPA) in Public Libraries: Strategies and Issues, FIRST MONDAY (non-paginated online publication) (2004), http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/issue/view/167 and http://firstmonday.org/issues/issue9_2/jaeger/index.html (last visited July 20, 2009); Derrick Stomberg, Note, United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum, 45 JURIMETRICS J. 59, 71 (2004).

27

determines the level of scrutiny because content-based speech restrictions in traditional and limited public fora are subject to strict scrutiny.85 A traditional public forum is one that historically has been open for public discourse and debate86 and does not require official government designation.87 Examples include streets, parks, sidewalks, and town squares.88 In the second type of forum, a limited or designated public forum, the government opens its public property for specific expressive activity or discussion of specific subjects89 or for use by certain groups, such as opening public university meeting rooms for registered student groups90 and municipal theaters for performers.91 The third type of forum, a nonpublic forum,92 generally is not open for public use because the property has an intended primary purpose that is not

85 Wardak, supra note 79, at 667.

86 Cornelius v. NAACP Legal Def. & Educ. Fund, 473 U.S. 788, 800 (1985) and Perry Educ. Ass'n v. Perry Local Educators' Ass'n, 460 U.S. 37, 45 (1983).

87 See Hague v. CIO, 307 U.S. 496, 515 (1939) and Perry, 460 U.S. at 45.

88 Hague, 307 U.S. at 515; Perry, 460 U.S. at 44-45; United States v. Am. Library Ass’n, 539 U.S. 194, 202 (2003).

89 Perry, 460 U.S. at 45-46. See also City of Madison Joint Sch. Dist. v. Wisc. Employment Relations Comm’n, 429 U.S. 167, 175 (1976) (holding that school board meetings were open to employees as well as the public, and stating, “Where the State has opened a forum for direct citizen involvement, it is difficult to find justification for excluding teachers.”).

90 Widmar v. Vincent, 454 U.S. 263, 269, 270-71, 277 (1981). The Supreme Court held that state regulation of speech must be content neutral and a state university's refusal to grant student religious group access to university facilities was unjustifiable, content-based exclusion of religious speech as the facilities were generally open to other student groups. According to the Court opinion, “[T]he campus of a public university, at least for its students, possesses many of the characteristics of a public forum.” Widmar, 454 U.S. at 269. See also Perry, 460 U.S. at 37, 45-47.

91 Se. Promotions, Ltd. v. Conrad, 420 U.S. 546 (1975). In a per curiam opinion expressing the view of five members of the Supreme Court, the Court held that a city board had engaged in prior restraint by refusing to grant the use of community facilities to a theater group wanting to perform the musical “Hair.” The Court said the board did not have constitutionally required procedural safeguards in place to decide on usage requests only based its decision on outside reports that the play involved nudity and was obscene. See also Perry, 460 U.S. at 45-47.

92 Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 679 (1992). Justice William Rehnquist, who wrote the majority opinion, used the term “nonpublic fora” to describe airport terminals. “Like the Court of Appeals, we conclude that the terminals are nonpublic fora and that the regulation reasonably limits solicitation.” Id. at 679. The International Society for Krishna Consciousness, a religious group, had invoked the First Amendment in challenging the law that prohibited solicitation of money and the distribution of literature inside airport terminals. The law allowed the solicitation of money and distribution of literature on sidewalks outside the airport terminals. Id. at 674- 75.

28

consistent with public use93 and has not been designated a public forum by the government.94

Examples of a nonpublic forum include airport terminals,95 military bases,96 prisons and jails,97 and teacher mailboxes.98

One law student argued that if forum analysis is applicable at all, the Court should have viewed the Internet as a traditional public forum.99 Several authors stated that the Supreme Court should have designated Internet access in public libraries as a limited public forum, consistent with the Court’s previous decisions involving libraries.100

Other commentators, because of the Supreme Court’s plurality opinion upholding the

CIPA, have stated the Court still has not “definitively settled”101 or “clarified”102 the issue concerning what type of forum a public library really is. Derrick Stomberg, a law student, said that the Supreme Court has been using a nineteenth century paradigm in applying public forum

93 Perry, 460 U.S. at 46.

94 Lee, 505 U.S. at 678-79.

95 Id. at 680-81.

96 Greer v. Spock, 424 U.S. 828, 838-39 (1976).

97 Adderley v. Florida, 385 U.S. 39, 41 (1966) and Jones v. N.C. Prisoners’ Labor Union, 433 U.S. 119 (1977).

98 Perry, 460 U.S. at 49. For a more thorough discussion of public fora doctrine and its applicability to public libraries, see Chapter 2.

99 Siefkes, supra note 79, at 353. Other commentators also have argued that the Internet should be treated as a traditional public forum. See Dawn C. Nunziato, The Death of the Public Forum in Cyberspace, 20 BERKELEY TECH. L.J. 1115, 1164 (2005); Aaron Jacobson, Note, United States v. American Library Association: Software Filters, Free Speech, and the Shrinking Public Forum, 38 U.C. DAVIS L. REV. 1345, 1361 (2005); Liebler, supra note 83, at 74; Stomberg, supra note 84, at 71; Peltz, supra note 13, at 463. See also Bernard Bell, Filth, Filtering, and the First Amendment: Rumination on Public Libraries’ Use of Internet Filtering Software, 53 FED. COMM. L.J. 191, 206-08 (2001). Professor Bell has argued that a library is, at the very least, a limited public forum and perhaps even a traditional public forum for receiving information. See also Steven Gey, Reopening the Public Forum—From Sidewalks to Cyberspace, 58 OHIO ST. L. J. 1535, 1611 (1998). Professor Gey said that the Supreme Court implied that the Internet was a traditional public forum when it struck down the Communications Decency Act in Reno v. ACLU, 521 U.S. 844 (1997).

100 Cassidy, supra note 79, at 465; Liebler, supra note 83, at 7-8; Piccardo, supra note 79, at 1459, 1467.

101 Jaeger & McClure, supra note 84.

102 Liebler, supra note 83, at 70.

29

doctrine and that an “inherently public forum” may need to be added as a fourth type of

forum.103 Courts then could evaluate objective characteristics of non-traditional properties to

determine if they promote the goals of public forum doctrine, and if so, designate those

properties as inherently public fora.

As Stomberg noted in his article, the concept of an inherently public forum is not new. In

1992, Supreme Court Justice Anthony Kennedy said the Court should adopt a more modern and

objective standard to the public forum doctrine, one that extends beyond the historical

designation of streets, parks and sidewalks because their role is diminishing.104 Stomberg argued

that the Internet has become the modern public forum as it provides speakers and recipients with

a primary way of exchanging ideas and information in a centralized place.105

Several law students said that the Supreme Court did not either apply or address the First

Amendment right to receive ideas and information in the CIPA case, even though that right was

clearly established in its earlier cases.106 In addition, two library scholars argued that the

constitution protects the right to receive information, and yet the imprecision in filtering

technology limits access to protected information.107 Several commentators recognized that the

Supreme Court has stated that minors have a lesser First Amendment right to receive information

103 Stomberg, supra note 84, at 70.

104 Stomberg, supra note 84, at 69-71 (citing Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 697-98 (Kennedy, J., concurring in the judgment)).

105 Stomberg, supra note 84, at 71; see also Gey, supra note 99, at 1618.

106 Cassidy, supra note 79, at 465; Barbara Sanchez, United States v. American Library Association: The Choice Between Cash and Constitutional Rights, 38 AKRON L. REV. at 485-87; Piccardo, supra note 79, at 1451-54 (2004); Wardak, supra note 79, at 732 (2004). See Chapter 2 for a discussion of the First Amendment right to receive ideas and information.

107 Jaeger & McClure, supra note 84. At the time this article was published, Jaeger, who has a J.D. and master’s degree in Information Studies, was a Ph.D. student at Florida State University and serving as a senior research associate at the School of Information Studies. McClure was professor and director of the Information Use Management and Policy Institute of the School of Information Studies at Florida State University.

30

than adults.108 However, the Court failed to address that right at all when ruling on the CIPA, according to a law student.109

Several authors, in analyzing the text of the Children’s Internet Protection Act, criticized

Congress’ choice of wording of the statute. The commentators argued that the terms, such as

“bona fide research,” “technology protective measure,” and “harmful to minors” are ambiguous.110 At the time Congress enacted the CIPA, a law student said that filtering technology was incapable of blocking “visual depictions” deemed “obscene,” “child pornography” or “harmful to minors,”111 which were terms used in the CIPA.112 Moreover, filtering technology software descriptions did not match these three definitions, according to an attorney who advised the House Committee on Science and Technology.113 A law student argued that filters cannot differentiate between content that is obscene and that which is harmful to

108 Sidne Koenigsberg, Print Symposium, Contract Options for Individual Artists: Library Records Open to Parental Scrutiny: A New Set of Internet Access Controls for Minors, 29 COLUM. J.L. & ARTS 361, 374 (2006); Cassidy, supra note 79, at 675; Nunziato, supra note 99, at 155-57 (2004); Gregory Laughlin, Sex, Lies and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 253 (2003); Horowitz, supra note 13, at 426-27. See also Ginsberg v. New York, 390 U.S. 629, 638 (1968).

109 Cassidy, supra note 79, at 465.

110 Miltner, supra note 79, at 63; Jared Chrislip, Filtering the Internet Like a Smokestack: How the Children's Internet Protection Act Suggests a New Internet Regulation Analogy, 5 J. HIGH TECH. L. 261, 278-79 (2005). See also Michael Birnhack & Jacob Rowbottom, Symposium, Do Children Have the Same First Amendment Rights as Adults?: Shielding Children: The European Way, 79 CHI.-KENT L. REV. 175, 217 (2004) (discussing Internet filtering software in general and stating, “[I]n many cases it is unclear, and it cannot be made clear, in advance whether the content at stake is harmful to children or not. A court can decide so in retrospect, but what is the line between that which is harmful to a child and that which is not?”).

111 Chrislip, supra note 110, at 278.

112 For a discussion of more recent Internet filtering technology, see Chapter 3.

113 Mitchell Goldstein, Congress And The Courts Battle Over The First Amendment: Can The Law Really Protect Children From Pornography On The Internet? 21 J. MARSHALL J. COMPUTER & INFO. L. 141, 187 (2003). Examples of filtering categories are “adults only”, “sexually explicit”, “sex education”, “nudity” and “violence”. For a discussion of how filtering software works, see Chapter 3.

31

minors,114 an important distinction since adults legally would be allowed to access the nonobscene material that could be considered harmful to minors.

Under the current Miller test, obscenity is evaluated, in part, on local community standards.115 However, community and geographic boundaries don’t exist in cyberspace, as one commentator pointed out.116 In addition, proprietary filtering software, which is developed for national and international markets, cannot take into account local community standards when blocking images deemed obscene.117 In contrast to the Supreme Court’s argument that filters could easily be disabled, the process can be time-consuming and difficult, according to several prominent library scholars.118 They also stated that the cost of filters sometimes exceeds the government funding for library computers and technology. 119

Several commentators questioned the Supreme Court’s interpretation of the role of public librarians. Professor Felix Wu argued that the plurality in the CIPA case did not characterize librarians the way the public does—as information managers, not gatekeepers.120 Librarian Jill

Ratzan said the CIPA did not differentiate public librarians from school librarians. The “CIPA appears to establish new, contradictory, and perhaps undesirable roles for public librarians,” she wrote. For example, under the CIPA, librarians would be “unblockers” and “deniers of access,”

114 Chrislip, supra note 110, at 269 and n.47.

115 See supra note 25 and accompanying text.

116 Todd Nist, Note, Finding the Right Approach: A Constitutional Alternative for Shielding Kids from Harmful Materials Online, 65 OHIO ST. L. J. 451, 459 (2004).

117 Amitai Etzioni, Symposium, Do Children Have the Same First Amendment Rights as Adults?: On Protecting Children from Speech, 79 CHI.-KENT L. REV. 3, 51 (2004); Goldstein, supra note 113, at 187.

118 Paul Jaeger, Charles McClure, John Bertot & Lesley Langa, CIPA: Decisions, Implementation, and Impacts, 44 PUB. LIBRARIES 105, 106 (2005). See also Miltner, supra note 79, at 563.

119 Jaeger, McClure, Bertot & Langa, supra note 118, at 107; see also Liebler, supra note 83, at 67.

120 Felix Wu, Note, United States v. American Library Ass’n: The Children’s Internet Protection Act, Library Filtering, and Institutional Roles, 19 BERKELEY TECH. L.J. 555, 575 (2004).

32

which are antithetical to librarians’ traditional roles as providers of access, Ratzan said.121 Law

Professor Jim Chen argued that “decisions to acquire material should lie beyond judicial challenge . . . and legislative mandates to exclude material should draw strict scrutiny and should be presumed unconstitutional.”122

Two commentators have said that the Supreme Court’s decision on the CIPA shows that the Court needs to reconcile its contradictory decisions. Professor Wu stated that the Court’s decisions regarding overall First Amendment issues seem “ad hoc.” He said that the Court needs to reconcile the various doctrines and line of cases under a broader framework so as to understand the relevant factors before applying the doctrines to a specific case.123 He suggested the Court look at both the government entity making the restriction and the place being restricted to evaluate whether the “proposed restriction is consistent with the level and type of discretion

entrusted to that entity.”124 Law librarian Raizel Liebler stated that the Supreme Court plurality

opinion ignored library case precedent. Liebler said the Court plurality did not even mention

Brown and Pico, two key library cases.125 She noted that only two opinions in the CIPA case—

121 Jill Ratzan, CIPA and the Roles of Public Librarians, 43 PUB. LIBRARIES 285, 286 (2004).

122 Jim Chen, Mastering Eliot's Paradox: Fostering Cultural Memory in an Age of Illusion and Allusion, 89 MINN. L. REV. 1361, 1362 (2005). See also Marc Blitz, Constitutional Safeguards for Silent Experiments in Living: Libraries, the Right to Read, and a First Amendment Theory for an Unaccompanied Right to Receive Information, 74 UMKC L. REV. 799, 843-44 (2006) (arguing in support of a library autonomy model).

123 Wu, supra note 120, at 568.

124 Wu, supra note 120 at 568.

125 Liebler, supra note 83, at 66. In Brown, the Supreme Court acknowledged that a public library is “a place dedicated to quiet, to knowledge, and to beauty.” See Brown v. Louisiana, 383 U.S. 131, 142 (1966). In Pico, a plurality held that a student’s “right to receive ideas” was infringed when school board members voted to have books removed from the junior and senior libraries that they considered to be “anti-American, anti-Christian, anti- Semitic, and just plain filthy.” See Bd. of Educ. v. Pico, 457 U.S. 853, 857-58 (1982). For a thorough discussion of Brown and Pico, see Chapter 2, infra notes 141-168 and accompanying text.

33

Justice Stephen Breyer’s concurrence and Justice David Souter’s dissent—addressed Pico, with

Justice Breyer only making a “passing reference” to the case.126

Relevant Articles Before the CIPA Was Adopted

Several authors wrote about the 1998 and 1999 versions of bills that were similar to the

Children’s Internet Protection Act of 2000, but none did a legislative history of those bills.

Although none of the bills were enacted, summaries of the authors’ works are included in this literature review because of the prior bills’ similarities to the CIPA.127

Three authors reviewed Sen. John McCain’s 1998 filtering bill, the Children’s Internet

Protect Act,128 which, like the CIPA, would have required schools and libraries to install Internet

filters to receive federal funding for online technology. Two authors concluded that filtering is

the least restrictive alternative to achieve the government’s compelling interest of keeping

material deemed harmful to minors away from juveniles.129 They argued that filters would be

constitutional because they could be turned off for adults or installed on only some computers

that minors would be limited to using.130 The third author questioned whether the bill was

necessary since the legislative history indicated only anecdotal rather than statistical support

showing material deemed “inappropriate” was actually harmful to minors.131

126 Liebler, supra note 83 at 66-67. For an analysis of the Supreme Court’s decision on the Children’s Internet Protection Act, see Chapter 7.

127 For the legislative history of the Children’s Internet Protection Act of 2000, see Chapter 6.

128 See S. 1619, 105th Cong. (2d Sess. 1998).

129 Kimberly S. Keller, Comment, From Little Acorns Great Oaks Grow: The Constitutionality of Protecting Minors from Harmful Internet Material in Public Libraries, 30 ST. MARY’S L. J. 549, 608-09 (1999); Matthew Thomas Kline, III, First Amendment: 1. Limiting Internet Access: a) Public Libraries: Mainstream Loudoun v. Board of Trustees of Loudoun County Library, 14 BERKELEY TECH. L. J. 347, 370 (1999).

130 Keller, supra note 129, at 609; Kline, supra note 129, at 370.

131 See Christopher G. Newell, Chalk Talk: The Internet School Filtering Act: The Next Possible Challenge in the Development of Free Speech and the Internet, 28 J. L. & EDUC. 129, 137 (1999). Under S. 1619, 105th Cong. (2d

34

In an article on the Children’s Internet Protection Act of 1999,132 which also was

introduced by Sen. McCain, a law student argued that the bill was the first step down the

“‘slippery slope’ of arbitrary censorship”133 and that the bill was flawed because it treated

schools and libraries the same.134 Another commentator concluded that Congress jumped too

quickly in trying to regulate the Internet before its full potential had been realized.135

Several commentators discussed library Internet filtering in general and argued that

mandatory Internet filtering in public libraries is, or could be, constitutional. First Amendment

scholar R. Polk Wagner stated that the evolution of more precise and accurate filters and the

increase in sites rating themselves could allow the government to mandate Internet filtering.136 A law student contended that, because of the dynamic nature of the Internet compared to the static nature of books, filtering is not equivalent to removing books from the shelf, but rather is an acquisition decision. Thus, it would fall under a librarian’s editorial discretion.137 Another law

student stated that filtering would be constitutional in public libraries if both filtered and unfiltered computers were made available to the public.138 However, under the CIPA, all library

Sess. 1998), local school officials and librarians would determine what content was “inappropriate.” Id. at 136-37. Newell did not cite any sources of legislative history.

132 S. 97, 106th Cong. (1st Sess. 1999).

133 Elizabeth Shea, Note, The Children’s Internet Protection Act of 1999: Is Internet Filtering Software the Answer? 24 SETON HALL LEGIS. J. 167, 205 (1999).

134 Id.

135 Jennifer Zwick, Comment, Casting a Net Over the Net: Attempts to Protect Children in Cyberspace, 10 SETON HALL CONST. L. J. 1133, 1179 (2000).

136 See R. Polk Wagner, Essay: Filters and the First Amendment, 83 MINN. L. REV. 755, 777 (1999). For a discussion of more recent filtering technology, see Chapter 3.

137 See Brent VanNorman, Comment and Note, The Library Internet Filter: On the Computer or in the Child? 11 REGENT U. L. REV. 425, 437 (1998/1999).

138 See Laurann Sage, Note, Mainstream Loudoun v. Board of Trustees: Restricting Internet Access in Public Libraries, 67 UMKC L. REV. 731, 743 (1999). However, under the E-rate, all library computers must have filtered Internet access, including staff computers.

35

computers must have filtered Internet access, including staff computers, as the federal district

court noted.139

Other commentators argued that mandatory Internet filters are not constitutional.

Professor Wagner concluded that direct government filtering most likely would be

unconstitutional in view of the limitations of technology of 1999, the time he was writing.140 In

analyzing the limitations of filtering software and the possibility of content providers rating their

own material, Wagner stated that librarians would be limited in making content decisions, if even permitted to do so at all, which he viewed as a disadvantage.141

In a 1998-99 law review article, Brent VanNorman, a law clerk, challenged the

application of the forum doctrine to library holdings. He argued that a public library is not a

traditional or limited public forum when choosing content because librarians use editorial

discretion in acquiring library materials.142 FCC attorney Mark Nadel said that if libraries were

classified as a limited public forum for content selection, librarians would be forced to have

compelling reasons for using content-based justifications to reject donations of books.143

In 2000, the author of this dissertation analyzed the Communications Decency Act, the

Child Online Protection Act and key Supreme Court cases that dealt with protecting minors from

139 United States v. Am. Library Ass’n, 539 U.S. 194, 230-31 (2003). See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f)(1)(A) and (f)(1)(B), 47 U.S.C. § 254(h)(6)(B) and (h)(6)(C)).

140 See Wagner, supra note 136, at 758-59. See also Eileen M. Candia, Comment, The Information Super Highway – Caution – Road Blocks Ahead: Is the Use of Filtering Technology to Prevent Access to “Harmful” Internet Sites Constitutional? 9 TEMPLE POL. & CIV. RTS. L. R. 85, 89, 108 (1999). Candia wrote that while protecting children from material deemed harmful on the Internet is a compelling government interest, “the use of site blocking technology does not meet the exacting standard of precision required in achieving the narrow tailoring of a statute because it permits exclusion of protected speech.” Id. at 108.

141 See Wagner, supra note 136, at 777.

142 See VanNorman, supra note 137, at 430-32.

143 Nadel, supra note 74, at 1132. Because libraries have limited shelving and storage facilities, as well as acquisition policies, they do not accept all donations. Id.

36

sexually explicit material. This author suggested that libraries adopt a three-pronged approach

that would be less restrictive and more narrowly tailored than a mandatory filtering policy.144

First, privacy walls and screens could be used to avoid offending patrons and librarians.145

Second, a library card bar code could contain a yes/no field to indicate whether parents wanted

their children to use filtered or unfiltered computers. Third, librarians could install Internet

filtering software that either could be turned off for adults or that was loaded onto only some of

the computers.146 In her conclusion, this author emphasized the role of the parent, rather than the

librarian or government, in determining whether minors should be allowed unfiltered Internet

access.147 Similarly, other commentators have emphasized parental choice, arguing that parents,

not librarians, are responsible for protecting minors from material deemed harmful.148 Attorney

Junichi Semitsu suggests that librarians should assist parents by offering optional filters, but they

should not become censors.149 Professor Catherine Ross, an attorney and historian, has argued

that older minors should have access to information without parental consent.150 However, none

of these options is available to public libraries accepting E-rate or LSTA funding.151

144 See Barbara H. Smith, To Filter or Not to Filter: The Role of Public Librarians in Determining Internet Access, 5 COMM. LAW & POL’Y 385, 419 (2000).

145 To ensure privacy, computers could be housed in cubicles, and plastic screens could be placed over the monitor that would make images and text less visible to passersby.

146 Smith, supra note 144, at 419.

147 Smith, supra note 144, at 421.

148 See Junichi Semitsu, Note, Burning Books in Public Libraries: Internet Filtering Software vs. the First Amendment, 52 STAN. L. REV. 509, 545 (2000); Nunziato, supra note 99, at 164; Cassidy, supra note 79, at 472.

149 Semitsu, supra note 148, at 545.

150 Catherine J. Ross, An Emerging Right for Mature Minors to Receive Information, 2 U. PA. J. CONST. L. 223, 224- 25, 275 (1999).

151 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f)(1)(A) and (f)(1)(B), 47 U.S.C. § 254(h)(6)(B) and (h)(6)(C)), mandating that filtering software be installed on any computers connected to Internet for those libraries receiving E-rate funding and Library Services and Technology Act (LSTA) funding.

37

Methodology

The principal methodology for this project involves legal research to analyze legislation,

legislative histories and court cases. The author of this study will use legal research methods to

analyze the legislative history of the Children’s Internet Protection Act of 2000 and its

predecessors, including preliminary bills, committee hearings and reports, legislative debates,

bill sponsors’ statements and public law. The author also will analyze court cases deciding the

CIPA. All materials are available online through Lexis or Thomas (the Library of Congress Web

site), or in paper or microform format at the Government Documents Library at the University of

Florida, Kansas State University, or the University of Rochester. Common paper and microform sources are available from the Congressional Information Service (CIS), Government Printing

Office (GPO), Congressional Record, and Statutes at Large.

The author will use the Lexis computerized database and Thomas (the Library of

Congress online database) to locate relevant legislative documents and court cases on the

Children’s Internet Protection Act. In addition, the author will use Lexis and Thomas to locate federal court cases reviewing patrons’ access to public library buildings and content. The author

will use the following key words in conducting database searches: “Children’s Internet

Protection Act,” “filter* w/p librar*,” “obscenity + minors,” “sexually explicit + minors,”

“variable obscenity + minors,” “harm* w/p minors”; “pornography + minors”; “pornography +

children”; “library + access”; “libraries + access”; “library + minors”; “libraries + minors”;

“library + access + minors”; “libraries + access + minors”; “library + children”; “library + access

+ children”; “libraries + children”; and “libraries + access + children.”

This dissertation also will rely on secondary source materials, including books, library

journals, social science journals, and legal periodicals, such as law journals and law reviews. The

38

secondary sources are available in hard copy or electronic form through the University of

Florida, Kansas State University, or University of Rochester libraries.

The author of this dissertation will use secondary sources to discuss the history and role

of the public library. To find these sources, the author will run keyword searches in university library book catalogs, the Lexis law review database, and the Wilson Select library literature database. The keywords will be: “public + library + mission”; “public + library + role”; “public

+ library + history”; and “public + library + access.” The author also will use the American

Library Association’s (ALA) Web site because the ALA is the world’s oldest and largest

professional organization for librarians.152 The ALA’s stated mission is “To provide leadership

for the development, promotion and improvement of library and information services and the

profession of librarianship in order to enhance learning and ensure access to information for

all.”153 The ALA also has adopted a bill of rights for public libraries and patrons.154

The author of this dissertation will rely on studies evaluating filtering software in an

effort to determine if the Children’s Internet Protection Act establishes a regulatory regime that

is physically capable of doing what Congress intended for it to do—to prevent minors from

accessing online “visual depictions” that are considered obscene, child pornography, or harmful

to minors. To find the filtering technology studies, the author will use secondary sources. She

will run keyword searches in the following databases that contain social science studies:

152 “The American Library Association (ALA) is the oldest and largest library association in the world, with members in academic, public, school, government, and special libraries.” Am. Library Ass’n, Frequently Answered Questions, http://www.ala.org/Template.cfm?Section=alafaq&template=/cfapps/faq/faq.cfm. (last visited July 20, 2009).

153 Am. Library Ass’n, Mission and Priorities, http://ala.org/ala/aboutala/missionhistory/mission/index.cfm. (last visited July 20, 2009).

154 AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm. For a discussion of the history and role of public libraries, see Chapter 2.

39

Academic Search Elite, Academic Search Premier, FirstSearch, and Wilson Omnifile. The author also will use the Google search engine to find citations or links to other scientific studies on filtering technology. The keywords will be: “Internet + filter + studies”; “Internet + filter + software”; “Internet + filtering + software”; “Internet + filtering + software + studies” “Internet + filtering + software + evaluation”; “Internet + filter + children”; “Internet + filter + minors”; and

“Internet + filter + pornography.”

The author of this dissertation will use secondary sources to find social science studies on the effects of pornography. For the purposes of this dissertation, pornography is defined as follows: nonviolent material, such as writings, photographs, movies or Internet sites, depicting sexual activity or erotic behavior between consenting adults in a way that is designed to arouse sexual excitement.155 To find the pornography studies, the author will run keyword searches in the following databases that contain social science studies: Academic Search Elite, Academic

Search Premier, FirstSearch, and Wilson Omnifile. The author also will use the Google search engine to find citations or links to other scientific studies on pornography. The keywords will be:

“pornography + effects”; “pornography + effects + children”; “pornography + effects + minors”;

155 The definition of pornography is based on the definition in Black’s Law Dictionary, which reads: “material (such as writings, photographs, or movies) depicting sexual activity or erotic behavior in a way that is designed to arouse sexual excitement.” BLACK’S LAW DICTIONARY (8th ed. 2004). The author of this dissertation has added the terms “Internet sites,” “nonviolent” and “between consenting adults” to Black’s definition for clarification and to distinguish consensual nonviolent pornography from nonconsensual and/or violent pornography. The U.S. Supreme Court has never defined the term “pornography,” which is a “vague” term because it includes both protected sexual material and unprotected sexual material, such as obscenity and rape. See ROBERT TRAGER, JOSEPH RUSSOMANO & SUSAN DENTE ROSS, THE LAW OF JOURNALISM AND MASS COMMUNICATION 384 (2007). The term “pornography” does not have a common definition or meaning. See DON R. PEMBER & CLAY CALVERT, MASS MEDIA LAW 12 (2005). The Supreme Court defined obscenity in 1973 in Miller v. California, 413 U.S. 15, as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual conduct in a patently offensive way, and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.” Miller, 413 U.S. at 24. Indecency, as defined by the Federal Communications Commission in 1975, is “language that describes, in terms patently offensive as measured by contemporary community standards for the broadcast medium, sexual or excretory activities and organs.” FCC v. Pacifica Found., 56 F.C.C.2d 94, 97 (1975). However, the FCC in 2001 revised the definition of indecency to consider “context” and examine how explicit or graphic the material is, whether the material dwells on sexual activities, and whether the material is meant to shock or sexually arouse the audience. FCC Enforcement Policy Regarding Broadcast Indecency, 16 F.C.C.R. 7999, 8002-03 (2001).

40

“pornography + online + effects”; “pornography + Internet + effects”; “pornography + online + effects + minors”; “pornography + Internet + effects + minors”; “pornography + online + effects

+ children”; and “pornography + Internet + effects + children.”

Although the CIPA applies to public libraries and schools, an analysis of the law’s applicability to schools is beyond the scope of this dissertation.

Outline of Dissertation

This research project on the Children’s Internet Project Act will be organized into eight chapters.

Ch. 1: Internet Access in Public Libraries. This chapter serves as an introduction to the dissertation and provides an overview of topics that will be covered. This chapter summarizes the Children’s Internet Protection Act of 2000 and discusses the purpose and relevance of the dissertation. This chapter reviews relevant literature on mandatory Internet filtering in public libraries, explains the author’s methodology, and provides a summary of chapters.

Ch. 2: The Public Library and the First Amendment. This chapter provides an overview of the mission and role of the American public library. Chapter Two examines the public forum doctrine as it applies to public libraries and the right to receive information in a public library.

This chapter discusses minors’ access to information in public libraries. Chapter Two examines

First Amendment theory in the context of mandatory Internet filtering in public libraries.

Ch. 3: Internet Filtering Technology. To help readers understand the theoretical and practical issues involved with Internet filtering, this chapter discusses how filtering technology works, including the strengths and weaknesses of filtering technology. Expert analysis in secondary sources is used for this chapter.

Ch. 4: The Protection of Minors from Material Deemed Harmful. This chapter provides an overview of American society’s changing view of childhood over time, the parental role in

41

childrearing, and the law’s treatment of sexually explicit material and minors’ access to sexually

explicit material. Chapter Four discusses social science research on the effects of pornography.

Ch. 5: Federal Attempts at Protecting Minors from Online Material Deemed Harmful.

This chapter examines the Communications Decency Act (CDA) and the Child Online Protection

Act (COPA). The CDA and COPA were the only two statutes that Congress passed in an effort

to prevent minors from accessing sexually explicit content on the Internet before enacting the

Children’s Internet Protection Act of 2000. This chapter examines the court cases that reviewed

the Communications Decency Act and the Child Online Protection Act.

Ch. 6: The Legislative History of the Children’s Internet Protection Act. This chapter

discusses the legislative history of the Children’s Internet Protection Act to try to determine

Congressional intent in enacting the law. An understanding of the legislative history of previous

bills is important to this study because the Children’s Internet Protection Act is based on the text

of these previous bills, as well as on House and Senate hearings and reports on these bills.

Ch. 7: Court Decisions on the Children’s Internet Protection Act. This chapter analyzes

the federal district court’s and the Supreme Court’s opinions on the Children’s Internet

Protection Act.

Ch. 8: Conclusion: Bridging the Gap Between Law and Technology. This chapter draws

on information from the preceding chapters to analyze the Children’s Internet Protection Act of

2000 in the context of First Amendment theory, the mission and role of the American public

library, Internet filtering technology, and legal precedence. The author will compare the

Children’s Internet Protection Act with the status of filtering technology to try to determine

whether it is technologically feasible for the so-called filtering and blocking software to accomplish what Congress is asking of it. This chapter will also evaluate whether the CIPA is

42

consistent with our current understanding of the harm to minors of sexually oriented content, the role of libraries, and the mandate of the First Amendment.

43

CHAPTER 2 THE PUBLIC LIBRARY AND THE FIRST AMENDMENT

Introduction

Librarians and, at least, First Amendment scholar Rodney Smolla believe the mission of the public library has been to support the educational, socialization and information needs of society, to promote self-education, and to satisfy the tastes of the popular culture.1 Public libraries often have provided open and equal access to all materials and to all users,2 with the goal of combating censorship and thus preserving everybody’s individual right to choose their own reading and viewing materials.3

Public libraries in the United States have existed for more than 150 years. The nation’s first large tax-supported municipal library, the Boston Public Library, opened in 1854 and served as a model for future public libraries in three major ways: by providing open access to all individuals, by allowing all residents to borrow materials, and by designing a separate children’s reading room.4 The number of public libraries greatly expanded during the 1890s and early

1900s after philanthropist Andrew Carnegie began donating millions of dollars for the

1 See RICHARD E. RUBIN, FOUNDATIONS OF LIBRARY AND INFORMATION SCIENCE 244 (2000); Rodney Smolla, Freedom of Speech for Libraries and Librarians, 85 LAW LIBR. J. 71, 73 (1993) (“Librarians must fight those who seek to destroy the critical role of the library as the free and open marketplace of ideas, turning it instead into an arbiter of conventional mainstream tastes and sensibilities.”).

2 See THE BOWKER ANNUAL LIBRARY AND BOOK TRADE ALMANAC 243 (Dave Bogart ed., 44th ed. 1999). But see Daniel J. Boorstin, The Indivisible Community, in LIBRARIES AND THE LIFE OF THE MIND IN AMERICA 119 (1977), explaining that many early librarians were torn between the preservation of books by protecting them from the public and the diffusion of ideas, or making books accessible. In addition, librarians were concerned over the dress and demeanor of patrons, particularly the “‘laboring classes’. . . who might soil the books and were unlikely to show them the respect that they were entitled to.” Boorstin at 119.

3 See BRUCE A. SHUMAN, FOUNDATIONS AND ISSUES IN LIBRARY AND INFORMATION SCIENCES 122 (1992).

4 See JEAN KEY GATES, INTRODUCTION TO LIBRARIANSHIP 72 (1990). The Boston Public Library opened in a former schoolhouse in 1854 and moved to its present site in 1895. Id. at 3. See also EVELYN GELLER, FORBIDDEN BOOKS IN AMERICAN PUBLIC LIBRARIES, 1876-1939 at 72 (1984) and Boston Public Library, A Brief History and Description, http://www.bpl.org/general/history.htm (last visited July 20, 2009). For a history of libraries in America, see generally C. SEYMOUR THOMPSON, EVOLUTION OF THE AMERICAN PUBLIC LIBRARY: 1653-1876 (1952) and LOUISE ROBBINS, CENSORSHIP AND THE AMERICAN LIBRARY: THE AMERICAN LIBRARY ASSOCIATION’S RESPONSE TO THREATS TO INTELLECTUAL FREEDOM: 1939-1969 (1996). See also GATES at 47-90.

44

construction of municipal libraries across the nation, according to library scholar and historian

Evelyn Geller.5 In most states, public libraries today are supported by taxes, primarily local property taxes.6 Sources of funding for libraries also can come from federal and state appropriations, intangible taxes, government lotteries, and private family, foundation and corporate funds.7 A local board of trustees oversees most public libraries. The board determines the overall purposes, objectives and policies of the library; provides budgetary advice; and works with public officials, library associations and local residents to provide quality library services.8

According to the American Library Association, trustees also are expected to support librarians in resisting an individual’s or group’s censorship attempts.9

Under the Children’s Internet Protection Act of 2000, public libraries—and most schools10—receiving federal funding are required to implement an Internet safety policy11 and install filtering technology on all computers connected to the Internet to block images that are

5 GELLER, supra note 4, at 40.

6 See SHUMAN, supra note 3, at 78-80. See also RUBIN, supra note 1, at 231.

7 See SHUMAN, supra note 3, at 75-80.

8 GATES, supra note 4, at 140. See also AMERICAN LIBRARY TRUSTEE ASSOCIATION, FIVE WAYS ALTA CAN HELP YOUR LIBRARY, available at http://www.ala.org/ala/mgrps/divs/alta/links/PDF5waysaltacanhelpyou.pdf.

9 AMERICA’S LIBRARY AND TRUSTEES ADVOCATES, ETHICS STATEMENT FOR PUBLIC LIBRARY TRUSTEES, available at http://www.ala.org/ala/mgrps/divs/alta/links/ethicsstatement.pdf.

10 The Federal Communications Commission concluded that for a school to be eligible for universal service discounts, “a school must meet the statutory definition of an elementary or secondary school found in the Elementary and Secondary Education Act of 1965, must not operate as a for-profit business, and must not have an endowment exceeding $50 million. Both public and non-public elementary and secondary schools that meet these criteria will be eligible to receive discounts on eligible services.” See FCC, Frequently Asked Questions on Universal Service and the Snowe-Rockefeller Amendment (released July 2, 1997), http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

11 20 U.S.C. § 9134(f)(1)(A) and (B), 47 U.S.C. §254(h)(5)(A) and (h)(6)(A). For a discussion of the CIPA, see infra Chapter 6, pp. 281 to 286. For an explanation of the LSTA and E-rate programs, see Chapter 1, supra notes 36 to 41 and accompanying text.

45

obscene, involve child pornography, or are considered harmful to minors.12 To understand the

role of the public library in providing information to patrons, it is important to analyze the

mission of the public library, how the First Amendment public forum and right to receive ideas

and information doctrines apply to public libraries, and the role the public library has played in

making information accessible to minors.

The Mission and Role of the Public Library

Even before the emergence of the Internet, public libraries faced varying degrees of

trouble with censorship, community control, and funding.13 During the latter part of the

nineteenth century, intellectual freedom was not yet a goal of tax-funded public libraries. In the

1880s, most librarians focused on the occupational and cultural needs of patrons and did not oppose moral censorship, though mostly out of fear of losing their position, according to

Geller.14 As the profession of librarianship became more established in the early 1900s, public

librarians began to embrace neutrality and impartiality. However, in 1917, during a time of war

and crisis, national values of patriotism sometimes led to policies endorsing censorship.

Librarians either voluntarily advocated, or were expected to advocate, societal values.15 In the

1920s, public library trustees16 and librarians joined forces to oppose attempts at censorship by

12 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f), 47 U.S.C. § 254(h)(6)), mandating that filtering software be installed on all computers connected to the Internet for those libraries receiving E-rate funding and Library Services and Technology Act (LSTA) funding.

13 SHUMAN, supra note 3, at 75.

14 GELLER, supra note 4, at 28-29.

15 GELLER, supra note 4, at 184. Library trustees and community officials expected the support of common societal values during war time.

16 Most public libraries are administered by a board of trustees, which determines the policies of the library, provides budgetary advice, and works with public officials, library associations and citizens to provide quality library services. GATES, supra note 4, at 140.

46

the local government and community.17 During the 1930s, an ideology of freedom emerged in the library profession.18 The American Library Association (ALA) began promoting “man’s freedom to seek the truth where and how he will” in the 1930s, according to Jean Key Gates, the author of a leading textbook on librarianship.19

During the 1930s, the American Library Association began developing a set of resolutions emphasizing the importance of providing a variety of viewpoints in library materials and challenging censorship.20 The ALA formally adopted a bill of rights in 1939, according to

Gates.21 The ALA’s Library Bill of Rights, which was amended several times in the following decades, states that “libraries should challenge censorship in the fulfillment of their responsibility to provide information and enlightenment” 22 and “libraries should cooperate with all persons and groups concerned with resisting abridgment of free expression and free access to ideas.”23

17 GELLER, supra note 4, at 185.

18 GELLER, supra note 4, at 143, 156; ROBBINS, supra note 4, at 151. Supreme Court Justice David Souter cited Geller in his dissenting opinion in United States v. American Library Ass’n, in emphasizing the evolution of freedom of choice in accessing library materials in the twentieth century. See United States v. Am. Library Ass’n, 539 U.S. 194, 238 (2003) (Souter, J., dissenting).

19 GATES, supra note 4, at 90 (citing David K. Berninghausen, The History of the ALA Intellectual Freedom Committee, 27 WILSON LIBRARY BULLETIN 813 (1953)).

20 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf. The Library Bill of Rights was adopted in 1948. For a discussion of the evolution of the Library Bill of Rights, see also OFFICE FOR INTELLECTUAL FREEDOM OF THE AMERICAN LIBRARY ASS’N, INTELLECTUAL FREEDOM MANUAL 5-17 (5th ed., 1996).

21 GATES, supra note 4, at 87.

22 AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, supra note 20, at art. III. The Library Bill of Rights was adopted on June 18, 1948, and amended on Feb. 2, 1961 and on Jan. 23, 1980 by the ALA Council. See also RUBIN, supra note 1, at 160 (reprint of Library Bill of Rights).

23 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, supra note 20 at art. IV. See also RUBIN, supra note 1, at 160 (reprint of Library Bill of Rights).

47

In 1953, the American Library Association and the American Book Publishers

Association jointly prepared and adopted the “Freedom to Read Statement,” which they have since revised several times.24 The statement stresses the importance of reading in a democratic society25 and states that librarians and publishers have the responsibility of protecting people’s freedom to read, providing access to a diversity of ideas, and opposing censorship.26 Library scholar Bruce Shuman said that a public library’s ongoing goal should be “the preservation of everybody’s right to choose,” and that librarians should not give in to community pressure to remove materials from the shelves.27

In addressing minors’ access to public library materials, the ALA’s Library Bill of Rights beginning in 1996 stated that a person’s right to use a public library should not be denied or abridged because of age.28 Many librarians have argued passionately for intellectual freedom, which they interpret to mean that individuals in a pluralistic society have the freedom and right to think, read, view and experience what they want to, without interference from others.29

24 AM. LIBRARY ASS’N, FREEDOM TO READ STATEMENT, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/ftrstatement/freedomreadstatement.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/ftrstatement/freedomtoreadstatement.pdf. The Freedom to Read Statement was adopted on June 25, 1953 and revised on January 28, 1972, on January 16, 1991, on July 12, 2000, and on June 30, 2004, by the ALA Council and the AAP Freedom to Read Committee. See also RUBIN, supra note 1, at 159.

25 FREEDOM TO READ STATEMENT, supra note 24, which states, “The power of a democratic system to adapt to change is vastly strengthened by the freedom of its citizens to choose widely from among conflicting opinions offered freely to them.” See also RUBIN, supra note 1, at 159.

26 See FREEDOM TO READ STATEMENT, supra note 24. In part, the statement reads, “It is in the public interest for publishers and librarians to make available the widest diversity of views and expressions, including those that are unorthodox, unpopular, or considered dangerous by the majority.” See also RUBIN, supra note 1, at 161.

27 SHUMAN, supra note 3, at 122.

28 See LIBRARY BILL OF RIGHTS, supra note 20. The Library Bill of Rights was adopted on June 18, 1948, and amended on February 2, 1961, and on January 23, 1980, with the inclusion of “age” reaffirmed on January 23, 1996, by the ALA Council. Art. V reads: “A person’s right to use a library should not be denied or abridged because of origin, age, background, or views.”

29 See SHUMAN, supra note 3, at 120-21. See also MICHAEL GORMAN, OUR ENDURING VALUES 27 (2000).

48

The mission of the public library today, as it was in most of the twentieth century, is to

support the educational, social and information needs of society, to promote self-education, and

to satisfy the tastes of the popular culture, according to library scholar Richard Rubin.30 Gates has written that the function of the public library is to provide materials to meet its constituents’ needs for information, education, self-realization, recreation, and cultural growth.31 Gates added

that the role of most public librarians is to assist patrons in carrying out their duties as citizens

and community members by organizing and interpreting information for patrons and guiding

them in the use of materials.32 Rubin, who also authored a key textbook on library and

information science, said librarians tend to hold common values, including commitment to service, respect for truth and the search for truth (including protecting and defending many

points of view), and tolerance for several perspectives on any given subject.33 Rubin stated that

public libraries continue to serve as multi-purpose social institutions for both self-improvement

and entertainment.34 Author W.J. Murison wrote that libraries are places where readers can

“fulfil [sic] any personal, professional or political aspiration”35 Rubin wrote that one role of public libraries is to provide “healthy entertainment,” which is carried out, in part, with the availability of books on fiction, romance and travel.36

30 RUBIN, supra note 1, at 244.

31 GATES, supra note 4, at 146.

32 GATES, supra note 4, at 146.

33 RUBIN, supra note 1, at 248-61.

34 RUBIN, supra note 1, at 258. But see Boorstin, supra note 2, at 120, stating that librarians in the late 1800s and early 1900s disagreed as to whether public libraries existed for the purpose of instruction or entertainment or both.

35 W.J MURISON, THE PUBLIC LIBRARY 85 (3d ed. 1988).

36 RUBIN, supra note 1, at 258.

49

Libraries are central to political and social freedom, as well as intellectual freedom,

according to library scholars Walt Crawford and Michael Gorman, who wrote, “A society

without uncensored libraries is a society open to tyranny.”37 In analyzing the role of the public

library, Crawford and Gorman stated, “Libraries exist to acquire, give access to, and safeguard

carriers of knowledge and information in all forms and to provide instruction and assistance in

the use of the collections to which their users have access.”38 Historian and Librarian of

Congress Emeritus Daniel Boorstin said that “the great American library movement” was founded, in part, on individual autonomy, where librarians encourage patrons to help themselves

in pursuing independence, self-governance, and specialized interests.39

Patrons using the Internet in public libraries have been able to find a variety of

information online, including intellectual, political, social and entertainment materials.40

However, commentators disagree as to whether the use of Internet filters in public libraries is an acquisition decision or censorship. To understand their arguments, it is important to first understand the concept of acquisitions or collections.

The terms “acquisition,” “collection” and “selection,” which are used synonymously, refer to a library’s objectives and methods in choosing materials to include in its holdings.

Typically, acquisition polices are general in nature. For example, the American Library

Association’s Bill of Rights reads: “Books and other library resources should be provided for

37 See WALT CRAWFORD & MICHAEL GORMAN, FUTURE LIBRARIES: DREAMS, MADNESS AND REALITY 11 (1995).

38 Id. at 3.

39 See Boorstin, supra note 2, at 126, 130.

40 About 73% of public libraries provided Internet access in 1998. See Press Release, Am. Library Ass’n, New Report Shows More Libraries Connect to the Internet; Access Still Limited (Nov. 1998) available at http://bubl.ac.uk/archive/journals/alawon/v07n149.htm. From 2006 to 2008, the percent of libraries providing Internet access remained constant at 99%. See AM. LIBRARY ASS’N, THE STATE OF AMERICAN LIBRARIES 5 (April 2007); Bertot et al., supra note 1.

50

the interest, information, and enlightenment of all people of the community the library serves.

Materials should not be excluded because of the origin, background, or views of those contributing to their creation.”41

To help public libraries develop an acquisition policy, the Public Library Association, a division of the ALA, provides links on its Web site to sample policies from a variety of public

libraries in the United States.42 While the objectives of the acquisition policies typically are

general, the policies all emphasize selecting books and other resources that reflect a variety of

viewpoints and address patron interest. The lists of objectives also include other factors to

consider, such as the cost of materials, the ease of acquiring the materials, and the maintenance

of a balanced collection of traditional and contemporary works.43 For example, a Wisconsin

sample policy reads:

The purpose of the ______Public Library is to provide all individuals in the community with carefully selected books and other materials to aid the individual in the pursuit of education, information, research, pleasure, and the creative use of leisure time. 44

A Louisiana library policy includes a paragraph on the principles of collection development:

Calcasieu Parish Public Library supports the individual's right to access ideas and information representing all points of view. To this end, the library welcomes and solicits patron suggestions, comments and ideas about the collection and its development. The Collection Development Coordinator and staff of the Library in making selections should do so in a manner based upon principle

41 FREEDOM TO READ STATEMENT, supra note 24.

42 See PUB. LIBRARY ASS’N, BEST PRACTICES IN PUBLIC LIBRARIES, available at http://www.ala.org/ala/shadows/pla/resources/bestpractices.cfm.

43 Id.

44 See SMALL LIBRARY COMMITTEE OF THE WISCONSIN ASSOCIATION OF PUBLIC LIBRARIANS, SAMPLE LIBRARY POLICIES FOR THE SMALL PUBLIC LIBRARY (2d ed., revised by David L. Polodna), available at http://www.owlsweb.info/L4L/policies/VIII.asp.

51

rather than personal opinion, reason rather than prejudice, and judgment rather than censorship.45

Librarians typically adhere to professional standards and local acquisition development policies when they purchase materials for the library’s collections.46 Librarians also use

“selection aids,” such as bibliographies and review journals, to acquire materials that meet the library’s collection development criteria.47 Librarians sometimes delegate their selection decisions to third-party vendors, who acquire print and video resources for the library based on the library’s collection development criteria.48 A public library’s collection development criteria typically reflect the library’s evaluation of the material’s quality and patrons’ demand for material.49

In contrast, when librarians select Internet filtering software, they would not be able to base their choices on the library’s collection development policies because filtering software is proprietary.50 Librarians would not be able to determine if commercial filters met their needs because software companies do not disclose their standards.51 Therefore, librarians would have

45 CALCASIEU PARISH PUBLIC LIBRARY, COLLECTION DEVELOPMENT POLICY, available at http://www.calcasieu.lib.la.us/CollectionDevelopment.htm. The mission statement of the library reads as follows: “The Calcasieu Parish Public Library serves all the people who live in the parish with materials, information, and services through a network of branches that are conveniently located and easy to use. The library strives to help people make informed decisions, enjoy their free time, and continue learning all their lives. The Library Board and Staff are committed to providing high quality, cost effective, equitable service that meets the needs of all parish residents.”

46 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 421 (E.D. Pa. 2002).

47 Id.

48 Id.

49 Id. at 462.

50 Id. at 462-64. See also Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing on H.R. 3783, H.R. 774, H.R. 1180, H.R. 1964, H.R. 3177, and H.R. 3442 before the Subcomm. on Telecomm., Trade and Consumer Protection of the H. Comm. on Commerce, 105th Cong. 40 (Sept. 11, 1998) (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

51 See Legislative Proposals to Protect Children from Inappropriate Materials on the Internet, supra note 50.

52

no way of knowing exactly which content has been blocked in any given category, as the district court noted in its opinion on the Children’s Internet Protection Act of 2000.52

Prior to the passage of the Children’s Internet Protection Act of 2000, library scholar

Walt Crawford stated that almost all libraries did “filter” library materials by choosing some materials over others, which he said was a common selection practice.53 When a public library buys only 5% of what is published in a given year, it is considered selection and not censorship, according to Crawford.54 He has contended that same approach should be used for Internet access, with librarians offering selective access to Internet materials. Crawford argued that

Internet access is a finite resource because Internet bandwidth is limited,55 and libraries have a set number of computers connected to the Internet and have to pay for computers, Web servers and online storage.56 “Open access to the Internet and World Wide Web is almost the antithesis of selection,” Crawford wrote. “The Web and Internet are full of junk: bad ‘information,’ deliberate lies, and some freely available pornography.”57

52 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). For a discussion of Internet filtering technology, see Chapter 3. For a discussion of the court cases deciding the Children’s Internet Protection Act, see Chapter 7.

53 WALT CRAWFORD, BEING ANALOG 225 (1999).

54 Id. See also Brent VanNorman, Comment & Note, The Library Internet Filter: On the Computer or in the Child? 11 REGENT U. L. REV. 425, 430-32 (1998/1999).

55 Bandwidth is the total capacity (volume and speed) of a wire or the maximum rate at which data can flow between computers. Bandwidth also refers to the amount of data that can be transmitted in a fixed period of time, e.g. how fast data flows through the path that it travels to a computer. Bandwidth is usually measured in kilobits, megabits or gigabits per second. The amount of bandwidth a Web server requires depends on the applications that will be running on the Web server. Large files—and audio and video files—use more bandwidth than text files, thus slowing down the system for other users. An online communications path usually consists of a succession of links, each with its own bandwidth, which can be slower than others in the succession. PETER BUCKLEY & DUNCAN CLARK, THE ROUGH GUIDE TO THE INTERNET 18, 30, 316 (2008). See also ‘BANDWIDTH FAQ’ in University of Wisconsin-Eau Claire LTS Online Help Documentation, http://www.uwec.edu/Help/Internet/bandwidth.htm (last visited July 20, 2009); ‘BANDWIDTH’ 2008, in Encyclopedia Brittanica Online (Chicago: Encyclopedia Brittanica, Inc., 2008), http://www.search.eb.com.lp.hscl.ufl.edu/eb/article-280194 (last visited July 20, 2009).

56 CRAWFORD, supra note 53, at 225-26.

57 CRAWFORD, supra note 53, at 225.

53

The American Library Association distinguishes between “selection” and “censorship,”

stating,

No library can make everything available, and selection decisions must be made. Selection is an inclusive process, where the library affirmatively seeks out materials which will serve its mission of providing a broad diversity of points of view and subject matter. By contrast, censorship is an exclusive process, by which individuals or institutions seek to deny access to or otherwise suppress ideas and information because they find those ideas offensive and do not want others to have access to them.58

Whether Internet filtering can be considered acquisition or censorship may depend on

how the libraries initially set up their computers, according to Crawford. He argued that selection

occurs when libraries have always used filtering mechanisms and thus have not removed content

that was once available. In contrast, censorship occurs when libraries first have had full access to

the Internet and then restrict that access through the use of filters.59 When material that was once

available has been removed based on content rather than age or wear, that removal could be

considered a form of censorship, Crawford stated.60 In other media, librarians do not selectively

delete portions of the material once they choose to acquire it.61 For example, librarians would not

go through past copies of local newspapers and blacken out advertisements for adult movies or

provocative personal ads62 or selectively remove sections of microfiche, audio or video

materials.

58 Am. Library Ass’n, Intellectual Freedom and Censorship Q & A, http://www.ala.org/ala/aboutala/offices/oif/basics/ifcensorshipqanda.cfm (last visited July 20, 2009).

59 CRAWFORD, supra note 53, at 226.

60 CRAWFORD, supra note 53, at 226.

61 Other media examples include books, magazines, CDs, DVDs, and audio and visual tapes.

62 CRAWFORD, supra note 53, at 227.

54

One law professor suggested that Internet filtering could be an effective tool for librarians, as long as filtering use was voluntary. In a 2001 article, two years before the U.S.

Supreme Court upheld the CIPA,63 Bernard Bell wrote:

[L]ibraries could assert that Internet filters may further public libraries' mission of helping the readers negotiate a vast array of materials on subjects that vary greatly in quality. Librarians have traditionally assisted patrons in negotiating a wealth of materials on various subjects. . . .There is little reason, however, to prevent those who wish to access the Internet without filters from doing so if filters are justified only as a means for assisting patrons in winnowing material.64

Conversely, other commentators have argued that Internet filtering, if mandated by the government, is censorship, regardless of the original set up of the computers.65 Some commentators have argued that the use of mandatory Internet filters in public libraries is a form of censorship that is both misguided and ineffective.66 Author Kiera Meehan argued that mandatory filtering constitutes censorship, stating,

Such censorship will severely hinder the primary function of public libraries: to provide free, public access to books, knowledge, and the Internet. . . .Forcing libraries to install filtering

63 United States v. Am. Library Ass’n, 539 U.S. 194 (2003) (upholding the CIPA, which requires libraries receiving E-rate and LSTA funding to implement an Internet use policy and install filters on all computers connected to the Internet.)

64 Bernard Bell, Filth, Filtering, and the First Amendment: Rumination on Public Libraries’ Use of Internet Filtering Software, 53 FED. COMM. L.J. 191, 224-25 (2001).

65 See generally Richard J. Peltz, Use “the Filter You Were Born With”: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Public Libraries, 77 WASH. L. REV. 397, 397-400 (2002); Barbara A. Sanchez, Note, United States v. American Library Association: The Choice Between Cash and Constitutional Rights, 38 AKRON L. REV. 463, 487-88 (2005). See also Junichi P. Semitsu, Note, Burning Cyberbooks in Public Libraries: Internet Filtering Software vs. The First Amendment, 52 STAN. L. REV. 509, 533-35 (2000); J. Adam Skaggs, Note, Burning the Library to Roast the Pig? Online Pornography and Internet Filtering in the Free Public Library, 68 BROOKLYN L. REV. 809, 809 (2003); Steven E. Merlis, Preserving Internet Expression While Protecting Our Children: Solutions Following Ashcroft v. ACLU, 4 NW. J. TECH. & INTELL. PROP. 117, 128 (2005), (arguing “when the government mandates filters in public settings (such as public libraries) significant censorship issues arise”); Kiera Meehan, Note, Installation of Internet Filters in Public Libraries: Protection of Children and Staff vs. the First Amendment, 12 B.U. PUB. INT. L.J. 483, 484, 503 (2003).

66 Skaggs, supra note 65, at 809.

55

software, by either government mandate or intimidation, impedes the library's right to determine what materials to acquire.67

The American Library Association views Internet filtering as censorship, rather than acquisition.68 In 1997, the ALA passed an anti-filtering resolution, stating that the use of filtering software to block constitutionally protected speech violates the Library Bill of Rights.69

The controversy over the use of mandatory filtering software in public libraries involves not only the mission of the public library and whether filtering constitutes censorship, but also how public forum and the right to receive information doctrines apply to the public library.

The Applicability of Public Forum and the Right to Receive Doctrines to Public Libraries

Although legal commentators disagree on the applicability of First Amendment public forum and right to receive information doctrines to public libraries,70 between 1992 and 2002

67 Meehan, supra note 65, at 484, 503.

68 At the top of its Web page on Internet filtering, which provides information on the CIPA and on Internet filtering options, the ALA posted the censorship provision from the Library Bill of Rights: “Libraries should challenge censorship in the fulfillment of their responsibility to provide information and enlightenment.” See Am. Library Ass’n, Filters and Filtering, http://www.ala.org/ala/aboutala/offices/oif/ifissues/filtersfiltering.cfm (last visited July 20, 2009).

69 AM. LIBRARY ASS’N, RESOLUTION ON THE USE OF FILTERING SOFTWARE IN LIBRARIES, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/ifresolutions/resolutionuse.cfm.

70 See Dawn C. Nunziato, The Death of the Public Forum in Cyberspace, 20 BERKELEY TECH. L.J. 1115, 1163 (2005) (stating that the public forum doctrine needs to be reexamined in light of new technology); Barbara A. Sanchez, supra note 65, at 486-87 (arguing that the Supreme Court has, in a variety of contexts, referred to a First Amendment right to receive information and ideas and Internet access in public libraries could reasonably be construed to be a public forum); Aaron Jacobson, Note, United States v. American Library Association: Software Filters, Free Speech, and the Shrinking Public Forum, 38 U.C. DAVIS L. REV. 1345, 1361 (2005) (arguing that Internet terminals in public libraries are a traditional public forum); Raizel Liebler, Institutions of Learning or Havens for Illegal Activities: How the Supreme court Views Libraries, 25 N. ILL. U. L. REV. 1, 74 (2004) (arguing that public libraries are the modern version of a traditional public forum); Larissa Piccardo, Note, Filtering the First Amendment: The Constitutionality of Internet Filters in Public Libraries Under the Children’s Internet Protection Act, 41 HOUS. L. REV. 1437, 1459 (2004) (arguing that public libraries and Internet access within those libraries are a public forum); Leah Wardak, Note, Internet Filters and the First Amendment: Public Libraries After United States v. Am. Library Association, 35 LOY. U. CHI. L.J. 657, 724-25 (2004) (arguing “The government took an affirmative step to designate Internet access as a public forum when it announced that the reason libraries provide Internet access is to give their patrons another medium for research.” Wardak did not elaborate further on public forum doctrine as it applied to Internet access); Bell, supra note 64, at 207 (arguing that the public library is at the very least a limited public forum for those desiring physical access to obtain information and might even be a traditional public forum for receiving information); Semitsu, supra note 65, at 533-35 (2000) (arguing that the Internet deserves limited public forum status); Mark S. Nadel, The First Amendment Limitations on the Use of Internet Filtering in

56

three federal courts held that public libraries are, at the very least, a limited public forum.71 The three courts also applied the right to receive information doctrine to public libraries. However, in

2003 a U.S. Supreme Court plurality upheld the Children’s Internet Protection Act, stating that

Internet access in public libraries is not a traditional public forum or a designated public forum.72

To understand the Court’s rationale, it is important to explain how the Court developed the concept of public forum doctrine. In a series of cases decided between 1939 and 1992, the

Supreme Court recognized three types of forums: a traditional public forum, a limited or designated public forum, and a nonpublic forum.73

A traditional public forum is one that historically has been open for public discourse and debate74 and does not require official government designation.75 Examples include streets, parks, sidewalks, and town squares.76 In a traditional public forum, the government may not prohibit all communicative activity and must subject any content-based speech restrictions to the strict

Public and School Libraries: What Content Can Libraries Exclude? 78 TEX. L. REV. 1117, 1132 (2000) (arguing public library buildings are a limited public forum, but library content—include online content—is a nonpublic forum); VanNorman, supra note 54, 430-32 (arguing that a public library is not a traditional public forum because patrons cannot give speeches and arguing that the library is not a limited public forum because no expressive activities have been admitted).

71 Kreimer v. Bureau of Police, 958 F. 2d. 1242 (3d Cir. 1992); Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 24 F. Supp. 2d. 552 (E.D. Va. 1998); Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). But see United States. v. Am. Library Ass’n, 539 U.S. 194, 205-06 (2003), in which the Supreme Court stated that Internet access in public libraries is neither a traditional public forum nor limited or designated public forum. For a discussion of public forum doctrine, see infra notes 73-90 and accompanying text.

72 See United States v. Am. Library Ass’n, 539 U.S. at 205-06 (2003). The Court emphasized that public forum doctrine historically has applied to speakers, rather than listeners or recipients of the communication. For a summary of the Supreme Court’s opinion on Internet access in public libraries, see supra notes 170-171 and accompanying text. For an analysis of the Supreme Court’s opinion, see Chapter 7.

73 See Hague v. CIO, 307 U.S. 496, 515 (1939); Carey v. Brown, 447 U.S. 455, 461 (1980); Widmar v. Vincent, 454 U.S. 263, 269, 270-71, 277 (1981); Perry Educ. Ass'n v. Perry Local Educators' Ass'n, 460 U.S. 37 (1983); Cornelius v. NAACP Legal Def. & Educ. Fund, 473 U.S. 788 (1985); Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672 (1992).

74 Cornelius, 473 U.S. at 800 and Perry, 460 U.S. at 45.

75 See Hague , 307 U.S. at 515 and Perry, 460 U.S. at 45.

76 Hague, 307 U.S. at 515; Perry, 460 U.S. at 44-45; United States v. Am. Library Ass’n, 539 U.S. at 202.

57

scrutiny test—that is, the regulation must be necessary to serve a compelling state interest and it must be narrowly drawn to achieve that end.77 However, the government may enforce time/place/manner regulations that are content-neutral as long as the three-part intermediate scrutiny standard is met: the regulations must serve a “significant” government interest, be narrowly tailored to achieve that interest, and leave open ample alternative channels of communication.78 Under the time/place/manner test, a narrowly tailored regulation need not be the “least-restrictive or least-intrusive means” of furthering the government’s interest as long as the regulation promotes a substantial government interest that would not be achieved as effectively without the regulation.79

In the second type of forum, a limited or designated public forum, the government opens its public property for specific expressive activity or discussion of specific subjects80 or for use by certain groups, such as opening public university meeting rooms for registered student groups81 and municipal theaters for performers.82 The Supreme Court has held that in a limited

77 Perry, 460 U.S. at 45 (citing Carey v. Brown, 447 U.S. 455, 461 (1980)).

78 Perry, 460 U.S. at 45 (citing U.S. Postal Serv. v. Council of Greenburgh Civic Ass’ns, 453 U.S. 114, 132 (1981); Consol. Edison Co. v. Pub. Serv. Comm'n, 447 U.S. 530, 535-536 (1980); Grayned v. City of Rockford, 408 U.S. 104, at 115 (1972); Cantwell v. Connecticut, 310 U.S. 296 (1940); Schneider v. State, 308 U.S. 147 (1939)).

79 Kreimer v. Bureau of Police, 958 F.2d 1242, 1264 (1992) (citing Ward v. Rock Against Racism, 491 U.S. 781, 797 (1989)).

80 Perry, 460 U.S. at 45-46. See also City of Madison Joint Sch. Dist. v. Wisc. Employment Relations Comm’n, 429 U.S. 167, 175 (1976), in which the Supreme Court held that school board meetings were open to employees as well as the public, stating, “Where the State has opened a forum for direct citizen involvement, it is difficult to find justification for excluding teachers.”

81 Widmar v. Vincent, 454 U.S. 263, 269, 270-71, 277 (1981). The Supreme Court held that state regulation of speech must be content neutral and a state university's refusal to grant student religious group access to university facilities was unjustifiable, content-based exclusion of religious speech as the facilities were generally open to other student groups. According to the Court opinion, “[T]he campus of a public university, at least for its students, possesses many of the characteristics of a public forum.” Widmar, 454 U.S. at 269. See also Perry, 460 U.S. at 45- 47.

82 Se. Promotions, Ltd. v. Conrad, 420 U.S. 546 (1975). In a per curiam opinion expressing the view of five members of the Supreme Court, the Court held that a city board had engaged in prior restraint by refusing to grant the use of community facilities to a theater group wanting to perform the musical “Hair.” The Court said the board

58

public forum, content-based restrictions also must meet the strict scrutiny standard, but content-

neutral restrictions are permissible. In explaining the parameters of the limited public forum, the

Court held that, “[a]lthough a State is not required to indefinitely retain the open character of the

facility, as long as it does so it is bound by the same standards as apply in a traditional public

forum. Reasonable time, place, and manner regulations are permissible, and a content-based

prohibition must be narrowly drawn to effectuate a compelling state interest.”83

The third type of forum, a nonpublic forum,84 generally is not open for public use

because the property has an intended primary purpose that is not consistent with public use85 and has not been designated a public forum by the government.86 Examples of a nonpublic forum

include airport terminals,87 military bases,88 prisons and jails,89 and teacher mailboxes.90

Case Analysis—The Application of Public Forum and Right to Receive Information Doctrines to Public Libraries

In 1992, prior to Internet access in most public libraries, a federal appellate court applied

public forum and the right to receive information doctrines to a municipal library. Richard

did not have constitutionally required procedural safeguards in place to decide on usage requests only based its decision on outside reports that the play involved nudity and was obscene. See also Perry, 460 U.S. at 45-47.

83 Perry, 460 U.S. at 46 (citing Widmar v. Vincent, 454 U.S. 263, 269-70 (1981)).

84 Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 679 (1992). Justice William Rehnquist, who wrote the majority opinion, used the term “nonpublic fora” to describe airport terminals. “Like the Court of Appeals, we conclude that the terminals are nonpublic fora and that the regulation reasonably limits solicitation.” Id. at 679. The International Society for Krishna Consciousness, a religious group, had invoked the First Amendment in challenging the law that prohibited solicitation of money and the distribution of literature inside airport terminals. The law allowed the solicitation of money and distribution of literature on sidewalks outside the airport terminals. Id. at 674- 75.

85 Perry, 460 U.S. at 46.

86 Lee, 505 U.S. at 678-79.

87 Id. at 680-81.

88 Greer v. Spock, 424 U.S. 828, 838-39 (1976).

89 Adderley v. Florida, 385 U.S. 39, 41 (1966) and Jones v. N.C. Prisoners’ Labor Union, 433 U.S. 119 (1977).

90 Perry, 460 U.S. at 49.

59

Kreimer, a homeless man with poor hygiene, wanted access to the library to read. However, library patrons and staff said Kreimer’s body odor, staring, and loud talking disrupted other patrons and staff.91 In the holding of Kreimer v. Bureau of Police,92 the court found that the local government in Morristown and Morris Township, New Jersey, had made the library a limited public forum for the right to receive information, but not for personal expression, such as making speeches.93

In determining whether a venue is a limited public forum or a nonpublic forum, the

Kreimer court considered three factors: 1) government intent in opening a non-traditional forum for expressive activity; 2) the extent of use granted by the government in allowing patrons access to the library; and 3) the nature of the forum and “its compatibility with expressive activity.”94

First, the court said the local government’s intent was to open the library during regularly scheduled hours to all members of the public for specific purposes—“for expressive activity,

91 Kreimer v. Bureau of Police, 958 F. 2d. 1242, 1247 (3d Cir. 1992).

92 Id.

93 Id. at 1256-63. The Kreimer Court emphasized the right to receive doctrine when stating, “The recognition of a constitutional right protecting public access to information and ideas is simply the threshold of our analysis.” Id. at 1255. In addressing the hygiene issue, the Third Circuit of the U.S. Court of Appeals upheld that the library’s policy requiring that patrons have a non-offensive body odor. The court held that the policy was narrowly tailored to meet a significant interest: “The Library's goal is served by its requirement that its patrons have non-offensive bodily hygiene, as this rule prohibits one patron from unreasonably interfering with other patrons' use and enjoyment of the Library; it further promotes the Library's interest in maintaining its facilities in a sanitary and attractive condition.” Id. at 1264. The court said the policy was narrowly tailored to achieve a significant government interest because the rule prevented one patron from interfering with other patrons’ use of the library and promoted the library’s interest in keeping its facilities clean and attractive. Id. at 1264. Other library policies mandated that patrons engage in activities associated with the use of a public library, such as reading, studying and using library materials, and prohibited loitering, food and drink, and noisy and boisterous activities. See also Kim R. Helper, Note, Kreimer v. Bureau of Police for Morristown: The Sterilization Of The Local Library, 23 STETSON L. REV. 521 (1994).

94 Kreimer, 958 F. 2d. at 1259-62.

60

namely ‘the communication of the written word.’” The Court said that the library could only

exclude patrons after they violated the library’s rules.95

Second, in addressing the extent of use, the court noted that the government does not

need to open a limited or designated public forum to the public at large, but rather can open it “to

a specific class of people or for the discussion of certain subject matter.”96 In the Kreimer case,

the court found that the library’s policies clearly state that the library is open to the public for

specific purposes only—reading, studying, and using library materials. The local government

does not need to grant permission to each person who wishes to enter the library, but the library can ask patrons to leave if they violate the rules.97

Third, in addressing the nature of the forum, the court stated that a public library is a

limited public forum, despite seemingly having conflicting characteristics with such a forum.

The court said the purpose of a public library is to pursue knowledge through “reading, writing

and quiet contemplation” and “the exercise of other oral and interactive First Amendment

activities is antithetical to the nature of the Library.”98 However, the court reasoned that the

library only needed to allow the public to exercise First Amendment rights that are consistent

95 Id. at 1259. The Supreme Court did not elaborate further on government intent, stating, “We could make a more definitive determination concerning the extent of use if, for example, the record included the Library's charter. However, the record is barren of this information, and we are bound to decide the case on the basis of the facts before the district court. We also point out that in its letter of July 14, 1989, to the ACLU-NJ, the Library indicated that access to it was granted ‘to all.’” Id. at 1260.

96 Id. at 1260. The court cited two Supreme Court decisions in reaching this conclusion: Widmar v. Vincent, 454 U.S. 263, 267 (1981), which held that a university is a designated public forum, but it does not need to make its facilities equally available to students and nonstudents or grant open access to campus grounds and buildings, and Perry Educ. Ass’n v. Perry Local Educators Ass’n, 460 U.S. at 45 (1983), which held that “[a] public forum may be created for a limited purpose such as use by certain groups . . . or for the discussion of certain subjects.”

97 Kreimer v. Bureau of Police, 958 F. 2d. 1242, 1260 (3d Cir. 1992).

98 Id. at 1260-61.

61

with the government’s intent in designating the library as a limited public forum and that are consistent with the nature and purpose of the library.99

In 1998 in Mainstream Loudoun v. Board of Trustees of Loudoun County,100 a federal district court judge in Virginia stated that a library board’s sexual harassment policy mandating

Internet filtering on all computers was content-based and therefore subject to the strict scrutiny standard.101 Judge Leonie Brinkema, a former librarian, also held that library buildings and

Internet access inside the libraries were limited public fora.102 Although not binding in other jurisdictions and presumably invalidated by the Supreme Court’s 2003 decision to uphold the

Children’s Internet Protection Act,103 the two related Mainstream Loudoun cases104 were the first court cases that fully applied First Amendment principles to Internet access in public libraries. In the first case, decided in April 1998, Judge Brinkema denied the defendant library board’s request for dismissal, stating that material factual issues remain.105 In the second case, decided in November 1998, Judge Brinkema issued a summary judgment in favor of the

99 Id. at 1261.

100 Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 24 F. Supp. 2d. 552 (E.D. Va. 1998).

101 Id. at 563.

102 Id.

103 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

104 Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 2 F. Supp. 2d 783 (E.D. Va. 1998) and Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 24 F. Supp. 2d. 552, 570 (E.D. Va. 1998).

105 Mainstream Loudoun, 2 F. Supp. 2d at 790. The material fact issues included the board’s justification for the policy, the Internet sites blocked by X-Stop, and the degree of the board’s knowledge and control over the Web sites that X-Stop blocks. The library board had moved to dismiss individual board members and to dismiss the case for failure to state a claim. Judge Brinkema granted the motion to dismiss individual board members as defendants. She also granted in part the motion to dismiss for failure to state a claim. In granting the motion, she said that the board was entitled to legislative immunity for adopting the policy but not for enforcing the policy. The library board contended it was immune from litigation under the Communications Decency Act of 1996, which granted absolute immunity to good faith users of filtering software. See Mainstream Loudoun, 24 F. Supp. 2d. at 561, in which the library board requested that the court reconsider its decision that the board was not immune from litigation. Judge Brinkema said that the CDA provided defendants from immunity from action for damages, but did not grant immunity to defendants when plaintiffs sought declaratory and injunctive relief. Mainstream Loudoun, 24 F. Supp. 2d. at 561.

62

plaintiffs, made up of a local nonprofit organization, its members and individual county residents. The plaintiffs had sought a permanent injunction prohibiting the library board from mandating the use of Internet filters on all computers.106

In 1997, the Board of Trustees for Loudoun County Library adopted a “policy on sexual harassment” that stated that Internet access would be provided to library patrons subject to several restrictions. First, the policy stated that patrons would not be allowed to access e-mail, chat rooms, or pornography. Second, the policy required that “site-blocking software” be installed on all computers to block access to Web sites containing child pornography, obscenity, and “material deemed harmful to juveniles.”107 A Loudoun-County non-profit organization and a group of Loudoun County residents challenged the site-blocking provision of the sexual harassment policy.

The library installed X-Stop, a commercial filtering software product. Because the software was proprietary, librarians did not know the methods used to block Web sites or which sites were blocked. Patrons wishing to view a blocked Web site could submit a written request asking that the site be unblocked and stating the reason they wanted access to the specific site. A librarian would then review the Web site to determine if it should be unblocked under the library’s policy.108

106 See Mainstream Loudoun, 24 F. Supp. 2d. at 570. Judge Brinkema also issued a summary judgment in favor of the intervenor Web sites, who opposed the filtering policy. In addition, she issued a summary judgment in favor of the Mainstream Loudoun Library Board on an issue of standing.

107 Id. at 556. The policy also contained two other restrictions: 1) all library computers would be installed near and in full view of library staff; and 2) patrons would not be permitted to access pornography and, if they did so and refused to stop, the police may be called to intervene. Id. However, the plaintiffs did not challenge these provisions.

108 Id.

63

Judge Brinkema issued a permanent injunction, which prohibited the Loudoun County

Library Board from implementing the filtering policy.109 In the first Loudoun case, the library board had argued that the First Amendment did not apply because the use of Internet filtering was an acquisition decision rather than a removal decision.110 It wasn’t until the second Loudoun

case that the board argued that libraries are nonpublic fora and therefore subject only to the

intermediate scrutiny standard. Under the intermediate scrutiny standard, the policy only has to

be reasonably related to an important government interest.111 The plaintiffs, who opposed the

Internet filtering policy, argued that the implementation of the policy was a removal decision,

similar to blacking out selected portions of a set of encyclopedias.112 The plaintiffs did not

address public forum doctrine.

Judge Brinkema held that Internet access in public libraries is a limited public forum113

and the implementation of the filtering policy was a removal decision.114 In addressing the

library board’s policy as a removal decision, she wrote,

The Internet therefore more closely resembles plaintiffs’ analogy of a collection of encyclopedias from which defendants have laboriously redacted portions deemed unfit for library patrons. As such, the Library Board’s action is more appropriately characterized as a removal decision.115

109 Id.

110 For a discussion of acquisition and removal concepts, see supra pp. 50 to 52.

111 Mainstream Loudoun, 24 F. Supp. 2d. at 561-62. See supra notes 73-90 and accompanying text for a discussion of public forum doctrine.

112 Mainstream Loudoun, 2 F. Supp. 2d at 793-95. For a discussion of acquisition policies, see supra pp. 50-52. For the American Library Association’s definition of censorship, see supra note 58 and accompanying text.

113 Mainstream Loudoun, 24 F. Supp. 2d. at 563.

114 Mainstream Loudoun 2 F. Supp. 2d at 794-95.

115 Id.

64

Brinkema held that the library board’s policy was subject to strict scrutiny analysis

“[b]ecause the Policy at issue limits the receipt and communication of information through the

Internet based on the content of that information.”116 In holding the library board’s policy to the

strict scrutiny standard, Judge Brinkema stated that the policy would have to serve compelling

government interests, be necessary to further those interests, and be narrowly drawn to achieve

those interests.117 She stated that the policy violated adults’ constitutionally protected speech because it would have required filters on all computers, even those used by adults.118 Judge

Brinkema stated that less restrictive methods were available for enforcing a sexual harassment

policy than mandating the installation of filtering software to block online obscenity, child

pornography and material deemed harmful to minors. She suggested the use of privacy screens

and the installation of filtering software on only some of the computers.119

In holding that the Loudoun County libraries were limited public fora, Brinkema applied

the three factors the Kreimer court considered: government intent, the extent of use of the

library that the government has allowed patrons, and the nature of the forum.120 First, Brinkema

held that the government intended to create a public forum when it authorized its public library

system and when it adopted and reaffirmed a resolution stating that its "primary objective . . . [is]

that the people have access to all avenues of ideas" in a variety of media forms.121 In applying the intent factor to the Internet, Brinkema wrote that the government “intended to designate the

116 Mainstream Loudoun, 24 F. Supp. 2d at 561-62 (citing Perry Educ. Ass’n v. Perry Local Educators’ Ass’n, 460 U.S. 37, 45 (1983)).

117 Id. at 564-65.

118 Id. at 570.

119 Id. at 562, 567.

120 Id. at 562-63 (citing Kreimer v. Bureau of Police, 958 F. 2d. 1242, 1259-61 (3d Cir. 1992)).

121 Mainstream Loudoun, 24 F. Supp. 2d at 562-63.

65

Loudoun County libraries as public fora for the limited purposes of the expressive activities they provide, including the receipt and communication of information through the Internet. This includes both the right to provide information and the right to receive information.”122

Second, in analyzing extent of use of the library that the government has allowed patrons,

Brinkema said the government opened the library for use by the Loudoun County public at large and “has significantly limited its own discretion to restrict access, thus indicating that it has created a limited public forum.” Brinkema stated that the Loudoun Library Board of Trustees

“has declared that ‘library access and use will not be restricted nor denied to anyone because of age, race, religion, origin, background or views.’”123

Third, in reviewing the nature of the forum, Brinkema said that the library explicitly offers expressive activity through its Internet access. “While the nature of the public library would clearly not be compatible with many forms of expressive activity, such as giving speeches or holding rallies, we find that it is compatible with the expressive activity at issue here, the receipt and communication of information through the Internet,” she wrote.124

In the Mainstream Loudoun ruling, Brinkema determined that the installation of mandatory of Internet filtering in public libraries was not an acquisition decision, but rather was a removal decision.125 As a removal decision, mandatory filtering constituted a content-based decision because the filters prevented patrons from accessing material based on it subject matter, she said. Therefore, the strict scrutiny test applied to Internet filters, Brinkema said.126 She stated

122 Id. at 563.

123 Id.

124 Id.

125 Id. at 561.

126 Id. at 564-65.

66

that Internet access in public libraries constituted a designated or limited public forum because

the right to receive information through the Internet was “an expressive activity.”127

Four years after the Loudoun decision, the federal district court for the Eastern District of

Pennsylvania in 2002 struck down the Children’s Internet Protection Act.128 The CIPA mandated that public libraries and most schools receiving federal funding implement an Internet safety policy and install filtering software on all computers connected to the Internet to block visual depictions that are obscene, contain child pornography, or are deemed harmful to minors.129 In its opinion, the district court stated that the right to receive information and ideas is fundamental to a free society.130 In applying the right to receive information doctrine to libraries, the judge

wrote, “The right to receive information is vigorously enforced in the context of a public

library.”131

The court vacillated between labeling Internet access in a public library as a limited (or

dedicated) public forum and the modern version of a public forum,132 but spent more time

likening online access to a traditional public forum.133 The court first determined that the proper

forum for analysis was Internet access provided by libraries, rather than the entire collection of

print and electronic resources.134 The court then stated that the government has created a

127 Id. at 563.

128 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

129 See Children’s Internet Protection Act, Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f), 47 U.S.C. § 254(h)(6)).

130 Am. Library Ass’n v. United States, 201 F. Supp. 2d at 451 (citing Stanley v. Georgia, 394 U.S. 557, 564 (1969)).

131 Id. at 466. For a complete discussion of the court’s opinion, see Chapter 7.

132 Id. at 409, 457, 466. The CIPA mandated that public libraries and most schools receiving certain types of federal technology funds implement an Internet safety policy and install Internet blocking technology on all computers.

133 Id. at 466-71.

134 Id. at 456.

67

designated public forum when it provides Internet access in a public library.135 The court also

said that Internet access in public libraries “promotes First Amendment values in an analogous

manner to traditional public fora, such as streets, parks and sidewalks, in which content-based

restrictions on speech are always subject to strict scrutiny.”136 However, a U.S. Supreme Court

plurality reversed the lower court’s decision, stating, “Internet access in public libraries is neither a ‘traditional’ nor a ‘designated’ public forum.”137

The U.S. Supreme Court has heard only three cases involving libraries, with each case

decided roughly twenty years apart. The first case, decided in 1966, involved physical access to

public libraries.138 The second case, decided in 1982, involved the removal of books in public

school libraries.139 The third case, decided in 2003, involved Internet access in public libraries.140 All three cases resulted in plurality opinions.

In Brown v. Louisiana,141 decided in 1966 during the Civil Rights era, the Court

overturned the convictions of five African-American men who were convicted of breach of

peace after staging a peaceful and silent protest against segregation in the Clinton, Louisiana

135 Id. (citing Mainstream Loudoun v. Bd. of Trustees of the Loudoun County Library, 24 F. Supp. 2d 552, 563 (E.D. Va. 1998)); cf. Kreimer v. Bureau of Police, 958 F.2d 1242, 1259 (3d Cir. 1992) (holding that a public library is a limited public forum).

136 Am. Library Ass’n v. United States, 201 F. Supp. 2d at 466. For a complete discussion of the court’s opinion, see Chapter 7.

137 United States v. Am. Library Ass’n, 539 U.S. 194, 205-06 (2003). The Court emphasized that public forum doctrine historically has applied to speakers, rather than listeners or recipients of the communication. For a summary of the Court’s decision, see infra notes 169-187. For a complete discussion of the Court’s opinion, see Chapter 7.

138 Brown v. Louisiana, 383 U.S. 131 (1966).

139 Bd. of Educ. v. Pico, 457 U.S. 853 (1982).

140 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

141 Brown, 383 U.S. at 131.

68

branch of the Audubon Public Library.142 The five men stayed in the library for ten or fifteen

minutes and refused to leave when librarians asked them to do so.143 African-Americans were

not allowed to use the libraries or the red bookmobile that served Whites.144 They were only

allowed to use the blue bookmobile, which served “Negroes.” 145 The Court issued four opinions:

the plurality, authored by Justice Abe Fortas and joined by Chief Justice Earl Warren and Justice

William Douglas; two separate concurrences by Justices William Brennan, who concurred in the

judgment, and Byron White, who concurred in the result; and one dissent, authored by Justice

Hugo Black and joined by Justices Tom Clark, John Marshall Harlan and Potter Stewart. In

writing the plurality opinion, Justice Fortas stated that the convictions violated the men’s First

and Fourteenth Amendment rights. He said the government may regulate the use of libraries, but

it cannot do so in a discriminatory manner. In referring to the role of the public library, he wrote,

“It is an unhappy circumstance that the locus of these events was a public library—a place

dedicated to quiet, to knowledge, and to beauty.”146 Although Justice Fortas’ statement did not directly refer to the right to receive information doctrine, his reference to “knowledge” implies the right to receive information and ideas in public libraries.

The concurring opinions did not specifically address the mission or role of the public library, but Justice William Brennan’s concurrence emphasized the petitioners’ First Amendment rights of speech, petition and assembly.147 He briefly mentioned the concept of a forum when he

142 Id.

143 Id. at 137-38.

144 Id. at 137. The plurality opinion stated, “It is a permissible inference that no Negroes used the branch libraries.”

145 Id.. At the time, segregated libraries were commonplace in the South. See Liebler, supra note 70, at 9.

146 Brown v. Louisiana, 383 U.S. 131, 142 (1966).

147 Id. at 146-47 (Brennan, J., concurring).

69

wrote, “Public buildings often provide a forum for more traditional forms of First Amendment

activity, such as verbal expression.”148 Justice Byron White’s concurrence focused on the lack of

equal protection under the Constitution. He said that it would be unlikely that whites would have

been asked to leave after only ten or fifteen minutes.149

On behalf of the dissent, Justice Hugo Black wrote that the Constitution does not prevent state government from enacting laws that would prevent “sit-ins” or “stand-ins.” He indirectly referred to the public library’s role when he wrote, “Short of physical violence, petitioners could not have more completely upset the normal, quiet functioning of the . . . [l]ibrary.”150 However,

Justice Black’s comments do not address the right to receive information in a public library.

Outside of Justice Brennan’s brief reference to public buildings providing a forum,151 none of the opinions addressed public forum doctrine.

Two decades later, a Supreme Court plurality used the right to receive information doctrine in a 1982 case involving a public school library. In Board of Education v. Pico,152 the

Court voted 5-4 in holding that a school board violated the First Amendment when it ordered the removal of books from a junior high school library and senior high school library. The school board described the books as "anti-American, anti-Christian, anti-[Semitic], and just plain filthy."153 The Court issued seven separate opinions: the plurality, two concurrences and four

148 Id. at 147.

149 Id. at 150-51 (White, J., concurring).

150 Id. at 163 (Black, J., concurring).

151 Id. at 147 (Brennan, J., concurring).

152 Bd. of Educ. v. Pico, 457 U.S. 853 (1982).

153 Pico, 457 U.S. at 857. The nine books in the High School library were: Slaughter House Five, by Kurt Vonnegut, Jr.; The Naked Ape, by Desmond Morris; Down These Mean Streets, by Piri Thomas; Best Short Stories of Negro Writers, edited by Langston Hughes; Go Ask Alice, of anonymous authorship; Laughing Boy, by Oliver LaFarge; Black Boy, by Richard Wright; A Hero Ain't Nothin' But A Sandwich, by Alice Childress; and Soul On Ice, by Eldridge Cleaver. The book in the Junior High School library was A Reader for Writers, edited by Jerome Archer.

70

dissents. Justice William Brennan wrote the plurality opinion and was joined by Justices

Thurgood Marshall and John Paul Stevens. Justice Harry Blackmun joined the plurality in part,

concurred in the judgment and also wrote a concurring opinion.154 Justice Byron White, who

concurred in the judgment, authored the second concurring opinion. Chief Justice Warren

Burger, Justice William Rehnquist, Justice Lewis Powell, Jr., and Justice Sandra Day O’Connor

each wrote separate dissenting opinions.

The plurality emphasized the right to receive information and ideas, stating that “the right

to receive ideas follows ineluctably from the sender's First Amendment right to send them . . .

and is a necessary predicate to the recipient's meaningful exercise of his own rights of speech,

press, and political freedom.” In authoring the plurality opinion, Justice Brennan wrote, “[T]he

special characteristics of the school library make that environment especially appropriate for the

recognition of the First Amendment rights of students.”155 Justice Brennan also made a distinction between the school curriculum and books in the school library. He stated that school boards have the right to determine curriculum and have “a legitimate role to play” in determining school library content, but they cannot remove books because they disagree with the ideas in those books.156 Justice Brennan referred to public libraries and the right to receive ideas and

information when he wrote, “A school library, no less than any other public library, is ‘a place

dedicated to quiet, to knowledge, and to beauty’ (where) 'students must always remain free to

154 Pico, 457 U.S. at 855, 858.

155 Id. at 867.

156 Id. at 869-71.

71

inquire, to study and to evaluate, to gain new maturity and understanding.' The school library is the principal locus of such freedom.”157

Justice Brennan said that the school board could have followed the board’s previously-

established policy for reviewing books. Under the policy, the school superintendent would

appoint a committee to study any objections to books and then make a recommendation.158

Brennan found it problematic that the school board ignored the advice of literary experts, librarians and teachers within the school system, and publications that rate books for junior and senior high school students.159 Justice Brennan also said the Pico case would have been “a very

different case” if the school board “had employed established, regular, and facially unbiased

procedures for the review of controversial materials,” but the board had not done so.160

The two concurring opinions, though agreeing with the judgment, focused on different

aspects. Justice Byron White, who concurred in the judgment only, said he would have remanded the case so that the trial court could resolve factual issues, including the board’s reasoning for removing the books. White argued that the Supreme Court plurality went beyond the factual issue at hand to extensively discuss a constitutional issue when it was not necessary for the Court

to do so. The plurality “seemed compelled to go further and issue a dissertation on the extent to

which the First Amendment limits the discretion of the school board to remove books from the

school library,” White wrote.161

157 Id. at 868-69 (citing Brown v. Louisiana, 383 U.S. 131, 142 (1966) and Keyishian v. Bd. of Regents, 385 U.S. 589, 603 (1967)).

158 Pico, 457 U.S. at 857, 874-75.

159 Id. at 874.

160 Id.

161 Id. at 883-84 (White, J., concurring).

72

Justice Harry Blackmun concurred in part with Justice Brennan and concurred in part in

the judgment only. He emphasized the wrongful suppression of ideas rather than the right to

receive information. He wrote,

The Court therefore made it clear that imposition of ‘ideological discipline’ was not a proper undertaking for school authorities . . . . As the plurality notes, it is difficult to see how a school board, consistent with the First Amendment, could refuse for political reasons to buy books written by Democrats or by Negroes, or books that are ‘anti-American’ in the broadest sense of that term. 162

The dissenting opinions emphasized the importance of local control of education and

stated that local school officials should be able to make decisions about library holdings and curriculum.163 Chief Justice Warren Burger, who was joined in his dissent by Justices Lewis

Powell, William Rehnquist and Sandra Day O’Connor, said that local officials and not federal

judges or teenage students should administer schools. He argued that the First Amendment right

to receive ideas does not carry with it the obligation for the government to provide access to

these ideas.164

In a second dissenting opinion, Justice Lewis Powell, Jr. stated that school boards have a

responsibility to parents and citizens of the district and therefore should determine educational

policy. He said there is no First Amendment “right to receive ideas” in public schools, which he viewed as “a new constitutional right” stated by the plurality.165 Justice O’Connor, who also

authored an individual dissent, said she did not agree with the school board’s action in removing all of the books, but added that the board, serving in its role as educator, is the proper

162 Id. at 877-78 (Blackmun, J., concurring).

163 Id. at 889-93 (Burger, C.J., dissenting), 909-10 (Rehnquist, J., dissenting), 921 (O’Connor, J., dissenting).

164 Pico, 457 U.S. at 885-88 (Burger, C.J., dissenting).

165 Id. at 893-95 (Powell, J., dissenting).

73

government entity to make such a decision. She said the board has the right to remove books

from the school library as long as it does not prevent students from reading or discussing them.166

Justice William Rehnquist, who was joined in his dissent by Chief Justice Burger and

Justice Powell, said the right to receive information in schools is not supported by Court

precedence.167 He contrasted school libraries from public libraries, which is directly on point for

this dissertation. Justice Rehnquist disagreed with the decision reached in Pico, but his

statements supported the public library’s mission in providing access to a wide variety of

information and ideas, the very information and ideas that filtering software limits. Rehnquist

wrote,

Unlike university or public libraries, elementary and secondary school libraries are not designed for freewheeling inquiry; they are tailored, as the public school curriculum is tailored, to the teaching of basic skills and ideas.168

Nearly twenty years later, in 2003, the Supreme Court voted 6-3 to uphold the Children’s

Internet Protection Act of 2000, reversing a district court’s decision. 169 Because this case is

discussed in Chapter 7, only a brief summary is included here. Five separate opinions were

issued: the plurality, two concurrences, and two dissents. Chief Justice William Rehnquist, who

authored the plurality opinion, stated that forum doctrine did not apply and wrote that “Internet

166 Id. at 921 (O’Connor, J., dissenting). O’Connor’s statement implied, but did not explicitly state, that students could obtain copies of the removed books elsewhere, and then read and discuss the books on school property.

167 Id. at 887-88 (Rehnquist, J., dissenting).

168 Id. at 915.

169 See United States v. Am. Library Ass’n, 539 U.S. 194 (2003). Chief Justice William Rehnquist wrote the plurality opinion and was joined by Justices Sandra Day O’Connor, Antonin Scalia and Clarence Thomas. Justices Stephen Breyer and Anthony Kennedy wrote separate concurring opinions. Justices John Paul Stevens and David Souter wrote separate dissents, with Justice Ruth Bader Ginsburg joining in Justice Souter’s dissent. For a detailed discussion of the case, see Chapter 7.

74

access in public libraries is neither a ‘traditional’ nor a ‘designated’ public forum.”170 Chief

Justice Rehnquist said Internet access in public libraries is not a traditional public forum because it has not been “immemorially held in trust” for use by the public for communication. He also said that Internet access in public libraries is not a limited public forum because the government did not intentionally take action to designate it as such.171 The plurality opinion did not address

the right to receive ideas and information doctrine and did not refer to Brown.172 Neither did the

Chief Justice mention any of the opinions issued in Pico, including his own.173

The Court reversed the judgment of the federal district court,174 stating that enforcement

of the CIPA did not violate the First Amendment175 and was within Congress’ spending

power.176 The plurality also found that prior restraint doctrine does not extend to collection

decisions made by public libraries.177 The Court plurality reasoned that a library's decision to use

filtering software is not a restraint on private speech, but rather a collection decision, and public

170 See United States v. Am. Library Ass’n, 539 U.S. at 205-06. The Court emphasized that public forum doctrine historically has applied to speakers, rather than listeners or recipients of the communication. For a discussion of public forum doctrine, see supra notes 73-90 and accompanying text.

171 United States v. Am. Library Ass’n, 539 U.S. at 205-07.

172 Brown v. Louisiana, 383 U.S. 131 (1966). See United States v. Am. Library Ass’n, 539 U.S. at 539, 198-214 (plurality opinion).

173 Bd. of Educ. v. Pico, 457 U.S. 853, 904-20 (1982) (Rehnquist, J., dissenting). See United States v. Am. Library Ass’n, 539 U.S. at 198-214 (plurality opinion).

174 United States v. Am. Library Ass’n, 539 U.S. at 194, rev’g Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). For an analysis of the district court and Supreme Court cases, see Chapter 7.

175 United States v. Am. Library Ass’n, 539 U.S. at 199.

176 Id. at 203, 205.

177 Id. at 209. For a discussion of acquisition policies, see pp. 50 to 52. For the American Library Association’s definition of censorship, see supra note 58 and accompanying text.

75

libraries are not required to add material to their collections just because the material is

constitutionally protected.178

In a concurring opinion, Justice Stephen Breyer agreed with the plurality that Internet

access in public libraries is not a public forum. He briefly mentioned the right to receive

doctrine, stating that the Court has long recognized the right to receive information and ideas.179

Justice Breyer stated that he would apply “heightened scrutiny,” rather than strict scrutiny or rational basis, in analyzing the CIPA.180 In a separate concurrence, Justice Anthony Kennedy

wrote that “there is little to this case” because the CIPA’s disabling provision would allow

libraries to disable filtering software and unblock specific Web sites for adults.181 Kennedy said

that although the CIPA is not unconstitutional on its face, an adult could challenge the statute on

an “as applied” basis if the filtering could not be removed in a timely manner.182

The justices writing the dissenting opinions did not address public forum doctrine, but

stated that the CIPA violated the First Amendment. Justice John Paul Stevens argued that the

CIPA constitutes “a significant prior restraint” because some libraries’ procedures would make it difficult for patrons to have the filtering disabled.183 Justice Stevens wrote that most of the

178 United States v. Am. Library Ass’n, 539 U.S. at 209.

179 Id. at 216-17 (Breyer, J., concurring).

180 Id. at 216-17. Breyer said that heightened scrutiny “supplements” strict scrutiny “with an approach that is more flexible but nonetheless provides the legislature with less than ordinary leeway in light of the fact that constitutionally protected speech is at issue.” Id. at 218. For an analysis of the plurality, concurring and dissenting opinions, see Chapter 7.

181 Id. at 214 (Kennedy, J., concurring).

182 Id. at 214-15.

183 Id. at 224 (Stevens, J., dissenting). Stevens wrote, “It is as though the statute required a significant part of every library's reading materials to be kept in unmarked, locked rooms or cabinets, which could be opened only in response to specific requests.” Id.

76

filtered content was “constitutionally protected speech,”184 and the Government could not suppress lawful speech as the means to suppress unlawful speech.185 Justice David Souter argued that mandatory filtering constitutes censorship because filtering technology blocks access to electronic resources that the library has already acquired.186 Souter wrote,

There is no good reason, then, to treat blocking of adult enquiry as anything different from the censorship it presumptively is. For this reason, I would hold in accordance with conventional strict scrutiny that a library's practice of blocking would violate an adult patron's First and Fourteenth Amendment right to be free of Internet censorship.187

Case Analysis—Historical Overview of the Right to Receive Ideas and Information

The U.S. Supreme Court had stated repeatedly that the First Amendment protects the right to receive ideas and information,188 although the Court did not address that right in 2003 in upholding the Children’s Internet Protection Act.189 The Court first clearly articulated a right to receive ideas and information during the 1960s, according to Professor William E. Lee.190

184 Id. at 220.

185 Id.

186 Id. at 237 (Souter, J., dissenting). Souter contrasted library acquisition policies with censorship. “In the instance of the Internet, what the library acquires is electronic access, and the choice to block is a choice to limit access that has already been acquired. Thus, deciding against buying a book means there is no book (unless a loan can be obtained), but blocking the Internet is merely blocking access purchased in its entirety and subject to unblocking if the librarian agrees. The proper analogy therefore is not to passing up a book that might have been bought; it is either to buying a book and then keeping it from adults lacking an acceptable ‘purpose,’ or to buying an encyclopedia and then cutting out pages with anything thought to be unsuitable for all adults,” Souter wrote. Id. at 237. Justice Ginsburg joined in Justice Souter’s dissent.

187 Id. at 242.

188 See Martin v. Struthers, 319 U.S. 141, 143 (1943); Lamont v. Postmaster Gen., 381 U.S. 301, 308 (1965) (Brennan, J., concurring); Griswold v. Connecticut, 381 U.S. 479, 482-83 (1965); Stanley v. Georgia, 394 U.S. 557, 564 (1969); Red Lion v. FCC, 395 U.S. 367, 390 (1969); Kleindienst v. Mandel, 408 U.S. 753, 762-63 (1972); Va. State Bd. of Pharmacy v. Va. Citizens Consumer Council, 425 U.S. 748 (1976); First Nat’l Bank of Boston v. Bellotti, 435 U.S. 765, 783 (1978); Bd. of Educ. v. Pico, 457 U.S. 853, 866-67 (1982).

189 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

190 William E. Lee, The Supreme Court and the Right to Receive Expression, 1987 SUP. CT. REV. 303, 303 (1987).

77

However, the Court directly referred to the right to receive ideas in two cases decided in the

1940s, as well as in subsequent cases.

In 1943, the Supreme Court held that the First Amendment protects the right to both

distribute and receive information.191 In a 5-4 vote in Martin v. Struthers,192 the Court

invalidated a municipal ordinance that prohibited persons from knocking on doors or ringing

doorbells to distribute handbills, circulars or advertisements.193 A Jehovah’s Witness, who went

door to door to distribute leaflets advertising a religious meeting, was convicted and fined for

violating the ordinance.194 Justice Hugo Black wrote the majority opinion on behalf of five

members of the Court and stated the ordinance violated the First Amendment. He specifically addressed the right to receive ideas and information when he wrote:

The authors of the First Amendment knew that novel and unconventional ideas might disturb the complacent, but they chose to encourage a freedom which they believed essential if vigorous enlightenment was ever to triumph over slothful ignorance. This freedom embraces the right to distribute literature, and necessarily protects the right to receive it.195

Justice Felix Frankfurter dissented, stating that homes are “sanctuaries from intrusions

upon privacy,” and the due process clause of the Fourteenth Amendment does not prevent cities

from passing ordinances that preserve the privacy of the home.196 Justice Stanley Reed also

dissented and was joined by Justices Owen Roberts and Robert Jackson. Justice Reed argued that

the ordinance did not violate the First Amendment. “No ideas are being suppressed. No

191 Martin, 319 U.S. at 143.

192 Id. at 141.

193 Id. at 143, 149.

194 Id. at 142.

195 Id. at 143.

196 Id. at 152-53 (Frankfurter, J., dissenting).

78

censorship is involved. The freedom to teach or preach by word or book is unabridged, save only

the right to call a householder to the door of his house to receive the summoner's message,” Reed

wrote.197

In 1945 in Thomas v. Collins,198 the Court again upheld the right to receive expression. In

another 5-4 vote, the Court struck down a Texas state law requiring union organizers to register

and obtain an organizer’s card before soliciting members, stating the statute was

unconstitutional.199 A union leader attempted to recruit members at a rally at the city hall, despite a court order issued earlier in the day prohibiting him from doing so unless he filed a registration with the city.200 The Texas State Supreme Court found the organizer in contempt and ordered

him fined and imprisoned.201 The Supreme Court reversed the lower court’s judgment, stating

that the state law interfered with the rights of free speech and free assembly. In authoring the

majority opinion, Justice Wiley Rutledge referred to the right to receive ideas and information

when he wrote: “That there was restriction upon Thomas’ (the organizer’s) right to speak and

the rights of the workers to hear what he had to say, there can be no doubt.”202 Justice Stanley

Reed wrote the dissenting opinion and was joined by Chief Justice Harlan Stone and Justices

Felix Frankfurter and Owen Roberts. Justice Reed stated that statute did not violate the First

Amendment because the law did not prevent union organizers from speaking. Justice Reed

argued that the labor union was a business association and the state could mandate that union

197 Id. at 154-55 (Reed, J., dissenting).

198 Thomas v. Collins, 323 U.S. 516 (1945).

199 Id. at 534, 543.

200 Id. at 520-24.

201 Id. at 524-25.

202 Id. at 534.

79

organizers identify themselves as paid solicitors because the organizers were engaging in a business transaction.203

During the 1960s, the Supreme Court affirmed the First Amendment right to receive ideas and information in four key cases. In a unanimous decision in Lamont v. Postmaster

General204 in May 1965, the Court held as unconstitutional a federal statute mandating that

“communist propaganda” mailed from abroad be held at a post office until the addressee requested it.205 The Postal Service and Federal Employees Salary Act of 1962 required that the recipients of communist literature return a reply card before the material could be delivered.

Several individuals who received these postcards sued to prohibit the enforcement of the statute.206 Justice William Douglas, writing on behalf of the eight members of the Court hearing the case,207 stated that the law violated First Amendment rights. He wrote,

“The addressee carries an affirmative obligation which we do not think the Government may impose on him. This requirement is almost certain to have a deterrent effect . . . . [A]ny addressee is likely to feel some inhibition in sending for literature which federal officials have condemned as "communist political propaganda."208

In a concurring opinion, Justice William Brennan emphasized the right to receive ideas, stating, “[T]he right to receive publications is such a fundamental right. The dissemination of

203 Id. at 551-57. (Reed, J., dissenting). Reed wrote, “[T]he Act and the injunction which he (Thomas) disobeyed say nothing of speech; they are aimed at a transaction—that of solicitation of members for a union.” Id. at 551-52.

204 Lamont v. Postmaster Gen., 381 U.S. 301 (1965).

205 Id. at 301.

206 Id. at 304-05.

207 Justice Byron White took no part in the consideration or decision of the two consolidated cases.

208 Lamont, 381 U.S. at 307.

80

ideas can accomplish nothing if otherwise willing addressees are not free to receive and consider

them.”209

A month later, in Griswold v. Connecticut,210 the Court voted 7-2 to strike down a

Connecticut law that criminalized the use of contraceptives, as well as the distribution of

information on contraception.211 The Court reversed the convictions of the executive director of

Planned Parenthood and its licensed physician,212 who were found to be in violation of the law

by providing contraception information and advice to married persons.213 In authoring the

majority opinion, Justice William Douglas wrote that the statute violated married persons’ privacy rights214 and the First Amendment.215 Justice Douglas addressed the right to receive

ideas and information when he wrote, “The right of freedom of speech and press includes not

only the right to utter or to print, but the right to distribute, the right to receive, the right to read

and freedom of inquiry, freedom of thought, and freedom to teach.”216

Justice Arthur Goldberg wrote a concurring opinion and was joined by Chief Justice Earl

Warren and Justice William Brennan. Justice Goldberg said the Fourteenth Amendment protects

fundamental personal rights and the concept of liberty is not confined to the specific terms

209 Id. at 308 (Brennan, J., concurring).

210 Griswold v. Connecticut, 381 U.S. 479 (1965).

211 Griswold, 381 U.S. at 482-83. The statutes struck down were §§ 53-32 and 54-196 of the General Statutes of Connecticut (1958 rev.). Section 53-32 states: "Any person who uses any drug, medicinal article or instrument for the purpose of preventing conception shall be fined not less than fifty dollars or imprisoned not less than sixty days nor more than one year or be both fined and imprisoned." Section 54-196 states: "Any person who assists, abets, counsels, causes, hires or commands another to commit any offense may be prosecuted and punished as if he were the principal offender.”

212 Griswold, 381 U.S. at 486.

213 Id. at 480.

214 Id. at 485.

215 Id. at 482.

216 Id. (citing Martin v. Struthers, 319 U.S. 141, 143 (1943) and Wieman v. Updegraff, 344 U.S. 183, 195 (1952)).

81

contained in the Bill of Rights.217 Justice Byron White also concurred, stating that the statute, as applied to married couples, deprives those couples of liberty without due process of law.218

Justice John Marshall Harlan concurred in the result, but did not agree with the majority’s reasoning. He said the Due Process Clause of the Fourteenth Amendment did not apply in this case because the statute did not violate “some right assured by the letter or penumbra of the Bill of Rights.” He supported the majority opinion to reverse the convictions of the appellants, but on the grounds that the statute “violates basic values ‘implicit in the concept of ordered liberty.’”219

Justices Potter Stewart and Hugo Black wrote dissenting opinions and also joined in each other’s dissents. Justice Stewart said he could find nothing in the Connecticut statute that violated the federal Constitution.220 Justice Black also stated that the statute did not violate any

provision of the Constitution.221 Justice Black specifically emphasized that there is no

constitutional right of privacy.222

In Stanley v. Georgia,223 in a 1969 obscenity opinion, the Supreme Court voted 9-0 to

reverse the conviction of a man for possession of obscene material within the privacy of his

home.224 Justice Thurgood Marshall, who authored the Court’s opinion, wrote, “It is now well established that the Constitution protects the right to receive information and ideas . . . (and) this right to receive information and ideas, regardless of their social worth, is fundamental to our free

217 Id. at 486-87 (Goldberg, J., concurring).

218 Id. at 502 (White, J., concurring).

219 Id. at 500 (Harlan, J., concurring) (quoting Palko v. Connecticut, 302 U.S. 319, 325 (1937)).

220 Id. at 527-30 (Stewart, J., dissenting).

221 Id. at 520-21 (Black, J., dissenting).

222 Id. at 508.

223 Stanley v. Georgia, 394 U.S. 557 (1969).

224 Id. at 557.

82

society.”225 In addressing the First Amendment, Justice Marshall said that freedom of speech and of the press protects the right to receive ideas and information.226 In a concurring opinion, Justice

Hugo Black wrote that he agreed with the Court “that the mere possession of reading matter or movie films, whether labeled obscene or not, cannot be made a crime by a state without violating the First Amendment.”227 Justice Potter Stewart also wrote a concurring opinion and was joined in his concurrence by Justices William Brennan and Byron White. Justice Stewart said he agreed with the result, but would reverse the conviction because the seizure of the films violated the

Fourth Amendment, and therefore the films were not admissible as evidence at trial.228 He said the majority, in its haste to address “new constitutional frontiers,” disregarded the Fourth

Amendment’s protection against search and seizure.229

In Red Lion Broadcasting v. FCC,230 a 1969 case involving broadcast restrictions, the

Supreme Court again supported the right to receive information when it upheld, with a unanimous vote, the personal attack rule, a corollary to the Fairness Doctrine.231 The personal

225 Id. at 564.

226 Id.

227 Id. at 568-69 (Black, J., concurring).

228 Id. at 572 (Stewart, J., concurring). The Fourth Amendment protects individuals from unreasonable searches and seizures and reads: “The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.” U.S. CONST. AMEND. IV, sec. 1. The Fourteenth Amendment makes the First Amendment applicable to the states. U.S. CONST. AMEND. XIV.

229 Stanley, 394 U.S. at 569 (Stewart, J., concurring).

230 Red Lion Broad. v. FCC, 395 U.S. 367 (1969).

231 In 1949, the Federal Communications Commission (FCC) imposed a "fairness doctrine" on radio and television broadcasters. See In re Editorializing by Broadcast Licensees, 13 F.C.C. 1246 (1949). The fairness doctrine required broadcasters to provide coverage of important controversial issues in the community and to provide an opportunity for the presentation of contrasting viewpoints on those controversial issues. Id. See also Report Concerning General Fairness Doctrine Obligations of Broadcast Licensees, 102 F.C.C. 2d 143, 146 (1985). The FCC repealed the fairness doctrine in 1987, stating that it violated the First Amendment and did not serve the public interest. 2 FCC Rcd 5043, 5057 (1987). In 1989, the U.S. Court of Appeals for the D.C. Circuit upheld the FCC’s decision to repeal

83

attack rule required broadcasters to offer free reply time to a person whose honesty, character, or

integrity was attacked during the discussion of a controversial issue.232 The Red Lion

controversy began during a religious show aired on a Pennsylvania radio station owned by Red

Lion Broadcasting when the host of the show stated that a book author had been fired from a newspaper. The broadcasters said the author was fired for making false charges against

government officials, had worked for a Communist-affiliated publication, and had written a book

to “smear and destroy” a political candidate.233 The book author demanded that the radio station provide him with free reply time because he had been personally attacked, but the station refused.234 Justice Byron White, expressing the view of the eight members of the U.S. Supreme

Court who heard the case, held that the personal attack rule was authorized by Congress and did not violate the First Amendment freedoms of speech and press.235 White emphasized the right to

receive ideas and information when he wrote: “It is the right of the viewers and listeners, not the

right of the broadcasters, which is paramount.”236 However, White’s language came in the

context of a federal licensing scheme for all radio and television broadcasters. In other words,

the fairness doctrine. See Syracuse Peace Council v. FCC, 867 F. 2d 654 (D.C. Cir. 1989), cert. denied, 493 U.S. 1019 (1990). Although the FCC eliminated the Fairness Doctrine in 1987, the FCC did not eliminate two corollaries of the fairness doctrine: the personal attack rule or the political editorial rule, which the FCC adopted in 1967. See Personal Attacks and Political Editorials, 8 FCC 2d at 722, 725 (1967); see also Radio-Television News Directors Ass’n v. FCC, 229 F.3d 269, 269 (D.C. Cir. 2000). The personal attack rule required broadcasters to offer reply time to a person whose honesty, character or integrity was attacked during the discussion of a controversial public issue. The political editorial rule required broadcasters to offer reply time to a candidate when a station’s editorial opposed that candidate or supported a competing legally qualified candidate. In 2000, the Court of Appeals for the D.C. Circuit ordered the FCC to abolish the personal attack and political editorializing rules. See Radio-Television News Directors Ass’n, 229 F.3d at 271 (stating that the FCC had not provided “adequate justification” for the personal attack and political editorializing rules).

232 Personal Attacks and Political Editorials, 8 FCC 2d at 725 (1967).

233 Red Lion Broad. v. FCC, 395 U.S. 367, 371 (1969).

234 Id. at 371-72.

235 Id. at 375. Justice William O. Douglas did not hear oral arguments and did not participate in the decision.

236 Id. at 390.

84

the right to receive information, in this case, was based on the fact that Congress had defined in the Communications Act broadcasters as fiduciaries of the public. Congress had emphasized the importance of broadcasters providing programming “in the public interest.”237 Therefore, for the

Court, the right to receive information in this case was built into a statute. This became a First

Amendment issue when the broadcasters claimed federal regulations infringed on their First

Amendment programming rights.238 Often the Court would have determined this kind of program regulation content control and therefore subject to strict scrutiny.239 However, the Court in Red Lion examined only the reasonableness of this federal regulation interpreting the "public interest" standard, consistent with the courts' tradition of examining broadcast content regulation.240

In 1976, the Supreme Court for the first time applied the right to receive information to purely commercial speech in Virginia State Bd. of Pharmacy v. Virginia Citizens Consumer

Council.241 A Virginia resident and two nonprofit organizations challenged a state statute that prohibited pharmacists from advertising prescription drug prices.242 Justice Harry Blackmun authored the 7-1 majority opinion,243 stating that advertising deserves First Amendment protection. He said the First Amendment protected consumers’ interest in the free flow of

237 Id. at 373, 379-80. See also 47 USC § 307 (a).

238 Red Lion, 395 U.S. at 386.

239 For a content-based restriction to pass the strict scrutiny test, the restriction “must be narrowly tailored to serve a compelling Government interest” and must be the “least restrictive alternative.” United States v. Playboy Entm’t Group, 529 U.S. 803, 813 (2000).

240 See Trinity Methodist Church, South v. FRC, 62 F.2d 850, cert. denied, 288 U.S. 599 (1933); Nat’l Broad. Co. v. United States, 319 U.S. 190, 216-217 (1943).

241 Va. State Bd. of Pharmacy v. Va. Citizens Consumer Council, 425 U.S. 748 (1976).

242 Id. at 749-50.

243 Justice John Paul Stevens did not take part in the consideration or decision of the case.

85

information, as such information was indispensable to well-informed private economic decisions.244 Justice Blackmun wrote, “This Court has referred to a First Amendment right to

‘receive information and ideas’ . . . . If there is a right to advertise, there is a reciprocal right to receive the advertising.”245

Chief Justice Warren Burger and Justice Potter Stewart wrote concurring opinions, while

Justice William Rehnquist wrote a dissenting opinion. Chief Justice Burger said that he agreed with the Court’s decision because the Court dealt specifically with the state’s power to prohibit pharmacists from advertising drugs prices; however, he said different factors would apply if the

Court were evaluating laws prohibiting lawyers and doctors from advertising.246 Justice Potter

Stewart said he agreed with the Court’s decision since it did not prevent the state or federal government from regulating false or deceptive advertising.247

Justice William Rehnquist dissented, stating that the Court had not followed precedent.

He wrote,

[The Court’s opinion] extends standing to raise First Amendment claims beyond the previous decisions of this Court. It also extends the protection of that Amendment to purely commercial endeavors which its most vigorous champions on this Court had thought to be beyond its pale.248

244 Va. State Bd. of Pharmacy, 425 U.S. at 765.

245 Id. at 757 (citing Lamont v. Postmaster Gen., 381 U.S. 301 (upholding the rights of citizens to receive political mail from abroad)). In Virginia State Board of Pharmacy, Justice Blackmun wrote that there is a First Amendment right to "receive information and ideas," and that freedom of speech “‘necessarily protects the right to receive.’” Id. (citing Martin v. City of Struthers, 319 U.S. 141, 143 (1943) and Stanley v. Georgia, 394 U.S. 557, 564 (1969)).

246 Id. at 775. Chief Justice Burger wrote, “I doubt that we know enough about evaluating the quality of medical and legal services to know which claims of superiority are ‘misleading’ and which are justifiable. Nor am I sure that even advertising the price of certain professional services is not inherently misleading, since what the professional must do will vary greatly in individual cases. It is important to note that the Court wisely leaves these issues to another day.” Id. (Burger, C.J., concurring).

247 Id. at 776-78 (Stewart, J., concurring).

248 Id. at 781 (Rehnquist, J., dissenting).

86

Rehnquist further argued that the plaintiffs were not asserting their right to receive information, but rather were asserting the right of a third party to publish information. He said the statute did not prohibit anyone from receiving information about prescription drug prices in person or by phone.249

In addition to applying the right to receive information and ideas to commercial speech, the Supreme Court also has recognized a recipient’s right to receive corporate political communications.250 In 1940, in Thornhill v. Alabama,251 the Court, in an 8-1 decision, reversed an Alabama appellate court ruling that had upheld a state statute prohibiting loitering and picketing on or near business properties.252 Justice Frank Murphy, who authored the majority opinion, wrote, “The freedom of speech and of the press guaranteed by the Constitution embraces at the least the liberty to discuss publicly and truthfully all matters of public concern without previous restraint or fear of subsequent punishment.”253 Murphy said that free speech

“suppl[ies] the public need for information and education with respect to the significant issues of the times.”254 Justice James McReynolds dissented, stating that the lower court’s judgment should be affirmed.255 However, McReynolds did not write a dissenting opinion.

In 1978, in First National Bank of Boston v. Bellotti,256 the Supreme Court, on a 5-4 vote, struck down a state statute that prohibited nonpress corporations from using corporate funds to

249 Id. at 782.

250 William E. Lee, The First Amendment Doctrine of Overbreadth, 71 WASH. U. L. Q. 637, 676-77 (1993).

251 310 U.S. 88 (1940).

252 Id. at 91-92.

253 Id. at 101-02.

254 Id.

255 Id. at 106.

256 435 U.S. 765 (1978).

87

support or oppose referenda issues unless those issues directly related to a corporation’s

business.257 Justice Lewis Powell, Jr., who was joined by Chief Justice Warren Burger and

justices Potter Stewart, Harry Blackmun and John Paul Stevens, authored the Court’s opinion.

Powell supported the listener’s right to receive information when he wrote, “[T]he Court's

decisions involving corporations in the business of communication or entertainment are based

not only on the role of the First Amendment in fostering individual self-expression but also on its

role in affording the public access to discussion, debate, and the dissemination of information

and ideas.”258 Powell emphasized the importance of citizens’ access to information in order for

self-governance to work in a democracy. The government is prohibited from restricting the

public’s access to information “lest the people lose their ability to govern themselves,” he

said.259

Chief Justice Warren Burger concurred in the opinion, stating that “the First Amendment does not ‘belong’ to any definable category of persons or entities: It belongs to all who exercise its freedoms.”260

Justice Byron White authored one of two dissenting opinions and was joined by justices

William Brennan and Thurgood Marshall. White said that one function of the “self-expression”

value of the First Amendment was to protect the “right to hear or receive information.”261

However, White argued that the right to receive information that is financed by corporate

257 Id. at 767-68.

258 Id. at 783.

259 Id. at 792.

260 Id. at 802 (Warren, C.J., concurring).

261 Id. at 806-07 (White, J., dissenting).

88

funding is not “of the same dimension as that to hear other forms of expression.”262 “Such

expenditures may be viewed as seriously threatening the role of the First Amendment as a

guarantor of a free marketplace of ideas,” he wrote.263 White said that measures are needed to prevent corporations from dominating the political process.264

Justice William Rehnquist authored a separate dissenting opinion,265 stating that the

statute did not violate the First or Fourteenth Amendments to the U.S. Constitution.266

Corporations are not entitled to “all the rights of free expression enjoyed by natural persons,”

Rehnquist said.267 “The free flow of information is in no way diminished by the

Commonwealth's decision to permit the operation of business corporations with limited rights of political expression,” he wrote.268

In 1980, in Consolidated Edison of New York v. Public Service Commission of New

York,269 the Supreme Court, in a 7-2 vote, struck down a state statute that prohibited the utility

company from including political brochures in its monthly bills.270 In reversing the New York

Court of Appeals ruling, the Supreme Court said that the ban on political messages was an

unconstitutional content-based regulation that violated the First and Fourteenth Amendments.271

262 Id. at 807.

263 Id. at 810.

264 Id. at 811-12.

265 Id. at 822 (Rehnquist, J., dissenting).

266 Id. at 827-28.

267 Id. at 825.

268 Id. at 828.

269 447 U.S. 530 (1980).

270 Id. at 544.

271 Id. at 533-36.

89

Justice Powell, writing for the Court, referred to the right to receive information when he wrote,

“Freedom of speech is ‘indispensable to the discovery and spread of political truth,’272 and ‘the

best test of truth is the power of the thought to get itself accepted in the competition of the

market.’”273 Powell said that "the First Amendment means that government has no power to

restrict expression because of its message, its ideas, its subject matter, or its content."274

Justice Thurgood Marshall concurred with the opinion, but stated that the Court did not

address the issue of whether the Public Service Commission could exclude the costs of the bill

inserts from the utility company’s rate base or the appropriateness of any allocation of those costs that the Commission might make.275

Justice John Paul Stevens concurred in the judgment, stating that the Public Service

Commission’s “censorial regulation . . . was motivated by nothing more than a desire to curtail

expression of a particular point of view on controversial issues of general interest” and therefore violated the First Amendment.276

Justice Harry Blackmun, joined by Justice William Rehnquist, dissented, stating that the

regulation did not violate the First and Fourteenth Amendments. The state was “legitimately

concerned with preventing the utility from taking advantage of this monopoly power to force

consumers to subsidize dissemination of its viewpoint on political issues,” Blackmun wrote.277

272 Id. at 534 (quoting Whitney v. California, 274 U.S. 357, 375 (1927) (Brandeis, J., concurring).

273 Id. (quoting Abrams v. United States, 250 U.S. 616, 630 (1919) (Holmes, J., dissenting).

274 Id. at 537 (quoting Police Dept. v. Mosley, 408 U.S. 92, 95 (1972)).

275 Id. at 544 (Marshall, J., concurring). For example, as Justice Blackmun wrote in his dissent, the Public Service Commission could “charge the utility's shareholders all the costs of the envelopes and postage and of creating and maintaining the mailing list, and charge the consumers only the cost of printing and inserting the bill and the consumer service insert.” Id. at 556 (Blackmun, J., dissenting).

276 Id. at 546-47 (Stevens, J., concurring).

277 Id. at 553 (Blackmun, J., dissenting).

90

He said that the state was attempting to protect ratepayers from unwillingly financing the utility

company’s speech and to preserve the billing envelope “for the sole benefit of the customers who

pay for it.”278

Later that year, in Pacific Gas & Electric Co. v. Public Utilities Commission,279 the

Supreme Court, in a 5-3 vote,280 held that a California utility company could not be forced to

include a consumer group’s newsletter in the company’s billing envelopes.281 The Public Utility

Commission of California had ruled that a monopoly must carry its critics’ messages.282 Justice

Lewis Powell, Jr. authored the Court’s opinion and was joined by Chief Justice Warren Burger

and justices William Brennan and Sandra Day O’Connor. Powell wrote that the Commission’s

order “impermissibly burdens” the utility company’s First Amendment rights because the order

“forces” the company “to associate with the views of other speakers, and because it selects the

other speakers on the basis of their viewpoints.”283 Powell compared Pacific Gas to Bellotti284 and Consolidated Edison,285 writing that “the critical considerations were that the State sought to

abridge speech that the First Amendment is designed to protect, and that such prohibitions limited the range of information and ideas to which the public is exposed.”286 Powell again

referred to the public’s right to receive information and ideas, stating that, “By protecting those

278 Id. at 555.

279 475 U.S. 1 (1986).

280 Justice Harry Blackmun did not take part in the consideration of the case.

281 Id. at 20-21.

282 Id. at 4-6.

283 Id. at 20-21.

284 435 U.S. 765 (1978).

285 447 U.S. 530 (1980).

286 Pacific Gas, 475 U.S. at 8.

91

who wish to enter the marketplace of ideas from government attack, the First Amendment

protects the public's interest in receiving information.”287

Chief Justice William Burger concurred with the plurality opinion, stating, “I would not

go beyond the central question presented by this case, which is the infringement of Pacific's right

to be free from forced association with views with which it disagrees.”288

Justice Thurgood Marshall concurred in the judgment only, and said that the State of

California had unconstitutionally redefined a property right in the extra space in the utility's

billing envelope. He said that the First Amendment would not allow burdening the speech of one

party in order to enhance the speech of another.289

Justice William Rehnquist wrote one of two dissenting opinions and was joined by justices Byron White and John Paul Stevens. Rehnquist said that negative free speech rights should not be extended to corporations in general. He also said that the consumer group’s access to the utility company’s billing envelopes would not have “any noticeable deterrent effect” on the utility’s right to speak.290

Justice Stevens wrote a separate dissenting opinion, stating that the Public Utilities

Commission’s requirement that Pacific Gas carry a third party’s message differed little from a

variety of other commercial communications that “have rarely been challenged” and “never

invalidated on First Amendment grounds.”291

287 Id.

288 Id. at 21 (Burger, C.J., concurring).

289 Id. at 22-26 (Marshall, J., concurring).

290 Id. at 26 (Rehnquist, J., dissenting).

291 Id. at 35-36 (Stevens, J., dissenting).

92

Commentator Analysis on Public Forum Doctrine

Although the U.S. Supreme Court stated that Internet access in a public library is neither

a traditional nor a limited public forum,292 legal commentators are divided on how forum

doctrine applies to public libraries. The commentators do not agree on whether a public library should be considered a traditional public forum, a limited or designated public forum, a

nonpublic forum, or a mixture of forums. Several commentators argue that public forum doctrine

does not apply to libraries at all.

The Public Library as a Traditional Public Forum

Several legal commentators have argued that the public library is a traditional public

forum where the right to receive information and ideas is paramount. Law professor Richard

Peltz said that public libraries are like public parks and one role of libraries is to foster public

inquiry. In an article on adults’ access to online materials in public libraries, Peltz wrote, “The

right to receive is critically important in a public library.”293 Law librarian Raizel Liebler argued

that libraries have replaced public parks as a traditional public forum.294 “Libraries have become

the epicenter of a physical space for public discourse, both through the items in the library and

through library meeting rooms, and when Internet access is added to libraries, they serve an even

more important role for the free exchange of ideas,” Liebler wrote.295 A law student, Aaron

Jacobson, contended that library Internet terminals are a traditional public forum. He stated that

courts characterize a traditional public forum as one that has been historically open to the public.

292 United States v. Am. Library Ass’n, 539 U.S. 194, 205 (2003). For a discussion of the Supreme Court’s opinion on library Internet access and public forum doctrine, see Chapter 7.

293 Peltz, supra note 65, at 440-41.

294 Liebler, supra note 70, at 71-73.

295 Liebler, supra note 70, at 73.

93

Jacobson argued that public libraries have been open to the public for more than a century and therefore meet the requirements of being a traditional public forum.296 “Internet terminals in public libraries fit within the definition of a traditional public forum; the terminals are extensions of the libraries that house them, and the libraries themselves are traditional public forums,”

Jacobson wrote.297

The Public Library as a Designated/Limited Public Forum

Some commentators have argued that, for the purposes of receiving information, Internet access in a public library may be considered a limited or designated public forum. Leah Wardak, a law student, stated that the government took an affirmative step to “designate Internet access as a (limited) public forum when it announced that the reason libraries provide Internet access is to give their patrons another medium for research.”298 Another author, Abigail Holland, argued that the government treated public libraries as a limited public forum. “The government opens libraries and provides Internet access to promote the dissemination and receipt of information.

This affirmative government act supports the premise that Internet access in a public library is a limited public forum and subject to strict scrutiny,” Holland wrote.299

296 Jacobson, supra note 70, at 1361.

297 Jacobson, supra note 70, at 1361 (citing United States v. Kokinda, 497 U.S. 720,726 (1990) (defining traditional public forum as governmental property traditionally open to public for expressive activity) and Perry Educ. Ass’n v. Perry Local Educators’ Ass’n, 460 U.S. 37, 45 (1983) (describing traditional public forums as places that, by long tradition or government fiat, have been devoted to assembly and debate)).

298 Wardak, supra note 70, at 724-25 (citing United States v. Am. Library Ass'n, 123 S. Ct. 2297, 2305 (2003)). Wardak added that, “Internet access in public libraries is a designated public forum; therefore, the First Amendment does include a right to receive, and CIPA should have been subject to strict scrutiny.” Id.

299 Abigail K. Holland, Comment, Constitutional Law - Constitutionality of Mandatory Filters on Federally Funded Internet Access in Public Libraries - United States v. American Library Association, Inc., 539 U.S. 194 (2003), 38 SUFFOLK U. L. REV. 217, 221 (2004) (internal citations omitted).

94

If Internet access in a public library were a limited public forum, any regulations limiting access to the Internet would be subject to strict scrutiny, two commentators noted.300 However, if a library’s Internet policy were not content-based, the intermediate scrutiny standard would apply. For example, filtering policies might be content-neutral time, place and manner regulations if a library prohibited the viewing of all images and graphics because of the effect on computer resources, such as the amount of bandwidth and length of time it takes to download images and other multimedia content, in contrast to text.301

Professor Bernard Bell has argued that the public library is at the very least a limited public forum for those desiring physical access to obtain information and might even be a traditional public forum for receiving information.302 The interests of recipients of information are “paramount” in a public library setting, according to Bell.303 He argued that librarians should have limited power in preventing patrons’ access to online “materials that satisfy their patrons' intellectual interests.”304 However, he contends that libraries can place “lesser value on materials that are not primarily focused on intellectual enlightenment, such as sexually explicit material directed toward the audience's prurient interests.” 305

300 Peltz, supra note 65, at 478-79; Holland, supra note 299, at 221. See also Semitsu, supra note 65, at 533.

301 Semitsu, supra note 65, at 528.

302 Bell, supra note 64, at 207.

303 Bell, supra note 64, at 195.

304 Bell, supra note 64, at 195.

305 Bell, supra note 64, at 207.

95

The Public Library as a Nonpublic Forum

Public libraries possibly could be viewed as a nonpublic forum, according to two

authors.306 If forum doctrine is even applicable, the library is a nonpublic forum by default,

according to one author.307 First, the government has not opened the library as a traditional

forum for public speeches; and second, the library is not a designated public forum “because no

expressive activities have been admitted to the forum.”308 Another author stated that in a

nonpublic forum, only a “reasonableness standard” is required when reviewing filtering

technology to examine “whether it is rationally related to any legitimate government interest.”309

The Public Library as Mixture of Fora

Two commentators have said that applying public forum and right to receive ideas and information doctrines to the public library is problematic because public libraries are a mixture of fora.310

Librarians make two types of decisions: which patrons to admit to the library and which

content to select for the library, according to attorney Mark Nadel.311 Because of libraries’ acquisition policies and limited shelving and storage facilities, librarians do not accept all donations. Nadel argued that a public library is a designated or limited public forum for physical access, but a nonpublic forum for the right to receive information and ideas.312 Libraries are a

306 Semitsu, supra note 65, at 533; VanNorman, supra note 54, at 430-32.

307 VanNorman, supra note 54, at 431.

308 VanNorman, supra note 54, at 431.

309 Semitsu, supra note 65, at 533.

310 Mark S. Nadel, The First Amendment Limitations on the Use of Internet Filtering in Public and School Libraries: What Content Can Libraries Exclude? 78 TEX. L. REV. 1117, 1132 (2000); Julie M. Tedjeske, Note, Mainstream Loudoun and Access to Internet Resources in Public Libraries, 60 U. PITT. L. REV. 1265, 1290 (1999).

311 Nadel, supra note 310, at 1132.

312 Nadel, supra note 310, at 1132; see Tedjeske, supra note 310, at 1290.

96

designated or limited public forum when deciding on which patrons to admit because libraries cannot be very selective about those decisions, Nadel said. In contrast, libraries are a nonpublic forum when choosing which materials to add to their collections because libraries have the discretion to be “quite selective,” according to Nadel.313 If libraries were classified as a designated or limited public forum for the purposes of selecting content, libraries would need to have compelling reasons for rejecting book donations based on the content of those books, Nadel said.314

A law student noted that the Kreimer court315 used a First Amendment right to receive information as the basis for the court’s public forum analysis.316 However, the issue before the court was physical access to the library and not access to the library’s holdings.317 In the holding of Kreimer v. Bureau of Police,318 the court found that the local government in Morristown and

Morris Township, New Jersey, had made the library a limited public forum for the right to receive information, but not for personal expression, such as making speeches.319 The Kreimer

313 Nadel, supra note 310, at 1132. For a discussion of library collection development, see supra pp. 50 to 52.

314 In discussing public forum doctrine, Nadel only used the book donation example and did not discuss acquisition or selection decisions that librarians routinely make in building their collection. Nadel said that the strict scrutiny test would apply to libraries rejecting book donations based on content. Nadel, supra note 310, at 1132-33. For a content-based restriction to pass the strict scrutiny test, the restriction “must be narrowly tailored to serve a compelling Government interest” and must be the “least restrictive alternative.” See United States v. Playboy Entm’t Group, 529 U.S. 803, 813 (2000).

315 Kreimer v. Bureau of Police, 958 F. 2d. 1242 (3d Cir. 1992).

316 Tedjeske, supra note 310, at 1289-90.

317 Kreimer, 958 F. 2d. at 1247, 1261, 1269. Richard Kreimer, a homeless man with poor hygiene, wanted access to the library to read. However, library patrons and staff said Kreimer’s body odor, staring, and loud talking disrupted other patrons and staff. Id.

318 Id. at 1242.

319 Id. at 1256-63. The Kreimer Court emphasized the right to receive information doctrine when stating, “The recognition of a constitutional right protecting public access to information and ideas is simply the threshold of our analysis.” Id. at 1255. In addressing the hygiene issue, the Third Circuit of the U.S. Court of Appeals upheld that the library’s policy requiring that patrons have a non-offensive body odor. The court held that the policy was narrowly tailored to meet a significant interest: “The Library's goal is served by its requirement that its patrons have non-

97

court held that a library’s policies on patron hygiene and conduct placed some restrictions on patrons’ rights to receive information, but the restrictions were “narrowly tailored,” and the restrictions did “not improperly restrict patrons from exercising the constitutionally protected right to receive information.”320

In an article on Internet access in public libraries, the law student stated that public forum analysis is not appropriate for a library’s holdings or collections because librarians make collections decisions that are consistent with their mission and policies.321 The law student also argued that the Perry court322 focused on lack of access to—or more precisely, the lack of exclusion from—the forum or building itself, and not the right to receive information inside the building.323 In Perry, the Supreme Court held that teacher mailboxes are a nonpublic forum.324

The Inapplicability of Public Forum Doctrine to Public Libraries

Other commentators have argued that public forum analysis is not applicable to libraries at all.325 Because librarians use editorial discretion in acquiring library materials326 and cannot

offensive bodily hygiene, as this rule prohibits one patron from unreasonably interfering with other patrons' use and enjoyment of the Library; it further promotes the Library's interest in maintaining its facilities in a sanitary and attractive condition.” Id. at 1264. The court said the policy was narrowly tailored to achieve a significant government interest because the rule prevented one patron from interfering with other patrons’ use of the library and promoted the library’s interest in keeping its facilities clean and attractive. Id. at 1264. Other library policies mandated that patrons engage in activities associated with the use of a public library, such as reading, studying and using library materials, and prohibited loitering, food and drink, and noisy and boisterous activities. See also Helper, supra note 93.

320 Kreimer, 958 F. 2d. at 1266. See supra notes 91- 99 and accompanying text for a discussion of Kreimer.

321 Tedjeske, supra note 310, at 1289-90. The author likened tax-funded Internet access to government subsidies in Rust v. Sullivan, 500 U.S. 173 (1991). She wrote the Supreme Court in Rust correctly upheld speech restrictions preventing family planning clinic health care workers from providing abortion counseling since the funding was limited to preconception counseling. See Rust v. Sullivan, 500 U.S. 173, 179, 181, 194 (1991). For a discussion of the public library’s mission, role and acquisition policies, see supra pp. 46-52.

322 Perry Educ. Ass’n v. Perry Local Educators Ass’n, 460 U.S. 37 (1983).

323 See Tedjeske, supra note 310, at 1290.

324 Perry, 460 U.S. at 55.

325 See Robert Corn-Revere, United States v. American Library Association: A Missed Opportunity for the Supreme

98

cater to every patron’s demands, public libraries should not be subject to forum analysis,

according to Brent VanNorman, a law student.327 VanNorman argued that public libraries could

be compared to public broadcasting, which the U.S. Supreme Court in 1998 said should not be

vulnerable to forum analysis.328 In Arkansas Educational Television Commission v. Forbes,329 the Supreme Court Justices noted that Congress in 1927 rejected the argument that broadcast stations be open to all persons who want to discuss public issues.330 Justice Anthony Kennedy,

who authored the majority opinion, wrote, “In the case of television broadcasting, however,

broad rights of access for outside speakers would be antithetical, as a general rule, to the

discretion that stations and their editorial staff must exercise to fulfill their journalistic purpose

and statutory obligations.”331

First Amendment scholar Robert Corn-Revere argued that public forum doctrine is not a

good fit for libraries because the doctrine revolves around speakers rather than listeners or

readers. In a 2003 article, Corn-Revere chastised the Supreme Court for not revising and

Court to Clarify Application of First Amendment Law to Publicly Funded Expressive Institutions, 2003 CATO SUP. CT. REV. 105, 127-28 (2003); VanNorman, supra note 54, at 432; Marc Blitz, Constitutional Safeguards for Silent Experiments in Living: Libraries, the Right to Read, and a First Amendment Theory for an Unaccompanied Right to Receive Information, 74 UMKC L. REV. 799, 844 (2006). See generally Jim Chen, The Faegre & Benson Symposium: Law, Information and Freedom of Expression: Article: Mastering Eliot's Paradox: Fostering Cultural Memory in an Age of Illusion and Allusion, 89 MINN. L. REV. 1361 (2005).

326 For a discussion of library acquisition decisions, see supra pp. 50 to 52.

327 VanNorman, supra note 54, at 432.

328 VanNorman, supra note 54, at 432 (citing Ark. Educ. Television Comm’n v. Forbes, 118 S. Ct. 1633 (1998)).

329 Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666 (1998).

330 Id. at 673-74.

331 Id. at 672-73. However, the Court also stated that, “Although public broadcasting as a general matter does not lend itself to scrutiny under the forum doctrine, candidate debates present the narrow exception to the rule.” Id. at 675.

99

clarifying public forum doctrine and the doctrine of unconstitutional conditions332 in the CIPA

case.333 He said the Court should have adopted a different constitutional framework,334 although

he did not make specific recommendations. Corn-Revere wrote that the Court should have

developed a “coherent theory for analyzing speech restrictions imposed on public

institutions.”335

Derek Stomberg, a law student, argued that a new category, “inherently public fora,” should be adopted. This designation would provide protection to new types of properties whose

principle purpose is to promote free speech. The Internet should then be classified as an

“inherently public fora” because the Internet is a centralized place where people can look for

information and discuss ideas.336 Stomberg noted that the concept of an inherently public forum

is not new. In his law review article, Stomberg cited a 1992 concurring opinion written by

Supreme Court Justice Anthony Kennedy. According to Justice Kennedy, the Court should adopt

a more modern and objective standard to the public forum doctrine, one that extends beyond the

historical designation of streets, parks and sidewalks as their role is diminishing. Justice

Kennedy said that the Court needs to recognize the possibility of other types of traditional public

332 The unconstitutional-conditions doctrine states that “the government cannot condition a person’s receipt of a government benefit on the waiver of a constitutionally protected right, (esp. a right under the First Amendment.”). BLACK’S LAW DICTIONARY (8th ed. 2004). For a discussion of the Supreme Court’s discussion of the unconstitutional conditions doctrine in the CIPA case, see Chapter 7.

333 Corn-Revere, supra note 325, at 129.

334 Corn-Revere, supra note 325, at 127-28.

335 Corn-Revere, supra note 325, at 126.

336 Derrick Stomberg, Note, United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum, 45 JURIMETRICS J. 59, 71-73 (2004).

100

fora, “whatever their historical pedigree and without concern for a precise classification of the

property.” 337 In his concurrence, Justice Kennedy wrote:

Without this recognition our forum doctrine retains no relevance in times of fast-changing technology and increasing insularity. In a country where most citizens travel by automobile, and parks all too often become locales for crime rather than social intercourse, our failure to recognize the possibility that new types of government property may be appropriate forums for speech will lead to a serious curtailment of our expressive activity.338

Commentator Analysis on the Right to Receive Ideas and Information Doctrine

In addition to discussing the applicability of public forum doctrine to public libraries,

commentators also have discussed the applicability of the right to receive ideas and information

doctrine to libraries.339 Although some commentators examined the right to receive information in the context of public forum doctrine, as discussed above, other authors focused on the right to

receive information without discussing forum doctrine.

Julie M. Tedjeske, a law student, argued that the right to receive information cannot be

extended to a library’s collections. If a right to receive information exists in a public library, then that right is most likely limited to physical access rather than access to information.340 While the

337 Id. at 69-71 (citing Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 697-98 (1992) (Kennedy, J., concurring in the judgment)). The Krishnas, a religious group, challenged the enforcement of a regulation prohibiting their solicitation of money and distribution of literature inside airport terminals. The Court held that an airport was a nonpublic forum and therefore the regulation need only meet the reasonableness standard and not the strict scrutiny standard. The Court held that prohibiting solicitation of money and distribution of literature inside airport terminals was reasonable to promote efficient air travel. The ordinance allowed solicitation and distribution of information on sidewalks outside airport terminals. Lee, 505 U.S. at 675-83.

338 Lee, 505 U.S. at 697-98.

339 See Blitz, supra note 325, at 810; Wardak, supra note 70, at 669-71; Peltz, supra note 65, at 440-41. See also Michael Cassidy, Note, To Surf and Protect: The Children’s Internet Protection Act Polices Material Harmful to Minors and a Whole Lot More, 11 MICH. TELECOMM. TECH. L. REV. 437, 464 (2005); Adam Horowitz, The Constitutionality of the Children’s Internet Protection Act, 13 ST. THOMAS L. REV. 425, 426, 444 (2000); Gregory Laughlin, Sex, Lies and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 213, 254-55 (2003).

340 Tedjeske, supra note 310, at 1289.

101

First Amendment right to receive information was the foundation for the public forum analysis in

Kreimer,341 “the issue in that case was general access to the library, not a claim of access to particular resources that the library did not otherwise make available,” Tedjeske said.342

In 1984, Shirley Echelman, former executive director of the Association for Research

Libraries and of the Medical Library Association, said a “relationship” exists “between the

Constitutional guarantee of free speech and the obligations of the public library to provide readers with maximum access to all points of view.”343 In a 1992 law review article, First

Amendment scholar Rodney Smolla said that librarians have to show “commitment” in

maintaining “intellectual openness” in public libraries.344 “Librarians play a pivotal role in

maintaining the free flow of information in American society,” Smolla said.345 He argued that

“courts have not gone very far in devising First Amendment doctrines that more fully protect the intellectual neutrality of libraries.”346 He stated, “[N]ew communications technologies carry with

them increased censorship pressures” and urged librarians to fight against censorship in what he

called “the never ending struggle to maintain the free flow of information in a wide-open and robust democracy.”347 Although Internet access was not available in public libraries when

Smolla’s article was published, his argument applies directly to Internet access in public libraries

341 Kreimer v. Bureau of Police, 958 F. 2d. 1242 (3d Cir. 1992). For a discussion of the Kreimer opinion, see supra notes 91-99 and accompanying text.

342 Tedjeske, supra note 310, at 1289.

343 Shirley Echelman, The Right to Know: The Librarian’s Responsibilities, in THE RIIGHT TO NFORMATION: LEGAL QUESTIONS AND POLICY ISSUES 56 (Jana Varlejs, ed., 1984).

344 Smolla, supra note 1, at 79.

345 Smolla, supra note 1, at 71.

346 Smolla, supra note 1, at 73.

347 Smolla, supra note 1, at 79.

102

because the Internet is a new communication technology that promotes a free flow of information and ideas.

In a 2006 article, law professor Marc Blitz stated that libraries play a crucial role in the right to receive information and in the exchange of ideas. He described the public library as a place “where open access for the public follows logically from the mission of the institution.”348

Blitz argued that the right to receive information, including online content,349 applies to public

libraries because patrons can retrieve information that leads to self-fulfillment and autonomy.350

Law student Michael Cassidy wrote that “filtering violates patrons’ rights to receive information

and ideas and therefore violates their First Amendment rights.”351

Commentators also have argued that the right to receive information doctrine applies to

public libraries because they are autonomous institutions. The library autonomy or

professionalism principle protects the acquisition decisions352 of librarians from government intervention and therefore supports patrons’ right to receive information and ideas.353 Under the

library autonomy principle, librarians’ decisions on acquiring materials would not be subject to

judicial challenge, according to law professor Jim Chen. He argued that legislation forcing

librarians to exclude material should be presumed unconstitutional.354 Rodney Smolla argued for a similar principle, which he labeled the “professional principle.” According to Smolla, content decisions in public libraries

348 Blitz, supra note 325, at 816.

349 Blitz, supra note 325, at 847, 849-50.

350 Blitz, supra note 325, at 817-18 (citing C. Edwin Baker’s liberty theory of the First Amendment).

351 Cassidy, supra note 339 at 464.

352 For a discussion of public library acquisition decisions, see supra pp. 50 to 52.

353 Blitz, supra note 325, at 844; Chen, supra note 325, at 1361. See also Corn-Revere, supra note 325, at 127-28.

354 Chen, supra note 325, at 1362. See generally, Smolla, supra note 1.

103

should be insulated from partisan political influence by committing them to the sound discretion of professionals in the field. These professionals judge the merits of a work from perspectives limited to the professional criteria that have evolved within their areas of expertise.355

Under the autonomy or professionalism principle, librarians would make decisions on the use of Internet filtering, rather than the government.

First Amendment Theory

Several theories of the First Amendment apply to patrons’ access to materials in public libraries. The right to receive ideas theory is perhaps the most directly applicable to public library patrons. First Amendment scholar William Lee stated that the right to receive ideas and information doctrine is important and useful because it “restricts the government’s power to interfere with the recipient of the communication.”356 However, Lee said that the Supreme Court has failed to develop “a cohesive theory of free speech” or to tie the right to receive information and ideas to the recipient, independent of the speaker.357 Although the Supreme Court has stated the “right to receive” is well established in Stanley v. Georgia in 1969,358 Lee said the Court has never explained the theoretical foundation of that right.359 In a 1987 article, Lee wrote that part of the problem is that the Court developed the right to receive information doctrine in various contexts360 and for “disparate purposes.”361 For example, the Supreme Court upheld the right to

355 Smolla, supra note 1, at 73.

356 Lee, supra note 190, at 343.

357 Lee, supra note 190, at 343.

358 Stanley v. Georgia, 394 U.S. 557, 564 (1969).

359 Lee, supra note 190, at 307.

360 Lee, supra note 190, at 306-07.

361 Lee, supra note 190, at 342.

104

receive political speech, even though it was communist propaganda from abroad,362 and the right to receive commercial speech in the form of advertising prescription drug prices.363 The Court also upheld the right of students in public schools to have access to school library books by striking down a school board decision to remove certain books from the library.364 Lee argued it would make more sense to recognize the right to receive information only in situations where a speaker has a right to speak.365 Under Lee’s analysis, the right to receive ideas and information most likely would not apply in public libraries because speakers do not have the right to speak in that venue.366 However, the U.S. Supreme Court and legal commentators have specifically applied the right to receive information and ideas to public libraries, which is discussed above.367

First Amendment scholar Thomas Emerson has emphasized the rights of the listener or audience, stating that there is a First Amendment right to receive and obtain communication, independent of or supplemental to the right of the speaker to communicate information.368 He

362 Lee, supra note 190, at 307-11 (citing Lamont v. Postmaster Gen., 381 U.S. 301 (1965) (holding as unconstitutional a federal statute that mandated “communist propaganda” from abroad be held at a post office until the addressee requested it)). For a discussion of Lamont, see supra notes 204-209 and accompanying text.

363 Lee, supra note 190, at 314-18 (citing Va. State Bd of Pharmacy v. Va. Citizens Consumer Council, 425 U.S. 748 (1976) (holding as unconstitutional a state statute that prohibited pharmacists from advertising prescription drug prices)). For a discussion of Virginia State Board, see supra notes 241-249 and accompanying text.

364 See Lee, supra note 190, at 323-27 (citing Bd. of Educ. v. Pico, 457 U.S. 853 (1982) (holding that a school board violated the First Amendment when it ordered the removal of books from a junior high school library and senior high school library)). For a discussion of Pico, see supra notes 152-168 and accompanying text.

365 Lee, supra note 190, at 344.

366 See Brown v. Louisiana, 383 U.S. 131, 142 (1966) (acknowledging that a public library is “a place dedicated to quiet, to knowledge, and to beauty.”). See supra notes 141-151 for a discussion of Brown v. Louisiana. See also Kreimer v. Bureau of Police, 958 F. 2d. 1242, 1260-61 (3d Cir. 1992) (stating that the purpose of a public library is to pursue knowledge through “reading, writing and quiet contemplation” and “the exercise of other oral and interactive First Amendment activities is antithetical to the nature of the Library.”) See supra notes 91-99 for a discussion of Kreimer.

367 For the Supreme Court’s decisions on the right to receive information and ideas, see supra pp. 77 to 92. For commentators’ discussion on the right to receive information and ideas, see supra pp. 101-104.

368 See Thomas I. Emerson, Symposium, The First Amendment and the Right to Know—Legal Foundations of the Right to Know, 1976 WASH. U. L. Q. 1, 2 (1976).

105

has argued that the “right to read, listen, or see is so elemental” that it deserves full First

Amendment protection.369 This “right to know” is of “vital importance in a democratic

society”370 and is an affirmative right, in contrast to the negative right of being free from

government interference.371 In Professor Emerson’s view, the right to know is necessary for self-

fulfillment and should receive “direct constitutional protection.”372 To attain self-fulfillment, or

self-realization, Emerson said individuals have “the right to form their own beliefs and opinions

. . . and the right to express these beliefs and opinions.”373 According to Emerson, the

“suppression of belief, opinion and expression is an affront to the dignity of man, a negation of man’s essential nature.”374

First Amendment scholar C. Edwin Baker, who was a student of Thomas Emerson’s at

Yale University,375 has argued for the “liberty theory” of the First Amendment, which is based

on Emerson’s theory.376 According to the liberty theory, the First Amendment protects two key

values: self-fulfillment and social change. 377 The liberty theory fosters self-fulfillment and self-

determination,378 thus allowing the listener to “use speech” for self-realization and change.379

369 Id. at 6.

370 Id. at 1.

371 Id. at 2.

372 Id. at 2, 6.

373 THOMAS EMERSON, TOWARD A GENERAL THEORY OF THE FIRST AMENDMENT 5 (New York, Random House: 1966).

374 Id.

375 JOSEPH HEMMER, JR., THE FIRST AMENDMENT 149 (Cresskill, NJ: Hampton Press, 2006).

376 C. Edwin Baker, Scope of the First Amendment Freedom of Speech, 25 UCLA L. REV. 964 (1978).

377 Id. at 990-92, 1040.

378 Id. at 966.

379 Id. at 1007.

106

Baker views change as public participation in collective decision making in society.380 Baker has

argued that the “solitary uses of speech contribute to self-fulfillment,”381 and listeners have a

right to demand that the government not prevent them from either receiving or using

information.382 Speech is protected because of the value to the individual rather than the

“collective good.”383 However, the “solitary uses” of information, such as outlining material,

engaging in problem solving, or writing a note to oneself, may contribute to social change, as

well as to self-fulfillment.384

The marketplace of ideas theory supports the exposure of individuals to a wide variety of

ideas and viewpoints and is often attributed to John Milton385 and John Stuart Mill.386 Listeners

will come to a clearer and more accurate perception of the truth by struggling to reconcile new

ideas with old assumptions, even if a new idea or opinion turns out to be wrong, according to

Mill.387 For example, listeners will be exposed to a “livelier impression of truth, produced by its collision with error” when they are exposed to false information.388 Mill said that the problem with silencing any opinion is that society is deprived of an opportunity of exchanging error for truth if the opinion turns out to be right.389 In addition, Mill argued that it is important to hear the

380 Id. at 990-92, 1040. This is a paraphrase, note a quote —Baker used these terms separately at different places in his article.

381 Id. at 995-96.

382 Id. at 1007.

383 Id. at 966.

384 Id. at 993.

385 See JOHN MILTON, AREOPAGITICA.

386 See JOHN STUART MILL, ON LIBERTY 16 (1859).

387 Id.

388 Id.

389 Id.

107

pros and cons of issues to fully understand their meaning.390 Moreover, the basis and meanings of opinions become lost without robust discussion.391

U.S. Supreme Court Justice Oliver Wendell Holmes’ famous dissent in 1919 in Abrams v. United States392 also supported the marketplace of ideas theory. Holmes, who was joined by

Justice Louis Brandeis, disagreed with the Court’s decision to uphold the convictions of Russian immigrants on charges of inciting resistance to the war by publishing anti-American leaflets during World War I. Holmes said that the government did not show that the immigrants had displayed an “immediate evil or an intent” to disrupt the war or start a revolution.393 Holmes wrote,

[T]he best test of truth is the power of the thought to get itself accepted in the competition of the market….[W]e should be eternally vigilant against attempts to check the expression of opinions that we loathe and believe to be fraught with death, unless they so imminently threaten immediate interference with the lawful and pressing purposes of the law that an immediate check is required to save the country.394

Eight years later, Holmes and Brandeis again supported the marketplace of ideas in

Whitney v. California.395 Brandeis, who was joined by Holmes in a concurring opinion, said that the U.S. Constitution guarantees free speech and assembly.396 Brandeis wrote,

390 Id. at 42-43.

391 Id. at 39.

392 250 U.S. 616 (1919). The Supreme Court majority voted to uphold the Espionage Act convictions of Jacob Abrams and his co-defendants who distributed leaflets to encourage resistance to the United States in the war with Germany, to protest American troops sent into Russia, and to advocate worker strikes in U.S. ammunition factories. Id. at 616-20. Justice John Clarke wrote that the “manifest purpose of such a publication was to create an attempt to defeat the war plans of the Government of the United States.” Id. at 623-24.

393 Id. at 628 (Holmes, J., dissenting).

394 Id. at 630.

395 274 U.S. 357 (1927). The Court, in a 9-0 vote, upheld the conviction of a member of the Communist Labor Party, stating that the freedom of speech does not protect the “advocacy and use of criminal and unlawful methods” to accomplish political change. Id. at 371-73. The Court wrote, “[A] State in the exercise of its police power may

108

[The founders] believed that freedom to think as you will and to speak as you think are means indispensable to the discovery and spread of political truth . . . [and] it is hazardous to discourage thought, hope and imagination.397

Another theory of the First Amendment, Alexander Meiklejohn’s self-governance theory,

seems to provide absolute protection for speech related to the government process.398 In a shared

governance system, where the “rulers” and the “ruled” are the same individuals,399 “the Congress

of the United States has a heavy and basic responsibility to promote the freedom of speech,”

Meiklejohn wrote in 1948.400 He said that while the government can prohibit the abridgment of

speech, it cannot prohibit the abridgment of freedom of speech.401 For example, at a town meeting, the local government can abridge speech by regulating the number of speakers and time limits for comments, as long as “everything worth saying be said.”402 However, the local

government cannot abridge freedom of speech by preventing alternative viewpoints from being

presented at a meeting.403 “The freedom of ideas shall not be abridged,” he wrote.404 According

to Meiklejohn, citizens need information and discussion to understand issues and make decisions

punish those who abuse this freedom by utterances inimical to the public welfare, tending to incite to crime, disturb the public peace, or endanger the foundations of organized government and threaten its overthrow by unlawful means, is not open to question.” Id. at 371.

396 Id. at 376. (Brandeis. J., concurring). Brandeis and Holmes concurred in the result because of Fourteenth Amendment issues, stating that lower court testimony indicated that Whitney and members of the Communist Labor Party of California intended to commit serious crimes. Id. at 379.

397 Id. at 375 (Brandeis, J., concurring).

398 See ALEXANDER MEIKLEJOHN, FREE SPEECH AND ITS RELATION TO SELF-GOVERNMENT (1948).

399 Id. at 6.

400 Id. at 17.

401 Id. at 19.

402 Id. at 25.

403 Id.

404 Id. at 27.

109

in a self-governing system.405 In the 1960s, Meiklejohn expanded his self-governance theory to

include education, philosophical and scientific achievements, literature, and the arts.406

“Education, in all its phases, is the attempt to so inform and cultivate the mind and will of a

citizen that he shall have the wisdom, the independence, and therefore, the dignity of a governing

citizen,” he wrote.407

Professor Vincent Blasi has argued that the First Amendment plays an important role in

“checking” on the abuse of power by government officials. Under Blasi’s “checking value”

theory, the press plays the role of watchdog in monitoring the activities of government and “the

misuse of official power.”408 Blasi stated that the “abuse of official power is an especially

serious evil,” largely because of the government’s power “to employ legitimized violence,” such

as killing innocent people during a war.409 Blasi said that Meiklejohn’s self-governance theory and the checking theory are similar because both theories give “special protection” to political speech. However, the self-governance theory covers a broad range of communication, whereas the checking theory focuses specifically on government misconduct.410

Minors’ Access to Public Library Material

The Supreme Court411 and commentators412 have stated that minors have a First

Amendment right to receive information, though to a lesser extent than adults.413 Historian and

405 Id. at 24-25.

406 Alexander Meiklejohn, The First Amendment Is an Absolute, 1961 SUP. CT. REV. 245, 257 (1961).

407 Id. at 257.

408 Vincent Blasi, The Checking Value in First Amendment Theory, 1977 AM. BAR FOUND. RESEARCH J. 521, 527 (1977).

409 Id. at 538. When discussing war, Blasi was referring to the Vietnam War. Id.

410 Id.

411 Ginsberg v. New York, 390 U.S. 629, 636 (1968) (upholding by a 6-3 vote a variable obscenity standard that prohibited the sale of magazines to minors that would be considered obscene for minors but not obscene for adults.

110

attorney Catherine Ross has stated that, despite the Supreme Court’s recognition of minors’ First

Amendment rights, the Court has not provided much guidance concerning the age and circumstances under which a minor may receive information.414 Professor Ross and other commentators have argued that older minors, which they define as teenagers, have a greater First

Amendment rights than younger ones.415 Ross said that teenagers have the right to receive information,416 with or without parental approval,417 and that Internet filters in public libraries

The majority in Ginsberg stated, "[T]he power of the state to control the conduct of children reaches beyond the scope of its authority over adults."). See also Bd. of Educ. v. Pico, 457 U.S. 853, 867-68 (1982) (rejecting a school board’s claim of absolute discretion to remove books from their school libraries). However, the Pico Court stated: “Of course all First Amendment rights accorded to students must be construed ‘in light of the special characteristics of the school environment.’” (citing Tinker v. Des Moines Sch. Dist., 393 U.S. 503, 506 (1969)). Pico, 457 U.S. at 867-68. See also Reno v. ACLU, 521 U.S. 844, 875, 878-79 (1997) (striking down the Communications Decency Act.) The CDA imposed criminal penalties for the “knowing transmission” by means of a telecommunications device, any "obscene or indecent" communications to any recipient aged 17 years or younger. The CDA also prohibited the electronic transmission of communications that depicted in terms "patently offensive" as measured by contemporary community standards, sexual or excretory activities or organs. The Court majority stated, “It is true that we have repeatedly recognized the governmental interest in protecting children from harmful materials. But that interest does not justify an unnecessarily broad suppression of speech addressed to adults." Reno, 521 U.S. at 875. The Court majority added, “It is at least clear that the strength of the Government's interest in protecting minors is not equally strong throughout the coverage of this broad statute.” Reno, 521 U.S. at 878. For an analysis of government attempts to protect minors from content deemed harmful, see Chapters 4 and 5.

412 See Sidne Koenigsberg, Print Symposium, Contract Options for Individual Artists: Library Records Open to Parental Scrutiny: A New Set of Internet Access Controls for Minors, 29 COLUM. J.L. & ARTS 361, 376 (2006); Cassidy, supra note 339, at 444; Nunziato, supra note 70, at 121-22; Laughlin, supra note 339, at 254; Horowitz, supra note 339, at 425, 426-27; Catherine Ross, An Emerging Right for Mature Minors to Receive Information, 2 U. PA. J. CONST. L. 223, 223-26 (1999).

413 See, e.g., Ginsberg v. New York, 390 U.S. 629, 636 (1968); Bd. of Educ. v. Pico, 457 U.S. 853, 867-68 (1982); Reno v. ACLU, 521 U.S. 844, 875, 878-79 (1997); infra Chapters 4 and 5.

414 Ross, supra note 412, at 223-26 (citing Bellotti v. Baird, 443 U.S. 622, 643-44 (1979) (holding that mature minors have a constitutional right to obtain abortions without parental consent under certain circumstances. The Court stated, “A pregnant minor is entitled in such a [court] proceeding to show either: (1) that she is mature enough and well enough informed to make her abortion decision, in consultation with her physician, independently of her parents' wishes; or (2) that even if she is not able to make this decision independently, the desired abortion would be in her best interests.)) Ross said the Supreme Court did not provide any guidance to the lower courts about how to ascertain maturity, which has resulted in an ad hoc application of the concept.

415 Ross, supra note 412, at 224-25; Amitai Etzioni, Symposium, Do Children Have the Same First Amendment Rights as Adults?: On Protecting Children from Speech, 79 CHI.-KENT. L. REV. 3, 43 (2004); Koenigsberg, supra note 412, at 376 (2006); Laughlin, supra note 339, at 254; Nunziato, supra note 70, at 121-22.

416 Ross, supra note 412, at 224-25.

417 Ross, supra note 412, at 275.

111

interfere with that right.418 According to Ross, minors’ right to receive information is most applicable when they have autonomy rights regardless of parental preferences. Ross explained that minors have autonomy rights in instances in which they legally are able to make their own decisions without parental permission. Examples of these autonomy rights include the right to exercise individual religious beliefs, the right to contraception and sexuality, and the right to an abortion without parental notice or consent.419 Ross and commentator Dawn Nunziato said that although the right to receive information does not seem to fully apply to minors, mature minors in particular still need access to diverse information for individual self-exploration, to develop values and autonomy, and to acquire the tools they will need for self-governance when they reach adulthood.420

Columbia University law student Sidne Koenigsberg argued that Ross’ analysis, while doctrinally sound, is too limited. Koenigsberg stated that minors should have freedom in “more mundane circumstances,” such as when they want to read books or visit Internet sites that advocate ideas or beliefs their parents do not share.421 Two law professors proposed an aged- based Internet filtering solution to address concerns about minors’ access to inappropriate materials. The professors suggested that public libraries could implement three tiers of filtering:

1) the most restrictive setting would apply to those aged 12 and under; 2) a “less restricted”

418 Ross, supra note 412, at 262.

419 Ross, supra note 412, at 253-54.

420 Ross, supra note 412, at 223-26; Nunziato, supra note 70, at 155, 161-62. See also Laughlin, supra note 339, at 254.

421 Koenigsberg, supra note 412, at 376.

112

setting would apply to minors aged thirteen through sixteen; and 3) a much less restricted or unrestricted setting would apply for adults aged seventeen and older.422

However, the American Library Association noted that an age policy does not take into account differing levels of intellectual development, family backgrounds or childrearing practices.423 The age-based recommendations also would pose three technical problems. First, libraries that choose to have filtering software installed on servers, rather than on individual computers, would not be able to alter settings on individual computers.424 Second, librarians do not necessarily have the expertise to alter the default settings that come with the filtering software packages. Third, the age-based tier does not meet the requirements of the Children’s

Internet Protection Act. Filtering software is unable to block the specific content prohibited by the CIPA, including material deemed harmful to minors.425 In addition, an unrestricted setting for adults would not meet the requirements of the CIPA.426

422 Etzioni, supra note 415, at 43-44 (2004); Nunziato, supra note 70, at 163-64. The “harmful to minors” clause of the Children’s Internet Protection Act applies to those under the age of seventeen.

423 INTELLECTUAL FREEDOM MANUAL, supra note 20, at 159.

424 A server is a computer that delivers information and software to other computers linked by a network. When individual computers are configured in such a way that they must go through the server to connect to the Internet, a software filter installed on the server computer would block selected content from reaching the individual computers connected to it. Libraries typically also have “internal” computers that allow patrons to search the library system’s holdings but that are not connected to the Internet. See BUCKLEY & CLARK, supra note 55, at 20, 323 (2008). See also ‘PROXY SERVER’ , available at http://www.techterms.com/definition/proxyserver. (last visited July 20, 2009); 'FILE SERVER' 2003, Encyclopedia of Computer Science, http://www.credoreference.com/entry/encyccs/file_server (last visited July 20, 2009).

425 Mitchell Goldstein, Congress And The Courts Battle Over The First Amendment: Can The Law Really Protect Children From Pornography On The Internet? 21 J. MARSHALL J. COMPUTER & INFO. L. 141, 187 (2003). Examples of filtering categories are “adults only,” “sexually explicit,” “sex education,” “nudity” and “violence.” The CIPA prohibits access to three major types of content: access by all patrons to “visual depictions” that are obscene, access by all patrons to “visual images” containing child pornography, and access by persons under age seventeen to “visual depictions” that are considered “harmful to minors.” See Children’s Internet Protection Act, Pub. L. No. 106- 554, 114 Stat. 2763, 2763A-335 (2000) (codified at 20 U.S.C. § 9134(f)(1)(A) and (B); 47 U.S.C. § 254(h)(6)(B) and (h)(6)(C)). For a discussion of how filtering software works, see Chapter 3.

426 See Children’s Internet Protection Act, Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified at 20 U.S.C. §§ 9134(f)(1)(A) & (B); 47 U.S.C. §§ 254(h)(6)(B) & (C)), mandating the use of filters in public libraries to block access by all patrons, both adults and minors, to online obscenity and child pornography.

113

Some proponents of Internet filtering genuinely want to protect children from text and images that they consider “morally harmful,” according to library scholar Michael Gorman.

Typically, filtering proponents would define “morally harmful” material as material that contains sexual and violent content, Gorman wrote.427 However, he stated that filtering proponents do not agree on the type or level of sexual content to block. For example, Gorman stated that some people find any writing about sex is offensive, while others object only to writings about “sexual variations” and visual images depicting sexual activity.428 Moreover, he said that individuals find different things repulsive and define “harmful” sexual content differently.429

Public librarians generally see their primary role as providing free and equal access to all information for all patrons, regardless of age, and without serving as censors or surrogate parents. Librarians who advocate this “open access” policy emphasize the role of parents in guiding their own children’s use of the Internet.430 According to the American Library

Association (ALA), librarians should not “assume, abrogate, or overrule the rights and responsibilities of parents.”431

Library scholar Richard Rubin stated that most librarians strongly believe that children should be protected from harm432 and a critical purpose of the library is to advance the education

427 See MICHAEL GORMAN, OUR ENDURING VALUES 93-94 (2000).

428 Gorman did not define or explain sexual variations.

429 See GORMAN, supra note 427, at 93.

430 THOMAS E. SHANKS & BARRY J. STENGER, ACCESS, INTERNET, AND PUBLIC LIBRARIES (originally published in 1997 and updated in 2002), available at http://www.scu.edu/ethics/practicing/focusareas/technology/libraryaccess/ (last visited July 20, 2009). This report was updated in March 2002 by Tamar Weber, research assistant at the Markkula Center for Applied Ethics at Santa Clara University.

431 AM. LIBRARY ASS’N, FREE ACCESS TO LIBRARIES FOR MINORS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/freeaccesslibraries.cfm. The ALA Council adopted the interpretation statement in 1972 and amended it in 1981, 1991 and 2004.

432 RUBIN, supra note 1, at 148.

114

of the young.433 However, librarians do not necessarily agree on their role in the process, particularly if that role involves restricting access to information.434 Many librarians have argued

that children who are free to explore ideas become better educated citizens and healthy adults,

while others have argued that access to some materials should be restricted.435 Prior to the

implementation of the Children’s Internet Protection Act, some librarians already had limited

minors’ access to some materials because they wanted to forestall criticism of the library’s

collection and services or to avoid confrontations with adults who were concerned with minors’

exposure to “harmful” material.436

Although many librarians think that parents should be responsible for determining what

their children see,437 parents do not agree on the role of the public librarian. For instance, some parents see the public library as a “bastion of free speech,” where access to information should be completely unrestricted.438 Other parents view the public library as a “safe haven” where they

can send their children without fear that they will be exposed to sexually explicit material.439

These parents argue that the protection of their children (and, for some parents, other people's

433 RUBIN, supra note 1, at 150.

434 RUBIN, supra note 1, at 150.

435 RUBIN, supra note 1, at 150. See also INTELLECTUAL FREEDOM MANUAL, supra note 20, at 27.

436 JOHN ROBOTHAM & GERALD SHIELDS, FREEDOM OF ACCESS TO LIBRARY MATERIALS 36-49 (1982). In the past, some librarians placed controversial and young-adult materials in the adult sections of libraries.

437 RUBIN, supra note 1, at 150.

438 See Markkula Center for Applied Ethics, Libraries on the Information Superhighway: Ethics Center Facilities Discussion on Internet Access, 9 ISSUES IN ETHICS (No. 1) (Winter 1998), available at http://www.scu.edu/ethics/publications/iie/v9n1/libraries.html.

439 Id.

115

children) is a core family value, and therefore minors should be protected from accessing

obscene and pornographic materials in the library.440

In addressing minors’ access to public library materials, the ALA amended its Library

Bill of Rights in 1967,441 stating that a person’s right to use a public library should not be denied or abridged because of age.442 As further support for minors’ access to library materials, the

ALA in 1972 adopted a “Free Access to Libraries for Minors” policy, which is an interpretation

of the Library Bill of Rights. The policy states that “library policies and procedures that

effectively deny minors equal and equitable access to all library resources available to other

users violate the Library Bill of Rights.”443 In explaining its rationale, the ALA statement reads:

“Librarians cannot predict what resources will best fulfill the needs and interests of any individual user based on a single criterion such as chronological age, educational level, literacy skills, or legal emancipation.”444

Over the years, the ALA issued several other interpretations of the bill of rights that

applied to minors. In the 1981 interpretation, the document stated that librarians should not act as

in loco parentis and that “[m]aterial selection decisions are often made and restrictions are often

440 Shanks & Stenger, supra note 430.

441 INTELLECTUAL FREEDOM MANUAL, supra note 20, at 67.

442 AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. V, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf. The Library Bill of Rights was adopted on June 18, 1948, and amended on February 2, 1961, and on January 23, 1980, with the inclusion of “age” reaffirmed on January 23, 1996, by the ALA Council. Art. V states, “A person’s right to use a library should not be denied or abridged because of origin, age, background, or views.”

443 AM. LIBRARY ASS’N, FREE ACCESS TO LIBRARIES FOR MINORS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/freeaccesslibraries.cfm.

444 Id. The policy was adopted on June 30, 1972 and amended on July 1, 1981, on July 3, 1991, and on June 30, 2004 by the ALA Council. The ALA Council is the governing body of the American Library Association and determines all policies of the Association. See http://www.ala.org/ala/aboutala/governance/council/index.cfm. See also AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. V, stating, "A person's right to use a library should not be denied or abridged because of origin, age, background, or views." See also RUBIN, supra note 1, at 163.

116

initiated under the assumption that certain materials may be ‘harmful’ to minors, or in an effort to avoid controversy with parents.” The document also includes a section stating that a uniform age policy does not take into account differing levels of intellectual development, family backgrounds or childrearing practices.445 In its 1996 interpretation of access to electronic sources, the ALA stated that “libraries and librarians should not deny or limit access to information via electronic resources because of allegedly controversial content or because of the librarian’s personal beliefs or fear of confrontation.”446 The ALA also stated that access to electronic resources should not be restricted because of age.447 In 1997, the ALA passed an anti- filtering resolution, stating that the use of filtering software to block constitutionally protected speech violates the Library Bill of Rights.448

In the Interpretations of the Library Bill of Rights in 2002, the ALA reiterated its commitment to providing open access to minors:

Libraries should not limit the selection and development of library resources simply because minors will have access to them. Institutional self-censorship diminishes the credibility of the library in the community, and restricts access for all library users. Children and young adults unquestionably possess First Amendment rights, including the right to receive information in the library . . . . Lack of access to information can be harmful to minors. Librarians and library governing bodies have a public and professional obligation to ensure that all members of the community they serve have free, equal, and equitable access to the entire range of library resources regardless of content, approach, format, or

445 INTELLECTUAL FREEDOM MANUAL, supra note 20, at 159.

446 AM. LIBRARY ASS’N, ACCESS TO ELECTRONIC INFORMATION, SERVICES, AND NETWORKS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagement/ContentDisplay.cfm&C ontentID=31872. This section was adopted on January 24, 1996 and amended on January 19, 2005, by the ALA Council.

447 Id.

448 AM. LIBRARY ASS’N, RESOLUTION ON THE USE OF FILTERING SOFTWARE IN LIBRARIES, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/ifresolutions/resolutionuse.cfm.

117

amount of detail. This principle of library service applies equally to all users, minors as well as adults.449

The ALA’s commitment on providing minors with access to all materials extends to

online content, too:

The American Library Association's principles protect minors' access to sound, images, data, games, software, and other content in all formats such as tapes, CDs, DVDs, music CDs, computer games, software, databases, and other emerging technologies.450

The American Library Association has clearly stated that all library patrons should have

“unrestricted” access to all of the materials and services that libraries offer. In interpreting the library bill of rights, the ALA wrote, “Every restriction on access to, and use of, library resources, based solely on the chronological age, educational level, literacy skills, or legal

emancipation of users violates Article V (of the library bill of rights).”

The ALA also included a reference to the relationship between the role of the librarian

and the role of the judicial system: “Librarians and library governing bodies should not resort to

age restrictions in an effort to avoid actual or anticipated objections, because only a court of law

can determine whether material is not constitutionally protected.”451

Conclusion

The mission of the public library has remained fairly constant for more than seventy

years: to support intellectual freedom and to provide open and equal access to all patrons and to

449 AM. LIBRARY ASS’N, FREE ACCESS TO LIBRARIES FOR MINORS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/freeaccesslibraries.cfm.

450 AM. LIBRARY ASS’N, ACCESS FOR CHILDREN AND YOUNG ADULTS TO NONPRINT MATERIALS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/accesschildren.cfm.

451 AM. LIBRARY ASS’N, FREE ACCESS TO LIBRARIES FOR MINORS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/freeaccesslibraries.cfm.

118

all users,452 with the goal of combating censorship and preserving everybody’s right to choose their own reading and viewing materials,453 regardless of format or medium.454 Librarians continue to determine which materials to acquire to add to the library’s collection, including online resources. However, because Internet filtering software is proprietary, librarians do not have input into online acquisition decisions as they do with other materials. Most software companies do not explain what they filter or how they filter,455 and therefore, unlike in the book selection process, librarians do not know what they are excluding from the online collection of resources. In addition, filtering technology is imprecise in that it does not block all sexually explicit material, while at the same time it blocks constitutionally protected speech.

452 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. V, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf; Bowker, supra note 2. But see Boorstin, supra note 2, at 119 (explaining that many early librarians were torn between the preservation of books by protecting them from the public and the diffusion of ideas, or making books accessible. In addition, librarians were concerned over the dress and demeanor of patrons, particularly the “‘laboring classes’ . . . who might soil the books and were unlikely to show them the respect that they were entitled to.”).

453 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. V, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf; SHUMAN, supra note 3, at 122.

454 AM. LIBRARY ASS’N, ACCESS TO ELECTRONIC INFORMATION, SERVICES, AND NETWORKS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagement/ContentDisplay.cfm&C ontentID=31872. This section was adopted on January 24, 1996 and amended on January 19, 2005, by the ALA Council.

455 CRAWFORD, supra note 53, at 227.

119

CHAPTER 3 INTERNET FILTERING TECHNOLOGY

Introduction

After the U.S. Supreme Court upheld the Children’s Internet Protection Act (CIPA) in

2003,1 public library administrators and librarians have been required to install filtering software

on all computers connected to the Internet if they want to receive Library Services and

Technology Act (LSTA) and universal service (E-rate) technology funding from the federal

government. These funds for technology are available under the LSTA and the E-rate, a part of

the Telecommunications Act of 1996.2 The CIPA mandates that libraries certify that they are

using technological measures that prevent patrons of all ages from accessing “visual depictions”

that are obscene or that are child pornography, and for patrons who are minors, visual depictions

considered “harmful to minors.”3 Many public librarians, as part of the profession’s mission,

already had established some controls on access to the Internet such as setting time limits for

each user and establishing protocol blocking, which denies access to types of online platforms,

regardless of content, such as usenets, bulletin boards, chat rooms and e-mail.4 This chapter

explains the evolution and architecture of the Internet and the technology behind both the

Internet and filtering.

1 See United States v. Am. Library Ass’n, 539 U.S. 194 (2003), rev’g Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

2 For an explanation of the LSTA and the E-rate, see Chapter 1, supra notes 36-41 and accompanying text.

3 Pub. L. No. 106-554, 114 Stat. 2763 (2000) (codified at 20 U.S.C. § 9134(f); 47 U.S.C. § 254 (h)). The CIPA also requires libraries to implement a safety policy to monitor minors’ online activities and to monitor the operation of “a technology protection measure. 20 U.S.C. §9134(f)(1)(A) and (B), 47 U.S.C. §254(h)(5)(A) and 47 U.S.C. §254(h)(6)(A). See Chapter 4 for a discussion of the concept of “harmful to minors,” Chapter 5 for federal attempts to protect minors from sexually explicit material, and Chapter 7 for a discussion of the components of the CIPA.

4 See Ann Curry & Ken Haycock, Filtered or Unfiltered?, 47 SCH. LIBR. J. 47 (Jan. 2001).

120

Nearly 99% of public libraries in the United States offered Internet access to the public in

2008,5 in contrast to 95.7% in 20006 and 25% in 1997.7 Early in 2000, prior to the passage of the

CIPA, nearly 17% of public libraries were using filtering software on some computers and 7% of public libraries were using filtering software on all computers.8

History of Internet Technology

Courts and researchers have described the Internet as ever-changing and unparalleled in human communication9 because the Internet connects millions of users via linked computer networks and with no physical boundary lines.10 The Internet can be traced to a Department of

Defense program in 1969 to link government computers at military installations.11 Today’s

Internet is an international network of interconnected computers, with no land-based or

5 From 2006 to 2008, libraries providing Internet access remained constant at just under 99%. See AM. LIBRARY ASS’N, THE STATE OF AMERICAN LIBRARIES 5 (April 2007) and JOHN CARLO BERTOT, CHARLES R. MCCLURE, ET AL., FLORIDA STATE UNIVERSITY INFORMATION INSTITUTE, PUBLIC LIBRARIES AND THE INTERNET 2008: STUDY RESULTS AND FINDINGS (2008). In fiscal year 2006, the latest year for which data are available, there were 9,208 public library systems or administrative units, and 16,592 central library outlets and branch library outlets in the fifty states and the District of Columbia. See INSTITUTE OF MUSEUM AND LIBRARY SERVICES, PUBLIC LIBRARIES SURVEY: FISCAL YEAR 2006 4 (2008).

6 JOHN CARLO BERTOT & CHARLES R. MCCLURE, PUBLIC LIBRARIES AND THE INTERNET 2000: SUMMARY FINDINGS AND DATA TABLES (2000), available at http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/16/9a/6f.pdf.

7 THE STATE OF AMERICAN LIBRARIES, supra note 5, at 5.

8 United States v. Am. Library Ass’n, 539 U.S. 194, 200 (2003) (citing a study conducted by the Library Research Center at the University of Illinois).

9 See Mohammed Hammami et al., WebGuard: A Web Filtering Engine Combining Textual, Structural, and Visual Content-Based Analysis, 18 IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 272, 272 (2006); Brian M. Werst, Comment, Legal Doctrine and Its Inapplicability to Internet Regulation: A Guide for Protecting Children from Internet Indecency after Reno v. ACLU, 33 GONZ. L. REV. 207, 218 (1997/1998). See also Reno v. ACLU, 521 U.S. 844, 850 (1997).

10 Werst, supra note 9, at 218; ACLU v. Reno, 929 F. Supp. 824, 830 (E.D. Pa. 1996), aff’d. in Reno. v. ACLU, 521 U.S. 844 (1997).

11 Werst, supra note 9, at 217; Scott Winstead, The Application of the “Contemporary Community Standard” to Internet Pornography: Some Thoughts and Suggestions, 3 LOY. INTELL. PROP. & HIGH TECH. J. 28, 29-30 (2000).

121

geographic limitations.12 The Internet is comprised of host computers, which store and relay

data, and user computers, which upload, view or download data over the Internet.13 Professors

Lawrence Lessig and Paul Resnick, both experts on cyberlaw, have explained that Internet

communication and regulation involve three types of actors: the sender or speaker, the receiver or listener, and the intermediary who stands between the two.14

In the 1990s, the Internet became more expansive and user-friendly when the World

Wide Web (WWW) emerged as a platform, providing a decentralized, rather than a hierarchal, system of accessing information on a global level. The development of the World Wide Web can

be traced to research by computer programmer Tim Berners-Lee of CERN, the European

Laboratory for Particle Physics in Geneva, Switzerland.15 Berners-Lee was a primary mover in

creating what today is known as cyberspace.16 In 1989, he proposed using hypertext markup

language (HTML) to allow linking between documents in a non-hierarchical manner at CERN.17

WWW sites contain not only text, but also colorful graphics, sound, animation and video, and each site may be linked to many other Web sites.18 According to Forbes magazine, Berners-Lee

“transformed the Internet from an obscure Pentagon-funded communications project into the

12 See A. John Harper III, Traditional Free-Speech Law: Does It Apply on the Internet? 6 COMP. L. REV. & TECH. J. 265, 269 (2002).

13 Id. at 266.

14 Lawrence Lessig & Paul Resnick, Zoning Speech on the Internet: A Legal and Technical Model, 987 MICH. L. REV. 395, 399 (1999).

15 Spencer Reiss, St. Tim of the Web, FORBES, Nov. 15, 1999, at 314.

16 Id.

17 See Tim Berners-Lee, Information Management: A Proposal, available at http://www.w3.org/History/1989/proposal.html. See also Robert Cailliau, A Little History of the World Wide Web from 1945 to 1995, http://www.w3.org/History.html (last visited July 20, 2009); Reiss, supra note 15, at 314-16.

18 See generally PETER MORVILLE & LOUIS ROSENFELD, INFORMATION ARCHITECTURE FOR THE WORLD WIDE WEB (3d ed., 2006).

122

most spectacular technology launch in history.”19 He developed a do-it-yourself global address system and the first Web browser and offered the package for free. Without his set of rules for linking and displaying information on computers, Berners-Lee said, the Internet might not have taken off.20

The Internet grew exponentially with the Mosaic Web browser, which created a graphical user interface. Other browsers followed, such as Netscape and Explorer, and commercial Internet

Service Providers (ISPs) such as America Online and Earthlink emerged.21 As the Internet exploded in popularity and content, search engines such as Alta Vista, Yahoo! and Google allowed users to sift through billions of Web pages to find information on almost any subject. In addition, online users could post messages on bulletin boards and usenet sites,22 participate in chat rooms23 and conferencing systems,24 send instant messages,25 access e-mail accounts,26 subscribe to listservs,27 and play in MUDs or MOOs28— all from anywhere in the world. Today,

19 Reiss, supra note 15, at 314.

20 Reiss, supra note 15, at 315.

21 See Winstead, supra note 11, at 30-32.

22 A bulletin board, or BBS, allows users to post messages to other users, either privately or publicly. Publicly- posted messages are preserved over time, unlike chat rooms. A usenet is another form of a BBS that is comprised of a set of individuals interested in seeking and posting information on a specific topic, such as ham radio, dog breeding or video games. In BBSs and usenets, there are postings on specific topics, where the comments—and sometimes images—follow one another in a bulletin board style.

23 Chat rooms allow live conversations and are divided into special interest groups, such as theater, gardening, car restoration and cybersex (such as women seeking women or group sex). Remarks appear in “real time” as participants type them in. Generally, these conversations are not preserved over time.

24 Conferencing systems are high-capacity BBSs that can offer other services, such as real-time chatting.

25 Instant messages can be sent privately between persons online at the same time.

26 Electronic mail (e-mail) allows communication between two or more people using an e-mail program, such as Outlook.

27 Listservs are e-mail lists to which users subscribe to obtain mass mailings on specific topics.

28 Multi-User Dungeons and MOO’s object-oriented sites are fictitious user-created online worlds in which players invent roles and interact with other online characters.

123

computer users can engage in peer-to-peer file sharing, post audio and video clips, watch Internet

television, and talk over the Internet using voice over Internet protocol (“VoIP”).29 Internet users

also can post content instantaneously on interactive sites, such as Facebook and MySpace.30

With the advances in technology, the number of Web sites, Web pages and online users

grew. Web sites are made up of a home or index page, plus a number of individual pages that make up the site. A Web site is similar to a file folder that contains and organizes documents, whereas Web pages are similar to the documents inside the folder. The number of Web sites and

Web pages is difficult to precisely measure because no one entity is in charge of the Web and no one central server runs all the Web sites in the world.

Internet Content and Usage

In 1993, about 130 sites existed on the Web, compared to an estimated 3.7 million sites in

2000.31 By July 2007, more than 125 million Web sites existed.32 In 2009, more than 231 million

29 ACLU v. Gonzales, 478 F. Supp. 2d 775, 791 (2007). A VoIP service converts a speaker’s voice into a digital signal that travels over the Internet. If the caller is calling a regular phone number, the signal is converted to a regular telephone signal before it reaches the destination. See Federal Communications Commission, Frequently Asked Questions: How VoIP/Voice Internet Works, http://www.fcc.gov/voip/ (last visited July 20, 2009).

30 DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET: SYNTHESIS REPORT 14 (2008), available at http://www.sip-bench.org/Reports2008/sip_bench_2008_synthesis_report_en.pdf.

31 See Winstead, supra note 11, at 30.

32 Netcraft, July 2007 Web Server Survey (July 2007) http://news.netcraft.com/archives/web_server_survey.html. (last visited July 20, 2009). Netcraft is an Internet services company based in Bath, England, and provides Internet security and research data and analysis to clients around the world, including Microsoft, Visa, Credit Suisse and Lloyds of London.

124

Web sites existed.33 In 2002, the number of Web pages was estimated at two billion.34 In 2007, about 29.7 billion pages existed, according to the latest estimate released by boutell.com.35

In striking down the CIPA in 2002, the U.S. District Court for the Eastern District of

Pennsylvania noted that estimates had indicated that “no more than 1-2% of the content on the

Web is pornographic or sexually explicit,”36 but the number of Web sites that offered free sexually explicit content approximated 100,000 sites.37 According to the National Research

Council (NRC),38 sexually explicit material could be found on only 1.5% of the public Web sites and Web pages in 2002,39 the year the federal district court struck down the CIPA.40 Even a small percentage of such large numbers still represents hundreds of thousands of pornographic

Web sites. In 2005, about 34% of ten- to seventeen-year-olds said they had come across online

33 Netcraft, April 2009 Web Server Survey (2009) http://news.netcraft.com/archives/web_server_survey.html. (last visited July 20, 2009).

34 Youth, Pornography, and the Internet Sec. 3.1 at 39 (Dick Thornburgh & Herbert S. Lin, eds., 2002), available at http://bob.nap.edu/html/youth_internet/.

35 How Many Web Sites Exist? (Feb. 15, 2007) http://www.boutell.com/newfaq/misc/sizeofweb.html (last visited July 20, 2009). Boutell.com is a software company based in .

36 Am. Library Ass’n v. United States, 201F. Supp. 2d 401, 419 (2002).

37 Id. As stated above, the Supreme Court upheld the CIPA in 2003, thus reversing the lower court’s opinion (see United States v. American Library Assn., 539 U.S. 194 (2003)). See also Youth, Pornography and the Internet, supra note 34, Sec. 3.1, at 72.

38 According to its Web site, the NRC was organized by the National Academy of the Sciences in 1916 to associate the broad community of science and technology with the National Academy of Science’s purposes of furthering knowledge and advising the federal government. The NRC has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering that provides services to the government, the public, and the scientific and engineering communities. See http://sites.nationalacademies.org/nrc/index.htm (last visited July 20, 2009).

39 Youth, Pornography and the Internet, supra note 34.

40 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

125

pornography that they did not wish to see, an increase from 25% in 2000, according to a survey done by the University of New Hampshire. 41

A 2006 study conducted for the Department of Justice indicated the portion of sexually explicit Web sites remained at an estimated 1.1 to 1.7%, about the same portion that was reported in 2002.42 However, the number of sexually explicit Web pages in 2006 was estimated at 275 million to 700 million.43

Because of the increase in the number of Web sites in 2009, the number of sexually explicit Web sites increased, with estimates ranging between 2.3 million and 4.6 million.44 The number of sexually explicit Web pages would be much higher, but no estimates on pages could be found in 2009.

In the twenty-first century, minors’ use of both computers and the Internet grew. In 2001 and 2003, the National Telecommunications and Information Administration (NTIA) conducted its last two studies on Americans’ Internet usage and reported that fourteen- to seventeen-year-

41 Janis Wolak et al., Unwanted and Wanted Exposure to Online Pornography in a National Sample of Youth Internet Users, 119 PEDIATRICS 247, 248 (2007). Thirteen percent of youth in the same age group reported that they had visited pornographic sites intentionally.

42 Transcript of day 11 of Nov. 8, 2006 non-jury trial at 114, ACLU v. Gonzales, 478 F. Supp. 2d 775 (E.D. Pa. 2007), available at http://www.aclu.org/pdfs/freespeech/copatranscript_20061108.pdf. See also Maryclaire Dale, 1 Percent of Web Sites Deemed Pornographic, www.msnbc.com/id/15721799 (last visited July 20, 2009). Philip Stark, a statistics professor at the University of California-Berkeley, testified that he estimated 1.7% Web sites accessed through Time Warner AOL, MSN and Yahoo, Inc, are sexually explicit and 1.1% of Web sites accessed through Google and MSN are sexually explicit. See also Youth, Pornography and the Internet, supra note 34.

43 ACLU v. Gonzales, 478 F. Supp. 2d 775, 788-89 (2007). The court accepted estimates that 1% of Web pages were sexually explicit. Id. The number of sexually explicit pages would be much higher than the number of Web sites because each Web site is made up of many individual pages. But see How Many Web Sites Exist? (Feb. 15, 2007) http://www.boutell.com/newfaq/misc/sizeofweb.html (last visited July 20, 2009). The Boutell group estimated about 29.7 billion pages existed in 2007, which would indicate about between 297 million and 594 million pages would be sexually explicit (1 to 2% of 29.7 billion pages). However, no one really knows exactly how many Web sites or Web pages are on the Internet because no central authority coordinates the Internet.

44 In multiplying 231,000,000 total Web sites by 1% and 2% of the estimated sexually explicit sites, the number of sexually explicit sites ranged from 2.3 to 4.6 million in 2009. In 2007, number of sexually explicit sites ranged from 1.26 to 2.5 million. The number of sexually explicit pages would be much higher because Web sites contain numerous different pages, but most estimates focus on Web sites rather than Web pages.

126

olds used the Internet more than any other age group in both years. In 2001, 76.4% of fourteen-to seventeen-year-olds used the Internet, and in 2003, 78.8% used the Internet.45 In 2003, 19.9% of one- to four-year-olds used the Internet, 42% of five- to nine-year-olds used the Internet, and

67.3% of ten- to thirteen-year-olds used the Internet.46 In reporting the overall U.S. population’s use of the Internet, including all age groups, the NTIA found that 55.1% of the population used the Internet in 2001,47 compared to 58.7% in 2003.48

As of late 2004, a Pew Internet and American Life Project survey reported that 87% of

Americans aged twelve to seventeen—about 21 million—went online.49 Another 2004 study, conducted by Nielsen/Net Ratings, looked at children’s access to the Internet and found that, in contrast to other Internet activities such as e-mail, instant messaging and social chat rooms, children two to eleven were increasing their Web usage at a faster rate than the overall population. On average, that age group viewed 106% more Web pages in October 2004 than in

45 See NAT’L TELECOMM. & INFO. ADMIN., A NATION ONLINE: ENTERING THE BROADBAND AGE at A-1 (2004), available at http://www.ntia.doc.gov/reports/anol/NationOnlineBroadband04.pdf. The largest percentage of adults using the Internet in 2003 was the eighteen- to twenty-four-year old group at 70.6%. About 68% of the twenty-five- to forty-nine-year olds used the Internet, while 44.8% of those fifty and above used the Internet. Id. Studies conducted before 2001 used different age ranges than the studies conducted in 2001 and 2003 and therefore are not included in the text. However, information is included in this footnote as a reference. According to the study conducted in 2000, children three to eight had the lowest Internet use rate in 2000 (15.3%) and the smallest increase in use since 1998 (4.3 percentage points). In 2000, 53.4% of youths nine to seventeen used the Internet, compared to 43% in 1998, a 24% increase in the use rate. See NAT’L TELECOMM. & INFO. ADMIN., FALLING THROUGH THE NET: TOWARD DIGITAL INCLUSION—A REPORT ON AMERICANS’ ACCESS TO TECHNOLOGY TOOLS, PART II, INTERNET USE AMONG INDIVIDUALS(NON-PAGINATED) (October 2000), available at http://www.ntia.doc.gov/ntiahome/fttn00/Falling.htm#36.

46 A NATION ONLINE: ENTERING THE BROADBAND AGE, supra note 45, at A-1.

47 See NAT’L TELECOMM. & INFO. ADMIN., EXECUTIVE SUMMARY: A NATION ONLINE: HOW AMERICANS ARE EXPANDING THEIR USE OF THE INTERNET (Feb. 2002), available at http://www.ntia.doc.gov/ntiahome/dn/html/execsum.htm. Full report available at http://www.ntia.doc.gov/opadhome/digitalnation/index_2002.html. The results are based on data collected in 2001.

48 A NATION ONLINE: ENTERING THE BROADBAND AGE, supra note 45, at A-1. For an index to the six Internet use studies conducted by the NTIA, see http://www.ntia.doc.gov/reports/anol/index.html. The results are based on data collected in 2003.

49 PEW INTERNET AND AMERICAN LIFE PROJECT, PROTECTING TEENS ONLINE, at 4 (March 17, 2005), available at http://www.pewinternet.org/Reports/2005/Protecting-Teens-Online.aspx.

127

October 2002, compared to a 15% growth for the population overall.50 Another Pew Internet and

American Life Project survey, taken from November 2007 through February 2008, showed that

93% of twelve- to seventeen-year-olds used the Internet.51

As World Wide Web usage increased, so too did Web page designers’ savvy use of embedded text that would keep visitors at a company’s or organization’s Web site, even if they tried to leave. Web designers use two common methods to keep viewers on their site:

“mousetrapping” and “pagejacking.”52 Mousetrapping occurs when a Web site includes

embedded and hidden text that disables the “back” and “close” features on a user’s browser.

Similarly, pagejacking occurs when users cannot leave the Web site they are on and are exposed

to pop-up pages that they did not request. Mousetrapping and pagejacking have been particularly problematic with pornographic Web sites. When users try to use the “back” button or attempt to

link to a URL53 of their choice, they instead see more pornographic sites.54

Concerns about children inadvertently or deliberately accessing online pornography, from

public library and school computers led to the passage of the Children’s Internet Protection Act

in 200055 and the United States Supreme Court’s decision to uphold the CIPA in 2003.56

50 The Youngest Surfers, PC MAGAZINE, Jan. 2005, at 28 (citing a study by Nielsen/Net Ratings).

51 PEW INTERNET AND AMERICAN LIFE PROJECT, GENERATIONS ONLINE IN 2009 at 6 (2009), available at http://www.pewinternet.org/Reports/2009/Generations-Online-in-2009.aspx. The study did not report the number of twelve- to seventeen-year-olds in the U.S. population in 2007 to 2008.

52 Rebecca L. Covell, Note, Problems With Government Regulation of the Internet: Adjusting the Court’s Level of First Amendment Scrutiny, 42 ARIZ. L. REV. 777, 777-78, 792 (2000).

53 A URL is a uniform resource locator, the Internet equivalent of an address, such as http://www.cnn.com. In this example, “http” is the protocol, and “www.cnn.com” is the server address or domain. A forward slash and file name may follow the “.com” suffix to direct users to particular indices and subcategories within the URL.

54 Covell, supra note 52, at 792.

55 See Chapter 6, supra notes 268-290 and accompanying text, and Chapter 7 for a discussion of the Children’s Internet Protection Act of 2000, Pub. L. No. 106-554, 114 Stat. 2763 (2000) (codified at 20 U.S.C. § 9134(f); 47 U.S.C. § 254 (h)).

128

To comply with the Children’s Internet Protection Act, librarians must use filtering

software—commonly referred to as user-based filtering—to block pornographic Internet

material. The CIPA focuses on receivers, not senders,57 and covers images, not words. However,

several attorneys and legal scholars have written that filtering technology is not equipped to

block access to images.58 Moreover, these software programs are unable to filter all sexually

explicit text while, at the same time, they block constitutionally protected speech.59

How Filtering Technology Works

Attorney and professor Lawrence Lessig has written that in order to understand filtering technology and choices, one must understand that both nonuser-based and user-based filtering exists.60 In discussing nonuser-based online filtering, Lessig explained that “architectures” in

cyberspace can either zone speech or filter speech.

In a zoning construct, Web site operators would be required to deny minors’ access to

content deemed “harmful to minors,”61 such as pornography. Under this scenario, the

government would require manufacturers to modify their browsers so that users could set up

56 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

57 Prior legislation, such as the Communications Decency Act and the Child Online Protection Act, was directed at senders’ rather than receivers’ expression. See Chapter 5 for an analysis of the CDA, the COPA and Supreme Court cases on protecting minors from material deemed harmful in the media.

58 See generally MARJORIE HEINS ET AL., BRENNAN CENTER FOR JUSTICE AT N.Y. UNIV. SCHOOL OF LAW, INTERNET FILTERS: A PUBLIC POLICY REPORT (2d ed., 2006); Richard J. Peltz, Use “the Filter You Were Born With”: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Public Libraries, 77 WASH. L. REV. 397, 397 (2002); Kathleen Conn, Commentary, Protecting Children from Internet Harm (Again): Will the Children’s Internet Protection Act Survive Judicial Scrutiny?, 153 ED. LAW. REP. 469 (July 5, 2001); Adam Horowitz, The Constitutionality of the Children’s Internet Protection Act, 13 ST. THOMAS L. REV. 425 (2000).

59 See generally HEINS, supra note 58; Peltz, supra note 58, at 397; Conn, supra note 58, at 469; Horowitz, supra note 58.

60 LAWRENCE LESSIG, CODE AND OTHER LAWS OF CYBERSPACE 175-82 (1999).

61 Id. at 176. In Ginsberg v. New York, 390 U.S. 629 (1968), the U.S. Supreme Court ruled that a variable obscenity standard is constitutional, stating that what is obscene as to minors may not be obscene as to adults, and there is a compelling state interest in protecting the health and welfare of children.

129

profiles, including a check-off box to identify minor users.62 Lessig has argued that this approach would be constitutional, whereas a version requiring adults to identify themselves would be unconstitutional,63 as well as time-consuming and expensive.64

In a filtering construct, where material is either prevented from being sent or from being

received, the PICS model (Platform for Internet Content Selection) has existed since 1995. PICS

facilitates the labeling and filtering of content.65 Web content managers voluntarily fill out a

PICS-created form that embeds a small piece of HTML66 within the Web site that tells browsers what type of content can be found on the site.67 PICS is regarded as neutral because it does not rate anything but rather allows different groups to apply their own ratings. Internet researchers

Paul Resnick and James Miller wrote that PICS “is analogous to specifying where on a package a label should appear, and in what font it should be printed, without specifying what it should

say.”68 PICS was not designed as a policy proposition but rather as a technical solution for rating

content on the Internet69 based on a set of computer industry standards designed to establish a

value-neutral labeling infrastructure for the Internet.70

62 LESSIG, CODE AND OTHER LAWS OF CYBERSPACE, supra note 60, at 176.

63 LESSIG, CODE AND OTHER LAWS OF CYBERSPACE, supra note 60, at 176.

64 Lawrence Lessig & Paul Resnick, Zoning Speech on the Internet: A Legal and Technical Model, 987 MICH. L. REV. 395, 420-21 (1999).

65 See id. at 411-12; LESSIG, CODE AND OTHER LAWS OF CYBERSPACE, supra note 60, at 177.

66 HTML is HyperText Markup Language, the coding language used to create hypertext documents for use on the World Wide Web.

67 See PROTECTING TEENS ONLINE, supra note 49.

68 Paul Resnick & James Miller, PICS: Internet Access Controls Without Censorship, 39 COMMC’NS OF THE ACM 87, 87 (1996).

69 Thomas B. Nachbar, Paradox and Structure: Relying on Government Regulation to Preserve the Internet’s Unregulated Character, 85 MINN. L. REV. 215, 226 (2000).

70 See Werst, supra note 9, at 218.

130

PICS, the World Wide Web Consortium’s universal protocol for rating and filtering sites

on the Internet,71 is a specification that uses metadata72 to label Web pages in an effort to help

control what minors can access on the Internet. PICS does not provide software73 but rather sets

technical specifications so that ratings from any source will work with all the filtering software.74

Law professor Junichi Semitsu explained that PICS allows variable ratings and ranges: “Instead of merely rating a site ‘adults-only’ or ‘block,’ PICS is multidimensional, allowing for variable ratings, from one to ten for example, under criterion ranging from ‘religious content’ to ‘graphic sex content.’ PICS-compatible filters and browsers can read the labels and use their own filtering criteria to decide whether to block the site.”75

Prior to PICS, no standard format for labels existed, so companies wanting to control access to material had to both develop the software and provide the labels.76 PICS provides a

common format for labels, so that any PICS-compliant selection software can process any PICS-

compliant label. Under the PICS system, online users can choose to view content based on any

content-ratings system they think would best serve them.77 The ratings are done voluntarily by a

variety of organizations and groups, such as religious organizations, children’s groups and other

71 PICS originally was designed to help parents and teachers control what children access on the Internet. However, it also facilitates other uses for labels, including code signing and privacy. Other rating services and filtering software have been built on the PICS platform. For a complete discussion of PICS, see http://www.w3.org/PICS/. (last visited July 20, 2009).

72 Metadata is encoded data that provides descriptions of information. More simply, it can be defined as data about data or information about content on a Web page.

73 Junichi P. Semitsu, Note, Burning Cyberbooks in Public Libraries: Internet Filtering Software vs. The First Amendment, 52 STAN. L. REV. 509, 517-18 (2000).

74 WC3, PICS Frequently-Asked Questions, http://www.w3.org/2000/03/PICS-FAQ/ (last visited July 20, 2009).

75 Semitsu, supra note 73, at 517.

76 See generally Resnick & Miller, supra note 68, at 87.

77 Nachbar, supra note 69, at 226.

131

special interest groups.78 For instance, the Christian Coalition could choose to develop a ratings

system that reflected its values, as could the American Civil Liberties Union.79 Although the

PICS architecture is in place, software manufacturers would have to write the necessary code to

filter material80 and organizations would have to actively rate Web sites before users could choose which content to block.

Lawrence Lessig stated that no technology is ever truly content-neutral and envisions

PICS as having an adverse effect on free speech because it can be installed anywhere in the distribution chain, including at the level of the nation-state, the Internet service provider, the proxy server,81 or the individual user.82 “PICS is a labeling standard that establishes a consistent

way to rate and block online content. PICS doesn't target any particular category of speech.

Instead, private agencies will use PICS to develop their own content rating schemes,” Lessig

wrote.83 He argued that although PICS is viewpoint-neutral, “PICS makes censorship easy

because it embeds the tools of censorship into the root architecture of online publishing.”84

According to Lessig, libraries and schools can prevent patrons from viewing controversial sites,

78 LESSIG, CODE AND OTHER LAWS OF CYBERSPACE, supra note 60, at 177.

79 See Lawrence Lessig, Tyranny in the Infrastructure (July 1997) http://www.wired.com/wired/archive/5.07/cyber_rights.html. (last visited July 20, 2009).

80 LESSIG, CODE AND OTHER LAWS OF CYBERSPACE, supra note 60, at 177.

81 A proxy server is a computer 'program' that runs on a computer other than the individual’s and which is modified to handle a large amount of requests. See PETER BUCKLEY & DUNCAN CLARK, THE ROUGH GUIDE TO THE INTERNET 20, 323 (2007).

82 Lessig, Tyranny in the Infrastructure, supra note 79.

83 Lessig, Tyranny in the Infrastructure, supra note 79.

84 Lessig, Tyranny in the Infrastructure, supra note 79.

132

companies can control what employees can access online, and countries such as China and

Singapore can “clean up” the Internet for their residents.85

Despite the potential downside of PICS, the computer software industry has used the

PICS platform since 1995. Organizations that have relied on PICS include the Software

Publishers Association (SPA), the Entertainment Software Rating Board (ESRB), the

Recreational Software Advisory Council (RSAC) and Internet Content Rating Association

(ICRA). The SPA system applies to computer games, which is beyond the scope of this dissertation. The ESRB, which assigns age-based ratings to video games86 similar to ratings used by the Motion Picture Industry of America, also is beyond the scope of this dissertation. Finally,

RSAC, which no longer exists as such, used a content-based system and based its ratings on the extent or level of four criteria: violence, nudity, sex, and language.87 Each of the four criteria was assigned a level from zero to four, based on the Web publisher’s evaluation of content, with the level four containing the most violence, nudity, sex, or crude language. For example, under the “sex” criterion, the Web publishers assigned “zero” to content that contained “romance, no sex” and “five” to content that contained “explicit sexual activity; sex crimes.”88

85 Lessig, Tyranny in the Infrastructure, supra note 79.

86 The Entertainment Software Rating Board (ESRB) is a non-profit, self-regulatory body established in 1994 by the Entertainment Software Association (ESA), formerly known as the Interactive Digital Software Association (IDSA). The ESRB commissions consumer research to measure parental agreement with the ratings. See http://www.esrb.org/index-js.jsp. (last visited July 20, 2009). See also Diane Roberts, The Jurisprudence of Ratings Symposium Part I: On the Plurality of Ratings, 15 CARDOZO ARTS & ENT. LJ 105, 113-14 (1997).

87 See Roberts, supra note 86, at 113-14.

88 C. Dianne Martin & Joseph M. Reagle, Jr., An Alternative to Government Regulation and Censorship: Content Advisory Systems for the Internet, available at http://penta2.ufrgs.br/gereseg/censura/rsac/dianne1.htm. Under the sex criterion, a “2” was assigned to “passionate kissing,” a “3” was assigned to “clothed sexual touching,” and a “4” was assigned to “non-explicit sexual activity.” Martin was RSAC president and Reagle was a member of the World Wide Web Consortium (W3C). According to the W3C Web site, the W3C is an international consortium where the W3C staff, member organizations, and the public “work together to develop Web standards.” The mission of W3C is “to lead the World Wide Web to its full potential by developing protocols and guidelines that ensure long-term growth for the Web.” http://www.w3.org/Consortium/ (last visited July 20, 2009).

133

In 1999, the RSAC was folded into Internet Content Rating Association (ICRA), which

later became a part of the Family Online Safety Institute. The RSAC’s aim, to protect children

from potentially harmful Internet content while preserving free speech, serves as the main focus

of the Family Online Safety Institute.89 The ICRA ratings system, incorporated by the Family

Online Safety Institute,90 noted whether foul, language, nudity or sexual content was present. In

addition, the ratings added drugs, alcohol, tobacco and weapons, including the context within which these words appear.91 Content labels generated by ICRA conformed to the industry-wide

PICS standard.92 The ICRA ratings are in contrast to the RSAC system, in which only "levels" of

nudity, sex violence and language were set. However, ICRA ratings often are not effective

because many sexually explicit sites that are not rated would not be blocked, and if a browser is

set to block all unrated sites, many non-pornographic Web sites, including government sites,

would be blocked as well.93 In addition, ICRA ratings do not work well for user-generated

content, such as YouTube, MySpace and Facebook because ICRA ratings have “great difficulty

distinguishing innocent from harmful content in these highly dynamic environments.”94

Parents can choose to use a combination of PICS and a ratings system based on content to

determine which content will be blocked from the family’s computer system. The issue is more

complex for librarians, who must balance allowing adults’ access to constitutionally protected

89 Recreational Software Advisory Council, http://www.rsac.org/ (last visited July 20, 2009).

90 Internet Content Rating Association, About ICRA, http://www.fosi.org/icra/#glance (last visited July 20, 2009).

91 Digital Chaperones for Kids, CONSUMER REPORTS, March 2001, at 20, 21. See also Internet Content Rating Association, About ICRA, http://www.fosi.org/icra/ (last visited July 20, 2009).

92 See Internet Content Rating Association, About ICRA, http://www.fosi.org/icra/ (last visited July 20, 2009).

93 See Digital Chaperones for Kids, supra note 91 at 20, 21. For the most recent Consumer Reports study conducted in 2005, see infra notes 156-159 and accompanying text.

94 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 11.

134

speech with protecting minors from sexually explicit material that could be deemed harmful.95

Because filtering software is proprietary, librarians have no way of knowing what content is being blocked.96 Unless they electronically or physically monitor patrons’ Web usage, they do not know whether sexually explicit Web sites made it through the filters.

Internet Filtering in Public Libraries

Most libraries have relied on proprietary filtering software,97 rather than attempt to develop their own software. Librarians can choose protocol blocking (the media of communication, such as social chat rooms, usenet, e-mail, instant messaging), “whitelisting”

(establishing an “allow” list of Web sites that can be accessed), and blacklisting (prohibiting access to Web sites based on the site’s URL or address, the host and/or keywords).98 It is not uncommon for libraries to block access to the Internet via protocol because some uses, such as social chat rooms, e-mail communications and instant messaging, are not necessarily consistent with the library’s mission of providing access to research rather than communication99 or recreation.100 Moreover, protocol blocking is not content-based because all chat rooms or e-mail correspondence would be blocked, regardless of the subject matter.101

95 For a discussion of the mission and role of the public library, see Chapter 2.

96 See Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). See also Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing on H.R. 3783, H.R. 774, H.R. 1180, H.R. 1964, H.R. 3177, and H.R. 3442 before the Subcomm. on Telecomm., Trade and Consumer Protection of the H. Comm. on Commerce, 105th Cong. 40 (Sept. 11, 1998) (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

97 See History and Development of Filters, 40 LIBRARY TECH. REPORTS 8 (March/April 2004).

98 See Peltz, supra note 58, at 403. See also Whitney A. Kaiser, The Use of Internet Filters in Public Schools: Double Click on the Constitution, 34 COLUM. J.L. & SOC. PROBS. 49, 53-55 (2000), and Horowitz, supra note 58.

99 See Peltz, supra note 58, at 404.

100 A study conducted in 2000 indicated that 53% of public libraries had configured their software to block chat sites and 22% had blocked e-mail access. Curry & Haycock, supra note 4, at 42, 47.

101 See Peltz, supra note 58, at 403-04.

135

Another option for librarians is to choose filtering software that “white lists” Web sites, allowing access to material best suited for a children’s section, which is similar to selection decisions that librarians routinely make.102 However, “white-list” filtering is constitutionally suspect when applied to adults (and possibly to mature minors).103 In addition, the immense size of the Internet, as well as ongoing Web page changes, makes it impossible for humans to assess every Web site, resulting in many Web sites being blocked.104 Another problem with

“whitelisting” can occur when Web sites change ownership or content and post content that is obscene or sexually explicit. Because Web site reviewers typically do not re-review sites that had previously been white-listed, library patrons would have access to the obscene or sexually explicit content.105

Finally, librarians may choose filtering software that blacklists content in one of three ways: by Web site address, host or keywords.106 Under the first option of site blocking, Web sites can be filtered according to their addresses or URLs, with individual human reviewers selecting the blocked sites. The problems that exist with “whitelisting” also exist with site blocking—the size of the Internet makes it impossible for humans to review each Web site, and the fast-changing Internet results in outdated site-blocking decisions. A second method is host blocking, which is similar to Web site blocking. Under host blocking, librarians can block all

102 See Peltz, supra note 58, at 402-03. See also Gregory K. Laughlin, Sex, Lies and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 213, 272-75 (2003).

103 See Laughlin, supra note 102, at 275. See Chapter 4 for a discussion on balancing adults’ constitutional rights with the protection of minors and Chapter 8 for a discussion of constitutional issues surrounding Internet filtering in public libraries.

104 See Hammami et al., supra note 9, at 272. See also Laughlin, supra note 102, at 275.

105 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 435 (E.D. Pa. 2002).

106 See Peltz, supra note 58, at 404; see also Horowitz, supra note 58, at 429-35.

136

Web sites published by a specific host. Under this system, when a Web host features pornographic content, all Web sites of all users of that host would be blocked, including non- pornographic sites.107 A third option under blacklisting is keyword blocking, which results in the blocking of Web sites containing objectionable words.108

One problem will all filtering software is that software vendors, rather than librarians, are making the filtering decisions because filtering software is proprietary.109 When librarians select

Internet filtering software, they are not able to base their choices on the library’s collection development policies, as they do in acquiring books and other materials.110 In choosing commercial filtering software, librarians would not be able to determine if the filters met their needs because software companies do not disclose their standards.111 Therefore, librarians would have no way of knowing which content has been blocked, as the district court noted in its opinion on the Children’s Internet Protection Act of 2000.112

After the Supreme Court upheld the Children’s Internet Protection Act in 2003, public libraries receiving E-rate and LSTA technology funding were required to install Internet filters

107 See Peltz, supra note 58, at 404-05.

108 See Peltz, supra note 58, at 405-06.

109 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). See also Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing on H.R. 3783, H.R. 774, H.R. 1180, H.R. 1964, H.R. 3177, and H.R. 3442 before the Subcomm. on Telecomm., Trade and Consumer Protection of the H. Comm. on Commerce, 105th Cong. 40 (Sept. 11, 1998) (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

110 See AM. LIBRARY ASS’N, BEST PRACTICES IN PUBLIC LIBRARIES, available at http://www.ala.org/ala/shadows/pla/resources/bestpractices.cfm. For a discussion of library collection decisions, see supra Chapter 2, pp. 50-52.

111 See Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing on H.R. 3783, H.R. 774, H.R. 1180, H.R. 1964, H.R. 3177, and H.R. 3442 before the Subcomm. on Telecomm., Trade and Consumer Protection of the H. Comm. on Commerce, 105th Cong. 40 (Sept. 11, 1998) (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

112 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). For a discussion of the court cases deciding the Children’s Internet Protection Act, see Chapter 7.

137

on all of their computers, not just those funded by the federal government or accessible to

minors.113 With the ever-changing nature of the Web and more than 231 million Web sites in

existence in 2009,114 software providers are unable to keep up with the white list and blacklist

methods.115 The manual classification involved with each method has become “impractical and inefficient,” even for companies that design filtering software, and the classifications would not be accurate.116 While keyword blocking has improved,117 it does not necessarily address the

CIPA, which only deals with “visual depictions,”118 unless descriptions of the images are listed

on the Web page and those descriptions contain keywords that have been blocked.

Filtering Software Studies

At the time Congress passed the Children’s Internet Protection Act in 2000,119 and also

when the Supreme Court was hearing the case in early 2003,120 the available Internet filtering

software programs were much less effective at blocking sexually explicit images than were later

programs. Filtering technology improved from 2003 through 2008,121 but filtering software

113 See United States v. Am. Library Ass’n, 539 U.S. 194, 230-31 (2003).

114 Netcraft, April 2009 Web Server Survey (2009) http://news.netcraft.com/archives/web_server_survey.html. (last visited July 20, 2009).

115 See Hammami et al., supra note 9, at 272.

116 See Hammami et al., supra note 9, at 272.

117 Early filtering software did not consider context, such as naked breasts and breast cancer, but newer software seems to be more effective at blocking the former and not blocking the latter. See infra notes 163-168 and accompanying text for a discussion of WebGuard filtering software and infra notes 184-204 and accompanying text for a discussion of the Deloitte study.

118 Pub. L. No. 106-554, 114 Stat. 2763 (2000) (codified at 20 U.S.C. § 9134(f) and 47 U.S.C. 254 (h)).

119 Id.

120 See United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

121 See DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 11. See also HEINS, supra note 58, at 45; Filtering Software: Better, But Still Fallible, CONSUMER REPORTS, June 2005, at 36.

138

continues to block content that would not be considered sexually explicit or harmful to minors and fails to block content that is sexually explicit or harmful to minors.122

From 2000 to 2002, some of the software programs filtered out many sites that would not be considered pornographic or harmful to minors. During this time period, four Internet filtering research studies were undertaken in the United States, though none focused exclusively on images, which is the sole focus of the Children’s Internet Protection Act.123 The results of all of the four studies showed that, depending on the level of filtering, the software blocked some pornographic material but still let thousands of pornographic Web sites through. The research was conducted for Consumer Reports,124 the School Library Journal,125 the Kaiser

Foundation,126 and the National Research Council.127

In its study of Internet filters in 2000, the same year Congress passed the Children’s

Internet Protection Act, Consumer Reports tested six of the most widely used software

122 See DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 11. According to the Deloitte study, “overblocking” occurs when the filter blocks “good content,” and “underblocking” occurs when the filter does not block “unwanted content.” See also HEINS, supra note 58, at 45; Filtering Software: Better, But Still Fallible, CONSUMER REPORTS, June 2005, at 36.

123 Libraries receiving federal funding through the Library Services and Technology Act or the E-rate are required to install a “technology protection measure,” such as filtering software, on all computers connected to the Internet to prevent access by all patrons to “visual depictions” that are obscene, access by all patrons to “visual depictions” that contain child pornography, and access by persons under the age of seventeen to “visual depictions” that are considered “harmful to minors.” See 20 U.S.C. §9134(f)(1) and 47 U.S.C. 254 §(h)(5)(B) & (h)(5)(C) and 47 U.S.C. 254 §(h)(6)(B) & (h)(6)(C). For a discussion of the Children’s Internet Protection Act, see Chapter 7.

124 Consumer Reports is a public interest magazine that rates a variety of consumer products and services, including appliances, automobiles, home and garden equipment, and electronic equipment. http://www.consumerreports.org/cro/index.htm. (last visited July 20, 2009).

125 Librarians Ann Curry & Ken Haycock surveyed librarians and released their findings in 2001.

126 The Kaiser Foundation, which is not affiliated with Kaiser Industries or Kaiser Permanente, is an independent, national health philanthropic organization that provides information and analysis on health issues to policymakers, the media and the general public. For further information, see http://www.kff.org/. (last visited July 20, 2009).

127 The National Research Council (NRC) was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy’s purpose of furthering knowledge and advising the federal government. The Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering. http://sites.nationalacademies.org/nrc/index.htm (last visited July 20, 2009).

139

packages,128 as well as America Online’s (AOL) parental controls. The study defined protection as “the ability to block Web sites containing objectionable material, such as sexual content or promotion of crime, bigotry, violence, tobacco, or drugs,” with failure to block at least 35% defined as “poor.”129 The partial blockage of a site—showing images but no words, or words but no images—was considered “less effective” than total blockage.130 When software analysis (the rapid content analysis of a Web site by software) was used, most software tested by Consumer

Reports blocked both words and images on 80% of objectionable sites,131which contradicted the test run in 1999 on the Basic Artificial Intelligence Routine (BAIR).132 BAIR used artificial intelligence and was touted as being capable of recognizing pornographic images with 99% accuracy.133 However, in one independent test, the BAIR system correctly labeled only two- thirds of a set of pornographic images,134 thus missing one-third of the sexually explicit images.

In another test, the BAIR system failed to block any of fifty pornographic images being tested.135

The Consumer Reports study in 2000 found human analysis and site labeling less effective, with

128 The stand-alone software programs tested were CyberPatrol—version 4, Cybersitter2000, Cyber Snoop, Internet Guard Dog, Net Nanny—version 4, and Norton Internet Security 2001 Family Edition.

129 Digital Chaperones for Kids, supra note 91, at 20, 23. Consumer Reports assigned a “poor” rating to filtering software that failed to block 35% or more of porn sites.

130 Digital Chaperones for Kids, supra note 91, at 20, 23.

131 Digital Chaperones for Kids, supra note 91, at 20-23.

132 See Geoffrey Nunberg, The Internet Farce: Why Blocking Software Doesn’t—and Can’t—Work as Promised, AM. PROSPECT (Jan. 1-15, 2001) at 32. The Basic Artificial Intelligence Routine was developed by Exotrope, a company in Elmira, New York.

133 Id. at 28, 32.

134 Id.

135 See Bennett Haselton, BAIR "Image filtering" Has 0% Accuracy Rate, available at http://www.copa commission.org/papers. Peacefire conducted the test on BAIR and presented its findings to the COPA Commission, which issued information and a final report on the Child Online Protection Act. See Chapter 5 for a discussion of the Child Online Protection Act. Peacefire.org is an anti-filtering organization, which was created in 1996 “to represent the interests of people under 18 in the debate over freedom of speech on the Internet.” See peacefire.org.

140

the former being time consuming and subject to the Web’s volatility and the latter being dependent on the honesty of the Web site manager.136 The failure to block objectionable sites ranged from 14% (AOL’s Young Teen) to 90% (CyberSnoop).137 While AOL’s Young Teen or

Kids Only setting provided the best protection, the study indicated it would likely restrict access to Web sites addressing political or social issues, too.138 In its summary, Consumer Reports stated that most of the products tested failed to block one objectionable site in five.139

In 2000, the same year that Consumer Reports tested filtering software, two librarians sent a survey to a random sample of School Library Journal subscribers.140 The study found that

96% of both school and public libraries had an Acceptable Use Policy whether or not they filtered access to the Internet.141 A significant percentage of the members of the public and school library staffs understood little about how their filtering software worked, although public librarians were slightly more knowledgeable. Nineteen percent of public librarians lacked information about their library’s keyword blocking, 27% lacked information about site blocking, and 60% lacked information about Web rating systems.142 Because of the confusion surrounding the various levels and ranges of filtering that users could select, the authors reported that the

136 See Digital Chaperones for Kids, supra note 91, at 20, 21.

137 Norton Internet Security 2001 failed to block 20% of objectionable sites, Cybersitter 2000 failed block 22% of objectionable sites, Cyber Patrol/version 4 failed to block 23% of objectionable sites, Internet Guard Dog failed to block 30% of objectionable sites, and Net Nanny version 4 failed to block 52% of objectionable sites. See Digital Chaperones for Kids, supra note 91 at 20, 23.

138 Digital Chaperones for Kids, supra note 91, at 20, 22.

139 Digital Chaperones for Kids, supra note 91, at 20, 23.

140 In April 2000, surveys were mailed to 2,000 school librarians and 1,000 public librarians. A total of 731 surveys (266 of those from public libraries) were returned, for a response rate of 24%. Curry & Haycock, supra note 4, at 42, 47.

141 Curry & Haycock, supra note 4, at 42, 44.

142 Curry & Haycock, supra note 4, at 42, 45.

141

results of the study were more informative than conclusive. The study indicated that 55% of

public libraries used vendor-supplied keywords or phrases for keyword blocking, while 15%

added their own words and phrases to the list.143 However, more than half of the librarians did

not know whether they had access to the list of blocked sites.144

In 2002, the Kaiser Foundation’s comprehensive study on how Internet filters affected

the search for online health information found that the Internet filters most frequently used by

schools and libraries can effectively block pornography without significantly impeding access to

online health information—but only if those filters aren’t set at their most restrictive levels.145

The results showed that filters set at higher levels blocked access to a substantial amount of health information, with only a minimal increase in blocked pornographic content.146

Researchers tested the six most commonly used filters at the “least,” “intermediate,” and “most”

restrictive settings.147 At the least restrictive level, the filters incorrectly blocked an average of

1.4% of health sites. When set at the most restrictive level, filters blocked 24% of health sites.

Blocking of sites on sexual health issues, such as condoms and safe sex, was higher at all levels:

from 9% at the least restrictive setting to as much as 50% of all sites at the most restrictive

setting. The amount of pornographic content blocked was found to increase only marginally,

143 Curry & Haycock, supra note 4, at 42, 45.

144 Curry & Haycock, supra note 4, at 42, 45, 47.

145 VICTORIA RIDEOUT ET AL., KAISER FAMILY FOUNDATION, SEE NO EVIL: HOW INTERNET FILTERS AFFECT THE SEARCH FOR ONLINE HEALTH INFORMATION 1, 6-9 (2002), available at http://www.kff.org/entmedia/20021210a- index.cfm.

146 Id.

147 The study was conducted for the Henry J. Kaiser Family Foundation by Dr. Caroline Richardson of the University of Michigan Medical School, Dr. Paul Resnick at the University of Michigan School of Information and Victoria Rideout, MA, with the results published in the December 11, 2002 issue of The Journal of the American Medical Association (JAMA). See Caroline Richardson et. al., Does Pornography-Blocking Software Block Access to Health Information on the Internet? 288 JAMA 2887-2894 (Dec. 11, 2002).

142

from 87% at the least restrictive configuration to 91% at the most restrictive level.148 Meanwhile,

researchers found that incidental exposure to online pornography during health information

searches did not appear to be a “substantial problem” and “filters can reduce but not eliminate

such incidental exposure.”149

In a 2002 report on youth, pornography and the Internet, a National Research Council

committee concluded that all filters both blocked protected material (a false positive) and failed

to block material that could be considered harmful to minors (a false negative).150 The committee

reported that filters could be highly effective in reducing minors’ exposure to “inappropriate

content” if librarians and teachers were willing to accept the inaccessibility of large amounts of

appropriate and acceptable material.151 The committee stated that an effective framework for protecting children from “inappropriate” online materials would require a “balanced composite” of technical, legal, economic and educational approaches.152

A 2004 PC Magazine review that rated seven software filtering packages did not assign

“excellent”153 ratings to any of the filtering software packages but did rate two of the seven as

“very good.”154 The study found that none blocked all “inappropriate content.”155 By 2005,

148 RIDEOUT, supra note 145, at 6-9. See also Richardson et al., supra note 147, at 2887-2894.

149 RIDEOUT, supra note 145, at 1, 7 (2002). See also Richardson et al., supra note 147, at 2887-94.

150 See Youth, Pornography and the Internet, supra note 34, Sec. 12.1 at 275-80. The committee studied tools and technology for protecting children from online pornography and other “inappropriate” content on the Internet.

151 See Youth, Pornography and the Internet, supra note 34, Executive Summary at 10.

152 See Youth, Pornography and the Internet, supra note 34, Executive Summary at 13.

153 Jay Munro, Filtering Software, PC MAGAZINE, Aug. 3, 2004, at 103.

154 The reviewers had five scores from which to choose: “poor”, “fair”, “good”, “very good” and “excellent.” Reviewers gave “very good” ratings to Cybersitter 9.0 and Net Nanny 5.0. PC Magazine also evaluated CyberPatrol 6.2, EnoLogic NetFilter Home 3.0, iProtectYou Pro Web Filter 6.03, Norton Parental Control, and Safe Eyes Platinum. Id. at 102.

155 Id. at 103.

143

filtering technology had become better at blocking pornography but was still “fallible,”

according to a 2005 Consumer Reports article,156 the most recent year the magazine conducted a study. Consumer Reports evaluated eleven filtering software packages157 as “very good” to

“excellent” at keeping out most pornography,158 but the best filters also blocked sites containing

information on health issues, sex education, civil rights and politics.159

In a 2006 public policy report prepared for the Brennan Center for Justice at New York

University School of Law, researchers concluded that while filtering software studies from 2001

through 2005 tended to be less anecdotal and more statistical,160“filters continue to block large amounts of valuable information.”161 The authors report that even a 1% error rate can result in

millions of sites being erroneously blocked.162

A filtering technology study conducted in 2006 combined textual, structural and visual

content analysis. A team of computer scientists developed a system called WebGuard, which

they described as “an automatic, machine learning-based pornographic Web site classification

and filtering system.”163 Visual content analysis in combination with “skin color modeling”

156 See Filtering Software: Better, But Still Fallible, CONSUMER REPORTS, June 2005, 36.

157 Consumer Reports reviewed SafeEyes 2005, Microsoft Parental Controls 9.1, CyberPatrol 7.0, Norton Internet Security 2005, McAfee Privacy Service 2005/Version 7, CyberSitter 9.4, AOL Parental Controls 9.0, ContentBarrier X 10.1, Net Nanny 5.1, iProtectYou Pro 7.1 and KidsNet. Id. at 38.

158 Id. at 36. The reviewers had five scores from which to choose: “poor”, “fair”, “good”, “very good” and “excellent.”

159 Id. at 36-37.

160 HEINS, supra note 58, at 45.

161 HEINS, supra note 58, at 73.

162 HEINS, supra note 58, at 46.

163 Hammami et al., supra note 9, at 272.

144

distinguished this filtering program from previous ones.164 The software relies on text, including

300 keywords extracted from six languages,165 visual images and “skin color modeling” 166 The researchers reported that WebGuard’s effectiveness ranged from 95.6% to 97.4% in correctly classifying Web sites as pornographic or non-pornographic.167 For one study, the researchers selected 200 adult sites and 200 non-pornographic sites, and for another study, they used 12,311 adult Web sites manually classified by the French Ministry of Education.168

In 2007, the U.S. District Court for the Eastern District of Pennsylvania stated that

Interning filtering technology had improved.169 In reviewing testimony provided in 2006, the court said that filtering software could block any Internet application, including Web sites, e- mail, chat rooms, instant messaging, peer-to-peer file sharing, newsgroups, streaming video and audio, Internet television and voice over Internet protocol ("VoIP").170 Also by 2006, the court said that filtering software designers had developed “dynamic” filtering products, which used artificial intelligence to screen content in “real-time” by analyzing Web site content as it was

164 “Skin color modeling” is a way of looking at skin-color pixels to determine if an image is likely to be pornographic or not, with the goal of distinguishing, for example, health related and underwear advertisements from sexually explicit images. Hammami et al., supra note 9, at 273-77.

165Hammami et al., supra note 9, at 283. The authors did not clarify the keyword classification.

166 Hammami et al., supra note 9, at 273-77.

167 Hammami et al., supra note 9, at 283.

168 Hammami et al., supra note 9, at 272-73. The Web sites chosen had been black listed by the French Ministry of Education.

169 ACLU v. Gonzales, 478 F. Supp. 2d 775 (E.D. Pa. 2007). The court’s discussion of filtering technology was based on witnesses’ testimony delivered in 2006. Id. at 795-97. In Gonzales, the court struck down the Child Online Protection Act, stating that it violated the First Amendment. Id. at 821. The Child Online Protection Act would have prohibited Web sites from “knowingly” distributing, for “commercial purposes,” any material deemed “harmful to minors” to persons under the age of seventeen. If found guilty, the Web site operator could be fined up to $50,000, imprisoned for up to six months, or both. See 47 USCS § 231 (a) and (e) (7). In striking down the Child Online Protection Act, the court wrote, “Although filters are not perfect and are prone to some over and under blocking, the evidence shows that they are at least as effective, and in fact, are more effective than COPA in furthering Congress' stated goal (of preventing minors from accessing online material deemed “harmful to minors.”) Gonzales, 478 F. Supp. 2d at 815. For a discussion of the Child Online Protection Act and the cases deciding the Act, see Chapter 5.

170 Gonzales, 478 F. Supp. 2d at 791.

145

being requested. 171 The filters could evaluate different parts of the content to determine whether

it should be blocked,172 based on categories of material that the computer user or parent wanted

blocked, such as sexually explicit content.173

Dynamic filters evaluate content the user can see, as well as content the user cannot

see.174 The filters analyze a number of components on the Web page, including words on the

page, the file names for images, the size of images, the links on a page, the formatting of the

page, and the URLs. The filtering products analyze “statistical pattern recognition features,” such

as the spatial patterns between certain words and images, which often can help filters categorize

content even if the actual words are not recognized. The filters also evaluate “metadata,” the

hidden information contained in the software code. Dynamic filters consider the context of the

page in an effort to determine if the filtering analysis is accurate in blocking access to content.175

“Many companies” developed templates to allow the software to recognize context. For example,

the word “breast” would be blocked when used with the word “sexy,” but not with the words

“chicken” or “cancer.”176

Many filtering packages allowed parents to block access to content based on age groups.

For example, the district court said that AOL's filtering product enabled parents to choose from four different age settings: “kids only,” “young teen,” “mature teen,” and “general” (unrestricted

171 Id. at 790-91.

172 Id.

173 Id. at 795.

174 Id. at 790-91.

175 Id.

176 Id. at 791.

146

access).177 Parents also could set up individual accounts in the filtering program to block

different content for each member of the family.178 Software products would allow parents to

choose a “more restrictive setting” for younger children, which would block 95 to 99% of

categories of content parents might want to block, including pornography.179

The district court stated that filtering programs contained “built-in mechanisms to prevent

children from bypassing or circumventing the filters,” such as password protection devices and

tamper detection devices. When a computer user, such as a minor, tried to uninstall or disable the

filtering software, the filter could be configured to cut off all Internet access until someone, such

as a parent, reconfigures the software.180

The district court stated that the “vast majority” of filtering software products blocked “at

least 95 percent” of “sexually explicit pages,”181 a percent that Professor Cheryl Preston said that

the district court saw as acceptable. However, Preston argued that filters blocking 95% of

sexually explicit content would allow access of up to thirty-five million pages of sexually

explicit content.182 At the same time, filters continued to block constitutionally-protected speech, she said.183

177 Id. The court said that AOL defined “kids” as those under the age of twelve, “young teens” as thirteen- to fifteen-years-old, and “mature teens” as sixteen-years-old. Twelve-year-olds were not listed in either category. Id. The Deloitte study, conducted for the European Union in 2008, used two age categories when evaluating filtering products: ten and under and eleven to sixteen. See DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 12, infra notes 189-190 and accompanying text.

178 Gonzales, 478 F. Supp. 2d at 791.

179 Id. at 795. The percentages of content blocked varied by filtering product. Id. A less restrictive setting, which the court said could be used by parental discretion for teenagers, could block 90% of pornographic content.

180 Id.

181 Id. at 796.

182 Cheryl Preston, Zoning the Internet: A New Approach to Protecting Children Online, 2007 B.Y.U. L. REV. 1417, 1450 (2007). Preston said that the district court accepted testimony that estimates of sexually explicit pages in 2006 ranged from 275 million to 700 million. If filtering software failed to block 5% of the 700,000,000 sexually explicit pages, the software would allow 35,000,000 sexually explicit pages through. Id. at 1450, n. 193. But see How Many

147

In a 2008 study conducted by Deloitte for the European Union, the researchers reported that the accuracy of Internet filters improved when compared to filters evaluated in 2006 and

2007.184 The researchers evaluated the twenty-six filtering products that they said were the most used, but they did not include WebGuard in their study.185 A graph included in the Deloitte study indicated that the better that the filtering software was at blocking access to “harmful” material,

Web Sites Exist? (Feb. 15, 2007) http://www.boutell.com/newfaq/misc/sizeofweb.html. (last visited July 20, 2009). In 2007, about 29.7 billion pages existed, according to the latest estimate released by boutell.com. Id. If 1% of Web pages are estimated to be pornographic, then 297,000,000 web page pages would be pornographic in 2007. If filters blocked 5%, about 14.8 million pornographic pages would still be accessible. However, that number is less than half the number that the federal district court accepted as fact during the trial for the Child Online Protection Act. See ACLU v. Gonzales, 478 F. Supp. 2d 775, 788 (E.D. Pa. 2007) (holding unconstitutional a statute providing both criminal and civil penalties for Web site owners transmitting sexually explicit materials and communications on Internet sites available to minors.) For a discussion of the Child Online Protection Act, see Chapter 5. According to Preston, 14.8 million pornographic pages is still a high number that would not be blocked. See Preston at 1450. If the computer user adjusted the filtering software to a more restrictive setting, and all but 1% of pornographic content was blocked, about 2.9 million pornographic Web pages would be accessible.

183 Preston, supra note 182, at 1451. Preston argued in favor of Internet zoning, in place of filtering. Zoning would separate, rather than block, content. Id. at 1432. In a zoning construct, different content would be assigned to different ports, similar to the channel line-up on cable television. Id. at 1429. She said that more than 65,000 ports exist in cyberspace, but only ten to twenty of the ports are used. Id. at 1427. Currently all Web content uses one port. Id. at 1431. Under the zoning system, Web site operators with content that was pornographic or “harmful to minors” would add a free programming code to their server, and the content would be assigned to the “adult” port. Id. at 1433, 1468-69. Content that was not pornographic or harmful to minors would be accessible on “community ports.” Id. at 1431-34. Parents could then decide which ports they wanted to block from their homes. Id. at 1437. Preston said that Congress would need to pass legislation to enforce a zoning construct. Id. at 1434-36. Preston did not discuss how Web site operators would determine if material was “harmful to minors.”

184 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 4 (2008). The authors also reported that filters were easier to operate in 2008. The researchers evaluated twenty-six filtering products: AOL Parental Controls, Blueprint Data Kidsnet, Cogilab Surfpass 4, Computer Associates Internet Security Suite 2008, Computer Associates Secure Content Manager, Easybits Magic Desktop, Editions Profil Parental Filter, F-Secure Internet Security 2008, Intego Internet Security Barrier X5 (Platinum Edition), Internetsafety.com Ethershield, Internetsafety.com SafeEyes, McAfee Security Suite, McAfee Total Protection, Microsoft Vista Ultimate, MicroWorld eScan Internet Security Suite, Norman Security Suite, Open Source DansGuardian, Open Source Poesia, Optenet Internet Security Suite (Appliance), Optenet Web Filter PC, Point Clark Networks Clark Connect, Smoothwall School Guardian 2008, SoftForYou Cyberserve, Symantec Norton Internet Security Suite 2008, Telocator Brightfilter, and Trend Micro Internet Security Suite 2008. DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 12.

185 The Deloitte researchers chose the most commonly used filtering software. They did not identify the “best” program, stating that the goal of the study was not to provide guidance to prospective customers, but rather to “benchmark the main functionalities” of the most used filtering software. DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 2.

148

the worse the software was at allowing access to “harmless” material.186 For example, if 80% of

“bad” content was blocked, nearly 30% of “good” content was blocked. If 90% of “bad” content was blocked, nearly 70% of “good” content was blocked.187

The researchers stated that filtering software programs released in 2008 did a better job at blocking access to pornography sites than past versions of the software.188 The “best performing” filtering program in 2008 scored a 4.0, the top score, for blocking pornography from children ten and under, and a 3.5 out of 4.0 for blocking pornography from minors from eleven to sixteen.189 In contrast, when all twenty-six filtering tools were included, the “average” score

186 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5. The problem of failing to block “bad” content was labeled “underblocking.” The problem of blocking of “good” content was labeled “overblocking.”

187 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5. The European Union funded a study on the effectiveness of filtering software, which was conducted in 2008 by Deloitte. The study evaluated filtering programs’ ability to block text or pictures in eight categories, with each category containing five to eleven subcategories. Content was labeled “bad” if text or pictures 1) “could impair” minors’ moral or social development, and “may have a traumatising [sic] effect on youngsters,” such as “blood scenes or scenes with seriously injured people” and “pictures and stories about cruelty to animals”; 2) “could impair” minors’ sexual development, and “may have a traumatising [sic] effect on youngsters,” such as “sex accompanied by pain, injury or humiliation” and “hardcore sex, ejaculation, erection, defecation, urination, bestiality, [and] necrophilia”; 3) “could impair” minors’ emotional and mental development, and “may have a traumatising [sic] effect on youngsters,” such as “coarse language” and “scenes that confuse facts and fiction with respect to violence”; 4) “could instigate damage to another,” and “may encourage youngsters to commit acts of rape or harassment,” including “inspiring snuff videos that show (played or reality) rape,” “games that simulate rape or harassment” and “positioning rape, torture, sadistic violence, [or] terrorism as ‘cool’”; 5) “could instigate damage to another’s life,” and “may encourage youngsters to commit acts of murder or terrorism,” such as “inspiring snuff videos that show (played or reality) murder,” “stories that convince youngsters to solve their problems by mass murder,” “games that simulate murder or terrorism” and “video of actual acts of terrorism”; 6) “could instigate damage to another’s freedom and rights,” and “may encourage youngsters to commit acts of strike, sabotage, hacking, theft, disclosure, or racism,” such as “calls to youngsters for sabotage or strikes,” “stories that claim hacking or theft is harmless,” “calls to disclose information about the parents,” and “stories that claim other races are inferior”; 7) “could instigate damage to him/herself,” and “may encourage youngsters” to abuse drugs, meet strangers or gamble, such as “stories that claim that medicines or drugs are harmless,” “calls to join a religious sect,” “invitations to meet a person they only know remotely,” and “invitations to gamble or bet”; 8) “could instigate damage to his/her life,” and “may encourage youngsters to death kicks or to commit suicide or abortion,” such as “stories that convince youngsters to solve their problems by committing suicide,” “stories, pictures and videos of death races or other death kicks,” and “stories to solve unplanned pregnancy by self-made abortion.” DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET: TEST AND SCORING METHODOLOGY at 9-13 (2008), available at http://www.sip- bench.eu/Reports2008/sip_bench_2008_methodology_report_en.pdf

188 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 6, 9, 31.

189 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 12. The researchers did not identify the “best” program, stating that the goal of the study was not to provide guidance to prospective customers, but rather to

149

for blocking pornography was lower: a 2.8 for children ten and under, and a 2.8 for eleven- to sixteen-year-olds.190 The researchers said that software vendors were having a harder time keeping up with all of the changes in pornography sites than in keeping up with changes in dating, drug, and hatred sites.191 The researchers also said that filtering software designed in

2008 was not as effective at blocking access to pornography sites established in 2006 as it was at blocking access to pornography sites established in 2007 and 2008.192

Because of the problem with blocking access to “harmless” content in the eight categories included in the study, the researchers gave the “best performing” software program only a score of 2.5 out of 4.0, or a “fairly good” rating, in 2006, 2007 and 2008.193 The researchers faulted the choice of keywords used to block content, stating that the filtering products provided “little or no clarity on the type of words that trigger the content to be

“benchmark the main functionalities” of the most used filtering software. Id. at 2. The values assigned to the same rating were different for the two age groups studied, with more restrictive parameters used for younger children. For children six to ten, a 4.0 rating meant that one out of thirty-two “bad” (pornographic) pages was not blocked or one out of eight “good” (non-pornographic) pages was blocked. For children eleven to sixteen, a 4.0 rating meant that one out of sixteen “bad” (pornographic) pages was not blocked or one out of sixteen “good” (non-pornographic) pages was erroneously blocked. The “best” filtering package for eleven- to sixteen-year-olds scored a 3.5, meaning that one out of twelve “bad” (pornographic) pages was not blocked or one out of twelve “good” (non-pornographic) pages was erroneously blocked. Id. at 27-29. See supra note 187 for the list of content categories used in the study.

190 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 12. For children six to ten, a 2.8 rating meant that about one out of sixteen “bad” pages was not blocked or about one out of four “good” pages was blocked. For children eleven to sixteen, a 2.8 rating meant that about one out of eight “bad” pages was not blocked or one out of eight “good” pages was erroneously blocked. Id. at 27-29.

191 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 31. The researchers did not explain why filtering programs had a harder time keeping up with changes on pornographic Web sites. But see infra notes 196 to 197 and accompanying text for a discussion of the problems with Web sites containing user-generated content.

192 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 31. The researchers did not discuss why the filtering software was not as effective at detecting the pornography sites established in 2006 as it was at detecting sites established in 2007 and 2008.

193 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 27. The rating evaluated all the defined “harmful” content, and not just pornography. For the list of categories, see supra note 187. The value assigned to a 2.5 rating was different for the two age groups studied. For children six to ten, a 2.5 rating meant that one out of sixteen “bad” pages was not blocked or three out of eight “good” pages were blocked. For children eleven to sixteen, a 2.5 rating meant that one out of six “bad” pages was not blocked or one out of six “good” pages was erroneously blocked.

150

blocked.”194 The study also showed that individual users of a computer could not customize the

filtering software, so the same filtering criteria were applied whether the user was a young child,

teen-ager, parent or teacher.195

In contrast to other researchers’ studies on filtering software, the Deloitte group

addressed the effects of user-generated online content on the effectiveness of filtering software

programs. Because Internet users can post content on interactive sites instantaneously, filtering

software vendors cannot keep up with the changes.196 For example, Internet users can post video

clips on YouTube and MySpace, or post “harmful scenes” in Second Life.197

The Deloitte group recommended that filtering software vendors develop a product that

would allow the product’s end users to classify content.198 The Deloitte researchers stated that a

“filtering policy” or model would need to be developed that would include a standard set of

criteria that the users would be able to evaluate.199 The researchers said that a user-based model

needs to be implemented so that Internet users are not relying solely on the software vendors’

assessment of content.200 “Parents and teachers should be able to select and combine those

194 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 35. The researchers did not elaborate further or provide examples.

195 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 35.

196 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 14.

197 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 14. YouTube is a Web site that allows visitors to post videos. See YouTube.com. MySpace is a Web site that allows users to set up a profile and post information, photos and videos. See myspace.com. Second Life is a 3D, virtual community, where visitors design avatars to interact with other users’ avatars. See http://secondlife.com/ (last visited July 20, 2009). A major social networking site not mentioned in the study is Facebook, which is similar to MySpace. See facebook.com. For a description of “harmful” content and a list of examples used in the Deloitte study, see supra note 187.

198 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5, 15. Deloitte defined end users as people who are using Internet filters to access or block online content.

199 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 33. The researchers did not list examples of criteria. However, for the criteria used in their study, see supra note 187.

200 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5.

151

criteria that best suit the needs of their own children and not have to assume that ‘the vendor

knows what is best for their children,’” the researchers wrote.201

User-based software classification would involve three steps, according to the Deloitte

study. First, individual users would classify or rate content against a number of standard or

“agreed upon” criteria.202 Second, “like-minded people,” such as parents and teachers, could

select the criteria most important to them and block content they did not want their children or

students to view.203 Third, users accessing others’ classifications would be able to rate the

accuracy of the classifications, similar to how consumers rate online vendors, thus attaching a

“confidence rating” to the user who originally classified the content.204

Conclusion

During the first decade of the twenty-first century, researchers have reported that while

Internet filtering technology can be effective in blocking access to a large percentage of pornography, it is not a panacea.205 In 2003, the American Library Association stated that filters

still were not sophisticated enough to distinguish pornography from art or literature.206 In 2006,

WebGuard’s textual and visual content analysis features, including the use of skin color modeling to distinguish pornography from non-pornographic visuals, was an improvement.

201 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 7.

202 A family, school or other group could determine a set of criteria, according to the researchers. DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5, 15. See supra note 187 for a list of the categories used in the Deloitte study.

203 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5, 7.

204 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 15.

205 See RIDEOUT et al., supra note 145, at 1, 6-9; Filtering Software: Better, But Still Fallible, CONSUMER REPORTS, June 2005, at 36; HEINS, supra note 58, at ii; DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 10, 27. See also Peltz, supra note 58, at 478-79. See Chapter 8 for an analysis of the technical and legal aspects of Internet filtering in the public library setting.

206 See Am. Library Ass’n, Libraries & the Internet Toolkit (last updated Dec. 1, 2003), at 10, http://www.ala.org/ala/aboutala/offices/oif/iftoolkits/litoolkit/default.cfm (last visited July 20, 2009).

152

However, WebGuard was in the testing stage in 2006, and no further studies have been released

to determine its effectiveness on more recent content.

In 2008, the Deloitte research team stated that filtering technology had improved since

2006, especially in blocking pornography. Nonetheless, its report showed that while a less

restrictive filter setting would allow access to more “good” content, the filter would not block as

much “bad” content. Conversely, a more restrictive setting would block more “bad” content, but

at the same time erroneously prevent access to more “good” content.207 The Deloitte researchers

also found a problem with keyword blocking because the filtering software designers did not

clarify how keywords were selected and used to block content.208 They suggested that Internet filters would be more effective if software designers developed a way for the products’ users to classify content according to family, religious or social group values.209

Meanwhile, anecdotal reports on the use of filtering programs in K-12 classrooms

suggest problems on the reliance of proprietary software, whether in schools or public libraries.

For example, some students in 2005 were unable to conduct research in school because the

filtering programs blocked Web sites on their topics, including the Motion Picture Association of

America’s movie rating system, book banning, smoking, medicinal uses of marijuana, sexually

transmitted diseases, and the dangers of illegal drug use.210 That same year, reporter Lauren

207 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 10. For a list of the types of content measured in the study, see supra note 187.

208 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 35. The researchers did not elaborate further or provide examples.

209 DELOITTE, SAFER INTERNET: SYNTHESIS REPORT, supra note 30, at 5, 15.

210 See HEINS, supra note 58, at 70; Lauren Barack, Filters Impede Learning, 51 SCH. LIBRARY J. 24, 24 (2005); Rebecca Meeder, Access Denied: Internet Filtering Software in K-12 Classrooms, 49 TECHTRENDS 56, 56 (2005).

153

Barack stated that some school librarians had used the default setting on filtering software

without exploring whether that setting is the most appropriate.211

In 2008, students were still experiencing problems in conducting online research. High

school students in the Midwest complained that all Google search results and all blogs were

blocked.212 A middle school teacher in the Midwest, who had assigned animal research papers,

reported that the San Diego Zoo and University of Michigan Zoological Museum were blocked.

The teacher also reported that a student could not access a map of Canada when trying to do a social studies assignment.213 A Wisconsin library media specialist reported that students doing

research on their favorite guitar players in 2008 found everything about their musicians

blocked.214

Before the CIPA was enacted, Congress and the courts had grappled with how to protect

minors from sexually explicit material, both online and in other media, while at the same time ensuring that adults had access to constitutionally-protected sexual material. The solution settled on by Congress may be less than satisfactory for many regulators and critics of the law, parents and children, and librarians and teachers. One fascinating part of the debate about how to protect children from sexually explicit materials is whether there is any evidence that exposure to the content seen on the Internet harms children, and if so, at what age? The assumption of harm by many seems so intuitive that little attention is paid to the fact there is some available relevant research, research to be discussed in the next chapter.

211 See Barack, supra note 210, at 24.

212 Helen R. Adams, Filters and Access to Information, Part II, 25 SCH. LIBRARY MEDIA ACTIVITIES 54, 54 (October 2008).

213 Id. at 55.

214 Id.

154

CHAPTER 4 THE PROTECTION OF MINORS FROM MATERIAL DEEMED HARMFUL

Introduction

The concern with protecting children from sexually explicit material is not new.

However, the concepts of childhood and what is considered harmful have vacillated during the

course of several centuries. In the United States, statutory and case law generally supports the

role of parents to raise their children as they see fit, although the government has intervened on

numerous occasions to protect the physical, psychological and emotional well-being of minors, including through the regulation of sexually explicit content, such as pornography.

For the purposes of this dissertation, pornography is defined as follows: nonviolent material, such as writings, photographs, movies or Internet sites, depicting sexual activity or erotic behavior between consenting adults in a way that is designed to arouse sexual excitement.1

Obscenity, as articulated by the U.S. Supreme Court in 1973 in Miller v. California,2 is defined

as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual

conduct in a patently offensive way, and which, taken as a whole, do not have serious literary,

artistic, political, or scientific value.”3 Indecency, as defined by the Federal Communications

Commission in 1975, is “language that describes, in terms patently offensive as measured by

1 The definition of pornography is based on the definition in Black’s Law Dictionary, which reads: “material (such as writings, photographs, or movies) depicting sexual activity or erotic behavior in a way that is designed to arouse sexual excitement.” BLACK’S LAW DICTIONARY (8th ed. 2004). The author of this dissertation has added the terms “Internet sites,” “nonviolent” and “between consenting adults” to Black’s definition for clarification and to distinguish consensual nonviolent pornography from nonconsensual and/or violent pornography. The U.S. Supreme Court has never defined the term “pornography,” which is a “vague” term because it includes both protected sexual material and unprotected sexual material, such as obscenity and rape. See ROBERT TRAGER, JOSEPH RUSSOMANO & SUSAN DENTE ROSS, THE LAW OF JOURNALISM AND MASS COMMUNICATION 384. The term “pornography” does not have a common definition or meaning. See DON R. PEMBER & CLAY CALVERT, MASS MEDIA LAW 12 (2005).

2 Miller v. California, 413 U.S. 15 (1973).

3 Id. at 24. The Miller obscenity test includes a local standard for judging whether a work is obscene, stating that "the average person, applying contemporary community standards" would find that the work, taken as a whole, appeals to the prurient interest.” Id. See infra notes 127-128 and accompanying text for a discussion of Miller.

155

contemporary community standards for the broadcast medium, sexual or excretory activities and

organs.”4 However, the FCC in 2001 clarified the definition of indecency and adapted the

application of the definition to consider “context” and to examine how explicit or graphic the

material is, whether the material dwells on sexual activities, and whether the material is meant to

shock or sexually arouse the audience.5

In this chapter, the author will provide an overview of the changing view of the child and of material deemed harmful, as well as the government’s view of the role that parents play in the

childrearing process. The author also will explain the evolution of the regulation of obscenity and sexually explicit material, examine social science studies that have tried to measure the

effects of pornography on adults and children, and analyze the federal government’s first two

attempts at trying to prevent minors from accessing sexually explicit material on the Internet.

The Changing View of the Child

For more than 6,000 years—from the time of the ancient Greeks to the present—many

societies have raised concerns about the potential harmful effects of sexual content in fiction on

children.6 In 400 B.C. Socrates strongly objected to the dissemination of Hesiod’s and Homer’s

“ugly and immoral” stories.7 Though Socrates did not specifically mention sex,8 he believed that

children should only be exposed to wholesome ideas and lessons because the earliest ideas a

4 FCC v. Pacifica Found., 56 F.C.C.2d 94, 97 (1975).

5 In the Matter of Industry Guidance On the Commission's Case Law Interpreting 18 U.S.C. § 1464 and Enforcement Policies Regarding Broadcast Indecency, 16 F.C.C.R. 7999, 8002-03 (2001). The FCC issued a policy statement to provide guidance to broadcast licensees regarding compliance with the Commission's indecency regulations. Id. at 8016-17.

6 See Richard Jackson Harris, The Impact of Sexually Explicit Media, in MEDIA EFFECTS 247, 249 (Jennings Bryant & Dolf Zillmann eds., 1994).

7 WALTER KENDRICK, THE SECRET MUSEUM 35 (1996).

8 Id. at 36.

156

child took in would become “indelibly fixed” within the child’s mind.9 Ironically, Socrates was condemned to death for corrupting the youth in Athens10 by engaging in "impiety"—irreverence towards the Supreme Being and contempt of the divine character and authority.11 The Greeks believed children needed to be “tamed and educated,” but they did not keep children ignorant of sex and sexuality.12 For example, in ancient Athens a sexual relationship between an adult man and adolescent boy was not viewed as a “crime against nature”13 but rather as crucial to male socialization. The experience was considered a highly valued sexual relationship that served as a pathway for learning and for intellectual development.14

The modern concept of childhood as a time of innocence did not come about in western society until sometime between the late 1500s and 1800s,15 although at least one historian has argued that emotional attachments were evident among English families as early as the thirteenth

9 Id. at 35.

10 Id. at 95.

11 David W. Allan, Socrates and Democracy (July 29, 2001), available at http://www.allanstime.com/Government/socrates_democracy.htm.

12 MARJORIE HEINS, NOT IN FRONT OF THE CHILDREN: INDECENCY, CENSORSHIP, AND THE INNOCENCE OF YOUTH 15 (2001). Heins used the word “tamed,” but did not define it.

13 JOHN G. GAGNON, HUMAN SEXUALITIES 14 (1977).

14 Id.

15 The history and evolution of childhood is beyond the scope of this dissertation. For an overview of historians’ analyses of the evolution of childhood, see Margaret L. King, Concepts of Childhood: What We Know and Where We Might Go, 60 RENAISSANCE Q. 371 (2007). Historians disagree on the exact time period when the concept of childhood was invented or recognized. Some historians have argued that childhood was recognized as a separate stage of life in the late 1500s and early 1600s. During this time period, children had individual effigies on tombstones for the first time. See PHILIP ARIES, CENTURIES OF CHILDHOOD: A SOCIAL HISTORY OF FAMILY LIFE 42 (Robert Baldick, trans., 1962). During the 1600s, parents began to dress children in age-appropriate clothing, rather than miniature adult “costumes,” and adults no longer allowed children to gamble. BARBARA GREENLEAF, CHILDREN THROUGH THE AGES: A HISTORY OF Childhood 46 (1978). Other historians have stated that the concept of childhood emerged in the 1700s, due in part to the writings of Jean-Jacques Rousseau and others who described children as a symbol of nature, goodness and innocence. GREENLEAF at 62. Still others have argued that the view of children as innocent beings needing protection came about in the 1800s when children were compared to women, who at that time were viewed as powerless, weak and objects of veneration needing protection. See MARIE WINN, CHILDREN WITHOUT CHILDHOOD 112-13 (1983). See also HEINS, supra note 12, at 18-19; GREENLEAF at 32-40; ARIES at 18-19, 34-49, 411-15.

157

century.16 However, prior to the fifteenth century, children often were viewed as property rather than as human beings. It was not uncommon for them to be killed, abandoned or sold into brothels.17 In ancient times, Moabites, Phoenicians and Ammonites believed that the gods liked

human food, and they killed their children as a sacrifice.18 In 100 BC, Egyptian and Greek

parents killed their infants to control population, prevent property disputes among sons, get rid of

weak boys, and eliminate girls, who were considered weak and therefore less valuable.19 During the Middle Ages, most parents expected children to work in order to contribute to the household.

By the 1700s, many children planted, weeded and harvested the family farm or were loaned out to help other farmers. Some children served as apprentices so that they could contribute to society and learn a useful skill,20 while other children became servants for another family.21

Historians have noted that the worlds of the child and adult were integrated in many

cultures as late as the sixteenth century. Parents could not shield children from the realities of

life, even if they wanted to. Because of crowding in homes, children often slept with their

16 See generally ALAN MACFARLANE, ORIGINS OF ENGLISH INDIVIDUALISM: THE FAMILY, PROPERTY, AND SOCIAL TRANSITION (1978), and ALAN MACFARLANE, MARRIAGE AND LOVE IN ENGLAND: MODES OF REPRODUCTION (1986).

17 See HEINS, supra note 12, at 16. See also WINN, supra note 15, at 207; King, supra note 15, at 384-90. But see MACFARLANE, ORIGINS OF ENGLISH INDIVIDUALISM, supra note 16 and MACFARLANE, MARRIAGE AND LOVE IN ENGLAND, supra note 16, in which he argues that emotionally-rich families cared about their children in the thirteenth century.

18 th GREENLEAF, supra note 15, at 18, 26. The Moabites, who flourished in the 9 century BC, were West-Semitic people who lived in the highlands east of the Dead Sea in what is now west-central Jordan. Scholarly research suggests the Moabites lived from the 14th century B.C. to 582 BC. Phoenicians lived in the Mediterranean from 1200 B.C. to 900 B.C. The Ammonites, a people who were closely related to the Hebrews, established a kingdom east of Jordan in the 13th century BC. Ammonites were Semites, who were ethnically close to modern north- Jordanians and were in sporadic conflicts with the Israelites. Their period as a distinct people lasted from the 13th century BC until the 3rd century AD; however, their kingdom lasted only until the 6th century BC.

19 GREENLEAF, supra note 15, at 18-20. Greenleaf noted that Israelis did not engage in infanticide.

20 GREENLEAF, supra note 15, at 26-32.

21 DARRETT RUTMAN, AMERICAN PURITANISM 8 (1970).

158

parents and learned about sexual intercourse from witnessing it firsthand.22 Parents dressed their children as miniature adults and treated them as adults.23 Adults and children played the same games24 and gambled by betting on card games, backgammon, chess and cockfights.25 Adults and children joked about the same things, including sex and sexuality,26 and adults did not protect children from coarse language.27

During the sixteenth and early seventeenth centuries, few parents bonded with their children.28 Because of high infant mortality rates in England and America in the 1500s and

1600s, parents did not become attached to their children.29 In the seventeenth century in

America, Puritans30 viewed children as carriers of Original Sin31 and believed children needed to

22 See HEINS, supra note 12, at 17; GREENLEAF, supra note 15, at 37. Greenleaf also wrote that parents could not prevent children from witnessing fights and deaths on the battlefield.

23 GREENLEAF, supra note 15, at 40.

24 See Conspectus—The Association of American Law Schools, Section on Mass Communications Law 1997 Annual Conference Panel: Sex, Violence, Children and the Media: Legal, Historical and Empirical Perspectives, 5 COMMLAW CONSPECTUS 341, 350 (1997) (comments of Catherine Ross).

25 GREENLEAF, supra note 15, at 39.

26 See Sex, Violence, Children and the Media: Legal, Historical and Empirical Perspectives, supra note 24, at 341, 350.

27 See ARIES, supra note 15, at 103-09.

28 See LAWRENCE STONE, THE FAMILY, SEX AND MARRIAGE IN ENGLAND: 1500-1800 at 117 (1977).

29 See King, supra note 15, at 371, 372, who stated that infant mortality rates ranged from 25% to 50%; GREENLEAF, supra note 15, at 32, 82, who wrote that parents did not form attachments with their children because of the high child mortality rate, which exceeded 50% at times during this period. See also HEINS, supra note 12, at 38.

30 RUTMAN, supra note 21, at 4-10. Rutman stated that Puritanism is difficult to define and goes beyond “simple religiosity.” In discussing the concept of Puritanism, Rutman described it as a way of living in which “men defined themselves, their society, their activities, and their institutions in terms of God.” RUTMAN at 4. He wrote that English Puritanism was brought to the Colonies by immigrants who believed in the sovereignty of God, the Bible as paramount in revealing God’s will, and the absolute necessity of submitting to the will of God. RUTMAN at 10. See also Perry Miller, The Puritan Way of Life, in PURITANISM IN EARLY AMERICA: PROBLEMS IN AMERICAN CIVILIZATION 4-12 (George M. Waller, ed., 1950). Similar to Rutman, Miller stated that Puritanism is more easily described than defined. He described Puritanism as a philosophy of life and code of values that revolved around the glory of God and the belief that God only communicated with humans through the Bible. See also THE PURITAN TRADITION IN AMERICA: 1620-1730, at 14-20 (Alden T. Vaughan, ed., 1972), which discussed the Puritans’ belief in predestination or selective redemption and in the glorification of God.

159

be controlled and indoctrinated.32 During the eighteenth century, sexuality took another turn when a reaction against Puritanism resulted in more relaxed sexual standards. But by the nineteenth century, the new evangelical Christianity movement emerged, spawning moral-purity crusades.33 In the late nineteenth century, educators and social reformers emphasized both the

importance of teaching children Puritan values and the need to prolong a child’s innocence as

long as possible.34 The middle and upper classes became increasingly worried about adolescents

as literacy spread and as sex education materials and cheap novels became available.35 Because

Christians viewed children as vessels of depravity and Original Sin, their virginity took on an

interior and spiritual value.36

By the nineteenth century, youth’s interest in sex became a matter of public concern in

the United States, and the public goal of protecting the morality of youth was central to the

support of all censorship campaigns.37 Historian Alison Parker explained that from 1873 through

31 Puritans believe in Original Sin, a Christian doctrine that holds that all people are born sinners because Adam (the first man) sinned in the Garden of Eden when he ate the apple offered by Eve. According to this doctrine, the apple was the forbidden fruit of the knowledge of good and evil, which God told Adam and Eve to refrain from eating. See Original Sin, Encyclopedia Brittanica, http://www.britannica.com/eb/article-9057375/original-sin (last visited July 20, 2009). See also Original Sin, MSN ENCARTA ENCYCLOPEDIA, http://encarta.msn.com/encnet/refpages/search.aspx?q=original+sin (last visited July 20, 2009). The Puritans believe that God, in his infinite mercy, has chosen to save only a few. See THE PURITAN TRADITION, supra note 30, at 14- 15.

32 HEINS, supra note 12, at 20.

33 See HEINS, supra note 12, at 25. See also PAUL BOYER, PURITY IN PRINT: THE VICE-SOCIETY MOVEMENT AND BOOK CENSORSHIP IN AMERICA 1-52 (1968); JOSEPH KETT, RITES OF PASSAGE: ADOLESCENCE IN AMERICA, 1790 TO THE PRESENT 41-54 (1977); Lynn Hunt, Introduction: Obscenity and the Origins of Modernity: 1500-1800, in THE INVENTION OF PORNOGRAPHY 12 (Lynn Hunt, ed., 1996).

34 See Sex, Violence, Children and the Media, supra note 24, at 341, 350.

35 See HEINS, supra note 12, at 26.

36 See HEINS, supra note 12, at 16.

37 See ALISON M. PARKER, PURIFYING AMERICA: WOMEN, CULTURAL REFORM, AND PRO-CENSORSHIP ACTIVISM, 1873-1933 at 19 (1997).

160

the 1930s, censorship was viewed as “a useful tool for social change,”38 and the regulation of

both printed materials and the new motion pictures medium was at its most restrictive level to

date.39 It remains unclear as to whether the general public supported the widespread censorship

laws, according to Parker,40 but in the early 1900s, censorship gained approval among a variety

of middle-class groups,41 including women’s groups, religious organizations and some

professional organizations.42 Attorney and historian Catherine Ross said that adults generally promoted Puritan values and self-denial, at least for children, in an effort to preserve their innocence.43

Despite this focus on the mental and emotional health and well being of minors, many

children were exposed to the harsh realities of contemporary life in the United States until the

late 1930s. They attended public hangings as a form of family entertainment,44 witnessed early

deaths within their families and communities, and worked long hours on the family farm or in

coal mines and factories.45 Industrialization was a strong force in increasing the number of

38 Id. at 1.

39 Id. at 2.

40 Id. at 3.

41 Historian Stuart Blumin wrote that American society recognized the concept of a middle class in the early 1800s. He described the middle class as a group that experienced “work, consumption, residential location, (and) formal and informal voluntary association.” STUART BLUMIN, THE EMERGENCE OF THE MIDDLE CLASS: SOCIAL EXPERIENCE IN AMERICAN CITY, 1760-1900, at 312 (1989).

42 See PARKER, supra note 37, at 4.

43 See Sex, Violence, Children and the Media, supra note 24, at 341, 350.

44 See Sex, Violence, Children and the Media, supra note 24, at 341, 350.

45 See CAROL SALLER, WORKING CHILDREN 8, 11, 16-27 (1998); Sex, Violence, Children and the Media, supra note 24, at 349-52; see generally, KENDRICK, supra note 7.

161

children who worked outside of the family farm.46 Beginning in the mid-1800s, children often worked ten to fourteen hours a day in stuffy, rat-infested meat packing plants and dark coal mines, sometimes losing hands or arms in machinery.47 By 1900 more than two million U.S.

children under the age of 16 worked in factories, mines, fields, and in the streets.48 These working children, who often were exhausted, hungry and sometimes ill,49 labored long hours to

help support their poor families.50 In the early 1900s, children frequently would work twelve

hours a day, six days a week throughout the year and would earn very little pay for their work.51

While children working in factories and the garment industry were kept inside all day long,

children who worked the fields spent long, hot days in the sun or went barefoot in mud and rain.52 Factory children, however, did not get outdoor breaks and therefore were unable to go out

for fresh air or play.53 These working children could not attend school, rarely knew how to read

or write and therefore did not develop the skills to find better jobs later, thus perpetuating the

cycle of poverty.54

46 See PARKER, supra note 31, at 54-55. See also Priscilla Ferguson Clement, The City and The Child, 1860-1885, in AMERICAN CHILDHOOD: A RESEARCH GUIDE AND HISTORICAL HANDBOOK (JOSEPH M. HAWES & N. RAY HINER, EDS.) (1985) at 249.

47 GREENLEAF, supra note 15, at 103.

48 See SALLER, supra note 45, at 8, 11, 16-27. See also DANIEL T. RODGERS, THE WORK ETHIC IN INDUSTRIAL AMERICA: 1850-1920, 115 (1978); KATHLEEN THOMPSON & HILARY MAC AUSTIN, AMERICA’S CHILDREN 130-52 (2003).

49 See SALLER, supra note 45, at 15.

50 See GREENLEAF, supra note 15, at 103-04; SALLER, supra note 45, at 9. See also DAVID PARKER, STOLEN DREAMS 71.

51 See SALLER, supra note 45, at 8-9.

52 See SALLER, supra note 45, at 25.

53 GREENLEAF, supra note 15, at 103-04. See also STOLEN DREAMS, supra note 50, at 71.

54 See GREENLEAF, supra note 15, at 103-04; SALLER, supra note 45, at 18, 34. For an overview of child labor throughout history and today, see University of Iowa Labor Center, Child Labor Public Education Project, available at http://www.continuetolearn.uiowa.edu/laborctr/child_labor/

162

During the first three decades of the 1900s, many people thought hard work was good for children,55 even though, paradoxically, they were sympathetic to the plight of working children.56 Congress passed laws in 191657 and 191958 forcing states to end child labor, but the

United States Supreme Court struck down the laws59 and held that Congress could not tell the states what to do.60 It wasn’t until the 1930s that Americans began to view the economic and social value of children differently.61 In 1938, Congress passed the Fair Labor Standards Act62 that prohibited young children from working and prohibited older children from working during school hours.63

55 See SALLER, supra note 45, at 31. See also GREENLEAF, supra note 15, at 103-04, stating that Americans believed putting children to work in factories was good training for the life of drudgery they would face as adults. See also Clement, supra note 46, at 248, stating that since the Colonists settled in America, most children had worked, and most felt children should work if for no other reason than to prevent delinquency.

56 GREENLEAF, supra note 15, at 103.

57 Fed’l Child Labor Law of 1916, Pub. L. No. 249, 39 Stat. 675 (ch. 432) (1916).

58 Child Labor Tax Act, 65 Pub. L. No. 254, 40 Stat. 1057, 1138 (1919), which, as a way to ban the use of child labor, imposed a 10% tax on the profits of all businesses engaged in interstate commerce that employed children. In Bailey v. Drexel Furniture Co., 259 U.S. 20, 43-44 (1922), the Supreme Court struck down the Child Labor Tax Act of 1919, holding that the law was not an allowable federal government excise tax but rather a penalty enacted to coerce state citizens to act in accordance with Congress' views on child labor and therefore was an infringement on state power.

59 See Hammer v. Dagenhart, 247 U.S. 251, 271-72 (1918) (striking down the Federal Child Labor Law of 1916, ch. 432, Pub. L. No. 249, 39 Stat. 675, 675-76). However, the Supreme Court overruled its Hammer decision in U.S. v. Darby, 312 U.S. 100, 116-17, 125-26 (1941), stating that Congress, under the Commerce Clause, has the power to establish limits on minimum wages and maximum hours. See also Bailey, 259 U.S. at 43-44 (striking down the Child Labor Tax Act of 1919, 65 Pub. L. No. 254, 40 Stat. 1057, 1138 (1919)).

60 See SALLER, supra note 45, at 37.

61 See generally VIVIANA A. ROTMAN ZELIZER, PRICING THE PRICELESS CHILD: THE CHANGING SOCIAL VALUE OF CHILDREN (1985). For a brief overview of the changing view of the child, see ZELIZER, at 5-21.

62 See Fair Labor Standards Act, 29 U.S.C. § 212 (1938).

63 For a history of federal child labor laws from 1916-29 and child labor reform during President Franklin Roosevelt's New Deal administration during the 1930s, see DAVID E. KYVIG, EXPLICIT AND AUTHENTIC ACTS: AMENDING THE U.S. CONSTITUTION 1776-1995, at 255-61, 307-14 (1996). Following the stock market crash of 1929 and the ensuing Depression, Pres. Roosevelt greatly expanded the scope of the federal government by introducing strong government regulation, control, and intervention. Between 1933 and 1939, Pres. Roosevelt’s administration took action to bring about immediate economic relief and reforms in industry, agriculture, finance, waterpower,

163

Prior to the early twentieth century, child protection efforts had centered around

children’s spirituality and exposure to printed information on obscenity, sex and sexually explicit

materials. At the turn of the twentieth century, American society began looking at children as

objects of scientific study, reform and control, according to historian Alison Parker.64 In the

early 1900s, Americans began to focus on the child’s physical well being. A few years later, the

focus shifted back to sexuality and morals. As the transportation, entertainment and mass media

industries expanded during the 1910s and 1920s, moral crusaders contended that each new development would lead to immoral conduct.65 In urban areas, adults began to worry about the

moral impact of automobiles and movie theaters, while residents in small towns lamented the

fact that young boys were hanging around confectionaries and train stations.66 The public and

special interest groups argued that entertainment mechanisms and the media, including

nickelodeons, pin ball machines, silent movies, radio and television, were harmful to minors.67

In the 1920s and 1930s, the Women’s Christian Temperance Union’s (WCTU) organized

censorship campaigns against movies, stating they were “the greatest factor in the education of

youth,” but as a negative influence.68 During the same period, the WCTU continued its fight,

labor and housing. The New Deal included a package of massive public works projects designed to re-employ Americans and to build a more modern infrastructure.

64 See PARKER, supra note 31, at 10. See also Hamilton Cravens, Child-Saving in the Age of Professionalism, 1915- 1930, in AMERICAN CHILDHOOD: A RESEARCH GUIDE AND HISTORICAL HANDBOOK (JOSEPH M. HAWES & N. RAY HINER, EDS.) 415-68 (1985).

65 See Sex, Violence, Children and the Media, supra note 24, at 341, 349.

66 See Sex, Violence, Children and the Media, supra note 24, at 341, 351. See also Richard Butsch, A History of Research on Movies, Radio, and Television, 29 J. POPULAR FILM & TELEVISION 112, 112-20 (2001), stating that some adults feared that boys were learning to steal and girls were learning about sex from movies.

67 See Sex, Violence, Children and the Media, supra note 24, at 341, 349.

68 See PARKER, supra note 31, at 135) (citing Alba B. Norton, “Motion Pictures,” Ohio A.R., 1930, at 127. For a brief history of the WCTU, see generally Alison M. Parker, "Hearts Uplifted and Minds Refreshed": The Woman's Christian Temperance Union and the Production of Pure Culture in the United States, 1880-1930, 11 J. WOMEN’S HISTORY 135 (1999).

164

begun in the 1880s, against crime-story papers and “immoral” literature, magazines, art, advertisements, prizefights, ballet, kinetoscopes, plays and gambling.69 The WCTU’s stated purpose was to protect innocent children by promoting “pure” culture and legal censorship.70

The WCTU was not alone in its quest to protect children’s morality. In the 1950s, the

United States Senate Committee on the Judiciary concluded that the “crime and horror” genre of comic books contributed to the rise in juvenile delinquency by offering “short courses in murder, mayhem, robbery, rape, cannibalism . . . sex, sadism, masochism, and virtually every other form of crime.”71 During the last half of the twentieth century, as television, music, video games and computers evolved, the controversies surrounding their effects on minors continued.72

The Parental Role in Childrearing

Despite public and government interest in protecting children from materials deemed harmful, under American law, parents—not the government—have the primary responsibility of raising their children, according to the U.S. Supreme Court.73 Yet the government has to balance

69 See PARKER, supra note 31, at 7, 11-12.

70 Parker, "Hearts Uplifted and Minds Refreshed”, supra note 68, at 135-36. In discussing the goals of the Woman’s Christian Temperance Union, Parker wrote: “Moral transformation of youth, activists argued, could only occur through the positive influence of a wholesome culture . . . . Asserting their right to be arbiters of culture themselves, WCTU women insisted upon a tie between art and morals.”

71 Catherine J. Ross, Anything Goes: Examining the State’s Interest in Protecting Children from Controversial Speech, 53 VAND. L. REV. 427, 445 (2000) (citing S. REP. NO. 84-62, at 2, 7 (1955)).

72 Id.

73 See Prince v. Massachusetts, 321 U.S. 158, 166 (1944) (holding that while the custody, care and nurture of the child resides first with parents, the state has some authority over children’s activities. The Court held that the lower court was correct in convicting the parent for violating the state child labor laws by engaging her child in street preaching and selling religious pamphlets); Pierce v. Soc’y of Sisters, 268 U.S. 510 (1925) (invalidating a law requiring parents to send their children to public schools, holding that such a law unreasonably interfered with parents’ rights to send their children to private schools); Meyer v. Nebraska, 262 U.S. 390, 399, 403 (1923) (invalidating a state statute prohibiting German language instruction in public schools to children who had not passed eighth grade. The Court held that the statute unconstitutionally interfered with parents’ right to raise their children without state interference). See also Jack Balkin, Comment, Media Filters, the V-Chip, and the Foundations of Broadcast Regulation, 45 DUKE L.J. 1131, 1138-39 (stating that the parent plays the primary role in childrearing in the United States).

165

two interests that sometimes compete: reinforcing parental authority in the childrearing process and shielding children from speech that could the government believes could harm them,74 or, more on point for this dissertation, the potential tensions between parental values and government values regarding sexuality and minors’ exposure to sexually explicit materials.75

The Supreme Court has clearly stated that the physical, psychological and emotional well-being of minors has been viewed as a compelling government interest,76 while a federal appellate court has held that the government has a compelling government interest in helping parents supervise their children.77 Congress also has recognized that while parents are primarily responsible for childrearing, “parental control or guidance cannot always be provided and society’s transcendent interest in protecting the welfare of children justify reasonable regulation of the sale of material to them.”78

Most parents do not want the government deciding what is best for their children;79 nor do they want to be sanctioned for exposing their children to material the government may think

74 Ross, supra note 71, at 472.

75 See Ashutosh Bhagwat, What If I Want My Kids to Watch Pornography?: Protecting Children from “Indecent” Speech, 11 WM. & MARY BILL OF RTS. J. 671, 696-701 (2003). Bhagwat, a law professor, discussed other problems with the indecency standard: conflicts that might erupt as parents disagree with the government’s value choices, the use of the same “indecency” standard for both toddlers and teens, the imposition of the views of the local majority over parental views, and the false assumption that all parents within a geographic region share the same government values. Id. at 698-701.

76 See Ginsberg v. New York, 390 U.S. 629, 639, 640 (1968) (upholding a state statute prohibiting the sale of pornographic magazines to minors under the age of seventeen); New York v. Ferber, 458 U.S. 747, 756-57 (1982) (affirming that using children in pornographic films harms their physical and emotional well being and upholding a state law intended to stop the use of children in porn films); Prince v. Massachusetts, 321 U.S. 158, 167 (1944) (upholding appellant's conviction for violating state child labor and compulsory school laws by engaging her child in street preaching and selling religious pamphlets).

77 See Action for Children’s Television v. FCC, 11 F.3d 170, 177 (D.C. Cir. 1993). The court wrote: “The government's asserted interest in protecting children also includes its independent interest in protecting the well- being of vulnerable youth and in shielding them from physical and psychological abuse.” Id.

78 See H.R. REP. NO. 775, 105th Cong. (2d Sess. (1998), at 12 (citing People v. Kahan, 15 N.Y.2d 311, 312, 206 H.E.2d 333, 334 (1965)).

79 See Balkin, supra note 73, at 1138-39.

166

is harmful.80 Jack Balkin, a constitutional law professor at Yale, has argued that different people

may have very different views about what is and is not harmful to children.81 For example, some

parents may want to shield their children from evolutionary views, discussions of homosexuality

and violent depictions of war, whereas others may want to keep their children away from PBS,

viewing it as “the great Satan of secular humanism.”82

The Supreme Court has supported the role of parents when it comes to exposing their

children to sexually explicit material. In 1968, in Ginsberg v. New York,83 the justices, in a 6-3

vote, stated that “constitutional interpretation has consistently recognized that the parents’ claim

to authority in their own household to direct the rearing of their children is basic in the structure

of our society.”84 Although the Court upheld a New York statute that prohibited store clerks

from selling pornography to minors under the age of seventeen, the Court did not preclude

parents from sharing the material with their children.85 In a 1978 radio broadcasting case

involving the afternoon airing of comedian George Carlin’s “seven dirty words” monologue,86 the Court noted that the government has an interest in supporting the role of parents in child rearing.87

80 Sex, Violence, Children and the Media, supra note 24, at 354 (comments of Jack Balkin).

81 Sex, Violence, Children and the Media, supra note 24, at 354 (comments of Jack Balkin).

82 Sex, Violence, Children and the Media, supra note 24, at 354 (comments of Jack Balkin).

83 Ginsberg v. New York, 390 U.S. 629 (1968).

84 Id. at 639. As support, the Court cited Prince v. Massachusetts, 321 U.S. 158, 166 (1944), in which it wrote, “"It is cardinal with us that the custody, care and nurture of the child reside first in the parents, whose primary function and freedom include preparation for obligations the state can neither supply nor hinder."

85 Ginsberg, 390 U.S. at 639. For further discussion of Ginsberg and the variable obscenity standard, see infra notes 129-143 and accompanying text.

86 The seven words were shit, piss, fuck, cunt, motherfucker, cocksucker and tits.

87 FCC v. Pacifica Found., 438 U.S. 726, 749 (1978). The Court held that the Federal Communications Commission had the authority to regulate the airing of indecent programming in the broadcast media to times when children most likely would not be in the audience. Pacifica, 438 U.S. at 729-31, 751.

167

On the other side of the issue, several legal experts have stated that the Supreme Court’s

harm-to-minors holdings and rationale raise problems. Attorney and historian Catherine Ross has

argued that the state itself has an interest in reinforcing family, but that authority “flounders” in a pluralist society in which families have diverse values88 and adolescents should not be treated

the same as toddlers.89 Law professor Eugene Volokh wrote that any kind of balancing test,

including one that attempts to balance protecting free speech with shielding children from material deemed harmful, begs the question—how do we balance these competing interests and which government interest is of greater “constitutional gravity”?90 In an article on media filters

and broadcast regulation, professor Balkin wrote, “[The] control of filters may be one of the

most important forms of power over human thought and human expression.”91 He stated that if indecency and violence are really bad for children and if society believes the protection of children is “paramount,” then steps should be taken to prevent parents from sharing sexually explicit and violent materials with their children.92 He argued that harm from other influences,

such as racism, sexism, homophobia and other forms of intolerance, could be equally as harmful

to children and questioned why the government has singled out sexual material, violence and bad

language as content which needed to be rated. He stated that the government has not prevented

parents from sharing sexually explicit or violent material with their children.93

88 Ross, supra note 71, at 473.

89 Ross, supra note 71, at 498.

90 Eugene Volokh, Freedom of Speech, Shielding Children, and Transcending Balance, 1997 SUP. CT. REV. 141, 167-68 (1997).

91 Balkin, supra note 73, at 1132.

92 Balkin, supra note 73, at 1138-39.

93 Balkin, supra note 73, at 1167.

168

In analyzing the impact of media on audiences, legal scholars Clay Calvert and Robert

Richards wrote that fears about the media’s effects on behavior are “fanciful and unfounded.”94

Calvert and Richards state that the magic bullet or hypodermic needle theory—which holds that messages directed and fired at the target audience would result in an expected response—has now been discredited.95 Proponents of this theory believed that messages could deliberately alter or control people’s behavior. The magic bullet theory came about in the first half of the twentieth century during the rise and popularization of radio and television and Hitler’s monopolization of the mass media in Germany during WWII.96 Richards and Calvert, in an article on cyberspace law, stated that people believe that emerging media tend to have “powerful effects,” often unwarranted, on audiences.97 They also stated that many people today believe the Internet is

“lawless and immoral.”98

94 Clay Calvert & Robert D. Richards, Essay, New Millennium, Same Old Speech: Technology Changes, but the First Amendment Issues Don’t, 79 B.U.L. REV. 959, 979 (1999). Calvert and Richards did not specify the ages of audience members to which they were referring; however, as minors access the World Wide Web and go to theaters to watch movies, including R-rated films, it is likely that the authors were referring to both minors and adults in their essay.

95 Id. See also ARTHUR BERGER, ESSENTIALS OF MASS COMMUNICATION THEORY 174 (1995), stating that the hypodermic needle theory, which Berger reports is “generally discredited now,” holds that all audience members “‘read’ the text the same way and get the same things out of it.” Berger wrote that hypodermic needle metaphor referred to an assumption that media injected all members of the audience with the same message. See also STANLEY BARAN & DENNIS DAVIS, MASS COMMUNICATION THEORY: FOUNDATIONS, FERMENT, AND FUTURE 72-75 (2000), stating that behavioral psychology has never been able to demonstrate that external stimuli, such as a media message, could condition the recipients of the message to behave the way a “master propagandist” wanted or that the recipients were powerless to consciously resist manipulation. Id.

96 See JULIA T. WOOD, COMMUNICATION MOSAICS: AN INTRODUCTION TO THE FIELD OF COMMUNICATION 303 (2006). In the 1920s, Freudianism and behaviorism were combined to produce a “simplistic propaganda theory,” according to social science theorists Stanley Baran and Dennis Davis. See BARAN & DAVIS, supra note 95, at 77.

97 Robert D. Richards & Clay Calvert, Essay: The “True Threat” to Cyberspace: Shredding the First Amendment for Faceless Fears, 7 COMM. LAW CONSPECTUS 291 (1999).

98 Id. at 293. Richards and Calvert conclude that although society has been both excited and apprehensive about the emergence of new communication technologies, “fear alone cannot justify dissolving First Amendment protections.” Id. at 295.

169

When it comes to shielding minors from sexually explicit material, the government has balanced the protection of minors against the First Amendment rights of adults to view such materials. As access to information and new technologies expands, the government has continued to try to balance the two compelling and sometimes competing interests of supporting the rights of parents to raise their own children as they see fit and protecting children from material deemed harmful.99 As stated above, parents—not the government—have the primary responsibility of raising their children under U.S. law.100 Parental views of childrearing may conflict with the government’s view. For example, parents might disagree with the government on the government’s value choices, the use of the same “indecency” standard for toddlers and teens, the imposition of the views of the local majority over parental views, and the false assumption that all parents within a geographic region share the same government values, according to one commentator.101

Sexually Explicit Material and the Law

Although obscenity and child pornography have no constitutional protection,102 the First

Amendment does protect indecent speech.103 Since the 1800s, the United States government has

99 See Ross, supra note 71, at 472.

100 Balkin, supra note 73, at 1138-39. See also Prince v. Massachusetts, 321 U.S. 158, 166 (1944) (holding that while the custody, care and nurture of the child resides first with parents, the state has some authority over children’s activities. The U.S. Supreme Court held that the lower court was correct in convicting a parent for violating state child labor laws by engaging her child in street preaching and selling religious pamphlets); Pierce v. Soc’y of Sisters, 268 U.S. 510 (1925) (invalidating a law requiring parents to send their children to public schools, holding that such a law unreasonably interfered with parents’ rights to send their children to private schools); Meyer v. Nebraska, 262 U.S. 390, 399, 403 (1923) (invalidating a state statute prohibiting German language instruction in public schools to children who had not passed eighth grade. The Court held that the statute unconstitutionally interfered with parents’ rights to raise their children without state interference).

101 Bhagwat, supra note 75, at 696-701.

102 In Roth v. United States, 354 U.S. 476 (1957), the Supreme Court held that obscenity was outside of First Amendment protection. In Miller v. California, 413 U.S. 15 (1973), the Court revised the obscenity test established in Roth and authored a three-part obscenity test that is still in effect today. The Court defined obscenity as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual conduct in a patently offensive way, and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.” Miller, 413

170

made numerous attempts to prevent the distribution of either obscene materials or sexually explicit materials. In 1865, Congress expanded the federal obscenity ban to include restrictions on the domestic use of the mails.104 A decade later, Anthony Comstock formed the Society for the Suppression of Vice in New York and pushed a new obscenity law through Congress in two months105 that led to widespread censorship.106 Meanwhile, federal courts in the United States imported the country’s first obscenity test from Regina v. Hicklin,107 decided in England in 1868.

In using the Hicklin test, the courts focused on the effect of isolated passages on the most susceptible individuals as a definition of and basis for punishing obscenity.108

It wasn’t until the last half of the twentieth century that the Supreme Court focused on children’s access to obscene materials and sexually explicit nonobscene materials. The Court tried to distinguish between content that was obscene for minors and content that was sexually explicit but not obscene for minors. In 1957 in Butler v. Michigan,109 the Supreme Court directly

U.S. at 24. In New York v. Ferber, 458 U.S. 747 (1982), the Supreme Court held that child pornography is not protected by the First Amendment. However, in Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002), the Court held that a ban on virtual child pornography—the use of computer-generated images that look like children—is unconstitutional.

103 See Reno v. ACLU, 521 U.S. 844, 874-75 (1997); FCC v. Pacifica Found., 438 U.S. 726, 732, 745-46 (1978); Sable Commc’ns of Calif. Inc. v. FCC, 492 U.S. 115, 124, 131 (1989).

104 Union soldiers had been receiving salacious literature, and the Postmaster General had been confiscating these packages without statutory authority. See HEINS, supra note 12, at 27.

105 HEINS, supra note 12, at 31.

106 HEINS, supra note 12, at 31. See also Ginsberg v. New York, 390 U.S. 629, 651 (1968) (Douglas, J., dissenting): “Comstock, of course, operated on the theory that every human has an ‘inborn tendency toward wrongdoing which is restrained mainly by fear of the final judgment.’ In his view any book which tended to remove that fear is a part of the ‘trap’ which Satan created. Hence, Comstock would have condemned a much wider range of literature than the present Court is apparently inclined to do.”

107 See Regina v. Hicklin, L. R. 3 Q. B. 360 (1868).

108 See Marion D. Hefner, Note, “Roast Pigs” and Miller-Light: Variable Obscenity in the Nineties, 1996 U. ILL. L. REV. 843, 846 (1996). See also Regina v. Hicklin, [1868] L. R. 3 Q. B. 360.

109 Butler v. Michigan, 352 U.S. 380 (1957).

171

confronted the problem of regulating obscenity for the first time.110 The Michigan Penal Code made it a misdemeanor to sell or make available to the general reading public any book containing obscene language "tending to the corruption of the morals of youth."111 The defendant sold a police officer a book that the trial judge described as "containing obscene, immoral, lewd, lascivious language, or descriptions, tending to incite minors to violent or depraved or immoral acts, manifestly tending to the corruption of the morals of youth."112 The

Court held that a state court improperly convicted the defendant of violating a Michigan obscenity law because the law was “not reasonably restricted to the evil with which it is said to deal.”113 The Court reasoned that the result of the legislation was “to reduce the adult population of Michigan to reading only what is fit for children.”114

In 1957, in Roth v. United States,115 the Supreme Court for the first time held that obscenity was outside the protection of the First Amendment.116 In a 6-3 vote, the Court affirmed the obscenity test that some lower courts already had adopted in the 1930s, 1940s and 1950s:117

"whether to the average person, applying contemporary community standards, the dominant theme of the material taken as a whole appeals to prurient interest."118 The Supreme Court noted

110 See Hefner, supra note 108, at 847.

111 Butler, 352 U.S. at 381 (citing section 343, Michigan Penal Code).

112 Id.

113 Id. at 383.

114 Id.

115 Roth v. United States, 354 U.S. 476 (1957).

116 Id. at 481.

117 Id. at 489.

118 Id. at 488-89. Although the Court included “community standards” in the definition of obscenity in the Roth case, the concept of community standards seems to have first been articulated by Judge Learned Hand in 1913. See Jacobellis v. Ohio, 378 U.S. 184, 192 (1964) (plurality opinion of Brennan, J.) (citing United States v. Kennerley, 209 F.119, 121 (S.D.N.Y. 1913)). The Court also acknowledged that the “early leading standard of obscenity

172

that the lower courts that had used the new obscenity standard had rejected the Hicklin standard.119 In Roth, the Court replaced the Hicklin test,120 saying it was unconstitutional.121 The

Court ruled that obscene speech could be prosecuted, but sexually explicit speech that fell outside the definition of obscenity had a least some constitutional protection.122 A decade later, the Court affirmed the Roth obscenity test, but questioned whether the concept of a community standard was a local or national standard,123 an issue the Supreme Court resolved in 1973 in

Miller v. California.124

allowed material to be judged merely by the effect of an isolated excerpt upon particularly susceptible persons.” Regina v. Hicklin, [1868] L. R. 3 Q. B. 360. The Roth Court wrote, “The Hicklin test, judging obscenity by the effect of isolated passages upon the most susceptible persons, might well encompass material legitimately treating with sex, and so it must be rejected as unconstitutionally restrictive of the freedoms of speech and press." Roth, 354 U.S. at 488-89.

119 Roth, 354 U.S. at 488-89.

120 See Hefner, supra note 108, at 847.

121 Roth, 354 U.S. at 489. The Court said that the Hicklin test was “unconstitutionally restrictive of the freedoms of speech and press.” Id.

122 Id. at 487-88.

123 Jacobellis v. Ohio, 378 U.S. 184, 192-93 (1964) (plurality opinion of Brennan, J.). Justice Brennan wrote:

It has been suggested that the ‘contemporary community standards’ aspect of the Roth test implies a determination of the constitutional question of obscenity in each case by the standards of the particular local community from which the case arises. This is an incorrect reading of Roth. The concept of ‘contemporary community standards’ was first expressed by Judge Learned Hand in United States v. Kennerley, 209 F. 119, 121 (D.C.S.D.N.Y. 1913).

In his opinion on community standards and obscenity, as cited by Brennan, Judge Hand wrote

If there be no abstract definition, such as I have suggested, should not the word ‘obscene’ be allowed to indicate the present critical point in the compromise between candor and shame at which the community may have arrived here and now? If letters must, like other kinds of conduct, be subject to the social sense of what is right, it would seem that a jury should in each case establish the standard much as they do in cases of negligence. To put thought in leash to the average conscience of the time is perhaps tolerable, but to fetter it by the necessities of the lowest and least capable seems a fatal policy.

See also FREDERICK SCHAUER, THE LAW OF OBSCENITY 117 (1976) (arguing that although there was no discussion of the definition of “community” in the Roth decision, one can infer from that decision that the justices envisioned a local community because the trial judge in Roth instructed jurors to “ ‘determine the impact [of the material] upon the average person in the community’” and to judge the content “‘by present-day standards of the community.’”)

173

First, however, a plurality of justices in 1966 extended the protection given sexually explicit material. In Memoirs v. Massachusetts,125 the Court expanded the definition of obscenity

to include material that was utterly without redeeming social value.126 A decade later, in Miller

v. California,127 the Court again revised the definition of obscenity, and authored the three-part

obscenity test, which is still the current precedent. The Miller test defines obscenity as

1) whether ‘the average person, applying contemporary community standards,’ would find the work, taken as a whole, appeals to the prurient interest; 2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 3) whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value.128

Children, the Variable Obscenity Standard and Indecency

The Miller obscenity test was written with adults in mind, but a different standard exists

for minors. Legal scholars William Lockhart and Robert McClure, who first articulated the

concept of and coined the term “variable obscenity” in a 1960 seminal law review article,

defined the standard as follows: “Material may be obscene when directed to one class of persons

but not when directed at another.”129 They advocated the two-part variable obscenity standard, stating that some material is obscene for adults and minors both, while other material is obscene

124 Miller v. California, 413 U.S. 15, 24 (1973). In 1973, the Court established that the community standard was a local standard. Id. See infra notes 127-128 and accompanying text for a summary of the Miller obscenity standard, which is still the controlling precedent.

125 Memoirs v. Massachusetts, 383 U.S. 413 (1966).

126 Id. at 418. In Memoirs, the Court noted that in subsequent cases after Roth that it had elaborated on the Roth definition of obscenity, stating that “three elements must coalesce: it must be established that (a) the dominant theme of the material taken as a whole appeals to a prurient interest in sex; (b) the material is patently offensive because it affronts contemporary community standards relating to the description or representation of sexual matters; and (c) the material is utterly without redeeming social value.” Id.

127 Miller, 413 U.S. at 15.

128 Id. at 24.

129 William Lockhart & Robert McClure, Censorship of Obscenity: The Developing Constitutional Standards, 45 MINN. L. REV. 5, 77 (1960).

174

for minors but not for adults.130 At the same time, the authors stated that the implementation of a

variable obscenity standard might create unmanageable problems. Lockhart and McClure stated

two specific problems with the standard: 1) material labeled “adults only” would tend to attract minors, and 2) booksellers would not have the necessary background to determine what would

be considered obscene as to minors.131

During the 1960s, when the Supreme Court was attempting to define obscenity and trying

to balance the constitutional rights of adults with the protection of minors, the Court decided a

case that focused on the constitutionality of a variable obscenity standard. In a case concerning a

Long Island luncheonette owner’s sale of “girlie” magazines to a sixteen-year-old boy on two

separate occasions, the Court held that material that is obscene for minors may not necessarily be

obscene for adults.132 In Ginsberg v. New York,133 the Court held that New York’s “harm to

minors” statute, which barred the sale of sexually explicit materials to persons under the age of

seventeen, was constitutional.134

The New York legislature had defined “harmful to minors” as the

“quality of any description or representation, in whatever form, of nudity, sexual conduct, sexual excitement, or sado-masochistic abuse, when it: (i) predominantly appeals to the prurient, shameful or morbid interest of minors, and

130 Id. at 77.

131 Id.

132 Ginsberg v. New York, 390 U.S. 629, 645-66 (1968).

133 390 U.S. 629 (1968).

134 Id. at 645. Justice William Brennan, who delivered the opinion of the Court, wrote, “The concept of variable obscenity is developed in Lockhart & McClure, Censorship of Obscenity: The Developing Constitutional Standards, 45 MINN. L. REV. 5, 85 (1960). Lockhart and McClure stated: ‘Variable obscenity . . . furnishes a useful analytical tool for dealing with the problem of denying adolescents access to material aimed at a primary audience of sexually mature adults. For variable obscenity focuses attention upon the make-up of primary and peripheral audiences in varying circumstances, and provides a reasonably satisfactory means for delineating the obscene in each circumstance.’” Id. at 636.

175

(ii) is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors, and (iii) is utterly without redeeming social importance for minors.135

Justice William Brennan, who delivered the opinion of the Court, wrote that, “The concept of variable obscenity is developed in Lockhart & McClure, Censorship of Obscenity:

The Developing Constitutional Standards, 45 Minn. L. Rev. 5, 85 (1960).”136

In upholding the variable obscenity standard, the Ginsberg Court based its ruling on its

1944 decision in Prince v. Massachusetts,137 in which it recognized the governmental interest in protecting children. The Ginsberg Court wrote, “The State has an interest ‘to protect the welfare of children’ and to see that they are ‘safeguarded from abuses’ which might prevent their

‘growth into free and independent well-developed men and citizens.’”138 The justices stated that the Constitution does not require “scientifically certain criteria of legislation,”139 thus leading to their reasoning in upholding New York’s “harmful to minors” statute. Under the variable obscenity standard, which remains in effect today, the Ginsberg Court did not prohibit parents from sharing sexually explicit material with their children.140 New York’s variable obscenity

135 Id. at 646 (App. A).

136 Id. at 636.

137 Prince v. Massachusetts, 321 U.S. 158, 167 (1944) (upholding appellant's conviction for violating state child labor and compulsory school laws by engaging her child in street preaching and selling religious pamphlets).

138 Ginsberg, 390 U.S. at 640-41 (citing Prince v. Massachusetts, 321 U.S. 158, 165 (1944)).

139 Id. at 643. The Court cited its 1911 decision in Noble State Bank v. Haskell, 219 U.S. 104, 110, in which the Court said, “We do not demand of legislatures ‘scientifically certain criteria of legislation.’” Haskell, 219 U.S. at 110.

140 Ginsberg, 390 U.S. at 639.

176

standard, upheld by the Ginsberg Court in 1968,141 was a modified version of the definition of obscenity the Supreme Court adopted in 1957142 and refined in 1966.143

Legal analyst Marion Hefner said the Ginsberg Court used the rational relationship test in upholding the New York statute.144 Courts use the rational basis test or “reasonableness standard” to determine whether the regulation “is rationally related to any legitimate government interest.”145 Legal scholar Kevin Saunders wrote that the variable obscenity standard is, in effect, a national standard,146 unlike the Miller obscenity test, which uses a local standard.147 Under

New York’s obscene-for-minors standard, the second prong uses the term “adult community,” rather than Miller’s average person standard within a geographic community. In Ginsberg, the obscene-for-minors test ends with the phrase “for minors,” regardless of locale. Saunders argued that people in all parts of the country—urban and rural, east and west, north and south—are more

141 Id.

142 Id. at 635.

143 Memoirs v. Massachusetts, 383 U.S. 413, 481 (1966). The Court expanded the definition of obscenity to include material that was utterly without redeeming social value.

144 Hefner, supra note 108, at, 857.

145 Junichi P. Semitsu, Note, Burning Cyberbooks in Public Libraries: Internet Filtering Software vs. The First Amendment, 52 STAN. L. REV. 509, 533 (2000). In determining whether to uphold the government’s regulation on content, courts choose one of three standards: strict scrutiny, intermediate scrutiny, or rational basis. The strict scrutiny standard requires that a content-based regulation serve a compelling state interest and be narrowly drawn to achieve that interest. Perry Educ. Ass’n v. Perry Local Educators Ass’n, 460 U.S. 37, 45 (1983) (citing Carey v. Brown, 447 U.S. 455, 461 (1980)); Sable v. FCC, 492 U.S. 115, 126 (1989). The intermediate scrutiny standard requires that regulations that incidentally regulate speech must serve a “significant” government interest, be narrowly tailored to achieve that interest, and leave open ample alternative channels of communication. Perry, 460 U.S. at 45 (citing U.S. Postal Serv. v. Council of Greenburgh Civic Assns., 453 U.S. 114, 132 (1981); Consol. Edison Co. v. Pub. Serv. Comm'n, 447 U.S. 530, 535-536 (1980); Grayned v. City of Rockford, 408 U.S. 104, 115 (1972); Cantwell v. Connecticut, 310 U.S. 296 (1940); Schneider v. State, 308 U.S. 147 (1939). The rational basis or reasonableness standard requires only that a regulation is rationally related to a legitimate government interest. See Perry at 49.

146 Kevin Saunders, Electronic Indecency: Protecting Children in the Wake of Cable and Internet Cases, 46 DRAKE L. REV. 1, 47 (1997).

147 See supra note 128 and accompanying text for the Miller obscenity standard, established in Miller v. California, 413 U.S. 15, 24 (1973).

177

likely to agree on what would be considered obscene as to minors than as to what might be

considered obscene as to adults.148

In 1989, twenty years after deciding Ginsberg,149 the Supreme Court again addressed the

variable obscenity standard in Sable v. FCC.150 The case involved a challenge to a federal statute

that banned indecent as well as obscene commercial telephone messages and conversations,

often referred to as “dial-a-porn.” The Court held that the First Amendment protects indecent

dial-a-porn messages but not obscene ones.151 The Court reaffirmed the government’s

compelling interest in protecting children from nonobscene material deemed harmful to them,152 while at the same time stating that regulations must be narrowly drawn so as to not interfere with the First Amendment freedoms of adults.153 The Court wrote, “There is a compelling interest in

protecting the physical and psychological well-being of minors. This interest extends to shielding

minors from the influence of literature that is not obscene by adult standards.”154

The Court contrasted the Sable opinion from the Pacifica case it had decided a decade earlier, which involved a New York radio station’s broadcast of George Carlin’s “Seven Filthy

Words” monologue during the afternoon when children were likely to be in the audience.155 In

148 Saunders, supra note 146, at 47.

149 Ginsberg v. New York, 390 U.S. 629 (1968).

150 Sable Commc’ns Inc. v. FCC, 492 U.S. 115 (1989).

151 Id. at 124, 131.

152 Id. at 126.

153 Id. (citing Schaumburg v. Citizens for Better Env't, 444 U.S. 620, 637 (1980) (Schaumburg cited Hynes v. Mayor of Oradell, 425 U.S., 610, 620 (1976) and First Nat’l Bank of Boston v. Bellotti, 435 U.S. 765, 786 (1978)). The Sable Court wrote that, “It is not enough to show that the Government’s ends are compelling; the means must be carefully tailored to achieve those ends.” Id. at 126.

154 Id. at 126.

155 FCC v. Pacifica Found., 438 U.S. 726, 729-30 (1978).

178

Pacifica, the Court held that the Federal Communications Commission had the authority to

regulate the airing of indecent programming in the broadcast media to times when children most

likely would not be in the audience.156

In pointing out the differences between the two cases, the Sable Court wrote:

Pacifica is readily distinguishable from this case, most obviously because it did not involve a total ban on broadcasting indecent material. The FCC rule was not ‘intended to place an absolute prohibition on the broadcast of this type of language, but rather sought to channel it to times of day when children most likely would not be exposed to it.’157

The Court also emphasized that, unlike telephone communication, the Pacifica opinion relied on the “unique” attributes of broadcasting—its unique pervasiveness, intrusion into the privacy of the home without prior warning on content, and unique accessibility to children.158

The Sable Court concluded that “[T]he private commercial telephone communications at issue

. . . are substantially different from the public radio broadcast at issue in Pacifica.”159

After the Supreme Court decided the Pacifica and Sable cases, the U.S. Court of Appeals

for the D.C. Circuit heard a case challenging a statute (and the related FCC regulations) that

required radio and television broadcasters to channel indecent programming between midnight

and 6 a.m. In 1995, in Action for Children’s Television v. FCC,160 the court, sitting en banc,

voted 7-4 to uphold the FCC's daytime ban on indecent material, stating the content-based

156 Id. at 729-31, 751.

157 Sable Commc’ns, Inc. v. FCC, 492 U.S. at 127 (1989).

158 Id. at 127 (citing FCC v. Pacifica Found., 438 U.S. at 748-49 (1978)). In Red Lion Broadcasting v. FCC, 395 U.S. 367, 388-89 (1969), the Supreme Court had earlier established that the physical scarcity of the electromagnetic spectrum in broadcasting justified lesser First Amendment protections. Red Lion, 395 U.S. at 388-89.

159 Sable, 492 U.S. at 127.

160 Action for Children’s Television v. FCC, 58 F.3d 654 (D.C. Cir. 1995).

179

regulations of speech met the strict scrutiny test161 because the regulations were narrowly tailored to serve the government interest of facilitating parental supervision and protecting children.162 However, the court held that the midnight to 6 a.m. time slot was not narrowly tailored because the statute allowed exceptions for some public stations to broadcast indecent programming after 10 p.m. and Congress had failed to explain any relationship between the compelling government interest and the disparate treatment of public and non-public stations.163

In 1996, the Supreme Court decided its first indecency case pertaining to the Cable

Television Consumer and Competition Act of 1992.164 The Court examined the constitutionality of three provisions of the Cable Act of 1992165 that were designed to protect children from exposure to indecent materials on cable.166 In the 1996 plurality opinion in Denver Area

161 To pass the strict scrutiny test, the Supreme Court has stated that a regulation must promote a compelling government interest and be the least restrictive means to further that articulated interest. Sable, 492 U.S. at 126.

162 Action for Children’s Television, 58 F.3d at 660-61. In the court’s Reconsideration Order in Action for Children’s Television v. FCC, 852 F. 2d 1332 (D.C. Cir. 1988), the court rejected broadcasters’ arguments that the Federal Communications Commission’s definition of indecency was unconstitutionally vague and overbroad. However, the court vacated the imposition of new safe harbor hours of midnight to 6 a.m. and remanded the case for further consideration. 852 F. 2d at 1338-44. In Action for Children’s Television v. FCC, 932 F. 2d 1504 (D.C. Cir. 1991), the court again rejected the vagueness and overbreadth arguments of broadcasters, industry associations and public interest groups. However, the court struck down the FCC’s twenty-four hour ban on indecent broadcasts, stating, "Our holding in ACT I that the Commission must identify some reasonable period of time during which indecent material may be broadcast necessarily means that the Commission may not ban such broadcasts entirely." 932 F. 2d at 1509. The Supreme Court denied certiorari in ACT II, 503 U.S. 913 (1992), after which Congress passed the Public Telecommunications Act of 1992, Pub. L. No. 102-356, 106 Stat. 949 (1992). Section 16(a) of the Act required the FCC to prohibit public radio and television stations from broadcasting indecent programming between 6 a.m. and 10 p.m. for stations that sign off the air at or before 12 midnight. Section 16(a) also required the FCC to prohibit commercial broadcast radio and television stations from transmitting indecent programming between 6 a.m. and 12 midnight. 58 F. 3d 658-59.

163 The exemption applied to public stations that signed off the air at midnight. Action for Children’s Television v. FCC, 58 F.3d 654, 668-69 (D.C. Cir. 1995).

164 Denver Area Educ. Telecomm. Consortium, Inc. v. FCC, 518 U.S. 727 (1996).

165 Cable Television Consumer Protection & Competition Act of 1992, Pub. L. No. 102-385 §§ 9, 10(a), (b), 106 Stat. 1484, 1486, §§ 10(a), 10(b), and 10(c) (1992) (codified at 47 U.S.C. § 532(h), 532(j)).

166 Denver Area Educ. Telecomm. Consortium, 518 U.S. at 734, 753.

180

Educational Telecommunications Consortium v. FCC,167 the Court held that the act’s grant of authority to leased channel cable operators under 10(a) —allowing cable operators to restrict the transmission of “patently offensive” or indecent programming—is consistent with the First

Amendment.168 The Court held the other two sections being challenged—10(b) and 10(c)—to be unconstitutional.169 The Court found Section 10(b) that required cable operators to segregate and block “patently offensive” sex-related material on a single channel, and that required cable operators to obtain a written request from a subscriber before unblocking that channel, to be overly restrictive to adult viewers, cable system operators and programmers.170 Finally, the Court found Section 10(c)—the blocking of offensive or indecent programming on public access channels—to be unconstitutional. The government had failed to show that the segregating and blocking provisions of the Cable Act were the least restrictive means to further its interest of protecting children from sexually explicit programming. The Court said that a “lockbox,” which

167 Id.

168 Id. at 733.

169 Id.

170 Id. at 752-54. The Court plurality also stated that the Cable Act of 1992 allowed cable companies up to 30 days to respond to a consumer’s request to unlock a restricted channel, which the plurality said was too restrictive. According to the plurality opinion,

The . . . delays, along with single channel segregation, mean that a subscriber cannot decide to watch a single program without considerable advance planning and without letting the ‘patently offensive’ channel in its entirety invade his household for days, perhaps weeks, at a time. These restrictions will prevent programmers from broadcasting to viewers who select programs day by day (or, through ‘surfing,’ minute by minute) . . . and to viewers who would like occasionally to watch a few, but not many, of the programs on the ‘patently offensive’ channel. Moreover, the ‘written notice’ requirement will further restrict viewing by subscribers who fear for their reputations should the operator, advertently or inadvertently, disclose the list of those who wish to watch the ‘patently offensive’ channel. Id. at 754.

181

would allow parents to block programming they did not want their children to see, was a less

restrictive alternative.171

In 2000, the Supreme Court again dealt with the cable industry and the telecasting of

sexually explicit material. In United States v. Playboy,172 the Court held as unconstitutional a

statute173 that required that cable operators to either scramble or block channels “primarily

dedicated to sexually-oriented programming” or to broadcast those channels during the “safe- harbor” hours of 10 p.m. to 6 a.m. when young children were not likely to be watching.174 The

purpose of the statute was to protect non-subscribers and their children from “signal bleed,”

which occurs when portions of the scrambled programs might be heard or seen. The Court held

that the statute violated the First Amendment because the law was a content-based restriction,

and the government failed to show that the statute was the least restrictive means of serving the

government’s compelling interest.175 The Court said a less restrictive alternative existed: Cable operators could notify subscribers that the companies would block indecent programming on individual channels that the subscriber requested to have blocked.176

As Congress and the Supreme Court tried to tackle the problem of protecting children

from material deemed harmful, several commentators stated that problems in defining “children”

and “harm” would be inherent in any proposed law. Several commentators have argued that

applying laws to all age groups can be problematic. For example, Professor Lili Levi said it is

171 Id. at 758.

172 United States v. Playboy Entm't Group, Inc., 529 U.S. 803 (2000).

173 Telecommunications Act of 1996, Pub. L. 104-104, Title V, Subtitle A, § 505(b), 110 Stat. 136 (1996) (codified as 47 U.S.C. § 561).

174 Playboy, 529 U.S. at 806, 826-27.

175 Id. at 813, 826-27.

176 Id. at 823-24.

182

difficult to treat young children and teenagers the same, while Professor Dale Kunkel pointed out

that educators would not use the same book to teach all age groups because children have

different capabilities and different needs.177 Similarly, commentator Marion Hefner wrote that

the variable obscenity standard would be difficult to apply because the determination of what

would be “patently offensive” would differ depending on a minor’s age.178 Moreover, Hefner stated that the “prurient interest” of a child has not been defined. “Babies and young children have no adult-like interest in sex, shameful or otherwise,” he argued.179

Attorney and historian Catherine Ross stated that the government’s use of the variable

obscenity standard does not answer the key question of whether the state is entitled to make

value judgments about speech on behalf of minors or whether parents should be making such

decisions.180 She noted that the variable obscenity standard, in the case of children, raises all of the issues of subjectivity that trouble the efforts to define indecent speech directed at adults.181

Several scholars have written that children are used as a focus when legislators and the courts want to restrict sexually explicit and violent speech.182 For instance, professor Jack Balkin

wrote that children are “problematized”—that is, as with other issues over time, children have

become a “recurrent object for analysis, discussion, worry, and concern.”183 He said that adults

focus on children, sometimes obsessively and sometimes in an effort to solve societal ills.184

177 Sex, Violence, Children and the Media, supra note 24, at 346.

178 Hefner, supra note 108, at, 870.

179 Hefner, supra note 108, at, 871.

180 Ross, supra note 71, at 521.

181 Ross, supra note 71, at 521.

182 See generally Sex, Violence, Children and the Media, supra note 24, at 349.

183 Balkin, supra note 73, at 1138-39.

184 Sex, Violence, Children and the Media, supra note 24, at 352 (comments of Jack Balkin).

183

Ross has said that the government has used children as a shield from the First Amendment.185

She wrote,

The contemporary anxiety about the nexus between child development and popular culture, whether it’s rock lyrics, TV or the Internet, is sometimes couched in the secular language of the social sciences and sometimes in overt religious concern about morality. But in each instance, the underlying concern invests a great deal in an image of childhood that does not always conform to social realities.186

Thus far, federal legislation aimed at protecting minors from sexually explicit material

does not differentiate among age groups. Moreover, social scientists disagree on whether a link

exists between minors’ exposure to sexually explicit material and their attitudes and behavior.

Social Science Research on the Effects of Pornography

Beginning in the 1960s, social scientists began studying the effects of sexually explicit

material, or pornography, on young adults and mature adults.187 Many of the seminal social

science studies on the effects of exposure to pornography were conducted prior to the advent of

the Internet. Research into the content, use and effects of online pornography is in its infancy,

and therefore not much is known about the behaviors and attitudes of people who access

pornography online.188 What is known, however, is the number of online pornography sites

185 Sex, Violence, Children and the Media, supra note 24, at 351 (comments of Catherine Ross).

186 Sex, Violence, Children and the Media, supra note 24, at 351 (comments of Catherine Ross).

187 Because of ethical constraints, researchers cannot conduct experiments in which they expose minors to sexually explicit material. However, researchers can survey minors on their access to sexually explicit material and then attempt to measure differences in attitudes between groups exposed to sexually explicit material and groups not exposed to such content. See infra notes 194-198 and 267-273 accompanying text.

188 See Patricia Goodson, Deborah McCormick & Alexandra Evans, Searching for Sexually Explicit Materials on the Internet: An Exploratory Study of College Students’ Behaviors and Attitudes, 30 ARCHIVES OF SEXUAL BEHAVIOR 101, 103, 115 (2001); see also Azy Barak & William A. Fisher, Effects of Interactive Computer Erotica on Men’s Attitudes and Behavior Toward Women: An Experimental Study 13 COMPUTERS IN HUMAN BEHAVIOR 353 (1997).

184

continues to grow.189

Thus far, studies measuring the effects of exposure to pornography on adults’ and children’s attitudes and behavior have been contradictory and inconclusive.190 One study, conducted in the mid-1980s, suggested that sexually explicit films only had undesirable effects on actions when violence was included.191 At least five studies, conducted in the 1980s, indicated that men developed negative attitudes toward women and engaged in aggressive behavior toward them after being exposed to sexually explicit material that portrays women as receptive, nondiscriminating and available sexual objects.192 On the other hand, another six studies, conducted in the 1980s and 1990s, indicated that men were not easily influenced or affected by such sexually explicit materials and therefore did not develop negative attitudes and behaviors toward women as a result of exposure to explicit materials.193

189 See Chapter 3 for a discussion of the increase in the number of online pornographic sites and filtering technology.

190 See Barak & Fisher, supra note 188, at 354. See also Aletha C. Huston, Ellen Wartella & Edward Donnerstein, Measuring the Effects of Sexual Content in the Media: A Report to the Kaiser Family Foundation at 4 (May 2003), http://www.kff.org/entmedia/1389-content.cfm; John S. Lyons, Rachel L. Anderson & David B. Larson, A Systematic Review of the Effects of Aggressive and Nonaggressive Pornography, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL, SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES 301 (Dolf Zillmann & Aletha C. Huston, eds., 1994); Azy Barak, William A. Fisher, Sandra Belfry & Darryl R. Lashambe, Sex, Guys & Cyberspace: Effects of Internet Pornography and Individual Differences on Men’s Attitudes Toward Women, 11 J. PSYCHOLOGY & HUMAN SEXUALITY 63, 65 (1999).

191 EDWARD DONNERSTEIN, DANIEL LINZ & STEVEN PENROD, THE QUESTION OF PORNOGRAPHY: RESEARCH FINDINGS AND POLICY IMPLICATIONS 176 (1987). See also Edward Donnerstein & Leonard Berkowitz, Victim Reactions in Aggressive Erotic Films as a Factor in Violence Against Women, 41 J. PERSONALITY & SOCIAL PSYCH. 710 (1981); Margaret E. Thompson, Steven H. Chaffee & Hayg H. Oshagan, Regulating Pornography: A Public Dilemma, 40 J. COMM. 73, 74 (1990).

192 J.V.P. Check & T.H. Guloien, Reported Proclivity for Coercive Sex Following Repeated Exposure to Sexually Violent Pornography, Nonviolent Dehumanizing Pornography, and Erotica, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS 159-84 (Dolf Zillman & Jennings Bryant, eds., 1989); E. Donnerstein, Pornography: Its Effect on Violence Against Women, in PORNOGRAPHY AND SEXUAL AGGRESSION 53-81 (N.M. Malamuth & E. Donnerstein, eds., 1984); N.M. Malamuth & J.V.P. Check, The Effects of Mass Media Exposure On Acceptance of Violence Against Women, 15 J. RESEARCH IN PERSONALITY 436-46 (1981); R.S. Wyer, G.V. Bodenhausen & T.F. Gorman, Cognitive Mediators of Reactions to Rape, 48 J. PERSONALITY & SOC. PSYCHOLOGY 324-38 (1984); and D. Zillmann, Effects of Prolonged Consumption of Pornography, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS 127-57 (Dolf Zillman & Jennings Bryant, eds., 1989).

193 Barak & Fisher, supra note 188, at 353-69; J. Becker & R.M. Stein, Is Sexual Erotica Associated With Sexual Deviance in Adolescent Males? 14 INT’L J. L. & POL’Y 85-95 (1991); W.A. Fisher & G. Grenier, Violent

185

Because it would be unethical for researchers to intentionally expose children to pornography, social scientists have not used minors in experiments. Moreover, virtually no empirical studies have been done on the presumed psychological, moral or developmental harm to minors that would result from their exposure to sexually explicit materials.194 However, a

2002 survey195 found that 25% of minors inadvertently or unintentionally were exposed to sexually explicit pictures on the Internet.196 While the majority did not reactive negatively to the exposure, one quarter reported that they were either extremely upset or very upset.197 One limitation of the study, as pointed out by the researchers, is that they did not measure the level of upset and therefore were not able to determine if those levels were comparable to a major upset, such as being assaulted or involved in an accident, or a minor upset, such as getting a bad grade.198

Pornography, Antiwoman Thoughts, and Anti Woman Acts: In Search of Reliable Effects, 31 J. SEX RESEARCH 23- 38 (1994); R. Langevin, et. al, Pornography and Sexual Offenses, 1 ANNALS OF SEX RESEARCH 335-62 (1988); N.M. Malamuth & J. Ceniti, Repeated Exposure to Violent and Nonviolent Pornography: Likelihood of Raping Ratings and Laboratory Aggression Against Women, 12 AGGRESSIVE BEHAVIOR 129-37 (1986); V.R. Padgett, J.A. Brislin-Slutz & J.A. Neal, Pornography, Erotica, and Attitudes Toward Women: The Effects of Repeated Exposure, 26 J. SEX RESEARCH 479-91 (1989). See generally I.L. REISS, JOURNEY INTO SEXUALITY: AN EXPLORATORY VOYAGE (1986).

194 See Kimberly J. Mitchell, David Finkelhor & Janis Wolak, The Exposure of Youth to Unwanted Sexual Material on the Internet: A National Survey of Risk, Impact, and Prevention, 34 YOUTH & SOC’Y 330, 334 (2003). See also Huston, Wartella & Donnerstein, supra note 190.

195 Mitchell, Finkelhor & Wolak, supra note 194, at 336. The national sample contained 1,501 Internet-using youth, ages 10-17, with 796 boys and 705 girls in the study. The mean age for youth was 14.14 years, with a standard deviation of 1.96. Twenty percent of youths lived in single-parent households, and 46% lived in households with an income of more than $50,000. Although the sample was representative of Internet-using youth, the authors pointed out that it is not representative of all youth in the United States because Internet use is not evenly distributed among the population. Mitchell, Finkelhor & Wolak, supra note 194, at 336.

196 Mitchell, Finkelhor & Wolak, supra note 194, at 330. The terms “inadvertently” and “unintentional” mean that minors accidentally came across sexually explicit online photographs while looking for nonsexual material. Mitchell, Finkelhor & Wolak, supra note 194, at 337.

197 Mitchell, Finkelhor & Wolak, supra note 194, at 330.

198 Mitchell, Finkelhor & Wolak, supra note 194, at 350.

186

During the 1960s and the 1980s, the United States government commissioned two studies

on pornography that reached different conclusions. In 1967, Congress established an advisory

commission to study ways to deal effectively with obscenity and pornography trafficking and to conduct a “thorough study . . . of the causal relationship of such materials [obscenity and

pornography] to antisocial behavior.”199 President Lyndon Johnson appointed nineteen members to the Presidential Commission on Obscenity and Pornography in 1968.200 The commission

examined national surveys measuring the opinions of the public and professional workers,201

experimental and quasi-experimental studies on the effects of pornography, and data on the national rates of sex offenses and illegitimacy.202

In its 1970 report, the Commission wrote that it did not find sufficient evidence that

exposure to pornography caused either juveniles or adults to commit sex crimes203 or to engage in delinquent or criminal behavior.204 The Commission stated that most subjects reported no

change in sexual behavior after being exposed to pornography.205 According to the report, some

people engaged in increased masturbation or sexual intercourse, but those behaviors generally

disappeared within forty-eight hours.206 The Commission wrote, “In general, established patterns

199 Commission on Obscenity and Pornography, REPORT OF THE COMMISSION ON OBSCENITY & PORNOGRAPHY 1 (1970).

200 Id. The Commission was comprised mostly of college professors, but also included were attorneys, ministers and a medical doctor. Id. at 634-39.

201 Professional workers included psychiatrists, psychologists, sex educators, and counselors. Id. at 160-63.

202 Id. at 23.

203 Id. at 242. The authors also noted that similar analyses in Denmark indicated that as the availability of erotica increased in that country, the number of sex times decreased. Id.

204 Id. at 27.

205 Id. at 25.

206 Id.

187

of sexual behavior were found to be very stable and not altered substantially by exposure to

erotica.”207

In 1985, Attorney General Edwin Meese, who served under President Ronald Reagan,

established a Commission on Pornography. The commission’s objectives were to “determine the

nature, extent, and impact on society of pornography” and to recommend “effective ways” to

“contain” the “spread of pornography, consistent with constitutional guarantees.”208 Meese

appointed twelve members to the commission,209 which examined social science studies on the effects of pornography and government reports on organized crime and the pornography

industry.210 The commission also heard testimony from thirty witnesses who said that they had

been harmed by pornography.211

The Meese Commission disagreed with the findings of the 1970 Commission on

Obscenity and Pornography. The Meese Commission was not persuaded by arguments by the

1970 Commission that pornography did not cause harm. “Many of the arguments made against

regulation . . . rest on claims of harmlessness that, as we have explained . . . are simply

erroneous,” according to the Meese report.212 The Meese Commission stated that pornography

had become more readily available213 and more graphic.214 Although the Meese Commission

207 Id.

208 Attorney General’s Commission on Pornography, Final Report of the Attorney General’s Commission on Pornography, as reprinted in FINAL REPORT OF THE ATTORNEY GENERAL’S COMMISSION ON PORNOGRAPHY li (Nashville: Rutledge Hill Press, 1986). Some deletions, “mostly inconsequential,” were made in publishing the Report, according to the publisher.

209 The Meese Commission was comprised of three college professors, three attorneys, one judge, one medical doctor, one journalist, one minister, one child abuse council president, and one city council member. Id. at 477-82.

210 Id. at 246-302.

211 Id. at 197.

212 Id. at 50.

213 Id. at 24; see also Lyons, Anderson & Larson, supra note 190.

188

acknowledged the problem of “multiple causation,” with most actions and consequences caused

by several factors,215 it recommended stronger enforcement of obscenity regulations and the

enactment of new and more restrictive measures.216

During the 1980s, social science experiments on the effects of pornography led to

contradictory conclusions. In a 1982 study, Dolf Zillmann and Jennings Bryant reported that

exposure to nonviolent pornographic films caused sexual callousness toward women and

trivialization of rape.217 Five years later, in 1987, Edward Donnerstein and Daniel Linz reached a

different conclusion, stating that sexually explicit films only had undesirable effects on actions

when violence was included.218

In a 1987 study, Jennings Bryant and Steven Rockwell examined the impact of sexually

explicit, though not pornographic, television shows on young teenagers. The researchers found

that 13- and 14-year-old boys and girls who were exposed to several hours of sexually-oriented

prime-time television vignettes viewed sexual indiscretions as “less bad” than their peers.219 The

researchers found that teenagers who were from homes in which families communicated openly

214 FINAL REPORT OF THE ATTORNEY GENERAL’S COMMISSION ON PORNOGRAPHY, supra note 208, at 38-41. See also Lyons, Anderson & Larson, supra note 190.

215 FINAL REPORT OF THE ATTORNEY GENERAL’S COMMISSION ON PORNOGRAPHY, supra note 208, at 34.

216 Id. at 49-65. See also SUSAN M. EASTON, THE PROBLEM OF PORNOGRAPHY 12 (1994).

217 Dolf Zillmann & Jennings Bryant, Pornography, Sexual Callousness, and the Trivialization of Rape, 32 J. COMM. 10, 10-18 (1982).

218 DONNERSTEIN, LINZ & PENROD, supra note 191, at 176; Donnerstein & Berkowitz, supra note 191, at 710; See also Thompson, Chaffee & Oshagan, supra note 191, at 73, 74.

219 Jennings Bryant & Steven Rockwell, Effects of Massive Exposure to Sexually Oriented Prime-Time Television Programming on Adolescents’ Moral Judgment, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES 187-88 (Dolf Zillmann & Aletha C. Huston, eds. 1994). The researchers used the phrase “less bad” in reporting their findings, but they did not define the term in their article. The experiment used a 3 x 2 factorial design, and the subjects were asked to rate programs on spurious items designed to mask the real purpose of the study. The show episodes, aired from 1984-87, included “Dynasty”, “Hotel”, “Dallas”, “Falcon Crest”, “Knots Landing” and “Emerald Point.” The adolescents viewed vignettes dealing with sexual relations between married partners, sexual relations among unmarried partners, and nonsexual relations between adults.

189

were not greatly affected by exposure to fifteen hours of television programming showing an

alternate value culture, which they defined as “sexual relations between unmarried adults.”220

The study concluded that young teenagers’ moral judgment can be altered after they have been heavily exposed to prime-time television shows featuring sexual intimacy between unmarried persons. However, the researchers noted that several individual and family factors can mitigate, if not eliminate, any unwanted shifts in values.221

In a study released in 1988, Dolf Zillmann and Jennings Bryant reported that repeated

and prolonged exposure to nonviolent pornography can alter perceptions of sexuality and relationships.222 For example, the authors reported that the adult research subjects who were

shown pornography viewed sexual promiscuity and nonexclusive sexual relationships as more

prevalent and more acceptable to others.223 They concluded that those who consume

pornography regularly could, for better or worse, be less likely to want a monogamous marriage

and to have children.224 They also found that exposure to non-violent pornography changed an individual’s perception of rape victims,225 but later studies have not replicated these findings.226

In a different study released in 1988, other researchers repeated earlier experiments on

the effects of nonviolent pornography, including the study conducted by Zillman and Bryant in

220 Id. at 190-91. In this 3 x 2 x 2 factorial design, programming included sexual relations between unmarried adults, nonsexual relations between adults, and no exposure to television.

221 The researchers listed one mitigating factor in their study: open family communications. Id. at 191-94.

222 See Dolf Zillmann & Jennings Bryant, Effects of Prolonged Consumption of Pornography on Family Values, 9 J. FAMILY ISSUES 518 (1988). The authors did not report the dates that they conducted their research.

223 Id. at 541.

224 Id.

225 Id. at 520.

226 Daniel G. Linz, Edward Donnerstein & Steven Penrod, Effects of Long-Term Exposure to Violent and Sexually Degrading Depictions of Women, 55 J. PERSONALITY & SOC. PSYCHOLOGY 758, 760, 766 (1988).

190

1982, and reached different conclusions. The 1988 study failed to support the premise that long- term exposure to degrading images of women in sexually explicit films would affect “subsequent beliefs and attitudes about women.”227 “In none of these studies do we find antisocial effects for exposure to pornographic material that is not overtly violent,” the researchers wrote.228

In a 1995 meta-analysis,229 which focused on adults in laboratory settings, researchers found that some connection probably exists between exposure to violent and nonviolent pornography and subsequent behavioral aggression.230 However, they acknowledged three major problems with their meta-analysis study. First, the researchers were unable to generate enough data. Second, the use of data, which came from an experimental design, raises questions about

227 Id. at 765-67. In the study by Linz, Donnerstein and Penrod, researchers divided subjects into three groups, each watching a different full feature film. The first group watched films that were “overtly violent” (often referred to as “slasher” films, with violent scenes often juxtaposed to mildly erotic scenes). The second group watched films that were “not overly violent but sexually explicit” (typically containing depictions of heterosexual intercourse, lesbian activity, fellatio, and cunnilingus, but usually without rape or other forms of sexual violence). The third group watched films that were “not sexually explicit, but . . . (which) nearly always portray women as sexual objects” (often characterized by the “inordinate amount of time major character—nearly always teenagers—spend pursuing, talking about, or engaging in sex.”). Id. at 759-60. Linz, Donnerstein and Penrod did not find that long-term exposure to degrading images of women in nonviolent, sexually explicit full-length films affected the subjects’ subsequent beliefs and attitudes about women. In contrast, Zillmann’s study relied only on excerpts from an X-rated film rather than on the full film. Donnerstein, Linz and Penrod repeated Zillmann and Bryant’s experiments, conducted in 1982 and 1984, on the effects of nonviolent pornography. Donnerstein, Linz and Penrod also repeated their own experiment, conducted in 1984, on the effects of nonviolent pornography. However, the researchers were unable to replicate the findings of those studies. See also Daniel Linz & Edward Donnerstein, The Methods and Merits of Pornography Research, 38 J. COMM. 180, 180-82 (1988).

228 Linz & Donnerstein, supra note 227, at 180, 180-82.

229 A meta-analysis is a quantitative statistical analysis of several separate but similar experiments or studies in order to test the pooled data for statistical significance. A meta-analysis also can be defined as a set of statistical procedures designed to accumulate experimental and correlational results across independent studies that address a related set of research questions. See GLENN G. SPARKS, MEDIA EFFECTS RESEARCH 173 (2002) and GERALD STONE, MICHAEL SINGLETARY, & VIRGINIA P. RICHMOND, CLARIFYING COMMUNICATION THEORIES 263 (1999).

230 See Mike Allen, Dave D’Alessio & Keri Brezgel, A Meta-Analysis Summarizing the Effects of Pornography II: Aggression After Exposure, 22 HUMAN COMMC’N RESEARCH 258, 274 (1995).

191

the generalizability231 of the data. Third, the aggression took place in the context of an artificial setting rather than a real world setting.232

In 1999, a group of researchers published the findings of two independent studies on the effects of online pornography.233 The researchers, who used male undergraduate students in their samples, were unable to document negative effects of online pornography on men’s attitudes toward women.234 However, the researchers did find a relationship between preexisting individual differences and attitudes toward women, suggesting the possibility that

“hypermasculine”235 men already may have misogynist attitudes.236 While both studies were externally and internally valid,237 two limitations were evident: the small sample size (twenty- four and thirty-four subjects, respectively) and the samples consisting only of undergraduate university males as opposed to other males in different age groups and outside of a university setting.238

231 Generalizablity refers to external validity—whether the results of the study can be generalized to the population as a whole, to other situations and to other time periods. See ROGER WIMMER & JOSEPH DOMINICK, MASS MEDIA RESEARCH 23.

232 Allen, Alessio & Brezgel, supra note 230, at 275.

233 Azy Barak et. al, Sex, Guys & Cyberspace: Effects of Internet Pornography and Individual Differences on Men’s Attitudes Toward Women, 11 J. PSYCHOLOGY & HUMAN SEXUALITY 63, 84-86 (1999).

234 Id.

235 The researchers did not define the term “hypermasculine.”

236 See Barak et al., supra note 233, at 84-86.

237 Validity is an important component of scientific research. Internal validity refers to the ability of a study to measure what it is intended to measure, e.g. the extent to which the change in the dependent variable was actually due to the independent variable. External validity refers to the extent that the relationship observed between the independent and dependent variables during the experiment—e.g. the results—is generalizable to the real world. See Wimmer & Dominick, supra note 231, at 29-34.

238 The authors acknowledged age and type limitations but did not discuss the small sample sizes. See Barak et al., supra note 233, at 86.

192

In an article released in 2000, a team of researchers who conducted a meta-analysis

reported that “exposure to pornography produces a variety of substantial negative outcomes.”239

The researchers stated that themes of aggression, gratification and objectification “may reinforce

and/or justify similar attitudes and behaviours [sic] in everyday human-life contacts.”240 The team noted, though, that the relationship between the consumption of pornography and subsequent behavior does not exist in a vacuum. The researchers wrote, “While likely not a solitary influence, it appears that exposure to pornography is one important factor which contributes directly to the development of sexually dysfunctional attitudes and behaviours.”

[sic]241

With the advent and expansion of the Internet, researchers have questioned whether

Internet pornography can be conceptualized as traditional pornography viewed through a new medium242 or if online pornography is quite different than that found in other media.243 For

instance, some researchers view Internet pornography as unique from other media because of the

interactivity. Users participate in an active and interactive environment involving animated

239 Elizabeth Oddone-Paolucci, Mark Genuis & Claudio Violato, A Meta-Analysis of the Published Research on the Effects of Pornography, in THE CHANGING FAMILY AND CHILD DEVELOPMENT 52 (Claudio Violato, Elizabeth Oddone-Paolucci & Mark Genuis, eds., 2000). The researchers analyzed forty-six studies published from 1962 to 1995, most of which were conducted in the United States. A total of 12,323 individuals participated in the studies.

240 Id.

241 Id.

242 See Michael D. Mehta & D. Plaza, Content Analysis of Pornographic Images Available on the Internet, 13 THE INFO. SOC’Y 153, 154, 161 (1997).

243 See Chad Mahood, Sriram Kalyanaraman & S. Shyam Sundar, The Effects of Erotica and Dehumanizing Pornography in an Online Interactive Environment, paper presented at 83rd annual convention of the Association for Education in Journalism and Mass Communication, Phoenix, Ariz. (Aug. 2000); S. S. Sundar, Technological Issues in Internet Pornography, paper presented at annual convention of the Association for Education in Journalism and Mass Communication, New Orleans, La. (Aug. 1999).

193

images, sequences and games.244 Users also can produce their own online pornography.245

Finally, users have privacy (in contrast to visiting adult bookstores), the ability to download select images (rather than having to buy a whole magazine or video), easy and discrete storage on their own computers, and ever-expanding access to pornographic Web sites.246 Even with the interactive environment of computer pornography, Azy Barak and William Fisher reported that males’ use of computer pornography did not affect their attitudes and behavior toward women.247 Chad Mahood, Sriram Kalyanaraman and Shyam Sundar argued that further research needs to be conducted to study whether interactive pornography has a more negative impact on consumers than non-interactive pornography or even non-interactive obscenity.248

Despite numerous studies on the effects of pornography consumption on users, a group of researchers has said that a key question remains: Under what conditions does behavior

244 Ven-hwei Lo & Ran Wei, Third-Person Effect, Gender, and Pornography on the Internet, 46 J. BROAD. & ELEC. MEDIA 13, 13-14 (2002).

245 Id.

246 Marty Rimm, Marketing Pornography on the Information Superhighway, 83 GEORGETOWN L.J. 1849, 1852 (1995).

247 Barak & Fisher, supra note 188, at 357-66. The researchers used the experimental method and the survey method in their study. The researchers conducted an experiment using 100 first-year university males and divided them into four groups: one control group viewing neutral (non-sexual) stimulus/passive viewing pictures and three experimental groups—erotic stimulus/passive viewing, erotic stimulus/moderately interactive viewing and erotic stimulus/highly interactive viewing groups. Men in the experimental groups were shown 56 sexually explicit color pictures that portrayed various sexual acts and nude women, including close-ups of women’s breasts and genitals. In the passive viewing group, the men viewed each picture for 30 seconds. In the other two experimental groups, the men were able to interact with the computer at two different levels: moderately interactive viewing and highly interactive viewing. At the moderately interactive viewing level, the men could determine the viewing time for each picture, browse back and forth among the pictures, and control the brightness of each picture. At the highly interactive viewing level, the men had the same options as the moderately interactive group, but also could control four additional parameters. First, the men could change the color tones of the pictures. Second, the men could change and control size and proportion of every picture, making it larger or smaller or skewing it vertically or horizontally. Third, men could change and control the color style of all of the pictures, such as creating pop-art presentations in which the pictures had checkered or other color patterns. Fourth, the men could zoom in and zoom out on one third of the photos in the group, thus allowing the men to magnify specific areas of those individual pictures. (The researchers did not discuss how the photos were selected for the zooming feature). The researchers used a survey to measure the men’s attitudes after viewing the pictures.

248 Mahood, Kalyanaraman & Sundar, supra note 243.

194

change?249 For example, if a person is predisposed to aggressive behavior, the arousal produced by exposure to sexual materials may enhance the aggressive predispositions.250 Conversely,

researchers have found that exposure to nudity diminished subsequent aggressive behavior.251

Pornography and the Third Person Effect Theory

In studies analyzing the connection between pornography and support for censorship, 252

researchers have looked at the “third-person effect” theory—the tendency for people to think

others are more affected by mass media messages than they themselves are affected.253 Several studies have found that people perceive pornography as having a greater negative influence on others than on themselves.254

In a study published in 1990, Margaret Thompson, Steven Chaffee and Hayg Oshagan

found that the largest predictor of attitudes toward regulation of pornography was the perceived

effect of pornography on others.255 A few years later, Hernando Rojas, Dhavan Shah and Ronald

Faber surveyed students at a large Midwestern university and concluded that people would judge

the media, pornography, and violence on television as having a greater impact on others than on

249 Allen, Alessio & Brezgel, supra note 230, at 274.

250 This is known as excitation transfer theory. Allen, Alessio & Brezgel, supra note 230, at 274.

251 Allen, Alessio & Brezgel, supra note 230, at 261.

252 See Hernando Rojas, Dhavan V. Shah & Ronald J. Faber, For the Good of Others: Censorship and the Third- Person Effect, 8 INT’L J. PUB. OPINION RESEARCH 163 (1996); Albert C. Gunther, Overrating the X-Rating: The Third Person Perception and Support for Censorship of Pornography, 45 J. COMMC’N 27 (1995); Lo & Wei, supra note 244.

253 The third-person effect is a theory of communication that was first asserted by W. Phillips Davison in 1983, and a number of subsequent studies, including experiments and surveys, have supported this theory. For a concise summary of third-person effect studies, see Rojas, Shah & Faber, supra note 252, at 165.

254 See Gunther, supra note 252; Rojas, Shah & Faber, supra note 252; Ven-hwei Lo & Anna R. Paddon, How Sexual Strategies Theory, Gender, and the Third-Person Effect Explain Attitudes About Pornography, paper presented at annual convention of Association for Journalism & Mass Communication, New Orleans, La. (Aug. 1999).

255 Thompson, Chaffee & Oshagan, supra note 191, at 81.

195

themselves, thus supporting the third-person effect model.256 Research participants who strongly

believed that pornography affected others more than themselves were more likely to display pro-

censorship attitudes.257 The authors did point out several limitations of the study, including the

non-random sample and therefore the lack of generalizability to the population as a whole.258

Renowned researcher Albert Gunther emphasized that a complexity of circumstances—

and not just perceived harmful effects of pornography—would lead to pro-censorship attitudes,

such as “gender differences, political ideology, attitudes toward freedom of expression and . . .

one’s own exposure to pornography.”259 While studies continue to support the existence of a

third-person effect, researchers point out a major problem in trying to implement policy based on

this effect: either people overestimate the effects of pornography and other messages on others,

or else they underestimate the effects on themselves, or both.260

Pornography and Children

Although studies on the effects of pornography have been inconclusive, professionals

who work with children have stated that pornography harms children.

In testimony before a U.S. House of Representatives subcommittee in 1998, Dr. Gary

Brooks, a psychologist, said that a child’s sexual development occurs gradually throughout

childhood, and exposure to pornography does not provide children with “a normal sexual perspective.” “Unlike learning provided in an educational or home setting, exposure to pornography is counterproductive to the goal of healthy and appropriate sexual development in

256 Rojas, Shah & Faber, supra note 252, at 174-75.

257 Rojas, Shah & Faber, supra note 252, at 181-82.

258 Rojas, Shah & Faber, supra note 252, at 182.

259 Gunther, supra note 252, at 29-30.

260 Rojas, Shah & Faber, supra note 252, at 182; Gunther, supra note 252, at 35.

196

children,” Brooks said. Pornography harms children because it inundates their minds with

“graphic messages about their bodies, their own sexuality, and those of adults and children around them,” Brooks testified.261

Dr. Mary Anne Layden, a psychotherapist, testified at the same House subcommittee hearing and said that pornography is harmful to minors and produces “permission-giving beliefs” for sexual pathology and sexual violence. Therefore, she said, children exposed to pornography can become victims or victimizers.262

Dr. Victor Cline, a clinical psychologist who treats children and adults who have been

“injured by exposure to pornography,”263 stated that several experimental, field and clinical studies indicated that pornography could result in harm.264 However, he said that not everyone exposed to pornography would be adversely affected by it, although some people can suffer eventual harmful effects from repeated exposure to pornography.265 “[A]s with using alcohol or even some of the highly addictive drugs, not everyone exposed will become alcoholic or addictive, at least in the early stages of use,” Cline said.266

261 See Committee on Commerce’s Report on Child Online Protection Act (to accompany H.R. 3783), H.R. REP. NO. 105 (2d Sess. 1998), at 11 (statement of Dr. Gary Brooks, Assistant Chief of Psychology Services, Dept. of Veterans Affairs, The Centerfold Syndrome). For a discussion of the Child Online Protection Act, see Chapter 5.

262 See Comm. on Commerce’s Report on Child Online Protection Act (to accompany H.R. 3783), H.R. REP. NO. 105 (2d Sess. 1998), at 11 (statement of Mary Anne Layden, Ph.D., Director of Education, Dept. of Psychiatry, Center for Cognitive Therapy, Univ. of Pennsylvania).

263 Victor B. Cline, Pornography Effects: Empirical and Clinical Evidence, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES 245 (Dolf Zillmann & Aletha C. Huston, eds., 1994). Cline said he has treated more than 200 sex offenders, as well as sex addicts, and victims of rape or sexual abuse. He treated a brother and sister (ages 14 and 13, respectively) who began having a sexual relationship after finding their father’s pornography collection. Id. at 233, 243. He treated two brothers, ages 9 and 10, who watched their parents’ X-rated videos and forced younger siblings and neighbor children to strip naked, after which they forced dirt, sticks, and small rocks into their rectums. Id. at 244.

264 Id. at 232-41.

265 Id. at 245.

266 Id. at 245.

197

In a 1998 report to the Kaiser Family Foundation, researchers stated that “[s]cripts and

schemas learned in childhood have particular importance because children do not have well-

developed ideas and understandings of sexuality. Content viewed later may modify such

schemas or reinforce them, but will not have quite the ‘primacy’ of what was initially

learned.”267 In a report on the effect of media messages on children, a team of researchers wrote

that individuals react differently, as do subgroups, based on a variety of factors.268 They also

stated that while media can facilitate change, change is not easy. For example, exposure to

depictions of “risky” sexual behavior is not likely to change people who have been socialized to

practice safe sex.269 They concluded that exposure to sexually explicit media messages rarely

leads to antisocial behavior or reactions, but rather a combination of factors—including exposure to mass media—leads to a behavioral response.270

In a 1999 article, Dr. Elissa Benedek wrote that an unpublished study indicated that

children who had been exposed to televised pornography were adversely affected.271 In contrast

to children who had been sexually abused, children who were exposed to televised pornography

displayed sexually reactive behaviors, including excessive masturbation with an object, insertion

of an object into one’s own anus or vagina, or simulation of sexual intercourse or oral copulation

267 Huston, Wartella & Donnerstein, supra note 190, at 13. Huston researches child development, Wartella researches child development and mass media violence, and Donnerstein researches mass media violence and media policy.

268 Huston, Wartella & Donnerstein, supra note 190, at 35.

269 Huston, Wartella & Donnerstein, supra note 190, at 35-36.

270 Huston, Wartella & Donnerstein, supra note 190, at 42.

271 Elissa P. Benedek & Catherine F. Brown, No Excuses: Televised Pornography Harms Children, 7 HARV. REV. PSYCHIATRY 236, 237 (1999) (citing an unpublished manuscript by D.M. Elliott, a government lawyer). No further citation information was provided in the article. Benedek is a psychiatrist specializing in child, adolescent and forensic psychiatry and served as president of the American Psychiatric Association from 1990-91 and as director of research and training at the Center for Forensic Psychiatry from 1980-1997. Brown, M.Ed., is executive editor of Psychiatric News, the newspaper of the American Psychiatric Association.

198

with a same-age child.272 Benedek said that children may be harmed by televised pornography because “television is a primary and effective sex educator, regardless of whether the information is accurate or wanted.”273

Conclusion

Congress has stated that while parents are primarily responsible for childrearing, they

cannot always monitor their children, and therefore the government has the right to regulate the

sale of sexually explicit material.274A federal appellate court has held that the government has a

compelling government interest in helping parents supervise their children.275 In Ginsberg v.

New York,276 the Supreme Court upheld a variable obscenity standard, which stated that material

that is obscene for minors may not be obscene for adults.277 The Court did not, however, prohibit

parents from sharing sexually explicit material with their children.278

Although social science studies measuring the effects of pornography have been inconclusive, Congress and the Supreme Court have acknowledged a compelling government interest in protecting minors from material deemed harmful, including sexually explicit content.

Historically, the government’s interest in protecting minors has focused on regulating obscenity

272 Id. at 237.

273 Id. at 238.

274 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 12 (citing People v. Kahan, 15 N.Y.2d 311, 312, 206 H.E.2d 333, 334 (1965) (as cited in Ginsberg v. New York, 390 U.S. 629, 640 (1968)).

275 Action for Children’s Television v. FCC, 11 F.3d 170, 177 (D.C. Cir. 1993). The Court wrote, “The government's asserted interest in protecting children also includes its independent interest in protecting the well-being of vulnerable youth and in shielding them from physical and psychological abuse.” Id.

276 Ginsberg v. New York, 390 U.S. 629 (1968).

277 Id. at 645.

278 Id. at 639.

199

and indecency. With the advent and growth of the Internet, however, Congress became increasingly concerned with minors’ ready access to online pornography.

200

CHAPTER 5 FEDERAL ATTEMPTS AT PROTECTING MINORS FROM ONLINE MATERIAL DEEMED HARMFUL

Introduction

Before Congress passed the Children’s Internet Protection Act in 2000,1 a law that required public libraries and schools to install Internet filters on user computers, it tried twice before to restrict minors’ access to online pornography.

Congress first attempted to regulate sexually explicit online content when it passed the

Communications Decency Act of 1996, one portion of the Telecommunications Act of 1996.2

The CDA criminalized the intentional online transmission of child pornography and “obscene,”

“indecent,”3 and “patently offensive” material4 to anyone under the age of 18.5 The Supreme

Court struck down the statute in 1997.6

A year after the Supreme Court struck down the CDA, Congress passed the Child Online

Protection Act of 1998,7 restricting minors’ access to sexually explicit commercial materials on

1 Pub. L. No. 106-554, § 1(a)(4), 114 Stat. 2763 (2000) (codified at 20 U.S.C. § 9134(f)(1)(A) & 47 U.S.C. § 254(h)(6)(B) & (C)). The CIPA was challenged and subsequently upheld by the U.S. Supreme Court. See United States v. Am. Library Ass'n, 539 U.S. 194 (2003). See Chapter 7 for a complete discussion of the CIPA.

2 See Commc’ns Decency Act of 1996, Pub. L. No. 104-104, 551; 110 Stat. 56, 133-39 (1996).

3 47 U.S.C. § 223(a)(1)(b).

4 47 U.S.C. § 223(d)(1).

5 47 U.S.C. § 223 (d)(1)(b) (1996). In addition, the CDA mandated that televisions with a diagonal screen size of thirteen inches or greater (and manufactured after 1997) contain a V-chip, which allows viewers to block television programs based on content, including sexual and violent content. See Commc’ns Decency Act of 1996, Pub. L. No. 104-104, 110 Stat. 56, 133-39 (1996); Telecommunications Act of 1996, Pub. L. No. 104-104 551, 110 Stat. 56, 139 (1996). For such a system to work, programmers had to use a uniform rating system, and in 1996, the entertainment industries agreed to use the Television Parental Guidelines (“TVPG”). See Otilio Gonzalez, Regulating Objectionable Content in Multimedia Platforms: Will Convergence Require a Balance of Responsibilities Between Senders and Receivers?, 20 SANTA CLARA COMPUTER & HIGH TECH. L.J. 609, 626-29 (2004). See infra notes 13 - 60 and accompanying text for a discussion of the CDA.

6 Reno v. ACLU, 521 U.S. 844 (1997).

7 Pub. L. No. 105-277, 112 Stat. 2681 (1998) (codified at 47 U.S.C. § 231).

201

the World Wide Web8 and prohibiting the sale of sexually explicit materials on the Internet to

minors.9 The Supreme Court barred the government from enforcing the Child Online Protection

Act (COPA) in 2002 and 2004.10 In 2002, the Supreme Court remanded the case to the Third

Circuit of the U.S. Court of Appeals for further examination and clarification. In 2004, the

Supreme Court remanded the case to the U.S. District Court for the Eastern District of

Pennsylvania to determine whether software filters would work as well as a criminal law in

shielding children from online pornography.11 In 2007, a U.S. District Court held that the COPA

facially violated the First Amendment because the statute was a content-based statute that did not

meet the strict scrutiny test—that is, the COPA was not narrowly tailored to meet the compelling

government interest of protecting minors from material deemed harmful.12

Communications Decency Act of 1996

The CDA was initiated as a result of the concerns of Sen. James Exon (D-Neb.) with what he viewed as the ready availability of pornography and indecency on the Internet13 and his

desire to protect children from such content.14 To garner support for the bill, Exon compiled a

“Blue Book,” a collection of pornography sites that he had downloaded from the Internet, put in

8 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 10.

9 Id. at 5, 12.

10 See Ashcroft v. ACLU, 535 U.S. 564, 566 (2002); Ashcroft v. ACLU, 542 U.S. 656, 668 (2004).

11 See Ashcroft v. ACLU, 542 U.S. at 673.

12 ACLU v. Gonzales, 478 F. Supp. 2d 775, 809-10, 821 (E.D. Pa. 2007). For a summary of the district court’s opinion, see infra notes 196-213 and accompanying text.

13 See 141 CONG. REC. 15503 (June 9, 1995) (statement of Sen. Exon).

14 See id.

202

a blue folder and made available to other senators for review.15 Newt Gingrich, a prominent

Republican and Speaker of the House at the time, stated that the CDA violated free speech.16

Senator Patrick Leahy (D-Vt.) also opposed the CDA, saying the government should not take

additional steps to regulate the Internet.17

A House Conference Report on the Telecommunications Act of 1996 stated that “the

federal government has a compelling interest in shielding minors from indecency.”18 The CDA contained two sections that Congress wrote in an effort to protect minors from online pornography.19 The CDA criminally prohibited the intentional transmission, by means of “any

interactive computer service,” any communication containing child pornography and “obscene or indecent” material20 to anyone under the age of 18.21 The CDA also criminally prohibited the

intentional transmission, to persons under 18, by means of “any interactive computer service,”

any communication that, “in context, depicts or describes, in terms patently offensive as

measured by contemporary community standards, sexual or excretory activities or organs”22 to

15 See 141 CONG. REC. 15503-04 (June 9, 1995) (statement of Sen. Exon). See also Joel Sanders, The Regulation of Indecent Material Accessible to Children on the Internet: Is it Really Alright to Yell Fire in a Crowded Chat Room? 39 CATHOLIC L. 125, 133 (1999).

16 Robert Cannon, The Legislative History of Senator Exon’s Communication Decency Act: Regulating Barbarians on the Information Superhighway, 40 FED. COMM. L.J. 51, 67 (1996).

17 See id. at 65.

18 House Conference Report on the Telecommunications Act of 1996, H. REP. NO. 104-458 (2d Sess. 1996), at 188.

19 See id. Congress also wanted to promote the development of the Internet, protect Internet Service Providers that tried to screen objectionable material, and “remove disincentives” for developing and using blocking and filtering technology that would empower parents to restrict their children’s access to objectionable or inappropriate online material. See 47 U.S.C. §§ 230(b) & (c).

20 47 U.S.C. § 223(a).

21 47 U.S.C. § 223(d)(1)(a).

22 47 U.S.C. § 223(d)(1)(b).

203

anyone under the age of 18.23 Those found in violation of the CDA could be fined, imprisoned,

or both.24

Immediately after President Bill Clinton signed the Communications Decency Act into

law on Feb. 8, 1996, the American Civil Liberties Union and other plaintiffs filed suit, requesting

a preliminary injunction against the enforcement of the CDA, stating that it was unconstitutional

because it was overbroad and vague.25 The plaintiffs only challenged the “indecency” and

“patently offensive” provisions of the CDA and not the obscenity or child pornography

provisions.26

Federal District Court Judge Ronald L. Buckwalter granted a temporary restraining order

enjoining the government from enforcing the CDA.27 He said that the use of the term “indecent”

might be unconstitutionally vague because it could be applied to “a whole range of conduct not

encompassed by ‘patently offensive.’”28

Four months later, a three-judge federal district court panel granted a preliminary

injunction, stating that the CDA violated the First Amendment.29 Chief Judge Dolores Sloviter

authored the panel’s opinion on behalf of Ronald L. Buckwalter, and herself.

Sloviter stated that Congress did not define the terms “indecency” and “patently offensive.”30

23 47 U.S.C. § 223 (d)(1)(b) (1996).

24 47 U.S.C. § 223(d)(2) (1996).

25 ACLU v. Reno, 1996 U.S. Dist. LEXIS 1617 at *3 and *9-10; 24 Media L. Rep. 1379 (E.D. Pa. 1996).

26 ACLU v. Reno, 929 F. Supp. 824, 829 (1996). Obscenity and child pornography were prohibited by law prior to the enactment of the CDA. Id.

27 ACLU v. Reno, 1996 U.S. Dist. LEXIS 1617 at *9-10; 24 Media L. Rep. 1379 ((E.D. Pa. 1996).

28 Id. at *6-7.

29 ACLU v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996).

30 Id. at 857.

204

She said that those provisions of the CDA would have a “chilling effect” on free expression31

and held that those two provisions were unconstitutional.32 She wrote that the “indecency” and

“patently offensive” provisions were a content-based restriction of speech and were not narrowly

tailored,33 and therefore had to pass the strict scrutiny test.34 Under strict scrutiny, any content-

based speech regulation must serve a compelling government interest and be narrowly tailored to

further that interest.35 The court found that existing technology could not effectively prohibit

minors from accessing online content without also denying access to adults.36

Upon direct appeal to the United States Supreme Court, as stipulated under a special review provision of the Communications Decency Act,37 the Supreme Court affirmed the federal

district court’s holding, stating that two sections of the CDA were unconstitutional.38 In a 7-2

decision, the Supreme Court struck down the provisions that would have restricted minors from

accessing online pornography. The Court found the provisions overbroad and vague, and

therefore in violation of the First Amendment.39

31 Id. at 851.

32 Id. at 849.

33 Id. at 851.

34 Id.

35 Id. To pass the strict scrutiny test, the Supreme Court stated that a regulation must promote a compelling government interest and be the least restrictive means to further that articulated interest. See Sable Commc’ns of Calif., Inc. v. FCC, 492 U.S. 115, 126 (1989)).

36 ACLU v. Reno, 929 F. Supp. at 845.

37 Under the special review provisions of the CDA, the government could—but was not required to—appeal directly to the United States Supreme Court. See 47 U.S.C. § 561(b).

38 Reno v. ACLU, 521 U.S. 844, 885 (1997).

39 Id. at 864.

205

Justice John Paul Stevens, in writing the Court’s opinion, acknowledged the government

has a compelling interest in protecting children from harmful materials,40 but said that the government has a “heavy burden” in explaining why a less restrictive law would not be as effective.41 Stevens said that the CDA was not narrowly tailored to meet the government

interest.42 He reasoned that “that interest does not justify an unnecessarily broad suppression of speech addressed to adults.43 As we (Supreme Court justices) have explained, the Government

may not ‘reduce the adult population . . . to . . . only what is fit for children.’”44 In applying the

strict scrutiny standard to the CDA, the Court said that while some other media, including

broadcasting, received a lower level of scrutiny for various reasons—such as the history of

extensive government regulation, the scarcity of access, and its "invasive" nature—those factors

were not present in cyberspace.45

In finding the CDA overbroad, Justice Stevens wrote:

[The CDA] lacks the precision that the First Amendment requires when a statute regulates the content of speech. In order to deny minors access to potentially harmful speech, the CDA effectively suppresses a large amount of speech that adults have a constitutional right to receive and to address to one another. That burden on adult speech is unacceptable if less restrictive alternatives would be at least as effective in achieving the legitimate purpose that the statute was enacted to serve.46

40 Id. at 875.

41 Id. at 879.

42 Id.

43 Id. at 875.

44 Id. (quoting Sable Commc’ns of Calif., Inc. v. FCC, 492 U.S. at 128 (1989) and Butler v. Michigan, 352 U.S. at 383 (1957)).

45 Id. at 868-69. According to the Court, in contrast to a time/place/manner regulation, “[T]he CDA is a content- based blanket restriction on speech” that requires “the application of the most stringent review of its provisions.” Id. at 868.

46 Id. at 874.

206

Justice Stevens said that the CDA was vague because it did not define the terms

“indecent” and “patently offensive.”47 Stevens said the Court was troubled by the CDA’s vagueness for two reasons. First, he said that the lack of precision in the statute’s wording could lead to a “chilling effect” on speech, which would violate the First Amendment. Precise wording is needed when a regulation restricts speech, Stevens wrote. Second, he said that the fear of criminal prosecution could silence potential speakers.48 Even parents could be prosecuted under the statute, Stevens said. For example, parents who allowed their children Internet access or sent their 17-year-old college freshmen birth control information in an e-mail message could be fined or imprisoned or both, he said.49

Justice Stevens contrasted the Supreme Court’s decision on the CDA from its decision in

1968 in Ginsberg v. New York.50 The Ginsberg Court upheld a New York state statute that

47 Id. at 871-72. Under the Radio Act of 1927 (the precursor to the Federal Communications Act of 1934), the Federal Radio Commission (the predecessor to the Federal Communications Commission) prohibited broadcasters from airing obscenity, indecency and profanity. The Federal Communications Act of 1934 established the FCC and gave it the authority to sanction broadcasters that aired obscene, indecent or profane content. In 1948, the Criminal Code was revised to include provisions that had previously been located in other titles of the United States Code. At that time, the prohibition against obscene, indecent, and profane broadcasts was removed from the Communications Act and re-enacted as 18 USC § 1464: “Whoever utters any obscene, indecent, or profane language by means of radio communication shall be fined under this title or imprisoned not more than two years, or both.” In FCC v. Pacifica Foundation, the Supreme Court upheld the FCC’s power to punish broadcasters for airing indecent content because broadcasting was considered intrusive and uniquely accessible to children. See FCC v. Pacifica Found., 438 U.S. 726, 748-49 (1978). The legislative history of the CDA indicates that the definition of “indecency” in the act was meant to be the same definition established by the FCC and upheld by the United States Supreme Court, e.g. “[The] concept of 'indecent' is intimately connected with the exposure of children to language that describes, in terms patently offensive as measured by contemporary community standards for the broadcast medium, sexual or excretory activities and organs, at times of the day when there is a reasonable risk that children may be in the audience." See Pacifica., 438 U.S. at 731-32 (citing 56 F.C.C. 2d. at 98). A conference committee for the CDA acknowledged that although the definitions of indecency have varied slightly depending on communications medium, the essence of the definition of indecency has been consistent as “patently offensive descriptions of sexual and excretory activities.” See H.R. Conf. Rep. No. 458, 104th Cong. 2d Sess. 188-89 (1996).

48 Reno v. ACLU, 521 U.S. 844, 871-74 (1997).

49 Id. at 878.

50 Ginsberg v. New York, 390 U.S. 629 (1968). See Chapter 4, supra notes 129 to 143, for a discussion of Ginsberg.

207

prohibited the sale of pornography to minors under the age of 17.51 In the CDA opinion, Stevens

wrote that there were four key differences between the New York state statute and the

Communications Decency Act. First, the Ginsberg Court did not prohibit parents from buying

sexually explicit material for their children. The CDA, however, made parental participation and

consent irrelevant. Second, the New York statute applied to commercial transactions only,

whereas the CDA applied to both commercial and non-commercial transactions. Third, the New

York statute defined material that was harmful to minors as that which is “utterly without

redeeming social importance for minors.” In contrast, the CDA did not define the terms

“indecent” or “patently offensive.”52 Fourth, the New York statute defined minors as persons

under the age of 17, whereas the CDA defined minors as under the age of 18.53

The Supreme Court agreed with the district court’s findings that existing technology, as

of 1996, did not allow Web sites to verify age. Even if age verification was possible, it would be

very expensive for non-commercial and some commercial Web sites to try to verify that users

are adults.54

In declaring the CDA unconstitutional, the Supreme Court reasoned that although sexually explicit material was widely available online in 1997, Internet users would rarely

encounter such material accidentally. Justice Stevens wrote,

Unlike communications received by radio or television, “the receipt of information on the Internet requires a series of affirmative steps more deliberate and directed than merely turning a dial. A child requires some sophistication

51 Ginsberg, 390 U.S. at 639.

52 Reno v. ACLU, 521 U.S. 844, 865-66 (1997).

53 Id. at 865-66.

54 Id. at 876-77.

208

and some ability to read to retrieve material and thereby to use the Internet unattended.”55

Justice Sandra Day O’Connor authored a dissenting opinion, concurring in the judgment

in part and dissenting in part, and was joined by Chief Justice William Rehnquist.56 O’Connor wrote that the creation of “adult zones” on the Internet would be constitutionally sound, arguing

that the states have denied minors access to adult establishments and to speech deemed “harmful

to minors.”57 However, O’Connor agreed with the Court that it was impossible to restrict online

speech to an “adult zone,” and therefore speakers could not be assured that their speech would only reach adults.58 She also agreed with the Court that indecent transmissions could be

prohibited in chat rooms because there was no way to exclude minors from online chat rooms.59

However, O’Connor said she would have upheld the CDA as it applied to “indecent speech”

between adults and minors when the adults knew they were communicating with minors.60

Child Online Protection Act of 1998

In an attempt to remedy the deficiencies of the Communications Decency Act, Congress enacted the Child Online Protection Act61 in 1998, despite warnings from the Justice Department

concerning the constitutionality of the COPA.62 President Bill Clinton signed the COPA into law

55 Id. at 854 (quoting ACLU v. Reno, 929 F. Supp. 824, 844-45 (E.D. Pa. 1996)).

56 Id. at 886 (O’Connor, J., dissenting).

57 Id. at 887-88.

58 Id. at 891.

59 Id. at 893.

60 Id. at 897.

61 See Child Online Protection Act, Pub. L. No. 105-277, 112 Stat. 2681 (1998) (codified at 47 U.S.C. § 231).

62 See Letter from L. Anthony Sutin, Acting Assistant Attorney General, to Thomas Bliley, Chairman of the Comm. on Commerce, U.S. H.R. (Oct. 5, 1998) available at http://www.eff.org/legal/cases/ACLU_v_Reno_II/HTML/19981005_doj_congress.letter.html (last visited July 20, 2009). See also American Civil Liberties Union, Censorship in a Box: Why Blocking Software Is Wrong for Public Libraries (2002), http://www.aclu.org/privacy/speech/14915pub20020916.html (last visited July 20, 2009).

209

on October 21, 1998 as part of the Omnibus Appropriations Act.63 The COPA, which was to go

into effect on November 29, 1998, 64 immediately was challenged. The COPA was brought before the federal courts on seven different occasions between 1998 and 2007.

Based on Supreme Court precedent establishing the government’s compelling interest in protecting children,65 Congress passed the COPA in an effort to restrict minors’ access to

sexually explicit commercial materials on the World Wide Web,66 including “teasers,”67 and to

prohibit the sale of sexually explicit materials on the Internet to minors.68 A 1998 House report

stated that minors had ready access to pornographic materials online,69 and parental control

protections70 and self-regulation had not provided a national solution to protect minors from

online pornography.71 In the debate on the proposed bill, Congressman Billy Tauzin (R-La.) said

that the bill addressed the Supreme Court’s concerns in Reno,72 including “a narrow prohibition,

tighter definition, and a realization that the applicability of the law may change as technology is

involved.”73 In the Commerce Committee’s 1998 report, it recommended the passage of the

63 105 Pub. L. No. 277, 112 Stat. 2681 (enacting H.R. 4328, 105th Cong. (1998)).

64 Id.

65 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 11. See also Ginsberg v. New York, 390 U.S. 629 (1968); New York v. Ferber, 458 U.S. 747 (1982).

66 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 10.

67 Teasers are free semi-nude or nude images that are designed to encourage Internet users to enter a site containing more sexually explicit content, usually for a fee. See ACLU v. Reno, 31 F. Supp. 2d 473, 476 (1999).

68 H.R. REP. NO. 105-775 (2d Sess. 1998), at 5, 12.

69 Id. at 8-10.

70 Id. at 9.

71 Id. at 17.

72 See Reno v. ACLU, 521 U.S. 844, 865 (1997).

73 See 144 CONG. REC. H9906 (1998) (statement of Rep. Tauzin).

210

COPA, stating that an international news service had reported that nearly 50% of material on the

Web was “unsuitable” for children.74

The COPA was somewhat more narrowly tailored than the CDA. The COPA defined

minors as persons under the age 17, just as the New York statute upheld by the Supreme Court in

Ginsberg75 did, whereas the CDA defined minors as persons under the age of 18.76 Unlike the

CDA, the COPA would not prohibit parents from obtaining online pornography for their

children.77 In addition, the COPA legislation did not include the “indecent” and “patently

offensive” language of the CDA.78 The COPA restricted a “narrower category” of content,79 material considered harmful to minors, which was defined as:

[A]ny communication, picture, image, graphic image file, article, recording, writing, or other matter of any kind that— (A) the average person, applying contemporary community standards, would find, taking the material as a whole and with respect to minors, is designed to appeal to, or is designed to pander, to the prurient interest; (B) depicts, describes, or represents, in a manner patently offensive with respect to minors, an actual or simulated sexual act or sexual contact, an actual or simulated normal or perverted sexual act, or a lewd exhibition of the genitals or post-pubescent female breast; and

74 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 10 (citing “Half of ‘Net Content Said Unsuitable for Children,” REUTERS FINANCIAL SERV. (Jan. 10, 1996)).

75 Ginsberg v. New York, 390 U.S. 629 (1968). See supra notes 50-53 and accompanying text for a summary of Ginsberg. For a discussion of Ginsberg and the variable obscenity standard, see Chapter 4, supra notes 129-143 and accompanying text.

76 See Reno v. ACLU, 521 U.S. at 865-66 (1997).

77 Timothy Zick, Congress, the Internet, and the Intractable Pornography Problem: The Child Online Protection Act of 1998, 32 CREIGHTON L. REV. 1147, 1178 (1999).

78 See supra notes 20-24 and accompanying text for the text of the CDA.

79 Ashcroft v. ACLU, 535 U.S. 564, 569-70 (2002). The U.S. Supreme Court said that the COPA’s restriction on “material that is harmful to minors” was “narrower” than the CDA’s restrictions on “indecent” and “patently offensive” communications. Id.

211

(C) taken as a whole, lacks serious literary, artistic, political, or scientific value for minors. 80

In amending an earlier version of the COPA bill, the House Committee on Commerce adopted the “variable obscenity” standard.81 The committee stated that its intention was “for the definition of material harmful to minors to parallel the Ginsberg82 and Miller83 definitions of obscenity and harmful to minors.”84 Similar to Ginsberg and unlike Miller, however, the committee defined the online “community standard” as a “variable obscenity” standard, rather than a geographic community standard that the Miller obscenity established.85 Therefore, the

COPA, in practice, would create a uniform national standard of variable obscenity.86

Unlike the CDA, the COPA would have applied only to those materials found on the

World Wide Web and not to material located in other places on the Internet, such as in e-mail messages, listservs, news groups or live chat rooms.87 The COPA also would have only applied to commercial Web sites,88 whereas the CDA would have applied to commercial,

80 See 47 U.S.C. § 231(e)(6).

81 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 27-28.

82 Ginsberg v. New York, 390 U.S. 629 (1968).

83 Miller v. California, 413 U.S. 15 (1973).

84 See H.R. REP. NO. 105-775 (2d Sess. 1998), at 27.

85 See id. at 28. In its report on H.R. 3783, a House bill on the COPA, the Commerce Committee wrote, “The Committee recognizes that the availability of community standards in the context of the Web is controversial, but understands it as an ‘adult’ standard, rather than a ‘geographic’ standard, and one that is reasonably constant among adults in America with respect to what is suitable for minors.”

86 See id.

87 See id. at 12.

88 See id.

212

noncommercial and nonprofit online communications, as well as individuals’ online

communications.89

The COPA stated:

Whoever knowingly and with knowledge of the character of the material, in interstate or foreign commerce, by means of the World Wide Web, makes any communication for commercial purposes that is available to any minor and that includes any material that is harmful to minors shall be fined not more than $50,000, imprisoned not more than 6 months, or both.90

Defendants would have an affirmative defense against prosecution, however, if they, in

good faith, required identification such as a credit card, debit account, adult access code or adult

personal identification number (PIN)91 or a digital certificate that verifies age.92

The COPA statute also had established a temporary Commission on Online Child

Protection to study methods to help reduce minors’ access to certain sexually explicit materials

online.93 The commission released its report in 2000 and noted that the Internet, unlike the

broadcast media, was “inherently multi-directional and interactive,”94 an attribute that was both

promising to users and challenging to policy-makers.95 The commission made major

recommendations in four key areas in an effort to protect children from Internet content deemed

89 47 U.S.C. § 223(d).

90 47 U.S.C. § 231(a)(1).

91 47 U.S.C. § 231(c)(1)(A).

92 47 U.S.C. § 231(c)(1)(B).

93 See Commission on Child Online Protection Act, Report to Congress, at 7 (October 2000) available at http://www.copacommission.org/report/COPAreport.pdf. The Commission was a temporary, 19-member organization composed of members of industry and government. See also Pub. L. No. 105-277, Div C, Title XIV, § 1405, 112 Stat. 2681-739 (1998).

94 Commission on Child Online Protection Act, Report to Congress, at 7 (October 2000) available at http://www.copacommission.org/report/COPAreport.pdf.

95 Id. The Commission also pointed out that the global nature of the Internet presents additional concerns to law enforcement agencies because a great deal of material deemed harmful to minors, as well as obscenity and child pornography, originates from overseas.

213

harmful to minors: 1) to make aggressive efforts toward public education, including the

promotion of acceptable use policies; 2) to provide consumers with empowerment, such as

independent evaluation of child protection technologies;96 3) to increase the resources to enforce existing obscenity laws; and, 4) to encourage Internet Service Providers and adult content Web

sites to voluntarily take steps to reduce access to online adult content by minors.97

Round 1: U.S. District Court for the Eastern District of Pennsylvania—1998

The Child Online Protection Act worked its way through the federal court system seven times in nine years, from 1998 to 2007. On November 20, 1998, federal district court Judge

Lowell Reed, Jr. granted a temporary restraining order98 prohibiting the government from

implementing the COPA.99 The American Civil Liberties Union and other plaintiffs had argued

that the COPA was unconstitutionally vague and would infringe upon the protected speech of

both adults and minors.100 The government argued that it had a compelling interest in protecting minors from online materials that were “not obscene by adult standards.”101 Reed said he had to

balance the public interest in protecting minors from online content with the First Amendment

rights of adults.102 He said that those challenging the COPA had shown the likelihood that they

would succeed on at least some of their claims,103 and the balance of interests weighed in favor

96 The COPA Commission stated that there was a lack of information about how technologies work, and a lack of transparency about what information they might block. Id. at 41.

97 Id. 39-46.

98 A temporary restraining order forbids an opposing party from taking action until an application for a preliminary or permanent injunction can be heard.

99 See ACLU v. Reno, 1998 U.S. Dist. LEXIS 18546 at *11 (E.D. Pa. 1998).

100 Id. at *2.

101 Id. at *6.

102 Id. at *10-11.

103 Id. at *11.

214

of granting the restraining order.104 Reed said the law would chill online speech and lead to self-

censorship.105

Round 2: U.S. District Court for the Eastern District of Pennsylvania—1999

Less than three months later, Reed issued a preliminary injunction106 against enforcement

of the COPA,107 one that was affirmed by the Third Circuit of the U.S. Court of Appeals.108

Reed again said the law would chill online speech and that such self-censorship of constitutionally-protected speech could result in “an irreparable harm to the plaintiffs.”109 He acknowledged that there exists a compelling government interest in protecting children from harm,110 but he also stated that “[w]hile the public has an interest in protecting its minors, the

public interest is not served by the enforcement of an unconstitutional law.”111

The American Civil Liberties Union and other plaintiffs had argued the COPA was

invalid on its face and violated the First Amendment rights of adults and minors and was

unconstitutionally vague.112 Reed held that the strict scrutiny standard would apply to the COPA,

as it was content-based speech.113 He also said the law abridged the free speech rights of

104 Id.

105 Id. at *10.

106 A preliminary injunction is a temporary court order commanding or prohibiting an action.

107 ACLU v. Reno, 31 F. Supp. 2d 473, 498-99 (E.D. Pa. 1999).

108 ACLU v. Reno, 217 F.3d 162, 166, 181 (3d Cir. 2000).

109 ACLU v. Reno, 31 F. Supp. 2d at 497.

110 Id. at 498.

111 Id.

112 Id. at 477.

113 Id. at 492.

215

adults.114 He stated that less-restrictive blocking or filtering technology—although imperfect—

could protect children from online pornography, just as well as the COPA could.115 Moreover,

unlike the COPA, filtering software could block sexually explicit Web sites emanating from

other countries, Reed said.116

Round 3: Third Circuit of the U.S. Court of Appeals—2000

In 2000, the Third Circuit of the U.S. Court of Appeals affirmed the District Court’s

holding, but on different grounds.117 In upholding Judge Reed’s preliminary injunction,118 the

Third Circuit looked only at the community standards aspect of the Child Online Protection Act.

In authoring the opinion for the court, Senior Judge Leonard Garth ruled that the statute was constitutionally overbroad because of its “reliance on ‘contemporary community standards’ in the context of the electronic medium of the Web to identify material that is harmful to minors.”119 Garth said that the community standards provision would subject Internet Service

Providers (ISPs) in the most tolerant communities to the decency standards of the most

restrictive or “puritanical” communities.120 What is harmful to minors in one community may

not be harmful to minors in another community, Garth said.121

114 Id. at 495.

115 Id. at 497.

116 Id.

117 ACLU v. Reno, 217 F.3d 162, 166, 172, 174 (3d Cir. 2000).

118 Id. at 166, 172.

119 Id. at 166, 173. However, the court pointed out that, in contexts other than the Internet and Web, it remained satisfied with the three-part obscenity test the Supreme Court established in Miller v. California, 413 U.S. 15 (1973). See ACLU v. Reno, 217 F.3d at 180.

120 ACLU v. Reno, 217 F.3d at 166, 175.

121 Id. at 177.

216

The Third Circuit recognized that by focusing solely on community standards, it had used a different analysis than the district court and had not addressed the issues raised by the district court.122 Judge Garth wrote:

[A] contemporary community standards clause . . . so concerns us that we are persuaded that this aspect of COPA, without reference to its other provisions, must lead inexorably to a holding of a likelihood of unconstitutionality of the entire COPA statute. Hence we base our opinion entirely on the basis of the likely unconstitutionality of this clause, even though the District Court relied on numerous other grounds.123

The appellate court distinguished the COPA from the Supreme Court’s holdings in cases

involving the mailing of pornography124 and telephone dial-a-porn.125 Judge Garth stated that, in

those instances, “the defendants could limit their exposure to liability by avoiding those

communities with particularly restrictive standards, while continuing to provide the controversial

material in more liberal-minded communities,”126 unlike Web publishers prosecuted under the

COPA.127 Garth said that the Miller obscenity test could not be applied to the Web because Web

publishers are unable to control the “geographic scope of the recipients of their

communications.”128

122 Id. at 173-74.

123 Id.

124 Id. at 166, 175-76 (citing Hamling v. United States, 418 U.S. 87 (1974)).

125 Id. (citing Sable Commc’ns. of Calif., Inc. v. FCC, 492 U.S. 115 (1989)).

126 Id.

127 Id. at 176.

128 Id. at 180.

217

Round 4: U.S. Supreme Court—2002

In 2002, the U.S. Supreme Court issued its first of two opinions on the Child Online

Protection Act,129 with its second opinion issued two years later.130 Because the Third Circuit

had focused solely on the community standards aspect of the COPA, the U.S. Supreme Court in

2002 only looked at that one narrow issue: whether the Child Online Protection Act’s use of

“community standards” in identifying materials harmful to minors violated the First

Amendment.131 The Supreme Court held that the reliance on the “community standards”

provision of the COPA to identify material that was “harmful to minors” did not, by itself,

violate the overbreadth doctrine of the First Amendment.132 The Court did not issue an opinion

as to whether the COPA was overbroad for other reasons or was unconstitutionally vague.133

In an 8-1 plurality decision authored by Justice Clarence Thomas, the Court vacated the judgment of the Third Circuit of the U.S. Court of Appeals and remanded the case to the Third

Circuit to examine three issues: 1) whether the COPA was overbroad; 2) whether the COPA was vague; and 3) whether the COPA likely would not survive strict scrutiny.134 The preliminary

injunction prohibiting the government from enforcing the COPA remained in effect for two

reasons. First, the government had not asked the Supreme Court to vacate the injunction. Second,

129 Ashcroft v. ACLU, 535 U.S. 564, 566 (2002).

130 For a discussion of the Court’s second opinion on the COPA, Ashcroft v. ACLU, 542 U.S. 656 (2004), see infra notes 163-195 and accompanying text.

131 Ashcroft v. ACLU, 535 U.S. at 566.

132 Id.

133 Id. at 585-86.

134 Id. Justice Clarence Thomas authored the plurality opinion and was joined by Chief Justice William Rehnquist and Antonin Scalia in all seven parts and by Justice Sandra Day O’Connor in four of seven parts. Justices Sandra Day O’Connor and Stephen Breyer concurred in part and concurred in the judgment. Justice Anthony Kennedy, joined by Justices David Souter and Ruth Bader Ginsberg, concurred in the judgment. Justice John Paul Stevens dissented.

218

the Court said that “prudence dictates allowing the Court of Appeals to first examine these difficult issues” of overbreadth, vagueness, and the COPA’s unlikelihood of surviving strict scrutiny.135

Justice Clarence Thomas authored the Court’s plurality opinion, and was joined in

portions of the opinion by Chief Justice William Rehnquist, and justices Sandra Day O’Connor,

Antonin Scalia and Stephen Breyer. Thomas stated that “COPA’s reliance on community standards does not by itself render the statute substantially overbroad for the purposes of the First

Amendment.”136

Although eight of the nine justices agreed that the Supreme Court should remand the case

to the Third Circuit, their reasoning varied. Justices Sandra Day O’Connor and Stephen Breyer

wrote separate concurring opinions. Justice O’Connor wrote that while precedent did not prevent the establishment of a national standard, such a standard would make it too burdensome for online communicators to control who received their messages.137 She said the expectation that

Internet speakers could control who accessed their material would be too much to ask and the

COPA “would potentially suppress an inordinate amount of expression.”138

Justice Stephen Breyer wrote that he believed Congress, in writing the COPA, intended

the term “community” to refer to a national adult community and not local geographic

communities.139 He said a local standard would result in Web content providers adopting the

135 Id. To pass the strict scrutiny test, the Supreme Court has stated that a regulation must promote a compelling government interest and be the least restrictive means to further that articulated interest. Sable Commc’ns of Calif., Inc. v. FCC, 492 U.S. 115, 126 (1989).

136 Ashcroft v. ACLU, 535 U.S. at 585-86.

137 Id. at 587-89 (O’Connor, J., concurring).

138 Id.

139 Id. at 589-90 (Breyer, J., concurring) (citing a House of Representatives Report on the COPA that he said was “apparently a uniform view within Congress.” In part, the report reads, "The Committee recognizes that the

219

standards of “the most puritan community.”140 He said that even with a national standard, some

regional variations may remain, but those variations “are not, from the perspective of the First

Amendment, problematic” because such variations are inherent in a system that uses jurors in

local communities.141

Justice Anthony Kennedy, joined by justices David Souter and Ruth Bader Ginsburg, concurred in the judgment to vacate and remand. Justice Kennedy wrote that the plurality was wrong in stating “that the Act is narrow enough to render the national variation in community standards unproblematic.”142 Justice Kennedy wrote that the Child Online Protection Act is

“overbroad” and unlikely to survive a strict scrutiny challenge because the application of a

national standard would still vary in different communities.143 Although agreeing with the

plurality decision to remand, Kennedy wrote that “the national variation in community

standards” could “constitute a substantial burden on Internet communication.” He wrote that he

had “grave doubts COPA is consistent with the First Amendment.”144

Justice John Paul Stevens, who wrote the only dissenting opinion, said he would have

affirmed the judgment of the Third Circuit.145 Stevens wrote that the COPA was overbroad

because “Web publishers cannot control who accesses their Web sites”146 and “even the

applicability of community standards in the context of the Web is controversial, but understands it as an 'adult' standard, rather than a 'geographic' standard, and one that is reasonably constant among adults in America with respect to what is suitable for minors." H. R. REP. NO. 105-775, p. 28 (1998)).

140 Ashcroft v. ACLU, 535 U.S. at 590 (Breyer, J., concurring).

141 Id. at 591.

142 Id. at 593 (Kennedy, J., concurring).

143 Id. at 591-94.

144 Id. at 602.

145 Id. at 612 (Stevens, J., dissenting).

146 Id. at 606.

220

narrowest version of the statute abridges a substantial amount of protected speech that many

communities would not find harmful to minors.”147

Round 5: Third Circuit of the U.S. Court of Appeals—2003

On remand from the U.S. Supreme Court, the U.S. Court of Appeals for the Third

Circuit in 2003 held that the district court did not abuse its discretion in granting a preliminary injunction against enforcement of the Child Online Protection Act148 and that the plaintiffs

would probably prove at trial that the COPA is substantially overbroad.149

The Third Circuit Court of Appeals, for the second time, affirmed the district court’s

ruling150 to issue a preliminary injunction against the COPA, but this time on different grounds

than the statute’s reliance on community standards that the court had used before.151 In its second review of the COPA, the Third Circuit held that the COPA could not pass the content- based strict scrutiny test.152

The court said that the COPA was not the least restrictive method of advancing the government interest,153 and several provisions of the COPA were not narrowly tailored,

including the definitions of “harmful to minors”, “as a whole”, “commercial purposes” and

“minors.”154 First, in evaluating the “harmful to minors” standard, the court stated the COPA

147 Id. at 610.

148 ACLU v. Ashcroft, 322 F.3d 240, 265 (3d Cir. 2003).

149 Id. at 271.

150 ACLU v. Reno, 31 F. Supp. 2d 473 (E.D. Pa. 1999).

151 ACLU v. Ashcroft, 322 F.3d 240, 243 (3d Cir. 2003).

152 Id. at 253, 261, 265-66. To pass the strict scrutiny test, the Supreme Court has stated that a regulation must promote a compelling government interest and be the least restrictive means to further that articulated interest. Sable Commc’ns of Calif., Inc. v. FCC, 492 U.S. 115, 126 (1989).

153 ACLU v. Ashcroft, 322 F.3d 240, 261, 265-66 (3d Cir. 2003).

154 Id. at 251-61.

221

“limits the range of permissible material under the statute to that which is deemed acceptable

only by the most puritanical communities.”155 Second, the court said that the concept of “taken

as a whole” does not satisfy the First Amendment because it is difficult and complicated to

determine context on the Internet.156 Third, in reviewing the “commercial purposes” provision,

the court held it would violate the First Amendment because it would impose content restrictions

on a substantial number of commercial speakers who were not engaged in obscene speech157 or who posted any “harmful to minors” material on their Web sites, even if they did not profit from it or post the material as a “principal part of their business.”158 Fourth, the court stated it was

troubled by the term “minor,” which referred to anyone from the ages of 3 to just under 17.159

The court held that the COPA does not use the least restrictive means because minors could access foreign Web sites or use their parents’ credit cards (or their own credit cards) in accessing content covered under the act.160 The court also stated that other alternatives existed

that were possibly less restrictive, such as installing filtering software on individual computers.

For instance, when installing filtering software on home computers, parents could determine

which type of content to block from entering the home computers.161 Finally, the court stated

that adult users could be deterred from accessing Web sites if they had to provide a credit card

number or other form of identification.162

155 Id. at 252.

156 Id. at 252-53.

157 Id. at 257.

158 Id. at 256.

159 Id. at 254.

160 Id. at 261.

161 Id. at 265.

162 Id. at 257-60.

222

Round 6: U.S. Supreme Court—2004

In its second ruling on the Child Online Protection Act, a divided Supreme Court in 2004

upheld the Third Circuit’s order that prohibited the government from enforcing the COPA,163 although on narrower grounds.164 In upholding the decision of the Third Circuit, the U.S.

Supreme Court, on a 5-4 vote,165 affirmed the district court’s decision to grant the preliminary

injunction for the same reason the district court itself had stated—that there are less restrictive alternatives than the COPA.166 The Supreme Court said it did not need to address the Third

Circuit’s other arguments on the terminology used in the statute,167 such as the definitions of

“harmful to minors”, “as a whole” and “commercial purposes.”168 The Supreme Court remanded

the case to the Federal District Court for the Eastern District of Pennsylvania for trial, asking the

lower court to determine if filters would work as well as, and be less restrictive than, the

COPA.169

The Supreme Court said that filtering software installed on home computers would limit

access to online pornography posted anywhere on the Internet, not just pornography posted on

commercial Web sites in the United States.170 The justices did not declare the COPA

163 Ashcroft v. ACLU, 542 U.S. 656, 673 (2004).

164 Id. at 665.

165 Justice Anthony Kennedy authored the majority opinion and was joined by justices John Paul Stevens, David Souter, Clarence Thomas and Ruth Bader Ginsburg. Chief Justice William Rehnquist and justices Antonin Scalia, Stephen Breyer, and Sandra Day O’Connor dissented, arguing that the COPA was constitutional.

166 The Third Circuit also stated that the COPA did not use the least restrictive means because minors could access online material considered harmful to minors by using their parents’ credit cards or viewing foreign Web sites. See ACLU v. Ashcroft, 322 F.3d 240, 261(2003).

167 Ashcroft v. ACLU, 542 U.S. at 665.

168 ACLU v. Ashcroft, 322 F.3d at 251-61.

169 Ashcroft v. ACLU, 542 U.S. at 656.

170 Id. at 667-69.

223

unconstitutional, but said that the Third Circuit was correct in concluding the district court had not abused its discretion in granting a preliminary injunction.171

Justice Anthony Kennedy, writing for the majority, said,

[T]he Government has not shown that the less restrictive alternatives proposed by respondents should be disregarded. Those alternatives, indeed, may be more effective than the provisions of COPA.172

The Court listed three alternatives that might be less restrictive than the COPA: Two statutes Congress had passed after the district court’s 1999 decision on the COPA and the implementation of user-based filtering software.173 The first statute, enacted in 2002, created a new kid-friendly “dot kids” level on the Internet that would contain only content deemed appropriate to minors.174 The second statute, enacted in 2003, prohibited the use of misleading domain names on the Internet.175 Justice Kennedy only briefly addressed those two statutes, stating that they “might qualify as less restrictive alternatives to COPA.”176

171 Id. at 673.

172 Id.

173 Id. at 666-73.

174 Id. at 672. In 2002, Congress enacted the “child-friendly, second-level Internet domain” in the United States. Congress mandated that the National Telecommunications and Information Administration (NTIA) operate and maintain the new domain, which would provide access “only to material that is suitable to minors and not harmful to minors.” See 47 USC § 941.

175 Ashcroft v. ACLU, 542 U.S. at 672. In 2003, Congress enacted 18 USC § 2252B, which criminalizes the knowing use of a domain name on the Internet that would deceive anyone into viewing obscene material and “deceive a minor into viewing material that is harmful to minors.” The phrase “harmful to minors” is defined as "any communication, consisting of nudity, sex, or excretion, that, taken as a whole and with reference to its context (1) predominantly appeals to a prurient interest of minors; (2) is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors; and (3) lacks serious literary, artistic, political, or scientific value for minors.” Those found guilty could be fined or imprisoned for up to ten years or both.

176 Ashcroft v. ACLU, 542 U.S. at 672.

224

Justice Kennedy spent more time addressing the third alternative suggested by the district

court—filtering software that individual computer users could install on their own computers.177

In discussing filtering, Justice Kennedy wrote that “the Government failed to introduce specific evidence proving that existing technologies are less effective than the restrictions in COPA.”178

The majority said that filters were less restrictive than the COPA179 and may be more effective

for several reasons. First, filters can block pornography posted on any Web site from anywhere

in the world, whereas the COPA applies only to sexually explicit content posted to commercial

Web sites from within the United States.180

Second, minors can circumvent the COPA’s verification systems (for example, by using

their own credit cards), whereas filtering software could block such access. Third, filters can block content from all areas of the Internet, such as e-mail, and not just the World Wide Web.181

The Supreme Court acknowledged that filtering software contained flaws, stating that

filters failed to block some Web sites containing material that is “harmful to minors,” and, at the

same time, prohibited access to other Web sites that are not harmful to minors.182 No evidence

was submitted to the district court in 1999 on the percentage of time that the filtering software underblocked inappropriate content or overblocked appropriate content, Justice Kennedy said.183

177 Id. at 666-67.

178 Id. at 668.

179 Id. at 667.

180 Id. at 667-68.

181 Id.

182 Id. at 668-69.

183 Id. (citing ACLU v. Reno, 31 F.Supp. 2d at 492). The district court wrote,

It appears undisputed that blocking and filtering technology is not perfect in that it is possible that some Web sites that may be deemed inappropriate for minors may not be blocked while some Web sites that are not inappropriate for minors may be blocked. In

225

In remanding the case to the Federal District Court for the Eastern District of

Pennsylvania, the Supreme Court said “substantial factual disputes” remain,184 and the factual

record does not reflect current Internet technology because of the time lapse since the district

court evaluated filtering technology in 1999.185 In affirming the preliminary injunction and

remanding the case to the district court for trial, the justices said both parties would be able to update the factual record, including advances in filtering technology.186 In addition, the district

court would be able to decide the case in light of the two other statutes passed by Congress.187

However, the Court concluded,

On a final point, it is important to note that this opinion does not hold that Congress is incapable of enacting any regulation of the Internet designed to prevent minors from gaining access to harmful materials . . . . This opinion does not foreclose the District Court from concluding, upon a proper showing by the Government that meets the Government's constitutional burden as defined in this opinion, that COPA is the least restrictive alternative available to accomplish Congress' goal.188

In a concurring opinion, Justice John Paul Stevens, joined by Justice Ruth Bader

Ginsburg, said that the “community standards” provision of the COPA was not the statute’s only

“constitutional defect.” He said that user-based controls, such as filtering software, would better

addition, a minor's access to the Web is not restricted if she accesses the Web from an unblocked computer or through another ISP. It is possible that a computer-savvy minor with some patience would be able to defeat the blocking device. No evidence was presented to the Court as to the percentage of time that blocking and filtering technology is over- or underinclusive. 31 F.Supp 2d at 492 (1999).

184 Ashcroft v. ACLU, 542 U.S. 656, 671 (2004).

185 Id.

186 Id. at 671-72.

187 Id. at 672. One statute created a minors-safe “dot-Kids” domain, and the other prohibited the misleading use of domain names. Id.

188 Id. at 672-73.

226

serve the congressional interest of protecting minors from “harmful” online content.189 Stevens said that criminal prosecutions are an “inappropriate means” to regulate obscenity because the line between offensive content and inoffensive content is “too blurred.”190

The four dissenting justices, in two different opinions, said the COPA was constitutional.

Justice Stephen Breyer, joined by Chief Justice William Rehnquist and Justice Sandra Day

O’Connor, wrote that the Court was wrong in concluding that Congress could have found a less

restrictive way to accomplish the government objective of protecting children from online

commercial pornography.191 He also said that the COPA places some burdens on protected speech, but that burden “is no more than modest.”192

Breyer argued that the use of filtering software is “part of the status quo” and the

“backdrop against which Congress enacted [the COPA].” Despite the availability of filtering

software at the time Congress passed the COPA in 1998, children still encountered “harmful material” on the Internet, Breyer said.193 Filtering software was inadequate for four reasons, according to Breyer. First, Internet filters allowed some pornographic material through. Second, filtering software costs money. Third, parents had to configure the filter to determine what access their children would—and would not—have to the Web. Fourth, filtering software “lacks

precision [and therefore] blocks a great deal of material that is valuable,” Breyer said.194

189 Id. at 674 (Stevens, J., concurring).

190 In discussing criminal prosecutions for obscenity, Stevens did not state that he was referring to online content only. Id.

191 Id. at 677 (Breyer, J., dissenting).

192 Id. at 677-78.

193 Id. at 684.

194 Id. at 684-85.

227

In a separate dissent, Justice Antonin Scalia said that he agreed with Justice Breyer’s

conclusion that the COPA was constitutional, but disagreed that the COPA should be subject to

strict scrutiny for two reasons. First, Justice Scalia wrote that the COPA applied to “sexually

provocative” speech, a type of speech that did not receive full First Amendment protection.

Second, Scalia wrote that commercial entities engaging in the business of “the sordid business of

pandering” could, consistent with the First Amendment, be banned entirely and therefore

COPA’s lesser restrictions were constitutional.195

Round 7: U.S. District Court for the Eastern District of Pennsylvania—2007

In 2007, the federal district court in Philadelphia granted a permanent injunction

prohibiting the government from enforcing the Child Online Protection Act.196 The lower court

held that the COPA facially violated the First Amendment,197 which the court also had held

when granting the preliminary injunction in 1999.198 Judge Lowell Reed applied the strict scrutiny test, stating that the COPA was content-based.199 He wrote that although the protection

of minors from online pornography was a compelling government interest, the government failed

to show that the COPA was narrowly tailored to meet that interest.200 He stated the COPA was

overinclusive in that it prohibited more speech than necessary and underinclusive in that it would

not apply to sexually explicit content originating outside the United States that would be

195 Id. at 676 (Scalia, J., dissenting).

196 Am. Civil Liberties Union v. Gonzales, 478 F. Supp. 2d 775, 821 (E.D. Pa. 2007).

197 Id. at 809-10, 821.

198 Id. at 820-21 (citing ACLU v. Reno, 31 F. Supp. 2d 473, 498 (1999)).

199 Id. at 809.

200 Id. at 810-15.

228

available to minors in the U.S.201 Reed wrote that the COPA was overly vague because some of the terms were not clearly defined202 and overbroad because protected speech was prohibited.203

Reed also stated that the affirmative defenses to liability under the COPA—such as the use of a credit card, debit card, adult access code or adult identification number to verify the user was an adult—are not effective measures to verify age.204 He said minors could use parents’ credit or debit cards without the parents’ knowledge,205 and payment card associations that issue credit and debit cards usually prohibit Web site owners from using these cards to verify age.206

Reed said that Internet filters were a better alternative to the COPA because they were widely available, easy to obtain, and provided free through Internet Service Providers, such as

AOL, and through Vista, Microsoft’s operating system.207 Moreover, he wrote that filters

201 Id. at 810-11.

202 Id. at 816-19. Judge Reed said the terms “knowingly and with knowledge of the character of the material” and “intentionally” were not clearly defined. In part, the COPA reads:

(a) Requirement to restrict access. (1) Prohibited conduct. Whoever knowingly and with knowledge of the character of the material, in interstate or foreign commerce by means of the World Wide Web, makes any communication for commercial purposes that is available to any minor and that includes any material that is harmful to minors shall be fined not more than $ 50,000, imprisoned not more than 6 months, or both. (2) Intentional violations. In addition to the penalties under paragraph (1), whoever intentionally violates such paragraph shall be subject to a fine of not more than $ 50,000 for each violation. For purposes of this paragraph, each day of violation shall constitute a separate violation.

47 U.S.C. §§ 231(a)(1) & (2).

203 Am. Civil Liberties Union v. Gonzales, 478 F. Supp. 2d 775, 819-20 (E.D. Pa. 2007). Judge Reed wrote that although the COPA defined a minor as “any person under 17 years of age,” material that would be “patently offensive” for an eight-year-old would not be patently offensive for a sixteen-year-old. Similarly, he stated that material that would not have “serious literary, artistic, political, or scientific value” for a toddler could have such value for a teenager.

204 Id. at 800-05.

205 Id. at 801-02.

206 Id. at 801.

207 Id. at 793.

229

installed on the user’s computer could block more content than that which was covered under the

COPA. Under the COPA, only commercial Web site operators would have been required to block material originating in the United States.208 In contrast, filters could block both commercial and noncommercial material from the United States and abroad, with parents determining the type of material to be blocked.209

Reed acknowledged the two main concerns of the effectiveness of filters:

1) Underblocking, which occurs when filters do not block content for which they have been set

to block; and 2) overblocking, which occurs when filters mistakenly prevent access to content

that the filter was not set to block.210 However, Reed stated that underblocking was the more

important concern because the goal of the COPA was to prevent children from accessing

sexually explicit online material deemed harmful to them.211 He stated that filtering technology

was continuing to improve and many filtering products blocked 90% of sexually explicit

content.212 Reed concluded that the government had “failed to show that filters are not at least as

effective as COPA at protecting minors from material on the Web.”213

Conclusion

In 1997, the Supreme Court declared the Communications Decency Act, Congress’ first

attempt at regulating Internet content, to be unconstitutional.214 Then in 2004, after hearing

208 See 47 U.S.C. § 231(a)(1).

209 Am. Civil Liberties Union v. Gonzales, 478 F. Supp. 2d at 794-95.

210 Id.

211 Id. at 794.

212 Id. at 794-97.

213 Id. at 814.

214 Reno v. ACLU, 521 U.S. 844 (1997).

230

arguments on the Child Online Protection Act a second time,215 the Supreme Court remanded the case to a lower federal court for further analysis while upholding the preliminary injunction against its enforcement.216 In 2007, the U.S. District Court for the Eastern District of

Pennsylvania issued a permanent injunction against the enforcement of the Child Online

Protection Act, stating the COPA violated the First Amendment because it was not narrowly tailored, was not the least restrictive way of protecting minors from material deemed harmful, and was both overbroad and unconstitutionally vague.217 However, in 2003, a year before remanding the COPA to the lower court, the Supreme Court upheld the Children’s Internet

Protection Act, passed in 2000, a law that mandates the installation of filtering technology in public libraries and most schools that accept federal funding for technology.218

215 The COPA was Congress’ second attempt at regulating online content.

216 Ashcroft v. ACLU, 542 U.S. 656, 668, 673 (2004).

217 Am. Civil Liberties Union v. Gonzales, 478 F. Supp. 2d 775, 821 (E.D. Pa. 2007).

218 United States v. Am. Library Ass'n, 539 U.S. 194 (2003).

231

CHAPTER 6 THE LEGISLATIVE HISTORY OF THE CHILDREN’S INTERNET PROTECTION ACT

Introduction

Both the Senate and House were concerned with the availability of online pornography when they first began drafting Internet filtering bills in 1998. The Internet filtering bills and amendments introduced from 1998 through 2000 focused on protecting minors from accessing sexually explicit online content in public libraries and schools.

According to a 1998 Senate Commerce Committee report, children not looking for online pornography could inadvertently find pornography Web sites by typing “innocuous” words into a search engine, such as “teen,” “nurse” or “cheerleader.”1 In a 1998 House

hearing, Rep. Ernest Istook stated that Congress’ goal was to protect children from online

pornography.2A 1999 Senate Commerce Committee report stated that “pornography” could easily be accessed on the Internet, both intentionally and unintentionally,3 and

legislation was needed “to protect America’s children from exposure to obscene material,

child pornography, or other material deemed inappropriate for minors.”4

The bills and amendments in the 1998-2000 period did not specify software

solutions, but rather used terms such as “systems,” which would incorporate both

1 Internet Filtering Systems, Report of the Senate Committee on Commerce, Science and Transportation on S.1619, S. REP. NO. 105-226 (2d Sess. 1998), at 2. See also Legislative Proposals to Protect Children from Inappropriate Materials on the Internet, Hearing before the Subcomm. on Telecomm., Trade, and Consumer Protect of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 20 (statement of Rep. Ernest Istook).

2 Legislative Proposals to Protect Children from Inappropriate Materials on the Internet, Hearing before the Subcomm. on Telecomm., Trade, and Consumer Protect of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 20 (statement of Rep. Ernest Istook).

3 Children’s Internet Protection Act, S. REP. NO. 106-141 (1st Sess., Aug. 5, 1999), at 2.

4 Id. at 1.

232

software-based systems and non-software-based approaches yet to be developed to block or filter sexually explicit or pornographic content deemed harmful to minors.5 “The selection (of a system) is not intended to be limited to software-based systems, but it is intended to encompass all the technologies available now and as technology develops,” a

1998 Senate Commerce Committee report stated.6 All of the bills stipulated that local officials, such as librarians, school boards or school officials, would have selected the filtering technology and would have determined what content was “inappropriate for minors” or “harmful to minors.”7 Four of the bills or amendments contained a disabling provision, with three applying to minors8 and one applying to adults.9 The Children’s

5 The Senate Commerce Committee stated that blocking software is distinguishable from filtering software. Blocking software prevents access to predetermined Web sites whose URLs, or online addresses, are programmed into the software, whereas filtering software screens sites based on keywords and rating systems. Children’s Internet Protection Act, S. REP. NO. 106-141 (Aug. 5, 1999), at 5. However, members of Congressional committees and witnesses testifying before the committees often used the terms “filtering” and “blocking” interchangeably. For a discussion of Internet filtering and blocking technology, see Chapter 3.

6 Internet Filtering Systems, Report of the Senate Committee on Commerce, Science and Transportation on S.1619, S. REP. NO. 105-226 (2d Sess. 1998), at 10.

7 For a discussion of the bills, see infra pp. 241-282.

8 Rep. Ernest Istook’s three filtering proposals—two bills and one amendment—only applied to minors, and therefore the disabling provision pertained to minors only. His proposals would have allowed the filtering technology to be temporarily disabled to allow a minor to access content that was “not unprotected by the Constitution,” as long as the minor was under the direct supervision of an adult. See Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Services, and Educ., and Related Agencies Appropriations Act, [FY]1999, H.R.4274, 105th Cong. (2d Sess. 1998); H.R. 2560(1)(a)(2), 106th Cong. (1st Sess. 1999); and H.R. 4545, 106th Cong. (2d Sess. 2000).

9 Rep. Charles “Chip” Pickering’s filtering bill included a disabling provision for adults only. When adults in libraries and schools were using computers connected to the Internet, “an administrator, supervisor, or other authority” could disable the filtering technology “to enable unfiltered access for bona fide research or other lawful purpose.”See Children’s Internet Protection Act, H.R. 4600, 106th Cong. (2d Sess. 2000).

233

Internet Protection Act, the enacted legislation, included a disabling provision for adults.10

Congress had previously enacted the Communications Decency Act11 and the Child

Online Protection Act12 in an attempt to protect minors from online pornography. Both acts were struck down by the courts.13 In 2000, Congress passed the Children’s Internet Protection Act and

Neighborhood Children’s Internet Protection Act as amendments to a major appropriations bill.14

The enacted version of the CIPA and NCIPA, which was based on two bills the Senate did not pass in 1999,15 prohibited public libraries and schools from receiving funding through Universal

Service and the Library Services and Technology Act unless they implemented an Internet safety policy for minors16 and installed Internet filtering technology.17 Universal Service, commonly

10 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D)). For a discussion of the enacted CIPA, see infra notes 268-290 and accompanying text.

11 See Commc’ns Decency Act of 1996, Pub. L. No. 104-104, 551; 110 Stat. 56, 133-39 (1996) (codified at 47 U.S.C. § 223).

12 Child Online Protection Act of 1998, Pub. L. No. 105-277, 112 Stat. 2681 (1998) (codified at 47 U.S.C. § 231).

13 See Reno v. ACLU, 521 U.S. 844, 885 (1997) (holding that the CDA was unconstitutional) and American Civil Liberties Union v. Gonzales, 478 F. Supp. 2d 775, 809-10, 821 (E.D. Pa. 2007) (holding that the COPA violated the First Amendment). See Chapter 5 for a discussion of the CDA and COPA and the court cases deciding the statutes.

14 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254(h)). See also Dept. of Labor, Health & Human Servs, and Educ. Appropriations Act, 2001, H.R. 4577, 106th Cong. (2d Sess. 2000).

15 See Children’s Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999) and the Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999). For a discussion of S.97, see infra notes 131-176 and accompanying text. For a discussion S.1545, see infra notes 220-227 and accompanying text.

16 20 U.S.C. § 9134(f)(1)(A) and (B), 47 U.S.C. § 254(h)(5)(A) and 47 U.S.C. § 254(h)(6)(A). The Internet safety policy applies only to minors and not to adults. The safety policy “includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any [library or school] computers with Internet access that protects against access through such computers to visual depictions that are obscene, child pornography, or harmful to minors.” The Internet safety policy must “address” minors’ access to “inappropriate matter” on the Internet and “measures designed to restrict minors' access to materials harmful to minors.” The Internet safety policy also must “address” minors’ safety and security when using e-mail, chat rooms, and other forms of direct communications. The Internet safety policy also must address minors’ unauthorized access to the Internet, including “hacking,” and minors’ unauthorized use of personal information regarding minors. Libraries

234

referred to as the E-Rate, applies to libraries and schools.18 In the Telecommunications Act of

1996, Congress mandated that the Federal Communications Commission use universal service funding to provide libraries and schools with discounted telecommunications equipment and

Internet access, as well as internal connections (such as network wiring within the library or school).19 Libraries and schools in rural or economically disadvantaged areas are eligible for more funding than libraries and schools in metropolitan or more affluent areas. The Library

Services and Technology Act (LSTA), which is available only to libraries, provides funding for computer equipment and Internet access.20

Once the two filtering amendments were enacted, Congress dropped the title

“Neighborhood Children’s Internet Protection Act” and referred to the filtering legislation as the

Children’s Internet Protection Act (CIPA).21 The Children’s Internet Protection Act requires public libraries and schools receiving E-rate and LSTA funding to do two things: 1) Adopt an

and schools must “provide reasonable public notice and hold at least one public hearing or meeting to address the Internet safety policy.” 47 U.S.C. § 254 (l) (1).

17 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254(h)).

18 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 47 U.S.C. § 254(h)). See also Dept. of Labor, Health & Human Servs., and Educ. Appropriations Act, 2001, H.R. 4577, 106th Cong. (2d Sess. 2000).

19 See 47 U.S.C. § 254. See also http://www.fcc.gov/cgb/consumerfacts/usp_Schools.html (last visited July 20, 2009).

20 See 20 U.S.C. § 9134.

21 No reason was given for dropping Neighborhood Children’s Internet Protection Act from the title. However, when Senator Rick Santorum originally introduced the Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999), the bill did not require a blocking or filtering system. The bill would have allowed libraries and schools to implement either a user policy or a blocking/filtering system. The filtering amendments enacted in 2000 required both the use of a “technology protection measure” and the implementation of a user policy. Dept. of Labor, Health & Human Servs., & Educ. Appropriations Act, 2001, H.R. 4577, 106th Cong. (2d Sess. 2000). See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254 (h)).

235

Internet safety policy for minors, 22 and 2) “enforc[e]” a “technology protection measure,” such

as blocking or filtering software, on all computers with Internet access.23

The Internet filtering technology must be installed and used to block or filter access by all patrons to “visual depictions” that are obscene24 or involve child pornography.25 The law also requires that “a technology protection measure” be used to block or filter access by persons under the age of seventeen to “visual depictions” that are considered “harmful to minors,”26 as determined by local libraries and schools.27 The filtering technology must be used on all

computers connected to the Internet, even those computers not purchased with E-rate or LSTA

funds.28 Adults can request that the filtering software be disabled for “bona fide research or other

lawful purpose[s].”29

Blocking software prevents access to predetermined Web sites whose URLs, or online

addresses, are programmed into the software, whereas filtering software screens sites based on

keywords and rating systems.30 Congress’ use of the term “technology protection measure,” as

22 20 U.S.C. § 9134(f)(1)(A), 47 U.S.C. § 254(h)(5)(A) and 47 U.S.C. § 254(h)(6)(A).

23 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254 (h)). From 2000, when Congress passed the legislation, through the present, the only technology measure available has been blocking or filtering software.

24 20 U.S.C. § 9134(f)(1) and 47 U.S.C. §§ 254 (h)(5)(B) & (h)(5)(C) and 47 U.S.C. §§ 254 (h)(6)(B) & (h)(6)(C).

25 Id.

26 20 U.S.C. § 9134(f)(1)(A), 47 U.S.C. § 254(h)(5)(B) and 47 U.S.C. § 254(h)(6)(B). See infra notes 268-290 and accompanying text for a discussion of the acts.

27 47 U.S.C. § 254 (l)(2).

28 20 U.S.C. § 9134 (f)(1)(a) and 47 U.S.C. §§ 254 (h)(5) & (6).

29 Under 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D), the term “lawful purpose” is singular, whereas under 20 U.S.C. § 9134(f)(3), the term “lawful purposes” is plural.

30 See Children’s Internet Protection Act, Report of the Senate Committee on Commerce, Science and Transportation on S.97, 106th Cong. (1st Sess. 1999), at 6. See also Chapter 3 of this dissertation for a discussion of filtering and blocking technology.

236

opposed to blocking or filtering software, seemed to be intentional so as not to prevent the

implementation of new technological devices that might be developed later.31

President Bill Clinton signed the CIPA and NCIPA into law on December 21, 2000,32 thirty-four months after the House and Senate had first introduced initial filtering and blocking legislation.33 The federal mandatory filtering law was to have become effective on April 20,

2001, but the American Civil Liberties Union (ACLU) and the American Library Association

(ALA) independently filed suit to block the implementation of the Children’s Internet Protection

Act.34 A federal district court consolidated the cases and in 2002 held that the challenged

sections of the Children’s Internet Protection Act were unconstitutional,35 a ruling that the U.S.

Supreme Court reversed in 2003.36 While the cases were pending before the courts, the Federal

Communications Commission issued a fact sheet to clarify the regulations. The fact sheet stated

that public libraries and schools were required to certify that they had their technology policies

and online safety policies in place (or that they were in the process of taking the necessary

actions of implementing them) before receiving E-rate funding for the following school year.37

31In 1998, in referring to S.1619, the Internet School Filtering Act, the Senate Commerce Committee said that proposed legislation was “intended to encompass all the technologies available now and as technology develops.” Internet Filtering Systems, Report of the S. Comm. on Commerce, Science & Transportation on S.1619, S. REP. NO. 105-226 (2d Sess. 1998) at 10.

32 Pub. L. No. 106-554.

33 Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998) and Internet School Filtering Act, S. 1619, 105th Cong. (2d Sess. 1998).

34 Am. Civ. Liberties Union, Library Internet Access is Still Free from Censorship as Law Goes into Effect, ACLU Tells Libraries, Patrons, available at http://www.aclu.org/Privacy/Privacy.cfm?ID=7224&c=252. See also Am. Library Ass’n, CIPA, available at http://www.ala.org/ala/aboutala/governance/annualreport/annualreport/annualreportarch/report2002/freedom.cfm

35 See Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

36 See United States v. Am. Library Ass’n, 539 U.S. 194 (2003), rev’g Am. Library Assn. v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

37 FCC Consumer Facts on Children’s Internet Protection Act, Sept. 17, 2003, http://www.fcc.gov/cgb/consumerfacts/cipa.html (last visited July 20, 2009).

237

The Internet safety policy had to include “technology protection measures” to prevent access to online images that were obscene, child pornography, or harmful to minors (for those computers used by minors).38 The Internet safety policy also required libraries and schools to educate minors about appropriate online behavior, including interacting with other people on social networking sites.39 However, libraries and schools did not have to comply with the CIPA while the act was being challenged.

The Emergence of the Children’s Internet Protection Act

The Children’s Internet Protection Act was not Congress’ first attempt at mandating

Internet filtering or blocking systems in public libraries and schools in order for libraries and schools to receive federal funding for Internet access. Since 1998, legislators had introduced twelve bills and amendments to appropriations bills that would have required the nation’s schools and public library systems40 to install a blocking or filtering system if the schools and libraries received federal technology funding.41 However, none of the filtering bills or amendments passed, with most dying in committee.

38 Id.

39 Id.

40 Public library administrative units or systems number approximately 9,100 and encompass a total of 16,241 buildings, including central and branch locations. See Adrienne Chute, Elaine Kroe, Patricia O’Shea, Maria Polcari & Cynthia Jo Ramsey, Public Libraries in the United States: Fiscal Year 2001 (U.S. Dept. of Education National Center for Education Statistics (2003), available at http://nces.ed.gov/pubs2003/2003399.pdf. See also Internet Filtering Systems, Report of the Senate Committee on Commerce, Science and Transportation on S.1619, S. REP. NO. 105-226 (2d Sess. 1998), at 8, stating that there are approximately 9,000 public libraries in the United States.

41 See Internet School Filtering Act, S. 1619, 105th Cong. (2d Sess. 1998); Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998); Child Protection Act of 1998 (the Istook Amendment) to FY99 Labor, Health & Human Servs., & Educ. Appropriations Bill, Title VI of H.R.4274, 105th Cong. (2d Sess. 1998); Internet Filtering, Amendment No. 3228 to Departments of Commerce, Justice, and State, the Judiciary, and Related Agencies Appropriations Act, 1999, S. 2260, 105th Cong. (2d Sess. 1998); Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, S.97, 106th Cong. (1st 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999); Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999), which is similar to S.97, the Children’s Internet Protection Act; Child Protection Act of 1999, H.R. 2560, 106th Cong. (2d Sess. 1999); Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999); the Internet Minors

238

During 1998 and 1999, as filtering bills were initially being introduced in both chambers of Congress, the U.S. Senate Commerce Committee held three hearings and issued two reports on Internet indecency and the proposed filtering bills.42 The U.S. House of Representatives

Subcommittee on Telecommunications, Trade, and Consumer Protection held one hearing on legislative proposals to protect children from “inappropriate” online content.43 The House and

Senate did not hold hearings or issue reports in 2000, the year Congress passed the Children’s

Internet Protection Act.

In the 1998 and 1999 Senate and House committee hearings, legislators heard testimony on the capabilities of filtering and blocking software at the time. They were told that, first, software could suggest appropriate material when users type in search terms on the World Wide

Web. Second, software could inform or warn users about online content before they saw it.44

Protection & Cyberspace Tech. Act, H.R. 4545, 106th Cong. (2d Sess. 2000); the Children’s Internet Protection Act, H.R. 4600, 106th Cong. (2d Sess. 2000). See also, the E-Rate Policy & Child Protection Act of 1998, H.R. 3442, 105th Cong. (2d Sess. 1998), which would have mandated a library use policy rather than filtering and which was not enacted into law. See also the provision of Internet filtering or screening software, Title XVI, Sec. 1604, of the Violent and Repeat Juvenile Offender Accountability and Rehabilitation Act of 1999, S. 254, 106th Cong. (1st Sess. 1999), which would have required Internet Service Providers to offer filtering software to their subscribers, but did not require public libraries and schools to install filtering software. The filtering bills are listed above in chronological order by date of introduction in the House or Senate. H.R. 3177 and S. 1619 contain identical wording, as do S.97 and H.R. 543. For an analysis of the bills, see infra pp. 241-283.

42 See Internet Indecency: Hearing Before the S. Comm. on Commerce, Science, & Transportation, 105th Cong. 910 (2d Sess., Feb. 10, 1998); Internet Filtering Systems, Report of the S. Comm. on Commerce, Science & Transportation on S.1619, S. REP. NO. 105-226 (2d Sess. 1998); S. 97, The Children’s Internet Protection Act, Hearing Before the Comm. on Commerce, Science, & Transportation, S. Hrg. 106-603 (1st Sess. 1999); S. 97, The Children’s Internet Protection Act, Hearing Before the Comm. on Commerce, Science, & Transportation, S. Hrg. 106-828 (1st Sess., May 20, 1999); Children’s Internet Protection Act, Report of the Comm. on Commerce, Science, & Transportation, on S.97, S. REP. NO. 106-141 (1st Sess., Aug. 5, 1999).

43 See Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998).

44 See Internet Indecency: Hearing Before the S. Comm. on Commerce, Science, & Transportation, 105th Cong. 910 (2d Sess., Feb. 10, 1998) at 35 (statement of Christine Varney, attorney and chair of the Internet Online Summit). According to the Summit’s Web site, more than 650 participants representing over 300 organizations came together in December 1997 for the Internet Online Summit: Focus On Children, which “addressed ways to assure that steps are taken to make the Internet online experience safe, educational and entertaining for children.” http://www.kidsonline.org/oldindex.shtml (last visited July 20, 2009).

239

Third, software could block access to online content,45 which would be done by preventing

access to predetermined Web sites once URLs, or online addresses, were programmed into the

software.46 Fourth, software could filter online content,47 which would be done by preventing

access to Web sites based on keywords and rating systems.48

The majority of the bills and amendments introduced from 1998 through 2000 would have amended the Communications Act of 1934 by requiring public libraries and schools receiving universal service, or E-rate, funds to install blocking or filtering technologies on computers connected to the Internet to prevent minors from accessing online pornography.49 The

bills and amendments contained common terms referring to the screening of online content, such

as “filter”, “block”, “tools” and “devices.” Some of the proposals mandated that only one library computer needed to have a filtering or blocking system in use, while others stipulated that only computers and peripherals purchased with E-rate funds needed to have a filtering or blocking system in use. All of the proposals stipulated that local officials, such as librarians, school boards or school officials, would select the filtering technology and would determine what content was

“inappropriate for minors” or “harmful to minors.” One bill, which did not pass, would have required only a user policy that would have required libraries and schools to “establish a policy with respect to access to material that is inappropriate for children.”50 Other proposals—

45 Id. (statement of Christine Varney, attorney and chair of the Internet Online Summit).

46 See Children’s Internet Protection Act, Report of the Senate Committee on Comm., Science & Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess., Aug. 5, 1999) at 6. See Chapter 3 of this dissertation for a discussion of filtering and blocking technology.

47 See id.

48 See id.

49 For a discussion of the bills, including the content covered in each bill, see the next section of this chapter.

50 E-Rate Policy and Child Protection Act of 1998, H.R. 3442, 105th Cong. (2d sess. 1998). Rep. Edward Markey, Dem.-Massachusetts, introduced the act, which was co-sponsored by Rep. Thomas Manton, Dem.-New York. The bill did not clarify or elaborate further on the user policy.

240

including the enacted Children’s Internet Protection Act51—required that libraries implement an

Internet use policy and install and use a “technology protection measure” on all computers with

Internet access. The Children’s Internet Protection Act of 2000 was based on three proposals that

Senator John McCain, a Republican from Arizona, had introduced from 1998 to 2000.52

Proposed Legislation in 1998

In 1998, the first year that Congress looked into implementing mandatory filtering in public libraries and schools, three proposals were introduced in the Senate and House. One bill was introduced in the Senate, and one bill and one amendment were introduced in the House.

The bills and amendment all stipulated that public libraries and schools that failed to install a filtering or blocking system on computers having Internet access would be ineligible for universal service or E-rate funding.53 Although Congress did not pass the bills or amendment, two nearly identical bills received considerable attention in the Senate54 and House,55 with neither bill limiting the selection of a filtering or blocking method to only software.

51 See Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254(h)).

52 S.1619, 105th Cong. (2d Sess. 1998); S.97IS, 106th Cong. (1st Sess. 1999); and Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consolidated Appropriations Act, 2001, 106th Cong. (2d Sess., June 22, 2000).

53 See Internet School Filtering Act, S. 1619, 105th Cong. (2d Sess. 1998); Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998); Child Protection Act of 1998 (the Istook Amendment) to FY99 Labor, Health & Human Servs., & Educ. Appropriations Bill, Title VI of H.R.4274, 105th Cong. (2d Sess. 1998); and Internet Filtering, Amendment No. 3228 to Depts. of Commerce, Justice, & State, the Judiciary, and Related Agencies Appropriations Act, 1999, S. 2260, 105th Cong. (2d Sess. 1998). Rep. Edward Markey (Dem.-Massachusetts) and Thomas Manton (Dem.-New York) introduced the E-Rate Policy and Child Protection Act of 1998, H.R. 3442, 105th Cong. (2d Sess. 1998), but their bill required the establishment of user policies for minors using computers connected to the Internet, rather than filtering or blocking technology.

54 See Internet School Filtering Act, S.1619, 105th Cong. (2d Sess. 1998), introduced by Sen. John McCain, R- Arizona, along with original cosponsors Ernest Hollings, Dem.-South Carolina, Dan Coats, Rep.-Indiana and Patty Murray, Dem.-Washington. Five other cosponsors (three Republicans and two Democrats) were added later.

55 See Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998) (introduced by Rep. Bob Franks, Rep.-New Jersey).

241

In February 1998, Sen. John McCain from Arizona, who became the U.S. presidential

candidate for the Republican Party in 2008,56 introduced S.1619, the Internet School Filtering

Act,57 his first of two major Internet filtering bills in the Senate.58 Three cosponsors—Ernest

Hollings, a Democrat from South Carolina; Dan Coats, a Republican from Indiana; and Patty

Murray, a Democrat from Washington—joined him on Feb. 9, 1998 when he introduced the

bill.59

S.1619, the Internet School Filtering Act, would have amended the Communications Act

of 1934, 47 U.S.C. 254, by prohibiting universal service, or E-rate, funding to public libraries

and schools unless the library or school certified that it “employ[ed] a system” on “one or more

of its computers” with Internet access that would “filter or block matter deemed to be

inappropriate for minors.”60 The bill did not define “inappropriate for minors.” McCain’s

Internet filtering bill, S.1619, emphasized local control, meaning that 1) local libraries and

schools would determine which content is inappropriate for minors, and 2) the U.S. government

would not be allowed to establish the criteria, review the determination or consider such criteria

in awarding funds.61

56 Senator McCain lost the presidential race to Barack Obama.

57 See Internet School Filtering Act, S.1619, 105th Cong. (2d Sess. 1998).

58 Sen. McCain introduced his second Internet filtering bill, S.97, in 1999. For a discussion of S.97, see infra notes 131-176 and accompanying text.

59 Six other cosponsors, four Republicans and two Democrats, later added their names as cosponsors of the bill: Spencer Abraham, Rep.-Michigan; Kay Bailey Hutchison, Rep.-Texas; Ted Stevens, Rep.-Alaska; Christopher “Kit” Bond, Rep.-Missouri; Herb Kohl, Dem.-Wisconsin; and Daniel Inouye, Dem.-Hawaii.

60 S.1619 a(1) & (3), 105th Cong. (2d Sess. 1998). In introducing his bill, Sen. McCain explained that the E-rate provides federal subsidies to libraries and schools for discounted Internet access. See 144 CONG REC. S8161 at S8162 (July 15, 1998) (statement of Sen. McCain).

61 See S.1619 a(1)(4), 105th Cong. (2d Sess. 1998).

242

In introducing S.1619, the Internet School Filtering Act, McCain stated that while some

minors actively sought out pornographic Web sites, many minors “unintentionally” came across

such Web sites. 62 “The prevention lies, not in censoring what goes onto the Internet, but rather

in filtering what comes out of it onto the computers our children use outside the home,” McCain

said.63 Although parental supervision is more effective than government assistance or industry self-regulation, according to McCain, “parental supervision . . . is not possible when children use the Internet while they are away from home, in schools and libraries.”64

The American Civil Liberties Union and American Library Association consistently opposed mandatory filtering bills. In testimony during the Senate Commerce

Committee hearing on S.1619, the Internet School Filtering Act, in 1998, the ACLU recommended that libraries and schools establish “content-neutral rules about when and how young people should use the Internet” and provide educational seminars on the

“responsible use of the Internet.”65 The ACLU stated that “S.1619 purports to leave to

the school board or librarian the authority to determine what matter is ‘inappropriate for

minors,’ (but) they are virtually powerless to make this determination.”66 The ACLU said

that a software vendor’s list of “acceptable sites” is proprietary and therefore not

available to customers. In addition, vendors would decide which speech is acceptable and

unacceptable, with employees using subjective judgment in making content

determinations. The ACLU recommended that libraries and schools establish “content-

62 See 144 CONG. REC. S517 at S519 (Feb. 9, 1998) (statement of Sen. John McCain).

63 Id.

64 Id.

65 Internet Indecency: Hearing Before the S. Comm. on Commerce, Science, & Transportation, 105th Cong. 910 (Feb. 10, 1998) at 67 (statement of American Civil Liberties Union).

66 Id. at 65 (statement of American Civil Liberties Union, referring to S.1619, the Internet School Filtering Act).

243

neutral rules about when and how young people should use the Internet” and provide

educational seminars on the “responsible use of the Internet.”67

The American Library Association testified that filtering or blocking software

deprives patrons of access to valuable and constitutionally-protected information. The

ALA said that software “fails to provide ‘protection’ from other materials that others may

find ‘objectionable,’ however defined.”68

However, the Senate Commerce Committee, in its 1998 report on Internet filtering

systems, opposed the implementation of a user policy unless blocking or filtering technology

also was mandated. The committee stated that a standard use policy that would be comprised of a code of conduct for appropriate online behavior would not be effective because it relied on minors pledging not to seek “harmful materials” on the Internet. Such a policy statement also would not protect minors from inadvertent or accidental exposure to pornography.69 The

Commerce Committee stated that filtering or blocking technology was needed because parents

would not be able to supervise their children in libraries and schools.70

The June 1998 Senate Commerce Committee’s report to accompany S.1619, the

Internet School Filtering Act, stated that the online pornography and graphic violence posed a “danger” and “is particularly acute for the nation’s children, who are unable to guard themselves with the sophistication of an adult.”71 The report acknowledged that

67 Id. at 67 (statement of American Civil Liberties Union).

68 Id. at 62 (statement of American Library Association).

69 Internet Filtering Systems, S. REP. NO. 105-226 (2d Sess., June 25, 1998) at 7.

70 Id. at 3.

71 Id. at 1.

244

although filtering or blocking technologies are not perfect, they “provide a reasonable means of protecting children from the majority of harmful material on the Internet.”72

The Senate referred the McCain bill to the Senate Committee on Commerce, Science, and

Transportation, which was chaired by the senator. The Committee, in its June 1998 report to the

Senate, recommended passage of the bill, stating that the purpose of the bill was to protect children from “exposure to harmful material while accessing the Internet from a school or library.”73 The report stated that mandating filtering or blocking mechanisms, in exchange for receipt of universal federal funds for technology in public libraries and schools, is constitutional.74 According to the report, “Congress may impose reasonable conditions on the receipt of federal funds or subsidies as part of its spending power” as long as the conditions are stated “clearly and unambiguously.”75 The report stated that the use of filtering or blocking systems was constitutional because the installation of those systems “is reasonably related to the purpose of providing schools and libraries with Internet services to fulfill their education mission.”76 Moreover, libraries could continue to provide unfiltered access to the Internet if they

72 Id. at 3 (June 25, 1998). The committee reported that filtering software prohibits access to online content based on key words, whereas blocking software prohibits access to online content based on a list of sites previously found to be “inappropriate.” Id. For a discussion of filtering and blocking technology, see Chapter 3.

73 Id. at 1.

74 Id. at 4 (citing two Supreme Court decisions that examined the relationship between the First Amendment and the government’s right to subsidize speech). See Rust v. Sullivan, 500 U.S. 173 (1991), in which the Supreme Court upheld a government ban that prohibited federally funded clinics from advocating, promoting or advising on abortion. The report noted that the Supreme Court, in Rust, stated that the government is entitled to say what it wishes when the government spends public funds to promote a specific policy. See also Rosenberger v. Rector, 515 U.S. 819 (1995), in which the Supreme Court held that a public university’s policy prohibiting the reimbursement of a Christian student newspaper’s expenses was unconstitutional. The report stated that the Supreme Court affirmed that while the government may make content-based decisions when the government is the speaker, in this case the university was not the speaker, but rather distributed funds to encourage diverse viewpoints from private speakers, including the religious newspaper and other student publications.

75 Id.

76 Id.

245

did so with their own funds or through other types of government programs other than the E- rate.77

The Senate Commerce Committee report considered the public library a public forum for the purpose of acquiring knowledge,78 stating that laws restricting access to such information must pass the strict scrutiny test.79 To pass the strict scrutiny test, a regulation must promote a compelling government interest and must be narrowly tailored to further that interest.80 The report said that the Internet School Filtering Act would pass such a test because the courts have found protection of minors to be a compelling government interest and the bill was designed to shield children from harmful materials on the Internet.81 According to the report, the bill was narrowly tailored and the least restrictive means of achieving the government’s compelling interest because libraries—not the government—would determine “matter deemed to be inappropriate for minors.”82 The report also stated that library standard use policies would not protect children from inadvertently or accidentally accessing “harmful materials” on the

77 Id. at 5.

78 Id. at 6. When stating that a public library is a public forum, the Senate Commerce Committee did not specify whether it meant a traditional public forum or a designated public forum. The Committee also did not discuss public forum doctrine in relation to Internet access. However, in the context of the report, the Senate Commerce Committee indicated that a public library was a designated public forum. In labeling the public library a public forum for the purpose of knowledge acquisition, the report cited Kreimer v. Bureau of Police for Morristown, 958 F.2d 1242 (1992), the most recent on-point case at that time. In the holding of Kreimer, the court found that the local government in Morristown and Morris Township, New Jersey, had made the library a limited public forum for the right to receive information, but not for personal expression, such as making speeches. Kreimer, 958 F.2d at 1256- 63). The court stated, “The recognition of a constitutional right protecting public access to information and ideas is simply the threshold of our analysis.” Kreimer, 958 F.2d at 1255. For further discussion of the application of public forum doctrine to public libraries, see supra Chapter 2, pp. 56-101. For a discussion of court cases dealing with Internet access within public libraries, see Chapter 2, pp. 62-68. For a discussion of the court cases on the Children’s Internet Protection Act, see Chapter 7.

79 Internet Filtering Systems, S. REP. NO. 105-226 (2d Sess., June 25, 1998) at 6.

80 See Sable v. FCC, 492 U.S. 115, 126 (1989).

81 Internet Filtering Systems, S. REP. NO. 105-226 (2d Sess., June 25, 1998) at 6.

82 Id. at 7.

246

Internet.83 Finally, the report stated that adults could still access constitutionally protected speech

in the public library because the bill required that a filtering or blocking system need only be

installed on one computer with Internet access. If a small branch library had only one computer with Internet access, the library would need to install a filtering or blocking system on that computer in order for the library to receive E-rate funds. However, the bill would allow the librarian to turn the system off when “appropriate” when adults used the computer,84 although the report did not define or provide examples of “appropriate.”

Senator Joe Lieberman, a Democrat from Connecticut who had opposed the

Communications Decency Act,85 stated that he supported McCain’s filtering bill even though he was “extremely reluctant to resort to government restrictions on speech or any forms of expression and much prefer[s] self-regulation.”86 Lieberman said he supported the bill because

the Internet industry did not seem able to regulate itself.87

S. 1619, the Internet School Filtering Act, passed in the Senate. However, no action was

taken on the bill in the House before the conclusion of the legislative session. In July 1998, the

Senate Appropriations Committee approved an appropriations bill88 with an amendment by Sen.

John McCain that contained the text of the bill that McCain had previously introduced as S.

83 Id. at 7. The Committee wrote that children could “be traumatized by exposure to hard core pornography using innocuous search terms,” such as “cheerleader” or “nurse.” Id.

84 Id. at 6-7.

85 Congress passed the Communications Decency Act as part of the Telecommunications Act of 1996. The CDA was Congress’ first attempt to regulate speech on the Internet and criminalized the online transmission of indecent or patently offensive material to minors, defined as those under the age of eighteen. The Supreme Court held that the CDA was unconstitutional. Reno v. ACLU, 521 U.S. 844, 885 (1997). See Chapter 5 for a discussion of the Communications Decency Act.

86 144 CONG. REC. S 8161 at 8162 (July 15, 1998) (statement of Sen. Joe Lieberman).

87 Id. at 8163 (statement of Sen. Joe Lieberman).

88 Appropriations for Commerce, Justice, State, & Judiciary FY 1999, S.2260, 105th Cong. (2d Sess. 1998).

247

1619, the Internet School Filtering Act.89 After the committee voted to approve the filtering

amendment, Sen. Conrad Burns, a Republican from Montana, stated his opposition to the bill:

I want to make it very clear that I remain steadfastly opposed to big government mandates on the filtering issue and I will work closely with my colleagues as S. 2260 heads to conference to perfect the bill to reflect these concerns. I continue to believe that local communities acting through their school and library boards, rather than software programs that are at best questionable or the federal government, are in the best position to make decisions on this critical issue.90

Although the Senate passed S.2260, the Fiscal Year 1999 appropriations bill for

Commerce, Justice, State, and Judiciary, with the text of S.1619 attached as an amendment,91 the

House did not take action on the appropriations bill. The following year, Senator McCain reintroduced the Safe Schools Internet Act as S.97, the Childrens’ [sic] Internet Protection Act, although the wording of S.97 changed over time, as discussed in the section below on filtering proposals introduced in 1999.92

In early 1998, during the same period that McCain had introduced S.1619, the Internet

School Filtering Act,93 a parallel filtering bill was working its way through the House. Rep. Bob

Franks, a Republican from New Jersey, introduced H.R. 3177, the Safe Schools Internet Act of

1998.94 Franks’ bill was identical to the McCain bill, mandating that before public libraries and

schools could receive universal service assistance, each library or school would have to certify

89 Internet School Filtering Act, S. 1619, 105th Cong. (2d Sess. 1998).

90 144 CONG. REC. S 8611 at S 8614 (July 21, 1998) (statement of Sen. Conrad Burns).

91 Appropriations for Commerce, Justice, State, & Judiciary FY 1999, S.2260, 105th Cong. (2d Sess. 1998).

92 Childrens’ [sic] Internet Protection Act, S. 97, 106th Cong. (1st Sess. 1999). See infra notes 131-176 and accompanying text for a discussion of S.97 and the changes in wording.

93 S.1619, 105th Cong. (2d Sess. 1998).

94 H.R. 3177 (a)(1), 105th Cong. (2d Sess. 1998) (introduced by Representative Bob Franks).

248

that “it employs a system to filter or block matter deemed to be inappropriate for minors” on at least one computer with Internet access.95 Franks’ bill also stipulated that local libraries and schools would determine which content is inappropriate for minors and the U.S. government would not establish, review or consider the local criteria in making its funding decisions,96 just as McCain’s bill had.

In a September 1998 House hearing on H.R. 3177 and other legislative proposals to protect minors from “inappropriate” online content, Mary Anne Layden, a clinical psychologist, stated that pornography distorts children’s and adults’ views about intimacy and sex by

“spread(ing) the myth that male sexuality is viciously narcissistic, predatory and out of control.”97

Peter Nickerson, the president of a filtering software company, stated that blocking and filtering software could be placed on servers, as well as on individual computers.98 He also said filtering and blocking programs are “fully customizable,”99 with “systems to be usable based on

95 H.R. 3177, 105th Cong. (2d Sess. 1998).

96 H.R. 3177 (4), 105th Cong. (2d Sess. 1998).

97 Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 55 (statement of Mary Ann Layden). For example, she told the committee about a twelve-year-old boy who viewed his father’s pornographic magazines and then began having sex with his eight-year-old sister. The boy continued having sex with his sister for ten years. Layden said that pornography “mis-educated him (the boy) about sexuality and gave him a pathological view of intimacy.” Id.

98 Id. at 61 (statement of Peter Nickerson, CEO of N2H2). Nickerson said that Internet filters can be placed on a single server, which controls information to individual computers throughout a network. In schools, filters can be placed on servers in an individual school or in a school district or in a network of school districts. Id. at 61. Similarly, filters can be placed on servers in an individual library or in the library district’s headquarters if all library computers run through one or more central servers in a main building. See American Library Association, CIPA and Libraries: From the Field, available at http://www.ala.org/ala/aboutala/offices/wo/woissues/civilliberties/cipaweb/adviceresources/fromthefield.cfm. For a discussion of filtering technology, see Chapter 3.

99 Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 64 (statement of Peter Nickerson, CEO of N2H2).

249

a community’s own standards.”100 Public library and school administrators could choose which

categories of content to block, Nickerson stated.101 For example, a library or school could

prevent minors from accessing Web sites containing pornography, drug use, graphic violence

and/or bomb-making instructions.102

Librarians and school administrators could add or subtract Web sites from the list. In

addition to blocking and filtering Web sites, libraries and schools could also block e-mail and

chat rooms, Nickerson said. The software also could allow adults unblocked and unfiltered

access to the Internet through the use of a password.103 While both legal and illegal pornography

would be blocked or filtered, courts have held that “adult entertainment can be ‘zoned’ out of certain parts of the community in order to protect other legitimate (but competing) interests,”

such as protecting children, Nickerson said.104

Agnes Griffen, an American Library Association representative, questioned whether any

blocking or filtering technology could keep up with monitoring content of more than 300 million

Web pages since the number of Web pages was “exponentially increasing.”105 Jerry Berman, the

executive director for the Center of Democracy and Technology, stated that while many filtering

100 Id. at 62.

101 Id.

102 Id. at 64.

103 Id.

104 Id. at 65. Nickerson did not cite the zoning cases in his testimony. In contrast to Nickerson’s testimony, the American Civil Liberties Union, in an earlier hearing before the Senate Commerce Committee, had testified that libraries and schools would not be able to set their own local parameters to filter content because private vendors design filtering software. See Internet Indecency: Hearing Before the S. Comm. on Commerce, Science, & Transportation, 105th Cong. 910 (Feb. 10, 1998) at 65 (statement of American Civil Liberties Union, referring to S.1619, the Internet School Filtering Act).

105 Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 71 (statement of Agnes Griffen, member of the ALA Committee on Legislation and Director of the Tucson-Pima Public Library in Arizona).

250

and blocking products were available in 1998, no one had conducted an empirical study on the effectiveness of filtering technology.106 Berman also stated that the Safe Schools Internet Act’s

“requirements to adopt filtering technology will usurp local communities’ ability to set standards

that reflect their values.”107 Because companies do not disclose their standards, libraries and

schools would not be able to determine if commercial filters met their needs, Berman said.108

Griffen, the ALA representative, stated that local libraries are best able to decide on whether filtering technology, a user policy or another method would be the most effective option to prevent minors from accessing online pornography.109 According to Griffen, “quick fixes,”

such as mandatory filtering, “fail to teach children how to best use the Internet.”110 Using

blocking or filtering technology would prevent minors from developing “critical viewing and information skills that will help them make good judgments about the information they encounter,” she said.111

Franks’ bill, H.R. 3177, was referred to, and died in, the House Commerce Committee.

Rep. Ernest Istook, a Republican from Oklahoma, introduced the third and final filtering

proposal in 1998, which was attached as an amendment to a major House appropriations bill.

The Child Protection Act of 1998 would have required schools and public libraries receiving

106 Id. at 82 (statement of Jerry Berman, Executive Director for the Center for Democracy and Technology). According to its Web site, the CDT is a “non-profit public policy organization dedicated to promoting the democratic potential of today's open, decentralized global Internet.” http://cdt.org/mission/ (last visited July 20, 2009). For a discussion of filtering studies, see Chapter 3.

107 Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess., Sept. 11, 1998) at 40 (statement of Jerry Berman, referring to S.1619, the Internet School Filtering Act).

108 Id.

109 Id. at 71 (statement of Agnes Griffen).

110 Id. at 74.

111 Id.

251

federal funds for acquiring or operating any computers connected to the Internet and accessible to minors to do two things: 1) to install on computers software that was “adequately designed” to prevent minors from accessing “obscene information”; and 2) to ensure that the software is

“operational” whenever minors used the computers. The software program could be “temporarily interrupted” to allow minors to access information that was not obscene or otherwise unprotected by the Constitution, as long as the minor was under the direct supervision of an adult.112

In contrast to the McCain’s Internet School Filtering Act (S.1619)113 and Franks’

Safe Schools Internet Act (H.R. 3177),114 the Child Protection Act applied to all federal funds (not just E-rate or LSTA funding) and specifically stated that the filtering method used had to be “software” and the software had to be “operational.”115 Istook’s proposal would have prevented minors from accessing only obscenity, rather than the broader range of material deemed “inappropriate to minors,”116 as the McCain and Franks bills

112 See Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, [FY]1999, H.R.4274, 105th Cong. (2d Sess. 1998).

113 See S.1619, 105th Cong. (2d Sess. 1998).

114 See H.R. 3177, 105th Cong. (2d Sess. 1998).

115 Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, [FY]1999, H.R.4274, 105th Cong. (2d Sess. 1998).

116 In Miller v. California, 413 U.S. 15, 24 (1973), the Supreme Court defined “obscenity” as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual conduct in a patently offensive way, and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.” Id. See Chapter 4 for a discussion of obscenity and the concept of harmful to minors. The phrase “inappropriate for minors” was not defined in McCain’s bill, S.1619, 105th Cong. (2d Sess. 1998) or Franks’ bill, H.R. 3177, 105th Cong. (2nd Sess. 1998). In June 2000, Rep. Charles “Chip” Pickering introduced the Children’s Internet Protection Act, the first bill to define what content would be inappropriate for minors. See H.R. 4600, 106th Cong. (2d Sess. 2000). Pickering’s bill, which was co-sponsored by Bob Franks and Ernest Istook, used the term “harmful to minors.” The bill defined “harmful to minors” as,

any communication, picture, image, graphic image file, article, recording, writing or other matter of any kind that—(i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a

252

did. The Istook amendment contained a provision stating that the software could be

“temporarily interrupted to permit a minor to have access to information that is not

obscene or otherwise unprotected by the Constitution under the direct supervision of an

adult designated by such school or library.”117 The McCain and Franks bills did not

require the installation of software as a solution and did not literally require that the

filtering or blocking system had to be activated, although the introduction to both

McCain’s and Franks’ bills stated that schools and libraries had to “implement” such a

system.118

Istook had requested an analysis of the constitutionality of the Child Protection

Act of 1998 from the Congressional Research Service. Henry Cohen, a legislative

attorney, drafted a memo stating that the bill would be constitutional because obscenity is

not protected by the First Amendment. Cohen also stated that the bill would not

necessarily be unconstitutional if it were to be applied to constitutionally protected

material because “Congress may, to some extent, discriminate on the basis of the content

of protected speech in choosing what speech to fund, even where it could not do so by

directly proscribing it.”119

lewd exhibition of the genitals; (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value.

H.R. 4600, 106th Cong. (2d Sess. 2000).

117 Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, [FY]1999, H.R.4274, 105th Cong. (2d Sess. 1998).

118 S.1619, 105th Cong. (2d Sess. 1998) and H.R. 3177, 105th Cong. (2nd Sess. 1998). The introduction to both bills stated: “No universal service for schools or libraries that fail to implement a filtering or blocking system for computers with Internet access,” which might imply activation. See Internet School Filtering Act. S.1619, Sec. 1, 105th Cong. (2d Sess. 1998) and Safe Schools Internet Act, H.R. 3177, Sec. 2, 105th Cong. (2d Sess. 1998).

119 Memorandum from Henry Cohen, Legislative Attorney, Congressional Research Service, to Rep. Ernest J. Istook, Jr., U.S. House of Representative, Re: Constitutionality of Blocking URLs Containing Obscenity and Child Pornography (June 7, 1999), reprinted 145 CONG. REC. E 1602-03 (July 20, 1999) (on file with author of this dissertation and also available at http://www.techlawjournal.com/congress/blocking/80629crs.htm).

253

A House Appropriations Subcommittee unanimously approved the Istook-sponsored

Child Protection Act of 1998, and it then was added as an amendment to the Labor, Health and

Human Services, and Education, and Related Agencies Appropriations Act in the House.120

However, the Istook amendment was not included in the final Omnibus Appropriations Act that

Congress enacted.121

Another related bill, but one not requiring filtering technology, the E-Rate Policy and

Child Protection Act, was introduced in 1998.122 The bill instead mandated that libraries and schools obtaining E-rate funding must establish a user policy for minors accessing computers connected to the Internet. The bill, which was introduced by Rep. Edward Markey, a Democrat from Massachusetts, would have amended the Communications Act of 1934 by requiring a user policy that would establish a policy governing minors’ access to material deemed “inappropriate for children.”123 The purpose of the proposed legislation, according to Markey, was “to ensure that local school and library officials think through the many issues of online access, and implement a policy for addressing access by children.”124 The House Commerce Committee referred the bill to the Subcommittee on Telecommunications, Trade, and Consumer Protection, where it died. Senator McCain, who had introduced a mandatory filtering bill earlier in the year, opposed the implementation of acceptable use policies in place of filtering or blocking technology. McCain stated that although public libraries and schools had argued in favor of

120 Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, [FY]1999, H.R.4274, 105th Cong. (2d Sess. 1998).

121 The author of this dissertation could not find documentation as to why the amendment was not included in the Appropriations Act.

122 E-Rate Pol’y & Child Protection Act of 1998, H.R. 3442, 105th Cong. (2d sess. 1998). Rep. Edward Markey, Dem.-Massachusetts, introduced the act, which was co-sponsored by Rep. Thomas Manton, Dem.-New York.

123 H.R. 3442, 105th Cong. (2d sess. 1998).

124 Extensions of Remarks, 144 CONG. REC. E362 (March 12, 1998) (statement of Rep. Edward Markey).

254

“acceptable use” policies, “[I]mplementing a use policy alone would be completely ineffective” because Internet users could intentionally or unintentionally access sexually explicit Web sites.125

Proposed Legislation in 1999

Congress again took up the filtering issue in its next session in 1999, with a total of seven library and school filtering bills or amendments coming before the Senate and House. Four filtering bills were introduced in the House,126 two filtering bills were introduced in the

Senate,127 and one filtering amendment was attached to the House’s version of the juvenile justice bill.128 Congress did not enact any of the bills or amendments.

In January 1999, Rep. Bob Franks, a Republican from New Jersey, introduced the first of his three filtering bills of the year, H.R. 368, the Safe Schools Internet Act of 1999.129 Franks’ bill would have required public libraries and schools receiving universal service funding to

125 144 CONG. REC. S 8161 at 8162 (July 15, 1998) (statement of Sen. John McCain). McCain had introduced the Internet School Filtering Act, S.1619, 105th Cong. (2d Sess. 1998).

126 Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999); Child Protection Act of 1999, H.R. 2560, 106th Cong. (1st Sess. 1999).

127 See Childrens’ [sic] Internet Protection Act, S. 97, 106th Cong. (1st Sess. 1999); Neighborhood Children’s Internet Protection Act, S. 1545, 106th Cong. (1st Sess. 1999).

128 See Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999). The Senate version of the juvenile offender bill contained a filtering amendment that required Internet Service Providers to offer a computer filtering or blocking system or software to residential customers, but that amendment did not include a provision that mandated filtering in public libraries and schools. See the Violent & Repeat Juvenile Offender Accountability & Rehabilitation Act of 1999, S. 254, 106th Cong. (1st Sess. 1999). The name of the Juvenile Justice Reform Act of 1999 was changed to the Violent and Repeat Juvenile Offender Accountability and Rehabilitation Act of 1999 in an engrossed amendment as agreed to by the Senate, H.R. 1501.EAS, 106th Cong. (1st Sess. 1999). Congress did not enact the filtering amendments.

129 See Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999). Rep. Michael Oxley (Rep.-Ohio) and Rep. Ronnie Shows (Dem.-Massachusetts) cosponsored the bill. Franks introduced the Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999) in February 1999 and the Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999) in March 1999. In April 1999, Franks also introduced a filtering amendment as part of the juvenile offenders [sic] bill. See Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

255

install a “system to filter or block matter deemed to be inappropriate for minors,” thus amending the Communications Act of 1934.130 In a public library, the system would need to be installed on

“one or more of its computers with Internet access.” In a school, the system would need to be installed on “computers with Internet access.” In the section of the bill pertaining to public libraries, Franks did not clarify the term “one or more computers.” If a library had more than one computer, the bill could mean that the library could choose whether to put a filtering or blocking system on just one computer, on several computers or on all computers. In contrast, the bill could mean a library with just one computer would be required to install a filtering or blocking system on that single computer. A local authority, such as the library, school or school board, would determine which content would be inappropriate for minors. Franks’ bill was referred to the

House Committee on Commerce, which then referred it to the Subcommittee on

Telecommunications, Trade, and Consumer Protection, where it died.

On the same day that Franks introduced the Safe Schools Internet Act in the House, John

McCain, a Republican from Arizona, and Ernest Hollings, a Democrat from South Carolina, introduced S. 97, the Childrens’ [sic] Internet Protection Act,131 in the Senate.132 The McCain-

Hollings filtering bill was similar to McCain’s 1998 bill.133

McCain, in his opening statement in the March 4, 1999 hearing on S.97, said that as more and more children use the Internet in libraries and schools, those children were likely to

130 See Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999).

131 See S.97IS, 106th Cong. (1st Sess. 1999). Ernest Hollings cosponsored S.97 when it was first introduced. Six Republican cosponsors were added later.

132 In 1998, Sen. McCain had introduced the Internet School Filtering Act, S.1619, 105th Cong. (2d Sess. 1998).

133 See S.1619, 105th Cong. (2d Sess. 1998). For a discussion of S.1619, see supra notes 58-92 and accompanying text.

256

encounter online pornography and sexual predators unless filtering technology was required.134

McCain, who continued to serve as chairman of the Commerce Committee, said that S.97 was

“basically the same Internet filtering bill reported out of this committee in the last Congress.”135

If public libraries and schools wanted to receive or retain universal service, or E-rate, funding,

the McCain-Hollings bill would have required those libraries and schools to “install” and “use” a

“technology” on computers connected to the Internet in order to “filter or block material deemed

to be harmful to minors.”136 The McCain-Hollings bill did define the term “harmful to minors.”

McCain’s 1998 bill, S.1619,137 did not contain a specific provision for libraries with only

one computer, whereas S.97, the McCain-Hollings bill, stipulated that libraries with only one

computer connected to the Internet were not required to use filtering or blocking technology.

However, if the library did not use filtering or blocking technology on its only computer, the bill

would have required the library to certify to the FCC that the library “employs a reasonably effective alternative means to keep minors from accessing material on the Internet that is deemed

to be harmful to minors.”138 The McCain-Hollings bill required that libraries and schools install

a “technology” to “filter or block material deemed to be harmful to minors,”139 whereas

McCain’s 1998 bill, S.1619, mandated that libraries and schools install a “system” to “filter or

block matter deemed to be inappropriate for minors.”140 No reason was given for deleting the

134 See S.97, The Children’s Internet Protection Act, S. Hrg. 106-603 (1st Sess., Mar. 4, 1999) at 2.

135 Id. at 1 (statement of Sen. John McCain, referring to S.1619, Internet School Filtering Act). See S.1619, 105th Cong. (2d Sess. 1998).

136 S.97IS, 106th Cong. (1st Sess. 1999).

137 S.1619, 105th Cong. (2d Sess. 1998).

138 S.97IS, 106th Cong. (1st Sess. 1999).

139 Id.

140 S.1619, 105th Cong. (2d Sess. 1998).

257

phrase “inappropriate for minors”, which was used in S.1619, and adding “harmful to minors” to

S.97. Moreover, neither term was defined in the bills. However, the Senate Report on S.97, the

Children’s Internet Protection Act, stated that the Senate Commerce Committee met in an open

executive session on June 23, 1999, and ordered the bill to be reported favorably with an

amendment in the nature of a substitute.141

Both McCain’s 1998 and 1999 bills would have required that a local authority, such as

the library, school board or school, determine which content would not be available to children.

However, the inappropriate content was labeled “inappropriate for minors” in McCain’s 1998

bill142 and “harmful to minors” in the McCain-Hollings 1999 bill.143 Under the McCain-Hollings

bill in 1999, a library with more than one computer connected to the Internet would need to

certify to the FCC that the library had “installed” and “uses” the filtering or blocking technology

“on one or more of its computers with Internet access.”144 According to McCain’s 1998 bill, a library would need to certify to the FCC that “on one or more of its computers with Internet access, it employs a system to filter or block mattered deemed to be inappropriate for minors.”145

During a Senate Commerce Committee hearing on S.97, the McCain-Hollings bill, in

March 1999, Senator McCain and a librarian debated whether the filtering software needed to be

turned on in order for libraries to receive universal service funding. McCain stated that “the

software does not have to be used.” Candace Morgan, associate director of the Fort Vancouver

141 SEN. REP. NO. 106-141 at 9 (Aug. 5, 1999). The author of this dissertation could not find any documentation explaining the reasoning for the changes in wording from “inappropriate for minors” to “harmful to minors.”

142 See S.1619, 105th Cong. (2d Sess. 1998).

143 See S.97, 106th Cong. (1st Sess. 1999).

144 Id.

145 S.1619, 105th Cong. (2d Sess. 1998). Libraries needed to certify to the FCC that the library “employs” a filtering or blocking “system” on “one or more of its computers with Internet access.” Id.

258

Regional Library in Washington and an opponent of the filtering bill, replied, “That is not what

the plain language appears to say.” Sen. McCain responded, “The language clearly says, it is up

to your criteria, the criteria that you decide, that the board decides, without any interference from anybody.” Morgan said, “The criteria is how to configure the software.” McCain answered, “No, ma’am. That is absolutely false . . . . I am sorry that you would not read English the same way that everybody else does . . . . If you and the board decided that you do not want to use this software, do not use it.”146

Senator John “Jay” Rockefeller, a Democrat from West Virginia, agreed with Morgan’s

interpretation of the plain wording of the bill and questioned McCain’s interpretation of his own

bill. “When somebody says that you install a technology but you do not have to turn it on, which

is what Senator McCain says, that does not necessarily help you, does it, Ms. Morgan?”147

Rockefeller asked. Morgan replied, “Twice in this bill it says that the software will be installed and used, so I cannot quite grasp why it could be turned off. If it is off, it is not being used.”148

Elliot Mincberg, M.D. and Vice President for People for the American Way, agreed with

Morgan’s and Rockefeller’s interpretation of the bill’s language. “S.97 . . . simply says ‘installs and uses’ this technology, so it is not at all an unreasonable inference, one shared not just by me and Ms. Morgan, but by Senator Rockefeller and organizations across the country, that S.97 as worded could be applied to mean that filtering must be used at all times.”149

146 S.97, The Children’s Internet Protection Act, S.Hrg. 106-603 (1st Sess., March 4, 1999) at 27-30.

147 Id. at 30-31 (statement of Sen. John “Jay” Rockefeller).

148 Id. at 30.

149 Id. at 59 (statement of Elliot Mincberg, M.D. and Vice President and General Counsel for People for the American Way). Mincberg did not list names of organizations during his testimony.

259

During the hearing, several other witnesses testified before the Commerce Committee.

The president of Net Nanny, a filtering software company, stated that filtering and blocking

technology can block Web sites, such as those containing pornography, hate speech, bomb-

making instructions, and illegal drug information, and can create a list of “positive sites that are

safe and educational for children.”150 The chairman of RuleSpace, another filtering software

company, stated that its product, WebChaperone, could recognize and stop the download of

pornography without blocking health-related information. 151 A criticism of older filtering

software had been that the software erroneously blocked health sites, such as breast cancer sites.

A filtering advocate testified that the Children’s Internet Protection Act would permit librarians and school administrators to rely on their “own subjective good faith judgment in blocking or screening material considered by them to be ‘harmful to minors.’”152 However, Jerry

Berman, Director of the Center for Democracy and Technology, who also testified in an earlier

House subcommittee hearing on mandatory filtering, disagreed. “Requirements to adopt filtering

technology will effectively usurp local communities’ abilities to set standards” because “filtering

technologies . . . do not mirror the diversity of local community norms found across the country,”

Berman stated again in 1999.153

Librarian Susan Fuller testified that she opposed mandatory filtering because a filtering law would prevent local communities from discussing and providing input on how best to address minors’ access to “inappropriate” Web sites in public libraries.154 Morgan, the librarian

150 Id. at 25 (statement of Gordon Ross, President and CEO of Net Nanny).

151 Id. at 40 (statement of Adrian Russell-Falla, Chairman of RuleSpace, which developed WebChaperone).

152 Id. at 53 (statement of Bruce Taylor, President and Chief Counsel, National Law Center for Children and Families).

153 Id. at 88 (prepared statement of Jerry Berman, Executive Director of the Center for Democracy and Technology).

154 Id. at 92 (statement of Susan Fuller, Director of the Santa Clara County Library).

260

who had debated the language of the bill with McCain, also disagreed with a federal mandate,

stating that libraries and communities should work together for find a solution. “I do believe

what this country is all about is both local control and also individual choice. An unfiltered

environment should not be mandated, and a filtered environment should also not be mandated,”

Morgan said.155 If filtering were to be mandated, Morgan recommended that the bill allow parents to decide whether their children would have filtered or unfiltered access to the Internet.

Parents would need to come to the library just once so that the librarian could program the child’s library card to allow either filtered or unfiltered access to the Internet.156

In June 1999, the Senate Commerce Committee endorsed the McCain-Hollings bill with

an amendment in the nature of a substitute and recommended passage of the amended bill. 157

The amended bill contained two major revisions—a change in content covered under the bill and the addition of an enforcement provision for libraries with more than one computer connected to the Internet. First, libraries with more than one computer connected to the Internet were required

to “use” a technology on computers with Internet access “in order to filter or block Internet

access through such computers to material that is obscene and child pornography.”158 The original version of S.97, the McCain-Hollings bill, did not include obscenity and child pornography. S.97, as first introduced, would have required libraries and schools to “install” and

“use” a “technology” on computers connected to the Internet in order to “filter or block material

155 Id. at 59 (statement of Candace Morgan, Associate Director of the Fort Vancouver Regional Library in Washington).

156 Id. at 37 (statement of Candace Morgan).

157 Children’s Internet Protection Act, Report of the Comm. on Commerce, Science, & Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 1, 9.

158 Id. See also Children’s Internet Protection Act, Report of the Committee on Commerce, Science, and Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 18 & 21.

261

deemed to be harmful to minors.”159 The Committee did not include a “harmful to minors”

provision in the amended bill. The Committee did not explain why it added obscenity and child

pornography to the amended bill or why it deleted the “harmful to minors” clause from the amended bill.

The second major change applied to libraries with more than one computer connected to the Internet. According to the amended version of the bill, libraries with more than one computer connected to the Internet would need to certify to the FCC that they were “enforcing a policy to ensure the operation of the technology during any use of such computers by minors.”160 The original version of the McCain-Hollings bill did not contain an enforcement policy provision for libraries having more than one computer connected to the Internet. However, neither the

McCain-Hollings original bill nor the amended bill would have required the library to install filtering or blocking software if only one computer was connected to the Internet. Instead, both versions of the bill would have required libraries with only one computer connected to the

Internet to enforce a user policy to ensure that minors did not use the computer to gain access to the Internet content specified under the original bill and the amended bill.161

The Commerce Committee did not specifically state why it changed the wording in the

McCain-Hollings bill. However, in explaining the need for filtering legislation, the August 1999

Senate Commerce Committee report used the same language found in McCain’s 1998 filtering

bill rather than in the McCain-Hollings 1999 bill. The report stated that the “purpose of the bill is

to protect America’s children from exposure to obscene material, child pornography, or other

159 S.97IS, 106th Cong. (1st Sess. 1999).

160 S.97RS, 106th Cong. (1st Sess. 1999). See also Children’s Internet Protection Act, Report of the Committee on Commerce, Science, and Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 18 & 21.

161 S.97IS & S.97RS, 106th Cong. (1st Sess. 1999), Committee on Commerce, ordered to be reported with an amendment in the nature of a substitute favorably.

262

material deemed inappropriate for minors while accessing the Internet from a school or library

receiving Federal Universal Service assistance.”162 Despite the inclusion of the “inappropriate to minors” phrase in its report, the Commerce Committee did not include “inappropriate for minors” or “harmful to minors” as text in the amended version of the McCain-Hollings bill. The

Committee report stated that the Supreme Court has repeatedly upheld the compelling government interest in protecting children from “exposure to sexually explicit material” and that schools and libraries accepting Universal Service funding become “a partner with the Federal government in pursuing this compelling interest.”163 The Committee also stated that “it does not

believe that the use of blocking and filtering technologies is in any way a substitute for

aggressive and responsible oversight by teachers and librarians,” but rather “[s]uch technologies

are intended to be a supplement to, not a replacement for, teacher and librarian efforts to protect

children while on-line.”164

Mary Anne Layden, a clinical psychologist who also testified on the negative effects of

online pornography before a House subcommittee in 1998,165 said that exposure to pornography

162 Children’s Internet Protection Act, Report of the Committee on Commerce, Science, and Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 1.

163 Id. at 7.

164 Id. at 6.

165 See Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the Comm. on Commerce, 105th Cong. 119 (2d Sess. Sept. 11, 1998) at 55, 84-85 (statement of Mary Anne Layden, Ph.D., Director of Education, Dept. of Psychiatry, Center for Cognitive Therapy, Univ. of Pennsylvania). Layden stated that online pornography could have a greater effect on minors than printed pornography. Children are especially affected by the “anonymity” of online pornography because it “loosens up the inhibitions to antisocial behavior” and “produces an increase in antisocial behavior,” Layden said. Id.

263

distorts children’s sexual beliefs.166 Parents who prefer that their children make their own online

choices are not aware of the damage that pornographic material causes, Layden said.167

Layden, who has treated children in her practice, said that exposure to pornography

impedes natural sexual development.168 She said that “[e]xposing children to pornography meets

the criteria for childhood sexual abuse” because children develop “distorted beliefs (and)

pathological behavior,” which can then lead them to become abusers.169 In addition, she said that

once children are exposed to pornography, the damage is done and cannot be undone,170 in part

because “visual imagery is very powerful.”171 She said that unlike verbal or written messages,

“images are stored permanently” and “are mentally processed as events, as facts . . . and are

stored unbuffered and unchallenged.”172

Layden stated that Internet sex cites contain the three factors that “produce the best

environment to stimulate antisocial behavior in children . . . the combination of anonymity, role

models of behavior, and arousal.”173 She said she was concerned about letting children and their

166 Children’s Internet Protection Act: Hearing before the Comm. on Commerce, Science, & Transportation on S.97, S. Hrg. No. 106-603 (1st Sess., March 4, 1999) at 12 (statement of Mary Ann Layden).

167 Id. at 26.

168 Id. at 12-14, 26, stating that exposure to soft core pornography negatively affects child and adolescent development (statement of Mary Anne Layden). See also Children’s Internet Protection Act, Report of the Committee on Commerce, Science, and Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 3 (statement of Mary Anne Layden) (reporting on the harms that children may experience from viewing pornography).

169 Children’s Internet Protection Act, Report of the Committee on Commerce, Science, and Transportation on S.97, SEN. REP. NO. 106-141 (1st Sess. 1999) at 12 (statement of Mary Ann Layden).

170 Id. at 36.

171 Id. at 13.

172 Id.

173 Id. Layden stated that research indicated that these three factors stimulated antisocial behavior, but she did not cite the research.

264

parents decide about minors’ access to pornography because those children will interact with other children whose parents do not want them exposed to sexually explicit material.174

Elliot Mincberg, M.D., an attorney and Vice President of People for the American Way, told the committee that a bill mandating Internet filtering would “discourage more effective approaches” that could be developed, such as a user policy.175

The Senate Commerce Committee approved S.97, which was placed on the Senate’s

Legislative Calendar under General Orders on Aug. 5, 1999. However, the Senate did not vote

on the bill.176

In February 1999, Franks introduced his second filtering bill of the year, the Childrens’

[sic] Internet Protection Act,177 which was identical to the original version of S.97, the McCain-

Hollings bill, before it was amended.178 Franks’ bill would have required public libraries and

schools that accepted E-rate funding to “select” and “install” a technology for computers with

Internet access “to filter or block material deemed to be harmful to minors.” Libraries with only

one computer connected to the Internet would not be required to install a filtering or blocking

technology, but rather could employ “a reasonably effective alternative means to keep minors

from accessing material on the Internet that is deemed to be harmful to minors.”179 Franks’ bill

was referred to the House Committee on Commerce, which then referred the bill to the

Subcommittee on Telecommunications, Trade, and Consumer Protection, where it died.

174 Id. at 35.

175 Children’s Internet Protection Act: Hearing before the Comm. on Commerce, Science, & Transportation on S.97, S. Hrg. No. 106-603 (1st Sess. March 4, 1999) at 59 (statement of Elliot Mincberg, M.D.).

176 The author of this dissertation could find no documentation on why the Senate did not vote on S.97.

177 H.R. 543, 106th Cong. (1st Sess. 1999).

178 Childrens’ [sic] Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

179 Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999).

265

In March 1999, Rep. Bob Franks introduced his third filtering bill of the year, H.R. 896,

the Childrens’ [sic] Internet Protection Act.180 Franks’ bill was identical to his second 1999 bill,

H.R. 543,181 and to the McCain-Hollings bill182 in all provisions but one. H.R. 896, Franks’ third

bill, would have required that libraries use filtering or blocking technology on any computer

when that computer was being used by a minor in order to block material deemed “harmful to

minors.”183 The term “harmful to minors” was not defined in the bill. In contrast, Franks’ second bill, H.R. 543, and the McCain-Hollings bill, S.97, required that such technology be used on at least one computer in libraries that had more than one computer connected to the Internet. In addition, for libraries that had only one computer connected to the Internet, Franks’ second 1999 bill, H.R. 543, and the 1999 McCain-Hollings bill, S.97, would have allowed libraries to use an

“effective alternative means” to prevent minors from accessing material deemed harmful.184

H.R.896, Franks’ third filtering bill of 1999, was referred to the House Committee on

Commerce, which then referred the bill to the Subcommittee on Telecommunications, Trade, and

Consumer Protection, where it died in committee.

In 1999, the House of Representatives approved an Internet filtering amendment as an attachment to a juvenile offender bill.185 The House amendment, sponsored by Bob Franks, a

180 Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999).

181 Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999).

182 Childrens’ [sic] Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

183 Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999).

184 Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999).; Childrens’ [sic] Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

185 Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999). The name of the act was later changed to the Violent and Repeat Juvenile Offender Accountability and Rehabilitation Act of 1999 in an engrossed amendment , as agreed to by the Senate, H.R. 1501.EAS, 106th Cong., 1st Sess. (1999). An engrossed amendment is the final copy of a bill that includes amendments to the text from floor actions. See Congressional Bills Glossary, available at http://www.gpoaccess.gov/bills/glossary.html.

266

Republican from New Jersey, who had previously introduced three filtering bills in 1999, and

Charles “Chip” Pickering, a Republican from Missouri, would have amended the

Communications Act of 1934 to require libraries and schools receiving universal service (E-rate) funds to implement “an Internet filtering or blocking technology” to filter or block three types of content: child pornography, obscene materials, and “materials deemed harmful to minors.”186

Franks’ most recent two bills had contained only a “harmful to minors” clause,187 while his first bill contained an “inappropriate for minors” clause.188 In contrast to Franks’ previous filtering bills, the filtering amendment contained two additional types of content: child pornography and obscenity.189 The local library board and school district would determine which type of filtering technology to use.190 The text of the amendment did not contain the definitions of child pornography, obscenity or harmful to minors, but the amendment did state that the terms would have the same meanings as specified in the United States Code. Although child pornography was defined in the section of the code cited in the amendment,191 obscenity was not defined in the cited section of the code.192 According to the U.S. Code, "Material that is harmful to minors"

186 Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

187 Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999) and Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999).

188 Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999).

189 Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

190 145 CONG. REC. H4537 (June 17, 1999), 106th Cong. (1st Sess. 1999) (statement of Rep. Bob Franks).

191 Child pornography is defined as a “visual depiction” that “involves the use of a minor engaging in sexually explicit conduct.” 18 U.S.C. § 2252 (2000).

192 18 U.S.C. § 1460 provides that persons possessing obscene materials with intent to sell shall be fined or imprisoned or both. However, obscenity is not defined in 18 U.S.C. § 1460. Obscenity is defined in Miller v. California, 413 U.S. 15 (1973). The Miller test defines obscenity as whether “the average person, applying contemporary community standards,” would find the work, taken as a whole, appeals to the prurient interest; 2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the

267

applied to persons under the age of seventeen and was defined as “any communication, picture,

image, graphic image file, article, recording, writing, or other matter of any kind that is obscene

or that—

(A) the average person, applying contemporary community standards, would find, taking the material as a whole and with respect to minors, is designed to appeal to, or is designed to pander to, the prurient interest; (B) depicts, describes, or represents, in a manner patently offensive with respect to minors, an actual or simulated sexual act or sexual contact, an actual or simulated normal or perverted sexual act, or a lewd exhibition of the genitals or post-pubescent female breast; and (C) taken as a whole, lacks serious literary, artistic, political, or scientific value for minors.193

Franks’ amendment did not specify who would determine whether the material met the

definitions of child pornography or obscenity. However, the amendment did specify that local

officials, such as a library board, library, school board or school, would determine which content

would be considered harmful to minors.194 In introducing the amendment, Franks said “children do not have to be actively looking for pornographic web sites to be exposed to adult-only

materials.”195 For example, he reported that pornographic web site operators “frequently” use

applicable state law; and 3) whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value. Miller, 413 U.S. at 24.

193 47 U.S.C. § 231. This section of the code, which contains the Communications Decency Act (CDA) and is part of the Telecommunications Act of 1996, also would have restricted minors’ “access to materials commercially distributed by means of [the] World Wide Web that are harmful to minors.” Id.

194 Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401(a)(8) of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

195 145 CONG. REC. H4537 (June 17, 1999) (statement of Bob Franks). As an example, Franks cited whitehouse.com, which at the time was a pornographic site, but later became a political web site and subsequently a home loan modification web site not affiliated with the U.S. government.

268

terms popular with kids on their web sites in order to draw minors to those sites.196 Franks stated that his filtering amendment would require that libraries and schools apply the same restrictions on Internet materials as they had on books. “For generations, schools and libraries have routinely

decided what books are appropriate for our children to read,” he said.197

Rep. W.J. “Billy” Tauzin, a Republican from Louisiana, supported the amendment,

stating that filters are easy to install and inexpensive, especially since the government would

cover most of the costs through the E-rate program.198 Rep. Bobby Rush, a Democrat from

Illinois, opposed the amendment, stating that mandatory filtering would “financially and administratively burden schools and libraries.”199 Rep. Bobby Scott, a Democrat from Virginia,

also opposed the amendment, stating that there had been examples of filters failing to block

pornographic sites while erroneously blocking nonpornographic sites.200

Although the House approved a mandatory filtering amendment as part of the Juvenile

Justice Reform Act of 1999, 201 the Senate version of the Act did not include a filtering

requirement.202 Congress did not enact a juvenile justice bill in 1999 and therefore no filtering

amendment passed as part of the juvenile justice bill.

196 Franks stated that the terms Disney, Nintendo and Barbie were the “most popular” names use by pornography web site operators. 145 CONG. REC. H4537 (June 17, 1999).

197 145 CONG. REC. H4537 (June 17, 1999) (statement of Bob Franks).

198 145 CONG. REC. H4538 (June 17, 1999) (statement of Billy Tauzin).

199 145 CONG. REC. H4537 (June 17, 1999) (statement of Bobby Rush).

200 145 CONG. REC. H4538 (June 17, 1999) (statement of Bobby Scott).

201 Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).The filtering amendment is similar to S.97, the Children’s Internet Protection Act introduced by senators John McCain and Ernest Hollings. The filtering amendment would have denied E-rate funding to schools and public libraries that did not install filters on computers connected to the Internet.

202 However, one section of the Senate version required Internet Service Providers (ISPs) with 50,000 or more subscribers to provide residential customers either with free filtering software or to offer it at cost. Violent and

269

In July 1999, Rep. Ernest Istook, a Republican from Oklahoma, introduced the Child

Protection Act of 1999,203 his second attempt at filtering legislation.204 The bill was cosponsored by Bob Franks, who had introduced three filtering bills and a filtering amendment earlier in the year.205 Istook’s 1999 Child Protection Act would have required all public libraries and schools receiving any federal funding, not just E-rate funding, to install software on any computer accessible to minors “to prevent minors from obtaining access to any obscene information or child pornography using that computer.”206 In contrast to his 1998 bill, which contained only an obscenity provision,207 Istook added child pornography as a second category of content in his

1999 bill.208 He did not explain why he added child pornography to his 1999 bill. The bill required that the software be operational when minors were using the computer. However, the bill allowed minors to have unfiltered access to Internet content that was not obscene or was not child pornography if the minors were supervised by an adult designated by the library or school.

Repeat Juvenile Offender Accountability & Rehabilitation Act of 1999, S. 254, § 1604, 106th Cong. (1st Sess. 1999).

203 H.R. 2560, 106th Cong. (1st Sess. 1999).

204 In 1998, Istook proposed a filtering amendment as part of an appropriations bill. Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, 1999, H.R.4274, 105th Cong. (2d Sess. 1998). The Istook amendment was not included in the final Omnibus Appropriations Act that Congress passed in 1998.

205 Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999); Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999). Istook’s bill was cosponsored by one Democrat and six Republicans, including Franks.

206 H.R. 2560, 106th Cong. (1st Sess. 1999).

207 Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, 1999, H.R.4274, 105th Cong. (2d Sess. 1998).

208 Child Protection Act of 1999, H.R. 2560, 106th Cong. (1st Sess. 1999).

270

Unlike McCain’s bills209 and Franks’ bills,210 Istook’s Child Protection Act did not contain an

“inappropriate for minors” or a “harmful to minors” clause. The bill Istook introduced in 1999 was similar to the bill he introduced in 1998.211 Both bills would have required libraries and schools to install filtering software to receive any federal funding, not just to receive the universal service (E-rate) funds specified in McCain’s bills212 and Franks’ bills.213 Istook’s bill did not provide for as much local control over filtering selection as McCain’s bills214 and Frank’s bills had.215 Instead, under Istook’s bill, the chief executive officer of each state would designate an agency or official to determine “adequate design” of filtering software for libraries and schools within a state’s jurisdiction. For libraries and schools not within the jurisdiction of any state, the U.S. Secretary of Education would determine what constituted “adequate design” of

209 Internet School Filtering Act, S.1619, 106th Cong. (2d sess. 1998) contained an “inappropriate for minors” provision. Childrens’ [sic] Internet Filtering Act, S.97, 106th Cong. (2d Sess. 1999) contained a “harmful to minors” provision.

210 Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998) and Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999) contained an “inappropriate for minors” provision. The Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999) and Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999) contained a “harmful to minors” provision. Franks’ filtering amendment, attached to the juvenile offender bill in 1999, also contained a “harmful to minors” provision. See Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

211 Title VI, Child Protection Act of 1998, Depts. of Labor, Health & Human Servs., & Educ., & Related Agencies Appropriations Act, 1999, H.R.4274, 105th Cong. (2d Sess. 1998).

212 Internet School Filtering Act, S.1619, 106th Cong. (2d Sess. 1998) and Childrens’ [sic] Internet Filtering Act, S.97, 106th Cong. (2d Sess. 1999).

213 Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998); Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999); and Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

214Internet School Filtering Act, S.1619, 106th Cong. (2d Sess. 1998) and Childrens’ [sic] Internet Filtering Act, S.97, 106th Cong. (2d Sess. 1999).

215 Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998); Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999); and Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

271

filtering software.216 The House did not hold a hearing or engage in floor debate on Istook’s bill.

In written comments submitted to the Congressional Record, Istook urged his colleagues to cosponsor the bill and stated that a Congressional Research Service opinion stated that his bill would be constitutional.217 However, Istook’s bill died in the House Committee on Education

and the Workforce. Istook next introduced his bill as an amendment to the Fiscal Year 2000

Labor, Health and Human Services, and Education appropriations budget. By a voice vote, the

House Appropriations Subcommittee added Istook’s filtering amendment to the appropriations

bill on Sept. 23, 1999.218 However, members of the House and Senate appropriations committees

dropped the Internet filtering amendment from that budget.219

The last filtering bill of 1999 was introduced in the U.S. Senate in August by Richard

(Rick) Santorum, a Republican from Pennsylvania. The Neighborhood Children’s Internet

Protection Act220 (NCIPA) would have required public libraries and schools to do one of two

things to receive universal (E-rate) funding: select and install a system to block or filter Internet

“matter” deemed “inappropriate for minors,”221 or adopt and implement an Internet use policy

that addressed “access by minors to inappropriate matter on the Internet and World Wide

216 H.R. 2560, 106th Cong. (1st Sess. 1999).

217 Memorandum from Henry Cohen, Legislative Attorney, Congressional Research Service, to Rep. Ernest J. Istook, Jr., U.S. House of Representative, Re: Constitutionality of Blocking URLs Containing Obscenity and Child Pornography (June 7, 1999), reprinted 145 CONG. REC. E 1602-03 (July 20, 1999) (on file with author of this dissertation and also available at http://www.techlawjournal.com/congress/blocking/80629crs.htm).

218 News Release, Am. Library Ass’n, Filtering Amendment Added to Appropriations Bill (Oct. 4, 1999) available at http://www.ala.org/ala/alonline/currentnews/newsarchive/1999/october1999/filteringamendment.cfm. The author could find no primary authority on the subcommittee’s vote.

219 Id. The author could find no documentation on the two committees’ decision or rationale to drop Istook’s amendment.

220 See Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999). Sen. Rod Grams (Rep.-Minnesota) cosponsored the bill.

221 Id.

272

Web.”222 Local school boards, school officials and library officials would be responsible for

selecting and installing the filter223 or adopting and implementing an Internet use policy for

minors.224 The term “inappropriate for minors” was not defined in the bill. Unlike the 1999

McCain-Hollings filtering bill225 and Rep. Franks’ filtering bills,226 Santorum’s proposal did not

specify that libraries and schools had to activate or use the filtering software if they chose

filtering as an option.227 In addition, local libraries and school boards would have been able to

develop and implement their own user policies rather than implement filtering technology.

Santorum’s bill also did not contain provisions pertaining to obscenity or child pornography.

Santorum’s bill would have amended the Communications Act of 1934 by requiring either a user policy or filtering technology. The bill was referred to the Senate Commerce Committee, where it died.

Proposed Legislation in 2000

By 2000, Congress was addressing mandatory Internet blocking and filtering proposals for the third year in a row. Representatives Ernest Istook, a Republican from Oklahoma, and

Charles “Chip” Pickering, a Republican from Mississippi, introduced filtering bills in the House,

but both bills died in committee.

222 Id.

223 Id.

224 Id.

225 Children’s Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

226 Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999); Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong. (1st Sess. 1999).

227 Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999).

273

In May 2000, Rep. Ernest Istook introduced the Internet Minors Protection and

Cyberspace Technology Act,228 which was cosponsored by Rep. Bob Franks and five other

Republicans.229 The Internet Minors Protection and Cyberspace Technology Act230 was identical

to the Child Protection Act, 231 the bill Istook introduced the previous year, except for a one-

phrase revision pertaining to universal service (E-rate) funding. The Internet Minors Protection

and Cyberspace Technology Act of 2000 would have applied to all public libraries and schools

receiving any federal funds— “or universal service assistance . . . under the Communications Act

of 1934”—if those funds were used to acquire or operate any computer accessible to minors and

if the computer had Internet access. In contrast, Istook’s Child Protection Act of 1999 would have mandated that libraries and schools ensure that filtering is operational on all computers used

by minors if those libraries and schools had received funding from any federal agency.232

However, Istook’s Child Protection Act of 1999 did not specifically list universal service funding.

The Internet Minors Protection and Cyberspace Technology Act of 2000 would have only applied to minors and would have required libraries and schools to ensure that filtering software was installed and “operational” when minors were using computers connected to the Internet.233

The software had to be “adequately designed to prevent minors from obtaining access to any

228 Internet Minors Protection & Cyberspace Tech. Act, H.R. 4545, 106th Cong. (2d Sess. 2000).

229 The cosponsors were Jay Dickey, Rep.-Arkansas; Sue Wilkins Myrick, Rep.-North Carolina; Mark Souder, Rep.- Indiana; Thomas Tancredo, Rep.-Colorado; and Lee Terry, Rep.-Nebraska, all of whom also cosponsored Istook’s 1999 bill. Id. See also Child Protection Act of 1999, H.R. 2560, 106th Cong. (1st Sess. 1999).

230 Internet Minors Protection & Cyberspace Tech. Act, H.R. 4545, 106th Cong. (2d Sess. 2000). Rep. Bob Franks, who sponsored filtering bills in previous years, cosponsored H.R. 4545.

231 Child Protection Act of 1999, H.R. 2560, 106th Cong. (1st Sess. 1999).

232 Id.

233 The bill did not define “operational.”

274

obscene information or child pornography.”234 However, the bill also allowed the software to be

“temporarily interrupted” to allow minors access to “information that is not obscene, is not child

pornography, or is otherwise unprotected by the Constitution” when those minors were under the

direct supervision of an adult designate of the school or library.235 Unlike filtering proposals introduced by other representatives and senators, Istook’s bill did not include an “inappropriate for minors” or “harmful to minors” clause. The House Committee on Education and the

Workforce referred the bill to the Subcommittee on Early Childhood, Youth and Families in

July, but the bill never made it out of committee.

In June 2000, Rep. Charles “Chip” Pickering introduced the Children’s Internet

Protection Act.236 Ernest Istook and Bob Franks cosponsored the bill, along with sixteen other

Republicans and two Democrats. Pickering’s bill, the Children’s Internet Protection Act (H.R.

4600),237 would have applied only to libraries and schools receiving universal service, or E-rate, discounts. Pickering’s bill would have amended the Communications Act of 1934 by requiring

that libraries and schools receiving E-rate funding to select and enforce “a technology” for

computers with Internet access to filter or block obscenity and child pornography. The bill did

not specify software as a requirement. The bill also would have required the technology to block

“material that is harmful to minors” when a minor was using any computer connected to the

Internet.238 A minor was defined as anyone under the age of seventeen. In contrast to previous

234 The Internet Minors Protection & Cyberspace Tech. Act, H.R. 4545, 106th Cong. (2d Sess. 2000). The bill did not define “operational.”

235 Id.

236 Children’s Internet Protection Act, H.R. 4600, 106th Cong. (2d Sess. 2000).

237 Id. The bill was cosponsored by three Democrats and eighteen Republicans, including two Republicans who had previously sponsored filtering legislation: Bob Franks of New Jersey and Ernest Istook of Oklahoma.

238 Id.

275

bills, H.R. 4600 defined the term “harmful to minors.” The bill defined “harmful to minors” as

“any communication, picture, image, graphic image file, article, recording, writing or other matter of any kind that—

(i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value.239

Local library boards, school boards, or library or school officials would have selected the blocking technology. When adults in libraries and schools were using computers connected to the Internet, “an administrator, supervisor, or other authority” could disable the filtering technology “to enable unfiltered access for bona fide research or other lawful purpose.”240 The terms “bona fide research” and “other lawful purpose” were not defined in the bill. The House

Commerce Committee referred Pickering’s bill to the Subcommittee on Telecommunications,

Trade, and Consumer Protection, where it died.

The Enactment of Mandatory Filtering Legislation in 2000

In December 2000, Congress adopted the text of the Children’s Internet Protection Act and the Neighborhood Children’s Internet Protection Act as two separate amendments to the

Labor, Health and Human Services, and Education Appropriations Act.241 The text of Senator

239 Id.

240 Id.

241 Dept. of Labor, Health & Human Servs., & Educ. Appropriations Act, 2001, H.R. 4577, 106th Cong. (2d Sess. 2000), Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f) and 47 U.S.C. § 254(h)).

276

John McCain’s amendment to the Consolidated Appropriations Act242 was based on S.97, the

Children’s Internet Protection Act,243 a bill that he and Senator Ernest Hollings had cosponsored in 1999. The text of Senator Rick Santorum’s amendment244 was based on S.1545, the

Neighborhood Children’s Internet Protection Act, which he had introduced in 1999.245

Representatives Ernest Istook of Oklahoma and Charles Pickering of Missouri, who had introduced their own filtering bills in the House in 2000, supported Senator McCain’s and

Senator Santorum’s filtering amendments.

The McCain Amendment

Although the Children’s Internet Protection Act was based on a bill that Senator McCain had introduced in 1999,246 as well as the filtering amendment he had proposed in the Labor,

Health and Human Services, and Appropriations bill in 2000, 247 his filtering amendment was modified in committee. McCain’s amendment would have amended the Communications Act of

1934 by requiring libraries and schools accepting E-rate funding to certify to the federal government that they had “selected a technology . . . to filter or block Internet access” to three types of content:

1. material that is obscene; 2. child pornography; and

242 Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

243 S.97, 106th Cong. (1st Sess. 1999).

244 Neighborhood Children’s Internet Protection Act, S.Amdt. No. 3635 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

245 Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999).

246 Children’s Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

247 Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

277

3. any other material (that the library or school) determines to be inappropriate for minors.248

Sen. McCain’s amendment would have required libraries and schools to certify that they

were “enforcing a policy to ensure the operation of the technology.”249 However, his amendment

did not clearly state that adults would have been able to access material that was “inappropriate

for minors” but was constitutional for adults. The amendment only stated that libraries and schools would be required to filter or block such “material” deemed “inappropriate for minors.”

Sen. McCain’s filtering amendment in 2000 differed from S.97, the McCain-Hollings bill introduced in 1999. Under the McCain-Hollings bill in 1999, public libraries and schools that wanted to receive or retain universal service funding would have been required to “install” and

“use” a “technology” on computers connected to the Internet in order to “filter or block material deemed to be harmful to minors.”250 The McCain-Hollings bill contained an additional provision

for libraries with only one computer connected to the Internet—the library was not required to

use filtering or blocking technology. However, if the library did not use filtering or blocking

technology, the bill would have required the library to certify to the FCC that the library

“employs a reasonably effective alternative means to keep minors from accessing material on the

Internet that is deemed to be harmful to minors.”251 The filtering amendment that McCain

introduced in 2000 did not contain a separate provision for libraries with only one computer

connected to the Internet. McCain’s filtering amendment required all libraries to install and use

248 Id.

249 Id.

250 S.97IS, 106th Cong. (1st Sess. 1999).

251 Id.

278

blocking or filtering technology on any computer connected to the Internet, including those

libraries with only one computer connected to the Internet.

The joint House-Senate Conference Committee amended the content categories in

McCain’s amendment. McCain’s 2000 amendment would have required the use of filtering

technology to prohibit access to obscenity, child pornography, and “any other material [that the

library or school] determines to be inappropriate for minors.” 252 In contrast, the CIPA prohibits

access to three types of sexually explicit visual online content: access by all patrons to “visual

depictions” that are obscene,253 access by all patrons to “visual depictions” that contain child pornography,254 and access by persons under the age of seventeen to “visual depictions” that are

considered “harmful to minors.”255 Unlike McCain’s amendment, the CIPA covers only “visual depictions.”256 The proceedings of the joint House-Senate conference committee, in which the

wording changes were made to include only “visual depictions,” were not made public.257 As stated above, the 1999 McCain-Hollings bill, S.97, would have required libraries and schools to

“install” and “use” a “technology” on computers connected to the Internet in order to “filter or block material deemed to be harmful to minors.”258

252 Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

253 20 U.S.C. § 9134(f)(1) and 47 U.S.C. § 254(h)(5)(B) & (h)(5)(C) and 47 U.S.C. § 254(h)(6)(B) & (h)(6)(C).

254 20 U.S.C. § 9134(f)(1) and 47 U.S.C. § 254(h)(5)(B) & (h)(5)(C) and 47 U.S.C. § 254(h)(6)(B) & (h)(6)(C).

255 20 U.S.C. § 9134(f)(1)(A) and 47 U.S.C. § 254(h)(5)(B) and 47 U.S.C. § 254(h)(6)(B).

256 20 U.S.C. § 9134 (f)(1) and 47 U.S.C. § 254(h)(5)(B) & (C) and (6)(B) &(C).

257 According to the American Library Association’s Washington office, the compromise “may never be reported out as an official document from the Conference Committee.” See Appropriations Bill Mandates Filter, AM. LIBRARIES, Vol. 13, Issue 8 at p. 14 (September 2000).

258 S.97IS, 106th Cong. (1st Sess. 1999).

279

The Santorum Amendment

Senator Rick Santorum also introduced an amendment in 2000 that required either filtering technology or an Internet user policy. He introduced the text of the Neighborhood

Children’s Internet Protection Act as an amendment to the House’s Miscellaneous

Appropriations Act.259 Santorum’s amendment would have required libraries and schools accepting E-rate funding to certify to the federal government that they had either installed a

“system” on computers to “filter or block Internet access to matter considered to be inappropriate for minors” or “adopted and implemented an Internet use policy that addresses access by minors to inappropriate matter on the Internet and World Wide Web.”260

Santorum’s 2000 amendment, like his 1999 bill,261 only pertained to “inappropriate matter” and not obscenity or child pornography. Like his 1999 bill, Santorum’s 2000 amendment would have given libraries and schools a choice of installing a blocking or filtering system or implementing an Internet use policy.262 Unlike McCain’s 2000 amendment, Santorum’s amendment did not mandate the installation or use of a blocking or filtering technology and did not include a provision to prevent adults or minors from accessing obscenity or child pornography.263

Santorum, in supporting his appropriations bill amendment, spoke out against mandatory filtering and blocking technology. He said that local communities—not the federal government—should decide how “to deal in a comprehensive way” with minors’ access to

259 See Misc. Appropriation Act, H.R. 5666, 106th Cong. (2d Sess. 2000).

260 S. Amdt. No. 3635 to H.R. 4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 27, 2000).

261 Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999).

262 S. Amdt. No. 3635 to H.R. 4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 27, 2000).

263 Id.

280

sexually explicit online material.264 McCain opposed Santorum’s amendment, calling it “a status

quo loophole” that would allow schools and libraries to implement an acceptable use policy

instead of filtering.265

The House referred the Appropriations Act to the House Appropriations Committee,

where the act—and Santorum’s mandatory user policy amendment—died. Santorum later added

the text of the Neighborhood Children’s Internet Protection Act as an amendment to the Labor,

Health and Human Services, and Education Appropriations Act in December 2000,266 which

Congress enacted. However, Congress combined the language of Senator Santorum’s and

Senator McCain’s amendments, thus requiring both an Internet use policy and a “technology

protection measure” to prevent online access to “visual depictions” that are obscene, child

pornography or harmful to minors.267

The Final Legislation

The Children’s Internet Protection Act requires public libraries, public schools and some

nonpublic schools268 to use blocking or filtering technology on all computers with access to the

Internet to receive discounts on Internet access and technology under the universal service, or E-

264 146 CONG. REC. S 5843 (June 27, 2000) (statement of Rick Santorum, Rep.-Pennsylvania).

265 146 CONG. REC. S 5866 (June 27, 2000) (statement of John McCain, Rep.-Arizona).

266 Dept. of Labor, Health & Human Servs., & Educ. Appropriations Act, 2001, H.R. 4577, 106 Cong. (2d Sess. 2000), Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134 (f) and 47 U.S.C. § 254(h)).

267 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134 (f) and 47 U.S.C. § 254(h)).

268 The Federal Communications Commission concluded that for a school to be eligible for universal service discounts, “a school must meet the statutory definition of an elementary or secondary school found in the Elementary and Secondary Education Act of 1965, must not operate as a for-profit business, and must not have an endowment exceeding $50 million. Both public and non-public elementary and secondary schools that meet these criteria will be eligible to receive discounts on eligible services.” See Frequently Asked Questions on Universal Service and the Snowe-Rockefeller Amendment (released July 2, 1997) http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

281

rate, program.269 The E-rate provides libraries and schools with discounts ranging from 20% to

90% to acquire telecommunications services, Internet services, internal systems, and equipment.270 The E-rate was established under the Telecommunications Act of 1996.271

Public libraries receiving funding under the Library Services and Technology Act also were required to install Internet filters under the Children’s Internet Protection Act.272 The

Library Services and Technology Act (LSTA) applies only to libraries and not to schools. The

LSTA provides funding to libraries so that they can expand services for learning, access to information and educational resources in a variety of formats.273 The LSTA allows libraries to use funds to access information electronically and to acquire or share computer systems and telecommunication technology, including the purchase of Internet access.274 The CIPA amended the LSTA, Title III of the Elementary and Secondary Education Act,275 by mandating that public libraries use Internet filters to receive funding.276

269 47 U.S.C. § 254(h), mandating that blocking technology be installed on all computers connected to the Internet in schools and libraries receiving universal service or E-rate funding, even on those computers with Internet access not funded under the E-rate program. See also, Dept. of Labor, Health & Human Servs., & Educ. Appropriations Act, 2001, H.R. 4577, 106th Cong. (2d Sess. 2000). See also Telecommunications Act of 1996, Pub. L. No. 104-104, 47 U.S.C. § 254.

270 Telecommunications Act of 1996, Pub. L. No. 104-104, 47 U.S.C. § 254. See also United States v. Am. Library Ass’n, 539 U.S. 194, 199 (2003); Children’s Internet Protection Act, Report of the Comm. on Commerce, Science, & Transportation on S.97, S. REP. NO. 106-141 (1st Sess. 1999), at 2; Federal Communications Commission, E-rate, available at http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

271 Telecommunications Act of 1996, 47 U.S.C. § 254

272 See Children’s Internet Protection Act, 20 U.S.C. § 9134(f); 47 U.S.C. § 254(h)(6).

273 Library Servs. & Tech. Act, 20 U.S.C. § 9141(a)(1)(C)(2003).

274 See United States v. Am. Library Ass’n, 539 U.S. 194, 199 (2003) (citing Library Servs. & Tech. Act, 20 U.S.C. § 9141(a)(1)(C)(2003)).

275 20 U.S.C. § 9121, 9134.

276 Children’s Internet Protection Act, 20 U.S.C. § 9134(f).

282

In addition to requiring Internet filtering technology, the Children’s Internet Protection

Act required libraries and schools to implement an Internet safety policy for minors.277 The

Internet safety policy “includes monitoring the online activities of minors and the operation of a

technology protection measure with respect to any [library or school] computers with Internet

access that protects against access through such computers to visual depictions that are obscene,

child pornography, or harmful to minors.” The safety policy must “address” minors’ access to

“inappropriate matter” on the Internet and “measures designed to restrict minors' access to

materials harmful to minors.” The policy also must “address” minors’ safety and security when

using e-mail, chat rooms, and other forms of direct communications, as well as minors’

unauthorized access to the Internet, including “hacking.” Before implementing the Internet safety

policy, libraries and schools were required to provide reasonable public notice and to hold at

least one public hearing or one public meeting on the safety policy.278

The CIPA prohibits access to three types of sexually explicit visual content online:

access by all patrons to “visual depictions” that are obscene,279 access by all patrons to “visual depictions” that contain child pornography,280 and access by persons under the age of seventeen to “visual depictions” that are considered “harmful to minors.”281 Both obscenity and child

pornography fall under criminal law, and the Supreme Court has held neither is protected by the

First Amendment.282

277 20 U.S.C. § 9134(f)(1)(A), 47 U.S.C. § 254(h)(5)(A) and 47 U.S.C. § 254(h)(6)(A).

278 47 U.S.C. § 254 (l) (1).

279 20 U.S.C. §§ 9134(f)(1) and 47 U.S.C. §§ 254(h)(5)(B) & (h)(5)(C) and 47 U.S.C. §§ 254(h)(6)(B) & (h)(6)(C).

280 20 U.S.C. § 9134(f)(1) and 47 U.S.C. §§ 254(h)(5)(B) & (h)(5)(C) and 47 U.S.C. §§ 254(h)(6)(B) & (h)(6)(C).

281 20 U.S.C. § 9134(f)(1)(A) and 47 U.S.C. § 254(h)(5)(B) and 47 U.S.C. § 254(h)(6)(B).

282 See Roth v. United States, 354 U.S. 476 (1957) (establishing that obscenity falls outside of First Amendment protection); Miller v. California, 413 U.S. 15 (1973) (observing that the First Amendment does not protect obscenity

283

The CIPA contains a disabling provision to allow unfiltered access to the Internet. Under the E-rate program, adults can request that the filters be disabled for adult use only for “bona fide research or other lawful purpose,”283 whereas under the LSTA program, anyone could request that the filters be disabled for “bona fide research or other lawful purposes.”284 However, libraries receiving LSTA funding would be prohibited from turning off the filtering technology for minors if those libraries also received E-rate funding because the disabling provision only applies to adults under the E-rate.

The Children’s Internet Protection Act did not define “bona fide research or other lawful purpose(s).” In addition, the CIPA did not define “obscenity.” The text of the CIPA indicated that the term “obscene” has been given the meaning found in Section 1460 of Title 18 of the

United States Code,285 but no obscenity definition is provided in that section of U.S. Code.

However, the Supreme Court developed a three-part test defining obscenity in Miller v.

California,286 a landmark case decided in 1973. In the CIPA’s sections on child pornography, the legislation stated that it incorporated the definition of child pornography found in Section 2256

and establishing the three-part obscenity test); New York v. Ferber, 458 U.S. 747 (1982) (holding that the distribution of child pornography does not receive First Amendment protection).

283 United States v. Am. Library Ass’n, 539 U.S. at 201. See also Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D)). Under 47 U.S.C. §§ 254(h)(5)(D) and 254(h)(6)(D), the term “lawful purpose” is singular, whereas under 20 U.S.C. § 9134(f)(3), the term “lawful purposes” is plural.

284 United States v. Am. Library Ass’n, 539 U.S. at 201. See also Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f)(3)). Under 47 U.S.C. §§ 254(h)(5)(D) and 254(h)(6)(D), the term “lawful purpose” is singular, whereas under 20 U.S.C. § 9134(f)(3), the term “lawful purposes” is plural.

285 20 U.S.C. § 9134(f)(7)D) and 47 U.S.C. §254(h)(7)(E).

286 413 U.S. 15, 24 (1973). In 1957, the Supreme Court held that obscenity was not protected by the First Amendment. Roth v. United States, 354 U.S. 476 (1957). In Miller, the Court established the current three-part obscenity test: 1) “whether ‘the average person, applying contemporary community standards,’ would find the work, taken as a whole, appeals to the prurient interest; 2) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and 3) whether the work, taken as a whole, lacks serious literary, artistic, political or scientific value.” Miller, 413 U.S. at 24.

284

of Title 18 of the U.S. Code,287 which the legislation in fact did do.288 In defining “harmful to minors,” Congress modified the current three-part obscenity test that the Supreme Court established in Miller.289 The CIPA legislation defined “harmful to minors” as—

any picture, image, graphic image file, or other visual depiction that (i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.290 Conclusion

From 1998 to 2000, the U.S. Senate and House had tried on several occasions to pass bills that would have required public libraries and most schools to install mandatory Internet filters to protect minors from accessing sexually explicit online material. However, most of the bills died in committee. In 2000, Congress passed Senator McCain’s mandatory filtering amendment and Senator Santorum’s mandatory user policy amendment as part of a major appropriations bill.291 After enacting McCain’s Children’s Internet Protection Act and

Santorum’s Neighborhood Children’s Internet Protection Act, Congress deleted “neighborhood”

287 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134 (f)(7)(A) and 47 U.S.C. § 254(h)(7)(F)).

288 18 U.S.C. § 2256(8). “‘[C]hild pornography’ means any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where—(A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or (C) such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.” Id.

289 For the text of the Miller obscenity test, see supra n. 286.

290 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as 20 U.S.C. § 9134(f)(7)(B)).

291 See H.R. 4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 27, 2000).

285

from the title of the acts and referred to both pieces of legislation as the Children’s Internet

Protection Act, as did the two courts hearing challenges to the CIPA in 2002 and 2003.292

292 Both the U.S. District Court for the Eastern District of Pennsylvania and the United States Supreme Court referred to the enacted legislation as the Children’s Internet Protection Act. See Am. Library Ass’n v. United States, 201 F. Supp. 401 (2002) and United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

286

CHAPTER 7 COURT DECISIONS ON THE CHILDREN’S INTERNET PROTECTION ACT

Introduction

After Congress enacted the Children’s Internet Protection Act in 2000, its constitutionality as applied to public libraries was challenged in lawsuits by the American

Library Association (ALA), the American Civil Liberties Union, state library associations,

individual public libraries, library patrons and Web site publishers.1 The plaintiffs argued that

the Children’s Internet Protection Act (CIPA) was facially unconstitutional because the CIPA

induced libraries to violate the First Amendment rights of their patrons. The plaintiffs said the

CIPA was overbroad because the law censored a “substantial amount of protected speech” and

created a prior restraint.2 The plaintiffs also argued that the legislation was “unconstitutionally

vague.”3 Finally, the plaintiffs argued that the CIPA required libraries to give up their First

Amendment rights in order to receive federal funding, thereby making the Act impermissible

under the doctrine of unconstitutional conditions.4

Federal District Court Holds the CIPA Unconstitutional

In 2002, the U.S. District Court for the Eastern District of Pennsylvania consolidated the

cases and held that the Children’s Internet Protection Act violated the First Amendment.5 The court said the law was content-based because Internet filters, by their very nature, would restrict

1 The plaintiffs did not challenge the application of the Children’s Internet Protection Act to schools. The CIPA, as applied to schools, is beyond the scope of this dissertation.

2 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 452 (E.D. Pa. 2002).

3 Id. at 407 (citing City of Chicago v. Morales, 527 U.S. 41 (1999)). “It is established that a law fails to meet the requirements of the Due Process Clause if it is so vague and standardless that it leaves the public uncertain as to the conduct it prohibits . . . . " Morales, 527 U.S. at 57 (quoting Giaccio v. Pennsylvania, 382 U.S. 399, 402-03 (1966)).

4 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 407.

5 Id. at 411.

287

access to online material based on subject matter.6 The court said that filters block both protected and unprotected speech, thus resulting in unconstitutional content-based restrictions.7 The court found that because of limitations inherent in filtering technology, public libraries adhering to the

CIPA’s conditions would be in violation of the First Amendment as the statute would restrict patrons' access to a “substantial amount” of protected speech.8 As a result of this finding, the court said it did not need to decide whether the CIPA would effect a prior restraint on speech or whether it was unconstitutionally vague.9

In reaching its decision, the district court analyzed the role of public libraries and librarians, the capacity and status of filtering technology, and the First Amendment and public forum doctrine. However, filtering technology was the primary focus of the trial, the court stated.10

6 Id. at 454.

7 Id. at 410-48.

8 Id. at 411.

9 Id. The district court, in trying to explain why it did not have to decide the issues of prior restraint or vagueness, said it had invalidated the CIPA on the grounds that filtering software had “severe limitations” and “less restrictive alternatives” were available. For example, the court said librarians could use privacy screens or recessed monitors. Id. at 484. The court also said that librarians could monitor patrons and sanction patrons who violated the library’s Internet use policy. Id. at 490. The district court wrote:

Hence, even under the stricter standard of facial invalidity proposed by the government, which would require us to uphold CIPA if only a single library can comply with CIPA's conditions without violating the First Amendment, we conclude that CIPA is facially invalid, since it will induce public libraries, as state actors, to violate the First Amendment. Because we hold that CIPA is invalid on these grounds, we need not reach the plaintiffs' alternative theories that CIPA is invalid as a prior restraint on speech and is unconstitutionally vague. Id. at 490.

10 Id. at 408.

288

Role of Public Libraries

In addressing public libraries, the court said that libraries provide information to patrons

for educational, recreational, professional and other purposes.11 “The mission of public librarians is to provide their patrons with a wide array of information,” the district court wrote.12 Librarians

adhere to professional standards and local acquisition development policies when they purchase

materials for the library’s collections, according to the court. Librarians also use “selection aids,”

such as bibliographies and review journals, to acquire materials that meet the library’s collection

development criteria.13 Librarians sometimes delegate their selection decisions to third-party

vendors, who acquire print and video resources for the library based on the library’s collection

development criteria, the court said.14 A public library’s collection development criteria typically

reflect the library’s evaluation of the material’s quality and patrons’ demand for material.15

According to the court, “Many public libraries include sexually explicit materials in their print collection, such as The Joy of Sex and The Joy of Gay Sex.”16 The court said that few libraries

would carry graphic sexually explicit materials, such as Hustler Magazine or “XXX-rated

videos.”17

The Use of Filtering Technology

In discussing online pornography and Internet usage in public libraries, the district court

acknowledged that the volume of online pornography is “huge” and that public library patrons of

11 Id. at 420.

12 Id. at 421.

13 Id. For a complete discussion of the public library’s role in America, see Chapter 2.

14 Id.

15 Id. at 462.

16 Id. at 420.

17 Id. (emphasis added).

289

all ages have sought access to it.18 The court said it was “sympathetic” to the government’s

position in wanting to prevent library patrons from accessing visual depictions that are obscene,

child pornography or harmful to minors, which was the content specified under the Children’s

Internet Protection Act.19 However, the court found that filtering technology both underblocked

and overblocked content. Filtering software underblocked content because the filters failed to

block “a substantial amount of speech” that the filters were designed to block, including sexually

explicit Web pages and the categories of content defined by the CIPA, according to the court.20

The court also found that filtering programs overblocked content by preventing users from

accessing “a huge amount of speech that is covered by the First Amendment.”21 Evidence

introduced in the district court trial showed that between 6% and 15% of blocked Web pages

contained no material that met the filtering companies’ definitions of sexually explicit content.22

Filtering software programs miscategorized a variety of legitimate sites, including those covering religious groups, government entities, health issues, education, careers, travel and sports.23

Examples of overblocked Web sites included sites hosted by the Knights of Columbus, Vision

Art Online (which sells wooden religious wall hangings), Wisconsin Right to Life, a Jewish lesbian and gay group, an amputee support group, a California libertarian candidate, a bed and

18 Id. at 406.

19 Id. at 410.

20 Id. at 406, 410, 431, 448.

21 Id. at 448.

22 Id. at 475-76.

23 Id. at 446-50.

290

breakfast resort, and a home schooling group.24 Most filtering companies do not re-review Web

sites, resulting in both overblocking and underblocking of content as well, the court said.25

Filtering software designers do not seek input from legal scholars or attorneys before material is categorized as obscenity, child pornography, or harmful to minors. In addition, filtering software designers do not include local community standards when categorizing content.26 The district court said the lack of legal standards and legal definitions that software

developers used in analyzing content was problematic because the filtering software’s blocking criteria or categories did not match the legal definitions of obscenity, child pornography, or

harmful to minors. “[T]he blocking software is (at least for the foreseeable future) incapable of

effectively blocking the majority of materials in the categories defined by CIPA without

overblocking a substantial amount of materials,” the court wrote.27

The court also pointed out that because of the nature of proprietary software, librarians do

not participate in filtering decisions.28 Although librarians would be able to choose categories of

content to block, such as nudity, violence or hate speech, they would not know exactly which

Web sites had been blocked because software developers made those determinations. Therefore,

the district court found that the librarians’ use of filters is not a collection decision because librarians and their third party delegates are not making selection decisions based on typical library standards and collection policies.29 For non-Internet collections, such as printed materials

24 Id. at 446-47.

25 Id. at 435.

26 Id. at 429.

27 Id. at 410.

28 Id. at 462.

29 Id. at 462-64. For a discussion of library acquisition policies, see supra Chapter 2, pp. 50-52.

291

and audio-video resources, librarians do participate in acquisition decisions, a process which is subject to rational basis review.30 A public library’s collection development criteria typically reflect the library’s evaluation of the material’s quality and patrons’ demand for material.31

Because of limited storage space and funding, librarians routinely make content-based decisions when choosing some materials over others, a process subject to rational basis review, rather than strict scrutiny, the court said.32

The First Amendment and Public Forum Doctrine

The district court determined that public forum analysis applied to public libraries, but then had to determine which type of forum applied to which part or parts of the library. As the district court stated, the U.S. Supreme Court has defined three types of fora. A traditional public forum consists of “streets and parks . . . have immemorially been held in trust for the use of the public and, time out of mind, have been used for purposes of assembly, communicating thoughts between citizens, and discussing public questions.”33 A limited, or designated, public forum

“consists of public property which the State has opened for use by the public as a place for

30 Id. at 454, 462. Under the rational basis test, a court will hold a law valid if there is a reasonable relationship between a legitimate government objective and the means used to reach that objective. See Cornelius v. NAACP Legal Defense Fund, 473 U.S. 788, 808 (1985). The CIPA district court stated that librarians routinely make content-based decisions that adhere to professional standards and local acquisition development policies when acquiring materials. When librarians designate some of their acquisition decisions to third-party vendors, those vendors also adhere to the library’s acquisition policies. Id. at 421, 462.

31 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 462 (E.D. Pa. 2002).

Librarians make acquisition decisions based on the library’s mission statement and community preferences. Id. at 420, 462. According to the American Library Association, “No library can make everything available, and selection decisions must be made. Selection is an inclusive process, where the library affirmatively seeks out materials which will serve its mission of providing a broad diversity of points of view and subject matter.” See Am. Library Ass’n, Intellectual Freedom and Censorship Q & A, http://www.ala.org/ala/aboutala/offices/oif/basics/ifcensorshipqanda.cfm (last visited July 20, 2009).

32 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 408-09.

33 Id. at 454-55 (citing Hague v. CIO, 307 U.S. 496, 515 (1939)).

292

expressive activity.”34 Examples of a limited public forum are a library meeting room, a state university meeting room, a municipal auditorium and a school board meeting. The third category, a nonpublic forum, consists of all remaining public property that is not generally open to the public for expressive activities,35 such as jails, prisons, military bases and airports.36

The district court determined that a forum analysis applied to Internet access in public libraries and not to printed materials or other media, such as video recordings.37 The court cited three U.S. Supreme Court cases as support for applying public forum doctrine to a part of the library and not to the whole library. In Cornelius v. NAACP,38 the Supreme Court held that the relevant forum was a charity drive and not an entire federal workplace.39 In Perry Education

Association v. Perry Local Educators’ Association,40 the Court found that the relevant forum was the school district’s mail system and teachers’ mailboxes and not the public school as a whole.41 In Widmar v. Vincent,42 the Supreme Court held that the relevant forum was a state

34 Id. at 455 (citing Perry Educ. Ass’n v. Perry Local Educators’ Ass’n, 460 U.S. 37, 46 (1983)).

35 Id. (citing Int’l Soc'y for Krishna Consciousness v. Lee, 505 U.S. 672 (1992)).

36 Id. at 457.

37 Id. at 456.

38 Cornelius v. NAACP, 473 U.S. 788 (1985).

39 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 456 (citing Cornelius, 473 U.S. at 801). Legal defense and political advocacy organizations challenged the validity of a presidential order which excluded such groups from participating in an annual charity drive aimed at federal employees and conducted in a federal workplace during working hours. The organizations said that the presidential order violated their First Amendment rights to solicit charitable contributions. In a 4-3 vote, with two justices not participating, the Supreme Court said that the campaign was a nonpublic forum and that access could be restricted on the basis of subject matter without violating the First Amendment. The government restrictions were viewpoint neutral and reasonable so as to avoid the appearance of political favoritism and to avoid any potential controversy in the workplace. Cornelius, 473 U.S. at 809-12.

40 Perry Educ. Ass’n v. Perry Local Educators’ Ass’n, 460 U.S. 37 (1983).

41 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 456 (citing Perry, 460 U.S. at 74). The PEA union was voted in as the exclusive bargaining representative for the school district’s teachers. The collective bargaining agreement stated that no other unions, including PLEA, a rival union, would have access to the school’s mail system, although PLEA and other unions could use other school facilities to communicate with teachers. PLEA, the rival union, challenged the mail system provision, arguing that the provision violated the First Amendment. PLEA argued that the mail system was a limited public forum because private groups that were not connected to the school used the system and

293

university’s meeting facilities and not the campus as a whole.43 “Although these cases dealt with a problem of identifying the relevant forum where speakers are claiming a right of access, we believe that the same approach applies to identifying the relevant forum where the parties seeking access are listeners or readers,” the CIPA district court wrote.44

The CIPA district court labeled Internet access in public libraries as both a designated, or limited, public forum and a traditional public forum. The court wrote, “When the government provides Internet access in a public library, it has created a designated public forum.”45 In labeling Internet access a designated public forum, the district court said that the government had opened Internet use in public libraries “‘for use by the public . . . for expressive activity.’”46 The

because PLEA had had prior unrestricted access to the system before the collective bargaining vote. In a 5-4 vote, the Supreme Court ruled in favor of the PEA, the incumbent union, stating that the mail system provision did not violate the First Amendment. The Court said that the inter-school mail system was a nonpublic forum. The Court said that even though the mail system had been opened for periodic use by church and civic organizations and even though PLEA was allowed to use the mail system before the collective bargaining vote, the mail system could not be construed as a limited public forum. In authoring the Court’s opinion, Justice Byron White wrote, “Implicit in the concept of the nonpublic forum is the right to make distinctions in access on the basis of subject matter and speaker identity. These distinctions may be impermissible in a public forum but are inherent and inescapable in the process of limiting a nonpublic forum to activities compatible with the intended purpose of the property. The touchstone for evaluating these distinctions is whether they are reasonable in light of the purpose which the forum at issue serves.” Perry, 460 U.S. at 49.

42 Widmar v. Vincent, 454 U.S. 263 (1981).

43 See id. at 277. A state university implemented a regulation that banned religious groups from meeting in university meeting rooms that were open to other student groups. In an 8-1 decision, the Supreme Court held that the university’s policy violated the First Amendment because the policy was not content-neutral. Id. at 277. The Court said that the university had created a limited public forum by opening its meeting rooms to student groups. Id. at 267-68. Justice Lewis Powell wrote that, “Having created a forum generally open to student groups, the University seeks to enforce a content-based exclusion of religious speech. Its exclusionary policy violates the fundamental principle that a state regulation of speech should be content-neutral, and the University is unable to justify this violation under applicable constitutional standards.” Id. at 277.

44 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 456 (emphasis in original).

45 Id. at 457 (citing Mainstream Loudoun v. Bd of Trustees of Loudoun County Library, 24 F. Supp. 2d 552, 563 (E.D. Va. 1998) & Kreimer v. Bureau of Police, 958 F. 2d 1242, 1259 (3d Cir. 1992)).

46 Id. (quoting Perry, 460 U.S. at 45).

294

court said that libraries “intentionally open their doors to vast amounts of speech” when

providing patrons with Internet access.47

The court said, in contrast to the rational basis review that the district court said librarians

were entitled to use when making decisions on acquiring their print collections,48 strict scrutiny

was applicable to mandatory Internet filtering in public libraries.49 In providing patrons even

with filtered Internet access, a public library invites patrons to access speech whose content has

never been reviewed or recommended as particularly valuable by either a librarian or a third

party to whom the library has delegated collection development decisions.50 “Where the

government creates a designated public forum to facilitate private speech representing a diverse

range of viewpoints, the government’s decision to selectively single out particular viewpoints for

exclusion is subject to strict scrutiny,” the court wrote.51

In labeling Internet access a traditional public forum, the district court stated that “public libraries’ provision of Internet access promotes First Amendment values in an analogous manner to traditional public fora, such as sidewalks and parks,”52 where “a principal purpose [is] . . . the

free exchange of ideas.”53 Similarly, public libraries are “‘designed for freewheeling inquiry’”54

47 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 463.

48 For an explanation of acquisition policies, see supra notes 11-17 and accompanying text. See also supra Chapter 2, pp. 50-52.

49 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 461, 470-71.

50 Id. at 421, 463. The court said that the rational basis test does not apply to Internet access as it would when librarians use “editorial discretion” or delegate “editorial discretion” in selecting books because librarians do not have control over which online resources to acquire when commercial filters are used. Third-party vendors typically follow libraries’ acquisition policies when selecting materials for the libraries. Id. at 421.

51 Id. at 460.

52 Id. at 466.

53 Id. (citing Cornelius v. NAACP, 473 U.S. 788, 800 (1985)).

54 Id. at 466 (citing Bd. of Educ. v. Pico, 457 U.S. 853, 915 (1982) (Rehnquist, J., dissenting)).

295

and are a “‘quintessential locus of the receipt of information,’” according to the CIPA district court.55 By providing patrons with Internet access, the court said that libraries “open their doors to an unlimited number of potential speakers around the world.”56 The court acknowledged that public libraries “[do] not share the historical pedigree of streets, sidewalks and parks as a vehicle of free expression.”57 However, the court said that Internet access “shares many of the characteristics of these traditional public fora that uniquely promote First Amendment values,” including the right to receive speech.58 For example, public libraries are like sidewalks and parks because they are generally open to anyone who wishes to receive speech through the print medium or through the Internet, the court said.59 Like a traditional forum, Internet access provides patrons with access to messages from an unlimited number of speakers, according to the court.60

The district court also said that strict scrutiny applies when Internet access in a public library is considered a traditional public forum.61 The court wrote:

Application of strict scrutiny to public libraries’ content-based restrictions on their patrons’ access to the Internet finds further support in the analogy to traditional public fora, such as sidewalks, parks, and squares, in which content-based restrictions are always

55 Id. (quoting Kreimer v. Bureau of Police for Morristown, 958 F. 2d 1242, 1255 (3d Cir. 1992)). In Kreimer, the federal appellate court stated that a public library was a designated public forum. The Third Circuit of the U.S. Court of Appeals wrote, “[A] limited public forum, the Library is obligated only to permit the public to exercise rights that are consistent with the nature of the Library and consistent with the government’s intent in designating the Library as a public forum. Other activities need not be tolerated.” Kreimer, 958 F. 2d at 1262.

56 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 468 (E.D. Pa. 2002).

57 Id. at 466.

58 Id. The court used the terms “traditional public forum” and “non-traditional public forum” interchangeably, though the court implied that a “non-traditional public forum” was a modern version of a traditional public forum.

59 Id. at 467.

60 Id. at 468.

61 Id. at 470.

296

subject to strict scrutiny. Like these traditional public fora, Internet access in public libraries uniquely promotes First Amendment values, by offering low barriers to entry to speakers and listeners.62

“Under strict scrutiny, a public library’s use of filtering software is permissible only if it

is narrowly tailored to further a compelling interest and no less restrictive alternative would serve

that interest,” the court wrote.63

In applying the strict scrutiny test to Internet filtering in public libraries, the district court

first looked at the content prohibited by the Children’s Internet Protection Act.64 The CIPA prohibited all patrons from accessing “visual depictions” that were obscene or child pornography. The CIPA also prohibited minors from accessing “visual depictions” deemed harmful to minors.65 The court acknowledged that public libraries had “an interest” in preventing patrons from engaging in illegal conduct, such as viewing obscenity or pornography.66 The court

also recognized that the government has a “compelling interest in protecting the well-being of its

youth.”67 The court said that it could justify the use of Internet filters if the filters were narrowly

tailored to further the government’s compelling interest and if no less restrictive means existed.

However, the court could not justify the use of content-based regulations to fulfill the government’s compelling interest. “[W]e are constrained to reject any compelling state interest

62 Id. at 489.

63 Id. at 410.

64 Id. at 474-75.

65 20 U.S.C. § 9134(f)(1) and 47 U.S.C. § 254(h)(6).

66 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 474.

67 Id. at 472.

297

in regulating patrons’ conduct as a justification for content-based restrictions on patrons’ Internet

access,” the court wrote.68

The court said that mandatory Internet filtering was not narrowly tailored to achieve the

government’s objective because filtering software overblocked “a substantial amount” of speech

not covered under the Children’s Internet Protection Act.69 In finding that filtering programs

overblocked content, the court said that the software blocked “thousands of pages” that did not

contain “visual depictions” that contained obscenity, child pornography, or material deemed

harmful to minors.70 “The government may not justify restrictions on constitutionally protected

speech on the ground that such restrictions are necessary in order for the government effectively

to suppress the dissemination of unprotected speech, such as obscenity and child pornography,”

the court wrote.71

The court also said filtering software underblocked content because the filters failed to

block “a substantial amount of speech” that the filters were designed to block, including sexually

explicit Web pages and the categories of content defined by the CIPA.72 “No software exists that

can automatically distinguish visual depictions that are obscene, child pornography, or harmful

to minors, from those that are not,” the court wrote.73

In addressing the least restrictive means component of the strict scrutiny test, the district court stated that the government failed to show that less restrictive alternatives were ineffective

68 Id. at 474.

69 Id. at 475-79.

70 Id. at 475.

71 Id. at 477.

72 Id. at 406, 410, 431, 448.

73 Id. at 476.

298

at furthering the government’s compelling interest in preventing patrons from accessing obscenity, child pornography, and material deemed harmful to minors.74 The district court listed a number of less restrictive alternatives, including user policies, that could clearly state that patrons could not use Internet terminals to access “illegal content.”75 Or, if a patron violated the user policy, the librarian could issue a warning or revoke Internet privileges. If the patron accessed obscenity or child pornography, the librarian could notify the police.76 The court said librarians could monitor patrons’ usage of the Internet and give them a “tap on the shoulder” if they were accessing obscenity or child pornography.77

The court said minors could be required to use unfiltered terminals only in children’s rooms where staff had a direct view of the computers.78 The court said librarians could program minors’ library cards to allow filtered or unfiltered access to the Internet, based on parental consent.79 To help patrons avoid accidentally accessing “sexually explicit content,” librarians could offer guidance or turn on optional filtering software.80 To help patrons avoid seeing

“sexually explicit content” on other patrons’ computer terminals, librarians could install privacy

74 Id. at 484.

75 Id. at 480.

76 Id. at 475, 481. The court did not discuss how a librarian would determine if the content contained obscenity or child pornography.

77 Id. at 481-82. The court did not list material deemed harmful to minors when discussing the tap-on-the-shoulder technique. The court also stated that librarians might be uncomfortable monitoring patrons’ usage of computers.

78 Id. at 482.

79 Id.

80 Id. at 483.

299

screens or recessed monitors. Librarians also could place unfiltered computers away from high

traffic areas and away from filtered computers.81

Under the E-rate program, public libraries must have filtering software in operation even

when adults are using computers for Internet access. However, library personnel may disable—

but are not required to disable—the filtering software when an adult requests “access for bona

fide research or other lawful purpose."82 The E-rate program prohibits librarians from disabling

filters for patrons who are minors.83 The district court said that the disabling provision does not

cure the constitutional deficiencies of the CIPA because the provision was not clear and was not

narrowly tailored.84 The district court said the meaning of the disabling provision was ambiguous because it was “unclear” whether librarians could disable the filters anytime an adult wished to access speech that is not restricted by the CIPA, or, in other words, “to enable access to all constitutionally protected material.”85 The court faulted Congress for choosing language in

the disabling provision of the statute that was inconsistent with the language in the banned

content provision of the statute:

If Congress intended CIPA's disabling provisions simply to permit libraries to disable the filters to allow access to speech falling outside of these categories, Congress could have drafted the disabling provisions with greater precision, expressly permitting libraries to disable the filters “to enable access for any material that is not obscene, child pornography, or in the case of minors, harmful to minors,” rather than “to enable access for bona fide

81 Id. However, the court correctly acknowledged that the Children’s Internet Protection Act mandated that filtering software be installed on all computers. “Even the less restrictive alternative of allowing unfiltered access on only a single terminal, well out of the line of sight of other patrons, however, is not permitted under CIPA, which requires the use of a technology protection measure on every computer in the library,” the district court wrote. Id.

82 Id. at 413 (citing Children’s Internet Protection Act § 1721(b) (codified at 47 U.S.C. § 254(h)(6)(D)).

83 Id. at 412-13.

84 Id. at 484-85, 489.

85 Id. at 485.

300

research or other lawful purposes,” which is the language that Congress actually chose.86

The district court also faulted the disabling provision because it required adults to

identify themselves before gaining unfiltered access to the Internet. “The Supreme Court has

made clear that content-based restrictions that require recipients to identify themselves before being granted access to disfavored speech are subject to no less strict scrutiny than outright bans on access to such speech,” the district court wrote.87

Because the district court found the CIPA violated the First Amendment by “blocking a

very substantial amount of protected speech,” the court said it did not need to address the

plaintiff’s contention that the CIPA was a prior restraint and was unconstitutionally vague.88 The plaintiffs had argued that the CIPA effected an “impermissible prior restraint on speech by granting filtering companies and library staff unfettered discretion to suppress speech before it has been received by library patrons and before it has been subject to a judicial determination.”89

Congress’ Spending Clause and Unconstitutional Conditions Doctrine

The court also said it did not need to decide whether the CIPA violated the doctrine of

unconstitutional conditions because the court had already reached its decision based on the

plaintiffs’ First Amendment claims, the technological limitations of filtering software, and the

86 Id.

87 Id. at 486 (citing Lamont v. Postmaster Gen., 381 U.S. 301, 307 (1965) (holding that a federal statute requiring the Postmaster General to halt delivery of communist propaganda unless the addressee affirmatively requested the material violated the First Amendment) and Denver Area Educ. Telecomm. Consortium v. FCC, 518 U.S. 727, 732- 33 (1996) (holding unconstitutional a federal law that required cable operators to allow access to patently offensive, sexually explicit programming only to those subscribers who requested access to the programming in advance and in writing)).

88 Id. at 411, 490.

89 Id. at 407.

301

role of public libraries.90 However, the district court addressed Congress’ spending power in relation to the unconstitutional condition argument, stating, “[W]e are mindful of the need to

frame the disputed legal issues and to develop a full factual record for the certain appeal to the

Supreme Court.”91

The American Library Association and the other plaintiffs had argued that the Children’s

Internet Protection Act posed an unconstitutional condition on libraries receiving E-rate and

LSTA funds by requiring the libraries to surrender their First Amendment right to provide library patrons with access to constitutionally protected speech.92 The government had contended that the unconstitutional conditions framework did not apply to the CIPA. The government argued that although public libraries are the recipient of the funds, libraries are state actors and therefore

are not protected by the First Amendment. The government further argued that although library patrons are protected by the First Amendment, library patrons are not the recipient of the funds.93

Under the Spending Clause of the U.S. Constitution, Congress has the power to tax and spend for the “general welfare.” 94 At the same time the court chose not to decide the issue of Congress’

use of its spending power in relationship to the Children’s Internet Protection Act, the district

court said the question “is not an easy one.”95

90 Id. at 490-93.

91 Id. at 490.

92 Id. at 490-91.

93 Id. at 491.

94 U.S. CONST., art. I, § 8, cl.1. “The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States.”

95 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 493 (E.D. Pa. 2002).

302

In 1987, in South Dakota v. Dole,96 the Supreme Court had upheld the constitutionality of

Congressional conditions placed on the allocation of federal funds designated for government activities.97 Both the government and those challenging the CIPA case agreed that the Dole test provided the most appropriate framework because the CIPA case involved a challenge to

Congress’ spending power.98 However, the government and other litigants disagreed on the application of one prong of the four-part Dole test.99

The fourth prong states that congressional spending power “may not be used to induce the States to engage in activities that would themselves be unconstitutional.”100 The government argued that because CIPA’s opponents contended the statute was unconstitutional on its face, they had to show that it was not possible for public libraries to meet the requirements of the

CIPA without violating the First Amendment.101 CIPA’s challengers, on the other hand, contended that the law would “induce” public libraries to violate the First Amendment rights of

Internet content providers to disseminate information and the First Amendment rights of patrons

96 South Dakota v. Dole, 483 U.S. 203 (1987).

97 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 450. In South Dakota v. Dole, the Supreme Court upheld the constitutionality of a federal statute that allowed Congress to withhold a percentage of highway funds from states in which persons under the age of twenty-one could legally buy alcoholic beverages. 483 U.S. at 207-08. The Dole Court said that the highway funding statute was a valid exercise of Congress’ spending power and the statute did not violate the twenty-first amendment as the statute did not induce the state to violate anyone’s constitutional rights. Dole, 483 U.S. at 209-11.

98 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 450. The Dole Court explained that if South Dakota had raised its drinking age to twenty-one, the state would not be violating anyone’s constitutional rights. Dole, 483 U.S. at 209-11.

99 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 450. According to the four-part Dole test, 1) the exercise of congressional spending power must be in pursuit of “the general welfare”; 2) conditions attached to federal funds that are allocated to states must be “unambiguous” so that states understand the consequences of accepting funds; 3) conditions on federal grants are constitutional only if they relate “to the federal interest in particular national projects or programs” (the Dole Court did not clarify this point or provide examples to illustrate this point); 4) conditions on the allocation of federal funds may be invalid if other constitutional provisions provide an “independent bar” to the enforcement of the conditions. Dole, 483 U.S. at 207-08.

100 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 450 (citing South Dakota v. Dole, 483 U.S. 203, 210 (1987)).

101 Id. at 451.

303

to access constitutionally-protected speech on the Internet, and therefore violate the fourth prong

of the Dole test.102

The district court said the government can make viewpoint-based funding decisions when

the government is the speaker or when the government uses private speakers to discuss a

government program.103 In Rust v. Sullivan,104 the district court noted, the Supreme Court

upheld, under the First Amendment, a federal law prohibiting health care clinics from counseling

patients on abortion if the clinics accepted federal funding.105 However, the district court said

that the CIPA case was not like Rust because the Children’s Internet Protection Act was not

communicating a government message or authorizing private speakers to communicate

government information. “Even with software filters in place, the sheer breadth of speech

available on the Internet defeats any claim that CIPA is intended to facilitate the dissemination of

governmental speech,” the district court wrote.106

The CIPA district court said that the government could create public institutions, such as

art museums, for the “dissemination of private speech that the government believes to have

particular merit.”107 For example, the district court cited NEA v. Finley,108 in which the Supreme

Court upheld the use of content-based criteria in awarding grants to artists based on artistic

excellence.109 However, the CIPA district court contrasted the Finley case with the CIPA case,

102 Id. at 450-51.

103 Id. at 458.

104 Rust v. Sullivan, 500 U.S. 173 (1991).

105 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 453 (E.D. Pa. 2002).

106 Id. at 493.

107 Id. at 459.

108 Nat’l Endowment of the Arts v. Finley, 524 U.S. 569 (1998).

109 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 459.

304

stating that the more broadly that government facilitates private speech, such as in creating a

designated public forum for Internet access in public libraries, any speech restrictions must pass

the strict scrutiny test.110

The district court said the CIPA case was not like Rust or Finley, but was instead

analogous to FCC v. League of Women Voters,111 Arkansas Writers’ Project v. Ragland,112 and

Legal Services v. Velazquez.113 In League of Women Voters, the Supreme Court held that the

federal government violated the First Amendment by imposing a content-based regulation that

prohibited public broadcasters from editorializing if they received federal funds.114 In Arkansas

Writers’ Project, the Supreme Court rejected as unconstitutionally content-based a state law that

selectively applied sales tax to some types of publications and not others.115 In Velazquez, the

Supreme Court held unconstitutional a funding restriction on Legal Services Corporation, that

prevented attorneys from representing clients trying to challenge existing welfare law and

disrupted the ordinary function of the court system.116

Similarly, the district court said that the Children’s Internet Protection Act violated the

First Amendment by interfering with the public libraries’ role in providing the public with

110 Id. at 460-61.

111 FCC v. League of Women Voters, 468 U.S. 364 (1984).

112 Ark. Writers’ Project v. Ragland, 481 U.S. 221 (1987).

113 Legal Servs. Corp. v. Velazquez, 531 U.S. 533 (2001).

114 Am. Library Ass’n v. U.S., 201 F. Supp. 2d at 492-93 (citing FCC v. League of Women Voters, 468 U.S. 364, 366, 402 (1984)).

115 Id. (citing Ark. Writers’ Project v. Ragland, 481 U.S. 221, 223 (1987)). The Supreme Court held that the Arkansas state government violated the First Amendment by subsidizing “newspaper [sic] and religious, professional, trade, and sports journals” and not subsidizing “general interest magazines.” Id. (quoting Ragland, 481 U.S. at 223).

116 Velazquez, 531 U.S. at 542. In authoring the Velazquez opinion, Justice Anthony Kennedy wrote that “the LSC program was designed to facilitate private speech, not to promote a governmental message.”

305

“controversial, yet constitutionally protected material.”117 The court also said the CIPA “distorts

the usual functioning of public libraries” because the CIPA required libraries to

1). Deny patrons access to constitutionally protected speech that libraries would otherwise provide to patrons; and 2). Delegate decision making to private software developers who closely guard their selection criteria as trade secrets and who do not purport to make their decisions on the basis of whether the blocked Web sites are constitutionally protected or would add value to a public library’s collection.118

Although the district court did not decide the unconstitutional conditions issue, it said the

American Library Association and others challenging the CIPA had “good arguments” to support

their claim. They had good arguments they could assert on appeal by relying on the First

Amendment rights of either the public libraries or their patrons, according to the court. They also

had “a good argument” that the CIPA’s requirement that public libraries use filtering software

distorted the “usual functioning of public libraries” to the extent that the law constituted “an

unconditional condition on the receipt of funds,” the district court said.119

Supreme Court Upholds the CIPA

In an expedited review in 2003, the U.S. Supreme Court reversed the judgment of the

federal district court on a 6-3 vote and upheld the Children’s Internet Protection Act.120 Although the justices were not able to agree on one opinion, six agreed that the Children’s Internet

Protection Act did not violate the Constitution.121

117 Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401, 493 (E.D. Pa. 2002).

118 Id.

119 Id.

120 United States v. Am. Library Ass’n, 539 U.S. 194, 198 (2003). The Children’s Internet Protection Act provided for expedited review; thus, the parties losing in federal district court could directly appeal their case to the United States Supreme Court, which they did. See 47 U.S.C. § 54 (2000).

121 United States v. Am. Library Ass’n, 539 U.S. at 214.

306

Chief Justice William Rehnquist authored the plurality opinion122 and was joined by

Justices Sandra Day O’Connor, Antonin Scalia and Clarence Thomas.123 Justices Anthony

Kennedy and Stephen Breyer concurred in the judgment only, and each wrote a separate

concurring opinion.124 Justices John Paul Stevens, David Souter and Ruth Bader Ginsburg dissented. Stevens wrote one dissenting opinion125 and Souter, joined by Ginsburg, wrote a

separate dissenting opinion.126

In the plurality opinion, Chief Justice Rehnquist explained that Congress enacted two

types of federal assistance to aid schools and public libraries with acquiring technology: the

universal service, or E-rate program, and the Library Services Technology Act. The E-rate

program, established under the Telecommunications Act of 1996, allows qualifying libraries to

purchase Internet access at a discounted rate.127 The Library Services and Technology Act

(LSTA) provides funding to libraries so that they can access information electronically and fund

acquisition or sharing of computer systems and telecommunication technology.128 According to a Senate Conference Report quoted by Justice Rehnquist, the purpose of the LSTA is “to stimulate excellence and promote access to learning and information resources in all types of libraries for individuals of all ages.”129 In authoring the Supreme Court’s opinion on the CIPA,

122 Id. at 198. Only four justices signed onto the Court’s opinion. Two other justices agreed with the judgment only and wrote separate concurring opinions. Three justices dissented. Id. at 215, 220.

123 Id. at 198.

124 Id. at 214 (Kennedy, J., concurring) & 215 (Breyer, J., concurring).

125 Id. at 220 (Stevens, J., dissenting).

126 Id. at 231 (Souter, J., dissenting).

127 Id. at 199 (plurality opinion).

128 Id.

129 Id. at 212 (quoting S. REP. NO. 105-226 at 5 (1998)).

307

Rehnquist wrote, “Congress became concerned that the E-rate and LSTA programs were

facilitating access to illegal and harmful pornography” in public libraries.130

The Use of Filtering Technology

Rehnquist disagreed with the district court’s assessment that congressional goals needed

to be met with less restrictive alternatives to filtering or blocking software. The goals of the

CIPA were to restrict all library patrons from accessing visual images that constituted obscenity

and child pornography and to restrict minors from accessing visual images that would be harmful to minors.131 “We require the Government to employ the least restrictive means only when the

forum is a public one and strict scrutiny applies . . . and such is not the case here,” Rehnquist

wrote.132

In addressing the filtering software’s problem of “overblocking”—that is, blocking

access to constitutionally-protected speech—the Supreme Court said if erroneous blocking did

“present constitutional difficulties,” adult patrons could ask a librarian to unblock a Web site.

Adult patrons also could ask a librarian “to ‘disable’ a filter altogether ‘to enable access for bona

fide research or other lawful purposes.’”133 The Court did not see a problem with patrons having

to identify themselves when asking for unfiltered access to the Internet. “The Constitution does

not guarantee the right to acquire information at a public library without any risk of

embarrassment,” Rehnquist wrote.134

130 Id. at 200.

131 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 410 (E.D. Pa. 2002).

132 United States v. Am. Library Ass’n, 539 U.S. 194, 207 (2003).

133 Id. at 209 (citing 20 U.S.C. 9134 (f)(3)).

134 Id. at 209.

308

The Congressional Spending Clause

The Supreme Court held that the Children’s Internet Protection Act did not violate library patrons’ First Amendment rights and did not place an unconstitutional burden on public libraries.

The Court held that the CIPA was a valid exercise of Congress’ spending powers and did not impose an “unconstitutional condition” on public libraries.135 The Court stated the CIPA did

“not directly regulate private conduct; rather, Congress has exercised its Spending Power by specifying conditions on the receipt of federal funds.”136 Historically the Supreme Court has interpreted Congress’ spending power broadly.137 The Court stated that it is within Congress’ spending power granted in the Constitution to pass laws such as the CIPA.138 The Court said that the government can define the parameters of the programs that it funds139 and ‘insist that public funds be spent for the purposes for which they were authorized.’”140 Rehnquist wrote that

“Congress has wide latitude to attach conditions to the receipt of federal assistance in order to

135 Id. at 214.

136 Id. at 203.

137 See Janelle A. Weber, The Spending Clause: Funding a Filth-Free Internet or Filtering out the First Amendment? 56 FLA. L. REV. 471, 472 (2004).

138 United States v. Am. Library Ass’n, 539 U.S. at 203. The Court pointed to previous holdings as support, including: Rust v. Sullivan, 500 U.S. 173, 194 (1991) (prohibiting the use of federal funds in family planning services that provided abortion counseling); Nat’l Endowment for the Arts v. Finley, 524 U.S. 569 (1998) (upholding an art funding program requiring the NEA to make funding decisions based on content-based criteria); Legal Servs. Corp. v. Velazquez, 531 U.S. 533, 542 (2001) (holding that a funding restriction on Legal Services Corporation, which prevents attorneys from representing clients in an attempt to amend or challenge existing welfare law, violated the First Amendment.) Justice Kennedy wrote for the Velazquez Court that, “As we have pointed out, it does not follow . . . that viewpoint-based restrictions are proper when the [government] does not itself speak or subsidize transmittal of a message it favors but instead expends funds to encourage a diversity of views from private speakers . . . . [T]he LSC program was designed to facilitate private speech, not to promote a governmental message.”

139 United States v. Am. Library Ass’n, 539 U.S. at 211.

140 Id. (citing Rust v. Sullivan, 500 U.S. 173, 194 (1991) (holding that the government could prohibit federally- funded family planning providers from using such funds for abortion counseling)).

309

further its policy objectives141 . . . but Congress may not ‘induce’ the recipient ‘to engage in

activities that would themselves be unconstitutional.’”142

Justice Rehnquist compared the CIPA case with previous decisions that upheld

congressional spending power. In Rust v. Sullivan143 the Court upheld a law that prohibited the

use of federal funds in family planning services that provided abortion counseling.144 In Rust, the

Supreme Court held that “‘the government [was] not denying a benefit to anyone, but [was]

instead simply insisting that public funds be spent for the purposes for which they were

authorized,” Rehnquist wrote in the CIPA opinion.145 Rehnquist said the CIPA case was

analogous to Rust. Rehnquist said, “The E-rate and LSTA programs were intended to help public libraries fulfill their traditional role of obtaining material of requisite and appropriate quality for

educational and informational purposes.”146 Rehnquist wrote:

Congress may certainly insist that these public funds be spent for the purposes for which they were authorized . . . . [B]ecause public libraries have traditionally excluded pornographic material from their other collections, Congress could reasonably impose a parallel limitation on its Internet assistance programs.147

In contrast, in Legal Services Corporation v. Velazquez,148 the Supreme Court had held

unconstitutional a funding restriction on Legal Services Corporation preventing attorneys from

representing clients trying to amend or challenge existing welfare law. In authoring the

141 Id. at 203 (citing South Dakota v. Dole, 483 U.S. 203, 206 (1987)).

142 Id. (citing Dole, 483 U.S. at 210).

143 Rust v. Sullivan, 500 U.S. 173 (1991).

144 Id. at 194.

145 United States v. Am. Library Ass’n, 539 U.S. at 211 (citing Rust v. Sullivan, 500 U.S. 173, 194 (1991)).

146 Id. at 211.

147 Id. at 211-12.

148 Legal Servs. Corp. v. Velazquez, 531 U.S. 533 (2001).

310

Velazquez opinion, Justice Anthony Kennedy wrote that “the LSC program was designed to

facilitate private speech, not to promote a governmental message.”149 In the CIPA opinion,

Rehnquist said that the Velazquez Court “concluded that the restriction on advocacy in such welfare disputes would distort the usual functioning of the legal profession and the federal and state courts before which the lawyers appeared.”150 Rehnquist contrasted the Velazquez case to

the CIPA case, stating that public libraries “have no comparable role that pits them against the

government, and there is no comparable assumption that they must be free of any conditions that

their [government] benefactors might attach to the use of donated funds or other assistance.”151

However, even given Congress’s broad spending authority it may not “induce” the

recipient of government funds “‘to engage in activities that would themselves be

unconstitutional,’” Rehnquist wrote on behalf of the Court’s plurality.152 Rehnquist said

Congress did not force libraries to install the filtering software but rather refused to subsidize

unfiltered Internet access.153 He said libraries must have “broad discretion” in deciding on what

material to acquire for patrons.154 The “traditional role” for public libraries is “identifying

suitable and worthwhile material; it is no less entitled to play that role when it collects material

from the Internet than when it collects material from any other source,” Rehnquist wrote.155

Rehnquist referred to “two analogous contexts” where government “has broad discretion

to make content-based judgments” in deciding what speech to make available to the public. In

149 Id. at 542.

150 United States v. Am. Library Ass’n, 539 U.S. at 213.

151 Id.

152 Id. at 203 (quoting South Dakota v. Dole, 483 U.S. 203, 210 (1987)).

153 Id. at 212.

154 Id. at 204.

155 Id. at 208.

311

Arkansas Educational Television Commission v. Forbes,156 the Supreme Court held that a

political debate on public television was a nonpublic forum.157 The Arkansas Educational

Television Commission (AETC), a state agency that owned and operated a network of

noncommercial television stations, had excluded an independent candidate from the debate based

on time constraints.158 The Supreme Court said that the AETC’s decision to exclude the

candidate was a “reasonable, viewpoint-neutral exercise of its journalistic discretion.”159

In National Endowment for the Arts v. Finley,160 the Supreme Court upheld the NEA’s

discretion in making content-based decisions for funding artists.161 In discussing the NEA case

in the CIPA opinion, Rehnquist wrote that content-based decisions “are a consequence” of grant

funding for the arts.162 The Finley Court did not apply public forum analysis because “it would conflict with ‘NEA’s mandate . . . to make esthetic judgments, and the inherently content-based excellence threshold for NEA support,’” Rehnquist wrote in the CIPA opinion.163

The First Amendment and Public Forum Doctrine

In CIPA, Rehnquist said that public libraries, like televised debates and arts funding, are

not public forums. Rehnquist wrote,

The principles underlying Forbes and Finley also apply to a public library’s exercise of judgment in selecting material it provides its patrons . . . . Public library staffs necessarily consider content in

156 Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666 (1998).

157 United States v. Am. Library Ass’n, 539 U.S. at 204 (citing Forbes, 523 U.S. at 672-73).

158 Forbes, 523 U.S. at 670.

159 Id. at 676.

160 Nat’l Endowment for the Arts v. Finley, 524 U.S. 569 (1998).

161 United States v. Am. Library Ass’n, 539 U.S. at 205 (citing Finley, 524 U.S. at 585).

162 Id.

163 Id. (quoting Finley, 524 U.S. at 586).

312

making collection decisions and enjoy broad discretion in making them.164

Rehnquist said that public forum analysis is “incompatible with the discretion that public

libraries must have to fulfill their traditional missions”165 of choosing which material to make

available to their audiences. The public library’s goal “has never been to provide universal

coverage,” Rehnquist wrote.166

Rehnquist disagreed with the district court’s public forum analysis. Rehnquist said

Internet availability in a public library is not a traditional public forum because the Internet has

not “‘immemorially been held in trust for the use of the public’” and used “‘for the purposes of

assembly, communication of thoughts between citizens, and discussing public questions.’”167

Rehnquist wrote, “We have ‘rejected the view that traditional public forum status extends beyond its historic confines.’”168 The Internet, as a recent resource, has not been held in trust as a

place for the public to assemble and communicate, he said.169

Internet access in a public library is not a designated or limited public forum either, said the chief justice in the plurality opinion. 170 “To create such a forum, the government must make

an affirmative choice to open up its property for use as a public forum,” Rehnquist wrote.171 “A public library does not acquire Internet terminals in order to create a public forum for Web

164 Id.

165 Id.

166 Id. at 204.

167 Id. at 205 (citing Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 679 (1992)).

168 Id. at 206 (citing Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666, 678 (1998) (holding that public forum doctrine does not apply to the editorial judgments made by public television stations)).

169 Id. at 205.

170 Id.

171 Id. at 206 (paraphrasing Cornelius v. NAACP, 473 U.S. 788, 802-03 (1985) (describing the types of forums)).

313

publishers to express themselves any more than it collects books in order to provide a public forum for the authors of books to speak.”172 Rehnquist said public libraries did not establish

Internet access to allow private speakers to share diverse views. In contrast, a library “provides

Internet access to facilitate research, learning, and recreational pursuits by furnishing materials of

requisite and appropriate quality.”173

Concurrences

In a concurring opinion, Justice Anthony Kennedy agreed with the judgment of the

Court, stating, “The statute is not unconstitutional on its face.”174 Kennedy wrote that “there is

little to this case” because adults could obtain unfiltered Internet access by requesting that a

librarian disable the filter.175 He said that the district court’s statement that unblocking could take

“days” did not appear to be “a specific finding.”176 However, if some libraries could not disable the filter or unblock specific Web sites, or if an adult user would be “burdened” by not being able to view constitutionally protected material, a plaintiff could file an “as applied” challenge to the law, Kennedy wrote.177

172 Id.

173 Id.

174 Id. at 215 (Kennedy, J., concurring).

175 Id. at 214 (Kennedy, J., concurring). Kennedy wrote:

If, on the request of an adult user, a librarian will unblock filtered material or disable the Internet software filter without significant delay, there is little to this case. The Government represents this is indeed the fact . . . . The interest in protecting young library users from material inappropriate for minors is legitimate, and even compelling, as all Members of the Court appear to agree. Given this interest, and the failure to show that the ability of adult library users to have access to the material is burdened in any significant degree, the statute is not unconstitutional on its face. Id.

176 Id. at 214 (Kennedy, J., concurring).

177 Id. at 215.

314

In a separate concurring opinion, Justice Stephen Breyer agreed with the plurality’s

judgment that the CIPA did not violate the First Amendment and therefore was constitutional.178

He also agreed with the plurality’s reasoning that Internet access in public libraries was not a public forum.179 Breyer acknowledged that filtering software could both overblock and

underblock content. However, Breyer said, “No one has presented any clearly superior or better

fitting alternatives.”180 In addition, Breyer, like Rehnquist and Kennedy, noted that the CIPA

contained a disabling provision. Breyer said the disabling provision provided “an important

exception that limits the speech-related harm” that overblocking might cause, allowing libraries

to permit any adult patron access to an “overblocked” site.181 Justice Breyer stated that the

disabling request process is no more “onerous” than other library practices, such as requiring

patrons to file requests for books in closed stacks182 or to file requests for interlibrary loan

materials.183

Although Breyer agreed with the plurality in its judgment, he said that he reached his

decision “in a different way.”184 Breyer applied “a form of heightened scrutiny” to the CIPA to

determine whether the act was constitutional. Breyer said that heightened scrutiny “supplements”

strict scrutiny “with an approach that is more flexible but nonetheless provides the legislature

178 Id. at 215-16, 220 (Breyer, J., concurring).

179 See id. at 215-16.

180 Id. at 219.

181 Id.

182 Closed stacks are bookshelves and/or rooms not open to the public. Usually a patron receives requested material by giving a librarian sufficient information to get the book for the patron. Closed stacks ordinarily are a security measure, designed to protect expensive, rare, or vulnerable parts of a collective. Sometimes stacks are closed because of limit space for patron browsing.

183 Id. at 219.

184 Id. at 215-16.

315

with less than ordinary leeway in light of the fact that constitutionally protected speech is at

issue.”185 Breyer said that heightened scrutiny was useful when “complex, competing constitutional interests are potentially at issue or speech-related harm is potentially justified by unusually strong government interests.”186

Breyer said the First Amendment did not require the application of the strict scrutiny test

to the CIPA because the use of filtering was similar to traditional collection development.187

Breyer said the mandatory filtering requirement was “a kind of ‘selection’ restriction, a kind of

editing. It affects the kinds and amount of materials that the library can present to its patrons.”188

Breyer said that libraries often make selection decisions, sometimes because it is necessary because of a scarcity of resources, for example, or because of a collection policy.189

Breyer said the two competing interests at stake in the CIPA were protecting minors from

accessing material deemed harmful to minors and allowing adults access to material that would

be deemed harmful to minors but not to adults. Breyer said the CIPA met his heightened scrutiny

test because government has a “legitimate, and indeed often compelling” interest in restricting

minors from accessing material deemed harmful to minors. In addition, Breyer wrote, the

government had a “legitimate, and indeed often compelling” interest in restricting all library

patrons from accessing obscenity and child pornography.190

185 Id. at 218.

186 Id. at 217.

187 Id. at 216-17.

188 Id. at 216.

189 Id. at 217.

190 Id. at 218. Although the filtering laws apply only to “visual depictions” that are obscene, child pornography or harmful to minors, Breyer did not make that distinction in his opinion.

316

Dissents

In a dissenting opinion, Justice David Souter, joined by Ruth Bader Ginsburg, stated that

the Children’s Internet Protection Act violated the First Amendment because adults would be

denied access to “ nonobscene material harmful to children but lawful for adult examination.”191

If the CIPA had applied to only minors, Souter said, he would have upheld the law. He said the

Supreme Court’s decision in Reno v. ACLU192 established that “the government interest in

‘shielding’ children from exposure to indecent material is ‘compelling.’”193 However, Souter said, the CIPA would deny access to a “substantial amount” of constitutionally-protected speech.194

Souter said the Children’s Internet Protection Act imposed “an unconstitutional condition

on the Government’s subsidies to local libraries for providing access to the Internet.”195 He stated that the CIPA’s blocking mandate was an “invalid’ exercise of Congress’ spending power because the CIPA “mandates action by recipient libraries that would violate the First

Amendment’s guarantee of free speech if the libraries took that action entirely on their own.”196

If a public library blocked adults’ Internet access to “material harmful to children,” the act would

191 Id. at 233-24 (Souter, J., dissenting).

192 Reno v. ACLU, 521 U.S. 844 (1997).

193 United States v. Am. Library Ass’n, 539 U.S. 194, 232 (2003) (Souter, J., dissenting) (citing Reno v. ACLU, 521 U.S. 844, 869-70 (1997)). In Reno, the Supreme Court struck down two provisions of the Communications Decency Act. The first provision made the “knowing” transmission of “obscene or indecent” messages to anyone under eighteen a crime. See 47 U.S.C. § 223(a)(1). The second provision prohibited the “knowing” sending or displaying to anyone under eighteen of any message “that, in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs.” See 47 U.S.C. § 223(d). The Court said the CDA was vague because it did not define the terms “indecent” and “patently offensive.” The Court also found the Act overbroad and stated that “[the CDA] . . . lacks the precision that the First Amendment requires when a statute regulates the content of speech.” See Reno, 521 U.S. at 874. For a complete discussion of Reno, see Chapter 5.

194 Id. at 233-24 (Souter, J., dissenting).

195 Id. at 231.

196Id.

317

be considered “censorship,” Souter said. The library itself would be imposing an unconstitutional

content-based restriction on adults’ access to library holdings, he said.197 Souter wrote:

[W]e can smell a rat when a library blocks material already in its control, just as we do when a library removes books from its shelves for reasons having nothing to do with wear and tear, obsolescence, or lack of demand. Content-based blocking and removal tell us something that mere absence from the shelves does not.198

Souter disagreed with the plurality’s treatment of Internet filtering as a “selection”

decision. “The Internet blocking here defies comparison to the process of acquisition,” Souter

said. Although, he said, libraries traditionally have made choices about what to acquire based on

funding, space, and “quality” and “demand” of material,199 mandatory filtering is unrelated to

limitations of money or space because blocking decisions are made after the resource already has been acquired.200 In looking at the “quality” and “demand” aspects of acquisition decisions,

Souter said that mandatory filtering is an unconstitutional content-based restriction because

Internet filtering would deny adults access to a part of a library’s holdings—online materials that

have already been purchased and are available to patrons. “Libraries do not refuse materials to

adult patrons on account of their content,” he said.201 Souter wrote:

The proper analogy therefore is not to passing up a book that might have been bought; it is either to buying a book and then keeping it from adults lacking an acceptable ‘purpose,’ or to buying an

197 Id. at 234-35.

198 Id. at 241.

199 Id. at 236.

200 Id. at 237.

201 Id. at 238. Libraries have limited patrons’ access to holdings by imposing constitutional content-neutral regulations, Souter said. For example, libraries “commonly limit access” to rare or valuable materials in order to preserve those materials. Id.

318

encyclopedia and then cutting out pages with anything thought to be unsuitable for adults.202

Souter said librarians would not even be making an acquisition or selection decision in choosing the blocking and filtering software. Because the software is proprietary, he said, only software developers know exactly which material has been blocked. Even though librarians could select categories—such as nudity or violence—to block, they would not know the actual content blocked in each category.203

Souter said the provision in the CIPA that permitted librarians to disable blocking and filtering for “research or other lawful purposes” still denied adults access to “a substantial amount” of unconstitutionally protected speech.204 He said that the disabling provision was

“superfluous” and “onerous” because of the “uncertainty of its terms.” Souter said charging librarians with determining whether a request for information was for “lawful purposes” would violate the First Amendment because it granted librarians “unduly broad discretion.”205 Souter said librarians don’t ask patrons the reason or purpose for requesting other materials in a library’s collection or requesting materials through interlibrary loan.206

In a separate dissent, Justice John Paul Stevens agreed with the plurality on one point, stating that it is “neither inappropriate nor unconstitutional for a local library to experiment with filtering software as a means of curtailing children’s access to Internet Web sites displaying sexually explicit images.”207 However, Stevens argued that the law mandating filtering for all

202 Id. at 237.

203 Id. at 234.

204 Id. at 233-34 (citing 20 U.S.C. § 9134 (f)(3) & 47 U.S.C. § 254 (h)(6)(d)).

205 Id. at 233.

206 Id. at 235, 241.

207 Id. at 220 (Stevens, J., dissenting) (emphasis added).

319

library patrons would “create a significant prior restraint on adult access to protected speech.”208

He described mandatory Internet filtering as “a law that prohibits reading without official consent.”209 Chief Justice Rehnquist, in the opinion for the Court, disagreed with the contention that the CIPA imposed a prior restraint. Rather, Rehnquist said, a library’s decision to use filtering software was a collection decision. He said public libraries are not required to add material to their collections just because the material is constitutionally protected.210 Neither

Rehnquist nor Stevens cited precedent in their comments.

Stevens also said the CIPA violated the First Amendment because it required filters to be installed on all computers with Internet access, not just those computers funded with E-rate discounts.211 He argued that “Neither the interest in suppressing unlawful speech nor the interest in protecting children from access to harmful materials” justified the overly broad federally- funded restriction on adult access to unprotected speech.” 212 He pointed to the Supreme Court’s

2002 opinion in Ashcroft v. Free Speech Coalition,213 in which the Court affirmed that the ban on virtual child pornography214 was unconstitutional. Stevens reminded the Court that it had said

208 Id. at 225 (emphasis added).

209 Id. at 225.

210Id. at 209 (plurality opinion).

211 Id. at 226, 230-31 (Stevens, J., dissenting) (emphasis by Stevens). See also 47 U.S.C. § 254(h)(6) and 20 U.S.C. § 9134 (f)(1)(b), stating that a public library must enforce the operation of a technology protection measure during patrons’ use of any computers connected to the Internet.

212 Id. at 222.

213 Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002).

214 Virtual child pornography uses computer-generated characters that look like real children.

320

in Ashcroft that “The Government may not suppress lawful speech as the means to suppress

unlawful speech.”215

Justice Stevens argued that “overblocking will ‘reduce the adult population . . . to reading

only what is fit for children.’”216 Stevens argued that “some library patrons would not make

specific unblocking requests” and therefore ‘the interest of authors of blocked Internet material

‘in reaching the widest possible audience would be abridged.’” In response, Rehnquist, in the

plurality opinion said that a public library’s purpose in acquiring Internet access is “to provide its

patrons with materials of requisite and appropriate quality, not to create a public forum for Web

publishers to express themselves.”217

Rehnquist challenged the premise that the CIPA violated the First Amendment because it

required public libraries to install filtering software on all computers connected to the Internet.

Rehnquist wrote that the “CIPA does not ‘penalize’ libraries that choose not to install such

software, or deny them the right to provide their patrons with unfiltered Internet access. Rather,

CIPA simply reflects Congress’ decision not to subsidize their doing so.”218

Stevens also disagreed with the Court’s interpretations of three significant cases.

Stevens cited the 2001 decision Legal Services Corp. v. Velazquez,219 when the Supreme Court

215 United States v. Am. Library Ass’n, 539 U.S. at 222 (Stevens, J., dissenting) (citing Ashcroft v. Free Speech Coal., 535 U.S. 234, 255 (2002)). In Ashcroft, the Court upheld the ban on actual child pornography and struck down the ban on virtual child pornography.

216 Id. (Stevens, J., dissenting) (quoting Butler v. Michigan, 352 U.S. 380, 383 (1957)). Stevens also cited Ashcroft v. Free Speech Coalition, 535 U.S. 234, 252 (2002) and U.S. v. Playboy Entertainment Group, 529 U.S. 803, 814 (2000), in which the Supreme Court stated that a blanket ban on constitutionally protected speech cannot be upheld if a less restrictive alternative is available. Stevens also cited Reno v. ACLU, 521 U.S. 844, 875 (1997), in which the Supreme Court stated, “The governmental interest in protecting children from harmful materials . . . does not justify an unnecessarily broad suppression of speech addressed to adults.” Id. at 222 (Stevens, J., dissenting).

217 United States v. Am. Library Ass’n, 539 U.S. at 209 (plurality opinion).

218 Id. at 212.

219 Legal Servs. Corp. v. Velazquez, 531 U.S. 533 (2001).

321

held unconstitutional a funding restriction preventing attorneys from representing clients who

wanted to challenge existing welfare law.220 The Velazquez Court said that federal funding that

limited the nature of the arguments a nonprofit legal corporation could make on behalf of

indigent clients was viewpoint-based content discrimination that violated the First Amendment.

The Court had said that the government could not control expression in a way that distorts the

functioning of the law.221 In CIPA, Stevens said that the requirement that filtering software be

installed on all computers connected to the Internet “distorts” the role of libraries because filters

both overblock and underblock content.222 Stevens stated that the federal district court correctly

pointed out that filtering technology was imprecise and unable to screen images, a problem that

the lower court said was not likely to be remedied in the foreseeable future.223 Because patrons

would not even know what has been blocked, they would not know whether to ask for the

filtering software to be disabled, Stevens wrote.224 Stevens said the Supreme Court plurality in the CIPA case interpreted Velazquez225 too narrowly. “Velazquez is not limited to instances in

which the recipient of Government funds might be ‘pitted’ against the government,” Stevens

wrote.226

220 United States v. Am. Library Ass’n, 539 U.S. at 227-28 (Stevens, J., dissenting) (citing Velazquez, 531 U.S. at 543.

221 Id.

222 Id.

223 Id. at 221-22.

224 Id. at 224.

225 Legal Servs Corp. v. Velazquez, 531 U.S. 533 (2001).

226 United States v. Am. Library Ass’n, 539 U.S. at 227-28 (Stevens, J., dissenting). In Velazquez , the Court held unconstitutional a funding restriction on Legal Services Corporation (LSC) preventing attorneys from representing clients trying to amend or challenge existing welfare law. The Velazquez Court said that the LSC program was designed to facilitate private speech, and not to promote a government message. In the CIPA opinion, Chief Justice Rehnquist said that the Velazquez Court “concluded that the restriction on advocacy in such welfare disputes would distort the usual functioning of the legal profession and the federal and state courts before which the lawyers

322

Stevens also argued that the plurality in the CIPA case did not properly interpret Rust v.

Sullivan or National Endowment for the Arts v. Finley. The plurality had relied upon Rust v.

Sullivan227 for precedent that the Court had allowed Congress to limit funds it allowed, in that

case funds for the discussion of means of birth control.228 However, Stevens said, Rust only

applies to “government speech—that is, situations in which the government seeks to

communicate a message.”229 In contrast, Stevens argued, the E-rate and LSTA programs do not

subsidize government messages, but rather “are designed to provide access . . . to a vast amount

and wide variety of private speech.”230 Stevens noted that in Rosenberger v. Rector,231 the

Supreme Court stated that Rust would not be applicable in instances in which “the government

‘does not itself speak or subsidize transmittal of a message it favors but instead expends funds to encourage a diversity of views from private speakers.’”232 The Rosenberger court held that a

public university’s guideline against funding a religious organization’s publications constituted

viewpoint discrimination and therefore violated the First Amendment.233

appeared.” United States v. Am. Library Ass’n, 539 U.S. at 213. Rehnquist said that, unlike LSC in Velazquez, public libraries “have no comparable role that pits them against the government, and there is no comparable assumption that they must be free of any conditions that their [government] benefactors might attach to the use of donated funds or other assistance.” United States v. Am. Library Ass’n, 539 U.S. at 213.

227 Rust v. Sullivan, 500 U.S. 173 (1991).

228 Id.

229 United States v. Am. Library Ass’n, 539 U.S. at 228 (Stevens, J., dissenting).

230 Id. at 228-29.

231 Rosenberger v. Rector, 515 U.S. 819, 834-35 (1995) (holding that a public university’s guideline against funding a religious organization’s publications constituted viewpoint discrimination and therefore violated the First Amendment).

232 United States v. Am. Library Ass’n, 539 U.S. at 229 (Stevens, J. dissenting) (citing Rosenberger v. Rector, 515 U.S. 819, 834 (1995)).

233 Rosenberger, 515 U.S. at 834-35, 845-46.

323

Justice Stevens said that Finley234 was not analogous to CIPA because library patrons

themselves were not challenging federally imposed restrictions on their access to a federal

program. In Finley, artists who were denied grants challenged as unconstitutional a statute

setting forth criteria on how experts administered a federal grant program.235 The Finley Court

held that the National Endowment for the Arts’ grant-making process did not violate the First

Amendment, stating, “Any content-based considerations that may be taken into account in the

grant-making process are a consequence of the nature of arts funding.”236 In Justice Stevens’

dissent in CIPA, he wrote that, “If this were a case in which library patrons had challenged a

library’s decision to install and use filtering, it would be in the same posture as Finley. Because it

is not, Finley does not control this case.”237 In contrast, under the CIPA, Justice Stevens said, the

government was imposing restrictions on the administration of a federal program.238

Stevens also disagreed with the plurality opinion that a federal mandate was the best

solution to addressing the problem of online material deemed harmful to children. Stevens said

that libraries should be able to make decisions on the local level and develop their own

approaches,239 thus allowing local officials “to tailor their responses to local problems.”240 To support his argument, Stevens cited a law review article that stated that “By allowing the nation’s libraries to develop their own approaches, they may be able to develop a better understanding of

234 Nat’l Endowment for the Arts v. Finley, 524 U.S. 569 (1998).

235 United States v. Am. Library Ass’n, 539 U.S. at 230 (Stevens, J., dissenting) (citing Finley, 524 U.S. at 577).

236 Finley, 524 U.S. at 585 (1998).

237 United States v. Am. Library Ass’n, 539 U.S. at 230 (Stevens, J., dissenting).

238 Id.

239 Id. at 224.

240 Id. at 220.

324

what methods work well and what methods add little or nothing.”241 Stevens noted that the

district court cited examples of “less restrictive” local alternatives public libraries could

implement in place of filtering. Among those ideas was the implementation of local use or

parental permission policies, requiring that a parent or librarian be present when a child was

using an unfiltered computer, or allowing a librarian to program a child’s library card for

unfiltered Internet access, based on a parent’s written consent. Other options included the

installation of privacy screens and recessed monitors. The lower court said local libraries could

impose penalties, such as issuing a warning to patrons who violated Internet use policies or

notifying law enforcement officials if patrons accessed obscenity or child pornography.242

Rehnquist, in the plurality opinion, countered the lower court’s and Stevens’ suggested alternatives to filtering, stating that they “have their own drawbacks.” He said that having librarians monitor computer users would be “far more intrusive . . . and would risk transforming the librarian from a professional to whom patrons turn to assistance into a compliance officer whom many patrons might wish to avoid.”243 Rehnquist said some of the other options discussed

by the lower court, such as installing privacy screens and moving computers to secluded

locations in the library, would encourage library patrons to view pornography.244

Justice Rehnquist disagreed with Justice Stevens’ analysis of Congress’ spending power

in the context of the CIPA. Stevens had asked if it would be constitutional for Congress to

require mandatory filtering in public libraries rather than allow local libraries to make their own

241 Id. at 224 (citing Gregory Laughlin, Sex, Lies, and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 213, 279 (2003)).

242 Id. at 223 (citing 201 F. Supp. 2d at 410).

243 Id. at 207 (plurality opinion).

244 Id.

325

decisions that were tailored to local problems. Stevens also had stated that the plurality’s reliance

on Finley245 was “misplaced” because in Finley, unlike in the CIPA, the government was not

imposing restrictions on the administration of a federal program.246 In countering Stevens’

dissent, Rehnquist said that Stevens did not ask the right question. “As the District Court

correctly recognized, we must ask whether the condition that Congress requires ‘would . . . be

unconstitutional’ if performed by the library itself,” Rehnquist wrote in the CIPA opinion.247

Therefore, he said, the Dole case provided the appropriate framework because the filtering legislation did not “directly regulate private conduct; rather, Congress has exercised its Spending

Power by specifying conditions on the receipt of federal funds.”248 Stevens wrote that in Finley,

artists who were denied grants challenged a statute setting forth criteria on how experts

administered a federal grant program.249 In his dissent in the CIPA case, Justice Stevens wrote,

“If this were a case in which library patrons had challenged a library’s decision to install and use filtering, it would be in the same posture as Finley. Because it is not, Finley does not control this case.”250 In contrast, under the Children’s Internet Protection Act, the government was imposing

restrictions on the administration of a federal program, Stevens said.251 Rehnquist said the Finley case was useful when examining the selection process that libraries use in acquiring materials,

245 Nat’l Endowment for the Arts v. Finley, 524 U.S. 569 (1998).

246 United States v. Am. Library Ass’n, 539 U.S. at 229 (Stevens, J., dissenting).

247 Id. at 203 (plurality opinion).

248 Id.

249 Id. at 230 (Stevens, J., dissenting) (citing Finley, 524 U.S. at 577). The Finley Court held that the National Endowment for the Arts’ grant-making process did not violate the First Amendment, stating, “Any content-based considerations that may be taken into account in the grant-making process are a consequence of the nature of arts funding.” Finley, 524 U.S. at 585.

250 United States v. Am. Library Ass’n, 539 U.S. at 230 (Stevens, J., dissenting).

251 Id.

326

however. Like the role of the NEA in awarding art grants, librarians “enjoy broad discretion” in making acquisitions decisions, Rehnquist said.252 However, for the use of Congress’ spending power, Rehnquist said the Dole framework applied.253

Conclusion

In 2003, three years after Congress enacted the Children’s Internet Protection Act, the

U.S. Supreme Court, on a 6-3 vote, reversed a federal district court’s decision and upheld the act.254 The CIPA requires public libraries and schools receiving federal E-rate and LSTA funds to install filtering technology on all computers connected to the Internet.255 The goals of the

CIPA were to restrict all library patrons from accessing visual images that were obscene or child pornography and to restrict minors from accessing visual images that would be harmful to minors.256

The Supreme Court disagreed with the district court’s assessment that congressional goals needed to be met with less restrictive alternatives to filtering or blocking software. The

Supreme Court held that the Children’s Internet Protection Act did not violate library patrons’

First Amendment rights and did not place an unconstitutional burden on public libraries. The

Court said that the CIPA was a valid exercise of Congress’ spending powers.257

Six years after the Supreme Court upheld the Children’s Internet Protection Act, an important question remains: Is the regulatory and technological system established by the CIPA

252 Id. at 205 (plurality opinion).

253 Id. at 203.

254 Id. at 214.

255 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f), 47 U.S.C. 254(h)(6)), mandating that “a blocking technology measure” be installed on “any” computer connected to the Internet at libraries receiving E-rate funding and Library Services and Technology Act (LSTA) funding.

256 United States v. Am. Library Ass’n, 539 U.S. at 410.

257 Id. at 214.

327

capable of doing what Congress is asking of it? The next chapter will examine this issue and identify the gaps between law and technology.

328

CHAPTER 8 BRIDGING THE GAP BETWEEN LAW AND TECHNOLOGY

Introduction

In a third attempt to protect minors from online pornography, Congress enacted the

Children’s Internet Protection Act in 2000.1 Congress first attempted to regulate sexually explicit online content when it passed the Communications Decency Act of 1996, one portion of the

Telecommunications Act of 1996.2 The CDA criminalized the intentional online transmission of

child pornography and “obscene,” “indecent,”3 and “patently offensive” material4 to anyone under the age of 18.5 The Supreme Court struck down the statute as unconstitutional in 1997,

stating that the CDA was overbroad and vague.6

A year later, Congress passed the Child Online Protection Act of 1998,7 which would

have prevented commercial Web site operators from intentionally making sexually explicit Web

content available to minors,8 including material that the COPA defined as “harmful to minors.” 9

In contrast to the CDA, the COPA would have applied only to those materials found on the

World Wide Web and not to material located in other places on the Internet, such as in e-mail

1 Pub. L. No. 106-554, signed into law on Dec. 21, 2000, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) and 47 U.S.C. 254(h)).

2 Communications Decency Act of 1996, Pub. L. No. 104-104, 551; 110 Stat. 56, 133-39 (1996). For a discussion of the CDA, see Chapter 5.

3 47 U.S.C. § 223(a)(1)(b).

4 47 U.S.C. § 223(d)(1).

5 47 U.S.C. § 223 (d)(1)(b) (1996). See Commc’ns Decency Act of 1996, Pub. L. No. 104-104, 110 Stat. 56, 133-39 (1996); Telecomm. Act of 1996, Pub. L. No. 104-104 551, 110 Stat. 56, 139 (1996).

6 Reno v. ACLU, 521 U.S. 844, 864 (1997).

7 Pub. L. 105-277, 112 Stat. 2681 (1998) (codified at 47 U.S.C. § 231).

8 Id.

9 See 47 U.S.C. § 231(a)(1).

329

messages, listservs, news groups or live chat rooms.10 The COPA also would have only applied

to commercial Web sites,11 whereas the CDA would have applied to commercial, non-

commercial and nonprofit online communications, as well as individuals’ online

communications.12 After a series of court challenges, a U.S. District Court, in 2007, held that the

COPA facially violated the First Amendment because the statute was a content-based statute that did not meet the strict scrutiny test—that is, the COPA was not narrowly tailored to meet the compelling government interest of protecting minors from material deemed harmful.13

The Children’s Internet Protection Act 14 requires public libraries and most schools

receiving federal technology funds to implement an Internet safety policy and install “a

technology protection measure,” such as blocking or filtering software, on all computers

connected to the Internet to prevent access to sexually explicit images.15 The safety policy requires libraries and schools to monitor minors’ online activities and to monitor the operation of

“a technology protection measure.”16 As of 2009, the only “technology protection measure”

available was blocking and filtering software.17 In 2002, a federal district court held that the

10 H.R. REP. NO. 775, 105th Cong. (2d Sess. 1998) at 12.

11 Id.

12 47 U.S.C. § 223(d).

13 ACLU v. Gonzales, 478 F. Supp. 2d 775, 809-10, 821 (E.D. Pa. 2007). For a discussion of the district court’s opinion, see Chapter 5.

14 Pub. L. No. 106-554, signed into law on Dec. 21, 2000, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) and 47 U.S.C. 254(h).

15 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. § 9134(f), 47 U.S.C. § 254(h)(6)), mandating that “a blocking technology measure” be installed on “any” computer connected to the Internet at libraries receiving E-rate funding and Library Services and Technology Act (LSTA) funding.

16 20 U.S.C. § 9134(f)(1)(A) and (B), 47 U.S.C. §254(h)(5)(A) and 47 U.S.C. § 254(h)(6)(A). For a detailed discussion of the Children’s Internet Protection Act, see Chapter 7.

17 For a discussion of filtering technology, terminology, strengths and limitations, see Chapter 3.

330

Children’s Internet Protection Act was unconstitutional,18 a ruling that the Supreme Court

reversed in 2003.19 However, the Supreme Court ruled that the CIPA could be challenged on an

“as applied” basis,20 which would allow a library patron to file a lawsuit alleging that the law

was improperly administered under a specific set of circumstances.21

The overarching purposes of this dissertation are to further public understanding of the role of the public library, analyze the legal and practical aspects of implementing mandatory

Internet filtering in public libraries, and examine the technological and regulatory scheme of the

CIPA to determine if it is capable of doing what Congress is asking of it. To that end, this chapter will examine the First Amendment and the role of the public library in providing access to information. Second, this chapter will offer a critique of Internet filtering as it is applied to public libraries. Third, this chapter will evaluate the Supreme Court’s efforts and use of tests in balancing three compelling and sometimes competing interests: protecting minors from material deemed harmful, upholding adults’ access to constitutionally protected speech, and upholding the parental right in childrearing. Finally this chapter will offer suggestions for future research.

The First Amendment Right to Receive Information and the Role of the Public Library

Several theories of the First Amendment apply to accessing information in public libraries, as discussed in Chapter 2. The right to receive ideas and information and self-

18 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). See Chapter 6 for a legislative history of the Children’s Internet Protection Act and Chapter 7 for a discussion of the two court cases on the CIPA.

19 United States v. Am. Library Ass’n, 539 U.S. 194 (2003), rev’g Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

20 United States v. Am. Library Ass’n, 539 U.S. at 215 (Kennedy, J., concurring).

21 The CIPA contains a disabling provision that allows librarians to disable the filter for adults for “bona fide research or other lawful purposes.” See 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D)). For a discussion of the Children’s Internet Protection Act and its disabling provision, see Chapter 7.

331

fulfillment theories are, perhaps, the two most applicable theories because they can be applied solely to the recipient of information.

William E. Lee stated that the right to receive information doctrine is important and useful because it “restricts the government’s power to interfere with the recipient of the communication.”22 Lee said that the Supreme Court has failed to tie the right to receive information and ideas to the recipient, independent of the speaker.23 Lee argued it would make more sense to recognize the right to receive information only in situations where a speaker has a right to speak.24 Lee most likely would not apply the right to receive information to public libraries because speakers do not have the right to speak in that venue.25

In contrast, First Amendment scholar Thomas Emerson said that there is a First

Amendment right to receive and obtain communication, independent of or supplemental to the right of the speaker to communicate information.26 He has argued that the “right to read, listen, or see is so elemental” that it deserves full First Amendment protection.27 This “right to know” is of “vital importance in a democratic society”28 and is an affirmative right, in contrast to the

22 William E. Lee, The Supreme Court and The Right to Receive Expression, 1987 SUP. CT. REV. 303, 343 (1987).

23 Id.

24 Id. at 344.

25 See Brown v. Louisiana, 383 U.S. 131, 142 (1966) (overturning the convictions of five African-American men who were convicted of breach of peace after staging a peaceful and silent protest against segregation).

26 Thomas I. Emerson, Symposium, The First Amendment and the Right to Know—Legal Foundations of the Right to Know, 1976 WASH. U. L. Q. 1, 2 (1976).

27 Id. at 6.

28 Id. at 1.

332

negative right of being free from government interference.29 In Emerson’s view, the right to

know is necessary for self-fulfillment and should receive “direct constitutional protection.”30

Emerson’s “right to know” argument31 is applicable to Internet access in public libraries,

even though his 1976 seminal article was published more than a decade before Internet access

was available to the public. He maintained that the listener’s interests may not always coincide with the speaker’s interests, and therefore the listener’s or recipient’s interests would have

“greater weight when they are based upon an independent legal foundation.”32 Emerson wrote,

“The right to read, listen or see is so elemental, so close to the source of all freedom, that one can

hardly conceive of a system of free expression that does not extend it full protection.”33

The Supreme Court has twice applied the right to receive ideas and information doctrine to public libraries in Court plurality decisions, but did not mention it when evaluating the CIPA.

In Brown v. Louisiana,34 in 1966, Justice Abe Fortas, authoring the plurality opinion, wrote that

“a public library [is] a place dedicated to quiet, to knowledge, and to beauty.”35 Although Justice

Fortas’ statement did not directly refer to the right to receive information, his reference to

“knowledge” implies the right to receive information and ideas in public libraries.

29 Id. at 2.

30 Id. at 2, 6.

31 Id. at 2.

32 Id.

33 Id. at 6.

34 Brown v. Louisiana, 383 U.S. 131 (1966) (overturning the convictions of five African-American men who were convicted of breach of peace after staging a peaceful and silent protest against segregation.)

35 Id. at 142.

333

Nearly two decades later, in 1982, a Supreme Court plurality used the right to receive

information doctrine in a case involving a public school library. In Board of Education v. Pico,36 the Court voted 5-4 in holding that a school board violated the First Amendment when it ordered the removal of books from a junior high school library and senior high school library. Justice

William Brennan, who authored the plurality opinion, referred to public libraries and the right to receive ideas and information when he wrote, “A school library, no less than any other public library, is ‘a place dedicated to quiet, to knowledge, and to beauty’ (where) ‘students must always remain free to inquire, to study and to evaluate, to gain new maturity and understanding.’”37 Justice William Rehnquist’s dissent in Pico supported the public library’s

mission in providing access to a wide variety of information and ideas, the very information and

ideas that filtering software limits. Rehnquist wrote,

Unlike university or public libraries, elementary and secondary school libraries are not designed for freewheeling inquiry; they are tailored, as the public school curriculum is tailored, to the teaching of basic skills and ideas.38

Lower federal courts also have applied the First Amendment right to receive information doctrine to public libraries.39 In the holding of Kreimer v. Bureau of Police,40 in 1992, the Third

Circuit for the U.S. Court of Appeals found that the local government in Morristown and Morris

Township, New Jersey, had made the library a limited public forum for the right to receive

36 Bd. of Educ. v. Pico, 457 U.S. 853 (1982).

37 Pico, 457 U.S. at 868-69 (citing Brown v. Louisiana, 383 U.S. 131, 142 (1966) and Keyishian v. Bd. of Regents, 385 U.S. 589, 603 (1967)).

38 Id. at 915 (Rehnquist, J., dissenting).

39 Kreimer v. Bureau of Police, 958 F. 2d. 1242 (3d Cir. 1992); Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 24 F. Supp. 2d. 552 (E.D. Va. 1998); Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002). For a discussion of public forum doctrine, see Chapter 2.

40 Kreimer, 958 F. 2d. at 1242.

334

information, but not for personal expression, such as making speeches.41 In Mainstream

Loudoun v. Board of Trustees of Loudoun County,42 in 1998, a federal district court judge in

Virginia held that library buildings and Internet access inside the libraries were limited public

fora,43 and that patrons had a right to receive information through the Internet.44 In American

Library Association v. U.S.,45 in 2002, the federal district court that struck down the Children’s

Internet Protection Act stated that the right to receive information and ideas is fundamental to a

free society46 and to the role of the public library.47 The Supreme Court, which reversed the

district court’s ruling in 2003 and upheld the Children’s Internet Protection Act, did not address

the right to receive information in the CIPA case.48

For more than seventy years, public libraries have supported the patron’s right to receive

information. Libraries have encouraged public inquiry by providing open access to all patrons.49

One goal of public libraries is to fight censorship50 in all formats or media.51 For example, the

American Library Association’s Library Bill of Rights states that “libraries should challenge

41 Id. at 1256-63.

42 Mainstream Loudoun, 24 F. Supp. 2d. at 552.

43 Id. at 563.

44 Id.

45 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401 (E.D. Pa. 2002).

46 Id. at 451 (citing Stanley v. Georgia, 394 U.S. 557, 564 (1969)).

47 Id. at 466.

48 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

49 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf.

50 See id.

51 See Am. Library Ass’n, Access to Electronic Information, Services, and Networks: An Interpretation of the Library Bill of Rights, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/default.cfm.

335

censorship in the fulfillment of their responsibility to provide information and enlightenment”52 and “libraries should cooperate with all persons and groups concerned with resisting abridgment of free expression and free access to ideas.”53 In addressing minors’ access to public library materials, the ALA’s Library Bill of Rights states that a person’s right to use a public library should not be denied or abridged because of age.54

In addressing minors’ right to receive information, historian and attorney Catherine Ross has stated that, despite the Supreme Court’s recognition of minors’ First Amendment rights, the

Court has not provided much guidance concerning the age and circumstances under which a minor may receive information.55 Professor Ross and other commentators have argued that older minors, which they define as teenagers, have a greater First Amendment rights than younger ones.56 Ross said that teenagers have the right to receive information,57 with or without parental approval,58 and that Internet filters in public libraries interfere with that right.59 According to

52 AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. III, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf.

53 AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, art. IV, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf.

54 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf.

55 Catherine J. Ross, An Emerging Right for Mature Minors to Receive Information, 2 U. PA. J. CONST. L. 223, 223- 26 (1999).

56 Id. at 224-25. See also Amitai Etzioni, Symposium, Do Children Have the Same First Amendment Rights as Adults?: On Protecting Children from Speech, 79 CHI.-KENT. L. REV. 3, 43 (2004); Sidne Koenigsberg, Print Symposium, Contract Options for Individual Artists: Library Records Open to Parental Scrutiny: A New Set of Internet Access Controls for Minors?, 29 COLUM. J.L. & ARTS 361, 376 (2006); Gregory Laughlin, Sex, Lies and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 253, 254 (2003); Dawn Nunziato, Symposium, Do Children Have The Same First Amendment Rights as Adults?: Toward a Constitutional Regulation of Minors' Access To Harmful Internet Speech, 79 CHI.-KENT. L. REV. 121, 121-22 (2004).

57 Ross, An Emerging Right for Mature Minors, supra note 55, at 224-25.

58 Ross, An Emerging Right for Mature Minors, supra note 55, at 275.

336

Ross, minors’ right to receive information is most applicable when they have autonomy rights

regardless of parental preferences. Ross explained that minors have autonomy rights in instances

in which they legally are able to make their own decisions without parental permission.

Examples of these autonomy rights include the right to exercise individual religious beliefs and

the right to contraception and sexuality choice.60 Ross and commentator Dawn Nunziato said

that although the right to receive does not seem to fully apply to minors, mature minors in

particular still need access to diverse information for individual self-exploration, to develop values and autonomy, and to acquire the tools they will need for self-governance when they reach adulthood.61

Columbia University law student Sidne Koenigsberg argued that Ross’ analysis, while

doctrinally sound, is too limited. Koenigsberg stated that minors should have freedom in “more

mundane circumstances,” such as when they want to read books or visit Internet sites that

advocate ideas or beliefs their parents do not share.62

In 1997, a year before mandatory Internet filtering bills were introduced in Congress, the

American Library Association passed an anti-filtering resolution, stating that the use of filtering

software to block constitutionally protected speech violates the Library Bill of Rights.63 The

American Library Association views Internet filtering as censorship, rather than acquisition.64 As

59 Ross, An Emerging Right for Mature Minors, supra note 55, at 262.

60 Ross, An Emerging Right for Mature Minors, supra note 55, at 253-54.

61 Ross, An Emerging Right for Mature Minors, supra note 55, at 223-26 (1999). See also Nunziato, supra note 56, at 155, 161-62. See also Laughlin, supra note 56, at 254.

62 Koenigsberg, supra note 56, at 376.

63 See Am. Library Ass’n, Resolution on the Use of Filtering Software in Libraries, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/ifresolutions/resolutionuse.cfm.

64 See Am. Library Ass’n, Filters and Filtering, available at http://www.ala.org/ala/aboutala/offices/oif/ifissues/filtersfiltering.cfm.

337

discussed in Chapter 2, a major responsibility of librarians is to acquire materials to add to the

library’s collection, including online resources.65 However, because Internet filtering software is

proprietary, librarians do not have input into the selection decisions as they do with other

materials. Most software companies do not explain what they filter or how they filter,66 and therefore, unlike in the book selection process, librarians do not know what they are excluding from the online collection of resources.

The Supreme Court missed an opportunity to apply the right to receive ideas and information doctrine to Internet access in public libraries when the Court struck down the CIPA in 2003. Prior to the advent of Internet access in libraries, the Supreme Court applied the right to receive information doctrine to school libraries in Board of Education v. Pico.67 Although the

Court was evaluating a decision by a school board to remove books from the library, the Court’s

right to receive information reasoning applies to public libraries as well, as can be seen in the

plurality opinion and a dissenting opinion. The plurality emphasized the right to receive

information and ideas, stating that “the right to receive ideas follows ineluctably from the

sender's First Amendment right to send them . . . and is a necessary predicate to the recipient's

meaningful exercise of his own rights of speech, press, and political freedom.” In authoring the

plurality opinion, Justice William Brennan wrote, “[T]he special characteristics of the school

library make that environment especially appropriate for the recognition of the First Amendment

rights of students.”68

65 See AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cfm and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights.pdf.

66 See WALT CRAWFORD, BEING ANALOG 227.

67 Bd. of Educ. v. Pico, 457 U.S. 853 (1982).

68 Id. at 867 (emphasis of Justice Brennan).

338

Justice William Rehnquist disagreed with the decision reached in Pico, but his statements supported the public library’s mission in providing access to a wide variety of information and ideas. Rehnquist wrote,

Unlike university or public libraries, elementary and secondary school libraries are not designed for freewheeling inquiry; they are tailored, as the public school curriculum is tailored, to the teaching of basic skills and ideas.69

Although the Supreme Court decided Pico prior to the availability of the Internet to libraries, the right to receive information applies to Internet access in public libraries and school libraries, just as it applies to books and other resources. Internet resources, like books, provide library patrons with access to a vast array of information, including some information that is only available online. Patrons using the Internet in public libraries have the same right to receive information through Internet resources as through books in order to find a variety of information, including intellectual, political, social and entertainment materials.

In the CIPA case, the Supreme Court did not address the right to receive information

doctrine. The Court also refused to apply public fora doctrine to Internet access in public

libraries, perhaps in part because the Court has been using a nineteenth century public fora

paradigm.70 The Internet should be considered a “metaphysical” public forum, according to law professor Stephen Gey. Although the Internet does not have a physical presence, unlike a

traditional public forum, Gey said that the Internet “operates as a centralized place where people

go to discuss issues, trade information, and peruse words, images, and music. The Internet is,

69 Id. at 915 (Rehnquist, J., dissenting).

70 For a discussion of public fora doctrine, see Chapter 2.

339

therefore, a forum,” Gey wrote.71 As Gey stated in his article, Supreme Court Justice Anthony

Kennedy had advocated expanding public forum doctrine in 1992.72

Justice Kennedy said the Court should adopt a more modern and objective standard to the

public forum doctrine, one that extends beyond the historical designation of streets, parks and

sidewalks because their role is diminishing.73 Justice Kennedy wrote that:

[w]ithout this recognition our forum doctrine retains no relevance in times of fast-changing technology and increasing insularity [and] . . . our failure to recognize the possibility that new types of government property may be appropriate forums for speech will lead to a serious curtailment of our expressive activity.74

In 2004, law student Derrick Stomberg said that the “inherently public forum” may need

to be added as a fourth type of forum.75 This designation would provide protection to new types

of properties whose principle purpose is to promote free speech, as Gey and Justice Kennedy stated. The Internet should then be classified as an “inherently public fora” because the Internet

is a centralized place where people can look for information and discuss ideas.76 Stomberg

argued that the Internet has become the modern public forum as it provides speakers and

recipients with a primary way of exchanging ideas and information in a centralized place.77

The role of the public library is to provide patrons with access to information, regardless

of format or technology, and to challenge censorship in the fulfillment of that responsibility. First

71 Stephen Gey, Reopening the Public Forum-From Sidewalks to Cyberspace, 58 OHIO ST. L.J. 1535, 1618 (1998).

72 Id. at 1535, 1555-66.

73 Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672, 697-98 (1992) (Kennedy, J., concurring in the judgment).

74 Id.

75 Derrick Stomberg, Note, United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum, 45 JURIMETRICS J. 59, 70 (2004).

76 Id. at 71-73.

77 Id. at 71; see also Gey, supra note 71, at 1618.

340

Amendment jurisprudence supports the role of the library as the Supreme Court and lower

federal courts have specifically applied the right to receive information doctrine to public

libraries. In addition, the expansion of public forum doctrine to apply to Internet access would

support the intent of the traditional public forum as a place where ideas and information can be

exchanged.

Internet Filtering and Public Libraries

To comply with the Children’s Internet Protection Act, librarians must use filtering

software on all computers connected to the Internet in an effort to block sexually explicit Internet

material. The CIPA focuses on receivers, not senders, and covers images, not words.78

The Communications Decency Act of 1996 and the Child Online Protection Act of 1998

were both criminal statutes that focused on the senders of information and applied to all types of

content, not just visual images. The CDA, a part of the Telecommunications Act of 1996, was

initiated as a result of the concerns of Sen. James Exon with what he viewed as the ready availability of pornography and indecency on the Internet79 and his desire to protect children

from such content.80 A House Conference Report on the Telecommunications Act of 1996

stated that “the federal government has a compelling interest in shielding minors from

indecency.”81 The CDA criminally prohibited the intentional transmission, by means of “any

interactive computer service,” any communication containing child pornography and “obscene or indecent” material to anyone under the age of 18.82

78 Pub. L. No. 106-554, signed into law on Dec. 21, 2000, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) and 47 U.S.C. 254(h).

79 141 CONG. REC. 15503 (June 9, 1995) (statement of Sen. Exon).

80 Id.

81 H. Conf. Rep. on the Telecomm. Act of 1996, H. CONF. REP. 104-458 (2d Sess. 1996) at 188.

82 47 U.S.C. § 223.

341

The Supreme Court struck down the CDA’s provisions that would have restricted minors

from accessing online pornography. The Court found the provisions overbroad and vague, and

therefore in violation of the First Amendment.83 The Court acknowledged that the government

has a compelling interest in protecting children from harmful materials,84 but said that the government has a “heavy burden” in explaining why a less restrictive law would not be as effective.85 The Court said that the CDA was not narrowly tailored to meet the government

interest.86

In an attempt to remedy the deficiencies of the Communications Decency Act, Congress enacted the Child Online Protection Act87 in 1998. Congress passed the COPA in an effort to

restrict minors’ access to sexually explicit commercial materials on the World Wide Web,88 and to prohibit the sale of sexually explicit materials to minors.89 A 1998 House report stated that

minors had ready access to pornographic materials online,90 and parental control protections91

and self-regulation had not provided a national solution to protect minors from online

pornography.92 After seven challenges in federal courts, a federal district court, in 2007, granted

83 Reno v. ACLU, 521 U.S. 844, 864 (1997).

84 Id. at 875.

85 Id. at 879.

86 Id.

87 Child Online Protection Act, Pub. L. No. 105-277, 112 Stat. 2681 (1998) (codified at 47 U.S.C. § 231).

88 H. REP. NO. 105-775 (2d Sess. 1998) at 10.

89 Id. at 5, 12.

90 Id. at 8-10.

91 Id. at 9.

92 See H. REP. NO. 105-775 (2d Sess. 1998) at 17.

342

a permanent injunction prohibiting the government from enforcing the Child Online Protection

Act.93

Judge Lowell Reed, who authored the court’s opinion, applied the strict scrutiny test, stating that the COPA was content-based.94 He wrote that although the protection of minors from online pornography was a compelling government interest, the government failed to show that the COPA was narrowly tailored to meet that interest.95 He stated the COPA was overinclusive in that it prohibited more speech than necessary and underinclusive in that it would not apply to sexually explicit content originating outside the United States that would be available to minors in the U.S.96 Reed wrote that the COPA was overly vague because some of the terms were not clearly defined97 and overbroad because protected speech was prohibited.98

93 ACLU v. Gonzales, 478 F. Supp. 2d 775, 821 (E.D. Pa. 2007).

94 Id. at 809.

95 Id. at 810-15.

96 Id. at 810-11.

97 Id. at 816-19. Judge Reed said the terms “knowingly and with knowledge of the character of the material” and “intentionally” were not clearly defined. In part, the COPA reads:

(a) Requirement to restrict access. (1) Prohibited conduct. Whoever knowingly and with knowledge of the character of the material, in interstate or foreign commerce by means of the World Wide Web, makes any communication for commercial purposes that is available to any minor and that includes any material that is harmful to minors shall be fined not more than $ 50,000, imprisoned not more than 6 months, or both. (2) Intentional violations. In addition to the penalties under paragraph (1), whoever intentionally violates such paragraph shall be subject to a fine of not more than $ 50,000 for each violation. For purposes of this paragraph, each day of violation shall constitute a separate violation. See 47 U.S.C. § 231(a)(1) & (2).

98 Id. at 819-20. Judge Reed wrote that although the COPA defined a minor as “any person under 17 years of age,” material that would be “patently offensive” for an eight-year-old would not be patently offensive for a sixteen-year- old. Similarly, he stated that material that would not have “serious literary, artistic, political, or scientific value” for a toddler could have such value for a teenager.

343

Six years after the Supreme Court upheld the CIPA, filtering technology still is not equipped to adequately block access to images. In addition, filters are not able to block all sexually explicit text that may be used as descriptors for sexually explicit images.

The focus on images only in the CIPA resulted from a joint House-Senate conference committee meeting in 2000. Earlier bills included language aiming to protect minors from all types of content, not just visual images. For example, most of the bills stated that filtering needed to be used to prevent minors from accessing “matter” or “material” that was “inappropriate for” or “harmful to minors.”99 In 2000, the year the CIPA was enacted, early versions of the CIPA

amendment did not specify images only, but rather would have required the filtering technology

to block “material that is harmful to minors.”100

McCain’s filtering amendment, introduced in 2000 and upon which the CIPA is based,

was modified in committee. His original filtering amendment, introduced as part of a major

appropriations bill, was based on the McCain-Hollings 1999 bill and would have required the

filtering technology to “filter or block material deemed to be harmful to minors.” Committee

action modified the wording to prohibit online access to three types of content: obscenity, child

pornography, and “any other material . . . inappropriate for minors,”101 but the focus was on a

broader range of content than just visual images.

A joint House-Senate conference committee made additional changes to the filtering

proposal. The committee changed “material” to “visual depictions” and “inappropriate for

minors” to “harmful to minors.” The proceedings of the committee meeting were not made

99 See Chapter 6 for a discussion of the legislative history of the Children’s Internet Protection Act.

100 Children’s Internet Protection Act, H.R. 4600, 106th Cong. (2d Sess. 2000).

101 Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

344

public, and therefore the committee’s rationale is not on record. However, based on House and

Senate concerns with minors’ access to online pornography, it is quite possible that the

committee changed the wording to “visual depictions” to deal with congressional concerns about

graphic pornographic videos and images that were readily available online.102

Another major change in the 2000 filtering amendment, in contrast to previous bills, was

the addition of a definition for the “harmful to minors” phrase. Earlier bills did not define the

term. The CIPA legislation defined “harmful to minors” as—

any picture, image, graphic image file, or other visual depiction that

(i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion; (ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and (iii) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.103

Although filtering technology has improved since the enactment of the CIPA, filtering

software continues to block content that would not be considered sexually explicit or harmful to

minors and fails to block content that is sexually explicit or harmful to minors.104 A 2008 study

on Internet filters showed that filters still underblocked and overblocked content. The better that

the filtering software was at blocking access to “harmful” material, the worse the software was at

allowing access to “harmless” material.105 The researchers also addressed the impact of user-

102 For a discussion of committee hearings and reports on filtering bills, see Chapter 6.

103 Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as 20 U.S.C. § 9134(f)(7)(B)).

104 See DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET: SYNTHESIS REPORT at 27 (2008). According to the Deloitte study, “overblocking” occurs when the filter blocks “good content,” and “underblocking” occurs when the filter does not block “unwanted content.” See also MARJORIE HEINS ET AL., BRENNAN CENTER FOR JUSTICE AT N.Y. UNIV. SCHOOL OF LAW, INTERNET FILTERS: A PUBLIC POLICY REPORT 45 (2d ed., 2006); Filtering Software: Better, But Still Fallible, CONSUMER REPORTS, at 36 (June 2005).

105 DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET, supra note 104, at 5-10.

345

generated online content on the effectiveness of filtering software programs. Because Internet

users can post photos, videos and other content on interactive sites nearly instantaneously, such

as on YouTube and MySpace, filtering software vendors cannot keep up with the changes, the

researchers said.106

A major problem with all filtering software is that software vendors, rather than

librarians, are making the filtering decisions because filtering software is proprietary.107 When librarians select Internet filtering software, they are not able to base their choices on the library’s collection development policies, as they do in acquiring books and other materials.108 In choosing commercial filtering software, librarians are not able to determine if the filters meet their needs because software companies do not disclose their standards.109 Therefore, librarians have no way of knowing which content has been blocked, as the district court noted in its opinion on the Children’s Internet Protection Act of 2000.110

In 2008, the Deloitte researchers suggested that Internet filters would be more effective if

software designers developed a way for the products’ users to classify content according to

family, religious or social group values, classifications that could be shared among users with

106 DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET, supra note 104, at 14.

107 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). See also Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the H. Comm. on Commerce, 105th Cong. 119 (Sept. 11, 1998) at 40 (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

108 See AM. LIBRARY ASS’N, BEST PRACTICES IN PUBLIC LIBRARIES, available at http://www.ala.org/ala/shadows/pla/resources/bestpractices.cfm. For a discussion of library collection decisions, see Chapter 2.

109 See Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the H. Comm. on Commerce, 105th Cong. 119 (Sept. 11, 1998) at 40 (statement of Jerry Berman, executive director for the Center for Democracy and Technology, referring to S.1619, the Internet School Filtering Act).

110 See Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 462-64 (E.D. Pa. 2002). For a discussion of the court cases deciding the Children’s Internet Protection Act, see Chapter 7.

346

similar beliefs and values.111 However, while a user-classification system might work well for

families and schools that potentially share a set of values, the system would not be consistent

with the values of public libraries, whose patrons may have a broad range of values and beliefs.

Another problem with commercial filtering software is that legal scholars and attorneys

do not write the filtering software programs. Moreover, filter software designers do not get input

from legal scholars or attorneys before material is categorized as obscenity, child pornography,

or harmful to minors. In addition, filtering software designers develop products for a national

market and do not include local community standards when categorizing content.112 The lack of

legal standards and legal definitions that software developers use in analyzing content is

especially problematic because the filtering software’s blocking criteria or categories do not

match the legal definitions of obscenity, child pornography, or harmful to minors contained in

the Children’s Internet Protection Act.113

Although several different companies have developed software packages,114 the filters

may give parents and the government a false sense of security because they do not block all

sexually explicit sites and/or they block perfectly innocent and useful sites.115 Filtering software blocks constitutionally protected speech as well.

111 DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET, supra note 104, at 5, 15.

112 Am. Library Ass’n v. United States, 201 F. Supp. 2d at 429.

113 For a discussion of filtering technology, see Chapter 3.

114 See Am. Library Ass’n v. United States, 201 F. Supp. 2d at 427-29.

115 See HEINS ET AL., supra note 104; Filtering Software: Better, But Still Fallible, supra note 104 at 36; Leah Wardak, Note, Internet Filters and the First Amendment: Public Libraries After United States v. Am. Library Association, 35 LOY. U. CHI. L.J. 657, 735 (2004); Am. Library Ass’n, Libraries & the Internet Toolkit (last updated Dec. 9, 2003), at 10, http://www.ala.org/oif/iftoolkits/internet. (last visited July 20, 2009); Richard J. Peltz, Use “the Filter You Were Born With”: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Public Libraries, 77 WASH. L. REV. 397 (2002); Am. Civil Liberties Union, Censorship in a Box: Why Blocking Software Is Wrong for Public Libraries (Sept. 16, 2002), http://www.aclu.org/privacy/speech/14915pub20020916.html (last visited July 20, 2009); Kathleen Conn, Commentary: Protecting Children from Internet Harm (Again): Will the Children’s Internet Protection Act Survive

347

Prior to the adoption of the CIPA, public libraries had the option of providing unfiltered

computer access or installing filtering software on computers. They could have chosen to install filters on only some computers, thus allowing full Internet access on other computers. Under

such a system, parents could have chosen whether they wanted their children to have filtered or

unfiltered access to the Internet at public libraries. However, the CIPA mandates that filtering

technology be installed on all computers connected to the Internet, thus impeding the role of

librarians and parents, as well as adults’ access to some constitutionally-protected speech.

The Supreme Court, the Protection of Minors, and Competing Interests

In upholding the Children’s Internet Protection Act in 2003, the Supreme Court could

have taken the opportunity to develop objective criteria for balancing the three competing

interests represented in that case: protecting minors from material deemed harmful, protecting adults’ rights to constitutionally-protected speech, and protecting parents’ rights to raise children

as they see fit. The Court did not try to explain how to balance the competing interests or if any

interests “trumped” under certain circumstances. An analysis of the competing issues would have

been useful to legislators and lower courts in enacting and deciding laws, respectively.

Instead, the Court relied on the congressional spending clause in upholding the CIPA,

stating that the CIPA was a valid exercise of Congress’ spending powers and did not impose an

“unconstitutional condition” on public libraries.116 In authoring the plurality opinion, Justice

William Rehnquist wrote that the CIPA did “not directly regulate private conduct; rather,

Judicial Scrutiny?, 153 ED. LAW. REP. 469 (July 5, 2001); Adam Horowitz, The Constitutionality of the Children’s Internet Protection Act, 13 ST. THOMAS L. REV. 425 (2000).

116 See United States v. Am. Library Ass'n, 539 U.S. at 214.

348

Congress has exercised its Spending Power by specifying conditions on the receipt of federal

funds.”117

In addressing the First Amendment in the CIPA case, Justice Rehnquist said that public

forum analysis and “heightened” judicial scrutiny were incompatible with public libraries’

discretion in deciding on what materials to acquire for their collections.118 He likened the Court’s

applicability of forum analysis in public libraries to its forum decisions involving public

television and funding for the arts, where he said forum doctrine does not apply.119 Rehnquist

also said that concerns over filters blocking access to constitutionally protected speech could be remedied by a provision in the CIPA that allows adult patrons to ask librarians to disable the filtering software.

In contrast, the Supreme Court applied the strict scrutiny standard to the Communications

Decency Act in Reno.120 Justice John Paul Stevens, who authored the decision, said that while

some other media, including broadcasting, received a lower level of scrutiny for various

reasons—such as the history of extensive government regulation, the scarcity of access, and its

"invasive" nature—those factors were not present in cyberspace.121 Stevens wrote that the CDA

was overbroad and suppressed a large amount of constitutionally-protected speech in order to

prevent minors from accessing potentially harmful speech.122 Justice Stevens said that the CDA

117 Id. at 203.

118 Id. at 205-06.

119 See Chapter 7 for the Supreme Court’s application of the forum doctrine to the CIPA.

120 Reno v. ACLU, 521 U.S. 844, 868-69 (1997).

121 Reno v. ACLU, 521 U.S. at 868-69 (1997). According to the Court, in contrast to a time/place/manner regulation, “[T]he CDA is a content-based blanket restriction on speech” that requires “the application of the most stringent review of its provisions.” Id. at 868.

122 Id. at 874.

349

was vague because it did not define the terms “indecent” and “patently offensive.” Stevens said

that the lack of precision in the statute’s wording could lead to a “chilling effect” on speech,

which would violate the First Amendment. Precise wording is needed when a regulation restricts

speech, Stevens wrote.123

Chief Justice Rehnquist agreed in part and dissented in part with the Court’s judgment in

Reno. He agreed that the CDA was unconstitutional because adult speech could not yet be zoned

in cyberspace. However, he said that he would have upheld the CDA as it applied to “indecent

speech” between adults and minors when the adults knew they were communicating with

minors.124

Despite the government interest in protecting children from materials deemed harmful,

parents—not the government—have the primary responsibility of raising their children,

according to the Supreme Court.125 For example, the Court has upheld parents’ rights in

educating their children as they see fit and in choosing books, magazines and television

programming for their children, including sexually explicit content if they so desire.

On the other hand, the Supreme Court has clearly stated that the physical, psychological

and emotional well-being of minors has been viewed as a compelling government interest,126 while a federal appellate court has held that the government has a compelling government interest in helping parents supervise their children.127 Congress also has recognized that while

123 Id. at 871-72.

124 Id. at 897 (O’Connor, J., dissenting, joined by Rehnquist, C.J.)

125 See Prince v. Massachusetts, 321 U.S. 158, 166 (1944); Pierce v. Soc’y of Sisters, 268 U.S. 510 (1925); Meyer v. Nebraska, 262 U.S. 390, 399, 403 (1923).

126 See Ginsberg v. New York, 390 U.S. 629, 639, 640 (1968); New York v. Ferber, 458 U.S. 747, 756-57 (1982); Prince v. Massachusetts, 321 U.S. 158, 167 (1944).

127 See Action for Children’s Television v. FCC, 11 F.3d 170, 177 (D.C. Cir. 1993).

350

parents are primarily responsible for childrearing, “parental control or guidance cannot always be

provided and society’s transcendent interest in protecting the welfare of children justify

reasonable regulation of the sale of material to them.”128

The government has a history of protecting minors from sexually explicit content. One

such Supreme Court decision, directly on point for this dissertation, upheld a state’s variable

obscenity standard. The case involved a Long Island luncheonette owner’s sale of “girlie” magazines to a sixteen-year-old boy on two separate occasions. In Ginsberg v. New York,129 the

Court held that material that is obscene for minors may not necessarily be obscene for adults.130

The Court stated that New York’s “harm to minors” statute, which barred the sale of sexually

explicit materials to persons under the age of seventeen, was constitutional.131 However, the

Court did not prohibit parents from purchasing sexually explicit material for their children.

Justice William Brennan, in authoring the Court’s opinion in Ginsberg, cited an article by

legal scholars William Lockhart and Robert McClure, the first authors to articulate the concept

of a “variable obscenity” standard.132 In a 1960 seminal law review article, Lockhart and

McClure advocated the two-part variable obscenity standard, stating that some material is

obscene for adults and minors both, while other material is obscene for minors but not for

adults.133

128 H.R. REP. NO. 105-775 (2d Sess. 1998) at 12 (citing People v. Kahan, 15 N.Y.2d 311, 312, 206 H.E.2d 333, 334 (1965), as cited in Ginsberg v. New York, 390 U.S. 629, 640 (1968)).

129 Ginsberg v. New York, 390 U.S. 629 (1968).

130 Id. at 645-66.

131 Id. at 645.

132 Id. at 636.

133 William Lockhart & Robert McClure, Censorship of Obscenity: The Developing Constitutional Standards, 45 MINN. L. REV. 5, 77 (1960).

351

The Supreme Court recognized the government’s compelling interest in protecting

minors from sexually explicit or indecent content in the electronic media, too. In Pacifica,134 the

Court held that the Federal Communications Commission had the authority to regulate the airing of indecent programming in the broadcast media to times when children most likely would not be in the audience. The Court cited Ginsberg135 in reaffirming the government’s compelling interest in the well-being of children. The Court also supported the parental right to childrearing by stating that parents have authority in their own household to determine which messages their children are exposed to.

In Sable,136 the Court held that the First Amendment protects indecent dial-a-porn

messages but not those that are obscene. The Court again reaffirmed the government’s

compelling interest in protecting children from nonobscene material deemed harmful to them,137 while at the same time stating that regulations must be narrowly drawn so as to not interfere with the First Amendment freedoms of adults.138

In United States v. Playboy,139 the Court held as unconstitutional a statute140 that required

cable operators to either scramble or block channels “primarily dedicated to sexually-oriented

programming” or to broadcast those channels during the “safe-harbor” hours of 10 p.m. to 6 a.m.

134 FCC v. Pacifica Found., 438 U.S. 726 (1978).

135 Ginsberg v. New York, 390 U.S. 629 (1968).

136 Sable Commc’ns, Inc. v. FCC, 492 U.S. 115 (1989).

137 Id. at 126. The Court wrote, “There is a compelling interest in protecting the physical and psychological well- being of minors. This interest extends to shielding minors from the influence of literature that is not obscene by adult standards.”

138 Id. (citing Schaumburg v. Citizens for Better Env't, 444 U.S. 620, 637 (1980)).

139 United States v. Playboy Entm't Group, Inc., 529 U.S. 803 (2000).

140 Telecommunications Act of 1996, Pub. L. No. 104-104, Title V, Subtitle A, § 505(b), 110 Stat. 136 (1996) (codified at 47 U.S.C. § 561).

352

when young children were not likely to be watching.141 Despite its ruling, the Court again

reinforced the protection of minors when it said that there is a problem that the government must

address if television programming “can expose children to the real risk of harmful exposure to

indecent materials, even in their own home and without parental consent.”142 The Court said the

government must find a way to protect minors from indecency in a way that is consistent with

the First Amendment.143

Despite their support of the government’s compelling interest in protecting minors from

obscenity and other material deemed harmful, Congress and the Supreme Court have not done a

good job of defining what is obscene or what is “harmful to minors.” Although defining terminology is more of a congressional issue than a court issue, Congress does not seem able to deal rationally with all of the issues involved. The definition of obscenity and the definition of harmful to minors,144 which is based on the Miller obscenity test,145 are difficult to clarify and

even more difficult to apply. In addition, Congress and the Supreme Court have not squarely faced the definitional issues in light of the Internet.

The Court also has not done a good job of determining how best to balance competing interests; nor has the Court been consistent in doing so. For example, in Ginsberg, 146 the Court

said that parents could purchase sexually explicit material for their children, thus supporting the

141 Playboy, 529 U.S. at 806, 826-27.

142 Id.

143Id..

144 For the definition of “harmful to minors,” see supra note 103 and accompanying text.

145 The Supreme Court defined obscenity in 1973 in Miller v. California as “works which, taken as a whole, appeal to the prurient interest in sex, which portray sexual conduct in a patently offensive way, and which, taken as a whole, do not have serious literary, artistic, political, or scientific value.” Miller v. California, 413 U.S. 15, 24 (1973).

146 Ginsberg v. New York, 390 U.S. 629 (1968).

353

parental role in childrearing. In contrast, in U.S. v. American Library Association,147 the CIPA

case, the Court did not question the congressional mandate that required minors (and adults, for

that matter) to use filtered computers even if parents preferred that their children use unfiltered

computers in public libraries. The CIPA, in general, imposes government values on the family.

In addition, the same filtering software and setting for all library users results in an imposition of

the software developer’s standards over parental views. The use of consistent software settings in

public libraries treats young grade school students and teenagers the same way.

The Supreme Court, in its opinion in the CIPA case, did not address the ambiguity or vagueness of the terms in the statute, such as “bona fide research”, “technology protective measure”, and “harmful to minors.” At the time Congress enacted the CIPA, a law student said that filtering technology was incapable of blocking “visual depictions” deemed “obscene”, “child pornography” or “harmful to minors”,148 which were terms used in the CIPA.149 Moreover,

filtering technology software descriptions did not match these three definitions, according to an

attorney who advised the House Committee on Science and Technology.150

The Court did not question Congress’ use of the term “bona fide research or other lawful

purpose.” Moreover, the Court did not question who would determine if an adult’s purpose was

bona fide or lawful.

147 United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

148 See Jared Chrislip, Filtering the Internet Like a Smokestack: How the Children's Internet Protection Act Suggests a New Internet Regulation Analogy, 5 J. HIGH TECH. L. 261, 278-79 (2005).

149 For a discussion of more recent Internet filtering technology, see Chapter 3.

150 See Mitchell Goldstein, Congress And The Courts Battle Over The First Amendment: Can The Law Really Protect Children From Pornography On The Internet? 21 J. MARSHALL J. COMPUTER & INFO. L. 141, 187 (2003). Examples of filtering categories are “adults only”, “sexually explicit”, “sex education”, “nudity” and “violence”. For a discussion of how filtering software works, see Chapter 3.

354

The Supreme Court also ignored Congress’ inconsistent use of terminology within the

CIPA itself. The lower court had faulted Congress for choosing language in the disabling provision of the statute that was inconsistent with the language in the banned content provision

of the statute. The district court wrote:

If Congress intended CIPA's disabling provisions simply to permit libraries to disable the filters to allow access to speech falling outside of these categories, Congress could have drafted the disabling provisions with greater precision, expressly permitting libraries to disable the filters “to enable access for any material that is not obscene, child pornography, or in the case of minors, harmful to minors,” rather than “to enable access for bona fide research or other lawful purposes,” which is the language that Congress actually chose.151

In deciding the CIPA case, the Supreme Court also did not explain how to determine

whether content met the “harmful to minors” standard. Even with advances in filtering

technology, filters still do not do a good job of differentiating between content that is obscene

and that which is harmful to minors, an important distinction since adults legally would be

allowed to access the nonobscene material that could be considered harmful to minors. To this

author’s knowledge, no court has yet distinguished those terms.

Two law professors proposed an aged-based Internet filtering solution to address

concerns about minors’ access to inappropriate materials in public libraries. Libraries could

implement three tiers of filtering: 1) the most restrictive setting would apply to those aged 12

and under; 2) a “less restricted” setting would apply to minors aged thirteen through sixteen; and

3) a much less restricted or unrestricted setting would apply to adults aged seventeen and

older.152

151 Am. Library Ass’n v. United States, 201 F. Supp. 2d 401, 485 (E.D. Pa. 2002).

152 See Etzioni, supra note 56, at 43-44 (2004); Nunziato, supra note 56, at 163-64.

355

Although Congress and the courts have presumed that pornography is harmful to adults and minors, social science studies on the effects of pornography are inconclusive and contradictory, as discussed in Chapter 4.153 Kathleen Conn, a legal expert and educator, wrote that “empirical research has traditionally failed to establish a link between exposure of children to sexually explicit materials and delinquent or criminal behavior.”154 Historian and attorney

Catherine Ross stated that proponents of government regulation have not justified the compelling interest of protecting minors from harm because they have not articulated any specific harms that would come to children,155 while social scientist Ernest Giglio said that anecdotal evidence carries little, if any, empirical weight and is made up largely of impressionistic and subjective experiences.156

Because of the presumed psychological, moral, or developmental harm to minors that would result from their exposure to sexually explicit material, it has been considered unethical for researchers to intentionally expose children to pornography. Therefore, virtually no empirical studies have been done, according to researchers.157

153 See Azy Barak & William A. Fisher, Effects of Interactive Computer Erotica on Men’s Attitudes and Behavior Toward Women: An Experimental Study, 13 COMPUTERS IN HUMAN BEHAVIOR 353, 354 (1997). See also Aletha C. Huston, Ellen Wartella & Edward Donnerstein, Measuring the Effects of Sexual Content in the Media: A Report to the Kaiser Family Foundation at 4 (May 2003), http://www.kff.org/entmedia/1389-content.cfm and http://www.kff.org/entmedia/loader.cfm?url=/commonspot/security/getfile.cfm&PageID=14624 (last visited July 20, 2009); John S. Lyons, Rachel L. Anderson & David B. Larson, A Systematic Review of the Effects of Aggressive and Nonaggressive Pornography, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES 301 (Dolf Zillmann & Aletha C. Huston, eds., 1994); Azy Barak, William A. Fisher, Sandra Belfry & Darryl R. Lashambe, Sex, Guys & Cyberspace: Effects of Internet Pornography and Individual Differences on Men’s Attitudes Toward Women, 11 J. PSYCHOLOGY & HUMAN SEXUALITY 63, 65 (1999).

154 Conn, supra note 115, at 479.

155 See Catherine J. Ross, Anything Goes: Examining the State’s Interest in Protecting Children from Controversial Speech, 53 VAND. L. REV. 427, 494 (2000).

156 See Ernest Giglio, Pornography in Denmark: A Public Policy Model for the United States, 8 COMP. SOC. RESEARCH 281, 296 (1985).

157 See Kimberly J. Mitchell, David Finkelhor & Janis Wolak, The Exposure of Youth to Unwanted Sexual Material on the Internet: A National Survey of Risk, Impact, and Prevention, 34 YOUTH & SOC’Y 330, 334 (2003). See also Huston, Wartella & Donnerstein, supra note 153.

356

Other research studies, conducted on the effects of sexually explicit material on young adults and mature adults, have yielded mixed results. One study, conducted in the mid-1980s, suggested that sexually explicit films only had undesirable effects on actions when violence was included.158 At least five studies, conducted in the 1980s, indicated that men developed negative attitudes toward women and engaged in aggressive behavior toward them after being exposed to sexually explicit material that portrays women as receptive, nondiscriminating and available sexual objects.159 On the other hand, another six studies, conducted in the 1980s and 1990s, indicated that men were not easily influenced or affected by such sexually explicit materials and therefore did not develop negative attitudes and behaviors toward women as a result of exposure to explicit materials.160

Most social science studies on the effects of exposure to pornography were conducted prior to the advent of the Internet. Research into the content, use and effects of online

158 See EDWARD DONNERSTEIN, DANIEL LINZ & STEVEN PENROD, THE QUESTION OF PORNOGRAPHY: RESEARCH FINDINGS AND POLICY IMPLICATIONS 176 (1987). See also Edward Donnerstein & Leonard Berkowitz, Victim Reactions in Aggressive Erotic Films as a Factor in Violence Against Women, 41 J. PERSONALITY & SOC. PSYCHOLOGY 710 (1981); Margaret E. Thompson, Steven H. Chaffee & Hayg H. Oshagan, Regulating Pornography: A Public Dilemma, 40 J. COMM. 73, 74 (1990).

159 See J.V.P. Check & T.H. Guloien, Reported Proclivity for Coercive Sex Following Repeated Exposure to Sexually Violent Pornography, Nonviolent Dehumanizing Pornography, and Erotica, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS 159-84 (Dolf Zillman & Jennings Bryant, eds., 1989); E. Donnerstein, Pornography: Its Effect on Violence Against Women, in PORNOGRAPHY AND SEXUAL AGGRESSION 53-81 (N.M. Malamuth & E. Donnerstein, eds., 1984); N.M. Malamuth & J.V.P. Check, The Effects of Mass Media Exposure On Acceptance of Violence Against Women, 15 J. RESEARCH IN PERSONALITY 436-46 (1981); R.S. Wyer, G.V. Bodenhausen & T.F. Gorman, Cognitive Mediators of Reactions to Rape, 48 J. PERSONALITY AND SOC. PSYCHOLOGY 324-38 (1984); and D. Zillmann, Effects of Prolonged Consumption of Pornography, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS 127-57 (Dolf Zillman & Jennings Bryant, eds., 1989).

160 See Barak & Fisher, supra note 153 at 353-69; J. Becker & R.M. Stein, Is Sexual Erotica Associated With Sexual Deviance in Adolescent Males? 14 INT’L J. L. & POL’Y 85-95 (1991); W.A. Fisher & G. Grenier, Violent Pornography, Antiwoman Thoughts, and Anti Woman Acts: In Search of Reliable Effects. 31 J. SEX RESEARCH 23- 38 (1994); R. Langevin et al., Pornography and Sexual Offenses, 1 ANNALS OF SEX RESEARCH 335-62 (1988); N.M. Malamuth & J. Ceniti, Repeated Exposure to Violent and Nonviolent Pornography: Likelihood of Raping Ratings and Laboratory Aggression Against Women, 12 AGGRESSIVE BEHAVIOR 129-37 (1986); V.R. Padgett, J.A. Brislin- Slutz & J.A. Neal, Pornography, Erotica, and Attitudes Toward Women: The Effects of Repeated Exposure, 26 J. SEX RESEARCH 479-91 (1989). See generally I.L. REISS, JOURNEY INTO SEXUALITY: AN EXPLORATORY VOYAGE (1986).

357

pornography is still in the early stages, and therefore not much is known about the behaviors and

attitudes of people who access pornography online.161 Even with the interactive environment of

computer pornography, researchers reported in 1997 that males’ use of computer pornography

did not affect their attitudes and behavior toward women.162 In 2000, another group of

researchers argued that further research needs to be conducted to study whether interactive

pornography has a more negative impact on consumers than non-interactive pornography or even

non-interactive obscenity.163 In an article released in 2000, a team of researchers who conducted a meta-analysis reported that “exposure to pornography produces a variety of substantial negative outcomes.”164 The researchers stated that themes of aggression, gratification and

objectification “may reinforce and/or justify similar attitudes and behaviours [sic] in everyday

human-life contacts.”165 The team reported, however, that the relationship between the

consumption of pornography and subsequent behavior does not exist in a vacuum. The

researchers wrote, “While likely not a solitary influence, it appears that exposure to pornography

is one important factor which contributes directly to the development of sexually dysfunctional

attitudes and behaviours.” [sic]166

161 See Patricia Goodson, Deborah McCormick & Alexandra Evans, Searching for Sexually Explicit Materials on the Internet: An Exploratory Study of College Students’ Behaviors and Attitudes, 30 ARCHIVES OF SEXUAL BEHAVIOR 101, 103, 115 (2001); see also Azy Barak & Fisher, supra note 153, at 353.

162 See Barak & Fisher, supra note 153, at 357-66.

163 See Chad Mahood, Sriram Kalyanaraman & S. Shyam Sundar, The Effects of Erotica and Dehumanizing Pornography in an Online Interactive Environment, paper presented at annual convention of the Association for Education in Journalism and Mass Communication, Phoenix, Ariz. (Aug. 2000).

164 Elizabeth Oddone-Paolucci, Mark Genuis & Claudio Violato, A Meta-Analysis of the Published Research on the Effects of Pornography, in THE CHANGING FAMILY AND CHILD DEVELOPMENT 52 (Claudio Violato, Elizabeth Oddone-Paolucci & Mark Genuis, eds., 2000).

165 Id.

166 Id.

358

With the advent and expansion of the Internet, researchers have questioned whether

Internet pornography can be conceptualized as traditional pornography viewed through a new medium167 or if online pornography is quite different than that found in other media.168 For

instance, some researchers view Internet pornography as unique from other media because of the

interactivity.169 With the emergence of social networking sites and 3-D sites with avatars, such

as Second Life, further studies on the effects of interactive pornography would be useful.

Suggestions for Future Research

Further study is needed on several issues related to mandatory Internet filtering in public

libraries. First, legal scholars could investigate “as applied” challenges170 to the Children’s

Internet Protection Act. An “as applied” challenge would allow a library patron to file a lawsuit

alleging that the law was improperly administered under a specific set of circumstances.171 For

example, if a librarian refused to disable the filter or was unable to disable the filter in a timely

manner, or if an adult patron’s access to constitutionally-protected online content was “burdened

in some other substantial way,” the patron would be able to challenge the CIPA on an “as

applied” basis.172

167 See Michael D. Mehta & D. Plaza, Content Analysis of Pornographic Images Available on the Internet, 13 THE INFO. SOC’Y 153, 154, 161 (1997).

168 See Mahood, Kalyanaraman & Sundar, supra note 163; S. S. Sundar, Technological Issues in Internet Pornography, paper presented at annual convention of the Association for Education in Journalism and Mass Communication, New Orleans, La. (Aug. 1999).

169 See Ven-hwei Lo & Ran Wei, Third-Person Effect, Gender, and Pornography on the Internet, 46 J. BROAD. & ELEC. MEDIA 13, 13-14 (2002); Marty Rimm, Marketing Pornography on the Information Superhighway, 83 GEORGETOWN L.J. 1849, 1852 (1995).

170 See United States v. Am. Library Ass’n, 539 U.S. 194, 215 (2003) (Kennedy, J., concurring).

171 The CIPA contains a disabling provision that allows librarians to disable the filter for adults for “bona fide research or other lawful purposes.” See 47 U.S.C. § 254(h)(5)(D) and 47 U.S.C. § 254(h)(6)(D)).

172 See United States v. Am. Library Ass’n, 539 U.S. at 215 (Kennedy, J., concurring).

359

Second, legal scholars could study the use of age-specific regulations—for example, grouping minors into several groups, such as 6-years-old and below, 7- to 12-year-olds, and 13- to 16-year-olds. The filtering settings could be set at the most restrictive level for young children, a somewhat restrictive level for elementary school children, and the least restrictive level for teenagers. Such an approach would be a less restrictive alternative for filtering Internet content in public libraries and more in line with the mission and role of public libraries. Legal scholars could collaborate with child development experts on the study.

Third, additional social science studies are needed to study the potential harm of sexually explicit images on minors. The CIPA presumes that sexually explicit images would be harmful to minors. Although researchers have stated that it would be unethical to conduct experiments that would subject minors to sexually explicit videos, researchers could target a group of minors who had already been exposed to sexually explicit images and use a longitudinal study or case studies to try to determine the impact of those images on minors. In addition, researchers could, in an effort to measure the effects of sexually explicit videos on minors, seek parental permission to show minors sexual videos, just as researchers would seek parental permission to expose minors to violent videos.

Fourth, research is lacking on the capability of current filtering software to correctly identify images, which is the only type of content covered under the Children’s Internet

Protection Act. Social science researchers could conduct quantitative and qualitative content analyses of the most common filtering programs used in public libraries to determine their effectiveness at preventing access to pornographic images and at allowing access to nonpornographic images.

360

Fifth, a study of the CIPA’s effect on public libraries and librarians would be useful.

Social science researchers could conduct a national random survey of librarians to determine if their libraries use filters, how they are using filters, and what they are telling patrons about the use of filters. Researchers also could ask about any self-perceived changes in their role as librarians after the enactment of the CIPA. In the same survey, researchers could collect data to help them evaluate any changes in public libraries’ participation in the E-rate and Library

Services and Technology Act programs (and reasons for the changes), librarians’ compliance with the CIPA, and librarians’ experiences with Internet filters and disabling requests.

Conclusion

The Children’s Internet Protection Act requires public libraries to install filtering technology on all computers connected to the Internet in order to receive federal funding. The mission and role of the public library is to provide patrons with access to a broad array of information, not to withhold information, which occurs with mandatory filtering. Because a great deal of information is available online, Internet access supports the public library’s role in making information accessible to patrons, albeit in a different format than printed resources, while the CIPA hinders the library’s role as information provider.

The First Amendment right to receive information doctrine is directly applicable to

Internet access in public libraries because the library is a place to pursue knowledge. As First

Amendment scholar Thomas Emerson said, there is a First Amendment right to receive and obtain communication, independent of or supplemental to the right of the speaker to communicate information.173 The public library should be a place for research and exploration, where patrons can access a wide variety of information and viewpoints. The right to read, listen

173 See Emerson, supra note 26, at 2.

361

and see deserves full First Amendment protection. The Children’s Internet Protection Act,

however, is a move away from the right to receive ideas and information that the Supreme Court

and lower federal courts recognized in earlier library decisions.

The right to receive ideas and information doctrine supports the marketplace of ideas,

self-fulfillment, self-government and checking value theories often mentioned as pillars of the

First Amendment. A public library’s collection inherently serves as a marketplace of ideas. The library is one place where visitors should be able to choose exposure to as many ideas as possible, including political and social ideas, in their search for knowledge and truth. In the search for knowledge, library patrons should not be faced with obstacles, such as filtered Internet access. The marketplace and the search for truth suffer when ideas are stifled. Moreover, without a right to receive information and ideas, the role of the speaker becomes meaningless, as do the self-fulfillment, marketplace, self-government and checking values of the First Amendment.

The First Amendment self-fulfillment theory also supports online access in public libraries. Patrons, in using the Internet, can find information that leads to self-fulfillment and autonomy. First Amendment scholar C. Edwin Baker has stated that the “solitary uses of speech contribute to self-fulfillment.”174 Such solitary use, in acquiring knowledge and information, is

the major function of a public library and Internet access in the public library. The CIPA

impedes the public’s pursuit of self-fulfillment.

The current First Amendment public forum doctrine does not seem to fit Internet access

in public libraries as it is inadequate for new and emerging technologies. A new category of

public fora, such as an “inherently” public forum or a “metaphysical” public forum, needs to be

174 C. Edwin Baker, Scope of the First Amendment Freedom of Speech, 25 UCLA L. REV. 964, 995-96 (1978).

362

developed for the Internet. The metaphysical public forum would be equivalent to the traditional public forum because the Internet has replaced streets, parks and sidewalks for the free exchange of ideas between speakers and listeners.

In a metaphysical public forum, any content-based regulation would be subject to strict scrutiny. The government would need to show that any content-based regulation, such as Internet filtering, would be narrowly tailored to meet the compelling interest of protecting minors from harm. The government would face two obstacles. First, social science studies on the effects of pornography have been inconclusive. Second, there are less restrictive ways to prevent minors from accessing sexually explicit material online. For example, libraries could offer both filtered and unfiltered computers. However, if minors were restricted to using filtered computers, the parental role in childrearing—by choosing whether their children should or should not have open access to the Internet—would be usurped.

In the current paradigm, where the Supreme Court has held the Internet in a public library is not a traditional or designated public forum, the Children’s Internet Protection Act cannot do what Congress expected of it. First, filtering technology is not sophisticated enough to block sexually explicit images without blocking a great deal of constitutionally-protected speech.

Second, because filtering technology is proprietary, there is no way to determine if filtering software uses the legal terminology of obscenity, child pornography, and harmful to minors, the language contained in the CIPA. Typically, filtering products block categories of content based on such factors as nudity, sex, violence, and hate speech.175 Third, even if filtering software companies tried to incorporate the three categories of content contained in the CIPA, the

175 For a discussion of filtering technology, see Chapter 3.

363

software developers are not trained in law and therefore would not be precise in applying the

definitions to the software programs.

Instead of mandating Internet filtering in public libraries, Congress could have enacted a

statute that gave public libraries more flexibility. Local libraries then would have been able to

develop local solutions to local problems, such as providing filtered computers for children and

unfiltered computers for adults, setting up an adult computer room with unfiltered computers, or

allowing parents to decide whether their children should use filtered or unfiltered computers.

The Supreme Court, in deciding to uphold the Children’s Internet Protection Act, seemed to ignore the role of the public library, the parental role in childrearing, and the flaws in filtering technology. The Court could have reaffirmed the right to receive information in public libraries, but it did not address the issue. The Court could have assessed how to balance the competing interests of protecting minors from harm, upholding the rights of adults to view constitutionally- protected speech, and supporting the rights of parents to raise their children however they saw fit, but the Court did not do so. The Court also could have discussed how lower courts could determine what constitutes “harm to minors,” which it did not do.

Congress and the courts need to address the definitional issues in the context of the

Internet because they will not go away unless more sophisticated filtering technology is developed. The development of filtering technology that can apply legal definitions to content of the CIPA—visual images that are obscene, child pornography, or harmful to minors—is unlikely, however. First, obscenity is defined by local standards, and filtering software designers cannot incorporate local standards in products that are distributed nationally. Second, actual child pornography, which is not protected, is difficult to distinguish from virtual child pornography,176

176 Virtual child pornography uses computer-generated actors that look like children.

364

which is protected. Filters would likely do a worse job than humans in trying to distinguish the two. Third, the definition of “harm to minors” is open to interpretation and difficult to apply to specific circumstances, especially in “borderline” instances.

By focusing on the congressional spending power in deciding the Children’s Internet

Protection Act, the Supreme Court missed the opportunity to clarify existing law. The Court also failed to allow public libraries come up with their own solutions that could be better geared to solving problems with Internet access on the local level. In upholding the Children’s Internet

Protection Act of 2000, the Supreme Court did not address the disconnect between law and technology.

365

LIST OF REFERENCES

Government Statutes & Reports

18 U.S.C. § 2256(8).

20 U.S.C. § 9134(f).

47 U.S.C. § 254(h)).

S. 1482, 105th Cong. (2d Sess. 1997).

H.R. 3177, 105th Cong. (2d Sess. 1998).

S.1619, 105th Cong. (2d Sess. 1998).

H.R. 368, 106th Cong. (1st Sess. 1999).

H.R. 896, 106th Cong. (1st Sess. 1999).

H.R. 1501, 106th Cong. (1st Sess. 1999).

S.1545, 106th Cong. (1st Sess. 1999).

H.R. 4545, 106th Cong. (2d Sess. 2000).

H.R. 4600, 106th Cong. (2d Sess. 2000).

H.R. 4577, 106th Cong. (2d Sess. 2000). H.R. 5666, 106th Cong. (2d Sess. 2000).

U.S. CONST. amend. I.

U.S. CONST. amend. XXI, § 2.

U.S. CONST. art. 1, § 8, clause 1

141 CONG. REC. 15503 (June 9, 1995).

144 CONG. REC. H9906 (1998).

144 CONG. REC. S519 (Feb. 9, 1998) (statement of Sen. McCain).

144 CONG. REC. E362 (Mar. 12, 1998) (statement of Rep. Markey).

144 CONG. REC. S8162 (July 15, 1998).

366

144 CONG. REC. S8614 (July 21, 1998) (statement of Sen. Burns).

145 CONG. REC. H4537 (June 17, 1999) (statement of Rep. Franks).

146 CONG. REC. S5843 (June 27, 2000) (statement of Sen. Santorum).

146 CONG. REC. S5866 (June 27, 2000) (statement of Sen. McCain).

Appropriations for Commerce, Justice, State, & Judiciary FY 1999, S.2260, 105th Cong. (2d Sess. 1998).

Cable Television Consumer Protection & Competition Act of 1992, Pub. L. No. 102-385 §§ 9, 10(a), (b), 106 Stat. 1484, 1486, §§ 10(a), 10(b), and 10(c) (codified as amended at 47 U.S.C. § 532(h), 532(j)).

Child Labor Tax Act, 65 Pub. L. No. 254, 40 Stat. 1057, 1138 (1919).

Child Online Protection Act, Pub. L. No. 105-277, Div C, Title XIV, § 1405, 112 Stat. 2681, (enacting H.R. 4328 and codified at 47 U.S.C. § 231 (1998)).

Child Protection Act of 1998 (the Istook Amendment) to FY99 Labor, Health & Human Servs., & Educ. Appropriations Bill, Title VI of H.R. 4274, 105th Cong. (2d Sess. 1998).

Child Protection Act of 1999, H.R. 2560, 106th Cong. (1st Sess. 1999).

Childrens’ [sic] Internet Protection Act, H.R. 543, 106th Cong. (1st Sess. 1999)

Childrens’ [sic] Internet Protection Act, H.R. 896, 106th Cong., (1st Sess. 1999).

Childrens’ [sic] Internet Protection Act, S.97, 106th Cong. (1st Sess. 1999).

Children’s Internet Protection Act of 1999, H. Amdt. 212/Title XIV/Sec. 1401 of Juvenile Justice Reform Act of 1999, H.R. 1501, 106th Cong. (1st Sess. 1999).

Children’s Internet Protection Act, Rep. of the S. Comm. on Commerce, Science & Transportation on S.97, S. REP. NO. 106-141, 106th Cong. (1st Sess. 1999).

Children’s Internet Protection Act, H.R. 4600, 106th Cong. (2d Sess. 2000).

Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R.4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. 2000).

Children’s Internet Protection Act, Pub. L. No.106-554 (codified at 20 U.S.C. § 9134 (f) and 47 U.S.C. § 254 (h)).

Commc’ns Decency Act, 47 U.S.C. §§ 223, 230, 231 & 561 (1996).

367

Dep’t of Labor, Health & Human Servs. & Educ. Appropriations Act, 2001, H.R. 4577, 106 Cong. (2d Sess. 2000), Pub. L. No. 106-554, 114 Stat. 2763, 2763A-335 (2000) (codified as amended at 20 U.S.C. 9134(f) and 47 U.S.C. 254(h)).

E-Rate Policy & Child Protection Act of 1998, H.R. 3442, 105th Cong. (2d Sess. 1998).

Fair Labor Standards Act, 29 U.S.C. § 212 (1938).

Fed. Child Labor Law of 1916, Pub. L. No. 249, 39 Stat. 675 (ch. 432) (1916).

H.R. CONF. REP. NO. No. 104-458 (1996).

H.R. REP. NO. 105-775 (1998).

H.R. REP. NO. 106-4577 (2000).

Juvenile Offenders Act, H.R. 1501, 106th Cong. (1999).

Internet Indecency: Hearing Before the S. Comm. on Commerce, Science, & Transportation, 105th Cong. 910 (Feb. 10, 1998).

Internet Filtering, Amendment No. 3228 to Departments of Commerce, Justice, & State, the Judiciary, & Related Agencies Appropriations Act, 1999, S. 2260, 105th Cong. (2d Sess. 1998).

Internet Minors Protection & Cyberspace Tech. Act, H.R. 4545, 106th Cong. (2d Sess. 2000).

Internet School Filtering Act, S. 1619, 105th Cong. (2d Sess. 1998).

Legislative Proposals to Protect Children from Inappropriate Materials on the Internet: Hearing before the Subcomm. on Telecomm., Trade, & Consumer Protection of the H. Comm. on Commerce, 105th Cong. 119 (Sept. 11, 1998).

Library Servs. & Tech. Act, 20 U.S.C. § 9121, 9134 (1996).

Misc. Appropriation Act, H.R. 5666, 106th Cong. (2d Sess. 2000). Neighborhood Children’s Internet Protection Act, S.1545, 106th Cong. (1st Sess. 1999).

Neighborhood Children’s Internet Protection Act, S.Amdt. No. 3610 to H.R. 4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 22, 2000).

Neighborhood Children’s Internet Protection Act, S. Amdt. No. 3635 to H.R. 4577, Consol. Appropriations Act, 2001, 106th Cong. (2d Sess. June 27, 2000).

Omnibus Appropriations Act, 105 Pub. L. No. 277, 112 Stat. 2681 (enacting H.R. 4328, 105th Cong. (1998)).

368

Safe Schools Internet Act of 1998, H.R. 3177, 105th Cong. (2d Sess. 1998).

Safe Schools Internet Act of 1999, H.R. 368, 106th Cong. (1st Sess. 1999).

S. CONF. REP. NO. 104-230 (1996).

S. REP. NO. 105-226 (1998).

S. REP. NO. 106-603 (1999).

S. REP. NO. 106-141 (1999).

Statement on Signing the Consol. Appropriations Act, FY 2001 52 WEEKLY COMP. PRES. DOC. 3171-72 (Dec. 29, 2000).

Telecomm. Act of 1996, 47 U.S.C. § 254 (1996).

Title XVI, Sec. 1604, of the Violent & Repeat Juvenile Offender Accountability & Rehabilitation Act of 1999, S. 254, 106th Cong. (1st Sess. 1999).

Violent & Repeat Juvenile Offender Accountability & Rehabilitation Act of 1999, S. 254, 106th Cong. (1st Sess. 1999).

Title XVII, Children’s Internet Protection, H.R. 4577, 106th Cong. (2000).

Regulatory Agency Materials

13 F.C.C. 1246 (1949).

Personal Attacks and Political Editorials, 8 FCC 2d at 722, 725 (1967).

Report Concerning General Fairness Doctrine Obligations of Broadcast Licensees, 102 F.C.C.2d 143, 146 (1985).

2 FCC Rcd 5043, 5057 (1987).

Fed. Commc’ns Comm’n, Frequently Asked Questions on Universal Service and the Snowe- Rockefeller Amendment (released July 2, 1997), http://www.fcc.gov/learnnet/ (last visited July 20, 2009). Fed. Commc’ns Comm’n, E-rate, http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

Fed. Commc’ns Comm’n, FCC Consumer Facts on Children’s Internet Protection Act, Sept. 17, 2003, http://www.fcc.gov/cgb/consumerfacts/cipa.html (last visited July 20, 2009).

369

Letters

Letter from L. Anthony Sutin, Acting Assistant Attorney General, to Thomas Bliley, Chairman of the Comm. on Commerce, U.S. H.R. (Oct. 5, 1998), available at http://www.eff.org/legal/cases/ACLU_v_Reno_II/HTML/19981005_doj_congress.letter. html.

Opinion Letter from Henry Cohen, Legislative Attorney, Congressional Research Service, to Rep. Ernest J. Istook, Jr., U.S. House of Representatives (June 29, 1998), available at http://www.techlawjournal.com/congress/blocking/80629crs.htm)

Memorandum from Henry Cohen, Legislative Attorney, Congressional Research Service, to Rep. Ernest J. Istook, Jr., U.S. House of Representative, Re: Constitutionality of Blocking URLs Containing Obscenity and Child Pornography (June 7, 1999), reprinted 145 CONG. th REC. E 1602-03 (July 20, 1999), 106 Cong. (1st Sess. 1999).

Principal Cases

Abrams v. United States, 250 U.S. 616 (1919).

Action for Children’s Television v. FCC, 852 F. 2d 1332 (D.C. Cir. 1988).

Action for Children’s Television v. FCC, 932 F. 2d 1504 (D.C. Cir. 1991).

Action for Children’s Television v. FCC, 11 F.3d 170 D.C. Cir. (1993).

Action for Children’s Television v. FCC, 58 F.3d 654 (D.C. Cir. 1995). Adderley v. Florida, 385 U.S. 39 (1966). Am. Civil Liberties Union v. Ashcroft, 322 F.3d 240 (3d Cir. 2003). Am. Civil Liberties Union v. Gonzales, No. 98-05591 at 114, 155 (E.D. Pa. Nov. 8, 2006) (transcript from non-jury trial), available at http://www.aclu.org/pdfs/freespeech/copatranscript_20061108.pdf Am. Civil Liberties Union v. Gonzales, 478 F. Supp. 2d 775 (E.D. Pa. 2007). Am. Civil Liberties Union v. Reno, 1996 U.S. Dist. LEXIS 1617 at *1; 24 Media L. Rep. 1379 (E.D. Pa. 1996). Am. Civil Liberties Union v. Reno, 929 F. Supp. 824 (E.D. Pa. 1996).

Am. Civil Liberties Union v. Reno, 1998 U.S. Dist. LEXIS 18546 at *1; 27 Media L. Rep. 1026 (E.D. Pa. 1998). Am. Civil Liberties Union v. Reno, 31 F. Supp. 2d 473 (E.D. Pa. 1999).

370

Am. Civil Liberties Union v. Reno, 217 F.3d 162, 166 (3d Cir. 2000). Am. Library Ass’n v. U.S., 201 F. Supp. 2d 401 (2002).

Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666 (1998).

Ark. Writers’ Project v. Ragland, 481 U.S. 221 (1987). Ashcroft v. Am. Civil Liberties Union, 535 U.S. 564 (2002).

Ashcroft v. Am. Civil Liberties Union, 542 U.S. 656 (2004).

Ashcroft v. Free Speech Coal., 535 U.S. 234 (2002).

Bailey v. Drexel Furniture Co., 259 U.S. 20 (1922).

Bellotti v. Baird, 443 U.S. 622 (1979).

Bd. of Educ. v. Pico, 457 U.S. 853 (1982).

Brown v. Louisiana, 383 U.S. 131 (1966).

Butler v. Michigan, 352 U.S. 380 (1957).

Cantwell v. Connecticut, 310 U.S. 296 (1940).

Carey v. Brown, 447 U.S. 455 (1980).

City of Chicago v. Morales, 527 U.S. 41 (1999).

City of Madison Joint Sch. Dist. v. Wisc. Employment Relations Comm’n, 429 U.S. 167 (1976).

Consol. Edison Co. v. Public Serv. Comm'n, 447 U.S. 530 (1980).

Cornelius v. NAACP Legal Defense & Educ. Fund, Inc., 473 U.S. 788 (1985).

Denver Area Educ’l Telecomms. Consortium, Inc. v. FCC, 518 U.S. 727 (1996).

Erznoznik v. City of Jacksonville, 422 U.S. 205 (1975).

FCC v. League of Women Voters, 468 U.S. 364 (1984).

FCC v. Pacifica Found., 438 U.S. 726 (1978).

First Nat’l Bank of Boston v. Bellotti, 435 U.S. 765 (1978).

Giaccio v. Pennsylvania, 382 U.S. 399 (1966).

371

Ginsberg v. New York, 390 U.S. 629 (1968).

Grayned v. City of Rockford, 408 U.S. 104 (1972).

Greer v. Spock, 424 U.S. 828 (1976).

Griswold v. Connecticut, 381 U.S. 479 (1965).

Hague v. CIO, 307 U.S. 496 (1939).

Hamling v. United States, 418 U.S. 87 (1974).

Hammer v. Dagenhart, 247 U.S. 251 (1918).

Int’l Soc’y for Krishna Consciousness v. Lee, 505 U.S. 672 (1992).

Jacobellis v. Ohio, 378 U.S. 184 (1964).

Jones v. N.C. Prisoners’ Labor Union, 433 U.S. 119 (1977). Keyishian v. Bd. of Regents, 385 U.S. 589 (1967). Kleindienst v. Mandel, 408 U.S. 753 (1972).

Kreimer v. Bureau of Police, 958 F. 2d 1242 (3d Cir. 1992).

Lamont v. Postmaster Gen., 381 U.S. 301 (1965).

Legal Servs. Corp. v. Velazquez, 531 U.S. 533 (2001).

Mainstream Loudoun v. Bd. of Trustees, 2 F. Supp. 2d 783 (E.D. Va. 1998).

Mainstream Loudoun v. Bd. of Trustees of Loudoun County, 24 F. Supp. 2d. 552 (E.D. Va. 1998).

Martin v. Struthers, 391 U.S. 141 (1943).

Memoirs v. Massachusetts, 383 U.S. 413 (1966).

Meyer v. Nebraska, 262 U.S. 390 (1923).

Miami Herald v. Tornillo, 418 U.S. 241 (1974).

Miller v. California, 413 U.S. 15 (1973).

Nat’l Endowment for the Arts v. Finley, 524 U.S. 569 (1998).

New York v. Ferber, 458 U.S. 747 (1982).

372

Noble State Bank v. Haskell, 219 U.S. 104 (1911).

Pacific Gas & Elec. Co. v. Pub. Utils. Comm’n of Calif., 475 U.S. 1 (1986).

Perry Educ. Ass’n v. Perry Local Educators’ Ass’n, 460 U.S. 37 (1983).

Pierce v. Soc’y of Sisters, 268 U.S. 510 (1925).

Prince v. Massachusetts, 321 U.S. 158 (1944).

Radio-Television News Directors Ass’n v. FCC, 229 F.3d 269 (D.C. Cir. 2000).

Red Lion Broad. v. FCC, 395 U.S. 367 (1969).

Regina v. Hicklin, L. R. 3 Q. B. 360 (1868).

Reno v. Am. Civil Liberties Union, 521 U.S. 844 (1997).

Rosenberger v. Rector, 515 U.S. 819 (1995). Roth v. United States, 354 U.S. 476 (1957).

Rust v. Sullivan, 500 U.S. 173 (1991).

Sable Commc’ns of Calif., Inc. v. FCC, 492 U.S. 115 (1989). Schneider v. State, 308 U.S. 147 (1939). South Dakota v. Dole, 483 U.S. 203 (1987).

Se. Promotions, Ltd. v. Conrad, 420 U.S. 546 (1975).

Stanley v. Georgia, 394 U.S. 557 (1969).

Stratton Oakmont v. Prodigy, 1995 N.Y. Misc. LEXIS 229 *1; 23 Media L. Rep. 1794 (N.Y. App. Div. May 24, 1995).

Syracuse Peace Council v. FCC, 867 F. 2d 654 (D.C. Cir. 1989).

Thomas v. Collins, 323 U.S. 516 (1945).

Thornhill v. Alabama, 310 U.S. 88 (1940).

Tinker v. Des Moines Indep. Cmty. Sch. Dist., 393 U.S. 503 (1969).

Turner Broad. Sys. v. FCC, 512 U.S. 622 (1994).

Turner Broad. Sys. v. FCC, 520 U.S. 180 (1997).

373

United States v. Am. Library Ass’n, 539 U.S. 194 (2003).

United States. v. Darby, 312 U.S. 100 (1941).

United States v. Kennerley, 209 F. 119 (D.C.S.D.N.Y. 1913).

United States v. Playboy Entm't Group, Inc., 529 U.S. 803 (2000). Va. State Bd. of Pharmacy v. Va. Citizens Consumer Council, 425 U.S. 748 (1976). Whitney v. California, 274 U.S. 357 (1927). Widmar v. Vincent, 454 U.S. 263 (1981). Wieman v. Updegraff, 344 U.S. 183 (1952).

Secondary References

Allan, David W., Socrates and Democracy (July 29, 2001), http://www.allanstime.com/Government/socrates_democracy.htm (last visited July 20, 2009). Allen, Mike, et al., A Meta-Analysis Summarizing the Effects of Pornography II: Aggression After Exposure, 22 HUMAN COMMC’N RES. 258 (1995). Altschul, Kimberly A., Note, The Viewing Gallery of the House of Representatives: A First Amendment Public Forum?, 76 B.U. L. REV. 705 (1996).

Press Release, Am. Civil Liberties Union, ACLU Files Challenge to Library Internet Censorship in Case Fast-Tracked for Supreme Court Review (Mar. 20, 2001), available at http://www.aclu.org/privacy/speech/15597prs20010320.html?ht=internet%20protection% 20internet%20protection.

American Civil Liberties Union, Censorship in a Box: Why Blocking Software Is Wrong for Public Libraries (Sept. 16, 2002), http://www.aclu.org/privacy/speech/14915pub20020916.html (last visited July 20, 2009).

Press Release, Am. Civil Liberties Union, Library Internet Access is Still Free From Censorship as Law Goes into Effect, ACLU Tells Libraries, Patrons, (April 19, 2001), available at http://www.aclu.org//privacy/speech/15417prs20010419.html and http://www.aclu.org/Privacy/Privacy.cfm?ID=7224&c=252.

AM. LIBRARY ASS’N, ACCESS FOR CHILDREN AND YOUNG ADULTS TO NONPRINT MATERIALS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagem ent/ContentDisplay.cfm&ContentID=31872.

374

AM. LIBRARY ASS’N, ACCESS TO ELECTRONIC INFORMATION, SERVICES, AND NETWORKS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/interpretations/acc esselectronic.cfm

AM. LIBRARY ASS’N, FREE ACCESS TO LIBRARIES FOR MINORS: AN INTERPRETATION OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagem ent/ContentDisplay.cfm&ContentID=8639

AM. LIBRARY ASS’N, INTERPRETATIONS OF THE LIBRARY BILL OF RIGHTS, available at http://www.ala.org/Template.cfm?Section=interpretations&Template=/ContentManagem ent/ContentDisplay.cfm&ContentID=31872

Am. Library Ass’n, Best Practices in Public Libraries. http://www.ala.org/ala/shadows/pla/resources/bestpractices.cfm (last visited July 20, 2009). Am. Library Ass’n, CIPA. http://www.ala.org/ala/aboutala/governance/annualreport/annualreport/annualreportarch/r eport2002/freedom.cfm (last visited July 20, 2009).

Am. Library Ass’n, The Children’s Internet Protection Act (CIPA), http://www.ala.org/ala/aboutala/offices/wo/woissues/civilliberties/cipaweb/legalhistory/le galhistory.cfm (last visited July 20, 2009). Am. Library Ass’n, Filters and Filtering. http://www.ala.org/ala/aboutala/offices/oif/ifissues/filtersfiltering.cfm (last visited July 20, 2009).

AM. LIBRARY ASS’N, FREEDOM TO READ STATEMENT, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/ftrstatement/freedomreadstate ment.cfm

Am. Library Ass’n, Intellectual Freedom and Censorship Q & A. http://www.ala.org/ala/aboutala/offices/oif/basics/ifcensorshipqanda.cfm (last visited July 20, 2009).

AM. LIBRARY ASS’N, OFFICE FOR INTELLECTUAL FREEDOM, INTELLECTUAL FREEDOM MANUAL (Chicago: American Library Association, 5th ed. 1996).

AMERICAN LIBRARY ASSOCIATION, THE LIBRARIAN'S GUIDE TO CYBERSPACE FOR PARENTS AND KIDS—DEFINITIONS, http://www.ala.org/ala/alsc/greatwebsites/greatsitesbrochure.pdf. American Library Ass’n, Libraries & the Internet Toolkit (last updated Dec. 9, 2003), at 10, http://www.ala.org/oif/iftoolkits/internet (last visited July 20, 2009).

375

AM. LIBRARY ASS’N, LIBRARY BILL OF RIGHTS, available at http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillrights.cf m and http://www.ala.org/ala/aboutala/offices/oif/statementspols/statementsif/librarybillofrights. pdf Press Release, Am. Library Ass’n, New Report Shows More Libraries Connect to the Internet; Access Still Limited (Nov. 17, 1998), available at http://bubl.ac.uk/archive/journals/alawon/v07n149.htm

AM. LIBRARY ASS’N, RESOLUTION ON THE USE OF FILTERING SOFTWARE IN LIBRARIES, available at http://www.ala.org/Template.cfm?Section=ifresolutions&Template=/ContentManagemen t/ContentDisplay.cfm&ContentID=13076

Am. Library Ass’n, The Children’s Internet Protection Act (CIPA), http://www.ala.org/ala/aboutala/offices/wo/woissues/civilliberties/cipaweb/legalhistory/le galhistory.cfm (last visited July 20, 2009).

AM. LIBRARY ASS’N, THE STATE OF AMERICAN LIBRARIES (April 2007), available at http://www.ala.org/ala/newspresscenter/mediapresscenter/presskits/stateofamericaslibrari espresskit2007/sal07pk.cfm

Am. Library Trustee Ass’n, Five Ways ALTA Can Help Your Library, http://www.ala.org/ala/alta/links/PDF5waysaltacanhelpyou.pdf (last visited July 20, 2009).

America’s Library & Trustees Advocates, Ethics Statement for Public Library Trustees, available at http://www.ala.org/ala/alta/links/ethicsstatement.pdf.

Appropriations Bill Mandates Filter, Vol. 13 AM. LIBRARIES, Issue 8, at 14 (Sept. 2000).

ARIES, PHILIP, CENTURIES OF CHILDHOOD: A SOCIAL HISTORY OF FAMILY LIFE (translated by Robert Baldick) (New York: Knopf, 1962).

Baker, C. Edwin, Scope of the First Amendment Freedom of Speech, 25 U.C. L.A. L. REV. 954 (1978).

Balkin, Jack, Comment, Media Filters, the V-Chip, and the Foundations of Broadcast Regulation, 45 DUKE L. J. 1131 (1996).

Barack, Lauren, Filters Impede Learning, 51 SCH. LIBR. J. 24, 24 (2005).

Barak, Azy & William A. Fisher, Effects of Interactive Computer Erotica on Men’s Attitudes and Behavior Toward Women: An Experimental Study, 13 COMPUTERS IN HUMAN BEHAVIOR 353 (1997).

376

Barak, Azy, William A. Fisher, Sandra Belfry, & Darryl Lashambe, Sex, Guys and Cyberspace: Effects of Internet Pornography and Individual Differences on Men’s Attitudes Toward Women, 11 J. PSYCHOLOGY & HUM. SEXUALITY 63 (1999).

BARAN, STANLEY & DENNIS DAVIS, MASS COMMUNICATION THEORY: FOUNDATIONS, FERMENT, AND FUTURE (Belmont, Calif.: Wadsworth/Thomson, 2000).

Becker, J. & R.M. Stein, Is Sexual Erotica Associated With Sexual Deviance in Adolescent Males? 14 INT’L J. L. & POL’Y 85 (1991).

Bell, Bernard W., Filth, Filtering, and the First Amendment: Ruminations on Public Libraries’ Use of Internet Filtering Software, 53 FED. COMMC’NS L. J. 191 (2001).

Benedek, Elissa P., & Catherine F. Brown, No Excuses: Televised Pornography Harms Children, 7 HARV. REV. OF PSYCHIATRY 236 (1999).

BERGER, ARTHUR, ESSENTIALS OF MASS COMMUNICATION THEORY (Thousand Oaks, Calif.: Sage Publications, 1995).

Berninghausen, David K., The History of the ALA Intellectual Freedom Committee, 27 WILSON LIBRARY BULLETIN 813 (1953)).

Berners-Lee, Tim, Information Management: A Proposal, http://www.w3.org/History/1989/proposal.html (last visited July 20, 2009).

Bertot, John Carlo & Charles R. McClure, Public Libraries and the Internet 2000: Summary Findings and Data Tables (2000), http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/1 6/9a/6f.pdf (last visited July 20, 2009). Bhagwat, Ashutosh, What If I Want My Kids to Watch Pornography?: Protecting Children from “Indecent” Speech, 11 WM & MARY BILL OF RIGHTS J. 671 (2003).

Birnhack, Michael & Jacob Rowbottom, Symposium: Do Children Have the Same First Amendment Rights as Adults?: Shielding Children: The European Way, 79 CHI.-KENT. L. REV. 175 (2004).

BLACK’S LAW DICTIONARY (St. Paul: West, 8th ed. 2004).

Blitz, Marc, Constitutional Safeguards for Silent Experiments in Living: Libraries, the Right to Read, and a First Amendment Theory for an Unaccompanied Right to Receive Information, 74 UMKC L. REV. 799 (2006).

BLUMIN, STUART M., THE EMERGENCE OF THE MIDDLE CLASS: SOCIAL EXPERIENCE IN THE AMERICAN CITY, 1760-1900 (New York: Cambridge University Press, 1989).

377

Boorstin, Daniel J., The Indivisible Community, in American Library Association, ed., Libraries and the Life of the Mind in America: Addresses Delivered at the Centennial Celebration of the American Library Association. (Chicago: American Library Association, 1977).

THE BOWKER ANNUAL LIBRARY AND BOOK TRADE ALMANAC, Dave Bogart ed., 44th ed. (New Providence, NJ: R.R. Bowker, 1999).

BOYER, PAUL, PURITY IN PRINT: THE VICE-SOCIETY MOVEMENT AND BOOK CENSORSHIP IN AMERICA (New York: Scribner, 1968).

Brown, Michael J., Note & Comment, The Children’s Internet Protection Act: A Denial of a Student’s Opportunity to Learn in a Technology-Rich Environment, 19 GA. ST. U. L. R. 789 (2003).

Bryant, Jennings & Steven Rockwell, Effects of Massive Exposure to Sexually Oriented Prime- Time Television Programming on Adolescents’ Moral Judgment, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES (Dolf Zillmann & Aletha C. Huston, eds., 1994, Hillsdale, NJ: Lawrence Erlbaum Associates).

BUCKLEY, PETER & DUNCAN CLARK, THE ROUGH GUIDE TO THE INTERNET (New York: Rough Guides, 2007).

Butsch, Richard, A History of Research on Movies, Radio, and Television, 29 J. POPULAR FILM & TELEVISION 112 (2001).

Cabe, Tanessa, Note, Regulation of Speech on the Internet: Fourth Time’s the Charm? 11 MEDIA L. & POL’Y 50 (2002).

Cailliau, Robert, A Little History of the World Wide Web from 1945 to 1995, http://www.w3.org/History.html (last visited July 20, 2009).

Calvert, Clay & Robert D. Richards, Essay: New Millennium, Same Old Speech; Technology Changes, But the First Amendment Issues Don’t, 79 B.Y.U. L. REV. 959 (1999).

Candia, Eileen, Comment, The Information Super Highway – Caution – Road Blocks Ahead: is the Use of Filtering Technology to Prevent Access to “Harmful” Internet Sites Constitutional? 9 TEMPLE POLITICAL & CIVIL RIGHTS L. REV. 85 (1999).

Cannon, Robert, The Legislative History of Senator Exon’s Communication Decency Act: Regulating Barbarians on the Information Superhighway, 40 FED. COMM. L.J. 51 (1996).

Cassidy, Michael, Note, To Surf and Protect: The Children’s Internet Protection Act Polices Material Harmful to Minors and a Whole Lot More, 11 MICH. TELECOMM. TECH. L. REV. 437, 440 (2005).

THE CHANGING FAMILY AND CHILD DEVELOPMENT (Claudio Violato, Elizabeth Oddone- Paolucci & Mark Genuis, eds., 2000, London: Ashgate Publishing,).

378

Check, J.V.P. & T.H. Guloien, Reported Proclivity for Coercive Sex Following Repeated Exposure to Sexually Violent Pornography, Nonviolent Dehumanizing Pornography, and Erotica, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS 159-84 (Dolf Zillman & Jennings Bryant, eds., 1989)

Chen, Jim, Mastering Eliot's Paradox: Fostering Cultural Memory in an Age of Illusion and Allusion, 89 MINN. L. REV. 1361 (2005).

Chen, Jim, The Faegre & Benson Symposium: Law, Information and Freedom of Expression: Article: Mastering Eliot's Paradox: Fostering Cultural Memory in an Age of Illusion and Allusion, 89 MINN. L. REV. 1361(2005).

Chrislip, Jared, Filtering the Internet Like a Smokestack: How the Children's Internet Protection Act Suggests a New Internet Regulation Analogy, 5 J. HIGH TECH. L. 261 (2005).

CHUTE, ADRIENNE, ET AL., U.S. DEP’T OF EDUC., NATIONAL CENTER FOR EDUCATION STATISTICS, PUBLIC LIBRARIES IN THE UNITED STATES: FISCAL YEAR 2001, NCES 2003–399 (2003), available at http://nces.ed.gov/pubs2003/2003399.pdf.

Clement, Priscilla Ferguson, The City and The Child, 1860-1885, in AMERICAN CHILDHOOD: A RESEARCH GUIDE AND HISTORICAL HANDBOOK (Joseph M. Hawes & N. Ray Hiner, eds. 1985 Westport, Conn.: Greenwood Press).

Cline, Victor B., Pornography Effects: Empirical and Clinical Evidence, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES (Dolf Zillmann & Aletha C. Huston, eds., 1994, Hillsdale, NJ: Lawrence Erlbaum Associates).

Comm’n on Obscenity & Pornography, REPORT OF THE COMMISSION ON OBSCENITY AND PORNOGRAPHY. (Washington, DC: U.S. Government Printing Office, 1970). Commission on Child Online Protection Act, Report to Congress (October 2000), available at http://www.copacommission.org/report/COPAreport.pdf Conn, Kathleen, Commentary: Protecting Children from Internet Harm (Again): Will the Children’s Internet Protection Act Survive Judicial Scrutiny?, 153 EDUC. L. REPORTER 469 (2001).

Conspectus – The Association of American Law Schools, Section on Mass Communications Law 1997 Annual Conference Panel: Sex, Violence, Children and the Media: Legal, Historical and Empirical Perspectives, 5 COMMLAW CONSPECTUS 341 (1997).

Corn-Revere, Robert, United States v. American Library Association: A Missed Opportunity for the Supreme Court to Clarify Application of First Amendment Law to Publicly Funded Expressive Institutions, 2003 CATO SUP. CT. REV. 105 (2003).

Covell, Rebecca L., Problems With Government Regulation of the Internet: Adjusting the Court’s Level of First Amendment Scrutiny, 42 ARIZ. L. REV. 777 (2000).

379

Cravens, Hamilton, Child-Saving in the Age of Professionalism, 1915-1930, in AMERICAN CHILDHOOD: A RESEARCH GUIDE AND HISTORICAL HANDBOOK (Joseph M. Hawes & N. Ray Hiner, eds. 1985, Westport, Conn.: Greenwood Press).

CRAWFORD, WALT, BEING ANALOG, (Chicago: Am. Library Ass’n, 1999).

CRAWFORD, WALT & MICHAEL GORMAN, FUTURE LIBRARIES: DREAMS, MADNESS, AND REALITY. (Chicago: Am. Library Ass’n, 1995).

Curry, Ann & Ken Haycock, Filtered or Unfiltered?, 47 SCH. LIBRARY J. 42 (2001).

Dale, Maryclaire, 1 Percent of Web Sites Deemed Pornographic, www.msnbc.com/id/15721799 (last visited July 20, 2009).

DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET: SYNTHESIS REPORT (2008), available at http://www.sip-bench.org/Reports2008/sip_bench_2008_synthesis_report_en.pdf

DELOITTE ENTERPRISE RISK SERVICES, SAFER INTERNET: TEST AND SCORING METHODOLOGY (2008), http://www.sip bench.eu/Reports2008/sip_bench_2008_methodology_report_en.pdf

Digital Chaperones for Kids, CONSUMER REPORTS. March 2001, at 20.

Donnerstein, Edward & Leonard Berkowitz, Victim Reactions in Aggressive Erotic Films as a Factor in Violence Against Women, 41 J. PERSONALITY & SOCIAL PSYCHOLOGY 710 (1981).

Donnerstein, Edward, Daniel Linz & Steven Penrod, THE QUESTION OF PORNOGRAPHY: RESEARCH FINDINGS AND POLICY IMPLICATIONS (New York: Free Press; London: Collier Macmillan, 1987).

Donnerstein, Edward, Pornography: Its Effect on Violence Against Women, in PORNOGRAPHY AND SEXUAL AGGRESSION (N.M. Malamuth and E. Donnerstein, eds. 1984).

EASTON, SUSAN M., THE PROBLEM OF PORNOGRAPHY (New York: Routledge, 1994).

Echelman, Shirley, The Right to Know: The Librarian’s Responsibilities, in THE RIGHT TO INFORMATION: LEGAL QUESTIONS AND POLICY ISSUES (Jana Varlejs, ed., 1984 Jefferson, N.C. & London: McFarland & Co.).

Emerson, Thomas I., Symposium: The First Amendment and the Right to Know – Legal Foundations of the Right to Know, 1976 WASH. U. L. Q. 1 (1976).

EMERSON, THOMAS I., TOWARD A GENERAL THEORY OF THE FIRST AMENDMENT (New York: Random House, 1966).

ENCYCLOPEDIA BRITTANICA, http://www.britannica.com/eb/article-9057375/original-sin (last visited July 20, 2009).

380

Entertainment Software Rating Board (ESRB), http://www.esrb.org/index-js.jsp (last visited July 20, 2009).

Etzioni, Amitai, Symposium: Do Children Have the Same First Amendment Rights as Adults?: On Protecting Children from Speech, 79 CHI.-KENT. L. REV. 3 (2004).

FCC, FCC Consumer Facts on Children’s Internet Protection Act, Sept. 17, 2003, http://www.fcc.gov/cgb/consumerfacts/cipa.html (last visited July 20, 2009). FCC, Frequently Asked Questions on Universal Service and the Snowe-Rockefeller Amendment (released July 2, 1997), http://www.fcc.gov/learnnet/ (last visited July 20, 2009).

Filtering Software: Better, But Still Fallible. CONSUMER REPORTS, at 36 (June 2005).

FINAL REPORT OF THE ATTORNEY GENERAL'S COMMISSION ON PORNOGRAPHY. (Nashville, Tenn: Rutledge Hill Press, 1986).

Finke, Robert W., Note, The Connecticut Constitution and the Public Forum Analysis, 14 QUINNIPIAC L. REV. 105 (1994).

Finsness, Lisa Schneider, The Implication of Internet Content Filters in Secondary Schools (May 2008) (unpublished Ph.D. dissertation, University of Minnesota) (on file with author).

Fisher, W.A. & G. Grenier, Violent Pornography, Antiwoman Thoughts, and Anti Woman Acts: In Search of Reliable Effects. 31 J. SEX RESEARCH (1994).

Anna Forslund, Protecting America’s Youth Online: A Legal and Ethical Analysis (December 2007) (unpublished master’s thesis, Southern Illinois University at Carbondale) (on file with author). Freedom Forum, What is the Public Forum Doctrine?, http://www.freedomforum.org/templates/document.asp?documentID=13878 (last visited July 20, 2009).

GAGNON, JOHN G., HUMAN SEXUALITIES (Glenview, Ill.: Scott, Foresman, 1977).

Garry, Patrick, The Flip Side of the First Amendment: A Right to Filter, 2004 MICH. ST. L. REV. 57, 68 (2004).

GATES, JEAN K., INTRODUCTION TO LIBRARIANSHIP (New York: Neal-Schuman, 1990).

GELLER, EVELYN, FORBIDDEN BOOKS IN AMERICAN PUBLIC LIBRARIES, 1876-1939 (Westport, Conn: Greenwood Press, 1984).

GetNetWise, Tools for Families (2008), http://kids.getnetwise.org/tools/blocksex (last visited July 20, 2009).

381

Gey, Steven, Reopening the Public Forum—From Sidewalks to Cyberspace, 58 OHIO ST. L.J. 1535 (1998).

Giglio, Ernest, Pornography in Denmark: A Public Policy Model for the United States, 8 COMP. SOC. RESEARCH 281 (1985).

Glasser, Ira, The Internet and the Law: The Struggle for a New Paradigm: Protecting Free Speech and Privacy in the Virtual World of Cyberspace, 23 NOVA L. REV. 625, 647 (1999).

Goldstein, Mitchell, Congress And The Courts Battle Over The First Amendment: Can The Law Really Protect Children From Pornography On The Internet? 21 J. MARSHALL J. COMPUTER & INFO. L. 141 (2003).

Gonzalez, Otilio, Regulating Objectionable Content in Multimedia Platforms: Will Convergence Require a Balance of Responsibilities Between Senders and Receivers?, 20 SANTA CLARA COMPUTER & HIGH TECH. L.J. 609 (2004).

Goodson, Patricia, Deborah McCormick & Alexandra Evans, Searching for Sexually Explicit Materials on the Internet: An Exploratory Study of College Students’ Behaviors and Attitudes, 30 ARCHIVES OF SEXUAL BEHAVIOR 101 (2001).

GORMAN, MICHAEL, OUR ENDURING VALUES. (Chicago: American Library Association, 2000).

GREENLEAF, BARBARA, CHILDREN THROUGH THE AGES: A HISTORY OF CHILDHOOD. (New York: McGraw-Hill, 1978).

Gunther, Albert C., Overrating the X-Rating: The Third Person Perception and Support for Censorship of Pornography, 45 J. COMMC’N 27 (1995).

Hammami, Mohammed, Youssef Chahir & Liming Chen, WebGuard: A Web Filtering Engine Combining Textual, Structural, and Visual Content-Based Analysis, 18 IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING 272 (2006).

Harper III, A. John, Traditional Free-Speech Law: Does It Apply on the Internet? 6 COMP. L. REV. & TECH. J. 265 (2002).

Harris, Richard Jackson, The Impact of Sexually Explicit Media, in MEDIA EFFECTS 247 (Jennings Bryant & Dolf Zillmann, eds., 1994, Hillsdale, NJ: Lawrence Erlbaum). Haselton, Bennett, BAIR "Image filtering" Has 0% Accuracy Rate, available at http://www.copa commission.org/papers. Hefner, Marion D., Note, “Roast Pigs” and Miller-Light: Variable Obscenity in the Nineties, 1996 U. ILL. L. REV. 843 (1996).

HEINS, MARJORIE, NOT IN FRONT OF THE CHILDREN: INDECENCY, CENSORSHIP, AND THE INNOCENCE OF YOUTH (New York: Hill & Wang, 2001).

382

HEINS, MARJORIE, ET AL., BRENNAN CENTER FOR JUSTICE AT N.Y. UNIV. SCHOOL OF LAW, INTERNET FILTERS: A PUBLIC POLICY REPORT (2d ed., 2006).

Helper, Kim R., Note, Kreimer v. Bureau of Police for Morristown: The Sterilization Of The Local Library: Kreimer v. Bureau of Police for Morristown, 958 F.2d 1242 (3d Cir. 1992), 23 STETSON L. REV. 521 (1994).

HEMMER, JOSEPH J., JR. THE FIRST AMENDMENT (Cresskill, New Jersey: 2006).

History and Development of Filters, 40 LIBRARY TECH. REPORTS 8 (March/April 2004). Holland, Abigail K., Case Comment, Constitutional Law - Constitutionality of Mandatory Filters on Federally Funded Internet Access in Public Libraries - United States v. American Library Association, Inc., 539 U.S. 194 (2003), 38 SUFFOLK U. L. REV. 217 (2004).

Horowitz, Adam, The Constitutionality of the Children’s Internet Protection Act, 13 ST. THOMAS L. REV. 425 (2000).

How Many Web Sites Exist? (Feb. 15, 2007), http://www.boutell.com/newfaq/misc/sizeofweb.html (last visited July 20, 2009).

Hunt, Lynn, Introduction: Obscenity and the Origins of Modernity: 1500-1800, in THE INVENTION OF PORNOGRAPHY (Lynn Hunt, ed., 1996, New York: Zone Books).

Huston, Aletha C., Ellen Wartella, & Edward Donnerstein, Measuring the Effects of Sexual Content in the Media: A Report to the Kaiser Family Foundation (May 2003), http://www.kff.org/entmedia/1389-content.cfm and http://www.kff.org/entmedia/loader.cfm?url=/commonspot/security/getfile.cfm&PageID =14624 (last visited July 20, 2009).

Internet Content Rating Association, About ICRA, available at http://www.fosi.org/icra/#glance (last visited July 20, 2009).

INVENTION OF PORNOGRAPHY (Lynn Hunt, ed., 1996, New York: Zone Books).

Jacobson, Aaron, Note, United States v. American Library Association: Software Filters, Free Speech, and the Shrinking Public Forum, 38 U.C. DAVIS L. REV. 1345 (2005).

Jacobson, Peter, Legislative Update: The Child Online Protection Act: Taming the World “Wild” Web, 9 J. ART & ENT’MT L. 421 (1999).

Jaeger, Paul & Charles McClure, Potential Legal Challenges to the Application of the Children’s Internet Protection Act (CIPA) in Public Libraries: Strategies and Issues, FIRST MONDAY (non-paginated online publication) (2004), http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/issue/view/167 and http://firstmonday.org/issues/issue9_2/jaeger/index.html (last visited July 20, 2009).

383

Jaeger, Paul, Charles McClure, John Bertot & Lesley Langa, CIPA: Decisions, Implementation, and Impacts, 44 PUB. LIBRARIES 105 (2005).

Jaeger, Paul & Zheng Yang, One Law with Two Outcomes: Comparing the Implementation of CIPA in Public Libraries and Schools, 28 INFO. TECH. AND LIBRARIES 6 (March 2009).

Kaiser, Whitney A., The Use of Internet Filters in Public Schools: Double Click on the Constitution, 31 COLUMBIA J. L. & SOC. PROBLEMS 49 (2000).

Keller, Kimberly S., Comment, From Little Acorns Great Oaks Grow; The Constitutionality of Protecting Minors From Harmful Internet Material in Public Libraries, 30 ST. MARY’S L. J. 549 (1999).

KENDRICK, WALTER, THE SECRET MUSEUM (Berkeley: University of California Press, 1996).

KETT, JOSEPH, RITES OF PASSAGE: ADOLESCENCE IN AMERICA, 1790 TO THE PRESENT (New York: Basic Books, 1977).

King, Margaret L., Concepts of Childhood: What We Know and Where We Might Go, 60 RENAISSANCE Q. 371 (2007).

Kline III, Matthew Thomas, First Amendment: 1. Limiting Internet Access; a) Public Libraries: Mainstream Loudoun v. Board of Trustees of Loudoun County Library, 14 BERKELEY TECH. L.J. 347 (1999).

Koenigsberg, Sidne, Print Symposium: Contract Options for Individual Artists: Library Records Open to Parental Scrutiny: A New Set of Internet Access Controls for Minors?, 29 COLUM. J.L. & ARTS 361 (2006).

Krotoszynski, Jr., Ronald J., Childproofing the Internet, 41 BRANDEIS L.J. 447 (2003).

Kutchinsky, Berl, Pornography and Its Effects in Denmark and the United States: A Rejoinder and Beyond, 8 COMP. SOC. RESEARCH 301 (1985).

KYVIG, DAVID E., EXPLICIT AND AUTHENTIC ACTS: AMENDING THE U.S. CONSTITUTION 1776- 1995 (Lawrence, Kan.: University Press of Kansas, 1996).

Langevin, R., R.A. Lang, P. Wright et al., Pornography and Sexual Offenses, 1ANNALS OF SEX RESEARCH 335-62 (1988).

Laughlin, Gregory K., Sex, Lies, and Library Cards: The First Amendment Implications of the Use of Software Filters to Control Access to Internet Pornography in Public Libraries, 51 DRAKE L. REV. 213 (2003).

Lee, William E., The First Amendment Doctrine of Overbreadth, 71 WASH. U. L. Q. 637 (1993).

384

Lee, William E., The Supreme Court and the Right to Receive Expression, 1987 SUP. CT. REV. 303 (1987).

LESSIG, LAWRENCE, CODE AND OTHER LAWS OF CYBERSPACE (New York: Basic Books, 1999). Lessig, Lawrence, Tyranny in the Infrastructure (July 1997), http://www.wired.com/wired/archive/5.07/cyber_rights.html (last visited July 20, 2009).

Lessig, Lawrence & Paul Resnick, Zoning Speech on the Internet: A Legal and Technical Model, 987 MICH. L. REV. 395 (1999).

Libraries on the Information Superhighway: Ethics Center Facilities Discussion on Internet Access, 9 ISSUES IN ETHICS (No. 1) (Winter 1998), Markkula Center for Applied Ethics, http://www.scu.edu/ethics/publications/iie/v9n1/libraries.html (last visited July 20, 2009).

Library Research Center, University of Illinois, Survey of Internet Access Management in Public Libraries: Summary of Findings (June 2000), http://lrc.lis.uiuc.edu/web/internet.pdf (last visited July 20, 2009).

Liebler, Raizel, Institutions of Learning or Havens for Illegal Activities: How the Supreme Court Views Libraries, 25 N. ILL. U. L. REV. 1 (2004).

Linz, Daniel & Edward Donnerstein, The Methods and Merits of Pornography Research, 38 J. COMM. 180 (1988). Linz, Daniel G., Edward Donnerstein & Steven Penrod, Effects of Long-Term Exposure to Violent and Sexually Degrading Depictions of Women, 55 J. PERSONALITY & SOC. PSYCHOLOGY 758 (1988).

Lo, Ven-hwei & Anna R. Paddon, How Sexual Strategies Theory, Gender, and the Third-Person Effect Explain Attitudes About Pornography, paper presented at annual convention of Association for Journalism & Mass Communication, New Orleans, La. (Aug. 1999).

Lo, Ven-hwei & Ran Wei, Third-Person Effect, Gender, and Pornography on the Internet, 46 J. BROAD. & ELEC. MEDIA 13 (2002).

Lockhart, William & Robert McClure, Censorship of Obscenity: The Developing Constitutional Standards, 45 MINN. L. REV. 5 (1960).

LOWERY, SHEARON A. & MELVIN L. DE FLEUR, MILESTONES IN MASS COMMUNICATION RESEARCH: MEDIA EFFECTS 3d ed. (White Plains, NY: Longman, 1995).

Lyons, John S., Rachel L. Anderson, & David B. Larson, “A Systematic Review of the Effects of Aggressive and Nonaggressive Pornography”, in MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES (Dolf Zillman & Aletha C. Huston eds., 1994, Hillsdale, NJ: Lawrence Erlbaum).

385

MACFARLANE, ALAN, MARRIAGE AND LOVE IN ENGLAND: MODES OF REPRODUCTION, 1300-1840 (Oxford & New York: Blackwell, 1986).

MACFARLANE, ALAN, ORIGINS OF ENGLISH INDIVIDUALISM: THE FAMILY, PROPERTY AND SOCIAL TRANSITION (New York: Cambridge University Press, 1978).

Mahood, Chad, Sriram Kalyanaraman & S. Shyam Sundar, The Effects of Erotica and Dehumanizing Pornography in an Online Interactive Environment, paper presented at 83rd annual convention of the Ass’n for Education in Journalism and Mass Communication, Phoenix, Ariz. (Aug. 2000).

Malamuth, N.M. & J. Ceniti, Repeated Exposure to Violent and Nonviolent Pornography: Likelihood of Raping Ratings and Laboratory Aggression Against Women, 12 AGGRESSIVE BEHAVIOR 129-37 (1986).

Malamuth, N.M. & J.V.P. Check, The Effects of Mass Media Exposure On Acceptance of Violence Against Women, 15 J. RESEARCH IN PERSONALITY 436-46 (1981).

Malen, Susannah, Protecting Children in the Digital Age: A Comparison of Constitutional Challenges to CIPA and COPA, 26 COLUM. J.L. & ARTS 217 (2003).

Martin, C. Dianne & Joseph M. Reagle, Jr., An Alternative to Government Regulation and Censorship: Content Advisory Systems for the Internet, http://penta2.ufrgs.br/gereseg/censura/rsac/dianne1.htm (last visited July 20, 2009).

Meeder, Rebecca, Access Denied: Internet Filtering Software in K-12 Classrooms, 49 TECHTRENDS 56 (2005).

MEDIA, CHILDREN, AND THE FAMILY: SOCIAL SCIENTIFIC, PSYCHODYNAMIC, & CLINICAL PERSPECTIVES (Dolf Zillmann & Aletha C. Huston, eds., 1994, Hillsdale, NJ: Lawrence Erlbaum).

Meehan, Kiera, Note, Installation of Internet Filters in Public Libraries: Protection of Children and Staff v. the First Amendment, 12 BOSTON PUB. INTEREST L.J. 483 (2003).

Meiklejohn, Alexander, The First Amendment Is an Absolute, 1961 SUP. CT. REV. 245, 257 (1961).

MEIKLEJOHN, ALEXANDER, FREE SPEECH AND ITS RELATION TO SELF-GOVERNMENT (New York: Harper & Brothers, 1948).

Merlis, Steven E., Preserving Internet Expression While Protecting Our Children: Solutions Following Ashcroft v. ACLU, 4 NW. J. TECH. & INTELL. PROP. 117 (2005).

Mehta, Michael D. & D. Plaza, Content Analysis of Pornographic Images Available on the Internet, 13 THE INFO. SOC’Y 153 (1997).

386

MIDDLETON, KENT R. & WILLIAM E. LEE, THE LAW OF PUBLIC COMMUNICATION (Boston: Pearson, 2008).

MILL, JOHN STUART, ON LIBERTY (Stefan Collini, ed.) (Cambridge: Cambridge University Press, 1989) (1859).

Miller, Perry, The Puritan Way of Life, in PURITANISM IN EARLY AMERICA: PROBLEMS IN AMERICAN CIVILIZATION (George M. Waller, ed., 1950, Boston: Heath).

Miltner, Katherine, Note, Discriminatory Filtering: CIPA’s Effect on Our National’s Youth and Why the Supreme Court Erred in Upholding the Constitutionality of the Children’s Internet Protection Act, 57 FED. COMM. L.J. 555, 578 (2005).

MILTON, JOHN, AREOPAGITICA, (K.M. Lea, ed.) (Oxford: Clarendon Press, 1973) (1644).

Mitchell, Kimberly J., David Finkelhor & Janis Wolak, The Exposure of Youth to Unwanted Sexual Material on the Internet: A National Survey of Risk, Impact, and Prevention, 34 YOUTH & SOC’Y 330 (2003).

MSN ENCARTA ENCYCLOPEDIA, http://encarta.msn.com/encnet/refpages/search.aspx?q=original+sin (last visited July 20, 2009).

MORVILLE, PETER & LOUIS ROSENFELD, INFORMATION ARCHITECTURE FOR THE WORLD WIDE WEB (3d ed., 2006, Sebastopol, CA: O’Reilly Media).

Munro, Jay, Filtering Software, PC MAGAZINE 103 (Aug. 3, 2004).

MURISON, WILLIAM JOHN, THE PUBLIC LIBRARY (London: Clive Bingley Ltd., 1988).

Nachbar, Thomas B., Paradox and Structure: Relying on Government Regulation to Preserve the Internet’s Unregulated Character, 85 MINN. L. REV. 215 (2000).

Nadel, Mark S., The First Amendment Limitations on the Use of Internet Filtering in Public and School Libraries: What Content Can Libraries Exclude? 78 TEX. L. REV. 1117 (2000). Nat’l Telecomms & Info. Admin., A Nation Online: Entering the Broadband Age at A-1 (2004), http://www.ntia.doc.gov/reports/anol/NationOnlineBroadband04.pdf (last visited July 20, 2009). Netcraft, July 2007 Web Server Survey (July 2007), http://news.netcraft.com/archives/web_server_survey.html (last visited July 20, 2009).

Newell, Christopher G., Chalk Talk: The Internet School Filtering Act: The Next Possible Challenge in the Development of Free Speech and the Internet, 28 J. L. & EDUC. 129 (1999).

387

Nist, Todd, Note, Finding the Right Approach: A Constitutional Alternative for Shielding Kids from Harmful Materials Online, 65 OHIO ST. L. J. 451 (2004).

Nunberg, Geoffrey, The Internet Farce: Why Blocking Software Doesn’t—and Can’t—Work as Promised, AM. PROSPECT (Jan. 1-15, 2001).

Nunziato, Dawn C., The Death of the Public Forum in Cyberspace, 20 BERKELEY TECH. L.J. 1115 (2005). Nunziato, Dawn C., Symposium: Do Children Have the Same First Amendment Rights as Adults? Toward a Constitutional Regulation of Minors’ Access to Harmful Internet Speech, 79 CHI.-KENT. L. REV. 121 (2004). Oddone-Paolucci, Elizabeth, Mark Genuis & Claudio Violato, A Meta-Analysis of the Published Research on the Effects of Pornography, in THE CHANGING FAMILY AND CHILD DEVELOPMENT (Claudio Violato, Elizabeth Oddone-Paolucci & Mark Genuis, eds., 2000, London: Ashgate Publishing).

O'Neill, Kevin Francis, Privatizing Public Forums To Eliminate Dissent, 5 FIRST AMENDMENT L. REV. 201 (2007). Padgett, V.R., J.A. Brislin-Slutz & J.A. Neal, Pornography, Erotica, and Attitudes Toward Women: The Effects of Repeated Exposure, 26 J. SEX RESEARCH 479-91 (1989).

PARKER, ALISON M., PURIFYING AMERICA: WOMEN, CULTURAL REFORM, AND PRO-CENSORSHIP ACTIVISM, 1873-1933 (Urbana, Ill.: University of Illinois Press, 1997). Parker, Alison M., "Hearts Uplifted and Minds Refreshed": The Woman's Christian Temperance Union and the Production of Pure Culture in the United States, 1880-1930, 11 J. WOMEN'S HISTORY 135 (1999).

PARKER, DAVID L., STOLEN DREAMS: PORTRAITS OF WORKING CHILDREN (Minneapolis: Lerner Publications, 1998). Peacefire, http://www.peacefire.org/info/about-peacefire.shtml (last visited July 20, 2009). Peltz, Richard, Pieces of Pico: Saving Intellectual Freedom in the Public School Library, 2005 BYU EDUC. & L. J. 103 (2005).

Peltz, Richard J., Use “the Filter You Were Born With”: The Unconstitutionality of Mandatory Internet Filtering for the Adult Patrons of Public Libraries, 77 WASH. L. REV. 397 (2002).

PEMBER, DON R. & CLAY CALVERT, MASS MEDIA LAW (New York: McGraw-Hill, 2005).

Pew Internet and American Life Project, Generations Online in 2009 (2009), http://www.pewinternet.org/Reports/2009/Generations-Online-in-2009.aspx (last visited July 20, 2009).

388

Pew Internet and American Life Project, Protecting Teens Online (March 17, 2005), http://www.pewinternet.org/Reports/2005/Protecting-Teens-Online.aspx (last visited July 20, 2009). Piccardo, Larissa, Note, Filtering the First Amendment: The Constitutionality of Internet Filters in Public Libraries Under the Children’s Internet Protection Act, 41 HOUS. L. REV. 1437, 1467 (2004).

Platform for Internet Content Selection (PICS), http://www.w3.org/PICS/ (visited July 20, 2009).

Preston, Cheryl, Zoning the Internet: A New Approach to Protecting Children Online, 2007 B.Y.U. L. REV. 1417 (2007).

THE PURITAN TRADITION IN AMERICA: 1620-1730 (Alden T. Vaughan, ed.) (Columbia, S.C.: University of South Carolina Press, 1972).

Ratzan, Jill, CIPA and the Roles of Public Librarians, 43 PUB. LIBRARIES 285 (2004). Recreational Software Advisory Council, http://www.rsac.org/ (last visited July 20, 2009).

REISS, I.L., JOURNEY INTO SEXUALITY: AN EXPLORATORY VOYAGE (Englewood Cliffs, NJ: Prentice-Hall, 1986).

Reiss, Spencer, St. Tim of the Web, FORBES, Nov. 15, 1999, at 314.

Resnick, Paul & James Miller, PICS: Internet Access Controls Without Censorship, 39 COMMUNICATIONS OF THE ACM 87 (1996).

Richards, Robert D. & Clay Calvert, Essay: The “True Threat” to Cyberspace: Shredding the First Amendment for Faceless Fears, 7 COMMLAW CONSPECTUS 291 (1999).

Richardson, Caroline R., Paul J. Resnick, Derek L. Hansen, Holly A. Derry, & Victoria J. Rideout, Does Pornography-Blocking Software Block Access to Health Information on the Internet?, 288 J. AM. MEDICAL ASS’N 2887-2894 (Dec. 11, 2002).

Rideout, Victoria, Caroline Richardson, & Paul Resnick, See No Evil: How Internet Filters Affect the Search for Online Health Information (Kaiser Family Foundation, Dec. 2002), available at http://www.kff.org/entmedia/20021210a-index.cfm

Rimm, Marty, Marketing Pornography on the Information Superhighway, 83 GEORGETOWN L.J. 1849 (1995).

ROBBINS, LOUISE, CENSORSHIP AND THE AMERICAN LIBRARY: THE AMERICAN LIBRARY ASSOCIATION’S RESPONSE TO THREATS TO INTELLECTUAL FREEDOM: 1939-1969 (Westport, Conn.: Greenwood Press, 1996).

Roberts, Diane, The Jurisprudence of Ratings Symposium Part I: On the Plurality of Ratings, 15 CARDOZO ARTS & ENT. LJ 105, 113-14 (1997).

389

ROBOTHAM, JOHN & GERALD SHIELDS, FREEDOM OF ACCESS TO LIBRARY MATERIALS (New York: Neal-Schuman, 1982).

RODGERS, DANIEL T., THE WORK ETHIC IN INDUSTRIAL AMERICA: 1850-1920 (Chicago: University of Chicago Press, 1978).

Rojas, Hernando, Dhavan V. Shah & Ronald J. Faber, For the Good of Others: Censorship and the Third-Person Effect, 8 INT’L J. PUB. OPINION RESEARCH 163 (1996).

Ross, Catherine J., An Emerging Right for Mature Minors to Receive Information, 2 U. PA. J. CON. L. 223 (1999).

Ross, Catherine J., Anything Goes: Examining the State’s Interest in Protecting Children from Controversial Speech, 53 VAND. L. REV. 427 (2000).

RUBIN, RICHARD E., FOUNDATIONS OF LIBRARY AND INFORMATION SCIENCE (2d ed., New York: Neal-Schuman, 2000).

Russomanno, Joseph A., “The Firebrand of My Youth”: Holmes, Emerson and Freedom of Expression, 5 COMM. LAW & POL’Y, 33 (2000).

RUTMAN, DARRETT B., AMERICAN PURITANISM (Philadelphia: J. B. Lippincott, 1970).

Sage, Laurann, Note, Mainstream Loudoun v. Board of Trustees; Restricting Internet Access in Public Libraries, 67 UMKC L. REV. 731 (1999).

SALLER, CAROL, WORKING CHILDREN (Minneapolis: Carolrhoda Books, 1998).

Sanchez, Barbara A., Note, United States v. American Library Association: The Choice Between Cash and Constitutional Rights, 38 AKRON L. REV. 463 (2005).

Sanders, Joel, The Regulation of Indecent Material Accessible to Children on the Internet: Is it Really Alright to Yell Fire in a Crowded Chat Room? 39 CATHOLIC L. 125 (1999). Saunders, Kevin, Electronic Indecency: Protecting Children in the Wake of Cable and Internet Cases, 46 DRAKE L. REV. 1 (1997).

SCHAUER, FREDERICK, THE LAW OF OBSCENITY (Washington, DC: The Bureau of National Affairs, 1976).

Semitsu, Junichi, Note, Burning Cyberbooks in Public Libraries: Internet Filtering Software vs. the First Amendment, 52 STAN. L. REV. 509 (2000).

Shanks, Thomas E. & Barry J. Stenger, Access, Internet, and Public Libraries (originally published in 1997 and updated in 2002 by Tamar Weber), http://www.scu.edu/ethics/practicing/focusareas/technology/libraryaccess/ (last visited July 20, 2009).

390

Shea, Elizabeth, Note, The Children’s Internet Protection Act of 1999: Is Internet Filtering Software the Answer?, 24 SETON HALL LEGIS. J. 167 (1999).

SHUMAN, BRUCE A., FOUNDATIONS AND ISSUES IN LIBRARY AND INFORMATION SCIENCES (Englewood, Colorado: Libraries Unlimited, 1992)

Siefkes, Darin, Note and Comment, Explaining United States v. American Library Association: Strictly Speaking, a Flawed Decision, 57 BAYLOR L. REV. 327 (2005).

Skaggs, J. Adam, Note, Burning the Library to Roast the Pig? Online Pornography and Internet Filtering in the Free Public Library, 68 BROOKLYN L. REV. 809 (2003)

Small Library Committee of the Wisconsin Association of Public Librarians, Sample Library Policies for the Small Public Library, 2d ed. (revised by David L. Polodna) available at http://www.owlsweb.info/L4L/policies/VIII.asp.

Smith, Barbara H., To Filter or Not to Filter: The Role of Public Librarians in Determining Internet Access, 5 COMM. LAW & POLICY 385 (2000).

Smolla, Rodney A., Freedom of Speech for Libraries and Librarians, 85 LAW. LIBR. J. 71 (1993).

SPARKS, GLENN G., MEDIA EFFECTS RESEARCH (Belmont, Calif.: Wadsworth, 2002). Stomberg, Derrick, Note, United States v. American Library Association, Inc.: The Internet as an Inherently Public Forum, 45 JURIMETRICS J. 59 (2004). Stone, Geoffrey R., Free Speech in the Twenty-First Century: Ten Lessons from the Twentieth Century, 36 PEPPERDINE L. REV. 273 (2009).

STONE, GERALD, MICHAEL SINGLETARY, & VIRGINIA P. RICHMOND, CLARIFYING COMMUNICATION THEORIES (Ames, Iowa: Iowa State University Press, 1999).

STONE, LAWRENCE, THE FAMILY, SEX AND MARRIAGE IN ENGLAND: 1500-1800 (New York & London: Harper & Row, 1977).

Sundar, S. S. Technological Issues in Internet Pornography, paper presented at annual convention of the Association for Education in Journalism and Mass Communication, New Orleans, La. (Aug. 1999).

Sweeney, Elise Gabrielle, Constitutional Law Chapter: Freedom of Speech: Protections and Limitations, 5 GEORGETOWN J. GENDER & L. 77 (2004).

Tedjeske, Julie M., Note, Mainstream Loudoun and Access to Internet Resources in Public Libraries, 60 U. PITT. L. REV. 1265 (1999).

THOMPSON, KATHLEEN & HILARY MAC AUSTIN, AMERICA’S CHILDREN (New York: W.W. Norton, 2003).

391

Thompson, Margaret E., Steven H. Chaffee & Hayg H. Oshagan, Regulating Pornography: A Public Dilemma, 40 J. COMM. 73 (1990).

THOMPSON, C. SEYMOUR, EVOLUTION OF THE AMERICAN PUBLIC LIBRARY: 1653-1876 (Washington: Scarecrow Press, 1952).

TRAGER, ROBERT, JOSEPH RUSSOMANO & SUSAN DENTE ROSS, THE LAW OF JOURNALISM AND MASS COMMUNICATION (New York: McGraw-Hill, 2007).

University of Iowa Labor Center, Child Labor Public Education Project, http://www.continuetolearn.uiowa.edu/laborctr/child_labor/ (last visited July 20, 2009).

University of Wisconsin-Eau Claire LTS Online Help Documentation, http://www.uwec.edu/Help/Internet/bandwidth.htm (last visited July 20, 2009).

VanNorman, Brent, Comment & Note, The Library Internet Filter: On the Computer or in the Child?, 11 REGENT U. L. REV. 425 (1998/1999).

Venhuizen, Tonnis, United States v. American Library Association: The Supreme Court Fails to Make the South Dakota v. Dole Standard a Meaningfull [sic] Limitation on the Congressional Spending Powers, 52 S.D. L. REV. 565, 597, 604 (2007).

Volokh, Eugene, Freedom of Speech, Shielding Children, and Transcending Balance, 1997 SUP. CT. REV. 141 (1997).

Wardak, Leah, Note, Internet Filters and the First Amendment: Public Libraries After United States v. Am. Library Association, 35 LOY. U. CHI. L.J. 657 (2004).

Wagner, R. Polk, Essay: Filters and the First Amendment, 83 MINN. L. REV. 755 (1999).

WC3, PICS Frequently-Asked Questions, http://www.w3.org/2000/03/PICS-FAQ/ (last visited July 20, 2009).

Weber, Janelle A., The Spending Clause: Funding a Filth-Free Internet or Filtering out the First Amendment? 56 FLA. L. REV. 471 (2004).

Weekes, Russell B., Cyber-Zoning a Mature Domain: The Solution to Preventing Inadvertent Access to Sexually Explicit Content on the Internet? 8 VA. J. L. & TECH. (4) 1 (2003).

Werst, Brian M., Comment, Legal Doctrine and Its Inapplicability to Internet Regulation: A Guide for Protecting Children from Internet Indecency after Reno v. ACLU, 33 GONZ. L. REV. 207 (1997/1998).

Williams, Susan H., Content Discrimination and the First Amendment, 139 U. PA. L . REV. 615 (1991).

392

Wimmer, Roger & Joseph Dominick, MASS MEDIA RESEARCH 5th ed. (Belmont, Calif.: Wadsworth, 1997).

WINN, MARIE, CHILDREN WITHOUT CHILDHOOD (New York: Pantheon Books, 1983).

Winstead, Scott, The Application of the “Contemporary Community Standard” to Internet Pornography: Some Thoughts and Suggestions, 3 LOY. INTELL. PROP. & HIGH TECH. J. 28 (2000).

Witte, Gretchen, Comment, Internet Indecency and Impressionable Minds, 44 VILLANOVA L. REV. 745 (1999).

Wolak, Janis, Kimberly Mitchell & David Finkelhor, Unwanted and Wanted Exposure to Online Pornography in a National Sample of Youth Internet Users, 119 PEDIATRICS 247 (2007).

WOOD, JULIA T., COMMUNICATION MOSAICS: AN INTRODUCTION TO THE FIELD OF COMMUNICATION (Belmont, Calif: Thomson Wadsworth, 2006).

Wu, Felix, Note, United States v. American Library Association: The Children’s Internet Protection Act, Library Filtering, and Institutional Roles, 19 BERKELEY TECH. L.J. 555 (2004).

Wyer, R.S., G.V. Bodenhausen & T.F. Gorman, Cognitive Mediators of Reactions to Rape, 48 J. PERSONALITY & SOC. PSYCHOLOGY 324 (1984).

The Youngest Surfers, PC MAGAZINE, at 28 (January 2005).

YOUTH, PORNOGRAPHY AND THE INTERNET (Dick Thornburgh & Herbert S. Lin, eds.), National Research Council (Washington, D.C.: National Academies Press, 2002) http://bob.nap.edu/html/youth_internet/ and http://www.nap.edu/openbook.php?record_id=10261&page=R1 (last visited July 20, 2009).

ZELIZER, VIVIANA A. ROTMAN, PRICING THE PRICELESS CHILD: THE CHANGING SOCIAL VALUE OF CHILDREN (New York: Basic Books, 1985).

Zick, Timothy, Congress, the Internet, and the Intractable Pornography Problem: The Child Online Protection Act of 1998, 32 CREIGHTON L. REV. 1147 (1999).

Zillmann, Dolf & Jennings Bryant. Effects of Prolonged Consumption of Pornography on Family Values, 9 J. FAMILY ISSUES 518 (1988). Zillmann, Dolf & Jennings Bryant, Pornography, Sexual Callousness, and the Trivialization of Rape, 32 J. COMM. 10 (1982).

Zillmann, D., Effects of Prolonged Consumption of Pornography, in PORNOGRAPHY: RESEARCH ADVANCES AND POLICY CONSIDERATIONS (Dolf Zillman & Jennings Bryant, eds., 1989).

393

Zwick, Jennifer, Comment, Casting a Net Over the Net: Attempts to Protect Children in Cyberspace, 10 SETON HALL CONST. L.J. 1133 (2000).

394

BIOGRAPHICAL SKETCH

Barbara Helena Smith was born on Long Island, New York and later moved to the

Midwest, where she graduated from Lumen Christi High School in Jackson, Michigan. She

earned a BA degree in telecommunications and an MA degree in journalism from Michigan State

University. She has professional experience as a broadcast, print and online journalist, and as an

announcer and disk jockey in both commercial and public radio. She also more than a decade of

experience in higher education, serving as a college administrator and faculty member. Included

among the positions she held in higher education were Director of Admissions at Adrian College

in Michigan, Assistant Professor of Journalism at Olivet College in Michigan, instructor in

journalism and mass communications at Kansas State University, and Assistant Professor of

Communication at Rochester Institute of Technology in Rochester, New York. She received her

Ph.D. from the University of Florida in the fall of 2009.