English - 10Th Grade Summer Assignment - Abraham
Total Page:16
File Type:pdf, Size:1020Kb
Incoming 10th Grade Summer Assignment: After reading the provided articles on censorship, write an essay in which you argue your position on the use of filters by schools. Support your position with textual evidence from the articles. Be sure to acknowledge competing views from the sources. You may also use other sources that you have researched on your own to illustrate or clarify your position. In the shifting world of the connected classroom, some suggest a seismic showdown is brewing By Ian Quillen In one corner are the Web 2.0 tools—the relatively new blogs, wikis, discussion forums, and social-networking sites that are gaining popularity among teachers looking to connect with their students and one another. By their very nature, such tools can be edited by a wide range of contributors, and they can host a wide range of content—some of it educational, and some not so much. In the opposite corner are the Web filters—software designed to block students from distracting or potentially harmful material, with roots in the more static online environment of the 1990s. In most cases, filters block whole websites rather than individual pages, based on a filtering company‟s database of sites that contain questionable material. While only the most laissez-faire technology advocates would favor scrapping filtering altogether—as well as the federal Children’s Internet Protection Act, or CIPA, which mandates it —most realize something has to change if schools are to continue exploring the use of Web 2.0 tools. “Filtering cannot any longer be a block-or-allow hard decision as it used to be,” says Rob Chambers, the chief technology officer for Bakersfield, Calif.-based Light speed Systems Inc., an education-only filtering company. “You have these sites that have good content and bad content, and it‟s all together. [Filtering] has to take these things to mind because that‟s what the world of education is heading toward.” Some education technology officers are pushing for what is called dynamic filtering, which blocks content based on words, phrases, and ratings of images that appear on each Web page, meaning some pages on a site may be blocked while others are allowed. Others say the flexibility within standard filtering programs—which do typically allow chief technology officers to select from a list of categories of sites to block or allow—should suffice when combined with a functional school or district technology office. And still others suggest that CIPA‟s standards are narrow enough that, if strictly adhered to, incidents involving the blocking of Web 2.0 tools would be slim to nil. CIPA mandates that schools block material that is obscene, shows child pornography, or is potentially harmful to minors, with the last of those three eliciting a range of interpretations. “There are times [educators] believe [the law is] much more prescriptive than it really is,” says James Bosco, a project director for the Washington-based Consortium for School Networking‟s Web 2.0 initiative. “But you do not want a situation when someone comes to a board meeting and says my kid came home and said the kid next to him was looking at pictures of naked girls." “We live with the specter of that occurring,” he says. “That‟s why some people say, „Are we really protecting kids [from harmful content], or ourselves from potential problems at school board meetings?‟ ” At the 11,000-student Saugus Union School District, which has students in grades K-6 in the northern region of Los Angeles County, Calif., teachers and students have been using Web 2.0 tools since their inception, says Jim Klein, the district‟s information services and technology director. The district has its own social-networking site, where students, teachers, and other staff members have their own spaces. And both students and staff regularly utilize other such sites for learning. Guardians of the Web Part of the district‟s ability to do that, Klein says, is its use of an open-source filtering program called Dan’s Guardian. Instead of checking websites against a list of forbidden sites, as most programs do, Dan‟s Guardian, Klein says, searches individual pages for “hot words” that could signal improper content. If a page passes a certain threshold for hot words—which can be adjusted depending on the level of filtering desired—it will be blocked. Critics of Dan‟s Guardian and other dynamic-filtering programs say the technology is imperfect in determining what is allowed and what is blocked. But Klein says such claims are meant to obscure the fact that dynamic filtering theoretically eliminates the need for updates found in traditional programs like IGear and CyberPatrol, and therefore eliminates a recurring cost for districts. Doug Anderson, the marketing director at Salt Lake City-based Content Watch LLC, says his company is one of the few commercial vendors to use primarily dynamic-filtering techniques built upon algorithms, both in personal and business products, including those serving school districts.. Content Watch‟s ability to provide more-nuanced filtering around, for example, the difference between breast augmentation and breast cancer, can make the product more useful. Klein acknowledges there is concern about how dynamic-filtering programs block images, but adds that most Web images are tagged with content ratings that the filter can read. Images that aren‟t, he says, usually appear alongside words and phrases that would cause a filter to block the Web page. Barb Rose, the vice president of marketing for the Web-filtering company CyberPatrol LLC, based in Carlisle, Pa., says the risk is far greater than Klein indicates, in part because most search engines allow users to search for images independent of the Web page they are found on. CyberPatrol‟s software is capable of some dynamic filtering, she says, but in part because of concerns about speed, it is a rarely used feature. Instead, like most filtering software, CyberPatrol compares the site a user is trying to visit against websites that are listed in a database under one of several dozen categories. If the site matches a site listed in that category, and the chief technology officer has chosen that category as one that users in the district should not be viewing, the site is blocked. Rose says CyberPatrol and its competitors are working to find more-refined approaches that take Web 2.0 tools into account. “I think probably all the vendors are back under the hood looking at what they‟ve got to meet those needs,” she says. But she adds that blocking social networks is often good practice to protect network safety. “We see more and more threats of viruses and malicious links” on social-networking sites, Rose says, pointing to examples of Facebook viruses that masked themselves as a legitimate application. Districts Set Priorities In many cases, say educational technology experts, what exactly gets filtered has more to do with the district than with the product. For example, David Jakes, the director of instructional technology for Glenbrook South High School in Glenview, Ill.—a northwestern suburb of Chicago—says sometimes teachers are unable to access Web resources because they aren‟t making the decisions about which categories to block and which to allow. “In most districts, the technology people do the blocking,” says Jakes, whose 5,000- student Glenbrook school district, made up of two high schools, blocks Facebook but allows access to most other sites. “I think that‟s a little misguided. There‟s no discussion between the technology people and the curriculum people.” The biggest teacher complaints are not always about whether they can access content, but how quickly they can access it. Chief technology officers generally have the power to override a blocked site that an instructor says has educational value. But depending on staffing, district procedure, and a teacher‟s relationship with the technology department, the response time could range from minutes to close to a month. Yet, while many filtering issues can be addressed internally, some experts feel the original point of filtering as mandated by federal law is consistently misrepresented. Even Karen Cator, the chief of the U.S. Education Department‟s office of educational technology, says that “people have very different ways of interpreting” CIPA, and Federal Communications Commission officials say the number of schools actually found in violation of the act is minute. Most violations that occur, they say, relate to the proper procedures for establishing filters, not the exposure of students to improper content. Knowing that, some ed-tech experts say schools should not fear to take a more hands-off approach. Klein of the Saugus Union district, for example, says that filters are ineffective at stopping a student from purposefully accessing improper content. Instead, he argues, technology directors should accept that a few students are generally going to succeed in circumventing the system, and thus should establish filter settings that help other students learn how to sift between dangerous and useful content. “I‟m waiting for the day when a school district gets sued by a parent for not teaching kids to be responsible online,” Klein says. “It‟s going to happen sooner or later.” Vol. 04, Issue 01, Pages 20-21 http://www.nytimes.com/2002/12/11/technology/11FILT.html?pagewanted=all Internet Filters Block Many Useful Sites, Study Finds By JOHN SCHWARTZ Teenagers who look to the Internet for health information as part of their "wired generation" birthright are blocked from many useful sites by anti-pornography filters that federal law requires in school and library computers, a new study has found. The filtering programs tend to block references to sex and sex-related terms, like "safe sex," "condoms," "abortion," "jock itch," "gay" and "lesbian." Although the software can be adjusted to allow access to most health-related Web sites, many schools and libraries ratchet up the software's barriers to highest settings, the report said. "A little bit of filtering is O.K., but more isn't necessarily better," said Vicky Rideout, vice president of the Henry J. Kaiser Family Foundation, which produced the report, to be published today in The Journal of the American Medical Association. "If they are set too high, they can be a serious obstacle to health information." The researchers found that filters set at the least restrictive level blocked an average of 1.4 percent of health sites; at the most restrictive level, filters blocked nearly 25 percent of health sites. The amount of pornography blocked, however, was fairly consistent: 87 percent at the least restrictive level, 91 percent at the most restrictive. The programs blocked a much higher percentage of health sites devoted to safe-sex topics: 9 percent at the least restrictive level and 50 percent at the most restrictive. The blocked pages at high levels included The Journal of the American Medical Association's site for women's health and a page with online information from the Food and Drug Administration about clinical trials. To the researchers, the results mean that a school or library that uses a less restrictive setting for Internet filters can lose very little of the protective effect of the filters, while minimizing the tendency of filters to block harmless and even valuable sites. The report is the first major study of the effectiveness of filters to appear in a peer- reviewed scientific journal, and the first to look at the effectiveness of filters at various settings. Most previous studies have been produced by organizations with a strong point of view either favoring or opposing filters. The Kaiser Foundation is a nonprofit health research group. David Burt, an antipornography advocate who is a spokesman for the filtering company N2H2 , said he was pleased with the report, which he called "very thoughtful and well designed they recognized it matters a lot how you configure a filter and set it up." But opponents of filtering requirements said the study showed the technology's clumsiness. "Filters are just fine for parents to use at home," said Judith F. Krug, director of the Office for Intellectual Freedom at the American Library Association. "They are not appropriate for institutions that might be the only place where kids can get this information." "The importance of the First Amendment," Ms. Krug said, "is that it provides us with the ability to govern ourselves, because it guarantees that you have the right to access information. The filters undercut that ability." Nancy Willard, an Oregon educator who has written student guides that emphasize personal responsibility in Internet surfing, called filtering a kind of censorship that, if performed by the schools directly, would be unconstitutional. "These filtering companies are protecting all information about what they are blocking as confidential trade secrets," Ms. Willard said. "This is nothing more than stealth censorship." The study was conducted for the foundation by University of Michigan researchers, who tested six leading Internet filtering programs. The researchers searched for information on 24 health topics, including breast cancer and birth control, and also for pornographic terms. They performed the tests at each of three settings. At the least restrictive setting, only pornography is supposed to be blocked; an intermediate setting also bars sites with nudity and other controversial material like illicit drugs. The most restrictive setting possible for each product may block sites in dozens of other categories. The researchers then called 20 school districts and library systems around the United States to ask how they set their filters. Of the school systems, which teach a half million students over all, only one set its filters at the least restrictive level. The issue of library filtering is making its way through the federal courts. Last month the Supreme Court agreed to hear a Bush administration defense of the Children's Internet Protection Act, the federal law requiring schools and libraries to use filters on computers used by children or to lose technology money. A special panel of the United States Court of Appeals for the Third Circuit, in Philadelphia, struck down part of the law that applied to libraries as unconstitutional. Chief Judge Edward R. Becker wrote that filters were a "blunt instrument" for protecting children. Teenagers who look to the Internet for health information as part of their "wired generation" birthright are blocked from many useful sites by anti-pornography filters that federal law requires in school and library computers, a new study has found. The filtering programs tend to block references to sex and sex-related terms, like "safe sex," "condoms," "abortion," "jock itch," "gay" and "lesbian." Although the software can be adjusted to allow access to most health-related Web sites, many schools and libraries ratchet up the software's barriers to highest settings, the report said. "A little bit of filtering is O.K., but more isn't necessarily better," said Vicky Rideout, vice president of the Henry J. Kaiser Family Foundation, which produced the report, to be published today in The Journal of the American Medical Association. "If they are set too high, they can be a serious obstacle to health information." The researchers found that filters set at the least restrictive level blocked an average of 1.4 percent of health sites; at the most restrictive level, filters blocked nearly 25 percent of health sites. The amount of pornography blocked, however, was fairly consistent: 87 percent at the least restrictive level, 91 percent at the most restrictive. The programs blocked a much higher percentage of health sites devoted to safe-sex topics: 9 percent at the least restrictive level and 50 percent at the most restrictive. The blocked pages at high levels included The Journal of the American Medical Association's site for women's health and a page with online information from the Food and Drug Administration about clinical trials. To the researchers, the results mean that a school or library that uses a less restrictive setting for Internet filters can lose very little of the protective effect of the filters, while minimizing the tendency of filters to block harmless and even valuable sites. The report is the first major study of the effectiveness of filters to appear in a peer-reviewed scientific journal, and the first to look at the effectiveness of filters at various settings. Most previous studies have been produced by organizations with a strong point of view either favoring or opposing filters. The Kaiser Foundation is a nonprofit health research group. David Burt, an anti- pornography advocate who is a spokesman for the filtering company N2H2 , said he was pleased with the report, which he called "very thoughtful and well designed they recognized it matters a lot how you configure a filter and set it up." But opponents of filtering requirements said the study showed the technology's clumsiness. "Filters are just fine for parents to use at home," said Judith F. Krug, director of the Office for Intellectual Freedom at the American Library Association. "They are not appropriate for institutions that might be the only place where kids can get this information." "The importance of the First Amendment," Ms. Krug said, "is that it provides us with the ability to govern ourselves, because it guarantees that you have the right to access information. The filters undercut that ability." Nancy Willard, an Oregon educator who has written student guides that emphasize personal responsibility in Internet surfing, called filtering a kind of censorship that, if performed by the schools directly, would be unconstitutional. "These filtering companies are protecting all information about what they are blocking as confidential trade secrets," Ms. Willard said. "This is nothing more than stealth censorship." The study was conducted for the foundation by University of Michigan researchers, who tested six leading Internet filtering programs. The researchers searched for information on 24 health topics, including breast cancer and birth control, and also for pornographic terms. They performed the tests at each of three settings. At the least restrictive setting, only pornography is supposed to be blocked; an intermediate setting also bars sites with nudity and other controversial material like illicit drugs. The most restrictive setting possible for each product may block sites in dozens of other categories. The researchers then called 20 school districts and library systems around the United States to ask how they set their filters. Of the school systems, which teach a half million students over all, only one set its filters at the least restrictive level. The issue of library filtering is making its way through the federal courts. Last month the Supreme Court agreed to hear a Bush administration defense of the Children's Internet Protection Act, the federal law requiring schools and libraries to use filters on computers used by children or to lose technology money. A special panel of the United States Court of Appeals for the Third Circuit, in Philadelphia, struck down part of the law that applied to libraries as unconstitutional. Chief Judge Edward R. Becker wrote that filters were a "blunt instrument" for protecting children. http://partners.nytimes.com/library/tech/99/03/cyber/education/10education.html By PAMELA MENDELS Schools Split on Using Internet Filters or Elizabeth M. Whitaker, who oversees technology in the Tucson, Ariz., public schools, Internet filters have for two years proved a common sense way to keep students from accessing pornography online. But for Barbara Ridgway, who is in charge of technology for schools in Helena, Montana, the devices have been troublesome and troubling. Two weeks after they were installed earlier this year, school officials removed them and have now formed a committee to decide how to proceed. Their different responses to Internet filters highlight a dilemma facing educators in the digital age. Around the country, school officials are grappling with whether filters are the best way to keep students from the Internet's seamier side. If some members of Congress have their way, the discussion could end with schools having little choice but to install the devices. For the second year in row, the Senate is considering bipartisan legislation that would require schools and libraries that receive a new federal Internet subsidy known as the E-rate, for education rate, to install and use filters to block material harmful to minors. A similar bill was introduced in the House of Representatives last week by Bob Franks, a New Jersey Republican. Pia Pialorsi, a spokeswoman for the Senate Commerce Committee, says it would be left to individual communities to determine what is harmful to minors and what, if anything, to filter. In prepared comments before a Commerce Committee hearing last week, one of the bill's sponsors, Senator John McCain, an Arizona Republican, said the legislation seeks to keep children from inappropriate material online when they are in institutions that are traditionally considered safe havens. "People have the right to feel safe that, when they send their child to school, when they drop their child off at the public library, someone is going to be looking out for their children, protecting them," he said. Few disagree with that goal. Nonetheless, an array of library and education groups, including the National PTA, the National Education Association teachers' union and the National School Boards Association, oppose the measure. They say decisions about filtering, like most decisions involving education, should be local prerogatives. "The main issue for us is not that we are supporting kids going to bad sites on the Internet. It is the federal government even thinking about dictating to the schools how to approach this issue," said Jefferson G. Burnett, director of government relations for the National Association of Independent Schools. Many schools have, on their own, chosen to use filters. A study last year by Quality Education Data, a Denver, Col. education market research company, found that about 39 percent of public schools that offered Internet access to students use filters. Others have shied away. They believe that filters, which have been criticized for blocking either too much or too little information, should take a back seat to strong acceptable use policies that sanction students for using computers inappropriately. Among those that have said yes to filtering are the public schools in the Tucson area, which serve about 63,000 students. Whitaker, coordinator for instructional technology in the Tucson Unified School District, where most classrooms now have Internet access, said filters were first introduced in 1997, after school officials found that too many students were getting into trouble for looking at sexually explicit material online. Whitaker says that although filters have helped greatly, on occasion they have brought problems of their own, notably by blocking legitimate sites that then need to be unblocked manually, she said. Whitaker said most teachers and administrators are willing to overlook such shortcomings, because they believe filtering makes the job of monitoring student computer use easier. However, school librarians, who generally oppose anything that smacks of censorship, are another matter. Whitaker said she understands their point, but ultimately disagrees, arguing that schools should be able to exercise the same sort of control over minors' access to pornography that many states require of adult bookstores. "If we look at the K-12 setting, we are dealing with children under the age of 18," she said. "Kids are not allowed into adult bookstores." In Helena, Montana, school officials have not been so satisfied with filters. "Our experience has been problematic," said Ridgway, technology coordinator for the district, which has about 8,000 students. Ridgway said filters were installed early this year in the area's two public high schools as an experiment, noting that the district has not had any major problems with students viewing online pornography. Indeed, a librarian at Helena High School, which has a strict acceptable use policy, said the library has recorded only six incidents of computer abuse so far this school year on the library's 27 Internet-connected computers. Only one of the incidents involved sexually explicit material; most centered on students looking at game sites that are off-limits. But because filtering was offered as part of the computer's security system, the district decided to try it. Then Ridgway started receiving complaints that students and teachers could not access Web sites they needed to visit, such as those expressing strong political views about environmental issues. "It was very frustrating to have things inaccessible for what appeared to be not logical reasons," Ridgway said. Now, officials are reviewing the district's filtering policy. In the meantime, Ridgway says students are well-supervised in their computer use and that to get Internet access they have to have their parents' permission in writing. Whatever the district finally decides, Ridgway believes it is important that students be educated in how to evaluate material online. Ridgway noted that filters do not address a wide variety of Internet material she considers ill-suited for schools - like advertising and pop culture. She believes that it is important to teach students to evaluate this type of information as well. "I am a firm believer in teaching people to use things wisely," she said
If you have any questions, please email Mrs. Abraham at: [email protected]