Chapter 1 Introduction

1.1 Background to research problem

Pornography in universities is common, especially in North America where student newspapers and magazines dealing with sexual expression and sexuality can be found. Two examples are the Boink Magazine at Boston University and the H-Bomb at Harvard University (Chheuy 2004). Some US universities, such as New York University, UMass Amherst, Penn State, Vanderbilt, North Western, San Francisco State, UCLA, UC Berkley, UC Santa Barbra and Arizona State University, offer courses as a possible career opportunity (Cullen 2006).

The University of Johannesburg (UJ) is a newly formed university comprising of three previous institutions: Rand Afrikaans University, Technikon Witwatersrand and Vista University’s Soweto and East Rand campuses. The challenge for successful integration is the development of new policies to meet the needs of the new organisation’s stakeholders. At present there is no formal UJ Acceptable Use Policy (AUP), which is the basis on which staff and students are required to conduct their behaviour in the online environment.

Currently UJ employs different content filtering software at the different campuses to regulate and manage access to unwanted content in the online environment. The Bunting Rd and Doornfontein campuses use a product called Dans Guardian and on the East Rand, Kingsway and Soweto campuses, Squid Guard is used. The problem with the use of the two different content filters is that there is no consistency in terms of what content access is granted to. This creates a problem, as UJ is unable to block access to pornography consistently at all the campuses.

The AUP, together with content filtering, creates a comprehensive approach to preventing access to online pornography. Students at UJ have actively voiced their concern as to what is being done to prevent access to pornography through the Auckland Park Campus Beeld (newspaper). Often unsupported claims are made as to the state of the problem. UJ needs to address the concerns of the students through the development of a well structured, comprehensive AUP and accurate content filtering for all the campuses.

1 1.2 Rationale and benefits to be derived from the study

The underlying purpose of the study is to test the effectiveness of current online content filters and AUP at UJ. From the research data gathered, recommendations will be made for the creation of a comprehensive approach to managing access to pornography in an online environment. This approach should ideally consist of two components, an active and passive approach. The active approach typically consists of a mechanism (content filtering solution) that physically restricts users from viewing content outlined as unacceptable. The passive approach cannot physically restrict users from viewing unacceptable content but in the form of a policy it may condemn such behaviour. This solution should be used at all five campuses, ensuring a single institutional approach to managing access to online pornography that protects staff and students from any implications that may emanate from the distribution or accessing of pornographic content via the Internet.

1.3 Research problem and sub-problems

The following research problem was formulated for this study: “To what extent can access to online pornography be managed at the University of Johannesburg?” In order to address this problem successfully, the following sub-problems will be investigated: – What is pornography? – What is the difference between legal and illegal pornography? – What are online content filters? – What are the advantages and disadvantages of different content filtering solutions? – What is an Acceptable Use Policy (AUP)? – What content should an AUP typically include?

1.4 Research methodology

The underlying aim of the research is to determine to what extent access to online pornography can be managed at the University of Johannesburg? For the purpose of addressing the research problem a comprehensive literature review was completed, followed by an empirical component, which consisted of content filtering testing and a quantitative survey.

2 1.4.1 Literature review

Chapters 2, 3 and 4 create a theoretical framework from which to conduct the empirical research. These chapters deal with the underlying themes and components of the research problem. The literature review is essential in the solving of the identified sub-problems. Below is a short summary of the literature review.

Chapter 2 Chapter 2 defines the term “pornography” and creates a taxonomy that identifies the different types of pornography. The taxonomy is used to show the distinction between illegal and legal pornography in an easy to understand layout. Different points of view on the use of pornography in society are discussed.

Owing to the evolution of the media and technology, pornography is becoming a more prominent feature in our lives. The Internet is a breeding ground for pornographers who are looking to distribute their content to an easily reachable international mass market. From the humble beginnings of a form of art, pornography is one of the most successful e- commerce ventures, turning over billions of dollars a year. The difficulty in defining pornography makes it harder to manage.

Pornography comes in many different formats, some more tolerant than others. Some content may be legal while certain content remains illegal under the SA Films and Publications Act. Universities have to decide what content will be tolerated and what measures will be taken against those in possession of, viewing or distributing intolerable content.

Chapter 3 This chapter is dedicated to the examination of various content filters and identifies the unique characteristics of popular content filtering methods. Different content filtering methods are described, along with their strengths and weaknesses. The filtering methods discussed include: keyword filters, Unique Resource Locator (URL) blocking, protocol blocking and content filtering rating systems and services. This chapter will also investigate future developments in content filtering in order to provide a more comprehensive filtering service.

3 In addition, this chapter deals with the topic of censorship of content. Content filtering is a form of censorship predominately used on Internet applications such as the World Wide Web (WWW). Censorship of content is nothing new. For thousands of years censorship has been employed by those in a position of power to varying effect. This term often has negative connotations, as it restricts the right to freedom of expression. Filtering content on the Web is a fairly new phenomenon and seen as necessary for some, in view of the evolving nature of content on the Web.

There are different views on the function of censorship in society. Certain groups of people believe that there is a need for censorship while others believe censorship should be condemned. An overview of the advantages and disadvantages of censorship is discussed in this chapter, with reference to content filtering.

Chapter 4 This chapter is dedicated to exploring Acceptable Use Policies (AUPs) as a passive tool for preventing access to pornography. AUPs are often not seen as proactive tools in preventing access to pornography, but behind any successful content filtering solution a thorough AUP should be accredited.

An AUP is typically divided into different sections, making it easier to understand and follow, allowing for continuity. Each section of the AUP will deal with a different focus area. When dealing with access to pornography, universities need to define clearly what content is acceptable and what measurers are taken to prevent access to unacceptable content such as pornography. In addition, an AUP should include a section on what disciplinary action will be taken against those who are caught accessing or distributing pornography in the university environment.

1.4.2 Empirical research

Chapter 5 is dedicated to empirical research, along with the findings and interpretations. Chapter 6 includes recommendations for implementing a comprehensive approach to managing access to online pornography at the University of Johannesburg. Following is a brief summary of the empirical research.

4 Chapter 5 In this chapter the empirical research is discussed along with the research methodology, statistical analysis and findings. The research methodology explored the research problem and sub-problems.

The research approach followed for this research project can be classified under Pasteur’s quadrant of Stokes Research Classification Quadrants (1997), as “use-inspired basic research”. This type of research is deemed the ultimate type of research, as it is devoted to solving problems in order to improve people’s lives. As part of Pasteur’s quadrant the main objective of the research is to discover new facts and knowledge as to how the University of Johannesburg can improve in their quest to manage access to online pornography.

In addition, action research can be classified under Pasteur’s quadrant, as it presumes an alliance between research and action, leaving the research outcomes highly organisation specific. A quantitative research methodology was chosen for this research project, as specific variables to be measured and sample sizes had been identified.

A simple random sampling method was used for the survey (Scheaffer et al 1979: 31). The size of the target group is also discussed, as well as the process of data collection and data processing. For the survey it was important to ensure equal representation from the five different campuses (Bunting Rd, Doornfontein, East Rand, Kingsway and Soweto), as a target group of 1 000 students was identified. This target group was divided into the different campuses using the number of registered students at each campus as an indicator to obtain proportional representation of the campuses. Each campus’s target was calculated as a percentage of the total number of students. Table 1.1 represents the breakdown of students per campus.

5 Table 1.1: Breakdown of students per campus for the University of Johannesburg for 2006

UJ Campus Number of students % of the total Size of sample Bunting Rd 7 943 15.86 160 Doornfontein 9 549 19.06 190 East Rand 455 0.9 10 Kingsway 30 703 61.3 610 Soweto 1 443 2.88 30 Total 50 093 100 1 000

For the survey a questionnaire was complied with special emphasis on the students’ perception of what content (covering all the categories of pornography as per a taxonomy defined in Chapter 2) they should be allowed access to and how effectively current content filtering methods worked at the various campuses. The questionnaire consisted predominately of closed-ended questions with one open-ended question where students were asked to fill in their age in full years. The questionnaire consisted of four sections, namely: Section A: Biographic information Section B: University computer facility usage Section C: University AUP Section D: Personal experience with university computer facility.

These questionnaires where handed out to students at the libraries and computer laboratories at the five campuses. The completed questionnaires were collected and submitted to Statcon (the Statistical Consultation Services at the Kingsway campus) for processing. A second phase of the research included the testing of online content filtering effectiveness to prevent access to pornography at the five campuses. This testing was completed over a 12 hour period to ensure consistency, as these content filtering solutions are updated on a regular basis.

A section in this chapter is devoted to statistical processing and analysis with the purpose of creating a framework within which the initial frequencies could be interpreted. This interpretation of the results, from both initial frequencies and cross table comparison, was completed in accordance with the sections as set out in the questionnaire with the aid of colour-coded charts. 6 Chapter 6 This chapter reveals the final conclusion and recommendations with regard to the research outcomes. Suggestions are made that can be used by the university IT staff and policy makers to create a uniform approach to managing access to online pornography at UJ. In addition, the findings provide valuable guidelines to those stakeholders involved in the development of AUPs and content filtering at an organisational level. The final part of this chapter is dedicated to areas of possible future research to provide continued improvement in content filtering techniques, providing more accurate methods for identifying and preventing access to pornography.

1.5 Research findings

The findings in Chapter 5 will help UJ improve service delivery with regard to managing access to pornography. These findings are centred on current content filtering methods employed at the different campuses and current AUP. Some of the significant findings included:

1. With regard to the use of the computer facilities provided by UJ, the sample responded with 42,7% (435 students) indicating that they used the computer facilities daily. The next significant group of respondents use the computer facilities twice a week. Only 4,6% (47 students) indicated they used the computer facilities once a month or less.

2. WebCT and Edulink were recorded as the most useful online facilities provided by UJ, with 32,5% (295 students) ranking this as number 1 (on a scale of 1-5, with 1 as most important). This was followed by surfing via the Internet at 29% and an email facility at 27,1%. The least used online facility that UJ provided, according to the student respondents, was the library services which include online catalogues and electronic journals. This facility was rated 5 by 34% (312 students) of the respondents.

3. When students were asked whether the content the University was actively blocking contained academic information, 52,1% stated “Yes”. The next group of 30,2% claimed “No” and 17,7% of the students were unsure of the nature of the content they were trying to access.

7 4. When asked how often the respondents were exposed to unsolicited pornography, 80% stated never, which indicates that the combined content filter at UJ is fairly effective at suppressing this type of content. This was followed by 12,8% claiming only once or twice and 7,2% indicating they had been exposed more then three times while using the computer facilities provided by UJ.

5. The largest group of 36,7% stated that exposure to unsolicited pornography did not bother them, while the next largest group responding to this question was 31,3% that deemed this unacceptable. Students who were bothered by exposure to unsolicited pornography constituted for 26,3% of the respondents. In addition 5,7% believed they should have the freedom to access pornography.

6. With regard to gender, females were less tolerant than males of exposure to unsolicited pornography, as 35,6% of females deemed exposure to unsolicited pornography unacceptable, compared to 25,1% of males. Only 2,5% of females believed that students should have the freedom to access pornography, compared to 8,9% of male respondents.

7. When asked to what extent students should have access to pornography, 60,5% stated “None” and 32,6% believed that access should be granted to those who have permission for research. Surprisingly, 6,9% believed that students should be granted total access to pornography.

8. Content labelled obscene, which is illegal to posses, distribute and access in South Africa received the following response from the respondents: 11% of students wanted access to bestiality and 10.8% believed that they should be allowed to access content containing acts of simulated rape or violence.

9. With regard to content labelled as , students responded with 23,4% of students believing they should be allowed to access content depicting exposed children or pseudo images. The reason for such a high response may be accredited to the lack of knowledge on the severity of underage sexual activity. The other category of child pornography includes acts involving children; 13,8% indicated they should be allowed to access such content.

8 10. Students where asked how often they used the computer facilities at UJ to access pornography. The majority claimed “Never”, followed by 5,7% claiming once or twice. A small but notable proportion of respondents (2,2%) stated that they used the UJ computer facilities daily to access pornography.

1.6 Conclusions and recommendations

The findings from the research are significant in that it gives the University of Johannesburg informative input into the effectiveness of current content filtering solutions at the various campuses and insight into the development of a comprehensive AUP. Since it is a fairly new institution, there is a need to integrate policies and content filtering solutions to accommodate all campuses and ensure standardisation. This approach will require constant updating as the content filtering solutions are continually being improved and new laws, especially with regard to the Internet, are being passed.

As indicated in the findings, the need remains to adopt a single content filtering solution. Lack of standardisation has created a polarising effect in terms of content filtering, with the Bunting Rd and Doornfontein campuses on the one side with their Dans Guardian and East Rand, Kingsway and Soweto campuses on the other side with their Squid Guard. Of these two products, Dans Guardian proved to be more effective at blocking access to online pornography in the testing conducted.

There is no formal UJ AUP in place. The current form of AUP is outlined in a Rand Afrikaans University policy, labelled as a “Campus Network User Code”, which was last updated in 2001. This policy contains no official UJ logo on the document (see Addendum C). There is a serious need for an updated UJ AUP to address the needs of the new organisation.

In conclusion to the research problem: The University of Johannesburg can to a large extent, through the use of an effective content filter and a comprehensive Acceptable Use Policy, manage access to online pornography.

9 Chapter 2 Pornography, society and the Internet

2.1 Introduction

Pornography is a common term used today, but inappropriately used with no formal definition and understanding of the word. This is a taboo subject in most societies and many people try to ignore the issues at hand and pretend they do not exist, rather than trying to understand and identify the different viewpoints associated with pornography. Many people believe that pornography in general is illegal.

This chapter is devoted to the simplification and understanding of the term pornography. Relevant words associated with pornography will be identified and discussed. There are many different viewpoints on what is considered pornography and further more, what constitutes illegal pornography. The former Supreme Court Justice of the US, Potter Stewart, made a very famous and true statement when he said “I cannot define pornography but I know it when I see it” (Dallas 2005).

Universities and other tertiary education institutes are the breeding grounds for young inquisitive minds. Pornography deals with sexuality, which is very important for the development of young adults. It is important that an understanding of pornography is attained before one tries to manage access to pornography, as there are very complex viewpoints on pornography.

In this chapter the following sub-problems will be discussed: • What is pornography? • What is the difference between legal and illegal pornography? • What is the impact of Internet technologies on pornography?

Once an understanding of the term pornography has been attained, an effective strategy can be implemented that will help manage online access to pornography. The Internet is fast becoming a popular tool for the distribution of pornography and as a form of e-commerce pornography grosses millions of US dollars per year (Ong 2004).

10 2.2 Defining pornography

Today the word pornography or “porn” is commonly used; what was once a taboo subject has become a common occurrence in society through various media. There is, however, a lack of clarity about the definitions and terms associated with pornography. The definition of pornography is determined by a society’s norms and values, as well as their perspectives: whether it is conservative, liberal, post-modern or feminist. These factors are used when deciding what content is of a pornographic nature.

Most western cultures are more open minded and portray a liberal attitude towards content that may be classified as an art form or freedom of expression. On the other hand, the same material may be considered intolerable and taboo by many religious societies of Eastern and Middle Eastern heritage, such as Muslim or Hindu nations. It is hard to find a definition that suits all cultures and people and so one needs to break down the word and consider the definitions of pornography in the context they are used.

There is a definite difficulty in describing and defining the word pornography. At the same time, there are a host of other words associated with pornography such as , and indecency, that fall in the same grey area (Halavais 2005). The word pornography originates from the term “Pornogrophs”, which were ancient Greek writings of encounters with prostitutes (Rothery 2003). The word has been adapted and is used to describe more than just writings about prostitutes. This definition and medium for delivery has evolved somewhat to be used in the modern context.

The term “pornography” was coined in England around 1850 and had a sense of scholarship about it. This quickly changed and began to have negative connotations (UKTV 2005). For centuries the Greeks and Romans produced many sexually explicit artefacts including paintings on plaster, vases and statutes, as well as carrying a tradition of making sex toys.

The Oxford English Concise Dictionary (2004) defines pornography as printed or visual material intended to stimulate sexual excitement. The Merriam Webster Dictionary of Law (1996) defines pornography as material that depicts erotic behaviour and is intended to cause sexual excitement. Both these definitions refer to the term “erotic”, which is seen as an artistic and more acceptable form of pornography. Defining pornography is further

11 complicated because the legal definition of pornography differentiates between obscene content and pornographic content.

Hogg (1999) and Van De Beer (1992) use the encyclopaedia of ethics when defining pornography as sexually explicit depictions of persons in words or images created with the primary, proximate aim and hope of eliciting significant as part of the consumption of such material. This definition identifies pornography as multimedia, identifying images as well as literature as a form of pornography.

Erotica, a word closely associated with pornography, explores the artistic nature of pornography. The word originates from a Greek word, “eros”, meaning love. This includes works of art such as literature, photographs, films, sculpture and paintings that deal with arousing content. In modern times erotica is used to describe the portrayal of human sensuality and sexuality with artistic aspirations. While pornography tends to focus on unemotional lust, erotica defines material with higher emotional involvement. “Soft pornography” is a similar type of commercial art that lies between erotica and (Wikipedia 2005).

2.2.1 Defining illegal pornography

When referring to the legal definitions of pornography, one identifies content of an obscene nature, which is deemed illegal by society and western law. These definitions vary slightly on different legal constitutions. Two defining court cases have been used internationally as a benchmark to form a definition of illegal pornography (obscenity), viz Roth vs. United States in 1957 and Miller vs. California in 1973.

The first proper challenge of the American first amendment was the case of Roth vs. United States in 1957. Samuel Roth ran a literary business in New York. He was convicted under federal state law of sending obscene, lewd, lascivious material through the mail for advertising and selling a publication called the “America Aphrodite”. This contained literary erotica and . This case formed the basis of the modern definition of obscenity (Wikipedia 2005).

Later in 1973, an important case was appealed, Miller vs. California. From this case a clear separation of obscenity from the definition of freedom of speech was made. The Miller test

12 was developed, which became the basis of the American definition of obscenity. This case is a landmark case for the South African legal definition of obscenity.

The South African constitution addresses the issue of illegal pornography through the Films and Publications Act. This act identifies three categories of pornography, separating acceptable pornography from illegal and obscene content (Watney 2005):

Category 1: XX classification of content is subject to the form of communication, publication or film consisting of: a) bestiality, incest and rape; b) explicit sexual conduct that violates or shows disrespect for the right to human dignity or degrades a person, which constitutes incitement to cause harm; and c) explicit infliction of or explicit effect of extreme violence, which constitutes incitement or cause of harm. This category is illegal to distribute and possess in South Africa (Watney 2005).

Category 2: X18 classification of content. Explicit sexual conduct, simulated or real, which in the case of sexual intercourse includes explicit visual presentation of genitals. This form of pornography is not illegal to possess or distribute to those over the age of 18.

Category 3: This category classifies child pornography. This includes any image, however created, or description of a person, real or simulated, who is depicted or described as under the age of 18 who is: a) engaged in sexual conduct; b) participating or assisting another to participate in sexual conduct; or c) showing or describing the body or any part of the body in a manner which amounts to sexual exploitation. This category of content is illegal in South Africa and people charged with material of this category will suffer harsh prison sentences.

The Supreme Court of Canada defines obscenity as content containing explicit sex with violence, explicit sex without violence but treating people in a degrading or dehumanising way or explicit sex that is neither degrading or violent (Hogg 1999).

13 In United States law, obscenity falls into three major categories (Larsen 1994:82-83): 1. The dominant theme of the material, taken as whole, appeals to prurient interests in sex. 2. The material is blatantly offensive because it confronts contemporary community standards regarding the depiction of sexual matters. 3. The material lacks social value.

Contrary to popular belief not all forms of pornography are illegal. Only content that is deemed obscene is illegal, as is any other pornographic content in the hands of people under the age of 18 years. One very disturbing form of pornography is that of child pornography. This content has become more popular with the advancement in technology, allowing people to distribute and search for content of such nature anonymously with little chance of being traced.

2.2.2 Child pornography

Child pornography, a category of obscene content, is one of the most controversial dimensions of , which is virtually universally illegal and is not easy to define for legislative purposes. In the light of child pornography, one is faced with important questions about whether the use of technology for the consumption of child pornography amplifies or triggers abusive behaviour, which may have remained dormant (Adam 2001: 36).

There are different types of child (under the age 18) pornography; some have a higher legal priority then others (Rothery 2003): • Erotic child pornography may include content of minors fully clothed or wearing mildly revealing attire. This may only be appealing to a few. Possession and production of such material is deemed unethical and in some extreme cases illegal. Nudism of minors can in most cases be considered illegal, as long as it is proved to be indicative of adult sexual interest in children. These images are often referred to as covert photographs. • Posed pictures of minors involving . This is seen as the illegal equivalent of adult softcore pornography and usually involves professional photographers. • Explicit sex, which usually depicts children posing and exposing their genitals in a sexually explicit way. This is the most severe form of child pornography. In the

14 same content category is content of adults or other children simulating sexual assault on minors (Rothery 2003).

The most comprehensive international treaty regarding cybercrime is the “Convention of Cybercrime” adopted by the council of Europe in 2001. As part of the Convention of Cybercrime, guidelines were established to seek stronger protection measures for children with regard to pornography. The following aspects of electronic production, possession and distribution of child pornography are criminalised (Watney 2006: 230): • production of child pornography for the purpose of distribution through a computer system; • making child pornography available through computer systems; • distribution or transmission of child pornography through a computer system; • procuring child pornography through a computer system for oneself or another; and • possession of child pornography in a computer system on a computer data storage medium.

For the purpose of the Cybercrime Convention, child pornography is defined as pornographic material that visually depicts minors engaged in sexually explicit conduct, a person appearing to be a minor involved in sexually explicit conduct and realistic images representing minors engaged in sexually explicit conduct. The convention identifies a minor as a person under the age of 18 years.

Not all child pornography is what it seems. Images that may have been constructed or altered are referred to as “pseudo images”. According to Taylor (2002) these images can be classified into three types: • digitally altered images of bodies, such as children in swimming costumes, where the costume may have been removed digitally or the face of a child may have been pasted onto the body of a young adult; • separate images that may have been combined into one photo, superimposing images; and • a montage of various pictures, some of which are sexual in nature.

One of the most debated issues concerning child pornography is that of pseudo images. These images are digitally constructed to look like the real thing. Pseudo images are

15 common in everyday media and are used to ridicule politicians. However, in this form it is quite harmless in comparison to the reality of children being used in illustrating sexual activities, which is widespread via the Internet (Adam 2001:37).

2.3 Taxonomies as research tools

Taxonomies are vital research tools. As the most important and basic step in conducting scientific enquiry they involve ordering and classification. A taxonomic classification focuses on general laws and principles that describe and characterise the phenomena or system of interest (Scherpereel 2006:123).

The goal of a taxonomy for this research is to aid in defining pornographic content in order to measure online pornography access at UJ. In addition, the taxonomy is used to identify the students’ view on acceptable access and measure the effectiveness of current content filtering practices at UJ with regard to various types of pornography mentioned in the taxonomy.

2.4 Pornography taxonomy

In order to better define pornography as a broad subject, one needs to identify the different categories of pornography. Not all of these categories are forbidden by the norms of society and illegal by law. People are unaware of the variety of content available through various mediums. People are often guilty of assuming all pornographic content is forbidden by society and therefore label the content illegal.

The main categories identified in the taxonomy (see figure 2.1) are softcore and hardcore. From these main categories, pornography is dissected into sub-sections. A detailed description of these categories will be discussed below. From the 10 categories identified, only four are illegal (indicated in yellow).

16

17 2.4.1 Softcore This content consists of the following categories: artistic nudity, , and visual erotica. This content is seen as more acceptable by society in comparison to other pornography categories outlined in the taxonomy.

• Erotic literature: This content is not in a visual format but rather in the format of text. These writings depict sexual encounters and can be real or fantasy. The purpose of these writings is to create sexual arousal and explore sexuality.

• Visual erotica: The images contain some nudity, not restricted to skin or even restricted to people. Images of non-living objects such as statues or even objects found in nature can be labelled as visually erotic. The defining factor in this category is the association or connection that the audience makes with the image, which may trigger sexual interest. It is difficult to differentiate this category from artistic nudity, but the most significant difference is that visual erotica is not restricted to images of humans or living entities, which is the case with artistic nudity.

• Child erotica: These images contain subjects under the age of 18 in suggestive situations. This classification is incredibly difficult to define, as inciting of personal arousal is determined by a number of factors. This content is not illegal like the other forms of child pornography in the taxonomy and to some these images may seem harmless representations of innocence and youth.

• Artistic nudity: These images contain “tasteful” nudity involving various models or volunteers. The word tasteful denotes some form of artistic impression and focuses on beauty or unfamiliarity. These images are generally not too revealing and incite curiosity in the mind of the audience.

2.4.2 Hardcore This content category consists of frontal close-ups, penetration and obscenity. The first two categories mentioned (frontal close-ups and penetration) are legal under the South African Films and Publication Act.

18 • Frontal close-ups: These images contain nudity in a sexually suggestive manner. This category is restricted to humans. These images are considered to be graphic and focus on specific regions of the human body, mainly the torso, genitals and buttocks. Nothing is left to the imagination of the audience. Content of this sort has been popularised by more extreme magazines such as .

• Penetration: These images contain intercourse between two or more human beings, but can include . This content was popularised with the growth of the adult film industry, largely owing to the invention of the video cassette recorder (VCR) and VHS tape. These media tapes made the recording of such content accessible to the man in the street. Today the Internet is a popular medium for the distribution of such content.

2.4.2.1 Obscenity (obscene content) The third category, obscenity or obscene content, is illegal under the South African Films and Publication Act. This content category consists of child pornography (discussed in the sub-section below), simulated rape or violence and bestiality. These types of pornography are illegal in South Africa and are highlighted in yellow in the taxonomy to illustrate their illegal nature.

• Simulated rape or violence: These images portray people being forced into sexual relations. The images contain content focusing on the domination of individuals who are forced into submissive situations. Whether or not this was real or an act, the outcome is the same.

• Bestiality: These images portray animals, mostly domestic pets or farm animals, receiving stimulation in a sexual manner from humans. This interaction of humans and animals can be highly disturbing and a blatant violation of animal rights.

2.4.2.1.1 Child pornography This content falls under the obscenity category and consists of sex acts involving minors and exposed or pseudo images. These types of child pornography fall under obscenity and remain illegal; for this reason they are highlighted in yellow.

19 • Exposed or pseudo images: These images contain real, modified or animated images of minors (under the age of 18) revealing their naked bodies, including images of partial nudity. This category of images has been popularised largely by an art form, “”. Hentai is a Japanese word for abnormality or metamorphosis but is often used as slang to refer to perversion. Elements of sexual fantasy are presented in a way that would be impossible to film. Not all forms of hentai portray images of minors, but there is a sub-category “”, that illustrates minors in sexual relations (Wikipedia 2007). This lolicon is illegal in South Africa but is legal in some other countries.

• Sex acts involving minors: Images contain minors, who are perceived to be under the age of 18, either involved with adults or other minors, in sexual acts. The age for this is not universal and differs, depending on the law. This content is not restricted to adults and minors and also includes sexual interaction between two minors.

2.5 Perspectives on pornography

The modern era of democratic and human rights has introduced a range of movements and perspectives on challenges that face moral society. People are encouraged to voice their opinions and avoid conformation to societal norms. Each of the following perspectives tries to consider the impact pornography has had on society and strives to identify the underlying cause of the increase in consumption and distribution of pornography. Some of these perspectives have a more radical approach to society, while others may seem easier to understand and associate with.

These perspectives are what makes this topic of managing pornography usage so controversial. Different groups or perspectives have varying theories on the role of managing such content, each of them having different motives for their views. On the whole, peoples perspective on pornography will influence their definition of pornography, making it hard to be objective. The following is a discussion of some of these perspectives:

20 2.5.1 Conservative perspective

This perspective is based on the traditional law of obscenity and concentrates on how pornography corrupts virtue and social order. This view believes that sexual desires should be restrained by rationality, interpersonal commitment and responsibilities preserved by marriage and commitment to the family. Pornography is seen as a disease, as it encourages recreational sex. By encouraging recreational sex it undermines the sanctity of marriage and threatens Judaeo-Christian family values (McNair 1996:49). The conservative perspective believes that pornography contributes to sexual violence and is based on religion or secular moral theory (Downs 2005).

2.5.2 Feminist perspective

This perspective rejects the moral and religious views of the conservative perspective and focuses on how pornography contributes to the inequality and domination of women by men in society. Pornography is not seen as a display of sex but rather a display of power and depicts women as objects who exist to fulfil the pleasure of men. There is a view among some feminists that distinguishes a difference between pornography and erotica and identifies erotic material as portraying men equal to women (Downs 2005). Active feminist Diana Russell (2004) defines pornography as “material that combines sex and/or exposure of genitals with abuse or degradation in a manner that appears to endorse, condone or encourage such behaviour”.

Other feminists see pornography as the portrayal of humans as sexual animals, their beings stripped down to sexual essence for the voyeuristic pleasure (sexual gratification from looking at others’ sexual actions) of the spectator. Pornography dehumanises woman and induces misogyny (McNair 1996:47).

2.5.3 Post-modern perspective

Post-modern critics accuse conservatives and feminists of adopting a narrow view of sex and equality. The post-modern view is unsupportive of complex social theories and views pornography as part of society that has many different meanings and effects.

21 2.5.4 Liberal perspective

Liberals tolerate any consensual or voluntary form of sexual activity as long as it does not impose on others. Pornography is seen as a legitimate expression of individual preference. Liberals believe that the means and the lie in the eye of the beholder and reject state restrictions on individual choice (Downs 2005). According to McNair (1996:47) the “liberal” or “liberation” perspective on pornography views pornography as harmless and believes it should be freely available to the public.

2.6 Origins of modern-day pornography

Only after the late 20th century has the media been promoting pornography. Up until the middle of the 19th century there were widespread social, religious and cultural probations against pornography that kept content of such nature hidden. Very early forms of pornography were artistic and such material was admired by the literate and well-to-do who tended to worry about the possible effects of such material. In the late 19th century social and economic factors led to the exposure of pornography to the growing working and lower class. Unlike the educated higher social classes, pornography had a corrupting influence on those newly exposed to such material (St. James Encyclopaedia of Pop Culture 2002).

The migration of the working class to urban areas, along with the expansion of printing operations, led to a rise in the sex trade in urban areas. By the turn of the century a number of different forms of pornography became available in the form of French postcards with pictures of naked women and booklets containing animated sex acts. In 1904 the first calendar featuring partially naked women appeared.

In the 1960s the increased the public’s tolerance of material of a sexual nature. The main forms of pornography in the 20th century have been in the format of magazines or films. A landmark in the publishing and printing revolution happened in 1953 with the start of magazine, showing nude pictures of the American “girl next door”, leading the way for other publications to follow, such as the Hustler magazine and Penthouse magazine.

22 Nudity in films also has a long and interesting history. Silent films of D.W. Griffiths in 1915 showed quick frames of nudity. Only in the 1960s did movies containing explicit sex acts catch the commercial market, and reached the movie houses. The VCR led to a boom in the pornographic video industry. In 1991 alone, 410 million adult movies were rented in the United States (St. James Encyclopaedia of Pop Culture 2002).

Throughout the history of modern media from vernacular speech to movable print, to photography, to the VCR, to cable and pay TV, to , to the Internet and writeable CDs, pornography has made use of technology to increase production and dissemination. Johnson (1998) quotes Paglia (1990:24) saying that “great art is always flanked by its darker sister blaspheme and pornography”. The same can be applied to the media.

The rise of the Internet and the WWW led to a new debate over pornography. The Internet, along with the WWW, provided new anonymous and easy access to pornography. Studies conducted in 1999 by a Washington based software filtering company, N2H2, revealed that there were over 1.3 million Web sites and over 260 million Web pages containing pornography (Bailey 1999). With growing technology and Information Communication Technologies (ICTs), more and more bandwidth has been made available, making it easier to disseminate and access multimedia pornography.

In 2003 pornography grossed in excess of $8 billion in the United States alone, which is far greater then the combined revenue of the ABC, CBS and NBC networks ($6 billion), making it a largely profitable industry (Family Safe Media 2006). Pornography has proven to be a very profitable e-commerce venture and has attracted the attention of investors looking for a successful business opportunity with low cost and high earning potential. See chart 2.1 for a summary of the composition of the pornography industry in 2006.

23 Composition of Pornography Industry 2006

3% 2% 3% 4% 4% Adult Videos 35% 8% Escort Services Magazines Sex Clubs Phone Sex Cable & Pay Per View 9% Internet CD-Rom Novelities Other 13% 19%

Chart 2.1: Composition of the pornography industry in 2006 (Family Safe Media 2007)

2.7 Pornography and Internet technologies

The Internet started in the 1960s as a project of the United States defence force. Universities saw the need for and importance of shared communication and in 1962 teamed up with the Defence Advanced Research Projects Agency (DARPA) to develop the DARPA network (DARPANET). In 1969 the Internet was brought online and named the Advanced Research Protocol Agency Network (ARAPANET). Developments continued at a rapid pace and in the 1970s Transmissions Control Protocol/Internet Protocol (TCP/IP) was proposed as a communication protocol. This created a breakthrough in the Internet, with the first credited communication standard for the Internet (Howe 2004).

Another significant milestone for the Internet came in 1991 when Tim Berners-Lee proposed a new protocol for communication, the Hyper Text Transfer Protocol (HTTP). This led to the inception of the WWW. The WWW has become one of the most popular applications on the Internet (Howe 2004). Along with the WWW came e-commerce, the

24 buying and selling of commodities over the Internet, which changed economics and marketing to suit the online environment.

Along with e-commerce came one of the easiest commodities to market, pornography. The WWW created an environment where online pornography became more and more accessible as technology improved. With the consistent growth of the online population, more and more users are being exposed to pornography, whether it be planned or accidental. Web sites containing pornographic content are among the most frequently visited via the Internet (Thomas 1997:202).

Peer-to-Peer networking is an application utilising the Internet that is vulnerable to exposure to pornography owing to the decentralised architecture and lack of control. A Peer-to-Peer network consists of computers that are connected to one another using the Internet and share resources and files with one another, using a decentralised architecture. According to Ropelato (2003) an astonishing 35% of all monthly downloads on Peer-to- Peer networks contain pornography. Internet Relay Chat (IRC) is an Internet application that is exploited by pornographers. IRC is a multi-user chat system that allows people to chat either in a group or individually (Satkofsky 2004).

File Transfer Protocol (FTP) is another application that can be used for obtaining pornography. FTP allows computers to communicate with one another via the Internet, allowing for quick and easy transfer of files in the format of images and videos. E-mail is an extensively used Internet application, which is commonly misused through the sending of “harmless entertaining” images containing pornography. According to Bissette (2004) over 2.5 billion pornographic emails are sent daily at an average of 4.5 images per user.

Another popular Internet application is newsgroups. Newsgroups are repositories within a Usenet system. This is similar to a discussion group but is technically different (Wikipedia 2005). A newsgroup consists of notes written in a central Web site and distributed through the Usenet through a Network News Transfer Protocol (NNTP). Newsgroups are organised into subject hierarchies, with the first letters of the group indicating subject category and sub-categories. Some newsgroups are monitored through a moderator who can choose what content to publish and what not. Most of the newsgroups on the Internet are un- moderated (Kressin 1997:49).

25 These un-moderated newsgroups and even those that are moderated have been a breeding ground for the distribution of pornography. Some of the most disturbing content is located in these newsgroups. It is almost impossible to regulate and manage what content is distributed through these networks. Many content filtering software applications (discussed in detail in Chapter 3) do not allow access to newsgroups owing to inability to screen their content.

It is clear that pornography has developed alongside technology from live peep shows to the development of photography. It has recently become possible to access pornography in multimedia formats. Online pornography is a rapidly growing e-commerce venture promising high returns with minimal marketing needed (Ong 2004).

Blythe and Jones (2004:75) believe that pornography is the reason why some people have PCs and access to the Internet. Sex and pornography have become the most frequently searched terms since the Web became commercially available. Pornography on the Internet is considered to be a money making industry, with an estimated annual income of over US $10 billion. After the dot.com bust between 1999 and 2001, technical online work in the pornography industry became an important prospect for newly unemployed computer engineers and designers.

The success of the online pornography industry has served as an example for other commercial applications and e-commerce ventures on the Internet. Sexual imagery is increasingly becoming more evident in mainstream media. This view is depicted in the fading boundary between softcore pornography and modern music videos (Blythe & Jones 2004:76).

When trying to understand this new phenomenon, careful examination of this new behaviour in connection with the creation and use of pornography on the Internet is necessary. One should try to understand how pornography encourages new and more liberal modes of sexual expression. At the same time the dark side of pornography has an effect on the morals of society (Adam 2001:37).

The Internet has presented a severe setback to the forces established to regulate pornography. In physical form such as books, magazines and videos, pornography can be seized and destroyed, but in digital format it is almost impossible to trace and destroy

26 (Bailey 1999). These new challenges have also exposed legal loopholes, as Web sites can be hosted in a country where the law allows such content while it is accessed in countries where such content is illegal. There is very little that can be done about this.

Cyberspace (Internet) is an environment conducive to criminal activity, as it is difficult to trace and catch offenders. At the moment child pornography is a major concern. Criminals can be outsmarted by authorities using anonymous interfaces, nicknames and fake addresses. Images currently available via the Internet range from innocent photographs of children to children posing in a sexually explicit manner (Guttman 1999).

A European research project, COPFNE, found that child pornography amounts to 0.07% of 40 000 newsgroups worldwide (Guttman 1999). Studies undertaken show that the most offensive material is located in a small number of newsgroups. People run the danger of accidentally coming across such material. The Internet has made child pornography more visual and accessible and gives paedophiles a sense of being connected to a community of like-minded people. Owing to the ease of distribution, collection, anonymity and convenience, people have an easier time pursuing their taboo sexual interests.

The Internet is undoubtedly seen as the main contemporary medium for the distribution of child pornography. It allows for a degree of anonymity affording people with sexual interests in children the chance to download images from newsgroups with relative safety, while others may be actively involved with the exchange of pictures via Internet applications such as IRC and video conferencing (Rothery 2003).

2.8 The influence of pornography on universities

Pornography has caused a stir in many western universities. Students exercise their right to freedom of expression coupled with the curiosity of the young mind. Universities in the United States have started magazines focusing on sex and sexuality and some universities offer academic courses on pornography as a film . There is definite proof that the liberalisation of pornography in the media is having an effect on society as well as education.

More and more universities are offering pornography as an academic course (Cullen 2006). Some of the activities in these courses involve men and women watching X-rated movies,

27 viewing pornography on the Internet and visiting sex shops. Some United States universities such, as New York University, UMass Amherst, Penn State, Vanderbilt, North Western, San Francisco State, UCLA, UC Berkley, UC Santa Barbara and Arizona State Universities offer pornography as a possible career choice (Cullen 2006).

Lina Williams, a well known film specialist, started presenting a course on the “history of moving image pornography” back in 1994. In this course she looks at and reviews material ranging from early underground footage for male audiences to couples films of the 1970s to more recent proliferating varieties of gay, lesbian and bisexual material. There are student magazines, such as the H-Bomb of Harvard and Bionk of Boston University, which are based on sexual experimentation and liberation. At Berkley University students can earn as many points as that of an engineering course by taking a course on female sexuality (Chheuy 2004).

2.9 Summary

In response to the evolution of the media and technology, pornography is becoming a more prominent feature in our lives. There are mixed perspectives on the perceived use and danger of pornography for society. Pornography today even features as part of university curricula for some universities in the United States.

Internet applications have created a breeding ground for pornographers who are looking to distribute their content to an easily reachable international mass market. Technologies make production and dissemination easier. From the humble beginnings of an art form, pornography has become one of the most successful e-commerce ventures turning over billions of dollars (US) a year. The difficulty of defining pornography makes it hard to manage.

Pornography comes in many different categories (see taxonomy figure 2.1), some more tolerant then others. Some content may be legal, while certain content remains illegal under the SA Films and Publications Act. Universities have to decide what content will be tolerated and what measures will be taken against those possessing and distributing intolerable content. Physical control methods in the format of content filtering (discussed in Chapter 3) can be implemented to enforce a university’s stand on pornography.

28 Chapter 3 Online content filters

3.1 Introduction

Content filtering is a form of censorship used on the WWW as well as other Internet applications. Censorship of content is nothing new. For thousands of years censorship has been employed by those in a position of power to varying effect. This term often has negative connotations, as it restricts one’s right to freedom of choice. Filtering content on the Internet is fairly new and seen as necessary for some, owing to the evolving nature of content via the Internet.

In this chapter the following sub-problems will be investigated: • What is online content filtering? • What are the advantages and disadvantages of different content filtering solutions?

The focal point for this chapter is content filtering technologies. Different content filtering methods are described along with their strengths and weakness. This chapter will also investigate possible future developments in content filtering in order to provide a more comprehensive filtering service.

3.2 Censorship

The term censorship originates from Latin, “censere”, meaning to give one’s opinion to access. The Roman censors were magistrates who took the census count and played a role as the inspectors of moral conduct. Today censorship involves the suppression of ideas and information that people, groups or government officials, find objectionable and dangerous. Censors try to use the power of the state to impose their views on what is truthful, appropriate or offensive. Censorship may take place at any point in time, before or after an incident occurs, in order to deter others from likewise expression (Sein 2001).

Some people believe that censorship never dies and that it simply changes form. There have been many examples throughout history where institutionalised powers have set out to control the creation and dissemination of ideas. Nazi Germany, Stalin’s Russia,

29 Apartheid South Africa and many other states today set up controlling of ideas as part of wider ideological and social control (Allard & Hannabuss 2001).

Some believe that dangers face society from extensive access to all forms of information and ideas, especially regarding the Internet and associated online applications. Too much control can lead to tyranny while too much tolerance can lead to passive and unthinking relativism. One way of looking at censorship is to see it as part of the dynamic process of intellectual production and dissemination of ideas.

One can look at extreme cases of government control over the Internet, i.e. Thailand and China. For 20 years the Internet has grown exponentially, amounting to a self-regulation experiment in anarchic development that is too large to be maintained by the originally intended nature (Ebbs & Rheingold 1997:59). One has to ask the question; where should one draw the line?

South Africa is no stranger to the idea of censorship. The past era of Apartheid lasted from the 1950s until 1994. This period was characterised by severe censorship, in an attempt to strangle African extra-parliamentary liberation activity. This strongly resembled the socialist activities of the USSR. The censorship affected aspects of cultural, intellectual and educational life (Newth 2001).

3.2.1 The origins of pornography censorship

Censorship has been around for centuries; only after the invention of Gutenberg’s printing press did the controlling of content become a concern. The duplication of content spawned the need to censor certain content. Technology made duplication of content easier and more affordable, as information in a digital format such as hypertext can be duplicated with little effort or cost.

During the American Civil War of Independence in 1864 photography found two new uses. The one was photography on the battlefield while the other was better known for the creation of pornographic photographs. Soldiers demanded more than just letters from home; they were in search of something a little more exciting, such as erotica. This soon became very popular and pornographic literature and photographs had to be restricted by the government’s postal service. The American Congress passed the first law prohibiting

30 obscenity being distributed via mail, although by the time the law was passed the war was over (Johnson 1996).

3.2.2 Perspectives on pornography censorship

Different approaches and perspectives arise from differing views on the possible role of censorship in society. It is impossible to find a uniform outlook on censorship, as many different underlying factors are involved.

The libertarian perspective believes that there is no harm in pornography and insists that only if the state can scientifically prove that there is any harm in exposure to pornography can the state justify censorship. Any censorship of pornography without empirical evidence of social harm is considered an illegitimate exercise of state power in a democratic society. Research has yielded no conclusive findings of the link between pornography and harm, so censorship is unjustified (Sandy 2001).

The conservative perspective views pornography as a sexual practice outside the majority norm and believes it threatens society itself. Sexuality as a force of nature is dangerous and out of control. Society is threatened by the perversion, immorality and family disintegration that can be related to the increase in the distribution of pornography. It degrades women and corrupts men. It is therefore the state’s responsibility to take active measures to curb the production and distribution of pornography.

The radical feminist perspective views sexuality as a reminder of patriarchal society; in such a society women are oppressed by men. Women are restricted to expressing their sexuality on men’s terms. Pornography is seen as violence against women and their rights and is seen as degrading. Sexuality is the main source of the oppression of women. Radical feminists believe that all pornography harms women and censorship is crucial (Sandy 2001).

3.3 Censorship in universities

As rapidly as technology is changing, universities are being faced with issues regarding the use of information technology on their campuses. University staff and students are

31 increasingly able to access material from a variety of sources that would be considered offensive to some.

Universities are faced with some tough decisions on the issue of the behaviour of its community members. It is crucial that a balance be reached between the existing laws and the sensitivity of members on academic campuses, as any screening or restriction is a dangerous step towards the loss of liberty. Universities should make consistent rules regarding the values of university conduct. Strong and frequent statements on the intended use of IT resources on campuses are essential (Rezmierski 1994).

Tertiary education institutions are often hailed as bastions of free speech, a place where censorship is revealed and students are given the opportunity to explore their different opinions. This notion was developed in a pre-Internet environment where information flowed less freely. Today, along with a wealth of information and Web sites that are dedicated to harmless academic knowledge, content of a different nature exists which promotes pornography, racism and criminal activity. Controversial information is often accessed in computer laboratories that can potentially expose users and bystanders to offensive content directly or indirectly (Peace 2003:105).

The rise of the Internet has expanded the censorship debate to include pornography accessed via the Internet. Universities are faced with a moral debate on whether to censor or not and where to draw the line. Basic censoring of sites or newsgroups based on keywords or other filtering methods, discussed later in the chapter, can inadvertently restrict access to legitimate or academically focused Web sites.

One approach specifically used by universities is the filtering of information based on whether the content is legal or illegal or whether it remains ethical or unethical. In this approach legal and ethical information must be uncensored and made available to all seeking such information. Problems do arise (see figure 3.1) when information is either illegal but ethical (i.e. gambling) or legal but unethical (i.e. pornography).

32 Ethical

Gambling E-Auctions Unethical Selling banned Pornography substances (certain categories)

Illegal Legal

Figure 3.1: Classification of content (Peace 2003:106)

In most scenarios where decisions to censor are being dealt with, institutions involve all members of the University’s community with the help of legal advisors, to decide on a policy on online pornography (see Chapter 4). Often the views of the public, students and staff are ignored. Interest from all parties affected is essential when making an informed decision (Peace 2003:107).

3.4 Content filtering

A content filter is one or more software applications that work together to prevent people from viewing or accessing undesirable content via the Internet. Hochnieser (2001) identifies two components to content filtering: content rating systems and protocol filtering.

Content rating involves the categorisation of content that requires software to analyse it before the user is allowed to access the content. Some form of artificial intelligence is necessary for the software to determine whether or not the user may be granted access to the content.

33 The other type mentioned by Hochnieser (2001) is protocol based filtering. In order to perform this task, the filtering software requires filtering of the content to determine whether or not access will be granted based on the communication protocol used to receive the information. Depending on the application being accessed, a unique protocol will be used to send and receive information. Newsgroups, as discussed in Chapter 2, use a Usenet protocol, which is different from the WWW’s hypertext protocol. This form of content filtering can be performed with relative ease.

Hochheiser’s definition does not mention various other types of content filtering (mentioned later in the chapter) and fails to inform people of the different types of content filtering available to the end user. Schneider (1997:xiv) views Internet filters, also known as content filters, as mechanical tools wrapped around subjective judgement, as they are designed to block Internet content. This content is usually identified and categorised. Some filters try to block keywords, some try to block individual sites and some try to block both types of categories. Most filters are constructed to block many different kinds of content (pornography, violence, racism etc.) and some try to limit content based on illegal access to obscene content.

Gartner Consulting (2001) defines content filtering as the ability to limit users to what content they may have access to. Web site blocking or filtering is implemented through filters, which provide Internet limitations to manage access to prohibited content.

For the purpose of this research, content filtering is defined as software applications designed to identify predefined content using various methods (i.e. keyword identification, URL identification, protocol identification and image classification) and prevent access to content falling under the predefined classifications. There are many different methods that can be employed to do so and in some cases more than one method can be used simultaneously to increase accuracy. The performance of a content filtering mechanism is measured by the positive as well as the negative identification of content. It is very difficult to strike a balance between over-blocking, not allowing access to relevant content and under-blocking, the inability to filter out unwanted content.

34 3.4.1 Different content filtering solutions

There are many different content filtering solutions available, each of them having a different approach to filtering unwanted content. Some of these approaches are very simple and may have a low accuracy where as some of the more complex filtering systems ensure greater accuracy. It is very simple to create a system that can filter out pornographic content. The challenge lies in developing systems that allow access to content of a sexual nature that is not considered pornography, such as sex education and medical journals dealing with human reproduction. Often these content filtering solutions are too restrictive and deny access to content containing no pornography owing to the simplicity of the filtering application, discussed later in this chapter.

It is very important that in an academic environment positive restriction of access to content is enforced. Universities need content filtering solutions that are accurate in filtering out unwanted content and allow access to desirable content. It can be difficult to achieve a state were content filtering is objective in denying access to resources. The following approaches have unique methods of filtering content, making them successful while at the same time they are faced with setbacks that restrict their performance. A discussion of some of these methods follows.

3.4.2 Keyword filtering

Keyword filtering, also known as keyword blocking, uses text searches to categorise and block Web sites. Web sites are blocked if they contain objectionable terms (Gartner Consulting 2001). There are other terms used to describe this type of content filtering, viz “content identification,” “content analysis,” “dynamic document review” and “phrase blocking”. All of these methods require software able to scan content and look for possibly objectionable terms as set up by the users of the filtering software (Schneider 1998).

Keyword filtering is a fairly basic form of content filtering and is simple to programme and set up. Keyword filters use a list of terms to analyse content and determine whether or not access should be granted. This is a text-based approach and scans metatags that are attached to the Web sites giving insight into the content on display. Search engines use a similar method by using spiders or crawlers to return hits to searchers’ requests. This approach relies on certain assumptions that the metatags are correctly entered and mirror

35 the content being published on a Web site. Keyword filters are unable to classify images accurately.

Once a keyword filter has identified an offensive term it can be programmed to act in different ways. The first approach is very common: as soon as it identifies an objectionable term it, stops receiving data from the Web server and does not display the Web site requested. The second approach displays the Web site but obscures the term or terms in question. The third approach allows the user to access the Web site but only display sections of the Web site. Not all Web pages may contain objectionable terms and these Web pages are accessible. The fourth approach will actually close down the browser once it comes across a term in question. This approach is suitable for minors and may not work well with adults (Schneider 1998).

Most software does not overwrite the objectionable term and replaces it with ####. Keyword filters seldom have the intelligence to understand the context a word is used in and block only the word and not the entire Web site or Web page. This creates problems for the end user of such approaches to content filtering (Hunter 1999).

Most pornography is in the format of images and videos. Keyword filters are unable to analyse these formats, making them less effective at identifying pornography and may be better suited to identifying other forms of unwanted content, such as hate speech and racism (Gartner Consulting 2001). When blocking content on the basis of words, one needs to be very careful and consider what content should and should not be blocked. Problems arise with certain words like “roger”; to some this may seem like a harmless term but to Australians this word is slang for a penis, also referred to as a “cock” in some dialects of English (Schneider 1998).

One of the biggest problems with keyword filters is that they are unable to identify the context in which a word has been used. Terms used in sex education will be blocked, as well as those referring to parts of the human body, which are often used in education. This method is crude and inflexible, needing many adjustments to make it a success.

The positive side of keyword blocking is that it is very easy to set up. Users have to determine what terms are deemed objectionable and enter these terms into the system. Once this is done, one can always add more terms or delete terms if problems have

36 occurred. This content filtering solution works very well in environments that can afford to be very restrictive, such as primary schools or certain work environments, but it is too simplistic leading to over-blocking and thus making it unsuitable for a university environment where a very wide range of subjects and issues are researched.

3.4.3 URL and Web site blocking

This approach requires humans to compile a list of URLs that contain pornography and other unwanted content. Another name given to this technique is “list blocking”. These URLs are the addresses where the Web sites can be located on the Internet. If one blocks access to a URL, users will not be allowed access to that particular Web site through that particular URL. This is very closely related to Web site blocking. Web sites may remain the same but change their URL or even have multiple URLs.

On a similar level people are able to block access to Internet Protocol (IP) addresses. Each URL is attached to an IP address. This IP address is used to identify the source of the Web site and destination hosts. An IP address is a 32 bit address (Wave Technologies 1998:230). IP addresses may be linked to multiple URLs. This IP address is seen as the root of a Web site. If access to an IP address is blocked, the Web site will remain inaccessible regardless of how many different URLs it may be linked to. IP blocking is the most effective method of preventing access to Web sites containing questionable content It is very similar to URL blocking but works at a different level.

Web site blocking, also referred to as site blocking, like URL blocking, requires the compilation of a list of Web sites and is used to determine whether one will be granted access to the Web site in question or not. The Web sites identified are linked to both the URLs and their IP address. These lists will differ, depending on what content universities are looking to restrict access to. Common categories for restriction are pornography, racism, violence and hate speech. This requires configuration of a set of parameters by the user (Schneider 1997:9).

URL and Web site blocking is often offered by third party software vendors who actively search for new Web sites containing content of a questionable source. These restricted Web sites are categorised according to content published on the Web site. These lists are updated periodically and people are able to download these updated lists when they are

37 made available. This is a challenging task, as thousands of new Web sites are opened every day making, it a never ending task.

Some filters are able to block at domain or host level, denying access to the entire Web site while more advanced Web sites are able to block at directory or even file level, allowing users access to the Web site but only restricting parts of the Web site in question. A more advanced Web site blocker allows users more access to the content and can be set up to restrict access only partially (Schneider 1998).

Humans involved in reviewing Web sites to determine the nature of the content are faced with an almost impossible task to fulfil, as accuracy and consistency are crucial. People reviewing these Web sites need to go back constantly and check on the content of the previously reviewed Web sites to ensure no changes to the content have been made. This should be done regularly, as content on Web sites is updated regularly. To keep up with the new Web sites and review previously checked Web sites is a daunting task for any list provider (Hochheiser 2001).

People need the help of automated tools to find new Web sites and identify Web sites that should be considered for categorisation. If these tools are not very accurate they will be unsuccessful in identifying content (Schneider 1998). This form of content filtering is time consuming and requires constant reviewing of the millions of Web sites on the ever- growing WWW.

It is very hard to identify content residing on Web sites, as well as categorise the content of Web sites accordingly. What must be done when there are Web sites that fall under a number of categories, such as violence and sex? If the administrator only chooses to block out content falling in the category of violence, will the user be able to access content that contains both sex and violence because it was categorised only under one of the categories? This depends on how the filtering is configured. It is very complicated to configure and set up.

One of the biggest overshadowing factors of URL and Web site blocking is that thousands of URLs and Web sites can share the same IP address. A study conducted at Berkham Centre, Harvard University, showed that two thirds of Web sites tested were Web sites with 50 or more allied domain names (McCullagh 2003). Current technologies make it

38 easy for Internet Service Providers (ISPs) to block IP addresses and not domain names or URLs. The sharing of IP addresses on Web sites is common practice; Yahoo hosts 74 000 Web sites on a single IP address. Tucows.com uses one IP address for 68 000 Web sites and Nanedzero.com has 56 000 URLs on one IP address (McCullagh 2003).

URL blocking is more subjective then other forms of content filtering because it relies on humans reviewing the content of a Web site making an accurate judgement of content. Many automated filtering services lack human intervention handicapping them, owing to the simplicity of their logic. URL blocking is very effective because of human involvement and classification.

3.4.4 Protocol blocking

This form of content filtering requires the blocking of access to all resources of a particular protocol that governs the exchange of information between computers (Schneider 1997:6). There are different applications that run on the Internet using a unique protocol that performs different tasks. These different applications use the Internet as a communication platform. Examples of these unique protocols are WWW, IRC and Usenet (NNTP). Many organisations use protocol blocking to reduce any unnecessary bandwidth usage and to prevent employees or students from accessing newsgroups that contain unregulated content. Some of these newsgroups have been found to contain content of illegal and obscene nature, such as child pornography (as discussed in Chapter 2).

Protocol blocking is very easy to set up and effective at blocking access to unwanted Internet applications. The problem with this approach is that it is too restrictive and only a very small percentage of content running on these Internet applications, such as newsgroups, contain questionable content. Over-blocking is common when applying protocol blocking. This type of blocking can be used to deny computers access to applications; in doing so it releases Internet bandwidth for better utilisation elsewhere.

3.5 Content rating systems and services

In response to the concern of various Internet organisations and societies, content rating systems are being developed in aid of effectively categorising content so users can assign control over content, giving them the power to deny access to unwanted content. These

39 systems are still in their early days but are gradually gaining acceptance. At present there are few internationally recognised standards.

Hochheiser (2001) defines a rating system as a series of categories and gradations that are used to rate Web sites. There are two approaches: self-rating and third party rating. Self- rating systems allow publishers to rate their own content using a predetermined system e.g. the Platform for Internet Content Selection (PICS). Third party systems use a particular rating system based on the PICS rating system but this can differ between vendors.

Rating systems use a series of categories to classify Internet content. Depending on the source of the rating, there may be a number of sub-levels for each of the categories. Rating systems contain information that describes the Web site, embedding this information in the Web site (Gartner Consulting 2001). This is a form of metadata, data that describes data, giving a browser more information on the content one may be trying to access. This metadata is embedded into the Hyper Text Mark-up Language (HTML) coding of the Web site, which is interpreted by the user’s browser (Schneider 1997:4).

For Web rating to work, one needs to have Web sites with rating integrating into the HTML coding. A Web browser is needed that can interpret the rating coding. This feature is available on the most popular browsers including Microsoft Internet Explorer, which has the capability to utilise any rating system but comes with Recreational Software Advisory Council (RSACi) preconfigured. This makes content rating accessible to many, as Microsoft Internet Explorer is a very popular browser (Schneider 1998).

There is a difference between rating systems and services. Rating services provide content labels for information on the Internet. A rating service uses rating systems to describe content (Evans et al 1996:10). Rating systems provide structured methods for organisations to rate their Web sites, where as rating services are organisations or companies that actually assign these ratings (Schneider 1997:64).

A simplistic approach to rating content and managing access to content on the Internet is a rating system. These can be very basic and use age rating to determine the nature of the content. Some rating systems are complex and rate content in multiple categories, comprising different security levels. The Platform for Internet Content Selection (PICS) is

40 seen as a universal language that is interpreted by the user’s browsers, giving them permission to view the content in question (Hunter 1999).

Filtering companies have their own rating services; however these are non standard and specific for each organisation. Rating systems that are used by rating services are not integral to specific filtering products. Some of the filtering vendors use RSACi’s four categories (nudity, sex, violence and language to rate content) as a rating system (Schneider 1997:64). There is no international standard when it comes to the rating of content but there are benchmarks, like that used by RSACi, which create an effective way of classifying content. This method is used by a number of content filtering vendors.

A very popular method at the moment is that of self-regulation of Web sites. Due to the size of the Internet it is almost impossible to police and self-regulation becomes a more realistic alternative. Self-regulation is a process in which various players agree on rules of behaviour and adhere to a code of conduct. Participation from Internet organisations is not necessarily absent; the industry can assume responsibility for more than one component of regulation. In particular with regard to the Internet, a sufficiently vigorous regulatory system is able to assert itself.

According to Hart et al (2002:40) an ideal state for self-regulation of the Internet is a filtering system that is based on self-classification by content providers and gives users the opportunity to activate lists of both desirable and undesirable content. Through this input from the users, notification of Web sites they should visit or be interested in, as well as those to avoid, will be indicated.

In figure 3.2 Hart et al (2002:43) identifies the different role players in online content regulation. No one force can work alone to regulate content effectively. There needs to be collaboration among Internet industry representatives i.e. law enforcement, hotlines, self- rating and the media. Objectionable content is divided into two categories: legal but questionable content and illegal content. Each of these needs a different approach in order to prevent access and distribution of such content.

41

Figure 3.2: The five dimensions of the regulation of the Internet (Hart et al 2002:43)

3.5.1 Recreational Software Advisory Council (RSAC)

The Recreational Software Advisory Council (RSAC) was developed with the coalition of over 25 organisations and established in 1994. The RSAC system was a content advisory system founded on self disclosure using a rating package. Content producers were required to fill out questionnaires about the content with respect to four categories: violence, sex, nudity and language. The RSAC would collect these questionnaires and post results with the relevant stakeholders. One drawback of the RSAC rating system was that it never gave age recommendations for which the content was suitable (Evan et al 1996).

Only later in 1996 was the RSAC adapted for Internet usage under the RSACi. The RSACi is a Web based questionnaire that identifies the content of Web pages or directory tree based on content categories. The RSACi provides a content rating system for Web sites to rate their own content. These four categories are violence, nudity, sex and language. Table 3.1 illustrates the four different levels of intensity attached to the each of the categories.

42 Table 3.1: RSACi content rating categories (Evans et al 1996)

Violence Nudity Sex Language crude, vulgar rape or wanton provocative Level explicit sexual acts language gratuitous frontal 4 or sex crimes or extreme violence nudity hate speech aggressive strong Level violence non-explicit frontal nudity language or 3 or death to sexual acts hate speech humans moderate Level destruction of clothed sexual partial nudity expletives 2 realistic objects touching or profanity Level injury to human mild revealing attire passionate kissing 1 beings expletives none of the Level none of the none of the above or none of the above 0 above innocent kissing; romance above or sports related

The RSACi was established in response to a need to create a standard way of reviewing content being published in computer software and on the Internet. For self-regulating content rating systems, to work standardised categorisation is needed. RSACi provides a foundation for the rating of content. This classification method is used in a wide range of content filtering software packages as a guideline for rating content.

3.5.2 Internet Content Rating Association (ICRA)

The Internet Content Rating Association (ICRA) was founded in 1999 by a non-profit organisation and leading international parties. The ICRA is funded by contributors from the European Union, and is considered as the European equivalent of the RSACi. ICRA plays a vital role in the administering of accepted self-regulation systems for Internet content (Hart et al 2002).

The ICRA software has been available as a shareware download since 2002. This system relies on voluntary input from providers as well as users. Once the software is installed on the user’s computer it can be configured to block unwanted content defined by the user. ICRA has a valuable feature, as it creates positive lists (content a user may be interested in) of Web sites as well as negative lists (content a user may be interested in filtering).

43 3.5.3 Platform for Internet Content Selection (PICS)

The Platform for Internet Content Selection (PICS) was developed by the World Wide Web Consortium (W3C), the guiding force behind the WWW. PICS was developed in 1995 and later released in 1996 (Hunter 1999). The founder and father of PICS is Paul Resnick from the University of Michigan. PICS was originally started so that people could distribute descriptions of their digital work electronically in a simple computer readable format (Hochheiser 2001).

Computers can process this information and shield users from undesirable content or direct users to content aligned with their interests. This metadata is very useful and can be used for more than just preventing people from accessing content that may contain pornography. It can be used in a positive capacity to attract users to content that may suit their interests (Hochheiser 2001).

The W3C started PICS as a practical alternative to global government censorship on the Internet. PICS is a number of protocols that work together allowing users to develop rating systems to distribute labels, write filtering rules and digitally sign labels for verification and security (Hunter 1999). This allows users to selectively block content. PICS is not a rating system but rather a way of carrying encoded ratings that are identified by a user’s browser. These metadata labels can be added to Web sites or even individual files residing on Web pages (Evans et al 1996).

PICS is able to ensure the integrity and the authenticity of the content through the use of digital signatures. This ensures that the Web page has not been altered or manipulated, ensuring the document and labelling are genuine. This makes it safe to use and prevents the risk of being exposed to altered content on a Web site (Evans et al 1996).

It is unclear how many Web sites rate themselves with PICS, but research has shown that fewer than nine in a thousand Web sites are believed to be PICS rated. That equates to an estimated average of less than 0,87% (Westfall 2005). One reason for this is that PICS coding in Web sites is not mandated by law, allowing Web site owners and developers to decide whether or not to incorporate PICS into their programming. Even if Web site owners do incorporate PICS coding into their Web sites, there is no guarantee that the

44 classification of the Web site will be correct. Misclassification may become a problem if Web sites are forced to adhere to the PICS classification (Gartner Consulting 2001).

PICS is difficult to configure, making it unattractive to new users. There is no economic incentive to include PICS into Web sites. This voluntary participation does not enforce the use of PICS. The only way to ensure successful content filtering with PICS on a user’s browser is to disallow any connection to Web sites that is not PICS rated and in doing so one seriously restricts user access (Hunter 1999).

PICS does offer some attractive attributes to a user, as almost every browser type is supported, making compatibility easy. No additional software is needed, ensuring affordability to organisations with budget constraints (Gartner Consulting 2001).

3.6 Drawbacks of content filters

Content filtering is an effective defence mechanism against managing access to pornography and offers the potential to create a safer surfing environment. However, there are a few setbacks. Any form of control has two sides. While creating a safer environment for unsuspecting Internet users, it also restricts the content that may be accessed. The two most common problems encountered with content filters are under-blocking and over- blocking. Under-blocking occurs when content filters are set to be less restrictive and allow access to more unwanted content than desired. Over-blocking occurs where content filters are set to be very critical to allow the least possible access to undesired content, but then suffers from loss of access to content.

Filters are notorious for sweeping too broadly and have a tendency to block sites with a small percentage of questionable content while restricting access to a large amount of insightful and useful academic content. At the same time content filters can be costly to run and maintain, requiring constant updating to ensure optimal efficiency. Under-blocking can create a false sense of reality, leading people to believe that the system in place can block all unwanted content. The Kaiser Family Foundation found in 2004 that filters do not block access to one in ten Web sites under intentional access and one in three under simulated accidental access. In addition users, with good knowledge of computer operating software are able to bypass content filtering software (Kranich 2004:14).

45 Over-blocking is common, which can lead to filtering out significant amounts of perfectly legal and useful information over a variety of fields. Some examples of Web sites that have been blocked by various filtering applications tested included Web pages for Quakers, safe sex Web sites, AIDS prevention Web sites and home cooking Web sites. One of the biggest challenges is the blocking of IP addresses, as more than one Web site may reside on the same IP address (Kranich 2004:14). Over-blocking is common when organisations block newsgroups when only a small percentage of the content found on newsgroups may contain pornography (Hunter 1999).

When trying to list or categorise content, one of the biggest problems is that of remaining objective while evaluating content. Surfwatch, a software filtering solution, for example, includes erotic stories and textual descriptions of sex acts as sexually explicit material. Making matters worse is that it is impossible to be objective when the very content being reviewed remains problematic to define, and where Web sites that have a variety of content are classified under a single category (Hunter 1999).

Control lists used by software vendors work either by keyword blocking or Web site blocking. Problems with language are also encountered as English is the most searched language on the Web. Not all of the content communicated via the Internet is English, making it almost impossible for keyword filters to identify questionable content in all the different languages (Kranich 2004:15).

Figure 3.3 illustrates a problem encountered when using a basic keyword filter. In this example the word Sussex is blocked because of the suffix of the word, which is “sex”. Keyword filters will identify this and prevent users from accessing this Web site. When the classification of the term Sussex is taken into consideration, the filter software classifies the term Sussex as harmless and access is granted to the Web site.

46 Filtering system based Filtering system based on keywords on content

Core element is filtering Core elements are self- software with integrated list of classification of content, words to be blocked filtering software and lists the URLs to be blocked

Term Term “Sussex” “Sussex”

Classification of content to be “harmless”

Filtering software Software recognises classification as harmless recognises “sex” URL on negative list blocked Word/URL blocked Reconciliation with URL lists not blocked URL not on negative list

Figure 3.3: Problems with keyword filters (Hart et al 2002:44)

Filtering solution providers are faced with the daunting task of reviewing vast amounts of content via the Internet. This is impossible to do manually and artificial intelligence is necessary in the form of “spiders” or “crawlers”, which are used to search the Web and create content reports. These reports are then interpreted and reviewed to decide what content needs to be blocked (Hunter 1999).

When categorising Web sites one needs to be careful, as Web sites can consist of hundreds or even thousands of Web pages. Not all of these Web pages may contain content of a pornographic nature. An ideal content filter should block access only to those individual Web pages that contain undesirable content and not the entire domain (over-blocking), which is often the case.

47 To complicate the issue further, research has shown that the average lifespan of a Web page is about 75 days. With the number of Web pages today, this is an average of about 250 000 Web page changes per hour. This is constantly increasing, creating a bigger problem for those trying to review the changes in content. The workload is drastically reduced by making blanket judgements with very little information on the Web pages being reviewed (Hunter 1999).

Content filtering companies place great secrecy on their category lists of Web sites and they do not share their lists of blocked sites, as strong competition exists between different filtering solutions. These lists of URLs are considered their trade secret and are subjected to encryption to ensure their competitors cannot gain access to these lists. This creates unnecessary competition between the various contending content filtering companies, reducing the chance of collaboration to ensure more comprehensive filtering solutions.

3.6.1 Techniques employed by users to bypass content filters

Content filters can be avoided by both the user and the site owner. Users of Internet facilities can get around content filtering applications by using Peer-to-Peer networks to access pornography. It is not possible at present to filter content being shared and distributed on Peer-to-Peer networks. It is only possible to block all Peer-to-Peer networking through a protocol block. Some users can even go to lengths such as using encryption to disguise URLs, as software is available via free download to perform such a task.

Proxy servers can be used to mask the identity of the user and can be used to manipulate ports not used by filtering mechanisms, allowing users to bypass the system. There are Web sites such as Multiproxy.org that give advice to exploit filtering software with the help of proxy servers. Proxy servers route HTTP traffic, keeping requests anonymous. If a user is restricted by content filtering from a certain location or university, the proxy servers can be used to make it seem as if the content is being accessed from another location (Gartner Consulting 2001).

48 3.6.2 Techniques employed by Web site owners to bypass content filters

Content filtering can quite easily be bypassed by the constant changing of the Web site’s URL. This is effective as most of the filtering solutions use URL lists to block access to content. By changing the published content, the content filtering companies are prevented from keeping up with the change and access may be granted on the user’s side. Data encryption is a basic method that uses Point-to-Point Tunnelling Protocol (PPTP). This protocol allows for the transmission of sensitive data across public networks without fear of filtering (Gartner Consulting 2001).

Web site owners can use streaming media to send live content via the Internet. This cannot be blocked by content filters. Messenger, a popular application, can be used to send and receive this streaming media via the Internet without detection from any content filtering. The software is readily available and easy to set up and use (Gartner Consulting 2001).

3.7 The future of content filtering

In view of all the mentioned deficiencies of content filters, there is room for improvement and as time moves on, technology improves making content filtering more effective. Content filtering is still in the early stages of development, and many of the small problems need to be consolidated and refined to create a truly effective filtering solution. Many different approaches have been attempted to create a more accurate filtration method. Future developments will come up with the answers to some of the problems experienced at present.

3.7.1 Separate domain for pornography

A simplistic approach to solving the problem of managing access to pornography would be to create a separate domain for pornography. Once a separate domain has been created, Web sites containing pornography can be registered to reside within the new domain. It is very easy to manage access to a particular domain and requires simple software to prevent access to this domain. The accuracy will be remarkably high, as well as having the benefit of being affordable, making this a good option for learning organisations to adopt.

49 A Toronto entrepreneur has had a plan approved to create a “red light district” on the Internet by selling the domain .XXX to Web sites that are funded by and based on pornography (Toronto Star 06/09/2005). The idea behind such a scheme is to aid the management and regulation of pornography in an attempt to combat illegal pornography. Creating a separate domain makes it easy to manage access to pornography and requires very basic software to manage access to this separate domain.

3.7.2 Image filtering

The recent development in object detection for a variety of uses has led to the discovery of image filtering. This process can be used to identify images. The technology is still new and improving at a rapid pace owing to continued interest and research. In order for this to work with real time Internet, content image classifiers need to be fast and accurate (Firschein et al 1998). Intelligent software agents are used to identify shapes and skin surface areas to classify images. This software can also be used to screen moving images in other formats such as mpeg, mpg and wmv.

This process of screening the content before it reaches a user’s browser can be performed at satisfying speed without crippling a browser’s upload time. Various methods are being developed to create a fast and accurate way of filtering images via the Internet. Some methods incorporate filtering of image descriptions as well as filtering the image itself to improve accuracy. Two commonly used approaches are histogram skin-based pixel classification and Wavelet Image Pornography Eliminator (WIPE).

3.7.2.1 Histogram skin-based pixel classification

This is an effective approach to identify images of questionable content. This approach looks closely at the colours of the image and concentrates on the identification and classification of skin. By determining how much skin is exposed and what part of the body the skin comes from the histogram software determines whether the picture contains content of a pornographic nature. This is a simplistic approach that concentrates on one aspect and can be combined with text classification to improve the speed and effectiveness of the classification process.

50 In each image various factors are used to determine the probability of , namely (Jones & Rehg 2002): • the percentage of pixels detected as skin; • the average probability of skin pixels; • area of skin pixels for the largest connected component; • the number of connected components of skin; and • the percentage of colour with no entries in the skin and non skin histograms.

These factors are linked with the width and the height of the image to give a strong basis for adult image classification. The last two features mentioned above are based on the observation that adult images are sized to standing or reclining image. The overall result can be computed very quickly.

Testing for the histogram skin-based pixel was done in 2002 using the methods indicated and achieved a 85,8% correct detection and only 7,5% false positive (Jones & Rehg 2002). A text analysis was run in conjunction with the image detector. This result improved to a 93,9% correct labelling of adult images and only obtained 8% false positive. Pages that contain little text but many offensive pictures proved to be more challenging to the image detector but the skin detector could identify these images.

There is a general classification process for skin histograms, which is similar to that used by Jones and Rehg (2002). These are: • the ratio of skin to the image area; • the ratio of the largest skin segment in comparison with the outline of the image; and • the number of skin segments in the image. There are a number of additional features (i.e. text analysis) that can be used to help with the accuracy of the identification of pornographic images (Chan et al 1999).

3.7.2.2 Wavelet Image Pornography Eliminator (WIPE)

WIPE is an image classifier that identifies images as either objectionable or benign. This is a multilayer approach developed using an icon filter, a texture filter and wavelet based shape matching algorithms to give an effective screen package of objectionable images. This is a practical screening solution using processes that query at speeds of less than two

51 seconds per image, depending on the size of the image. Over a testing period the system demonstrated 96% sensitivity and false positive classification of 9% of images (Firschein et al 1998).

3.7.2.3 Setbacks for image filters

There is still considerable room for the improvement of image filtering techniques. This technology is still in the early phases of development but shows promising results. The use of image filtering has not been included in packages offered by some of the big content filtering vendors. Once refinements have been made to improve the speed and accuracy of the image filtering process, more users will pay attention to the enormous potential it offers.

Some images being filtered are subject to the following criteria, making effective identification difficult (Fireschein et al 1998): • Low pixilation quality makes it almost impossible to filter effectively. • There are a number of different angles and positions where images can be taken from making it difficult to identify the human figure. • If there are more than one person in the image in contact with each other, object classifiers battle to identify “foreign” objects. • There are many different skin colours and different textures, making it very difficult for histograms to identify skin pixilation.

Images may contain isolated skin pixels that contain the same colour as the background, making it very hard to differentiate skin from background. Histogram skin-based classification may be thrown off by the background if there are skin pixels in the image that contain pixels of the same colours as those of the background (Chan et al 1999).

3.8 Summary

Censorship has been around for thousands of years and is considered part of everyday life, where it is used to control ideas on politics and religion. The censorship of pornography is a relatively new concept and can be traced back to the end of the 18th century.

52 Content filtering is a widely used approach in an effort to prevent unwanted access to pornography via the Internet. Universities use content filtering as an active control mechanism to prevent online users from accessing pornography. Chapter 3 analyses the different types of content filtering that are available. The choice of a content filtering product will influence the accuracy of a university’s filtering capability.

Great care should be taken when deciding upon a content filtering product, to ensure that it meets the needs of an organisation. Universities have to be especially careful with their decision, as over-blocking should be avoided to ensure students and staff have access to information. Careful consideration is necessary when choosing and setting up a content filtering solution. Chapter 4 will discuss the importance of AUPs as a passive control measure against the accessing of online pornography.

53 Chapter 4 Acceptable Use Policies (AUPs)

4.1 Introduction

AUPs are often not seen as proactive tools in preventing access to pornography. An organisation should have a sound AUP to co-exist with the content filtering measurers employed. This policy can be seen as a passive form of control, while it does not physically restrict a user from accessing pornography. Behind any successful content filtering solution should lie a thorough AUP.

Acceptable Use Policies should be seen as an integral part of managing access to pornography. In this chapter the following sub-problems will be addressed: • What is an AUP? • What should typically be included in an AUP?

AUPs need to be broken up into different sections making them easier to understand and follow, allowing for continuity. Each section of an AUP deals with a different focus area. When looking at access to pornography, organisations need to define clearly what content is acceptable and what measurers are taken to prevent access to online pornography. In addition, an AUP should include a section on what disciplinary action will be taken against those who are caught accessing or distributing pornography in the university environment.

4.2 Defining Acceptable Use Policies

There are many different policies pertaining to the acceptable use of organisational facilities. For the purpose of this research, reference will be made to the computer facilities provided by organisations, especially the physical hardware resources, as well as the use of Internet and intranet accessed via an organisation’s infrastructure. This theory is applicable to any organisation that allows their personnel access to the Internet or other computer facilities.

According to Scott and Vass (1994:61) an Acceptable Use Policy is used to define who may use computer facilities and for what purpose. The AUP acts as an organisation’s official voice on the ethical use of computer and Internet facilities. 54 The Net Dictionary (2004) identifies AUPs as a set of rules that govern the network and how it may be used. It acts as a set of rules applied to networks (two or more computers linked through the use of a communication protocol) to restrict use (Wikipedia 2005). Common practices include new members joining an organisation signing an agreement on the AUP before access is granted to the computer facilities. An AUP needs to be well organised, concise and easy to read.

The AUP can be a formal or informal document, used to organise computer and information resources, defining unacceptable use and the consequences for non-compliance with the policy (Simbulan 2004:194). AUPs are created with three main goals: 1. Educating users about activities that may be harmful to the organisation. 2. Providing legal notice of unacceptable behaviour and the penalties for such behaviour. 3. Protecting an organisation from liabilities it may incur from misuse of the Internet and other computer facilities.

Organisations customise their AUP to fit their specific needs based on their unique requirements. AUPs can vary in length from one page to more then ten pages and can be divided into two categories: broad policies and detailed policies. Broad policies are easier to digest but leave grey areas causing debate. Detailed policies ensure that there are no questionable clauses. In general, policies should find a balance between being too vague and too technical. It is very easy to fall into a trap of over policing the use of computer facilities and the Internet (Kliener & Welebir 2005).

Universities, like most other organisations, have broader policies that deal with the behaviour and conduct of employees and students. AUPs need to fit into and fall under the broad policy as a sub-section and deal specifically with the computer facilities. The AUP needs to echo the sentiments established in the university’s broad policy in order to remain consistent with existing rules and regulations (Mckenzie 1995).

4.3 The importance of an AUP

Many organisations have not carefully considered the importance of an AUP. It is important that facilities provided are used with good intent. The use of computer facilities

55 is often overlooked, leading to inappropriate and outdated AUPs. The following is a discussion on the importance of an AUP as a legal policy.

4.3.1 AUPs as a legal policy

Granting employees and students use of the Internet can lead to the unsuspecting discovery of inappropriate content, including sexually explicit and violent content. This activity can put legal pressure on a university, leading to possible criminal prosecution. Organisations as well as their members need to be protected by promoting responsible Internet usage (Surfcontrol 2005).

Legal issues concerning the use of the Internet have become a major focus for education. In Germany the former head of Compuserve was charged for failing to block access to pornography. Although Compuserve argued that it is almost impossible to filter all content and monitor thousands of files, a charge was still laid. This ruling makes it potentially dangerous for those in charge of managing Internet usage. The convicted person was sentenced to a two year probation and fined a large sum, payable to charity (Held 1999). Incidents like this can be avoided if the correct procedures are in place.

It is vital for an organisation to be fully aware of the relevant laws and standards before setting the grounding for an AUP. Essential research on issues dealing with laws on Internet usage (cyber laws) and netiquette need to be carried out before final drafts of an AUP are compiled. In addition, all computer users must be notified of any unlawful Internet activities. Internet law is a dynamic field because numerous new cases are brought to trial, creating new laws (Lichtenstein & Swatman 1997).

Sexual harassment charges may be laid against people who bring sexually explicit or objectionable material into a university’s online environment. Students or staff members of the university are at risk of exposure to pornography, possibly leading to a sexual harassment case being lodged against the university or parties involved in view of the unwelcome nature of the content. If a user is caught downloading illegal material (as discussed in Chapter 2) the university may be held liable, if the right precautionary measures have not been put into place.

56 Copyright infringements also need to be dealt with in an AUP, especially in the academic environment of a university. Users download software, documents and other media in all innocence, while infringing on copyright and other distribution laws. Universities should foster an approach that creates a safe haven for intellectual property.

4.4 Key issues AUPs need to address

According to Surfcontrol (2005) the goals of an Acceptable Use Policy should be aimed at clarifying the organisation’s policy regarding the usage of the Internet and other computer facilities. An AUP is vital to an organisation for the protection from potential liability, to avoid security threats by promoting awareness, good practice and to encourage positive use of the Internet as well as other computer facilities.

According to Kelehear (2005:33) the following key points need to be addressed in the AUP of a university: • a statement on the intended educational use and an outline on the advantages of the Internet in a university environment; • a list of responsibilities of users, including staff and administration; • a code of conduct administering the use of the Internet; • a description of what constitutes acceptable and unacceptable use of the Internet; and • a disclaimer absolving the university from possible responsibility for any misuse of the Internet.

An AUP should strive to be a well-rounded policy taking into consideration the rights of the users in order to be fair. In addition, an AUP needs to be concise when addressing Kelehear’s (2005) key points. These key points highlight the core of an effective AUP in a university environment.

4.5 Recommendations for developing and planning an AUP

There are many points that need to be taken into consideration when planning and developing an AUP. It is crucial that careful consideration be taken of all influential factors pertaining to the university. A well-rounded policy is carefully planned and includes input

57 and consideration of all parties involved, including staff, students, legal representatives and external experts.

It is common practice when designing an AUP to start with a brief policy overview. This serves as an introduction to students, staff and other employees on how resources on the Internet can be used as a productive resource. This introduction overview should explain why an AUP is necessary.

In most universities the decision to censor content on the Internet involves different groups within the university, with external legal advisors playing a very important role. Peace (2003) discovered that the opinion of the public is not considered important by university staff and policy developers. Universities often do not explore the viewpoint of the public before making and implementing policy decisions. Further findings from Peace’s survey revealed that the decision to censor pornography ranked third in importance behind criminal intent and racist content. Universities with policies in place appeared to be more aware of the practical issues such as technology, legal and monetary concerns that come into play with regard to decisions on censorship.

AUPs need to address areas concerning security and a disclaimer in the security section should outline the consequences for persons attempting to circumvent any of the security measures in place at the university. Security in the university environment is not yet as big a concern as it is in competitive external industries, as any information can be used as a tool to gain insight into a competitor’s operations or even lead to some form of industrial espionage. One needs to secure information systems from any unwarranted external penetration which could lead to the leakage of information.

Many AUPs are confusing and written as if they were specifically targeted for lawyers and legal professionals. If an AUP is considered confusing or murky, it will be less effective and create a sense of ambiguity. People will not be able to digest an AUP if it is not simply constructed, logical and consistent. The correct use of grammar is vital to prevent any ambiguity (Kinnaman 1995).

The wording of an AUP needs to be considered carefully. An example of a popular clause, written without consideration is: “…anybody found trying to enter an objectionable Web site will have their Internet access lifted…” However, experienced Internet users know that

58 one cannot be completely certain of the nature of the Web site until it is opened in the user’s browser. Anybody who has aimlessly explored (surfed) the Internet knows that one simply doesn’t know where one is going to land until it is too late. This problem is caused by unsuspecting URLs and the ability to link Web sites. Issues such as these need to be taken into consideration.

4.5.1 The seven P’s

Scott and Vass (1994) developed the 7 P’s model, which looks at the different points and issues that need to be addressed in the drawing up and implementing of an effective AUP. The seven P’s are participation, partitioning, philosophy, privacy, pernickety, phog phactor and publication.

1. Participation: An AUP should be compiled by a broad committee spanning all groups of users. These groups should include administration, students, facilities, clerical staff, Information Technology (IT) and computer personnel.

2. Partitioning: An AUP should be divided into several logical sections, each of which deals with a specific problem area. These partitioned areas should cover a generalised central policy and should be separately linked to sections dealing with common problems involving security, privacy, copyrights and the Internet.

3. Philosophy: An AUP needs to be in line with and emphasise the mission statement (broad policy) of the organisation. A common theme should be carried throughout the document and an AUP should identify how permissive an organisation may be with regard to religion, business, political and civil activities.

4. Privacy: All users need to understand what degree of privacy is acceptable. An AUP should clearly state when it is acceptable to breech the privacy agreement in order to protect the organisation and other stakeholers from harmful use.

5. Pernickety: This consists of a list of do’s and don’ts making up the core of the AUP. This section covers issues such as privacy, hacking, illegal content and punishment for violations. It is vital that this section should not be overly long with

59 detailed descriptions of various disciplinary actions. On the other hand, a short AUP lacks specific guidance and may encounter problems in enforcement.

6. Phog Phactor: Because of the legal gibberish and complicated legal jargon, AUPs tend to be hard to understand and read. Compilers of an AUP need to avoid unnecessary jargon for improving readability.

7. Publication: This deals with the means an organisation uses to communicate their AUP to the relevant employees and stakeholders. From a legal perspective an AUP needs to be disseminated in a way that makes it legally binding on the university’s computer system. Some methods for publication include: printed copies of the AUP made available at the various facilities, or visible copies of the AUP in computer laboratories. Another way of ensuring agreement is by making users agree to the terms of the AUP before they are allowed to log on and access various online resources.

Scott and Vass’s 7P’s model is useful in identifying the focal areas of an AUP. This basic model will help insure that organisations will be protected from common threats, but a little more detail is necessary to compile a thorough AUP. This model emphasises the need for an AUP to be in line with the mission statement and philosophy of the organisation.

4.5.2 Tips for effective AUPs

Hughes (2004) identifies steps that need to be taken to create an effective AUP: • Conduct a current policy review. • Distinguish between the different network access permissions. Different policy control may apply to different individuals or user groups in the organisation, as not everyone needs to have access to the Internet and possibly other resources residing on the network. • Gain visibility of one’s network traffic, by using Web traffic assessment tools to identify and monitor specific areas or groups that are engaging in inappropriate or unnecessary Internet usage. • Consult with all parties. Doing so should ensure that there is no policy mismatch between established policies and the ability of the Internet infrastructure to support all parties involved.

60 • Conduct a policy test exercise with key members when the policy is at draft stage. This will ensure that the policy is practical in terms of achieving objectives and at the same time is flexible enough to accommodate change resulting from possible emergency situations. • Consider all possible loopholes and greys areas that may be exploited. • Involve a legal team to review every element of the policy. This is an ongoing task as the laws and the policies are constantly changing. • Announce the policy or changes to it. Create a plan to communicate the policy and the changes effectively amongst staff and students. • Maintain flexibility by ensuring that the network or software configuration can implement possible changes made to the AUP.

Hughes’s tips for an effective AUP give a well rounded look into the steps involved in developing an AUP. Hughes interestingly identifies the need for a policy test exercise involving key members. This will help to ensure acceptance from key members in the organisation.

4.5.3 AUPs collaborating with content filtering

AUPs cannot work alone in a vacuum. It is vital that AUPs and content filtering work together to prevent people from misusing the Internet. An AUP works as a passive control mechanism that exists in the background while content filtering works as an active enforced control mechanism that physically restricts users from accessing undesirable content as determined by the AUP (see Chapter 3). These control mechanism need to work in cohesion with each other to ensure a balanced approach to managing access to pornography.

Content filters are by no means a comprehensive solution to managing access to pornography. These filters are programmed to block certain content but regularly fail (see findings in Chapter 5). Because of this ineffectiveness it is vital to have a thorough AUP to protect the organisation and its users from exposure to pornography.

Notification of Internet monitoring is necessary. The users of online resources should be informed of what activities will be monitored and how suspect activities will be reported. If Internet usage is recorded there should also be disclosure in the AUP, alerting users.

61 Filtering Internet content is part of Internet usage monitoring. An AUP should disclose the use of content filtering and should specify what content will be screened (Kliener & Welebir 2005).

People should not be given the impression that their activities on the Internet are private. At the same time an impression that every keystroke will be monitored should not be given. It is essential that clear intentions for the monitoring and filtering of content are stated in the AUP. The university should clearly communicate reasons for managing content usage.

AUPs are only as good as the systems and people who implement them. A well written AUP gives one a better chance to make things work, including content filtering. An AUP can make it more difficult for objectionable content to be accessed or reviewed (Splitt 2001). One should always give notice in an AUP, informing users that any filtering software is imperfect and there is no guarantee that some objectionable content will not slip under the radar. In addition an AUP should inform users that any person(s) trying to disable or circumvent any filtering shall be restricted from Internet privileges and may be subjected to disciplinary action.

Managing access to the Internet is not unlike managing other tangible resources such as phones and faxes. A four stage approach can be taken to ensure the correct use of the Internet. This approach includes (Surfcontrol 2005): 1. compiling an AUP; 2. informing and educating users; 3. installing appropriate technology to filter monitor usage; and 4. maintaining the policy and updating filtering technology.

The third stage of this process addresses the need for using the appropriate technology as part of a complete approach to managing access to pornography. AUPs need to work in synergy with content filtering as an extensive approach to managing access to pornography. Content filtering on its own is an incomplete approach to preventing access to pornography.

4.6 Evaluation criteria for AUPs

Different approaches can be used to evaluate AUPs. Evaluation of an AUP is crucial for the identification of strengths and weaknesses in current AUPs as well as for designing a new

62 AUP. Strong emphasis is placed on whether or not the AUP is technically and legally correct and contains no loopholes. Different AUPs have unique focus areas. Listed below are two sets of evaluation criteria: the Flowers and Rakes approach and a Five Key Areas Analysis. The Five Key Area Analysis is aimed more specifically at the university environment.

4.6.1 Flowers and Rakes’s four area approach

Flowers and Rakes (2000) developed an approach to analyse AUPs that concentrates on four areas: liability issues and concerns, online behaviour, system integrity issues and concerns and lastly the quality of content on the Internet.

The issue of liability is broken further up into three categories: 1. Service liabilities looks at services such as email, information and news services, public domain, shareware software, discussion groups and any connection to any library services. Disclaimers for these services imply that accessibility to these services may not be uninterrupted or error free. 2. Damage and cost incurred involve the actual cost of damage that a user might suffer while using the Internet. A disclaimer would emphasise that the organisation will not be held responsible or liable for any direct, indirect, incidental or consequential damage sustained in connection with or during operation of the Internet. 3. Quality and accuracy of content on the Internet is another area an AUP should address in the process of shielding the organisation from any responsibility for content published on the Internet.

The second area deals with online behaviour and netiquette. This outlines issues and concerns addressing the behaviour of system users. Typical content in this section would state the appropriate manner in which to conduct online activities such as emailing and surfing the Internet. Inappropriate behaviour is defined as actions such as violation of copyright laws, use of the system for community, political or religious reasons, violation of privacy, use of the computer facilities for non-academic reasons and activities involving pornographic, profane, offensive, illegal or obscene content. In a survey carried out by Flowers and Rakes (2000:357) a few of the AUPs reviewed included a section on netiquette

63 (behaviour guidelines for users). These guidelines express the need for users to adhere to generally accepted rules for polite and responsible behaviour on the Internet.

The third area identifies the concern for addressing the integrity and security of the Internet. Some issues outlined in AUPs reviewed by Flowers and Rakes (2000) included notification to system administration of any security problems, avoiding the demonstration of suspect activities to other users and refraining from using others’ user accounts. These are common practices that need to be emphasised to create security conscious users. Still in the same area is the issue of privacy for users. Each policy should state that it reserves the right to examine and monitor individuals’ access to the Internet for the purpose of maintaining the integrity of the Internet.

The fourth area deals with the content found on the Internet. Some policies reviewed stated that the transmission of illegal material stated in present law was prohibited. This issue was further confused as the term “illegally” was not substantiated or explained in these policies. Various terms were used to describe and define inappropriate content in material generated by users.

4.6.2 Five key areas to AUP analysis

AUPs are constructed with several key objectives in mind. The biggest concern and driving force behind an AUP are legal issues that may present themselves. Universities need to have a policy in place that protects them from any unfair treatment, as well as protecting the rights staff and students. Universities have to balance the individual rights of the students and staff carefully while monitoring the usage of computer facilities to ensure an academically focused online environment. AUP policies can be broken down into classification under legal, security, netiquette, privacy and university property. A discussion of each of the key areas follows.

4.6.2.1 Legal issues

The most important point of analysis of any AUP is legal concern. One has to be very careful when developing an AUP to consider all the legal acts that could apply to the computer facilities supplied by the university. It is important that any policies implemented are checked by people who specialise in law in a digital environment. References should be

64 made to specific laws and acts governing the acceptable use of the university’s computer facilities.

Universities need to adhere to copyright laws and therefore students should be informed of the seriousness of copying copyrighted material available from the WWW, electronic journals and digital library resources. Intellectual property is an important output of any university and special attention needs to be paid to the protection of intellectual property of students and staff. Areas concerning software licensing and unauthorised downloading of software need to be addressed as well.

The legal protection of a university is the biggest overshadowing influence in the construction of a university’s AUP. In this regard a university would outline lack of tolerance for people viewing or distributing any illegal content, such as obscene pornography (see Chapter 2). One should remember that this section deals with content that is illegal. Not all forms of pornography are illegal. Legal pornographic content that is not tolerated should be outlined in the section dealing with netiquette.

4.6.2.2 Netiquette

It is very important for an AUP to have a section describing the behaviour expected from students and staff while using the computer facilities provided, whether on the intranet or the Internet. The netiquette section usually lists the do’s and don’ts, may differ from university to university and needs to reflect the views outlined by the broad university policy or code of conduct. This section may address the need to respect the rights of other users on the network. Policy on changing settings and software on computer facilities provided, as well as enforcing the non-acceptability of using computer facilities for moonlighting activities, should be mentioned too.

4.6.2.3 Security

It is vital that AUPs address their policy stance towards security. The security of information is vital for any organisation. Information needs to be safeguarded from any negative use or poor publicity. General security practices need to be emphasised, including the use of user accounts, password protection, policy concerning hacking and viruses.

65 Security is crucial for ensuring the protection of information from internal and external threats. At the same time privacy needs to be maintained by adhering to strict security regulations. Privacy and security are two conflicting aspects that need to be outlined clearly to avoid invasion of privacy.

4.6.2.4 Privacy

Universities need to respect the rights of their users. There is a fine line between monitoring network usage and the invasion of privacy. AUPs need to outline monitoring procedures employed as well as explain the reasons for monitoring network usage. Users’ privacy rights are often undermined and little is done to enforce these rights.

4.6.2.5 University property

AUPs need to address expected tolerated conduct with regard to the university’s property. Facilities such as computers, printers, other hardware and software need to be accounted for. These facilities that are provided remain the property of the university and any theft or defacing of these facilities would justify punishment, expulsion or even legal prosecution. The compliers of many AUPs forget to include such a clause in their policy. The university needs to emphasise that university property is reserved for academic research and unlawful activities will not be tolerated.

4.7 AUP maintenance

Once an appropriate AUP has been implemented, regular updates are essential. One needs to take into account changes in staff, business practices, management expectations and developments in IT (Stott 2001).

An AUP needs to keep up with the needs of the organisation as well as external and internal risks. Surfcontrol (2005) outlines various circumstances that require constant review: • Are the students or the staff receiving adequate training and sufficient information to understand and interpret the policy? • Is the university receiving feedback from their users on the effectiveness of the policy? • Are policy restrictions preventing any users from accessing legitimate material?

66 • Have there been any incidents that require a policy change to prevent them from happening again?

AUPs, like the technologies involved in managing access to unwanted content, need to be maintained and updated to ensure effectiveness. This activity is often overlooked and little effort is made to ensure the AUP is up-to-date with legislative and technological changes.

4.8 Summary

AUPs play a vital role in managing access to pornography. These policies, together with content filtering, create a comprehensive approach to preventing access to unwanted content on the Internet. Organisations are often misled to believe that an effective content filter is all that is needed and passive control mechanisms such as AUPs are undermined. These policies need to be developed carefully to conform to laws and ethical practices, ensuring all stakeholders are accounted for. The AUP needs to be visible to ensure that these policies are adhered to. There should be no ambiguity as to what is stipulated in the AUP.

Chapter 5 focuses on the empirical research. The methodology and findings about content filtering testing and questionnaires used at UJ will be discussed. This research tests the effectiveness of current content filtering solutions employed at UJ. Chapter 5 also identifies student’s views on content filtering at UJ and highlights the need for a comprehensive AUP for the University.

67 Chapter 5 Empirical research

5.1 Introduction

The University of Johannesburg is a fairly new educational institution that has been in existence since 2005. Owing to the merger of previously established institutions there is a need for a reliable and standardised IT infrastructure and service delivery. At the time of this research, content filtering was carried out by two different systems, resulting in inconsistencies in the quest to block unwanted content on the Internet at UJ. One of the biggest threats in terms of difficulty to manage unwanted content at UJ is Internet pornography.

With technological development there is always a dark side, which is the case with the prominence of Internet pornography. Through the use of certain IT initiatives such as content filtering, one can improve on content delivery to all the stakeholders at UJ. The quest consists of two approaches: an active approach (content filtering) and a passive approach (AUP). In the previous chapters the theoretical framework has been established, resulting in a concrete foundation from which to undertake this empirical research. In this chapter, the rationale of the study, research methodology and findings of the empirical research will be discussed.

5.2 Rationale of the study

Pornography today is a bigger and more profitable industry than ever. Music videos, sitcoms, reality TV and movies often focus on sex. The media, especially with the explosion of the Internet, hone in on man’s primordial instinct for survival, being reproduction. Unfortunately our society is influenced by the increase in pornography consumption and distribution, as well as other forms of sex trade. This study aims to establish new guidelines that will curb the influx of Internet pornography at UJ.

There has been more and more interest from the journalism students at UJ in pornography. Several articles appeared in the Auckland Park Campus Beeld dealing with students accessing pornography on campus. Some of these articles made fictitious statements giving

68 no evidence. The research conducted for this study will bring some clarity as to the severity of the problem and identify some trends within UJ.

During the research a number of stakeholders were consulted to determine their expertise in terms of pornography as an issue in the online environment at UJ. From the information that was collected various issues were identified, leading to a greater understanding of the problem.

5.3 Research methodology

The fundamental aim of the study was to determine the means for managing Internet access to pornography at the University of Johannesburg in a bid to manage and curb unwanted access to pornography via the Internet more effectively. The next section will be devoted to a closer look at the research problem, the research approach, the sampling of the target group, data collection processing and analysis procedures.

5.3.1 Research problem

The following research problem was formulated: “To what extent can access to online pornography be managed at the University of Johannesburg?”

In order to address this problem successfully the following sub-problems were formulated: • What is pornography? • What is the difference between legal and illegal pornography? • What is online content filtering? • What different content filtering solutions are available? • What is an AUP? • What should typically be included in an AUP?

5.3.2 Research approach

Research can be defined as the “systematic investigation into and study of material and sources in order to establish facts and reach new conclusions” (Concise Oxford Dictionary

69 2004). Mouton (1996:35) defines research as the process involving application of a variety of standardised methods and techniques in the pursuit of knowledge.

According to Taylor (2000) the basics of research are to solve problems and expand knowledge of our universe, which necessitates that it is carefully and systematically conducted. Taylor identifies seven characteristics of research: 1. Research begins in a question in the mind of the researcher. 2. Research demands the identification of a problem stated in clear and unambiguous terms. 3. Research requires a plan, addressing various components involved. 4. Research deals with the main problem through appropriately associated sub- problems. 5. Research seeks direction through appropriate hypotheses and is based upon obvious assumptions and beliefs. 6. Research deals with facts and their meanings. 7. Research is circular; all major parts of the research have been fused into a model.

Research classification can be broken up into two categories: applied research and pure research. Joppe (2001) makes a distinction between the two by identifying pure research as an attempt to expand the limits of knowledge, where as applied research attempts to find a solution to a specific problem.

For the purpose of this research project, a focus on applied research is required. Stokes (1997) defines applied research as “research that is directed towards some individual or group or societal need. Its aim is to convert the possible or the actual”. Applied research is research that is completed to solve practical questions; its aim is to gain knowledge for its own sake. It may be explanatory but usually remains descriptive (Wikipedia 2006).

Stokes (1997) developed a quadrant model able to identify and classify different types of research. The model uses two dimensions to classify research: • as having been inspired by consideration of use; and • as a quest for fundamental understanding.

70

Figure 5.1: Research classification quadrants (Stokes 1997:74)

Bohr’s quadrant (upper left) is pure basic research guided by the quest for understanding. This quadrant is named after Niel Bohr whose quest for atomic structure was a voyage of discovery. The lower right quadrant, labelled Edison’s quadrant, includes research that is guided by applied goals without seeking a more general understanding of the subject. The upper right quadrant, Pasteur’s quadrant, includes basic research with the purpose of extending the frontiers of understanding but is also inspired by considerations of use. The lower left quadrant, the Sterile quadrant, includes research that is not inspired by either understanding or use. This quadrant includes research that explores a phenomenon without having any explanatory objectives or applied use (Stokes 1997:74).

The quadrant that is applicable to this research project is Pasteur’s quadrant, as there is a quest for understanding and consideration of use. The purpose of the research is to gain insight and understanding of the mechanisms that can be used to manage pornographic content on the Internet with special focus on UJ. The outcome of the research will provide recommendations for UJ to improve on current content filtering methods, as well as make

71 recommendations to policy makers to address more effectively online pornography at a policy level.

The research process used for this research project was adapted from Emory and Cooper (1995) who suggest an approach to the research commonly referred to as the question hierarchy (see figure 5.2). This approach consists of three elements: problem statement, research question and investigative questions. The purpose of the adapted question hierarchy is to focus on the research problem as a result of increasingly descriptive questions.

Problem Research Investigative Statement Question Questions

Those questions, This problem, which The single objective which must be has prompted the that states the answered research objective of the satisfactorily to research study support the research question

Figure 5.2: The question hierarchy (Emory & Cooper 1995)

Research data collected can be either qualitative or quantitative or even a hybrid of both. Quantitative research consists of collecting figures to identify relationships and trends in subjects identified, while qualitative research consists of the collection of words and experiences from subjects to identify relationships and trends. For this research project quantitative data were collected to solve the research question. Neuman (2003) outlines the differences between the quantitative and qualitative style of research in table 5.1.

72 Table 5.1: Comparison between quantitative and qualitative research (Steinback & Steinback 1988:8)

Dimensions Quantitative Qualitative Purpose Prediction and Control. Understanding: seeks why. Stable: Reality is made up of facts and Dynamic: Reality changes with Reality do not change. people's perceptions. Outsider- Reality is what quantifiable Insider: Reality is what people Viewpoint data indicates. perceive it to be. Value Free: Values can be controlled Value Bound: Values are important Values with appropriate methodological and need to be understood during the procedures. research process. Particularistic: Selected, predefined Holistic: A total or complete picture Focus variables are studied. is sought. Discovery: Theories and hypotheses Verification: Predetermined Orientation are evolved from data as it is hypotheses are investigated. collected. Subjective: Data are perceptions of Objective: Data are independent of Data the subjects in the environment people's perceptions. (context). Human: The human person is the Non-human: Reconstructed primary data collection instrument Instrumentation instruments such as surveys, such as observing and reporting on questionnaires, rating scales, tests, etc. behaviour and expressed feelings. Controlled: Investigations are Naturalistic: Investigations are Conditions conducted under controlled conditions. conducted under natural conditions. Valid: The focus is on design and Reliable: The focus is on design and Results procedures to gain rich, real and deep procedures to gain replicable data. data.

Because of the nature of the research conducted the findings and analysis have been broken up into two parts: questionnaires and content filtering. Table 5.2 below is a breakdown of the empirical research conducted.

73 Table 5.2: Breakdown of empirical research completed

Questionnaire survey 1. Sampling and target group for questionnaires (section 5.4) 2. Data collection and processing procedures for questionnaires (section 5.5.2) 3. Statistical analysis of questionnaires (section 5.7) 4. Interpretation of results from questionnaires (section 5.8) Content filtering measurement 1. Data collection and processing procedures for content filtering (section 5.5.1) 2. Interpretation of results for content filtering (section 5.6) 3. Summary of content filtering testing (section 5.6.6)

5.4 Sampling and target group for questionnaire

To ensure equal representation from the five UJ campuses (Bunting Rd, Doornfontein, East Rand, Kingsway and Soweto) a target group of 1 000 students was identified. The target group was broken up into the different campuses using the number of registered students at each campus as an indicator to obtain proportional representation from all of the campuses. Each campus’s target was worked out as a percentage of the total number of registered students. Table 5.3 represents the breakdown of registered students per campus.

Table 5.3: Student distribution among the different campuses

Number of registered % of the Minimum size of UJ Campus students total sample Bunting Rd 7 943 15.86 160 Doornfontein 9 549 19.06 190 East Rand 455 0.9 10 Kingsway 30 703 61.3 610 Soweto 1 443 2.88 30 Total 50 093 100 1 000

74 On the campuses with large numbers of students the response rate was unpredictable and large venues made it difficult to manage the completion of the questionnaires, so more questionnaires were handed out than required. This led to some of the campuses having a few more participants than originally planned. In some scenarios students felt left out and asked if they too could complete a questionnaire. The Table 5.4 shows the actual number of participants at each of the campuses.

Table 5.4: Distribution of questionnaires among the five campuses

Number of questionnaires UJ Campus completed Bunting Rd 163 Doornfontein 193 East Rand 10 Kingsway 640 Soweto 31 Total 1 037

Questionnaires were randomly handed out to students at the computer laboratories on the different campuses. This process ran over 14 days. Permission was obtained to handout of questionnaires prior to this period.

5.5 Data collection and data processing procedures

A thorough literature review was completed to establish the theoretical background for addressing the research problem. Different methods for managing access to pornography were discussed, focusing on their strengths and weaknesses. The empirical study can be broken up into two parts: content filtering testing and a questionnaire survey. The content filtering testing (URL testing) was completed to gauge the effectiveness of current content filtering employed at UJ with regard to online pornographic content. The questionnaires were used to gather insight into the effectiveness of current content filtering practices and to gain knowledge as to what content students feel they should be granted access to.

75 5.5.1 Content filtering testing

Content filtering testing was completed in four different main categories, which were derived from the taxonomy on pornography discussed in Chapter 2. These categories included softcore, hardcore, obscenity and child pornography. To maintain consistency, all the URLs used for the content filtering testing were tested using the same version of Internet Explorer.

In order to test the effectiveness of the content filtering used at the different campuses of the UJ, different sub-categories of content had to be identified. This process involved the collection of Web pages from various Web sites under the different pornography categories identified in the taxonomy in Chapter 2. In order to test for over-blocking of content that was not pornographic in nature, two further sub-categories, over-blocking and medical were included. The final categories for the purpose of the testing were: • artistic nudity • bestiality • child erotica • exposed or pseudo images • frontal close-up • medical • over-blocking • penetration • rape, sex acts involving minors • visual erotica.

Content labelled as softcore is legal in a South African context and can be viewed as tasteful to liberal minds. Access to such content can be justified, The same goes for content labelled hardcore for those over the age of 18. At the time of the research this content was condemned at UJ. Lastly, obscene content is illegal and the laws of South Africa prohibit access to, production or distribution to any content of this nature. Table 5.5 below illustrates the different categories of pornography tested.

76 Table 5.5: Breakdown of the different content categories tested

Softcore Hardcore Obscene Child Pornography Over-blocking Artistic Frontal Close- Exposed or Pseudo Ambiguous Bestiality Nudity ups Images Content Visual Rape or Sex Acts Involving Penetration Medical Erotica Violence Minors Child

Erotica

These main categories were identified from the taxonomy in Chapter 2. With the different categories one is able to test the accuracy of a particular content filter with regard to an array of content and identify possible weakness or areas that need improvement. It is more important to prevent access to some forms of pornography, as they may be illegal.

Erotic literature was left out of the content filtering testing, as the main aim of the research conducted focused on visual media (images) containing pornographic content. With a strong focus on image detection, text was less important, resulting in the exclusion of this category from testing. For the purpose of the research conducted, images that may have been present in erotic literature were recorded as visual erotica.

An open computer, one with no content filtering, was allocated to the researcher. This open computer was on the Kingsway campus and special permission was granted by the IT department to carry out searching for images in the above-mentioned categories. Google’s images search was used to locate all images. Some images that were more difficult to find required a Web site search in which a normal Google Web search was used. Once an image was identified, the URL for that particular Web page was followed to ensure access was granted to the Web sites in question. Once access was granted, the URL was recorded to be visited at a later stage, when the testing would be conducted.

Four Web pages for each category were recorded for testing at the different campuses. Some of the images were related to those in Question 15 of the questionnaire that was handed out to the students. In this question students where asked whether or not they should be granted access to the content in question, which was divided into the different categories from the taxonomy (see Addendum A: Survey Questionnaire).

77 The testing at the five different campuses was completed within 24 hours to ensure consistency among the different content filtering systems utilised. It is common practice to update these content filtering systems at regular intervals so it was vital to complete the testing within a short period of time. Each of the categories was tested on all of the campuses. The results were saved to a read-only digital medium for later analysis.

Once the testing was complete the results were analysed. The results were recorded in five different categories: complete access, partial access, Squid Guard block, Dans Guardian block and technical error. Complete access was recorded if no pictures were blocked and all sections of the Web page were present. Partial access was recorded if any part of the Web page was not accessible. This is important, as images may be blocked separately. A Squid Guard block (see figure 5.3) was recorded where a UJ access denied Web page appeared instead of the required Web page. A Dans Guardian block (see figure 5.4) was recorded if one was redirected to the Dans Guardian Web page while trying to access the required Web page. A technical error was recorded if the required Web page was non- operational or could not be accessed. An error can occur when a Web page is moved or there is a change in the IP address, or the server where the content is located is offline.

Figure 5.3: Sample of screen layout for Squid Guard block

78

Figure 5.4: Sample of screen layout for Dans Guardian block

5.5.2 Questionnaires

The second part of the research, the questionnaire survey, was designed in conjunction with Statcon (Statistical Consulting Service) on the Kingsway campus to ensure the continuity of questions and possible outcomes. Special emphasis was placed on the AUP at UJ (see Chapter 4). The questionnaire consisted of four sections: Section A: Biographic information Section B: University computer facility usage Section C: University AUP Section D: Personal experience with university computer facility.

The majority of questions were closed-ended questions with the exception of one question, which asked respondents to write their age in full years. With regard to closed-ended questions, respondents were required to select the most appropriate alternative.

Questionnaires were handed out to students at the different campuses at the computer laboratories. Some of the computer laboratories were located within the libraries on the different campuses, namely Bunting Rd, East Rand and Soweto. At Doornfontein a separate venue is dedicated to computer facilities, while at Kingsway various venues are allocated to computer laboratories, as well as the library. Student assistants from the Department of Information and Knowledge Management were used to hand out questionnaires. This was overseen by the researcher. The average completion time for the questionnaire was approximately 10 minutes. The total time for the completion of all the questionnaires was 14 days. 79 Once all the completed questionnaires had been collected, they were sent to Statcon for processing and analysis. The initial frequencies were made available within three weeks. Further consultations with Statcon were conducted to finalise cross tabulation for a more in-depth statistical analysis. The interpretation of the results will be discussed in section 8 of this chapter.

5.6 Interpretation of results from content filtering testing

Results have been divided into the different categories mentioned in Table 5.5. This section discusses the interpretation of the results with regard to the different categories, as well as a summary of content filtered at the different campuses.

5.6.1 Softcore This category includes: artistic nudity, child erotica and visual erotica (see chart 5.1).

• Artistic nudity: These Web pages contained nudity from various models, mostly female. The Web sites advertised photographic work from various photographers and for the purpose of the research this was labelled as artistic nudity. Results revealed that none of this category’s Web pages were blocked at any of the campuses. This indicated that the level of filtering was not too restrictive.

• Child erotica: These Web pages contained subjects under the age of 18 in suggestive contexts. This classification is incredibly difficult to define, as it concerns personal arousal that is determined by a number of factors. The only campus where all the Web pages were open to complete access was Doornfontein. All the other campuses achieved a 50% success rate at blocking the Web pages tested in this category. The Squid Guard block was recorded at Bunting Rd, East Rand, Kingsway and Soweto.

• Visual erotica: The images on these Web pages contained some nudity and were found on Web sites that were advertising or displaying art works. It is difficult to differentiate this category from artistic nudity, but the most significant difference is that this category contained less exposed skin than that of artistic nudity. A technical error was encountered with one of the Web pages. Bunting Rd and Doornfontein allowed complete access to 50% of the Web pages; 25% were blocked by Dans Guardian. East Rand, Kingsway and Soweto allowed complete access to 75% of these Web pages.

80 Softcore

50%

45%

40%

35%

30%

25%

20%

15%

10%

5% Child Erotica 0% Visual Erotica or Pseudo Images Bunti Do E Artistic Nudes ng o as r K RD n t fonti R in S and gsway o e w n et o

Chart 5.1: Softcore content block on the various UJ campuses

5.6.2 Hardcore This content category includes frontal close-ups and penetration (see chart 5.2).

• Frontal close-ups: The images contained nudity in a sexually suggestive manner. Bunting Rd and Doornfontein were able to block 100% of these images while the other campuses did not fare well. East Rand, Kingsway and Soweto blocked 0%. The Dans Guardian block proved to be far more effective at restricting access to this category of content.

• Penetration: Images on these Web pages contained intercourse between two or more human beings. Bunting Rd and Doornfontein were able to block 75% of the content with Dans Guardian, while the remaining 25% was granted complete access. East Rand and Kingsway fared poorly with complete access granted to 75% of the Web pages in this category and only 25% being blocked with Squid Guard. Interestingly Soweto campus blocked 75% of the Web pages with Squid Guard and complete access was granted to only 25%, the same as Bunting Rd and Doornfontein.

81 Hardcore

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Frontal Close-up B un D ti oo Penetration ng RD East Rand rnfo Kin Soweto nti gs en w a y

Chart 5.2: Hardcore content block on the various UJ campuses

5.6.3 Obscene content This content includes simulated rape or violence and bestiality (see chart 5.3).

• Simulated rape or violence: These images portrayed people being forced into sexual relations. The images contained content focusing on the domination of individuals who were forced into a submissive situation. Whether or not this was real or an act remains unknown, but the impression was left that these individuals were involved against their will. The Web pages from these Web sites contained text referring to either violent sex or rape. In this category there was a technical error with one of the Web pages. For Bunting Rd and Doornfontein 50% of the content was blocked by Dans Guardian, while complete access was granted to 25% of the content. With regard to East Rand, Kingsway and Soweto, complete access was granted to 75% of these Web pages.

• Bestiality: These Web pages incorporated images of animals, mostly pets, receiving stimulation in a sexual manner. The Dans Guardian filtering software that is employed at Bunting Rd and Doornfontein, blocked all of these Web pages, while complete access was granted to all of the Web pages at East Rand, Kingsway and Soweto, which were using Squid Guard. This helped to identify a trend that was consistent throughout 82 the test. Dans Guardian filtering software was more effective at blocking access to pornography and other unwanted content. In comparison the Squid Guard software used at the other campuses was seldom effective at restricting access to unwanted content with regard to the material used for the testing.

Obscenity

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0% Rape Bun D tin o Bestiality o East g RD rn King fon R Sowe a tie n swa n d y to

Chart 5.3: Obscene content block on the various UJ campuses

5.6.4 Child pornography This content category includes sex acts involving minors and exposed or pseudo images (see chart 5.4).

• Sex acts involving minors: Images on these Web pages contained minors, who were perceived to be under the age of 18, involved with either adults or other minors, in sexual relations. In this category a technical error was observed on one of the Web pages. Bunting Rd was able to block 100% of these Web pages and Doornfontein was able to block 75% of them, allowing complete access to 25%. East Rand, Kingsway and Soweto were able to block only 50% of these Web pages, with Squid Guard allowing complete access to 50% of the Web pages.

• Exposed or pseudo images: These images contained either modified real or animated images of minors (under the age of 18) revealing their naked bodies, including images of partial nudity. At Bunting Rd and Doornfontein 75% of the Web pages tested were

83 blocked by Dans Guardian, while complete access was granted to 25% of the Web pages in this category. At East Rand, Kingsway and Soweto 50% of the Web pages were blocked by Squid Guard and complete access was granted to the remaining 50%.

Child Pornography

100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Sex Acts Involving Minors Bu D nti o Exposed or Pseudo Images n ornf Ea g s Kingswa RD o t Rand Soweto nti e n y

Chart 5.4: Child pornography content block on the various UJ campuses

5.6.5 Over-blocking This content category includes ambiguous content and medical.

• Ambiguous content: The Web pages in this category contained text and images of topics that are common in the English language but have more than one connotation, depending on the context. One example that was tested was a Web page that was dedicated to poultry “cocks”. These Web pages tested the ability to avoid over-blocking and identify the context of the content. The campuses all had the same results and complete access was granted to all the Web pages in this category.

• Medical: These Web pages contained content centred on issues involving sex and sexuality, with some of the Web pages contained nudity. The campuses granted complete access to all of these Web pages. This illustrates the ability to grant

84 access to challenging content where nudity and sex may be portrayed in an acceptable fashion. This is an important test to certify whether the filtering mechanisms employed are guilty of over-blocking.

5.6.6 Summary of content filtering

From the testing one can identify two different products that are currently used to filter content. Owing to the use of two different products and similar parameter settings on each of the products, there are two similar sets of results. Dans Guardian proved more effective at blocking access to unwanted content and upholding the current interim AUP of the university. The blocking software, Squid Guard, was not as effective during testing and less consistent with the content blocked as opposed to Dans Guardian.

The blocking of content for the Bunting Rd and Doornfontein campuses were configured by the same IT staff and shows consistent results in which marginal differences were recorded. The East Rand, Kingsway and Soweto campuses were configured by a different group of IT staff and results were not as consistent.

On the whole the Dans Guardian software seems to block more unwanted content and at the same time does not have a tendency to over-block, making it a well balanced product. The East Rand, Kingsway and Soweto campuses should re-evaluate their content filtering solution or install a new product for all campuses that performs better than the Dans Guardian software.

Charts 5.5-5.19 illustrate the amount of content blocked at each of the campuses. The larger the pie slice the more content pertaining to that category was blocked during the testing conducted. For the purpose of display the pie slices in each of the charts are not in the same order but can be identified using colour.

85 Bunting Rd

Visual Erotica or Artistic Nudes Pseudo Images 0% Bestiality 9% 18%

Sex Acts Involving Child Erotica Minors 9% 17%

Medical 0%

Exposed or Pseudo Rape Images 13% 4% Frontal Close-up 17% Penetration Ambiguous Content 13% 0%

Chart 5.5: Summary of content blocking on Bunting Rd campus

Doornfontien

Visual Erotica or Sex Acts Involving Pseudo Images Artistic Nudes Minor s 9% 0% Bestiality 14% 17%

Child Erotica 0%

Exposed or Pseudo Images Rape 14% 14%

Medic al 0% Penetration Frontal Close-up 14% 18% Ambiguous Content 0%

Chart 5.6: Summary of content blocking on Doornfontein campus

86 East Rand

Visual Erotica or Pseudo Images Artistic Nudes 9% 0% Ambiguous Content Child Erotica 0% 18% Sex Acts Involving Minor s Bestiality 18% 0%

Rape 9%

Medic al Exposed or Pseudo 0% Images Penetr ation Frontal Close-up 37% 9% 0%

Chart 5.7: Summary of content blocking on East Rand campus

Kingsway

Visual Erotica or Artistic Nudes Pseudo Images 0% 9% Child Erotica Rape 18% 9% Bestiality Ambiguous Content 0% 0%

Sex Acts Involving Minors 18% Medical 0% Exposed or Pseudo Penetration Images 9% Frontal Close-up 37% 0%

Chart 5.8: Summary of content blocking on Kingsway campus

87 Soweto

Visual Erotica or Pseudo Images 8% Artistic Nudes 0% Child Erotica Sex Acts Involving 15% Minors Bestiality 15% 0%

Ambiguouos Content 0%

Rape 8%

Frontal Close-up 0% Exposed or Pseudo Images Medical Penetration 31% 0% 23%

Chart 5.9: Summary of content blocking on Soweto campus

5.7 Statistical analysis of questionnaires

Frequency analysis and interpretation of responses to the questionnaires were used to identify common trends while cross table comparisons were used to give further insight into possible relationships between questionnaire variables (see Addendum B). The interdependencies of these variables were statistically analysed using a form of hypothesis testing.

The Concise Oxford English Dictionary (2004) defines a hypothesis as “…explanation based on limited evidence used as a starting point for further investigation”.

The use of hypothesis testing is an integral part of the empirical quantitative research process, providing a platform to conduct or initiate further research. When testing a

hypothesis two outcomes arise either a null hypothesis (H0) or an alternative hypothesis

(H1). A null hypothesis represents a theory that has been put forward, either because it has been believed to be true or it is to be used as a basic argument. An alternative hypothesis is a statement of what a statistical hypothesis is set to establish (Easton & McColl 2004).

88 Effect size is used to determine the nature of a relationship between two variables. When determining the effect size (using Cramer’s V and Phi coefficient method) the test sample size is not an inhibiting factor, as the bigger the test sample is, the stronger the power of the test. Rosenthal et al (2000:15) identify four groups of effects: no effect, small effect, medium effect and large effect. In table 5.6 the size of the effect is matched to the effect group.

Table 5.6: Rosenthal’s description of effect sizes (Rosenthal et al 2000:15)

no effect 0.0 – 0.1 small effect 0.1 – 0.3 medium effect 0.3 – 0.5 large effect 0.5 - 1

A small effect size means there is no or very little dependency between the two variables. A large effect size shows a significant or great dependency between the two identified variables.

In Question 8 students were asked how often they surfed the Web at the University of Johannesburg. This question was useful to establish if either males or females were more susceptible to exposure to online pornography relative to the time one spent on the Internet. Table 5.7 indicates that a larger percentage of males were using the Internet more frequently. Statistical analysis was done to identify the effect size.

89 Table 5.7: Cross tabulation of gender difference for Question 8

At least Twice Once a Once a I only surf Total every a week week month the day Internet at home Gender Male Count 181 164 90 47 18 500 % within 36.2% 32.8% 18% 9.4% 3.6% 100% gender Female Count 137 153 117 74 33 514 % within 26.7% 29.8% 22.8% 14.4% 6.4% 100% gender Total Count 318 317 207 121 51 1 014 % within 31.4% 31.3% 20.4% 11.9% 5% 100% gender

Chi-Square Tests

Value df Asymp. Sig. (2-sided) Pearson Chi-Square 20.239a 4 .000 Likelihood Ratio 20.381 4 .000 Linear-by-Linear Association 19.850 1 .000 N of Valid Cases 1 014

a. 0 cells (.0%) have expected count less than 5. The minimum expected count is 25.15.

Symmetric Measures

Value Approx. Sig. Nominal by Phi .141 .000 Nominal Cramer’s V.141 .000 N of Valid Cases 1 014

• Not assuming the null hypothesis • Using the asymptotic standard error assuming the null hypothesis

A small effect size was achieved indicating a relative magnitude of difference between male and female respondents. The proportion of males that surf the Internet at least every day is significantly different from the proportion of females that surf the Internet at least every day, therefore males are at higher risk of being exposed to pornography.

In Question 12 students were asked how tolerant they regard themselves towards exposure to unsolicited pornography at the University of Johannesburg. In table 5.8 it is indicated that a larger percentage of males were not bothered by this exposure, as opposed to the female respondents. Statistical analysis was done to determine the effect size. 90 Table 5.8: Cross tabulation of gender differences for Question 12

Does Bothers Deemed as Students Total not me unacceptable should have bother the freedom me to access pornography Gender Male Count 214 111 127 45 497 % within gender 43.1% 22.3% 25.6% 9.1% 100% Female Count 155 154 184 13 506 % within gender 30.6% 30.4% 36.4% 2.6% 100% Total Count 369 265 311 58 1 003 % within gender 36.8% 26.4% 31% 5.8% 100%

Chi-Square Value df Asymp. Sig. (2-sided) Pearson Chi-Square 44.436a 3 .000 Likelihood ratio 45.590 3 .000 Linear-by-Linear Association 2.914 1 .088 N of Valid Cases 1 003

a. 0 cells (.0%) have expected count less than 5. The minimum expected count is 28.74.

Symmetric Measurers Value Approx. Sig. Nominal by Phi .210 .000 Nominal Cramer’s V.210 .000 N of Valid Cases 1 003

• Not assuming the null hypothesis • Using the asymptotic standard error assuming the null hypothesis

The effect size measured indicated a small effect, confirming a relative magnitude of difference between males and females. The proportion of males that are not bothered by exposure to unsolicited pornography is much greater than the proportion of females. This shows statistically that males are more tolerant towards pornography, where as females are more likely to be offended by such content.

In Question 14 students were asked to what extent they thought students should have access to pornography on UJ’s computer facilities. Table 5.9 indicates that males indicated

91 a higher occurrence of restricted and total access opposed to female respondents. Statistical analysis was done to determine the size of the effect.

Table 5.9: Cross tabulation of gender differences for Question 14

No access Restricted access Total access Total Gender Male Count 259 186 49 494 % within gender 52.4% 37.7% 9.9% 100% Female Count 347 138 20 505 % within gender 68.7% 27.3% 4% 100% Total Count 606 324 69 999 % within gender 60.7% 32.4% 6.9% 100%

Chi-Square Tests Asymp. Sig. Value df (2 sided) Pearson Chi-Square 31.961a 2 .000 Likelihood of Ratio 32.416 2 .000 Linear-by-Linear Association 31.913 1 .000 N of Valid Cases 999

a. 0 cells (.0%) have expected count less than 5. The minimum expected count is 34.12. Symmetric Measures Value Approx. Sig. Nominal by Phi .179 .000 Nominal Cramer’s V.179 .000 N of Valid Cases 999

• Not assuming the null hypothesis • Using the asymptotic standard error assuming the null hypothesis

A small effect size was achieved, indicating the relative magnitude of the difference between male and female respondents. The proportion of females that believe no access to pornography should be granted is marginally greater than the proportion of males that believe no access should be granted to pornography. This analysis again shows that females are less inclined to access pornography or encourage access to pornography than their male counterparts.

5.8 Interpretation of results from questionnaires

The frequencies from the evaluation of the questionnaires have been converted into charts and graphs allowing easier understanding and graphical representation. Biographic 92 information was obtained from Questions 1 to 4. University computer facility usage was dealt with in Questions 5 and 6. Question 7 dealt with the university’s Acceptable Use Policy. Personal experience with the university’s computer facilities was addressed in Questions 8 to 17. The following is a discussion on each of these sections.

5.8.1 Biographic information

Gender distribution is almost even, with a total of 49,4% male participants and 50,6% female. This was not enforced by those conducting the research and a desirable 50/50 split was almost achieved.

Most of the respondents enrolled during or after 2004 (chart 5.10). This accounted for 71,3% of the respondents. The largest group of students first registered in 2006, which made up 26,7% of the sample. The questionnaires were completed by respondents in the last quarter of the year, giving students who enrolled in 2006 some exposure to the computer and online facilities provided by UJ.

Question 2

300 273 246 250 228

200 166

150

100 60 23 25

Number of Students of Number 50

0

00 01 05 20 20 2002 2003 2004 20 2006 ore Bef Year of Enrolment

Chart 5.10: Results for Question 2: Year of first enrolment at tertiary education institution

There were some problems with the question relating to the age of the respondents, as seven respondents indicated that they were 14 years of age or younger. Two respondents indicated they were two years of age, and another two claimed to be four years of age

93 while, two others said they were 11 years of age and one respondent claimed to be 14 years of age. This was not the observation of those assisting in the handing out of the questionnaires and a decision was made not to include these responses with regard to the age of respondents. Incorrect answers amounted to only 0,7% of the sample group, having no significant effect on the rest of the population.

The rest of the respondents indicated that they were 17 years of age or older. The largest grouping of respondents aged between 19 years and 21 years of age accounted for 61,6% of the respondents. Those between the ages of 17 and 19 years made up 9,6% and those from 22 to 26 years made up 25,1% of the students in the sample. Only 3,7% were 27 years or older, see chart 5.11.

Question 3

250

218 200 194 183

150

97 100 85 72 Number of Students

50 37 25 15 12 5 4 8 4 0 17 18 19 20 21 22 23 24 25 26 27 28 29 30 Age of Students

Chart 5.11: Results for Question 3: Age of student (in full years)

5.8.2 University computer facility usage

With regard to the use of the computer facilities provided by UJ, the sample responded with 42,7% indicating that they used the computer facilities daily. This illustrated the impact that computers have on the courses they study, as many departments use computer facilities such as WebCT for the distribution of content as well as for assessment. The next significant collection of respondents claimed that they used the computer facilities twice a

94 week. Only 4,6% claimed they used the computer facilities once a month or less frequently (chart 5.12).

Question 5

11% 4% Once a day or more 1%

43% Once every second day

Twice a week

23% Never

Once a month

19% Once a week

Chart 5.12: Results for Question 5: Which option best describes how often do you use the online computer facilities provided by the University of Johannesburg?

For Question 6, respondents were asked to rank their computer facility usage, with 1 being the most important or most frequently used and 5 being the least frequently used. WebCT and Edulink were recorded as the most useful, with 32,5% ranking this as number 1. This was followed by surfing the Web at 29% and the email facility 27,1% (see figure 5.13). The least used online facility that UJ provided, according to the students who responded was the library services, which include online catalogues and electronic journals. This facility was rated as 5 by 34% of the students (see figure 5.14).

95 Question 6: Most Important

32.5%

ink ul d 29.0% T o r E

WW WebC 24.0% e W g th rfin u S 19.7% nt Portal

tude S 27.1%

line Servicel On mai ry E ra 0.0% 5.0% 10.0% 15.0% 20.0% 25.0% 30.0% 35.0% ib L

Chart 5.13: Results for Question 6: Rank your online computer facility usage from 1 (most used) to 5 (least used)

Question 6: Least Important

25.1% k lin du E or T 19.3% C eb W W WW e th 18.7% ng rfi l u ta S or t P en 34.0% ud St e ic rv Se e in 20.9% nl O il ry a a m br E Li 0.0% 5.0% 10.0% 15.0% 20.0% 25.0% 30.0% 35.0%

Chart 5.14: Results for Question 6: Rank your online computer facility usage from 1 (most used) to 5 (least used)

96 5.8.3 University AUP

For Question 7 students were asked if they thought UJ had a policy that governs the use of the computer facilities (chart 5.15). 47,3% of students indicated that there was such a policy in place, 11,4% of student claimed that such a policy did not exist, 32,0% of students were unsure and only 9,4% of students claimed that they had seen such a policy.

Question 7

9%

Yes

48% No I do not know 32% Yes, I have seen it

11%

Chart 5.15: Results for Question 7: Do you think the university has a policy that governs the use of the computer facilities?

Question 7 can be broken down further into the separate campuses (chart 5.16). The campus with the highest recorded awareness was the Kingsway campus, where 53,5% of students reported that there was such a policy in place and 12% of students claimed that they had seen this policy. The campus with the lowest awareness of policy was East Rand, 31,3% of students indicated that there was a policy that governs the use of computer facilities at UJ and only 6,3% of students claimed that they had seen such a policy.

97 Question 7

60.0%

50.0%

40.0% Bunting Rd Doornfontein 30.0% Eas t Rand Kingsw ay 20.0% Sow eto

10.0%

0.0% Yes No I do not know Yes, I have seen it

Chart 5.16: Results for Question 7 (per campus): Do you think the university has a policy that governs the use of the computer facilities?

5.8.4 Personal experience with the university’s computer facilities

With regard to Question 8, the largest group of respondents (31,6 %) stated that they surfed the Internet (WWW) at least once a day, followed by a close 31,3% stating that they surfed the Internet (WWW) twice a week (see chart 5.17). This confirms that students are incorporating Internet (WWW) usage into their day-to-day student activities where access to unwanted content is inevitable. Only 16,8% claimed they surfed the Internet (WWW) less than once a week, while 5% of students claimed they reserved Internet (WWW) surfing for home.

98 Question 8

4.9% 11.7% At least every day 31.3% Twice a week Once a week Once a month 20.1% I only surf the Internet at home

30.9%

Chart 5.17: Results for Question 8: What would best describe how often you search the Web at the University of Johannesburg?

Students on the Doornfontein campus surfed the Internet (WWW) more frequently then the other campuses, with 49,5% claiming to surf at least once a day followed by Bunting Rd with 39%, Soweto 38,2%, East Rand 25%. The campus where students were least likely to surf the Internet (WWW) at least once a day was Kingsway with 22,3% (chart 5.18).

Daily UJ Internet Surfing

50.0%

45.0%

40.0%

35.0%

30.0%

25.0%

Internet 20.0%

15.0%

10.0%

Students daily surfing the the surfing daily Students 5.0%

0.0%

nd to g Rd ntein a way we tin fo gs So Kin Bun oorn East R D

Campus

Chart 5.18: Breakdown per UJ campus of daily Internet (WWW) surfing

99 The majority of respondents had experienced content being blocked at some stage of their university career. Only 18,7% of the students claimed that they had never had access blocked before. The largest group of respondents who experienced content being blocked was 33,9% who had access blocked once or twice, not often (chart 5.19). One can compare this to the content filtering testing, which indicated that the Dans Guardian product, used at Bunting Rd and Doornfontein was more effective at blocking content, while the less restrictive Squid Guard is used at East Rand, Kingsway and Soweto. This is indicated by the responses of students at Bunting Rd 29,7% and Doornfontein where 41,3% stated they were often confronted with blocked access. Students at East Rand 31,3%, Kingsway 19,2% and Soweto 14,7% stated that they often came across a Web site that had been blocked (chart 5.20).

Question 9

20.9% 18.7%

Never Once or twice, not often Often Most of the time

26.4% 33.9%

Chart 5.19: Results for Question 9: While surfing the Internet have you ever come across either of these Web pages while trying to access a particular Web site?

100 Question 9

45.0%

40.0%

35.0%

30.0%

25.0% 20.0% Students 15.0% 10.0% 5.0% Most of the time Of ten 0.0% Once or tw ice, not often d R in Nev er g d n te n nti on t Ra to Bu rnf s e w Doo Ea Kingsway So

Campus

Chart 5.20: Results for Question 9 (per campus): While surfing the Internet have you ever come across either of these Web pages while trying to access a particular Web site?

When students were asked whether the content the University was actively blocking contained academic information (Question 10), 52,1% stated “yes”. The next group of 30,2% claimed “no” and 17,7% of the students were unsure of the nature of the content they were trying to access.

When assessing the breakdown per campus, see chart 5.21. The campus that experienced the greatest amount of supposed academic content blockage was East Rand, with 68,8% stating that this happened. This was not consistent with researcher’s prior testing. Inaccuracy could occur owing to the size of the population for the sampling, which consisted of 10 respondents for East Rand campus. Altogether 60,9% of students believed that they had been denied access to academic content at Doornfontein, followed by Bunting Rd with 52,5%. Students at the Kingsway and Soweto campuses did not experience as much blocking of academic content as those on other campuses; at Kingsway only 40,4% identified blocking of access to educational content and 38,2% at Soweto.

101 Question 10

70.0%

60.0%

50.0%

40.0%

30.0% Students 20.0%

10.0%

0.0%

n d y Rd tei a g n fo oweto rn ingsw S untin o K B o East Ran D Campus

Chart 5.21: Results for Question 10: If you have come across either of the Web pages, do you think that the Web sites you were trying to access contained any relevant academic information that would help you in your studies?

When asked how often the respondents were exposed to unsolicited pornography (chart 5.22), 80% stated never, which indicates that combined content filter at UJ is fairly effective at suppressing unwanted content. This was followed by 12,8% claiming only once or twice and 7,2% claiming they had been exposed more than three times.

102 Question 11

1.1% 0.2% 1.4% 4.4%

12.8% Never 1-2 times 3-5 times 6-10 times 11-20 times More then 20 times

80.0%

Chart 5.22: Results for Question 11: How often have you been exposed to unsolicited pornography while using the online computer facilities provided by the University of Johannesburg?

When further focus was placed on the different campuses, those with the less restrictive Squid Guard (East Rand, Kingsway and Soweto) encountered greater exposure to unsolicited pornography. At Kingsway 78,1% of students had never been exposed to unsolicited pornography, 73,5% at Soweto and 62,5% at East Rand. These percentages are considerably lower than the other campuses that use Dans Guardian to block access to unwanted content. Bunting Rd respondents indicated that 80,1 % had never been exposed to unsolicited pornography and 84,2% at Kingsway claimed this (chart 5.23).

103 Never Exposed to Unsolicited Pornography

90.0% 80.0% 70.0% 60.0% 50.0% 40.0% 30.0% Students 20.0% 10.0% 0.0%

in d to an e ng Rd R ow ti n ingsway S K Bu East Doornfonte Campus

Chart 5.23: Results from Question 11: Never exposed to unsolicited pornography at the University of Johannesburg per campus

The largest group of 36,7% stated that exposure to unsolicited pornography did not bother them, while the next significant group regarding this question represented 31,3% that deemed this unacceptable. The percentage of student who were bothered by exposure to unsolicited pornography was 26,3% and 5,7% believed they should have the freedom to access pornography (chart 5.24).

Question 12

5.7% Does not bother me

36.7% Bothers me 31.3% Deemed as unacceptable

Students should have the freedom to access pornography

26.3%

Chart 5.24: Results for Question 12: How tolerant are you towards exposure to unsolicited pornography while using the computer facilities provided by the University of Johannesburg?

104 With regard to gender (chart 5.25), females were less tolerant than males of exposure to unsolicited pornography, 35,6% of females deemed exposure to unsolicited pornography unacceptable compared to 25,1% of males. Only 2,5% of females believed that students should have the freedom to access pornography compared to 8,9% of male respondents. This clearly illustrates that females are significantly less tolerant of pornography than males. Doornfontein, Bunting Rd and Kingsway were less bothered about exposure to unsolicited pornography than East Rand and Soweto. At Doornfontein 39,7% were not bothered by exposure to unsolicited pornography, followed by Kingsway with 37,9% and Bunting Rd with 33,5%. The last two campuses were less tolerant of exposure, with only 14,7% of the students at Soweto stating they were not bothered and 12,5% at East Rand.

Question 12

45.0%

40.0%

35.0%

30.0%

25.0%

20.0% Students 15.0% Does not bother me

10.0% Bothers me

5.0% Deemed as unacceptable

0.0% Students should have the freedom to access pornography Male Female

Gender

Chart 5.25: Results for Question 12 (per gender): How tolerant are you towards exposure to unsolicited pornography while using the computer facilities provided by the University of Johannesburg?

When students were asked if they had ever reported others accessing pornography, 84,5% stated “no”, which is not particularly surprising, as 80% had stated that they had never been exposed to unsolicited pornography at UJ (see chart 5.22). Only 8,7% claimed they had reported such activity, while 6,8% stated they had thought of it but did not want to make a scene.

105 Male students were more likely to report others accessing pornography, with 9,1% claiming that they had reported others accessing pornography in comparison to 7,4% of females reporting such activity. Students who registered in 2001 were more likely to report access to pornography, with 20% of these students doing so, followed by those who registered before 2000 with 13% and then those in 2002 with 11,7%. This trend indicates that older students were more inclined to report others accessing pornography in comparison to students who had registered in the last four years.

When asked to what extent should students have access to pornography (chart 5.26) 60,5% stated “none” while 32,6% believed that access should have been granted to those who have permission for research; surprisingly 6,9% believed that students should be granted total access to pornography.

Question 14

6.9%

None Restricted access 32.6% Total access

60.5%

Chart 5.26: Results for Question 14: To what extent do you think students should have access to pornography on the university’s computer facilities?

When further analysis is undertaken of gender, 67,1% of females believed no access should be granted to pornography and 51,3% of males who believed no access should be granted, which is considerably lower than the 60,7% average (chart 5.27). More males, 36,8%, believed that access should be granted to those who get permission for research and only 26,7% of females believed the same. Only 3,9% of females believed total access should be granted to pornography as opposed to 9,7% of males.

106 Question 14

67.1%

70.0%

60.0%

50.0% 51.3% 40.0% 26.7%

30.0% 36.8% Students 20.0% 3.9% 10.0% Female 0.0% 9.7% Male Gender None Restricted access Total access

Chart 5.27: Results for Question 14 (per gender): To what extent do you think students should have access to pornography on the university’s computer facilities?

For Question 15 students were asked whether or not they should be allowed to access certain types of content. The types of content in question were derived from the taxonomy (see Chapter 2) with the exception of one category, “medical”. This was included to gauge the perception of students with regard to nudity in acceptable surroundings. The input from the students is vital to identify problem areas that should be addressed by the university. The content in question can be divided into three main categories: softcore, hardcore and obscene (which for the purpose of the research was sub divided into child pornography, bestiality and simulated rape or violence).

Concerning content labelled softcore (chart 5.28) the response to the question on allowing access to such content was fairly high, which was expected, with 63.8% of students indicating they wanted access to visual erotica, followed by erotic literature with 33.7%, artistic nudity with 26.5% and child erotica with 19.2%.

107 Softcore Access

63.8% 70.0%

60.0%

50.0% 33.7% 40.0% 26.5% 19.2% 30.0%

20.0% Students 10.0%

0.0% Artistic Nudity Erotic Literature Visual Erotica Child Erotica

Content Type

Chart 5.28: Students’ views on access to softcore pornography

Concerning content labelled hardcore, 17.4% of students wanted access to full frontal nudity and 14.7% believed students should have access to acts of . This seems to be relatively high and access to such content should be blocked by the filtering mechanisms in place (chart 5.29).

Hardcore Access

17.4%

18.0%

17.0% 14.7% 16.0%

15.0%

Students 14.0%

13.0% Frontal Closeups Penetration

Content Type

Chart 5.29: Students’ views on access to hardcore pornography

Content labelled obscene, which is illegal to distribute and possess in South Africa, received the following response from the students: 11.0% of students wanted access to

108 bestiality and 10.8% believed that they should be allowed to access content containing acts of simulated rape or violence (chart 5.30).

Access to Obscenity

11.0%

11.1% 11.0% 11.0% 10.8% 10.9% 10.9% 10.8%

Students 10.8% 10.7% 10.7% Bestiality Simulated Rape or Violence Content Type

Chart 5.30: Students’ views on access to obscene pornography

The last category was child pornography, which is also obscene and therefore illegal to access and distribute in South Africa. In this category students responded with 23.4% believing they should be allowed to access content of exposed children or pseudo images (chart 5.31). The reason for such a high response might be lack of knowledge on the severe implications of pseudo images, which can include content such as cartoons and comics depicting underage sexual activity. The other category of child pornography includes sex acts involving children, which 13.8% believing they should be allowed access to this.

109 Access to Child Pornography

23.4%

25.0% 13.8% 20.0%

15.0%

10.0%

Students 5.0%

0.0% Exposed Children or Psuedo Sex Acts Involving Children Images Content Type

Chart 5.31: Students’ views on access to child pornography

In reply to Question 15, it is interest to note that with regard to gender, females consistently responded negatively to access to these various types of pornography. This was expected, as females tend to be more conservative with the regard to pornography in general. Males, on the other hand, seem to have a more inquisitive nature concerning the issue of pornography. Multiple studies have indicated that men, especially adolescents, are greater consumers of pornography (Haggerstorm-Nordin et al 2005; Hald 2006). Chart 5.32 illustrates the difference between male and female students opinion on what one should have access to.

110 Question 15

35.0%

30.0%

25.0%

20.0%

15.0% Students

10.0%

5.0%

0.0% Bestiality )

es Penetration

y Images

male ( EroticaChild % Male (yes)e Frontal Closeups

F EroticaVisual % Sex Acts Involving Children Erotic Literature Artistic Nudity Exposed or Children Psuedo Simulated Rape or Violence Simulated

Content Type

Chart 5.32: Results for Question 15 (per gender): Please indicate whether or not you think students should have access to the following content.

Question 17 was an optional question at the end of the survey, which only 3,2% did not answer. The response rate was very good for an optional question. Students were asked how often they used the computer facilities at UJ to access pornography. The majority claimed “never” followed by 5,7% claiming once or twice. A notably small 2,2% stated that they used UJ computer facilities daily to access pornography (chart 5.33).

111 Question 17

1.3% 2.4% 2.2% 5.7% Never Once or twice Once a week Once a month Daily

88.4%

Chart 5.33: Results for Question 17: How often have you used the computer facilities provided by the university to access pornography?

If one looks at the campus distribution for Question 17, the largest percentage that used the computer facilities to view pornography was at East Rand, with 18,8%. Percentages at the other campuses were considerably lower, with only 3% at Bunting Rd, 2,2% at Doornfontein, 1,3% at Kingsway and 0% at Soweto indicating that they had used the UJ computer facilities to access pornography daily (chart 5.34). It must be noted again that the East Rand campus consists of a very small sample, so results may not be a true reflection of the rest of the students at that campus.

112 Question 17

90.0%

80.0%

70.0% Once or twice 60.0% Once a week

50.0% Once a month Daily 40.0% Never Students 30.0%

20.0% Nev er 10.0% Daily Once a month 0.0% Once a w eek Once or tw ice Rd tein ng nd i on nf Bunt Door East Ra Kingsway Soweto

Campus

Chart 5.34: Results for Question 17 (per campus): How often have you used the computer facilities provided by the university to access pornography?

5.9 Summary

In this chapter, findings on the effectiveness of current content filtering methods and views of students on access to online pornography were discussed. Through the use of testing and questionnaires, findings on topics such as the current effectiveness of content filtering at UJ, use of computer facilities, student Internet usage, student’s views on acceptable content and presence of AUP at UJ, contributed to answering the problem stated in 5.3.1 of this chapter. Chapter 6 will deal with the conclusion and recommendations based on the findings of this chapter.

113 Chapter 6 Conclusion and Recommendations

6.1 Conclusion

In the literature of the research project the following was established:

Chapter 2 discusses how pornography affects a large part of society at present, laws in place allow the sale of pornography to those over the legal age (18 years in South Africa). The Internet is no different, with almost 12% of content on the Internet containing pornography (Family Safe Media 2006). The subject of pornography is taboo in many cultures and issues concerning pornography are often unresolved or not discussed. Pornography is a loosely used term, creating confusion about defining and identifying it. Taxonomy is developed in Chapter 2 that divides the term pornography into different categories to achieve a greater understanding and insight.

Not all forms of pornography are illegal and some forms of pornography mentioned in the taxonomy are accepted by Western society as a form of artistic expression. Only obscene content is illegal. This category of pornography is immoral, as it shows no regard for the rights of those involved. This obscene content is defined in Chapter 2.

Chapter 3 deals with content filters, as software programs are used to restrict access to unwanted content actively. Content filters are developed in various formats to identify unwanted content and restrict users from accessing this content. The different content filters that were dealt with in the chapter included keyword filters, URL blocking, protocol blocking and content filtering rating systems and services. Each of these content filtering solutions has their own unique attributes. For this reason some content filters are better suited to filtering pornography. Some content filters are particularly good at filtering content but fall into the trap of being too restrictive. This is known as over-blocking. It is important especially in a university environment that content filters are not too restrictive and guilty of over-blocking.

AUPs were discussed in Chapter 4. An AUP is a policy that deals with the tolerated behaviour and manner in which computer facilities should be used. This policy includes lists of do’s and don’ts and expected netiquette. This policy forms the foundation of a

114 passive control mechanism preventing access to pornography at the university. An AUP needs to be comprehensive in covering a wide range of issues, ensuring users of computer facilities are informed of expected behaviour while using the computer facilities provided by UJ.

The empirical research, discussed in Chapter 5, has uncovered valuable findings that could assist UJ to employ a standardised content filter for all five of the campuses and in the development of a comprehensive AUP. This would ensure a complete approach to actively and passively managing access to online pornography.

Content filtering testing conducted revealed some interesting and unexpected results. Two different content filtering solutions are currently employed at UJ. Bunting Rd and Doornfontein campuses use a product called Dans Guardian and East Rand, Kingsway and Soweto use another product, Squid Guard. This is illustrated by two similar sets of results that were achieved. Overall, the Dans Guardian displayed greater potential at preventing access to pornography. Both these content filtering solutions passed the over-block test, indicating that current filtering measures are not too restrictive.

The questionnaires completed by the students yielded some useful input. Only a small minority, 20% of students, complained of being exposed to unsolicited pornography while using the on campus computer facilities at UJ. Students indicated there was little awareness of the existence of an AUP at UJ; only 9% of respondents claimed they had seen the current AUP. The information gathered from the testing and questionnaires can act as a valuable resource for the improvement of content filtering at UJ and the creation of a comprehensive AUP.

The final conclusion to the research problem is that the University of Johannesburg can to a large extent, through the use of an effective content filter and a comprehensive Acceptable Use Policy, manage access to online pornography.

6.2 Recommendations

From the information compiled for the research, the following recommendations are made:

115 The University of Johannesburg needs to choose a single content filtering solution, as there is no consistency between the two different content filtering solutions used at present. Of the two used by UJ (Dans Guardian and Squid Guard), Dans Guardian proved more effective at preventing access to pornography in the tests conducted.

One of the areas where the current AUP falls short is mentioning and explaining pornographic content that will or will not be tolerated. In addition, UJ needs to develop an official organisation-wide comprehensive AUP to inform computer facility users of the required behaviour to ensure a productive environment for students and staff. In addition, serious efforts need to be made to ensure the visibility of an official UJ AUP to students and staff.

6.3 Areas for future research

There is a need for the improvement of content filtering software. As discussed in Chapter 3, there are different methods for filtering content. Some of these methods can be called hybrid content filtering methods, as they use more than one approach to content identification to improve accuracy. The most accurate solution to filtering online pornography is an image filter, as most of the pornography is in the format of images or motion pictures.

In view of the relative newness of these applications, there are still a few problems that need to be ironed out. Research must be furthered to fine-tune these applications to improve their accuracy of content identification for the detection of pornography.

The research conducted was restricted to only the University of Johannesburg. There is a need to conduct this research on a national level to test the effectiveness of content filtering solutions at all the universities in South Africa. This information could be used to identify best practice as well as trends to improve the effectiveness of content filtering within universities in South Africa. In addition, the research conducted can be extended to primary and secondary education institutions.

116 References

Adam, A. 2001. Book Review: Pornography and the Internet: “The Biggest Dirty Bookshop in History?” Computer and Society 36-40.

Allard, M. & Hannabuss, S. 2001. Issues of Religion and Censorship. Library Review 50(2):81-89.

Bailey, F. 1999. Sex Sells: Pornography: The Secret History of Civilisation. [Online]. Available WWW: http://www.melonfarmers.co.uk/arsssecr.htm. Accessed 29 May 2007.

Bissette, D.C. 2004. Internet Pornography Statistics: 2003. [Online]. Available WWW: http://healthyminds.com/s-port-stats.html. Accessed 26 September 2005.

Blythe, M. & Jones, M. 2004. Human Computer (Sexual) Interactions. Interactions September + October 75-76.

Chan, Y. Harvey, R. Smith, D. 1999. Building Systems to Block Pornography. [Online]. Available WWW: http://www.2.cmp.uea.ac.uk/~rwh/research/reprints/cir99.pdf. Accessed 14 June 2005.

Chheuy, R. 2004. You’re Studying What?!: A Look at the Colourful Courses Offerings of Modern Academia. [Online]. Available WWW: http://www.calpatriot.org/article.php?articleID=148. Accessed 29 May 2007.

Concise Oxford English Dictionary. 2004. Oxford. Oxford University Press.

Cullen, L. 2006. Time Magazine: Sex in Syllabus. [Online]. Available WWW: http://www.time.com/time/magazine/article/0,9171,117697-3,00.html. Accessed 24 April 2007.

117 Dallas, G. Striking the Right Balance: Perscripitive v. Enabling Strategies. [Online]. Available WWW: http://www.finance.ox.ac.uk/file_links/corporategovconference/George%20Dallas%20 11-30%20session.ppt. Accessed 29 May 2007.

Downs. R. 2005. Microsoft Encarta Encyclopaedia Online: Pornography: Attitudes Towards Pornography. [Online]. Available WWW: http://encarata.msn.com/text_761568395_11/Pornography.html. Accessed 19 July 2005.

Easton, V. & McColl, J. 2004. Statistics Glossary: Hypothesis Testing. [Online]. Available WWW: http://www.cas.lancs.ac.uk/glossary_v1.1/hyptest.html#hypothtest. Accessed 29 May 2007.

Ebbs, G. & Rheingold, H. 1997. Censorship on the Information Super Highway. Internet Research: Electronic Networking Applications and Policy 7(1): 59-60.

Emory, C.W. & Cooper, D.R. 1995. Business Research Methods: 5th Edition. Homewood. Irwin.

Evans, M. Reagle, J. Shareck, P. 1996. An Alternative to Government Regulation and Censorship: Content Advisory Systems for the Internet. Computer and Society 26(4): 9-14.

Family Safe Media. 2006. Pornography Statistics. [Online]. Available WWW: http://www.familysafemedia.com/pornography_statistics.html. Accessed 30 January 2007.

Firschein, O. Li, J. Wang, J. Wiederhold, G. 1998. Systems for Screening Objectionable Images. Computer Communications 25(15): 1355-1360.

Flowers, B. & Rakes, G. 2000. Analyses of Acceptable Use Policies Regarding the Internet in Selected K-12 Schools. Journal of Research on Computing in Education 32(3): 351-365.

118 Gartner Consulting. 2001. Blocking Online Gambling: Technical Study. [Online]. Available WWW: http://www.iia.net.au.pdf. Accessed 30 January 2006.

Grossman, W. 2000. How to Draft an AUP. PC Support Advisor. June 21-22.

Guttman, C. 1999. The Darker Side of the Net: Dissemination of Child Pornography. [Online]. Available WWW: http://www.findarticles.com/p/articles/mi- m1310/is_1999_Sept/ai_56027293. Accessed 15 May 2005.

Haggerstorm-Nordin, E. Hanson, U. Tyden, T. 2005. Association Between Pornography Consumption and Sexual Practices Among Adolesents in Sweden. International Journal of STD & AIDS. 16(2): 102-107.

Halavais, A. C. 2005. Small Pornographies. SIGGROUP Bulletin 25 (2). 19-22.

Hald, G.M. 2006. Gender Differences in Pornography Consumption Among Young Hetrosexual Danish Adults. Archives of Sexual Behaviour. 35(5): 557-585.

Hart, T. Klatenhauser, B. Machill, M. 2002. Structural Development of Internet Self- Regulation: A Case Study of the Internet Content Rating Association (ICRA). Info 4(5): 39-55.

Held, G. 1999. Beware of a New Potential Liability. International Journal of Network Management 9 (1).

Hochheiser, H. 2001. Computer Professionals for Social Responsibility: Filtering FAQ. [Online]. Available WWW: http://www.cpsr.org/prevsite/filters/faq.html. Accessed 6 June 2005.

Hogg, C. 1999. What is Pornography? [Online]. Available WWW: http://www.slais.ubc.ca/courses/libr500/fall1999/www_presentations/c_hogg/argue.ht m. Accessed 29 May 2007.

119 Howe, W. 2004. An Anecdotal History of the People and Communities that Brought About the Internet and the Web. [Online]. Available WWW: http://www.walthowe/navnet/history.html. Accessed 5 August 2005.

Hughes, J. 2004. Ten Tips for Implementing an Acceptable Internet Use Policy. [Online]. Available WWW: http://www.computerworld.com/printthis/2004/0,4814,94231,00.html. Accessed 29 May 2007.

Hunter, C. 1999. Filtering the Future?: Software Filtering, Porn, PICS, and Internet Content Conundrum. [Online]. Available WWW: http://www.ala.org/ala/oif/ifissues/hunterthesis.pdf. Accessed 29 May 2007.

Johnson, P. 1996. Pornography Drives Technology: Why Not to Censor the Internet. [Online]. Available WWW: http://www.law.indiana.edu/fclj/pubs/v49/no1/johnson.html. Accessed 29 May 2007.

Johnson, P. 1998. Can You Quote Donald Duck?: Intellectual Property in Cyber Culture. [Online]. Available WWW: http://www.usergioarboleda.edu.co/civilizar/revista5/CAN_YOU_QUOTE.doc. Accessed 24 April 2007.

Jones, M. & Rehg, J. 2002. Detecting Adult Images. [Online]. Available WWW: http://crl.research.compaq.com/projects/vision/adult-detection.htm. Accessed 6 June 2005.

Joppe, M. 2001 The Research Process. [Online]. Available WWW: http://www.ryerson.ca/~mjoppe/rp.htm. Accessed 24 January 2007.

Kelehear, Z. 2005. When Email Goes Bad: Be Sure that Your AUP cover Staff as Well as Students. American School Board Journal January: 32-34.

Kinnaman, D. 1995. Critiquing Acceptable Use Policy. [Online]. Available WWW: http://www.lo.Com/~kinnaman/pchealth/draft.html. Accessed 26 September 2005.

120 Kleiner, B. & Welebir, B. 2005. How to Write a Proper Internet Usage Policy. Management Research News 28 (2/3).

Kranich, N. 2004. Why Filters Won’t Protect Children or Adults. Library Administration & Management 18(1): 14-18.

Kressin, M. 1997. The Internet and the World Wide Web: A Time Saving Guide for New Users. New York. Prentice-Hall: 49-52.

Larsen, O.N. 1994. Voicing Social Concern. Lanham: University Press of America:82- 83.

Lichtenstein, S. & Swartman, P. 1997. Internet Acceptable Use Policy for Organisations. Information Management and Computer Security 5(5): 182-190.

McCullagh, D. 2003. Net Blocking Threatens Legitimate Sites. [Online]. Available WWW: http://news.com.com/Net+blocking+threatens+legitimate+sites/2100-1023_3- 985126.html. Accessed 13 June 2005.

McKenzie, J. 1995. Creating Broad Policies for Student Use of the Internet. The Education Technology Journal 5(7).

McNair, B. 1996. Mediated Sex: Pornography and Postmodern Culture. London. Arnold: 47-49.

Merrrian Webster Dictionary of Law: Definition of Pornography. 1996. [Online]. Available WWW: http://dictionary.reference.com/search?q=pornography. Accessed 29 May 2007.

Mouton, J. 1996. Understanding Social Research. Pretoria. J.L. van Schaik: 35.

Net Dictionary. 2004. Acceptable Use Policy. [Online]. Available WWW: http://www.netdictionary.com/a.html. Accessed 29 May 2007.

121 Newth, M. 2001. The Long History of Censorship. [Online]. Available WWW: http://www.beaconforfreedom.org/about_project.history.html. Accessed 24 April 2007.

Ong, J. 2004. Pornography and Internet Technologies. [Online]. Available WWW: http://www.wiki.media- culture.org.au/index.php?title=Pornography_and_Internet_Technologies. Accessed 29 May 2007.

Paglia, C. 1990. Sexual Personae: Art and Decadence from Nefertiti to Emily Dickinson. New Haven. Yale University Press: 24-25.

Peace, G. 2003. Balancing Free Speech and Censorship: Academia’s Response to the Internet. Communications of ACM 45(11): 105-109.

Rezmierski, V.E. 1994. Seeing Through the Smoke And Haze: Clarifying Issues Regarding Electronic Access to Potentially Offensive Material And Pornography. ACM SIGUCCS XXII.

Ropelato, J. 2003. Peer-to-Peer Pornography: Kids Know, Do Mom and Dad. [Online]. Available WWW: http://www.familysafemedia.com/peer-to-peer_pornography_-- _ki.html. Accessed 19 April 2007.

Rosenthal, R. Rosnow, R. Rubin, D. 2000. Contrasts and Effect Sizes in Behavioural Research: A Correlational Approach. Cambridge: Cambridge University Press.

Rothery, B. 2003. Guide to Child Pornography on the Internet. [Online]. Available WWW: http://www.inquisition21.com/articles~view~7~page_num~6.html. Accessed 29 May 2007.

Russell, D. 2004. What is Pornography? [Online]. Available WWW: http://www.dianarussell.com/pornintro.html. Accessed 29 May 2007.

Sandy, G.A. 2001. The Online Service Bill: Theories and Evidence of Pornographic Harm. Research and Practice in Information Technology 1(46) 55.

122 Satkofsky, A. 2004. The Express Times: Web a Playground for Pornography. [Online]. Available WWW: http://www.nj.com/specialprojects/expresstimes/index.ssf?/news/expresstimes/stories/ molesters4_onlineporn.html. Accessed 29 May 2007.

Sein, R. 2001. Fascinating Censorship: Mundane Behavior in the Treatment of Banned Material. Journal of Mundane Behavior 2(1).

Scheaffer, R. Mendenhall, W. Ott, L. 1979. Elementary Survey Sampling: Second Edition. Massachusetts. Duxbury Press.

Scherpereel, C. 2006. Decision Orders: A Decision Taxonomy. Management Decisions 44(1):123-136.

Schneider, K,G. 1997. A Practical Guide to Internet Filters. Neal-Schuman Publishers. New York.

Schneider, K,G. 1998 Figuring Out Filters: A Quick Guide to Demystify Them. School Library Journal February 1998: 36-38.

Scott, V. & Vass, R. 1994. Ethics and the 7 “P’s” of Computing Use Policies. Ethics in Computing Age: 61-67.

Simbulan, M. 2004. Internet Access Practice and Employee Attitudes Towards Internet Usage Implementation in Selected Philippines Financial Institutions. Gadjah Mada International Journal of Business 6(2): 193-224.

Splitt, D. 2001. Backup Your Filtering with an Airtight AUP. [Online]. Available WWW: http://www.eschoolnews.com/news/showstory.cfm?ArticleID=2755. Accessed 29 May 2007.

St. James Encyclopaedia of Pop Culture: Pornography. 2002. [Online]. Available WWW: http://findarticles.com/p/articles/mi_g1epc/is_tov/ai_2419100979. Accessed 24 April 2007.

123 Standler, R. 2002. Issues in Computer Acceptable Use Policy. [Online]. Available WWW: http://www.rbs.com/policy.htm. Accessed 19 October 2005.

Steinback, S. Steinback, W. 1988. Understanding and Conducting Qualitative Research. Reston, VA: Council for Exceptional Children, 8-9.

Stokes, D.E. 1997. Pasteur’s Quadrant: Basic Science and Technological Innovation. Washington D.C. Brookings Press Institution.

Stott, D. 2001. Your Internet Acceptable Use Policy. PC Support Advisor. August 7- 10.

Surfcontrol. 2005. How to Write an Acceptable Use Policy (AUP). [Online]. Available WWW: http://www.trustmarquesolutions.com/security/documents/aup.pdf. Accessed 29 May 2007.

Taylor, G.R. 2000. Integrating Quantitative and Qualitative Methods in Research. Lanham. University Press of America.

Taylor, M. 2002. The Nature and Dimension of Child Pornography on the Internet. [Online]. Available WWW: http://www.ipce.info/library_3/files/nat_dims_kp.htm. Accessed 29 May 2007.

Thomas, D.S. 1997. Cyberspace Pornography: Problems with Enforcement. Internet Research: Electronic Networking Applications and Policy 7(3): 201-207.

Toronto Star. 06/09/2005. .XXX Marks the Spot for Porn Surfers.

UKTV. 2005. History of Pornography. [Online]. Available WWW: http://www.uktv.co.uk/index.cfm?uktv=standardItem.Index&aID=529865. Accessed 29 May 2007.

Van de Beer, D. 1992. Pornography. Encyclopaedia of Ethics. New York: Garland Publishing.

124 Watney, M. 2005. Regulation of the Internet Pornography in South Africa, Conference Paper, Proceedings of the 7th WWW Conferencfe on WWW Applications, Cape Town, 2005.

Watney, M. 2006. Tydskryf vir Hedendaagse Romeins-Hollandse Reg: Regulation of Internet Pornography in South Africa (69)2: 227-237.

Wave Technologies. 1998. Core: LAN, WANs and the Internet. St Louis. Master Skill: 230.

Westfall, J. 2005. The Problem: “Cybersmut”. [Online]. Available WWW: http://www.scu.edu/ethics/publications/submitted/westfall/blocking.html. Accessed 29 May 2007.

Wikipedia. 2005. Acceptable Use Policy. [Online]. Available WWW: http://en.wikipedia.org/wiki/Acceptable_use_policy. Accessed 3 October 2005.

Wikipedia. 2005. Erotica. [Online]. Available WWW: http://www.wikipedia.org/wiki/Erotica. Accessed 29 May 2007.

Wikipedia. 2005. Roth vs. United States. [Online]. Available WWW: http://en.wikipedia.prg/wiki/Roth_v._United_States. Accessed 20 September 2005.

Wikipedia. 2006. Research. [Online]. Available WWW: http://en.wikipedia.org/wiki/Research. Accessed 24 January 2007.

Wikipedia. 2007. Hentai. [Online]. Available WWW: http://en.wikipedia.org/wiki/Hentai. Accessed 29 May 2007

125

Addendum A: Survey Questionnaire

UNIVERSITY OF JOHANNESBURG

DEPARTMENT OF INFORMATION AND KNOWLEDGE MANAGEMENT

OFFICIAL RESEARCH PROGRAMME

QUESTIONNAIRE

The aim of this questionnaire is to identify any possible issues that students face with regard to exposure to pornography on the Internet facilities provided by the University of Johannesburg. These issues may include accidental exposure, intentional access or restriction of access to content that is believed to contain pornographic content.

I am currently a Master’s Degree student at the University of Johannesburg and am completing a study on managing access to pornography in an online University environment. The nature of this study requires assistance and input from you as a user of the online computer facilities provided by the university.

The outcome of the research is to provide the University with guidelines on how to improve their policies and pornography filtering techniques to create a better learning environment for all.

Please note that students are not required to share any personal information that will link them to this particular questionnaire; STUDENTS WILL REMAIN ANONYMOUS.

126

Biographic Information 1. Please complete the following information. (Mark with a X) Gender Male Female

2. Year of first enrolment at a tertiary education institution Before 2000 2001 2002 2003 2004 2005 2006

3. Age of Student (in full years) ______4. At which campus are you predominately based? Bunting Road Doorfontien East Rand Kingsway Soweto

University Computer Facility Usage

5. Which option best describes how often do you use the online computer facilities provided by the University of Johannesburg? Once a day or Once every second Twice a Once a Once a Never more day week week month

6. Rank your online computer facilities usage, from 1 (being the most used) to 5 (being the least used). (Please rank all the categories listed in the table below, see example)

Email

Library Online Services

Student Portal

Surfing the WWW

WebCT or Edulink

University AUP

7. Do you think the University of Johannesburg has a policy that governs the use of the computer facilities? Yes No I do not know Yes, I have seen it

Personal Experience with University’s Computer Facilities

127

8. What would best describe how often you surf the Internet at the University of Johannesburg? At least every Twice a week Once a week Once a month I only surf the Internet at home day

9. While surfing the Internet, have you ever come across either of these Web pages while trying to access a particular Web site?

Never Most of the time Often Once or twice, not often

10. If you have come across either of these Web pages, do you think that the Web sites that you were trying to access contained relevant academic information that would help you with your studies?

Yes No Unsure of the nature of the content I was trying to access

11. How often have you been exposed to unsolicited pornography while using the online computer facilities provided by the University of Johannesburg? Never 1-2 times 3-5 times 6-10 times 11-20 times More then 20 times

128

12. How tolerant are you towards exposure to unsolicited pornography while using the online computer facilities provided by the University of Johannesburg? Does not bother Bothers Deemed as Students should have freedom to access me me unacceptable pornography

13. Have you ever reported other students accessing pornography, while using the online computer facilities provided by the University of Johannesburg, to a staff member or student responsible for the computer facilities? Yes No Thought of it but did not want to create a scene

14. To what extent do you think that students should have access to pornography on the University’s computer facilities? None Restricted access to those who have permission and are doing research Total 15. Please state whether or not you think students should have access to the following content.

Pictures of people (models) in swimming costumes Yes No

Picture of infants being bathed Yes No Pictures showing uncovered genitals Yes No Erotic literature love stories containing descriptions of sexual acts Yes No Pictures of minors (under the age of 18) being sexual stimulated by either other Yes No minors or adults Literature containing description of sexual encounters Yes No Pictures of adults engaging in intercourse Yes No Pictures of minors (under the age of 18) in swimming costumes or mildly revealing Yes No attire Pictures pets being used or receiving sexual stimulation from humans. Yes No Pictures depicting genital dysfunctions or infection of the genital region i.e. STD’s Yes No Pictures of cartoon style images under the age of 18 revealing the naked bodies or Yes No involved in sexual activity Pictures of people in lingerie or underwear Yes No Pictures of people being forced into sexual relations against their will Yes No photos illustrating nudity with unfamiliar surroundings Yes No Pictures of minors (under the age of 18) revealing their bodies (genitals or breasts) Yes No Pictures revealing full body nudity, adults Yes No Pictures of cysts or growths in female and male breasts Yes No Pictures of minors (under the age of 18) unclothed at a nudist beach Yes No Pictures of people engaged in sexual acts Yes No Violent sexual intercourse, resulting in physical or emotional harm Yes No Artistic photos that contain nudity Yes No

129

16. Please identify what content is deemed you think, legal or illegal according to the laws governing obscenity in South Africa.

Images of naked people posing in an artistic nature Legal Illegal Images of frontal nudity including the groin region Legal Illegal Images of two or more people involved in sexual activities Legal Illegal Images of minors (under 18) exposing their naked bodies Legal Illegal Images of people being forced into sexual relations or simulated rape Legal Illegal Animals being used by humans for sexual stimulation Legal Illegal

17. How often have you used the computer facilities provided by the university to access pornography? (Optional question) Never Once or twice Once a week Once a month Daily

Thank you for the participation in this questionnaire; your input will help with the improvement of service delivery at the University of Johannesburg to all its stakeholders.

130

Addendum B: Statistical Frequency Analyses

Statistics

Group Campus Gender Firstenrol

Valid 1035 1027 1022 1021 N Missing 0 8 13 14 Mode 5 4 2 7

Group

Frequency Percent Valid Percent Cumulative Percent

1 Bunting Road 163 15.7 15.7 15.7 2 Doornfontein 193 18.6 18.6 34.4 3 East Rand 10 1.0 1.0 35.4 Valid 4 Soweto 31 3.0 3.0 38.4 5 Kingsway 638 61.6 61.6 100.0 Total 1035 100.0 100.0

Campus

Frequency Percent Valid Percent Cumulative Percent

1 Bunting Road 236 22.8 23.0 23.0 2 Doornfontein 184 17.8 17.9 40.9 3 East Rand 16 1.5 1.6 42.5 Valid 4 Kingsway 557 53.8 54.2 96.7 5 Soweto 34 3.3 3.3 100.0 Total 1027 99.2 100.0 Missing System 8 .8 Total 1035 100.0

Group * Campus Crosstabulation Count Campus

5 Soweto Total 1 Bunting Road 2 Doornfontein 3 East Rand 4 Kingsway

1 Bunting Road 162 0 0 0 0 162 2 Doornfontein 9 178 1 1 0 189 Group 3 East Rand 0 0 10 0 0 10 4 Soweto 0 0 0 0 31 31 5 Kingsway 65 6 5 556 3 635 Total 236 184 16 557 34 1027

131

Gender

Frequency Percent Valid Percent Cumulative Percent

1 Male 505 48.8 49.4 49.4 Valid 2 Female 517 50.0 50.6 100.0 Total 1022 98.7 100.0 Missing System 13 1.3 Total 1035 100.0

Gender 1 Male 2 Female

Count % Count % 1 Bunting Road 74 14.7% 89 17.2% 2 Doornfontein 115 22.8% 70 13.5% 3 East Rand 7 1.4% 3 .6% 4 Soweto 17 3.4% 14 2.7% 5 Kingsway 292 57.8% 341 66.0% Total 505 100.0% 517 100.0%

Firstenrol

Frequency Percent Valid Percent Cumulative Percent

1 Before 2000 23 2.2 2.3 2.3 2 2001 25 2.4 2.4 4.7 3 2002 60 5.8 5.9 10.6 4 2003 166 16.0 16.3 26.8 Valid 5 2004 229 22.1 22.4 49.3 6 2005 245 23.7 24.0 73.3 7 2006 273 26.4 26.7 100.0 Total 1021 98.6 100.0 Missing System 14 1.4 Total 1035 100.0

132

Year of first enrolment at tertiary education institution 1 Before 2000 2 2001 3 2002 4 2003 5 2004 6 2005 7 2006

Group Group Group Group Group Group Group Count % Count % Count % Count % Count % Count % Count % 1 Bunting 1 4.3% 4 16.0% 9 15.0% 24 14.5% 44 19.3% 48 19.5% 32 11.7% Road 2 3 13.0% 2 8.0% 15 25.0% 34 20.5% 63 27.6% 29 11.8% 40 14.7% Doornfontein 3 East Rand 1 4.3% 2 1.2% 2 .9% 1 .4% 4 1.5% 4 Soweto 2 8.7% 3 5.0% 17 10.2% 3 1.3% 2 .8% 3 1.1% 5 Kingsway 16 69.6% 19 76.0% 33 55.0% 89 53.6% 116 50.9% 166 67.5% 194 71.1% Total 23 100.0% 25 100.0% 60 100.0% 166 100.0% 228 100.0% 246 100.0% 273 100.0%

Statistics Age Valid 966 N Missing 69 Mean 20.881 Median 21.000 Mode 21.0 Std. Deviation 3.2290 Skewness 3.383 Std. Error of Skewness .079 Kurtosis 43.161 Std. Error of Kurtosis .157 Minimum 2.0 Maximum 63.0

133

Age

Frequency Percent Valid Percent Cumulative Percent

2.0 2 .2 .2 .2 4.0 2 .2 .2 .4 11.0 2 .2 .2 .6 14.0 1 .1 .1 .7 17.0 8 .8 .8 1.6 18.0 85 8.2 8.8 10.4 19.0 183 17.7 18.9 29.3 20.0 194 18.7 20.1 49.4 21.0 218 21.1 22.6 71.9 22.0 97 9.4 10.0 82.0 23.0 72 7.0 7.5 89.4 24.0 37 3.6 3.8 93.3 25.0 25 2.4 2.6 95.9 Valid 26.0 12 1.2 1.2 97.1 27.0 4 .4 .4 97.5 28.0 5 .5 .5 98.0 29.0 4 .4 .4 98.4 30.0 4 .4 .4 98.9 31.0 2 .2 .2 99.1 32.0 1 .1 .1 99.2 33.0 2 .2 .2 99.4 36.0 2 .2 .2 99.6 37.0 1 .1 .1 99.7 41.0 1 .1 .1 99.8 50.0 1 .1 .1 99.9 63.0 1 .1 .1 100.0 Total 966 93.3 100.0 Missing System 69 6.7 Total 1035 100.0

Statistics Q5 Valid 1018 N Missing 17 Mode 1

134

Q5

Frequency Percent Valid Percent Cumulative Percent

1 Once a day or more 435 42.0 42.7 42.7 2 Once every second day 194 18.7 19.1 61.8 3 Twice a week 231 22.3 22.7 84.5 Valid 4 Once a week 111 10.7 10.9 95.4 5 Once a month 40 3.9 3.9 99.3 6 Never 7 .7 .7 100.0 Total 1018 98.4 100.0 Missing System 17 1.6 Total 1035 100.0

Rank order of online computer facilities usage 1 Most used 2 3 4 5 Least used 6 Total

Count % Count % Count % Count % Count % Count % Count % Online computer facilities 252 27.1% 155 16.7% 169 18.2% 160 17.2% 194 20.9% 930 100.0% usage: email Online computer facilities usage: 181 19.7% 130 14.2% 123 13.4% 171 18.6% 312 34.0% 1 .1% 918 100.0% library online service Online computer facilities 224 24.0% 192 20.6% 182 19.5% 161 17.2% 175 18.7% 934 100.0% usage: student portal Online computer facilities 269 29.0% 150 16.2% 178 19.2% 150 16.2% 179 19.3% 926 100.0% usage: surfing the www Online computer facilities usage: 295 32.5% 136 15.0% 133 14.6% 117 12.9% 228 25.1% 909 100.0% WebCT or Edulink

135

Statistics Q7 Valid 994 N Missing 41 Mode 1

Q7

Frequency Percent Valid Percent Cumulative Percent

1 Yes 470 45.4 47.3 47.3 2 No 113 10.9 11.4 58.7 Valid 3 I do not know 318 30.7 32.0 90.6 4 Yes, I have seen it 93 9.0 9.4 100.0 Total 994 96.0 100.0 Missing System 41 4.0 Total 1035 100.0

Statistics

Q8 Q9 Q10 Q11 Q12 Q13 Q14

Valid 1024 1003 937 1021 1013 1016 1009 N Missing 11 32 98 14 22 19 26 Mode 1 4 1 1 1 2 1

Q8

Frequency Percent Valid Percent Cumulative Percent

1 At least every day 324 31.3 31.6 31.6 2 Twice a week 320 30.9 31.3 62.9 3 Once a week 208 20.1 20.3 83.2 Valid 4 Once a month 121 11.7 11.8 95.0 5 I only surf the Internet at home 51 4.9 5.0 100.0 Total 1024 98.9 100.0 Missing System 11 1.1 Total 1035 100.0

136

Q9

Frequency Percent Valid Percent Cumulative Percent

1 Never 188 18.2 18.7 18.7 2 Most of the time 210 20.3 20.9 39.7 Valid 3 Often 265 25.6 26.4 66.1 4 Once or twice, not often 340 32.9 33.9 100.0 Total 1003 96.9 100.0 Missing System 32 3.1 Total 1035 100.0

Q10 Valid Cumulative Frequency Percent Percent Percent 1 Yes 488 47.1 52.1 52.1 2 No 283 27.3 30.2 82.3 Valid 3 Unsure of the nature of the content I was 166 16.0 17.7 100.0 trying to access Total 937 90.5 100.0 Missing System 98 9.5 Total 1035 100.0

Q11

Frequency Percent Valid Percent Cumulative Percent

1 Never 817 78.9 80.0 80.0 2 1-2 times 131 12.7 12.8 92.9 3 3-5 times 45 4.3 4.4 97.3 Valid 4 6-10 times 11 1.1 1.1 98.3 5 11-20 times 2 .2 .2 98.5 6 More than 20 times 15 1.4 1.5 100.0 Total 1021 98.6 100.0 Missing System 14 1.4 Total 1035 100.0

137

Q12 Valid Cumulative Frequency Percent Percent Percent 1 Does not bother me 372 35.9 36.7 36.7 2 Bothers me 266 25.7 26.3 63.0

Valid 3 Deemed as unacceptable 317 30.6 31.3 94.3 4 Students should have freedom of access 58 5.6 5.7 100.0 pornorgaphy Total 1013 97.9 100.0 Missing System 22 2.1 Total 1035 100.0

Q13 Valid Cumulative Frequency Percent Percent Percent 1 Yes 88 8.5 8.7 8.7 2 No 859 83.0 84.5 93.2 Valid 3 Thought of it but did not want to create 69 6.7 6.8 100.0 a scene Total 1016 98.2 100.0 Missing System 19 1.8 Total 1035 100.0

Q14 Valid Cumulative Frequency Percent Percent Percent 1 None 610 58.9 60.5 60.5 2 Restriced access to whose who have 329 31.8 32.6 93.1 Valid permission and are doing research 3 Total 70 6.8 6.9 100.0 Total 1009 97.5 100.0 Missing System 26 2.5 Total 1035 100.0

138

Should have access 1 Yes 2 No Total Count % Count % Count % Pictures of people (models) in swimming costumes 753 73.3% 274 26.7% 1027 100.0% Picture of infants being bathed 493 48.0% 534 52.0% 1027 100.0% Pictures showing uncovered genitals 170 16.7% 850 83.3% 1020 100.0% Erotic literature love stories containing descriptions of 337 32.9% 687 67.1% 1024 100.0% sexual acts Pictures of minors (under the age of 18) being sexual 85 8.3% 935 91.7% 1020 100.0% stimulated by either other minors or adults Literature containing descriptions of sexual encounters 354 34.6% 668 65.4% 1022 100.0% Pictures of adults engaging in intercourse 158 15.4% 866 84.6% 1024 100.0% Pictures of minors (under the age of 18) in swimming 270 26.4% 753 73.6% 1023 100.0% constumes or mildly revealing attire Pictures pets being used or receiving sexual stimulation from 116 11.4% 905 88.6% 1021 100.0% humans Pictures depicting genital dysfunctions or infection of the 502 49.0% 522 51.0% 1024 100.0% genital region i.e. STD's Pictures of cartoon style images under the age of 18 202 19.7% 822 80.3% 1024 100.0% revealing the naked bodies or involved in sexual activity Pictures of people in lingerie or underwear 556 54.5% 465 45.5% 1021 100.0% Pictures people being forced into sexual confrontation 137 13.4% 885 86.6% 1022 100.0% against their will Black and white photos illustrating nudity with unfamiliar 168 16.4% 855 83.6% 1023 100.0% surroundings Pictures of minors (under the age of 18) revealing their 92 9.0% 931 91.0% 1023 100.0% bodies (genitals or breasts) Pictures revealing full body nudity, adults 191 18.7% 831 81.3% 1022 100.0% Pictures of cysts or growths in female and male breasts 358 35.0% 665 65.0% 1023 100.0% Pictures of minors (under the age of 18) unclothed at a 128 12.5% 896 87.5% 1024 100.0% nudist beach Pictures of people engaged in sexual acts 148 14.4% 878 85.6% 1026 100.0% Violent sexual intercourse, resulting in physical or emotional 126 12.3% 898 87.7% 1024 100.0% harm Artistic photos that contain nudity 380 37.0% 647 63.0% 1027 100.0%

Statistics Q15sum Valid 949 N Missing 86

139

Q15sum

Frequency Percent Valid Percent Cumulative Percent

0 130 12.6 13.7 13.7 1 86 8.3 9.1 22.8 2 103 10.0 10.9 33.6 3 80 7.7 8.4 42.0 4 96 9.3 10.1 52.2 5 74 7.1 7.8 60.0 6 70 6.8 7.4 67.3 7 56 5.4 5.9 73.2 8 44 4.3 4.6 77.9 9 34 3.3 3.6 81.5 10 32 3.1 3.4 84.8 Valid 11 34 3.3 3.6 88.4 12 23 2.2 2.4 90.8 13 19 1.8 2.0 92.8 14 21 2.0 2.2 95.0 15 6 .6 .6 95.7 16 2 .2 .2 95.9 17 10 1.0 1.1 96.9 18 6 .6 .6 97.6 19 4 .4 .4 98.0 20 4 .4 .4 98.4 21 15 1.4 1.6 100.0 Total 949 91.7 100.0 Missing System 86 8.3 Total 1035 100.0

Content deemed legal or illegal 1 Legal 2 Illegal Total Count % Count % Count % Images of naked people posing in an artistic nature 617 60.6% 401 39.4% 1018 100.0% Images of fontal nudity including the groin region 313 31.1% 694 68.9% 1007 100.0% Images of two or more people involved in sexual activities 272 26.9% 739 73.1% 1011 100.0% Images of minors (under 18) exposing their naked bodies 79 7.8% 939 92.2% 1018 100.0% Images of people being forced into sexual confrontation or 72 7.1% 946 92.9% 1018 100.0% simulated rape Animals being used by humans for sexual stimulation 54 5.3% 965 94.7% 1019 100.0%

Statistics Q17 N Valid 1002

140 Missing 33

q17

Frequency Percent Valid Percent Cumulative Percent

1 Never 886 85.6 88.4 88.4 2 Once or twice 57 5.5 5.7 94.1 3 Once a week 24 2.3 2.4 96.5 Valid 4 Once a month 13 1.3 1.3 97.8 5 Daily 22 2.1 2.2 100.0 Total 1002 96.8 100.0 Missing System 33 3.2 Total 1035 100.0

141 Addendum C: Computer Network User Code

Annexure B

RAND AFRIKAANS UNIVERSITY INFORMATION TECHNOLOGY (IT)

CAMPUS NETWORK: USER CODE

142 1. Service delivery

1.1 The Information Technology Division of the Rand Afrikaans University (hereinafter referred to as ‘the Division’) is committed to providing a service that is effective, economical and reliable. 1.2 In order to render the envisaged service, the Division undertakes, as far as possible, to: (a) ensure access to computer services by providing the necessary login codes and passwords; (b) grant authorized users access to the central administrative systems; (c) handle the maintenance of the network and related equipment so as to minimize the interruption of services; (d) restore interruptions in network connections quickly and effectively; (e) respect the privacy of network users, in terms of paragraph 1.3; (f) act professionally and provide a professional service at all times; and (g) maintain the highest possible standards within the given budgets. 1.3 In the event of proven suspicion, RAU reserves the right to interrupt services if necessary and with the authorization of a specific University Management member. No email, Internet site or any other information will, however, be checked without the user’s permission (unless a legal transgression has occurred).

2. User undertaking

2.1 Users are any RAU employees who have been authorized by their management members, dean, departmental chairperson or divisional head to use the RAU campus network, or lawfully registered RAU students who meet all the set requirements. 2.2 Each user undertakes: (a) to act professionally at all times; (b) to accept responsibility for the prescribed and correct use of the relevant systems; (c) to comply with copyright rules, as prescribed by local and international acts; (d) not to use inappropriate language and material or send inappropriate messages; (e) to ensure that other users and systems are not harmed through their actions; (f) to keep login codes and passwords to themselves; (g) not to give unauthorized people access to any services or systems; (h) to ensure that they do not infringe on their rights as students or staff members with a specific task; (i) to report any abuse of the systems, equipment or services by colleagues or students as soon as possible; (j) to ensure that all software is lawfully in their use and without infringing on licensing conditions; and (k) to ensure that there is no unauthorized use or abuse of the University’s resources and services.

3. Unauthorized use or abuse

1) Unauthorized use or abuse includes but is not limited to:

143 a) the provision of any official RAU or personal login codes and passwords to any other person; b) unauthorized use of software; c) the interception of network traffic; d) transgressions of copyright rules, as set in national and international copyright acts, treaties and agreements; e) unauthorized use of facilities for personal financial or any other profits; f) the playing of unauthorized computer games; g) ownership or disclosure of pornographic material unless it is related to approved and/or bona fide research; h) intentional or negligent distribution or development of computer viruses; i) establishment of services such as file, WWW and email servers without the necessary authorization or approval (including abuse of registered RAU domain names and IP addresses); j) linking any apparatus to the network without the written approval of the IT Division; k) removal or exchange of computer equipment without the approval of the IT Division; l) changing or extending network equipment without the approval of the IT Division; m) attempts at unlawfully obtaining the login codes and passwords of other users; and n) intentional or negligent disclosure of confidential information, including salary and other financial information.

4. Important guidelines

4.1 Passwords and login codes must be protected at all times. They should not be disclosed or written in any visible place. 4.2 No one should attempt to obtain passwords in an unlawful manner. 4.3 The login codes of other staff may not be used under any circumstances. 4.4 Computer network stations must be protected by a screen password or the user should log off from the network. 4.5 No strange or unknown software may be downloaded from the Internet and/or used. 4.6 Users must ensure that viruses are not spread and be wary of email attachments on which viruses are normally placed. 4.7 If sensitive information is sent via email, it must be encrypted. 4.8 Any security shortcoming must be reported to the IT Division immediately. 4.9 Any suspicious or unauthorized action that is detected must be reported immediately. 4.10 Any electronic services, such as approved WWW servers, must be protected. 4.11 Official services may be stored only on approved servers.

5. Confidentiality

Although the University will do everything in its power to protect the privacy of individuals, it reserves the right to summons anyone to have any computer or information investigated and/or to seize it if there is reasonable suspicion that a transgression has been committed.

FV/Security Policy without complete disaster plan/28/2/200

144