ICSA Labs 7th Annual Computer Virus Prevalence Survey 2001

Lawrence M. Bridwell Content Security Programs Manager, ICSA Labs

Peter Tippett, MD, PhD Chief Technology Officer, TruSecure¨ Corporation

Sponsors: Gantz-Wiley Research Network Associates, Inc. Panda Software Symantec Corporation

ICSA Labs 7th Annual Computer Virus Prevalence Survey 2001 Lawrence M. Bridwell, Content Security Programs Manager, ICSA Labs  Peter Tippett, MD, PhD, Chief Technology Officer, TruSecure Corporation

Sponsors:

Gantz-Wiley Research Network Associates, Inc. Panda Software Symantec Corporation

ICSA Labs Virus Prevalence Survey 2001  Table of Contents  LIST OF FIGURES...... III

 LIST OF TABLES...... IV

 EXECUTIVE OVERVIEW...... 1

 OBJECTIVES...... 2

 RESEARCH METHODOLOGY...... 2 CONFIDENCE...... 2 SELECTION ...... 2 INTERVIEW TARGET ...... 2 TRAINED INTERVIEWERS...... 3 ROUNDING...... 3 PREVIOUS WORK ...... 3 BIASES...... 3 Retrospective Study...... 3 Correctness ...... 4 Site Selection...... 4 Encounter Definition...... 4 Familiarity ...... 4  PRINCIPLE FINDINGS...... 5 2001 DEMOGRAPHICS...... 5 HOW COMMON ARE VIRUS INFECTIONS? ...... 5 CHANCE OF A DISASTER ...... 6 RESPONDENT PERCEPTION OF THE VIRUS PROBLEM...... 6 DETAILED FINDINGS...... 8 VIRUS ENCOUNTERS...... 8 TOP REPORTED VIRUSES ...... 9 HOW IS PREVALENCE CHANGING?...... 10 VIRUS DISASTERS...... 12 Date of last virus disaster ...... 12 Which virus caused the most recent disaster?...... 13 How many PCs were infected in disasters? ...... 14 Accuracy of initial estimates ...... 15 WHAT ARE THE EFFECTS ON VICTIMS OF VIRUS DISASTERS?...... 18 How long were servers down? ...... 18 What was the cost of the disaster in person-days?...... 20 What was the cost in dollars to your company?...... 21 WHAT IMPACT DO VIRUSES HAVE? ...... 22 What are the organizational effects of viruses? ...... 22 WHERE DO THEY COME FROM?...... 24 Changes in Virus Distribution Mechanisms...... 26 USAGE OF ANTI-VIRUS PRODUCTS ...... 27 Overall Level of Usage...... 27 Anti-virus products disabled on PCs...... 28 Anti-virus products employed on PCs...... 29 Anti-virus mechanisms in use on PCs ...... 31 Server anti-virus methods ...... 33 Anti-virus Usage usage on Perimeter perimeter services ...... 33 Perimeter anti-virus methods...... 35

Copyright © 2001 Page I ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

PC OPERATING SYSTEMS ...... 39 Operating systems on PCs ...... 40 LAN OPERATING SYSTEMS ...... 41 Operating systems on LAN servers ...... 42 PRIMARY LINE OF BUSINESS...... 43 DEPARTMENT ...... 43 JOB TITLE...... 43  DISCUSSION SECTION...... 44 WHAT CONCLUSIONS CAN WE DRAW FROM THIS STUDY? ...... 44 The virus problem in companies continues to get worse...... 44 Virus Types: ...... 45 Perceptions of the Virus Problem ...... 46 Virus Disasters and Costs ...... 46 Virus disaster impact: ...... 46 Protection Strategies: ...... 46  APPENDICES ...... 48 APPENDIX A: SURVEY QUESTIONNAIRE...... 48 APPENDIX B: REPORTED VIRUSES FOR THE SURVEY PERIOD...... 52 APPENDIX C: GLOSSARY OF COMMON TERMS IN ANTI-VIRUS DISCUSSION...... 53 APPENDIX D: TRUSECURE ANTI-VIRUS POLICY GUIDE...... 56 TRUSECURE CORPORATION RECOMMENDED VIRUS CONTROLS...... 57 Desktop Systems...... 57 TruSecure Corporation Recommended Synergistic Controls at the Desktop-Level: ...... 58 E-Mail Client (Desktop) Applications...... 59 TruSecure Corporation Recommended Primary Controls at the E-Mail Client Level: ...... 59 Synergistic Controls at the E-Mail Client Level: ...... 60 Network File and Print Servers...... 60 TruSecure Corporation Recommended Primary Control at Inside Server level: ...... 60 Synergistic Controls at the Inside Server Level: ...... 60 Utilize centralized AV management ...... 60 E-Mail Gateways, Firewalls, Other Gateways and Anti-Spam Tools...... 60 TruSecure Corporation Recommended Primary Control at the Gateway Level: ...... 60 Gateway Level, Potential Synergistic Controls...... 61 Human Factors ...... 61 ANNEX 1: Recommended settings for Microsoft Office...... 63 ANNEX 2: Recommended settings for Microsoft Outlook Security patches ...... 65 ANNEX 3: File types for potential filtering...... 66

Copyright © 2001 Page II ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

 List of Figures

Figure 1: Encounters per 1,000 PCs per month – two months prior to survey ______6 Figure 2: Opinions of the virus problem, 2001 vs. 2000 ______7 Figure 3: Encounters per months, 2001 ______8 Figure 4: Encounters per 1,000 computers per month, 1994 - 2001 ______9 Figure 5: Summary of encounters per 1,000 PCs per month by types ______11 Figure 6: Frequency distribution of dates of last virus disasters ______13 Figure 7: Viruses causing most recent disaster ______14 Figure 8: Frequency distribution of infected PCs in virus "disasters"______15 Figure 9: Comparison of observed PC infections to initial estimates ______16 Figure 10: Initial estimates vs. actual numbers of infected servers ______18 Figure 11: Frequency distribution of server downtime ______19 Figure 12: Loss in person-days due to disaster ______21 Figure 13: Effects of viruses, 2001______23 Figure 14: Comparison of virus encounter vectors ______25 Figure 15: Changes in virus encounter vectors ______26 Figure 16: Percentage of desktop PCs protected by anti-virus products______28 Figure 17: Frequency distribution of PCs with disabled anti-virus products ______29 Figure 18: Desktop coverage by frequency of response______30 Figure 19: Desktop coverage by total number of PCs ______31 Figure 20: Anti-virus methods in use by companies______32 Figure 21: Anti-virus methods used on file servers ______33 Figure 22: Frequency distribution of reported perimeter anti-virus coverage ______34 Figure 23: Comparison of perimeter anti-virus coverage, 1997 - 2001______35 Figure 24: Anti-virus methods used on e-mail gateways by percentage ______36 Figure 25: Anti-virus methods used on proxy servers by percentage______37 Figure 26: Anti-virus methods used at the firewall by percentage______38 Figure 27: PC operating systems by frequency of response only______40 Figure 28: Distribution of operating systems installed on PCs ______41 Figure 29: Server operating systems by total number of servers ______42

Copyright © 2001 Page III ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

 List of Tables

Table 1: Monthly rate of infection per 1,000 PCs over the two months prior to each survey 2001 ______5 Table 2: Top viruses for 2001, encounters per month per 1,000 PCs ______10 Table 3: Total encounters reported by types ______10 Table 4: Encounters per 1,000 PCs per month______11 Table 5: Percentage of respondents experiencing virus disaster ______12 Table 6: Date of most recent disaster______12 Table 7: Virus causing most recent disaster______13 Table 8: Frequency distribution of infected PCs in virus “disasters” ______14 Table 9: Comparison of observed PC infections to initial estimates ______16 Table 10: Initial estimates vs. actual numbers of infected servers ______17 Table 11: Frequency distribution of server downtime ______19 Table 12: Frequency distribution of person-days lost ______20 Table 13: Frequency distribution of dollar costs ______22 Table 14: Effects of viruses, 2001______23 Table 15: Sources of infection, 1996 - 2001______24 Table 16: Comparison of virus activity, 1996 - 2001 ______26 Table 17: Percentage of PCs NOT covered by anti-virus products ______27 Table 18: Frequency distribution of percentage of PCs with disabled anti-virus products ______28 Table 19: Use of specific anti-virus products by frequency of response ______29 Table 20: Use of specific anti-virus products by number of PCs ______30 Table 21: Respondents using specific anti-virus methods ______31 Table 22: PCs using specific anti-virus methods ______33 Table 23: Frequency distributions of coverage of perimeter services by frequency ______34 Table 24: Anti-virus methods in use on e-mail gateways ______35 Table 25: Files blocked, quarantined, or filtered at the e-mail gateway ______36 Table 26: Anti-virus methods used on proxy servers ______37 Table 27: File types or objects blocked, quarantined, or filtered at the proxy server ______38 Table 28: Anti-virus methods used at the firewall ______38 Table 29: File types or objects blocked, quarantined, or filtered at the firewall ______39 Table 30: Distribution of PC operating systems in respondent organizations______39 Table 31: Distribution of operating systems installed on PCs ______40 Table 32: Distribution of LAN operating systems in respondent organizations______41 Table 33: Distribution of LAN operating systems installed on servers ______42 Table 34: Primary line of business______43 Table 35: Departmental affiliations of respondents ______43 Table 36: Job titles ______44

Copyright © 2001 Page IV ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

 Executive Overview The objective of ICSA Labs’ 7th Annual Virus Prevalence Survey is to describe the existing computer virus problem in desktop computers and computer networks. The survey was organized by ICSA Content Security Labs’ sponsors of the survey. Trained interviewers located a random sample and gathered responses from 300 qualified respondents who worked for companies and government agencies with more than 500 PCs, two or more LANs, and at least two remote connections to the site.

What does the survey tell us? The virus problem facing corporations continues to worsen. Regardless of other items that may be garnered from this survey, the clearest message is quite simply this -- companies continue to experience an increasing number of virus incidents with higher virus incident costs each year. The likelihood of a company experiencing a computer virus or worm has approximately doubled for each of the past survey years through 1999 and has continued to grow approximately 15 percent per year for the two years since 1999. This is true for either infection rates or costs, as well as for whether one considers only the data within this survey period or if this year’s data is compared with data from previous surveys. Consequently, the virus (malicious code) risk is growing significantly notwithstanding persistent corporate efforts and in spite of increased protective expenditures each year.

How common are virus infections? The group of 300 organizations had 1,182,634 encounters on 666,327 machines during the 20 months of the survey period from January 2000 through August 2001. This translates to 113 encounters per 1,000 machines per month over the entire survey period.

Is the virus problem getting worse? The data showed worsening of the computer virus problem in the period of January 2000 through August 2001. In addition, global infection rates calculated from the surveys of 1996 through 2001 continued a significant annual growth rate of approximately 20 encounters per month per 1,000 PCs for each year in that period.

What are the characteristics of virus disasters? This year’s survey shows a decrease in the number of reported disasters. Only 28 percent of the respondents had experienced a virus disaster – that translates to 25 or more PCs or servers infected at the same time in comparison with 51 percent in last year’s survey and 43 percent in the 1999 survey. It should be noted here, however, that a major virus incident originated after this year’s survey commenced. The Nimda event of September did not get complete coverage as a significant number of respondents had already been surveyed. It is quite possible, if not probable, that the disaster numbers would have risen significantly had all data been gathered after the event.

What are the effects of virus disasters? In 2000, 36 percent of those reporting disasters estimated that servers were down one hour or less. By contrast, 65 percent of this year’s respondents reported downtime of one hour or less, with 53 percent claiming no server downtime at all. The average server downtime was 14 hours, while the median downtime was reported as zero hours. It is obvious from those skewed numbers that several respondents had disaster experiences requiring much longer recovery time.

More than 80 percent of those reporting a disaster required 20 person-days or less to recover from their virus disasters. The median response was four person-days for recovery. On average, this cost between $5,500 (median) and $69,000 (average) in estimated direct costs. Based on in-depth analysis of previous years’ studies, there is the tendency for respondents to underestimate these costs. When one compares in-depth studies that include cost modeling and productivity analysis to these numbers, one finds an approximate seven to eight-fold underestimation. With that proportional underestimation in mind, one could extrapolate that the average company might find cost between $50,000 and $500,000 in total ramifications (both soft and hard costs) per year for virus disasters.

How are anti-virus products used? More than 95 percent of the respondents reported protection for 90 percent or more of their PCs with anti-virus products. About 90 percent stated that 100 percent of their PCs were protected. Most PCs (71 percent) were reported to be protected by full-time automatic anti-virus protection. Between them,

Copyright © 2001 Page 1 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Network Associates and Symantec Corporation products were reported to be installed on 94 percent of the PCs in this sample.

Last year, almost all respondents used either no protection or incomplete protection on network services such as firewalls, proxy servers, and e-mail servers. This year’s study shows a major change. 84 percent of respondents say that all of their e-mail servers are protected by anti-virus software, 51 percent cover all of their firewalls, and 45 percent say all of their proxy servers have anti-virus software installed. This year’s survey checked to see if companies were not only using anti-virus software at the perimeter, but how many companies were also blocking, quarantining, or filtering files or objects as well. The study shows that 69 percent block, quarantine, or filter at the e-mail gateway, while only 40 percent do so at the proxy server and only 41 percent at the firewall.

How do respondents perceive the evolution of the virus problem? More than three-quarters of the respondents surveyed feel that the overall virus problem is either somewhat worse or much worse than last year. Without doubt, this is due to the continued increase in Internet-enabled viruses, especially those that employ a mass-mail payload.

 Objectives

The objectives of this project are to describe the computer virus problem in computer networks, including desktop computers; application and file servers; and perimeter devices such as firewalls, gateways, and proxy servers. The scope of the survey includes: • Intel-based or Intel-compatible PCs1 • Only sites with more than 500 PCs, two or more LANS, and two or more remote connections • Commercial, Government, and Industrial business sectors only

 Research Methodology

Confidence

To meet the objectives of the survey, telephone interviews were conducted by trained interviewers who gathered 300 completed surveys of corporate end-users. This sample size provides an accuracy rate of ±5.5 percent with a confidence limit of 95 percent for questions that relate to the entire data sample. Internal consistency checking suggests that the reliability of data may be a great deal lower in areas, perhaps as much as 45 percent, where similar data was arrived at by different means and/or by different questions.

Selection

Respondents for the survey were randomly selected from a qualified list of sites with 500 or more PCs, two or more LANs, and two or more remote connections at that site. The qualified list was procured from Harte-Hanks, Inc. The sample population included all service and Standard Industry Code (SIC) codes, as well as federal, state, and local governments. The survey explicitly excluded home, SOHO, and educational sectors.

Interview target

The interviews were conducted with the individual who claimed to be most responsible for managing virus problems on PCs or networks for the organization or site. The individual was typically found through

1Data gathered for Macintosh and other non-Intel servers and workstations may be referred to, but only in an anecdotal manner in this report.

Copyright © 2001 Page 2 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

referral and an average of 18 calls was required to complete an interview. Only individuals with personal responsibility for 200 or more PCs at this site in terms of virus knowledge, prevention, and software were qualified as respondents to the survey. Respondents were ensured confidentiality to maximize their responsiveness and to enhance the reliability of the survey.

Trained interviewers

All interviewing was supervised by Gantz Wiley Research of Minneapolis, MN. Trained interviewers dialing from a centrally supervised and monitored facility conducted interviews during the hours of seven a.m. through five p.m., Central Standard Time. The interviewers involved in the survey were dedicated to surveys of vendors, dealers, and end-users for computer and networking industries. All responses were captured using an on-line survey system for direct tabulation and analysis.

Rounding

Occasionally percentages will total more than 100 percent because some questions allowed for multiple responses. In some cases, charts or graphs will show less than 100 percent due to exclusion of “Don’t Know”, “Refused”, or “Other” responses. In addition, rows or columns may total either 99 percent or 101 percent due to rounding errors.

Previous Work

Some of the results of this survey can be directly compared with the results of five previous surveys: • A previous survey conducted for ICSA by Gantz Wiley Research during May through June 2000 • A previous survey conducted for ICSA by Gantz Wiley Research during March through May 1999 • A previous survey conducted for ICSA by Gantz Wiley Research during March through May 1998 • A previous survey conducted for ICSA by the Pariah Group during March 1997 • A previous survey conducted for ICSA by Network Associates Public Relations during March 1996 Some of the results can also be compared with the results of another previous survey conducted by Dataquest during October 1991. The 1991 ICSA/Dataquest survey had similar design with similar demographics. There were, however, two primary differences: • The 1991 survey focused on organizations with a smaller number of PCs than later surveys (minimum number was 300 in 1991 compared with at least 500 for later surveys); and • The 1991 survey had a smaller average number of PCs per respondent (about 1,000 per respondent compared with almost three times that number in 2000). The larger cut-off for inclusion used for the 2000 survey reflects in part the growth in the installed base of PCs over the intervening period. Where appropriate, data from the 2001 survey will be compared with similar data from previous surveys.

Biases There are several potential biases to this report.

Retrospective Study

The most important bias is that this study is retrospective. That is, those people interviewed were asked to answer questions about past events. Though most sites claimed to have formal tracking mechanisms in place, we believe that the older events are less reliably described than information that is more recent.

Copyright © 2001 Page 3 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Moreover, older events are often under-represented (forgotten) compared to current information. Finally, it may also be true that unpleasant events are remembered less well than pleasant events, so that the past seems more positive than it actually was. This bias might enhance the perception that things are getting worse.

Correctness

Some questions referred to issues with well-known right answers; e.g., questions about policy and virus protection. Other questions asked respondents about the correctness of their estimates (e.g., comparisons of actual virus infections to initial estimates). In such cases, it is possible that respondents consciously or unconsciously adjusted their responses to look good to the interviewer or to reduce cognitive dissonance between their actual behavior and the normative behavior they felt they ought to have displayed. This bias could overestimate correctness in such questions.

Site Selection

The survey can be biased in favor of companies that have “computer virus experts” due to the initial site screening. Consequently, it might be true that sites that do not have such a person were under-represented in the survey. It may also be true that these sites did not have such a person because the virus problem was minimal there. This bias could show the problem as worse than it really is. Unfortunately, it could also be true that under- represented sites were worse off than the sites in the sample.

Encounter Definition

The survey attempted to define “virus encounter,” yet despite the interviewer training to reiterate and explain the usage, the definition is necessarily imprecise. We believe that some of those surveyed used the number of files, PCs or diskettes infected as an “encounter” rather than our intended meaning as an event in which any number of files, PCs or diskettes were infected at the same time. Another problem with the use of “encounters” (even in the sense we intended it) is that the larger the organization (and its number of computers), the more likely that it would experience “encounters.” We tried to compensate for this bias by calculating the derived quantity, “Encounters per month per 1,000 PCs,” but this bias probably makes the incidence and prevalence of computer viruses appear higher than it would be with more precise reporting. The problem is evident especially for the respondents who indicated thousands of “encounters” per month per thousand PCs – almost certainly an indication of misunderstanding the question.

Familiarity

The survey tried to estimate the chance that the respondent would actually know about every virus encounter by the site. Respondents were asked the question, “What percent of virus incidents in your group are you informed of or likely to know about?” However, based solely on anecdotal experience of what happens to anti-virus reporting in organizations, we surmise that a remote employee who encountered a virus for which the appropriate actions were already well known (because of past experience) would be less likely to report the incident to the respondent.

Copyright © 2001 Page 4 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Therefore, we think that common viruses may be under-reported compared with newer or less familiar viruses or those that have recently hit the headlines at the time of questioning.

 Principle Findings

2001 Demographics The 2001 survey represents a total of 666,327 PCs and 26,492 file and application servers. The average site in the survey had 2,221 PCs (the median was 1,000) and 90 file and application servers (median was 30).

How Common Are Virus Infections? All of the companies responding to the survey experienced at least one virus encounter during the survey period.

The group of 300 organizations had 1,182,634 encounters on 666,327 machines during the 20 months in question for the year 2000 and January through August of 2001. This translates to 113 encounters per 1,000 machines per month over the survey period with a rate of 103 infections per site per month by the end of the survey period. This rate represents the sixth consecutive year of increase.

Table 1 is a comparison of the 1996 – 2001 survey data for the months of January and February. As Table 1 shows, virus encounters, in general, have been increasing steadily. These data were arrived at by determining the average of the infection rates reported for the two months prior to the survey. The two months prior (July and August) were selected for comparison because they historically produce the greatest accuracy in respondent estimates due to proximity in time.

Table 1: Monthly rate of infection per 1,000 PCs over the two months prior to each survey 2001 Survey Year Jul – Aug 1996 10 1997 21 1998 32 1999 80 2000 91 2001 103

These figures show an increased infection rate of ∼12 infections per 1,000 machines per month each year through 1998 and again from 1999 - 2001. In 1999, there was a surge in the encounter rate. This increase was no doubt the result of the “mass mail” payload of macro viruses, Internet worms, and the scripting viruses that followed. Figure 1 below is a graphical representation of these data. A linear regression analysis of these global figures showed an annual growth of ∼20 encounters per month per 1,000 PCs for each year over the study period with a confidence level of ∼94 percent. A trend line added to the chart represents this analysis. Another way to look at this data is that the number of incidents per site per month about doubled each year through 1999. In the years since 1999, the rate has grown at a compound rate of about 15 percent per year.

Copyright © 2001 Page 5 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 1: Encounters per 1,000 PCs per month – two months prior to survey

Two Months Prior to Survey

120

100

80

60

40 y = 20.514x - 40940 R2 = 0.9416 Encounters per 1000 PCs per per month PCs 1000 per Encounters 20

0 1996 1997 1998 1999 2000 2001 Year

Chance of a Disaster For the purposes of this survey, a virus disaster was defined as an incident in which 25 or more machines, media, or files experienced a single virus on or about the same time. Respondents were asked if their company had experienced a “disaster” during the survey period. 84 of the respondents (28 percent) who answered this question responded in the affirmative.

Respondent Perception of the Virus Problem Interviewers asked respondents to state their opinion of the state of the virus problem in 2001 as opposed to 2000. They were given a scale of answers ranging from Much Worse to Much Better. Figure 2 below graphically represents those answers.

Copyright © 2001 Page 6 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 2: Opinions of the virus problem, 2001 vs. 2000

Opinions of the Virus Problem in 2001

32% 40%

1%

1% 3%

23%

Much worse Somewhat worse About the same Somewhat better Much better Don’t know

The respondents were clearly inclined to believe the problem of computer viruses in general was much worse than in 2000. Of the respondents, only 28 percent felt the problem was about the same or better. Conversely, almost three- quarters (72 percent) of them felt the problem was somewhat worse or much worse.

Copyright © 2001 Page 7 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Detailed Findings

Virus Encounters As reported above, virus encounters continue to rise. Figure 3 below gives us a picture of the survey period January 2000 through August 2001. In Figure 3, the survey time periods are indicated by four Series as listed below:

• Series 1 Jul - Aug 2001 - Previous two months of survey • Series 2 Jan - Jun 2001 - Six months prior to Series 1 • Series 3 Jul - Dec 2000 - Six months prior to Series 2 • Series 4 Jan - Jun 2000 - Six months prior to Series 3

Figure 3: Encounters per months, 2001

Encounters per Month per 1000 PCs

140

120 133 100

cy

n 80 109 103 e 97 u 60

q

e

r 40

F 20

0 4 3 2 1 2001 Survey Periods

Figure 4 gives a more defined view of this rise with comparison of the comparable data from the previous five surveys, 1996 – 2000. The chart presents these data as a monthly proportion during the mid-month of the surveyed periods. The results are charted as the rate of virus encounters per 1,000 PCs per month. It shows that the likelihood of a medium-to-large company experiencing a virus encounter grew from less than one encounter per 1,000 PCs per month in mid-1994 to about 103 chances per 1,000 PCs per month in July/August 2001.

Figure 4 uses the same Series construct for comparable periods in the previous surveys as referenced above.

Copyright © 2001 Page 8 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 4: Encounters per 1,000 computers per month, 1994 - 2001

160 147 150 140 133 130 120 109 103 110 99 97 100 90 90 81 80 70 60 50 34 33 40 32 28 30 21 20 21 20 10 10 9 9 8 10 3 2 0 4 0 1996 Survey 1997 Survey 1998 Survey 1999 Survey 2000 Survey 2001 Survey

Series1 Series2 Series3 Series4

Top Reported Viruses It is obvious that particular viruses are more likely to occur and spread than others are. Additionally, viruses of a certain type, viruses that use various infection vectors, or those viruses with a particular payload are “growing” in prevalence while others are in decline. Respondents were asked which viruses affected their group during the period of January 2001 through August of 2001. This period was divided into three segments: August 2001, July 2001, and January – June 2001. Respondents were asked, “Which viruses have affected your group's PCs during [a specified period]?” and then they were asked, “How many times [were you affected]?”

Due to the large number of viruses and their many variants (approximately 60,000+ known viruses and variants); the often cryptic naming convention for viruses; a lack of standardized naming convention in the anti-virus industry; and possibly poor record keeping, respondents were not always able to accurately identify the viruses. In all instances, every effort was made to identify individual responses at least to the virus family name. In instances where exact names were not known, partial names were given, or virus types were given and the data was pooled as [Type], unspecified.

The prevalence data for the most common viruses encountered in the survey are shown as encounters per month per 1,000 PCs for each of the survey periods in Table 2 below. These data were sorted and ranked by summing the encounter rate per 1,000 PCs for the three survey periods in 2001. Only those viruses that had a composite encounter rate of at least one per 1,000 PCs were considered for this listing. A complete listing of reported viruses can be found in Appendix B along with a chart showing encounter rates.

Copyright © 2001 Page 9 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 2: Top viruses for 2001, encounters per month per 1,000 PCs

2001 Rank Virus Name August 2001 July 2001 Jan – June 2001 1 Sircam 39.60 37.163 3.513 2 LoveLetter 29.167 23.558 8.242 3 Homepage 21.378 1.785 0.017 4 Funlove 5.589 5.577 0.914 5 Anna Kournikova 7.677 0.060 0.901 6 Macro-Unspecified 6.485 1.531 0.223 7 Magistr 2.277 3.716 1.743 8 Hybris 0.408 1.325 2.766 9 0.068 1.266 2.706 10 CodeRed 3.351 0.254 0.389 11 Class 1.384 2.191 0.006 12 Story 1.425 1.650 0.053 13 Worm-Unspecified 2.652 0.011 0.014 14 VBS.SST 0.000 2.364 0.001 15 Kak 0.932 0.855 0.302 16 Divi 0.008 1.628 0.003

How is Prevalence Changing? While macro viruses continue to be prevalent, they are fast being outstripped by viruses with a “mass mailing” payload; Win 32 viruses, Visual Basic Script, and Java Script viruses. Additionally, while we did have responses that identified three of our “old friends” – AntiExe, Monkey.B, and Stealth − we again do not find Boot Sector or infectors in the most common viruses.

In order to achieve a more detailed picture of the changes in prevalence, we classified the reported viruses by types. Table 3 shows the total number of encounters.

Table 3: Total encounters reported by types Virus Type August 2001 July 2001 Jan-Jun 2001 File 15,347 8,655 10,251 Macro 14,125 5,961 12,163 VBScript Worm 10,656 911 953 Internet Worm 2,374 318 3,426 Jscript Worm 383 168 286 Boot 5 45 70 Trojan 5 25 51 Joke 0 0 13

Copyright © 2001 Page 10 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 4 presents these data as encounters per 1,000 PCs per month.

Table 4: Encounters per 1,000 PCs per month Class August 2001 July 2001 Jan-Jun 2001 File 42.09 46.82 6.28 Macro 38.70 32.25 11.81 VBScript Worm 29.19 4.93 0.93 Internet Worm 6.50 1.72 3.33 Jscript Worm 1.05 0.91 0.30 Boot 0.01 0.24 0.11 Trojan 0.01 0.14 0.20 Joke 0.00 0.00 0.00

Figure 5 below gives us a graphical look at these data with a slight addition. In this chart, a category has been added to represent viruses that have a “Self Mailing” payload. Viruses with mail payloads began to appear with some regularity in late 1998 with (Win32/Ska) and with more publicity in March 1999 with the W97M/Melissa incident. That trend has continued as can be seen by the representation in Figure 5. Mass mail payloads have been seen in Macro viruses, Win32 File viruses, Visual Basic script viruses, Java Script viruses, and Trojan Horse programs.

Figure 5: Summary of encounters per 1,000 PCs per month by types

90.00

80.00

70.00

60.00

50.00

40.00

30.00

20.00

10.00

0.00 August 2001 July 2001 Jan-Jun 2001

Auto Mailer File Macro VBScript Worm Internet Worm Jscript Worm Boot Trojan Joke

Copyright © 2001 Page 11 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Virus Disasters Respondents were asked, “Has your group had a virus disaster (25 or more PCs/Servers infected at once) anytime since January 2001?” Table 5 shows that 28 percent did experience such an event.

Table 5: Percentage of respondents experiencing virus disaster Answer Frequency % Yes 84 28% No 211 70% Don't know 0 0% Refused 5 2% Total 300 100%

Date of last virus disaster Respondents were asked the month and year of their latest virus disaster. Table 6 presents these as a frequency distribution.

Table 6: Date of most recent disaster Date Frequency % September 2001 31 37% August 2001 16 19% July 2001 8 10% June 2001 1 1% May 2001 3 4% April 2001 1 1% March 2001 7 8% February 2001 6 7% January 2001 1 1% Don't Know 10 12% Total 84 100%

88 percent of the respondents professing disaster incidents reported those disasters in the calendar year 2001. 37 percent reported them in September, the month of the interviews. Of the respondents reporting disasters in September (31 participants), 26 were victims of the Nimda virus. Nimda was discovered during the second week of interviews and is probably underrepresented in this survey.

Copyright © 2001 Page 12 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 6: Frequency distribution of dates of last virus disasters

Date of Most Recent Disaster

50

40 31 30

20 16

Frequency 10 8 10 7 6 3 1 1 1 0 Sep-01 Aug-01 Jul-01 Jun-01 May-01 Apr-01 Mar-01 Feb-01 Jan-01 Don't Know Month

The frequency distribution of responses shows a strong increase beginning with July. This rapid increase is probably due to three viruses: Sircam, first discovered in July; CodeRed II, discovered in August; and Nimda, discovered in September.

Which virus caused the most recent disaster? Participants were asked to identify the virus responsible for their most recent disaster. Table 7 lists these viruses, their frequency, and number of PCs involved.

Table 7: Virus causing most recent disaster

Virus Name Frequency PCs Involved Nimda 28 138,650 LoveLetter 12 15,050 Sircam 11 15,232 Don't Know 9 40,200 Anna Kournikova 6 21,550 CodeRed 4 15,900 Funlove 4 40,800 Apost 3 5,500 Homepage 3 4,200 Melissa 3 5,280 MTX 1 450 84 302,812

Copyright © 2001 Page 13 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 7 below portrays these data and the percentage of disasters caused. Two things should be noted here: first, the survey was underway when the Nimda outbreak occurred and may have caused under-reporting of other computer viruses that were prevalent at the time (i.e. Sircam and CodeRed); second, two of the named viruses, Anna Kourikova and Homepage, belong to the VBSWG family and could be consolidated. We have chosen to represent them separately due to their prevalence throughout the survey.

Figure 7: Viruses causing most recent disaster

Viruses Causing Last Disaster

28 1 1 3 3

3 4

4 12 6

8 11

Nimda LoveLetter Sircam Don't Know Anna Kournikova CodeRed Funlove Apost Homepage Melissa MTX Other

How many PCs were infected in disasters? Table 8 gives the frequency distribution of the number of PCs infected in individual disasters.

Table 8: Frequency distribution of infected PCs in virus “disasters” PCs per Disaster Frequency % 25 21 25% 50 28 33% 100 9 11% 200 12 14% 300 5 6% 400 2 2% 500 0 0% 1,000 2 2% 2,000 2 2% 3,000 0 0% 4,000 0 0% 5,000 0 0% >5,000 3 4% Total 84 100%

Copyright © 2001 Page 14 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 8 portrays the data from Table 8 above.

Figure 8: Frequency distribution of infected PCs in virus "disasters"

PCs Found To Be Infected

30

28

25

21 20

15

12

Frequency of Responses 10 9

5 5

3 2 22

0 0 000 25 50 100 200 300 400 500 1,000 2,000 3,000 4,000 5,000 >5000 Distribution of PCs

70 percent of the respondents reported that they observed 100 or fewer PCs infected at or about the same time. The arithmetic mean or average number of affected PCs was 2,301 while the median was 50. Due to three large reports – 7,200, 18,000, and 142,000 machines – this shows a significantly skewed distribution, making it unwise to use the average number of affected PCs in calculating tendencies.

Accuracy of initial estimates An interesting statistic was the comparison of actual involvement of PCs to initial estimates of PCs involved. Table 9 and Figure 9 show that approximately 80 percent of the respondents underestimated their initial evaluation and only about 20 percent initially correctly estimated or overestimated the extent of the problem.

Copyright © 2001 Page 15 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 9: Comparison of observed PC infections to initial estimates

PCs Initial Estimate Actual 25 36 21 50 21 28 100 8 9 200 7 12 300 3 5 400 0 2 500 2 0 1,000 1 2 2,000 2 2 3,000 1 0 4,000 0 0 5,000 0 0 >5,000 1 3

It should be noted that it is somewhat difficult to determine the reliability of such responses. The reasons for such difficulty are outlined under Biases in Research Methodology in this report.

Figure 9: Comparison of observed PC infections to initial estimates

PCs Found To Be Infected

40

36 35

30 28

25

21 21 20

15 Frequency of Responses 12 10 9 8 7 5 5 3 3 2 2 222 1 1 11 1 0 0 0000 25 50 100 200 300 400 500 1,000 2,000 3,000 4,000 5,000 >5000 Distribution of PCs

Copyright © 2001 Page 16 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Server encounters seem to be almost as difficult to estimate. For the respondents who experienced a disaster in which servers were infected, the median number of servers actually infected was two, and this was the same as the number of servers initially thought to be infected. The averages were once again skewed by one or two large estimates; the actual average was 23 servers while the estimated average was 75. These data are presented in Table 10 and Figure 10.

Table 10: Initial estimates vs. actual numbers of infected servers

Servers Initial Estimate Actual 1 9 8 2 11 11 3 6 6 4 3 3 5 2 2 6 3 3 7 0 0 8 1 2 9 0 0 10 0 1 15 5 3 20 2 4 25 1 1 30 1 1 35 0 0 40 1 0 50 1 2 60 0 0 70 0 0 80 0 0 90 0 0 100 0 0 >100 3 2

Copyright © 2001 Page 17 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 10: Initial estimates vs. actual numbers of infected servers Actual Servers vs. Initial Estimates 12 1111

10 9

8 8

6 6 6 5

4 4 3 3 3 3 3 3

2 2 2 2 2 2 2 1 1 111 1 1 1

0 0 0 0 0 0 0 0 000000 000 0 0 1 2 3 4 5 6 7 8 9 101520253035405060708090100>100 Distribution

Initial Actual

What are the effects on victims of virus disasters?

How long were servers down? Table 11 and Figure 11 show the pattern of responses on the question of how long servers were down after a virus disaster. Of those responding with servers involved, 53 percent reported NO server downtime.

Copyright © 2001 Page 18 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 11: Frequency distribution of server downtime

Hours Frequency % 0 41 53% 1 9 12% 2 4 5% 3 3 4% 4 3 4% 5 1 1% 10 4 5% 20 1 1% 30 2 3% 40 3 4% 50 2 3% 100 2 3% 200 2 3% 300 0 0% 400 1 1% 500 0 0% Total 78 100% For those companies reporting server downtime, 12 percent reported downtime between zero and one hour. The arithmetic mean was somewhat low, 14 hours, and not as affected by several larger reports as it has been in the past. Even the median was extremely low (zero hours). Fully 83 percent of responding companies reported ten hours or less. Figure 11: Frequency distribution of server downtime

D istrib ution of D ow ntim e H ours

100%

90%

80%

70% Series2 Series3

60% 53%

50%

40%

30% % of Responses of % 20% 12%

5% 5% 10% 4% 4% 3% 4% 3% 3% 3% 1% 1% 0% 1% 0% 0 1 2 3 4 5 10 20 30 40 50 100 200 300 400 Hours downtime

Copyright © 2001 Page 19 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

What was the cost of the disaster in person-days? Respondents were asked how many cumulative person-days were lost during the disaster that affected their company.

Table 12 and Figure 12 show these results.

Table 12: Frequency distribution of person-days lost

Days Frequency % 0 6 9% 1 9 14% 2 11 17% 3 6 9% 4 3 5% 5 5 8% 6 0 0% 7 1 2% 8 2 3% 9 0 0% 10 2 3% 20 7 11% 30 5 8% 40 2 3% 50 2 3% 60 1 2% 70 0 0% 80 1 2% 90 0 0% 100 1 2% 200 1 2% 300 1 2% Total 66 100%

Only 66 of the participants felt they could respond to this question. Follow-up questioning found those that chose not to answer did so either because of corporate policy or recentness of the disaster and lack of data. However, of the 84 companies reporting disasters, almost 79 percent were able to respond. Of those responding, 68 percent reported ten person-days or less. Again, outliers distorted averages. The arithmetic mean was 19 person-days and the median was four person-days.

Copyright © 2001 Page 20 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 12: Loss in person-days due to disaster

100%

90%

80%

70%

60% Percentage 50% Cumulative % 40%

30% % of responses of % 17% 20% 14% 11% 9% 9% 8% 8% 10% 5% 3% 3% 3% 3% 0% 2% 0% 2% 0% 2% 0% 2% 2% 2% 0% 0 1 2 3 4 5 6 7 8 9 10 20 30 40 50 60 70 80 90 100 200 300 Cumulative Person-Days

What was the cost in dollars to your company? Respondents were asked to estimate the cost in dollars for their latest disaster. Each respondent was asked to consider ALL costs including employee downtime, overtime to recover, lost opportunity, etc.

Copyright © 2001 Page 21 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Only 24 participants responded with data. Table 13 shows these responses.

Table 13: Frequency distribution of dollar costs

Cost Frequency Cumulative % $100.00 9 28% $200.00 1 3% $300.00 0 0% $400.00 1 3% $500.00 2 6% $1,000.00 1 3% $2,000.00 0 0% $2,500.00 0 0% $3,000.00 0 0% $5,000.00 2 6% $10,000.00 5 16% $20,000.00 2 6% $30,000.00 2 6% $40,000.00 0 0% $50,000.00 1 3% $100,000.00 2 6% $200,000.00 2 6% $300,000.00 0 0% $400,000.00 0 0% $500,000.00 1 3% $1,000,000.00 1 3% Total 32 100%

The average for dollar costs was more than $69K while the most frequent answer was $10K and the median was only $5.5K. Again, this skewing is owing to the few extremely large reports, one as high as $1,000,000. These data obviously should not be used to set cost trends for several reasons, not the least being the small sample of respondents.

What impact do viruses have?

What are the organizational effects of viruses? Virus disasters and encounters have organizational effects beyond the money, resources, and effort required to recover from such incidents. We asked the participants what organizational effects the latest disaster or encounter had on the company or working group. Table 14 lists these data in decreasing response frequency. Percentages add up to more than 100 percent due to multiple answers per company.

Copyright © 2001 Page 22 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 14: Effects of viruses, 2001

Answer Frequency % PC was unavailable to the user 210 70% Loss of productivity (Personnel & Machine) 208 69% Corrupted files 165 55% Loss of access to data 123 41% Lost data 111 37% Loss of user confidence in the system 100 33% Screen message, interference, lockup 51 17% Trouble reading files 36 12% Unreliable applications 31 10% System crash 29 10% Trouble printing 24 8% Trouble saving files 23 8% Other 10 3% Threat of someone losing their job 7 2% Don't know 3 1% None 2 1% Total respondents 300

Figure 13 represents these data in the form of a bar chart.

Figure 13: Effects of viruses, 2001

Effects of Viruses on Corporations

PC was unavailable to the user 70% Loss of productivity (Personnel & Machine) 69% Corrupted files 55% Loss of access to data 41% Lost data 37% Loss of user confidence in the system 33% Screen message, interference, Lockup 17% Trouble reading files 12% Unreliable applications 10% Effects System crash 10% Trouble printing 8% Trouble saving files 8% Other 3% Threat of someone losing their job 2% None 1% Don't know 1%

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% % of respondents

Copyright © 2001 Page 23 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Where Do They Come From? Respondents were asked to identify the means of infection for their most recent virus incident, disaster or encounter. In this survey, respondents could indicate more than one avenue of infection, and totals may exceed 100 percent. A comparison with the 1996 through 2001 surveys is provided below in Table 15 and graphically in Figure 14.

Table 15: Sources of infection, 1996 - 2001

Source of Virus 1996 1997 1998 1999 2000 2001

E-mail Attachment 9% 26% 32% 56% 87% 83% Internet Downloads 10% 16% 9% 11% 1% 13% Web Browsing 5% 2% 3% 0% 7% Don't Know 15% 7% 5% 9% 2% 1% Auto Software 0% 2% 1% 0% 0% 2% Distribution Other Vector 0% 5% 1% 1% 1% 2% Diskette: other 21% 27% 21% 9% 2% 1% Diskette: From Home 36% 42% 36% 25% 4% 0% Diskette: Sales Demo 11% 8% 4% 2% 0% 0% Diskette: Wrapped SW 2% 4% 2% 0% 0% 0% Diskette: LAN mgr/spvr 1% 3% 1% 0% 0% 0% Diskette: repair/service 3% 3% 3% 2% 0% 0% Diskette: malicious 0% 1% 1% 0% 0% 0% person CD: software distribution 0% 1% 2% 0% 1% 0% Total Respondents 300

These data highlight some important trends. At first glance, it is somewhat surprising to see that the e-mail vector decreased slightly after dramatic growth through 2000. However, it must be understood that the month of August gave rise to CodeRed II and September (the month of data collection) gave us Nimda. Both of these viruses could infect through other Internet means than e-mail. We will discuss this in a later section. One will also note a sharp increase in the vectors of web browsing and Internet downloads. In addition, there has been a continuance of the last two years’ sharp decrease in encounters by diskettes.

Copyright © 2001 Page 24 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 14: Comparison of virus encounter vectors

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

E-mail Attachment

Internet Downloads

Web Browsing 1996 1997 Don't Know 1998 1999

Auto Software Dist 2000 2001

Other

Diskette: other

Diskette: home

Di ske tte : sa le s de m o

Diskette: shrink-wrapped software

Diskette: LAN manager/supervisor

Diskette: repair/service person

Diskette: malicious person

Copyright © 2001 Page 25 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Changes in Virus Distribution Mechanisms To focus on the major changes in virus distribution vectors, we have grouped the various vectors into three categories: E-mail, Diskettes, Downloads, and all other vectors. We compare this year to the previous surveys, 1996 - 2000. Table 16 and Figure 15 show these comparisons.

Table 16: Comparison of virus activity, 1996 - 2001

Source 1996 1997 1998 1999 2000 2001 Diskette 74% 88% 67% 39% 7% 1% E-mail 9% 26% 32% 56% 87% 83% Internet/Non-email 12% 24% 14% 16% 2% 20% Other 15% 7% 5% 9% 2% 1%

Figure 15: Changes in virus encounter vectors

100%

90%

80%

70%

60%

50%

40%

30%

20%

10%

0% 1996 1997 1998 1999 2000 2001

Diskette E-mail Internet/Non-email Other

While decline in the frequency of Boot Sector infectors are quite apparent, it must be stated that there is a potential problem in these data. There are several of these potential problems; some of these are covered in the “Biases section” of the Research Methodology part of the survey. Other potential problems include the fact that an older and more well-known virus might be detected and cleaned by an anti-virus product without user intervention. In such a case, the effectiveness of the anti-virus product in use would decrease the probability of the incident being reported, especially if there was no significant data, revenue, or production loss. Our experience also shows us that those

Copyright © 2001 Page 26 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001 viruses that cause unpleasant experiences such as loss of data, massive infection, and/or prove difficult to remove are most likely to be recorded leading to potential under-reporting of some other viruses.

Usage of Anti-Virus Products

Overall Level of Usage The survey attempts to determine what percentage of PCs were covered by anti-virus products. The interviewer does this by asking what percentage was NOT covered. Table 17 shows the responses in a frequency distribution table.

Table 17: Percentage of PCs NOT covered by anti-virus products

No Coverage Frequency Percentage

0% 210 72% 10% 59 20% 20% 7 2% 30% 2 1% 40% 3 1% 50% 3 1% 60% 0 0% 70% 0 0% 80% 1 0% 90% 2 1% 100% 6 2% Total 293 100%

With 293 respondents, six responded that NO anti-virus products were installed. Figure 16 shows the same data expressed as the percentage of PCs that were covered by anti-virus products and includes the cumulative percentage covered. These data show that 92 percent of the respondents covered 90 percent or more of their PCs with anti-virus products, while 72 percent stated that all of their PCs were protected. It appears that once the organizations have recognized the value of anti-virus products, they try to protect all of their PCs.

Copyright © 2001 Page 27 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 16: Percentage of desktop PCs protected by anti-virus products

100%

90%

80%

70% 72%

60%

50%

40%

30%

Percentage of Response of Percentage 20% 20%

10% 2% 2% 1% 1% 1% 0% 0% 0% 1% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Frequency of Response Percentage of Respondents Cumulative Percentage

Anti-virus products disabled on PCs Respondents also were asked what percentage of PCs with installed anti-virus software had the anti-virus disabled. Table 18 shows the frequency distribution of responses.

Table 18: Frequency distribution of percentage of PCs with disabled anti-virus products

%Disabled Frequency Cum % 0% 237 84% 10% 32 11% 20% 2 1% 30% 0 0% 40% 0 0% 50% 0 0% 60% 1 0% 70% 0 0% 80% 0 0% 90% 0 0% 100% 10 4% Total 282 100%

Copyright © 2001 Page 28 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 17 shows these results in a graphical form.

Figure 17: Frequency distribution of PCs with disabled anti-virus products

Anti-Virus Software Installed But Not Running

(Frequency of Response)

100%

90% 84%

80%

70%

60%

50%

40%

30%

20% 11%

Response Percentage 10% 4% 1% 0% 0% 0% 0% 0% 0% 0% 0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Frequency of Response Response Percentage Cumlulative %

These data show that very few of the respondents felt that significant numbers of the PCs for which they were responsible had anti-virus products that were disabled. 84 percent answered that none of their PCs had disabled anti-virus products; over 95 percent claimed that no more than 10 percent were disabled, which fits well within the same parameters seen in previous surveys although somewhat higher than last year.

Anti-virus products employed on PCs We asked the respondents what anti-virus products they used. Table 19 and Figure 18 show the results of the survey taking into consideration only whether a respondent reported using one or more specific anti-virus products on the organization’s PCs. Totals equal more than 100 percent as some companies have more than one product in use.

Table 19: Use of specific anti-virus products by frequency of response Product Freq % Network Associates 150 50% Symantec Corporation 133 44% Trend Micro 23 8% Computer Associates 17 6% Command Software 10 3% Don't know 4 1% None 3 1% Sophos, Inc. 2 1% Total respondents 300 >100%

Copyright © 2001 Page 29 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 18: Desktop coverage by frequency of response

Anti-virus Product Usage (by Frequency of Respondents)

50%

44%

1%

1% 1% 3% 6% 8%

Network Associates Symantec Corporation Trend Micro Computer Associates Command Software Don't know None Sophos, Inc.

Table 20 shows the result of the same question. However, this representation is of the total number of PCs that respondents claimed were using the particular anti-virus products.

Table 20: Use of specific anti-virus products by number of PCs

Product Total PCs % Symantec Corporation 316,015 42% Network Associates 285,501 38% Trend Micro 62,020 8% None 49,000 6% Computer Associates 26,772 4% Command Software 9,554 1% Don't know 5,150 1% Sophos, Inc. 1,600 <1% Total PCs 755,6122 100%

2 This totals to more than the 666,327 desktops represented by all companies. Further querying of companies using multiple products revealed that some companies used multiple products on desktops. Hence, the larger number.

Copyright © 2001 Page 30 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 19 shows these same data as a bar chart.

Figure 19: Desktop coverage by total number of PCs

AV Products Used (on Total Number of PCs)

5 01 350,000 6, 1 1 50 3 5, 300,000 28 250,000 200,000 150,000

PCs 0 02 0 100,000 2, 0 6 ,0 72 49 ,7 0 50,000 6 54 5 0 2 ,5 ,1 60 9 5 1, 0

n s ro e s re w c. tio te ic n te a o n a ia o ia w kn , I or c M N c ft 't s p so nd o o n o or s e ss S o ph C A Tr A nd D o c rk r a S te o te m n tw pu m a e m o ym N o C S C AV Products

Between them, Network Associates’ McAfee VirusScan and Symantec Corporation’s Norton’s Anti-virus accounted for 94 percent of responding companies and about 80 percent of the total number of PCs in this sample. It would be prudent, however, before drawing too many conclusions from these data to review the Biases described in the Methods section of this report.

Anti-virus mechanisms in use on PCs Respondents were asked what mechanisms were in use on their anti-virus-protected PCs. Table 21 shows their responses.

Table 21: Respondents using specific anti-virus methods

Method Frequency Anti-virus software scans full time in background 213 Anti-virus software scans every boot-up 181 Users check diskettes and downloads for viruses 168 Other periodic anti-virus detection on the desktop 138 Anti-virus software scans every login 79 Other full-time anti-virus detection on the desktop 57 Total respondents used 300

Copyright © 2001 Page 31 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 20 shows the data from Table 21. Most companies (71 percent) used full-time background protection at the desktop and 60 percent were protected by auto scan at boot-up.

Figure 20: Anti-virus methods in use by companies

Anti-Virus Methods used on the Desktop 100%

80%

60%

40%

20%

Corporate Responses by 0% AV scans full- AV scans Users check Other periodic AV scan at Other full-time time each start-up files & AV login AV diskettes

Perhaps even more interesting than the number of organizations using various methods was the number of PCs on which these methods were applied. Using the data detailing how many PCs for which each respondent was responsible, we computed the total number of PCs in the sample of respondents who gave numerical estimates of the usage of various methods. Using this total, we were able to compute the proportion of PCs where those methods were used. Table 22 lists the total PCs using various methods.

Copyright © 2001 Page 32 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 22: PCs using specific anti-virus methods

Method PCs % Anti-virus software scans hard drive for viruses full time in the background 381,511 57% Anti-virus software scans hard drive for viruses every boot-up 330,008 50% Other periodic anti-virus detection on the desktop 275,848 41% Users check diskettes and downloads for viruses 265,124 40% Anti-virus software scans hard drive for viruses every login 191,917 29% Other full-time anti-virus detection on the desktop 111,584 17%

Server anti-virus methods We asked what methods were used on file and application servers. Figure 21 shows the response for both the percentage of responding companies and the total number of servers using particular methods.

Figure 21: Anti-virus methods used on file servers

Server Methods by Companies and Total Servers

100 % 90% 93% 80% 70% 75% 73% 72% 60% 50% 40% 30% 20% 10% 18% 0% 7% User software scans hard Anti-virus software scans Other drive for viruses drive for viruses full time in background Frequency of Response Percent of Servers

Anti-virus Usage usage on Perimeter perimeter services Respondents were asked what percentage of their e-mail servers, proxy servers, and firewalls were covered by anti- virus methods. Table 23 shows the results of this distribution of their responses.

Copyright © 2001 Page 33 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 23: Frequency distributions of coverage of perimeter services by frequency

% Coverage E-mail Proxy Firewalls 0% 40 130 115 10% 0 0 0 20% 0 0 1 30% 0 0 0 40% 0 0 0 50% 1 1 1 60% 0 0 0 70% 0 1 0 80% 1 0 0 90% 2 0 0 100% 237 108 124 Total 281 240 241

Figure 22 shows these results in a three-dimensional chart.

Figure 22: Frequency distribution of reported perimeter anti-virus coverage

Coverage of Firewalls, E-mail & Proxy Servers by Frequency of Response

115 130 124

108 40 0 237 1 0 0 0 1 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0% 1 0 0 10% 20% 0 1 30% 40% 2 50% 60% 70% 80% 90% 100% Percentage of Coverage

E-mail Proxy Firewalls

As has been the case for the previous three years, these data show a strongly bi-modal distribution. Almost all respondents use either no protection on perimeter gateway services or cover almost all of their perimeter services. Only a very small percentage claim percentages between 10 percent and 90 percent.

Figure 23 below charts the growth of reported perimeter protection since the 1997 survey.

Copyright © 2001 Page 34 ICSA Labs TruSecure® Corporation A Division of TruSecure® Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 23: Comparison of perimeter anti-virus coverage, 1997 - 2001

Percentage of Perimeter Coverage

90%

80%

70%

60%

50%

40%

30%

20%

10%

0% 1997 1998 1999 2000 2001

E-Mail Gateways Firewalls Proxy Servers

Even though increases in perimeter coverage have been reported since 1997, it is still alarming that companies have not embraced perimeter coverage to the extent they have desktops and network server virus protection. While e-mail gateways are closest to the necessary coverage, companies are still lagging in adopting the necessary levels of protection necessary at the Internet perimeter. While perimeter protection is not a replacement of desktop and server protection, it is another layer of protection and a key component necessary for a complete corporate virus protection strategy. With the already demonstrated increased prevalence of viruses using a mass mail payload and this year’s arrival of viruses using other Internet ports and protocols as infection vectors, perimeter protection is quite possibly the best protection for corporate assets. We will discuss this more in some detail in a later section.

Perimeter anti-virus methods Respondents were also queried about the anti-virus methods used at their Internet perimeter. The following tables and charts represent their answers. Table 24 lists the responses, while Figure 24 graphs the method usage by percentage.

Table 24: Anti-virus methods in use on e-mail gateways

Method Frequency % Anti-virus software scans all messages in real time? 230 77% Anti-virus software scans message folders and databases? 177 59% Block, filter, or quarantine e-mail attachments by file type 207 69% Other anti-virus software protection methods used 17 6% Total respondents 300

Copyright © 2001 Page 35 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 24: Anti-virus methods used on e-mail gateways by percentage

E-mail Gateway Anti-Virus Usage by Percentage 100% 90% 80% 70% 60% 50% 40% 30% 20% 10%

Percentage of Respondents 0% Scans all messages in Scans message folders Block, filter, or Other anti-virus real time? and databases? quarantine e-mail software protection attachments by file meth od s u s ed type Methods

We also asked the respondents what types of files they filtered or quarantined at the gateway. Table 25 lists a summary of those comments.

Table 25: Files blocked, quarantined, or filtered at the e-mail gateway File Types Blocked, Quarantined, or Filtered at E-mail Gateway Avi Jpeg scr All MS Office files Bat Jpg shs All video Bin Js vb? Email body by keyword Com lnk vbe Email by subject Eml mov vbs Worms Exe mp3 All scripts Gifs mpeg wav All video files Hlp pe Zip Hta pif All attachments http pis All audio

Copyright © 2001 Page 36 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 26 lists the responses for anti-virus methods used on proxy servers, and Figure 25 graphically depicts this in percentages of respondents.

Table 26: Anti-virus methods used on proxy servers

Method Freq % AV scans all traffic in real time 76 55% Block, filter, or quarantine files by file type 55 40% Other Methods 8 6% Total respondents used 139

Figure 25: Anti-virus methods used on proxy servers by percentage

Proxy Servers Using Various Anti-virus Protection Methods

6%

40% 55%

AV scans all traffic in real time Block, filter, or quarantine files by file type Other Methods

Copyright © 2001 Page 37 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 27 lists those file types and objects blocked, quarantined, or filtered at the proxy server.

Table 27: File types or objects blocked, quarantined, or filtered at the proxy server File types or objects blocked, quarantined, or filtered at the proxy server Com jpeg Shs Dat jpg Tif Dbs mpegs vb? Dll nws Vbe Eml pif Vbs Exe readme.eml All scri pt files Gif readme.exe No download t hrough proxy Hta scr

Table 28 lists the anti-virus methods used on the firewalls, and Figure 26 gives us a graphical representation of the method percentages of total respondents.

Table 28: Anti-virus methods used at the firewall

Method Frequency % Anti-virus software scans all traffic in real time 94 53% Block, filter, or quarantine files by file type 73 41% Other anti-virus protection methods 9 5% Total respondents 176

Figure 26: Anti-virus methods used at the firewall by percentage

Usage of Anti-Virus Methods (Percentage of Respondents) 5%

41% 53%

Anti-virus software scans all traffic in real time

Block, filter, or quarantine files by file type

Other anti-virus protection methods

Copyright © 2001 Page 38 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 29 lists those file types or objects blocked or filtered at the firewall.

Table 29: File types or objects blocked, quarantined, or filtered at the firewall File types or objects blocked, quarantined, or filtered at the firewall Bmp Gif pif all scripts Com Hta scr all sound Dat Jpeg shs all video Dbs Jpg tif Monitor specific protocols Dll Js vb? Monitor specific ports Eml Mpegs vbe Exe Nws vbs

PC operating systems Respondents reported on all of the operating systems they used in their corporations for PCs. Table 30 and Figure 27 show the results.

Table 30: Distribution of PC operating systems in respondent organizations OS Percent Frequency Windows NT 263 88% Windows 2000 244 81% Windows 95 203 68% Unix 97 32% Macintosh 87 29% Windows 98 80 27% Linux 75 25% DOS 24 8% Windows 3.1 19 6% OS/2 17 6% Others 6 2% Windows ME 2 1% Windows XP 2 1% Solaris 1 <1% Total Respondents 300

The percentages add up to more than 100 percent because many respondents used multiple operating systems within their organizations. Almost all organizations had some PCs running operating systems, including: DOS, Windows 3.x, Windows 95/98, Windows ME, Windows NT, Windows 2000, and even Windows XP; only 7 percent reported some Unix-based systems; and 29 percent of companies reported usage of Macintosh systems.

Copyright © 2001 Page 39 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 27: PC operating systems by frequency of response only

Windows NT 263

Window s 2000 244

Windows 95 203

Unix 97

Macintosh 87

Windows 98 80

Linux 75

DOS 24

Windows 3.1 19

OS/2 17

Others 6

Windows XP 2

Windows ME 2

Solaris 1

0 50 100 150 200 250 300 Percentage of Respondents

These figures do not represent the total number of PCs running each of the various operating systems. For those data, see Table 31 and Figure 28.

Operating systems on PCs Respondents gave interviewers the number of PCs running each of the operating systems. These data are summarized in Table 31 and Figure 28. The total number of PCs in the sample of respondents who answered this question was 631,082. Table 31 lists the operating systems as a percentage of the total PCs in the sampling.

Table 31: Distribution of operating systems installed on PCs OS Percent of total PCs Windows NT 39% Windows 95 32% Windows 2000 13% Windows 98 11% Macintosh 1% Unix 1% Windows 3.1 1% DOS 1% OS/2 1% Total 100%

In our sample, approximately 97 percent of the PCs were running some form of Microsoft OS.

Copyright © 2001 Page 40 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Figure 28: Distribution of operating systems installed on PCs

Operating Systems Usage (by total # of PCs)

250000 245,416

200000 199,042

150000

100000 82,113 71,070

50000

8,831 7,024 4,922 4,668 3,788 0 Win NT Win 95 Win 2000 Win 98 Macintosh Unix Win 3.x DOS OS/2 Operating Systems

LAN Operating Systems Respondents reported on all the network operating systems (NOS) they used in their corporations for local area networks (LANs). Table 32 below lists those data in order of NOS in use by order of frequency of response.

Table 32: Distribution of LAN operating systems in respondent organizations OS % Frequency Windows NT 85% 255 Novell NetWare 48% 145 UNIX 12% 37 Solaris 3% 8 Banyan Vines 3% 5 IBM OS/2 LAN Server 2% 5 Don't Know 1% 4 Other 1% 4 None Above 1% 3 AIX <1% 2 AS/400 <1% 2 VMS <1% 2 Windows 2000 <1% 2 HPUX <1% 1 Macintosh <1% 1 Total Respondents >100% 300 Percentages add up to more than 100 percent due to respondents' use of multiple operating systems in their organizations. A large majority of organizations had some Microsoft Windows network operating systems and about half (48 percent) had some Novell Netware NOS. Surprisingly, only 13 percent reported using UNIX systems.

Copyright © 2001 Page 41 ICSA Labs TruSecure® Corporation A Division of TruSecure® Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

These figures do not, however, represent the number of LAN servers running the various operating systems. For those data, see Table 33 and Figure 29.

Operating systems on LAN servers Respondents were asked which LAN servers were running each of the operating systems. These data are summarized in Table 33 and Figure 29. The total number of LAN servers in the sample of respondents who answered this question was 22,454.

Table 33: Distribution of LAN operating systems installed on servers OS % of total LANs Windows NT 71% Novell NetWare 25% Unix 3% Banyan Vines 1% AIX <1% AS/400 <1% HPUX <1% IBM OS/2 LAN <1% Mac <1% Solaris <1% VMS <1% Win2000 <1% Other <1%

In our sample, 71 percent of the LAN servers were running Microsoft Windows NT, while only 25 percent of servers were running Novell Netware.

Figure 29: Server operating systems by total number of servers

NOS Installed by Number of Servers 63 138 9 9

145

276

172

OS/400 IBM OS/2 LAN Server Linux Unix Novell NetWare Windows 2000 Windows NT

Copyright © 2001 Page 42 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Primary line of business Table 34 shows the distribution of responses from respondents on their primary line of business. Government, healthcare and manufacturing were the top three categories making up approximately 59 percent of the respondents. The complete list is shown below and is sorted alphabetically.

Table 34: Primary line of business

Line of Business Frequency % Line of Business Frequency % Agriculture 1 <1% Manufacturing 32 11% Business Services 4 1% Mining 1 <1% Communications 10 3% Non-Profit 2 1% Construction 2 1% Professional Srvs 7 2% Education 12 4% Real Estate 1 <1% Engineering 4 1% Retail 5 2% Finance 17 6% Transportation 1 <1% Government 100 33% Utilities 9 3% Healthcare 46 15% Wholesale 1 <1% Insurance 15 5% Other 23 8% Legal 5 2%

Department Table 35 shows the distribution of department affiliations of the respondents. Only those departments with at least one response are shown.

Table 35: Departmental affiliations of respondents

Department Frequency % MIS/IS (Management Information Systems) 276 92% Other 12 4% Data Processing 3 1% General Administration/Management 3 1% Education/Training 2 1% Research & Development 2 1% Engineering 1 0% Personnel/Human Resources 1 0%

Job title Table 36 shows the distribution of job titles according to the pre-existing categories used by the survey personnel. 46 percent of the respondents’ titles were not on the list and are classified as “Other.”

Copyright © 2001 Page 43 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Table 36: Job titles Job Title Frequency % Systems Manager/PC Network/LAN Manager 91 31% Manager of MIS/IS 60 20% Support Specialist 47 16% Systems Analyst/Programmer 30 10% Director of Data Processing/Information Systems 29 10% Manager of Computer Security 24 8% Other 8 3% Director of Computer Security 4 1% Manager of Data Processing 3 1% Don't Know/Refused 2 1% Total Respondents 298

 Discussion Section by Dr. Peter Tippett, CTO, TruSecure Corporation

What conclusions can we draw from this study? The purpose of this section is to summarize the findings of this year’s survey and draw conclusions regarding the data. In short, this will serve as both a high-level summary of the results of this year’s survey and comments concerning malicious code prevalence, trends, and mitigation factors.

The virus problem in companies continues to get worse The malicious code problem (viruses, Internet worms, and Trojan programs) continues its seven-plus year trend of worsening every year. This is true whether one considers incidents or costs. It is true notwithstanding the fact that the total expenditures directed at protecting enterprises also have risen each of the seven years. By definition, when a problem gets worse in spite of increases in spending and increases in other mitigating activity, it is out of control.

One possible good sign is that the rate of growth of virus incidents appears to be slowing. The virus encounter rate per 1,000 users per month approximately doubled each year for at least the four years prior to 1999. 1999 introduced us to the Melissa and the onslaught of mass mailing and Internet-aware viruses. However, after being exposed thoroughly to these Internet-enabled viruses, the rate of growth seems to have slowed down. The rate of growth for the years 1999 through 2001 is only 15 percent annually. However, it should be noted this apparent slowing could also be attributed to the timing of the surveys. The 2000 survey was conducted during the Loveletter outbreak and this year about one third of this survey was performed in a timeframe that would capture the Nimda event of September. On the other hand, it is possible the virus problem has become so prevalent and common that it is under-reported.

We believe that the rate of desktop-oriented growth has slowed, but are inclined to believe that both encounters and disasters will increase significantly above the current 15 percent per year. With the anticipated increase of encounters will come an increased protection and recovery cost. New virus types, expanded forms of connectivity, expanded VPN and partnership connections, expanded use of multiple e-mail methods by normal users, and new, pervasive replication vectors are all likely to contribute to our predicted increase in the rate of malicious code problems.

Copyright © 2001 Page 44 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

If the 15 percent trend continues, organizations in 2002 might expect 135 virus encounters per month per 1,000 PCs. The median company with 1,000 PCs would experience fewer encounters than the average, but because each month a relatively small number of companies experience very high rates of virus infections, the average for all companies is elevated to 135 per month. That means that those relatively few companies have very big virus problems indeed. We have observed this phenomenon in each of the ICSA Labs surveys since 1994.

Companies tend to have continuous small problems with viruses (three to ten incidents per week for a 1,000 PC company) with relatively less common large malicious code issues, which we call disasters. Our measured disaster rate of 28 percent did not count potential Nimda disasters from the approximately 2/3 of respondents who completed our survey before the Nimda date. Given these two modifiers, we believe it is probable that the current rate of desktop-oriented disasters is roughly unchanged from the previous baseline of about one per year per 1,000 PC organization. We believe that the likelihood of desktop-oriented malcode disasters in 2002 will be at least as high and probably higher for reasons cited below.

Other research and data unrelated to this survey indicate that disasters oriented toward Internet-servers grew dramatically during 2001. Between the infamous Morris worm of 1988 and 2000, there were effectively no worms or viruses principally involving the hosts of the Internet. Toward the end of 2000, the L10n family of worms began and infected in the low 1,000’s of Internet Hosts worldwide. During 2001, we experienced about 5,500 Host- oriented worms during Q1 (mostly Poisonbox), about 330,000 infections mostly from Code Red during Q2, and perhaps 525,000 infections during Q3 mostly related to Nimda. Together this represents the birth and aggressive growth of Host-oriented malicious code. We expect to see a continued growth of this type of malicious code.

Virus Types: As predicted, boot sector viruses are essentially gone in this year’s reporting. The reasons for this decline are addressed in the discussion section of our 6th Annual Computer Virus Survey 2000. Therefore, boot-up scanning and scanning of floppy drives on shutdown are no longer valuable if full-time anti-virus is running on desktop and laptop PCs.

We saw some important trends in the birth of new virus types, which may be the harbinger of the larger scale malicious code events to come in 2002 and 2003. There are three significant trends:

1. 2001 (and late 2000) demonstrated the first occurrence of real, pervasive Internet worms since the Morris worm in 1988. Code Red (two successful variants) and Nimda along with earlier multi-platform and Unix platform worms like Poizon-box and L10n (and its variants) show a trend toward infection and propagation via Internet host computers. We expect more multipartite worms by both vector and platform. There is every reason to expect one or more of these to be at least as successful as Nimda.

2. Probably the most successful worm was Nimda, (based on research by TruSecure Corporation) which showed an impact on some 68 percent of North American companies. A significant reason for the success of Nimda was that it embodied a total of five major infection vectors: web server to web server, browser, mail, shares and viral vectors. We expect to see the most successful worms in the next two years exploiting this strategy of multiple infection vectors. Interestingly, the least successful of the five Nimda vectors was its email vector.

3. We saw the first examples of malicious code embedded in HTML scripts where no file attachment was present. None of these was particularly successful during 2001, but we fully expect to see worms that bypass most controls by avoiding any file attachments, and bypass traditional security products by their rapid rate of spread. There are numerous vulnerabilities that are pervasive and yet to be exploited by malicious code. The infected Internet server vector (as in the case of Nimda) will recur.

Copyright © 2001 Page 45 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Perceptions of the Virus Problem How network managers feel about viruses is consistent with reality. A large majority (72 percent) of those surveyed feel that the overall virus problem is either somewhat worse or much worse than last year. This undoubtedly reflects the reality of their experience with Internet-enabled viruses and worms as well as completely new “host-based” malicious code.

Virus Disasters and Costs Fully 15 percent of respondents in the 1999 survey experienced a Melissa-related disaster versus about 41 percent in the 2001 survey that experienced the Love Letter virus as a disaster. This year’s measured costs and impact were down from last year but, accounting the partial data from the Nimda event, the costs related to disasters were probably about the same as last year. Last year this amounted to, on average, between seven (median) and 344 (average) person-days and between $10,000 (median) and $120,000 (average) in estimated direct costs. Further analyses of previous year’s surveys tend to show these numbers to be about 7-fold low when compared to in-depth studies including productivity analyses and cost modeling of the effects of virus disasters. If this proportion holds for this survey and our other assumptions are good, then the average company spends between $100,000 and $1,000,000 in total ramifications (both soft and hard costs) per year for desktop-oriented virus disasters. The average company undoubtedly spends additional dollars on disasters involving their Internet Host computers.

Virus disaster impact: Loss of productivity has always been, by far, the most important ramification of both virus encounters and disasters. This year, however, corruption of files, trouble reading files, and lost data were up significantly as compared to the previous year’s survey results. The concern of data destruction and file corruption has always been out of proportion to the real effects of viruses, but this year, the reality is beginning to grow to match perception.

In addition to being more prevalent in surveyed corporations, computer viruses were more costly, more destructive, and caused more real damage to data and systems than in the past. This is all true regardless of the increased use of anti-virus products, more aggressive product updates, improved management of anti-virus products, and greater distribution of anti-virus protection. We are spending more time and energy trying to prevent viruses, and our virus experience is worse, across the board.

A primary reason for this trend is that of the propagation of Internet-enabled viruses. Today it is common for a virus to infect a large proportion of corporations within just a few hours of release, or at least within days. Previously, viruses might take years (boot and file type viruses) or months (macro viruses) to gain enough traction to be a likely risk to a given organization. Due to the proliferation of these Internet-enabled viruses, the importance of protection against new and previously unknown viruses is increased. The strategy of using the current reactive anti-virus technologies that protect against known viruses was sound. Today, however, known virus scanning is only a baseline. Corporations also need to look toward other virus protection such as better and finer grained heuristics, access controls, behavior blocking, change detection, sandboxes, filtering, and other generic technologies discussed below.

Protection Strategies: Protection across the board is up from each of the previous years. Last year, about 70 percent of desktops, 91 percent of servers, 45 percent of proxy devices and firewalls, and 80 percent of e-mail gateways appeared to be protected by full-time anti-virus products. This year, 92 percent of companies claim to have more than 90 percent desktop coverage. 84 percent of e-mail servers and about 50 percent of firewalls use anti-virus products this year. Updates are required for these to be effective, and updating activity increased significantly over previous years. Of these locations, the value of desktop protection cannot be over-emphasized. The desktop is still the most important virus vector and therefore the best single place to put protections, especially when configured with real-time

Copyright © 2001 Page 46 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001 scanning of all file types. E-mail gateway or SMTP server protection is next in importance and most users seem to recognize this. File and print server protection is probably the least useful as is manual and login scanning. This year brings the need to add protection to the web-browsing pathway. Depending upon network configuration in the survey base, as many as half of respondents may have such protection.

If protection is up, why are the current strategies failing? People are putting the emphasis in the right places (desktops and gateways), but with an incomplete protection philosophy. With rapidly proliferating viruses, we need generic virus protection at the desktop, e-mail gateway, and browser proxy. Though the questions to assess utilization of generic protection last year were not ideal, in 2000 our survey suggested that only a very small proportion (8-12 percent) of users utilized generic anti-virus strategies such as filtering of certain file types at either the e-mail gateway or desktop. This year we explicitly asked the question and fully 69 percent claimed to block, filter, or quarantine e-mail attachments by file type. This is exactly the right behavior. We predicted that there had been a large up tick in file filtering protections at the corporate level as we tracked the propagation of VBS and similar viruses in the past 18 months. Each month the corporate penetration decreases and the home penetration carries a larger proportion of the load.

There are other generic controls that can be employed with minimal maintenance and infringement. Though average computer users want a particular control to be 100 percent effective where employed, it is far easier with less infringement, and more cost-effective to implement a program of “synergistic” protection strategies at both the desktop and gateway levels. Synergistic controls are those that are not universally effective against a particular virus, but are likely to work in certain specific cases. For example, combining three different controls, each of which is independently effective at the 80 percent level, leads to over 99 percent effectiveness for the collection of synergistic controls. For several years, TruSecure Corporation has published a list of synergistic controls in its anti- virus policy guide. A copy of this guide is available in the appendices.

These controls include file attachment filtering; configuration for e-mail clients, web browsers, and word processor applications; and various other controls that are generally easy to implement, require infrequent updates, and usually go unnoticed by the average user. Yet, their effectiveness is quite good, especially during the window of time after a rapid, Internet-enabled virus is released, and before the virus scanning signatures are updated. Our analysis shows that a six-month-old TruSecure Anti-Virus Policy Guide, if implemented at an average company, was successful at protecting against essentially all the Internet-enabled viruses that have been experienced to date. Organizations need to add more generic virus protection -- especially at the desktop and browser security settings.

Another area that will help fight the new generation of Internet-enabled viruses and Trojans is heuristic anti-virus technology. This technology is implemented already in most anti-virus products. Heuristics are particularly effective in current anti-virus products for macro viruses. Because of the very effective heuristics with this class of virus, there has been no mass-effect macro virus since Melissa. We expect that heuristics will also help significantly with imbedded HTML script viruses.

Users should also subscribe to an alert service that warns system administrators as quickly as possible of the outbreak of new malicious code. In conjunction with such a service, it is imperative that organizations be able to update their perimeter virus defenses within hours of receiving an alert. It would be good to see desktop systems updated within a day or so of the same alert. Such rapid responses would seriously interfere with the ability of third- generation viruses to propagate but only if a sizable portion of the entire Internet was protected.

Copyright © 2001 Page 47 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

 Appendices

Appendix A: Survey Questionnaire

S2 How many personal computers in North America are you responsible for in terms of virus knowledge, prevention, and software? S2A1 Which desktop PC operating systems does your organization have running? READ LIST DOS Only (No Windows) Windows 3.x Windows 95 Windows 98 Windows NT Windows 2000 Linux Macintosh O/S2 UNIX Other (Specify OS) None of the Above Don't Know Refused S3 How many file and application servers are you responsible for in terms of virus knowledge, prevention, and software? S3A1 Which LAN server operating systems does your organization have running? READ LIST Banyan Vines IBM Lan Server Novell Netware Windows NT Windows 2000 Linux Macintosh OS/2 UNIX Other (Specify OS) None of the Above Don't Know Refused Q1 Does your organization have a formal virus tracking procedure? Yes No Q2 When was it last reviewed and updated? Q3 What percent of virus incidents in your group are you likely to know about?

Copyright © 2001 Page 48 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Q4A Which anti-virus products are you running at the desktop PC level? Q4B Which anti-virus products are you running at the file and print server level?

Q5A On the desktop PC level, which of the following anti-virus software protection methods are used? How many PCs use each method? 1. Users check diskettes and downloads for viruses? 2. Anti-virus software scans hard drive for viruses every boot-up? 3. Anti-virus software scans hard drive for viruses every login? 4. Anti-virus software scans hard drive for viruses full time in the background? 5. Other periodic anti-virus detection on the desktop? 6. Other full-time anti-virus detection on the desktop? Q5B What PERCENTAGE of desktop PC's have NO anti-virus software installed? Q5C What percentage of desktop PC's have anti-virus software installed, but not running? [RECORD PERCENT] Q5D On the file server level, which of the following anti-virus software protection methods are used? 1. Anti-virus software scans hard drive for viruses periodically? 2. Anti-virus software scans hard drive for viruses full time in the background? 3. Other anti-virus detection on the server? Q5E What percentage of e-mail gateways have full-time anti-virus software installed now? Q5F What percentage of proxy servers have full-time anti-virus software installed now? Q5G What percentage of firewalls have anti-virus software installed now? Q5H On the e-mail gateway, which of the following anti-virus software protection methods are used? 1. Anti-virus software scans all messages in real time? How many? 2. Anti-virus software scans message folders and databases? How many? 3. Block, filter, or quarantine e-mail attachments by file type? How many? a. What file types do you block, filter, or quarantine? 4. Other anti-virus software protection methods used that have not been mentioned? Q5I On the proxy server level, which of the following anti-virus software protection methods are used? Please answer 'yes' or 'no' for each. 1. Anti-virus software scans all traffic in real time? 2. Block, filter, or quarantine files by file type? a. What file types do you block, filter, or quarantine? 3. Other anti-virus software protection methods used that have not been mentioned? Q5J On the firewall level, which of the following anti-virus software protection methods are used? 1. Anti-virus software scans all traffic in real time? 2. Block, filter, or quarantine files by file type? a. What file types do you block, filter, or quarantine? 3. Other anti-virus software protection methods used that have not been mentioned?

Q6 To the best of your knowledge, has a computer virus ever been discovered in any PC, diskette or file in your organization?

If they have had a virus.

For the remainder of the survey, a "virus encounter" will be defined as an event or incident where viruses were discovered on any PC diskettes or files.

Copyright © 2001 Page 49 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Q7A How many virus encounters did you have during August 2001? Q7B How many virus encounters did you have during July 2001? Q7C How many virus encounters did you have during the first half of 2001 (January-June)? Q7D How many virus encounters did you have during the first half of 2000 (July-December)? Q7E How many virus encounters did you have during the first half of 2000 (January-June)?

Q8A Which viruses have affected your group's PCs during August 2001? How many times? Q8B Which viruses have affected your group's PCs during August 2001? How many times? Q8C Which viruses have affected your group's PCs during the first half of 2001 (January - June)? How many times? Q8D Which viruses have affected your group's PCs during the last half of 2000 (July-December)? How many times? Q8E Which viruses have affected your group's PCs during the first half of 2000 (January - June)? How many times?

Q9 Compared to this time last year, do you feel virus problems in the computing industry are: READ LIST Much Worse Somewhat Worse About the Same Somewhat Better Much Better

For the remainder of the survey, a "virus disaster" will be defined as an event or incident where viruses were discovered on 25 or more PCs, diskettes, or files at approximately the same time.

Q10A What was the month and year of your most recent disaster? Q10B What was the name of the virus in your most recent disaster? Q10C How many PCs were initially suspected of having the [disaster] virus? Q10D How many PCs actually were found to be infected? Q10E How many servers were initially suspected of having the virus? Q10F How many servers actually were found to be infected? Q10G How long were any servers "down" while dealing with the disaster? Q10H How long did it take for your group to completely recover? Q10I How many cumulative person-days did the disaster cost your company? Q10J How many dollars did the disaster cost your group including employee downtime?

Q11 Which of the following effects occurred in your group as a result of the most recent virus disaster or encounter? READ LIST 1. Loss of user confidence in the system 2. Threat of someone losing his or her job 3. Loss of productivity 4. Screen message, interface, or lockup 5. Lost data 6. Corrupted files 7. Loss of access to data (i.e. on Server, etc) 8. Unreliable applications 9. PC was unavailable to the user 10. System crash

Copyright © 2001 Page 50 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

11. Trouble saving files 12. Trouble reading files 13. Trouble printing 14. Other (Specify Effect)

Q12 How did your most recent virus disaster or encounter come to your site? 1. A diskette, brought from someone's home 2. A diskette, sales demo, or similar 3. A diskette, shrink-wrapped software 4. A diskette, LAN manager/supervisor 5. A diskette, repair/service person 6. A diskette, malicious person 7. A diskette, other 8. From a distribution CD 9. A download from BBS, AOL, CompuServe, etc. 10. Other download 11. Via email as an attachment 12. Via an automated software distribution 13. While browsing the World Wide Web

Q13 What department are you in? Q14 What is your job title? Q15 What is your organization's primary line of business?

Copyright © 2001 Page 51 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Appendix B: Reported Viruses for the Survey Period The viruses reported in this year’s survey are listed below alphabetically.

# Virus Name # Virus Name # Virus Name 1 AntiEXE.[x] 26 W32/BadTrans.A-mm 50 W97M/Melissa.[x]-mm 2 Empire.Monkey.B 27 W32/BleBla.[x]-mm 51 W97M/Myna.[x] 3 HLLP.Toadie.7800.[x]-mm 28 W32/CodeRed.[x] 52 W97M/Onex.[x] 4 JS.Exception.Exploit 29 W32/Funlove.[x] 53 W97M/Pri.[x] 5 JS/Kak.[x]-m 30 W32/Hybris.[x]-mm 54 W97M/Resume.[x]-mm 6 JS/Seeker.gen 31 W32/Magistr.[x]-mm 55 W97M/Story[x] 7 JS/Wobble-mm 32 W32/MTX-m 56 W97M/Thus.[x] 8 O97M/Tristate.[x] 33 W32/Naked.[x]-mm 57 WM/Cap.[x] 9 VBS/Plan.[x] 34 W32/Navidad.[x]-m 58 WM/Concept.[x] 10 Stealth.B 35 W32/Nimda.[x]-mm 59 WM/Npad.[x] 11 Sadmind 36 W32/PrettyPark.[x]-mm 60 WM/Wazzu.[x] 12 VBS/Bubbleboy.[x] 37 W32/Prolin.[x] 61 WM97/Locale.[x] 13 VBS/Elva 38 W32/Prolin.[x]-mm 62 X97M/Divi.[x] 14 VBS/FriendMess-MM 39 W32/Prolin.A-mm 63 X97M/Vcx.[x] 15 VBS/Iestart.gen 40 W32/Qaz 64 XF97/Poppy 16 VBS/LoveLetter.[x]-mm 41 W32/SirCam.[x]-mm 65 XM/Laroux.[x] 17 VBS/Potok-mm 42 W32/Ska.[x]-m 66 XM97/Manalo.[x] 18 VBS/Stages.[x]-mm 43 W95/CIH.[x] 67 Macro-Unspecified 19 VBS/Tam-M 44 W97M/Assilem.[x] 68 VBS-Unspecified 20 VBS/Tqll.[x]-MM 45 W97M/Class.[x] 69 Worm-Unspecified 21 VBS/VBSWG.J-mm 46 W97M/Coke.22231.[x] 70 Backdoor 22 VBS/VBSWG.J-mm 47 W97M/Ethan.[x] 71 Infectus 23 VBS/VBSWG.J-mm 48 W97M/Groov.[x] 72 Joke 24 VBS/VBSWG.X-mm 49 W97M/Marker.[x] 73 PornDialer 25 W32/Apost.A-mm

Copyright © 2001 Page 52 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Appendix C: Glossary of Common Terms in Anti-virus Discussion

The following are common terms used in discussions of anti-virus software:

Background Scanning: Automatic scanning of files as they are created, opened, closed, or executed. Performed by memory resident anti-virus software. Synonyms: online, automatic, background, resident, active.

Behavior Blocking: A set of procedures that are tuned to detect virus-like behavior, and prevent that behavior (and/or warn the user about it) when it occurs. Some behaviors that should normally be blocked in a machine include formatting tracks, writing to the master boot record or boot record, and writing directly to sectors. Synonyms: “dynamic code analysis”, “behavioral analysis.”

Boot Record: The program recorded in the Boot Sector. All floppies have a boot record, whether or not the disk is actually bootable. Whenever you start or reset your computer with a disk in the A: drive, DOS reads the boot record from that diskette. If a boot virus has infected the floppy, the computer first reads the virus code in (because the boot virus placed its code in the boot sector), then jumps to whatever sector the virus tells the drive to read, where the virus has stored the original boot record.

Boot Sector: The first logical sector of a drive. On a floppy disk, this is located on side 0 (the top), cylinder 0 (the outside), sector 1 (the first sector.) On a hard disk, it is the first sector of a logical drive, such as C: or D:. This sector contains the Boot Record, which is created by FORMAT (with or without the /S switch.) The sector can also be created by the DOS SYS command. Any drive that has been formatted contains a boot sector.

Boot Sector Infector: Every logical drive, both hard disk and floppy, contains a boot sector. This is true even of disks that are not bootable. This boot sector contains specific information relating to the formatting of the disk, the data stored there and contains a small program called the boot program (which loads the DOS system files). The boot program displays the familiar “Non-system Disk or Disk Error” message if the DOS system files are not present. In addition, the program is infected by viruses. You get a boot sector virus by leaving an infected diskette in a drive and rebooting the machine. When the program in the boot sector is read and executed, the virus goes into memory and infects your hard drive. Remember, because every disk has a boot sector, it is possible (and common) to infect a machine from a data disk.

Boot Virus: A virus whose code is called during the phase of booting the computer in which the master boot sector and boot sector code is read and executed. Such viruses either place their starting code or a jump to their code in the boot sector of floppies, and either the boot sector or master boot sector of hard disks. Most boot viruses infect by moving the original code of the master boot sector or boot sector to another location, such as slack space, and then placing their own code in the master boot sector or boot sector. Boot viruses, which also infect files, are sometimes known as multipartite viruses. All boot viruses infect the boot sector of floppy disks; some of them, such as Form, also infect the boot sector of hard disks. Other boot viruses infect the master boot sector of hard disks.

Copyright © 2001 Page 53 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Companion Virus: A program that attaches to the operating system, rather than files or sectors. In DOS, when you run a file named “ABC”, the rule is that ABC.COM would execute before ABC.EXE. A companion virus places its code in a COM file whose first name matches the name of an existing EXE. You run “ABC”, and the actual sequence is “ABC.COM”, “ABC.EXE”.

Encrypted Virus: A virus whose code begins with a decryption algorithm, and continues with the scrambled or encrypted code of the remainder of the virus. When several identical files are infected with the same virus, each will share a brief identical decryption algorithm, but beyond that, each copy may appear different. A scan string could be used to search for the decryption algorithm. Cf. Polymorphic.

File Virus: Viruses that attach themselves to (or replace) .COM and .EXE files, although in some cases they can infect files with extensions .SYS, .DRV, .BIN, .OVL, OVR, etc. The most common file viruses are resident viruses, going into memory at the time the first copy is run, and taking clandestine control of the computer. Such viruses commonly infect additional programs as you run them. But there are many non-resident viruses too, which simply infect one or more files whenever an infected file is run.

In the Wild Virus: A term that indicates that a virus has been found in several organizations somewhere in the world. It contrasts the virus with one that has only been reported by researchers. Despite popular hype, most viruses are “in the wild” and differ only in prevalence. Some are new and therefore extremely rare. Others are old, but do not spread well, and are therefore extremely rare. Joe Wells maintains a list of those he knows of to be “in the wild.”

Macro Virus: A virus which consists of instructions in Word Basic, Visual Basic for Applications (VBA), or some other macro language, and resides in documents. While we do not think of documents as capable of being infected, any application, which supports macros that automatically execute, is a potential platform for macro viruses. Because documents are now even more widely shared than diskettes (through networks and the Internet), document-based viruses are likely to dominate our future.

Master Boot Record: The 340-byte program located in the Master Boot Sector. This program begins the boot process. It reads the partition table, determines what partition will be booted from (normally C:), and transfers control to the program stored in the first sector of that partition, which is the Boot Sector. The Master Boot Record is often called the MBR, and often called the “master boot sector” or “partition table.” The master boot record is created when FDISK or FDISK /MBR is run.

Master Boot Sector: The first sector of the hard disk to be read. This sector is located on the top side (“side 0”), outside cylinder (“cylinder 0”), first sector (“sector 1.”) The sector contains the Master Boot Record.

Master Boot Sector Virus: A virus that infects the master boot sector, such as NYB, spreads through the boot sector of floppy disks. If you boot or attempt to boot your system with an infected floppy disk, NYB loads into memory and then writes itself to the master boot sector on the hard drive. If the disk is not bootable, you see the DOS error message, “Non-system disk or disk error...”. If the disk is bootable, the system boots to the A: prompt. Either way the system is infected, and there is no indication on the screen that this has happened. Once the hard drive is infected, NYB loads into memory each time the system is booted. The virus stays in

Copyright © 2001 Page 54 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

memory, waiting for DOS to access a floppy disk. It then infects the boot record on each floppy DOS accesses.

On-Demand Scanning: Synonyms: offline, manual scanning, foreground, non-resident scanning, scanning.

Polymorphic Virus: A polymorphic virus is one that produces varied (yet fully operational) copies of itself, in the hope that virus scanners will not be able to detect all instances of the virus.

Remove: To remove or clean a virus means to eliminate all traces of it, returning the infected item to its original, uninfected state. Nearly all viruses are theoretically removable by reversing the process by which they infected. However, any virus that damages the item it has infected by destroying one or more bytes is not removable, and the item needs to be deleted and restored from backups in order for the system to be restored to its original, uninfected state. There is a gap between theory and practice. In practice, a removable virus is one that the anti- virus product knows how to remove. The term “clean” is sometimes used for remove, and sometimes used to refer to the destruction of viruses by any method. Thus deleting a file that is infected might be considered cleaning the system. We do not regard this as an appropriate use of the term “clean.”

Resident: A property of most common computer viruses and all background scanners and behavior blockers. A resident virus is one which loads into memory, hooks one or more interrupts, and remains inactive in memory until some trigger event. When the trigger event occurs, the virus becomes active, either infecting something or causing some other consequence (such as displaying something on the screen.) All boot viruses are resident viruses, as are the most common file viruses. Macro viruses are non-resident viruses.

Stealth Virus: A virus that uses any of a variety of techniques to make itself more difficult to detect. A stealth boot virus will typically intercept attempts to view the sector in which it resides, and instead show the viewing program a copy of the sector as it looked prior to infection. An active stealth file virus will typically not reveal any size increase in infected files when you issue the “DIR” command. Stealth viruses must be “active” or running in order to exhibit their stealth qualities.

Trojan Horse: A program that does something unwanted and unexpected by a user, but intended by the programmer. Trojans do not make copies of themselves, as do viruses, and seem to be more likely to cause damage than viruses.

Worm: Similar to a virus in that it makes copies of itself, but different in that it need not attach to particular files or sectors at all. Once a worm is executed, it seeks other systems to infect, then copies its code to them.

Zoo Virus: A virus which is rarely reported anywhere in the world, but which exists in the collections of researchers. A zoo virus has some “escaping” virus collections, and infecting user machines. Its prevalence could increase to the point that it was considered “in the wild.”

Copyright © 2001 Page 55 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Appendix D: TruSecure Anti-Virus Policy Guide

Specific and Detailed TruSecure Corporation Anti-Virus Policy Suggestions

This policy guide is designed to deal with all current types of viruses known to TruSecure Corporation as of the revision date of this document.

The policies are specifically designed to deal with: 1. Active Communication-enabled viruses, Trojans and worms (such as Happy99, Melissa, BubbleBoy, and LoveLetter) as well those that may utilize vectors such as ActiveX and JavaScript( Kak.Worm). 2. Conventional Macro Viruses 3. Multipartite, Parasitic, Stealth, Polymorphic Viruses 4. Executable File Viruses 5. Boot Sector & Partition Table Viruses 6. Malicious code that has been compressed by a 32-bit compressing program 7. Self-updating malicious code TruSecure Corporation policy suggestions are characterized as either primary controls or synergistic controls. Primary controls are the most important and effective stand-alone preventative technique and constitute TruSecure Corporation’s principal policy recommendations for organizations. Synergistic controls function in a way that is analogous to the military strategy of defense-in-depth, which provides for redundancy and failure of particular controls.

When operating alone, individual policies, controls or screens may have limited value, but synergistically can be quite effective. When used in conjunction with other synergistic controls, serial screens behave according to Baye’s theorem. Their cumulative effect improves with the use of each control and their use enhances the effectiveness of other, primary, controls.

TruSecure recommends the use of synergistic controls that a site can easily implement without infringing on other business productivity. This includes controls that are easily tolerated by particular groups or sub-groups in an organization, even though other groups may have less tolerance for these same controls. For any given site, it is common to be able to find synergistic controls that have very low infringement and maintenance costs. Furthermore, effective synergistic controls tend to require less frequent updates in order to maintain their effectiveness.

We expect that many synergistic controls (below) will not be applicable for a particular site or for a particular group at a particular site. TruSecure Corporation does not encourage changing business practices in order to achieve these controls. Rather we encourage the use of all controls that are quickly applied to a given organization and all that can be applied to given sub-groups within an organization.

Virus protection controls can be applied at various levels in an organization. Before macro viruses, there was no practical utility in any controls applied anywhere other than PC desktops3. With the advent of macro viruses, the utility of applying some controls at the gateway level became evident. With the communication-enabled viruses, gateway and perimeter controls gain value.

3 For more discussion on the derivation and logic behind these statements, please see other support documents such as ICSA Labs anti-virus surveys, virus study, and virus cost models, available at www.icsalabs.com or www.trusecure.com.

Copyright © 2001 Page 56 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

TruSecure Corporation Recommended Virus Controls

Desktop Systems

TruSecure Corporation Recommended Primary Anti-Virus Controls at Desktop Level:

1) ICSA Certified anti-virus software should be installed and running on at least 90 percent of the organization’s desktop PCs4. 2) Subscribe to an alert service, such as the TruSecure MonitorTM 3) If alert service is available, update desktop anti-virus software (virus signatures) at least monthly and be prepared to perform emergency updates within two business days after an alert. NOTE: Updating virus signatures is not good enough if you are using outdated software, please be certain you are using the most up-to-date version of your anti-virus software also. 4) If alert service is not available, update desktop anti-virus software at least weekly5. 5) Educate users on how to update virus signatures where the process is not centralized. 6) Recommended Desktop anti-virus software configuration: a) Full-time, background, real time, auto-protect or similar mode—ENABLED b) Start-up scanning of memory, master / boot records, system files—ENABLED c) Configure your AV to scan “All Files.” d) Logging ENABLED to log all desktop virus-related activity. 7) Turn off the Operating System Windows Scripting Host i) Go to Control Panel ii) Select Add/Remove iii) Under the Accessories Details, you will find a Windows Scripting Host checkbox. iv) Clear the check box to remove Windows Scripting Host from your computer. 8) ENABLE Macro Virus Protection in Microsoft Office Programs. Additional notes on desktop level policies:

1) Alerts to users are neither recommended nor discouraged. TruSecure Corporation marginally recommends turning user alerts off, but system administrator alerts, logs, or other advisories on. 2) Daily, startup, login, or other periodic scanning of hard drives has little incremental utility. TruSecure Corporation recommends these options be turned off as their user infringement normally outweighs their protective value. 3) User-driven scanning policies such as requesting users to scan floppies, downloads or hard drives are not recommended, as they are generally more expensive and infringing than useful. 4) Write protection of floppies is probably still worthwhile. There is little on the downside of a policy of write-protection, and although the incidence of boot sector viruses seems to be declining, it is quite likely that future Win32 viruses will seek out the boot sectors of floppies. It will be easier to simply keep doing

4 ICSA Labs virus surveys and cost models show a cost minimum when 90 percent of desktops have active protection with communications-enabled viruses. The same survey indicates that at upwards of 10 percent of machines with anti-virus products installed have the AV software turned off or other wise inoperable, therefore it is important to monitor AV protection to maintain this 90 percent coverage. This is up from 75 percent in v3.01 and earlier policy guides. 5 Previous policy recommendations suggested quarterly updates when using alert capability or monthly updates without alert capability for macro level viruses and six month updates with alert capability, quarterly without, for pre-macro viruses.

Copyright © 2001 Page 57 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

the write protection than to try suddenly to retrain everyone. 5) Educating users to look for virus-like activity is not a viable policy.

TruSecure Corporation Recommended Synergistic Controls at the Desktop-Level: Implement as many of these as may be feasible within your organization. (For items 1 & 2 specific recommended configuration settings are listed in Annex 1)

1) Ensure that all relevant Microsoft security patches and Service Releases are installed. See Annex 2 for a list of such Security Patches and Service Releases. 2) Configure all instances of MS Word to save files as Rich Text Format (*.rtf). Note 1: Renaming the *.doc file to *.rtf is not useful. Files must be saved in *.rtf format but may be saved by any file name or extension.

Note 2: The policy should allow users to manually save as *.doc type where needed to reduce the file size of complex documents, or for files that require macro use or other very advanced, embedded features use. Otherwise, all files should be saved, mailed, stored, and maintained as *.rtf files by default.

3) Seriously consider the use of add-in tool, which is designed to prevent double-click execution of e-mail attachments without challenge. 4) Configure WordView or WordPad as default association with *.doc files. Note: This is probably not effective on all newer versions and service releases of Office, particularly as the document-centric6 model becomes more common, but there is little on the downside. 5) Create and institute a policy NEVER to double-click on email-attachments. If you receive a document or spreadsheet that you want to look at, manually open it with WordView, Wordpad or XLView. If the attachment appears to be an executable program, never run it, despite what the accompanying email says, and no matter from whom it is received. It is now a common ploy for spreading viruses to make it appear it is from someone you know and trust. A viable policy is to ask users to never double click on an attachment unless they are expecting it. 6) Use AV software heuristic controls (in full-time background mode where available) 7) Set site attributes to READ ONLY for *.exe and *.dll in the %systemroot% directory. On Windows ‘9x machines, the usual directory is C:\Windows; on Windows NT and Windows 2000 machines, this is usually C:\WINNT. On Windows ‘9x machines, this is done using the DOS command ATTRIB.EXE, on Windows NT and Windows 2000 machines, this is a permission setting. Set site attributes for WSOCK32.DLL to “read only.” 8) Set AV product to alert or prevent changing of DOS Read-Only file attributes. 9) Store NORMAL.DOT (the default document template) in a protected folder on the file server or write- protect on the C: drive (i.e. set its “read only” attribute). 10) Create a copy of NORMAL.DOT, rename it and store it somewhere else on your computer, and create a batch file which runs from the AUTOEXEC.BAT, and which compares the two files. You can use a simple DOS command like FC (File Compare). NORMAL.DOT shouldn’t change, unless you are making some sort of customizations. While viruses may still defeat the read-only attribute on NORMAL.DOT, it is unlikely they will find the renamed copy. This would provide a useful early warning that something had happened.

6 Document-centric means that, instead of the user having to decide what program they want to use to examine a file, the operating system examines the file internally, as opposed to just looking at the extension, and decides what program should be used automatically. This already happens for some versions of some applications.

Copyright © 2001 Page 58 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

11) Make a copy of Winsock.DLL and Kernel32.DLL and store them somewhere, and compare them the same way as the previous item, for the same reason. This would help protect against the likes of Happy99, which modifies Wsock32. Kernel32.dll is another likely target for malicious code. Periodically, we will probably suggest other candidate system files, which are both static, and thus defendable, and important, and thus likely targets. Windows 2000 implements a feature called Windows File Protection. This feature ensures that any protected file which is replaced, for any reason by any means, is automatically recovered from a protected cache. 12) Consider use of alternative word processor or office suite or limiting the use to certain users, providing other users with alternate applications. Remember that no product is bulletproof, and anything that becomes very popular is likely to be targeted.

E-Mail Client (Desktop) Applications

TruSecure Corporation Recommended Primary Controls at the E-Mail Client Level:

1) Outlook: a) Set (IE) 4.x/5.x security settings in the Internet zone to "high" b) Customize IE settings (after setting to High) and disable ActiveX and Active Scripting -- set to disable (strongest) or prompt. Note 1: All versions of Outlook (except Outlook 97) rely on the Security Zone Security Settings from Internet Explorer (Tools, Internet Options, Security). Outlook lets you specify one of two zones to use as the security settings for dealing with email messages (Tools, Options, Security). You can use the Internet Zone (default), or the Restricted Sites Zone (which is more secure). You should be using the Restricted Sites Zone. But, as configured by default, Restricted is not strict enough. To the Restricted Sites Zone you should add these restrictions: 1. Change Script ActiveX controls marked safe for scripting to disable / prompt 2. Change Active Scripting to disable or prompt 3. JavaScript settings should already be set to disable or prompt Note 2: RAMIFICATIONS -- of disabling Active Scripting probably include the inability of the machine to utilize Microsoft automated path update systems. Ramifications of disabling ActiveX and JavaScript mostly relate to web browsing infringements. If a company want its users to have the ability to use Microsoft’s Windows Update feature, this can be overcome by including those pages in Internet Explorer’s Trusted Sites Zone. c) For Outlook ’98 users, disable the Preview Pane on all folder views. Since Preview Pane is enabled by default, this must be done for every folder available to the user. It must also be done again each time a new folder is added. By default, Outlook 2000 prevents the use of Active Scripting in the Preview Pane, regardless of the Security Zone setting of the Outlook 2000 client. However, it may be recommended that all users use the Auto Preview feature rather than the Preview Pane for consistency within the company. 2) – Disable Open and /or Preview panes. (Auto Preview which shows the first three lines of the message is ok and is certainly the safest of the three choices.) a) Perform IE Security Zone settings as above. b) To disable the Preview Pane in Outlook Express: 1. Select Local Folders, then on the Menu Bar select View | Layout | 2. At the bottom of the Dialogue Do not check Preview Pane 3) Netscape- Disable JavaScript To disable JavaScript: On the Menu Bar select: Edit | Preferences a) On the left menu of the Dialogue Box Click Advanced b) On the right side of the Dialogue Box, Do Not Enable JavaScript for e-mail and news. c) For maximum safety, also disable JavaScript for browsing.

Copyright © 2001 Page 59 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Note: Be advised, however, that Netscape uses JavaScript as part of its Smart Update program and will need to be Enabled to take make use of this feature.

4) Other Email client -- Disable Visual Basic Scripting or Java Scripting if utilized. Also, disable the use of IE for HTML viewing of e-mail if used by the client. (i.e. Eudora 4.x). If IE is on desktop, set as above.

Synergistic Controls at the E-Mail Client Level:

Organizations should implement as many as feasible. 1) Turn off auto-open attachments 2) Configure for Plain text only 3) Configure to challenge execution of all executables attachments, see Annex Three 4) Configure to challenge opening of all *.doc, *.xls, *.ppt, as well as other executable file types. See Annex 3 for a listing of file types. 5) Configure to challenge double click of all attachments 6) Consider using non MS Mail application 7) Do not store ALL_Company alias in local email lists

Network File and Print Servers

TruSecure Corporation Recommended Primary Control at Inside Server level:

1) Run AV Scanner in full time, background, automatic, auto-protect or similar mode on any file server which potentially stores files which are potentially infect-able such as*.doc files and executables which run on desktops. Note: Daily or other periodic scanning has little value when auto-protect mode or similar full time mode is enabled. 2) Update server signature files monthly if alert service is available, weekly (or at maximum vendor rate) if no alert service is available.

Synergistic Controls at the Inside Server Level:

Organizations should implement as many as feasible.

Utilize centralized AV management

1) Use centralized desktop management 2) Manage Internet Explorer and Visual Basic Scripting Centrally (eg: login lockdown, use of registry editing scripts, etc.)

E-Mail Gateways, Firewalls, Other Gateways and Anti-Spam Tools

TruSecure Corporation Recommended Primary Control at the Gateway Level:

1) Install e-mail gateway anti-virus software configured for full-time active mode. 2) Configure your anti-virus “Files List” to include scanning all files (most effective). 3) If organizations determine not to configure anti-virus software to scan all files, check ANNEX 3 for a list of files that may be automatically invoked and should be scanned for viruses.

Copyright © 2001 Page 60 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

4) Filter all *.exe attachments and similar. See Annex 3 for a list of file types that may be filteredFilter all arriving (and departing if possible) e-mail traffic by subject line /header: Note: Be prepared to rapidly adjust filtering rules within one hour of notice for emergency alerts. See below for examples: a) Kill all mail with the following subject lines: − Important Message From* (for Melissa) − BubbleBoy is back!” (for BubbleBoy) b) Kill all mail with the following message header: − X-Spanska: Yes (for Happy99) c) Kill all mail with Message body text: − This document is very Important and you've GOT to read this!!! (W97M/Prilissa) − Here's some pictures for you! (W32/MyPics) 5) For additional help with sendmail see: http://www.sendmail.com/blockmelissa.html 6) For help with John Hardin's Procmail see: ftp://ftp.rubyriver.com/pub/jhardin/antispam/procmail-security.html 7) For help with Innosoft's PMDFsee: http://www.innosoft.com/iii/pmdf/virus-word-emergency.html

Gateway Level, Potential Synergistic Controls

Implement as many as are feasible within your organization. 1) Consider filtering all arriving and departing e-mail by spam threshold (greater than 40 identical messages blocked and source traced, if inside). 2) Filter all *.doc and al all feasible attachments file types (See Annex 3). 3) Filter ActiveX and JavaScript. 4) Strip HTML tags (angle brackets> from html email.

Human Factors

TruSecure Corporation Recommended Primary Control at the Human Factor Level:

None

Human Factors Potential Synergistic Controls:

Implement as many as are feasible within your organization. 1) Educate users to consider e-mail attachments and links potentially dangerous and to treat them very cautiously. Specifically recommend education: Open only expected attachments and links from known and trusted sources. Delete or question all others before opening. 2) Keep system managers updated and informed. Links to hundreds of sites and related documents are available at http://www.icsa.net/virus. 3) Reinforce the message to users to never double click an e-mail attachment that is not expected. This policy is difficult since the affected (malicious) email will normally come "From" a trusted person. (Well- informed users can be taught that *.doc, *.exe, *.scr, *.vbs, *.shs, *.js, and *.hta extensions are the most likely to be dangerous). Desktop anti-virus software will normally work if it is kept updated and properly

Copyright © 2001 Page 61 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

configured to operate full-time in the background. 4) E-mail containing Christmas card attachments, video, audio or other "fun attachments" and sexually- oriented attachments are likely to be the most exploited vector for this and similar viruses. Other Notes:

Diversity of your site compared with average sites with popular software in general is protective. Look for things about your site that change enough of your implementation of word processing, document file usage, e-mail, and other similar systems so as to not be identical to the average site. ActiveX and HTML viruses have not yet materialized as real risks that have been experienced by the computing community. TruSecure Corporation believes that these vectors are imminent. Building defenses for them now will save the day when they become real. Viruses and worms dependent upon them may travel exceedingly fast -- much faster than an organization can be expected to update their anti-virus signatures and potentially faster than anti-virus companies can create and distribute signatures. Use a Security Service like TruSecure that will provide updated policies (updated at least quarterly for proven effectiveness), continuous alerts and information, emergency alerts and updates, and which will repeatedly measure and test the effectiveness of your site’s implementation of these viruses and numerous other security policies and practices. The service considers the malicious code risk as one of six risk categories (Electronic Risk (mostly hacking), Malicious Code Risk (mostly viruses), Privacy Risk, Downtime risk, Physical Risk, and Human Factors risk. The process is continuous and constantly improving and it rapidly leads to effective security using the people and products already at your organization. Submit Suggestions:

Please contribute your AV policy suggestions whether primary controls or supplemental / synergistic controls by sending e-mail to ‘[email protected].’

Copyright © 2001 Page 62 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

ANNEX 1: Recommended settings for Microsoft Office

Office 97 Settings:

Word 97

Tools /Options /General: Do Check: [Macro virus protection] Do Not Check: [Mail as attachment] Tools /Options / Save: Do Check: [Prompt to save Normal template] Do not check: [Allow fast Saves] Do Configure: Save Word files as: Rich Text Format (*.rtf)

Excel 97

Tools /Options /General: Do Check: [Macro virus protection]

PowerPoint 97

Tools /Options /General: Do Check: [Macro virus protection]

Office 2000 Settings:

Word 2000

Tools | Macro | Security Security level = High (default is high) Trusted Sources = NO ONE Do not check “Trust all installed add-ins and templates.” (Default is checked) Tools | Options | General: Do Not Check: Mail as attachment Tools | Options | Save: Do Check: Prompt to save Normal template Do not check: Allow fast Save Do Configure: Save Word files as: Rich Text Format (*.rtf)

Excel 2000

Tools | Macro | Security Security level = High (default is high) Trusted Sources = NO ONE

Copyright © 2001 Page 63 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

Do not check “Trust all installed add-ins and templates.” (Default is checked)

PowerPoint 2000

Tools | Macro | Security Security level = High – (Default may be medium) Trusted Sources = NO ONE Do not check “Trust all installed add-ins and templates.” (Default is checked.)

Copyright © 2001 Page 64 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

ANNEX 2: Recommended settings for Microsoft Outlook Security patches

Email Attachment Security Update for all versions of Outlook

This Update prevented users from automatically executing certain file types, instead forcing them to save these file types to disk. Download and install the following depending on the version of Outlook which you use;

Outlook ‘97 http://officeupdate.microsoft.com/downloadDetails/O97attch.htm Outlook ’98 http://officeupdate.microsoft.com/downloadDetails/O98attch.htm Outlook ‘2000 http://officeupdate.microsoft.com/2000/downloadDetails/O2Kattch.htm The following url provides a fuller description of the Update. The Update prevents the automatic execution of .exe, .bat, .com, or .cmd file types. http://support.microsoft.com/support/kb/articles/q235/3/09.asp

Microsoft Office SR-1

MS Office SR-1 provides further flexibility for Outlook 2000 than the previous versions. Included in SR-1 is an enhancement to the Email Attachment Security Update as described in;

http://support.microsoft.com/support/kb/articles/Q259/2/28.ASP See the following for details.

http://officeupdate.microsoft.com/2000/downloadDetails/O2kSR1DDL.htm

Copyright © 2001 Page 65 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050 ICSA Labs Virus Prevalence Survey 2001

ANNEX 3: File types for potential filtering The list below contains a listing of file types that are recognized by default on machines with a complete installation of Office 2000. These may be automatically invoked when presented to the user as an attachment in Email.

??_ IM? OQY UDL AD? INF OSS ULS ADE INI OV? URL ADP INS P10 VB? ASP IQY P12 VBE ASX ISP P7B VBS BAS ITS P7R VS? BAT JOT P7S WAB BIN JS? PBK WBK CDR JSE PFX WEBPNP CER LNK PKO WHT CHM MAD PCD WIZ CMD MAF PIF WIZHTML COM MAM PL WPD CPL MAPIMAIL PMA WS? CRL MAQ PMC WSC CRT MAR POTHTML WSF CSC MAS POT WSH CSV MAT PP? XLA DER MAV PPA XLB DESKLINK MAW PPS XLC DEV MD? PPT XLD DIF MDA PPTHTML XLK DLL MDB PRF XLL DO? MDBHTML PWZ XLM DOC MDE QDS XLS DOCHTML MDT RNK XLSHTML DOT MDW RQY XLT DOTHTML MDZ RTF XLTHTML DQY MSC SC2 XLV DSN MHT SCD XLW DUN MHTML SCH XML EML MPP SCR XNK EXE MPT SCT XSL FAV MS? SHB XTP GMS MSI SHS XL? GZ? MSP SLK ZAP HLP MST SMM HT? NFO SNP HT NMW SPC HTA NWS SST HTM OBD STL HTML OBT STM HTT OCX SYSVB? HTW OLE

Copyright © 2001 Page 66 ICSA Labs TruSecure Corporation A Division of TruSecure Corporation 13650 Dulles Technology Drive 1000 Bent Creek Blvd., Suite 200 Herndon, VA 20171-4062 Mechanicsburg, PA 17050