2012 Consumer Security Products Performance Benchmarks (Edition 3) Antivirus, Internet Security & Total Security Windows 7

February 2012

Document: 2012 Consumer Security Products Performance Benchmarks (Edition 3) Authors: M. Baquiran, D. Wren Company: PassMark Software Date: 16 February 2012 Edition: 3 File: totalprotectionsuites-feb2012.docx Consumer Security Products PassMark Software

Table of Contents

TABLE OF CONTENTS ...... 2 REVISION HISTORY ...... 4 REFERENCES ...... 4 EXECUTIVE SUMMARY ...... 5 OVERALL SCORE ...... 6 PRODUCTS AND VERSIONS ...... 8

INTERNET SECURITY SOFTWARE ...... 8 ...... 9 TOTAL SECURITY SOFTWARE ...... 9 PERFORMANCE METRICS SUMMARY ...... 10 INTERNET SECURITY SOFTWARE – TEST RESULTS ...... 13

BENCHMARK 1 – BOOT TIME ...... 13 BENCHMARK 2 – SCAN TIME ...... 14 BENCHMARK 3 – USER INTERFACE LAUNCH TIME ...... 15 BENCHMARK 4 – MEMORY USAGE DURING SYSTEM IDLE ...... 16 BENCHMARK 5 – BROWSE TIME ...... 17 BENCHMARK 6 – INTERNET EXPLORER LAUNCH TIME ...... 18 BENCHMARK 7 – INSTALLATION TIME ...... 19 BENCHMARK 8 – INSTALLATION SIZE ...... 20 BENCHMARK 9 – REGISTRY KEYS ADDED ...... 21 BENCHMARK 10 – FILE COPY, MOVE AND DELETE ...... 22 BENCHMARK 11 – INSTALLATION OF THIRD PARTY APPLICATIONS ...... 23 BENCHMARK 12 – NETWORK THROUGHPUT ...... 24 BENCHMARK 13 – FILE FORMAT CONVERSION ...... 25 BENCHMARK 14 – FILE COMPRESSION AND DECOMPRESSION ...... 26 BENCHMARK 15 – FILE WRITE, OPEN AND CLOSE ...... 27 BENCHMARK 16 – PE SCAN TIME ...... 28 BENCHMARK 17 – FILE COPY DISK TO DISK ...... 29 ANTIVIRUS SOFTWARE – TEST RESULTS ...... 30

BENCHMARK 1 – BOOT TIME ...... 30 BENCHMARK 2 – SCAN TIME ...... 31 BENCHMARK 3 – USER INTERFACE LAUNCH TIME ...... 32 BENCHMARK 4 – MEMORY USAGE DURING SYSTEM IDLE ...... 33 BENCHMARK 5 – BROWSE TIME ...... 34 BENCHMARK 6 – INTERNET EXPLORER LAUNCH TIME ...... 35 BENCHMARK 7 – INSTALLATION TIME ...... 36 BENCHMARK 8 – INSTALLATION SIZE ...... 37 BENCHMARK 9 – REGISTRY KEYS ADDED ...... 38 BENCHMARK 10 – FILE COPY, MOVE AND DELETE ...... 39 BENCHMARK 11 – INSTALLATION OF THIRD PARTY APPLICATIONS ...... 40 Performance Benchmark Page 2 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

BENCHMARK 12 – NETWORK THROUGHPUT ...... 41 BENCHMARK 13 – FILE FORMAT CONVERSION ...... 42 BENCHMARK 14 – FILE COMPRESSION AND DECOMPRESSION ...... 43 BENCHMARK 15 – FILE WRITE, OPEN AND CLOSE ...... 44 BENCHMARK 16 – PE SCAN TIME ...... 45 BENCHMARK 17 – FILE COPY DISK TO DISK ...... 46 TOTAL SECURITY SOFTWARE – TEST RESULTS ...... 47

BENCHMARK 1 – BOOT TIME ...... 47 BENCHMARK 2 – SCAN TIME ...... 47 BENCHMARK 3 – USER INTERFACE LAUNCH TIME ...... 48 BENCHMARK 4 – MEMORY USAGE DURING SYSTEM IDLE ...... 48 BENCHMARK 5 – BROWSE TIME ...... 49 BENCHMARK 6 – INTERNET EXPLORER LAUNCH TIME ...... 49 BENCHMARK 7 – INSTALLATION TIME ...... 50 BENCHMARK 8 – INSTALLATION SIZE ...... 50 BENCHMARK 9 – REGISTRY KEYS ADDED ...... 51 BENCHMARK 10 – FILE COPY, MOVE AND DELETE ...... 51 BENCHMARK 11 – INSTALLATION OF THIRD PARTY APPLICATIONS ...... 52 BENCHMARK 12 – NETWORK THROUGHPUT ...... 52 BENCHMARK 13 – FILE FORMAT CONVERSION ...... 53 BENCHMARK 14 – FILE COMPRESSION AND DECOMPRESSION ...... 53 BENCHMARK 15 – FILE WRITE, OPEN AND CLOSE ...... 54 BENCHMARK 16 – PE SCAN TIME ...... 54 BENCHMARK 17 – FILE COPY DISK TO DISK ...... 55 DISCLAIMER AND DISCLOSURE ...... 56 CONTACT DETAILS ...... 56 APPENDIX 1 – TEST ENVIRONMENT ...... 57 APPENDIX 2 – METHODOLOGY DESCRIPTION ...... 58

Performance Benchmark Page 3 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Revision History

Rev Revision History Date

Initial version of this report, includes new results for 2012, G Data Internet Security 2012, Kaspersky Internet Security 2011, Panda Internet Security Edition 1 11 August 2011 2012, ! Internet Security 6, Norton AntiVirus 2012, G Data AntiVirus 2012, Kaspersky Anti-Virus 2012, Panda Antivirus 2012 and Avast! Pro Antivirus 6.

Added results for 7 additional Internet Security products and 7 additional Antivirus Edition 2 14 November 2011 products. For more details on these products, see Products and Versions.

Added results for 4 additional Free Antivirus products to the antivirus category. Added results for 5 products in the Total Security category. Replaced all Registry Key Count Edition 3 24 January 2012 results using RegistryCount.exe to results taken using OSForensics. For more details see Benchmark 9 – Registry Key Count.

References

Ref # Document Author Date

O. Warner, 1 What Really Slows Windows Down (URL) 2001-2009 The PC Spy

Performance Benchmark Page 4 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Executive Summary

PassMark Software® conducted objective performance testing on thirteen (13) Internet Security software products, seventeen (17) Antivirus software products, and five (5) Total Security software products on Windows 7 Ultimate Edition (64-bit) SP1 between April 2011 and January 2012. This report presents our results and findings as a result of performance benchmark testing conducted for these consumer security products.

Subsequent editions of this report will include new products released for 2012 as they are made available. For more details on which versions were tested, please see the section “Products and Versions”.

Testing was performed on all products using seventeen (17) performance metrics. These performance metrics are as follows:

 Boot Time;  Scan Time;  User Interface Launch Time;  Memory Usage during System Idle;  Browse Time;  Internet Explorer Launch Time;  Installation Time;  Installation Size;  Registry Keys Added;  File Copy, Move and Delete;  Installation of Third Party Applications;  Network Throughput (previously named “Binary Download Test”)  File Format Conversion;  File Compression and Decompression;  File Write, Open and Close;  PE Scan Time and;  File Copy Disk to Disk.

Performance Benchmark Page 5 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Overall Score

PassMark Software assigned every product a score depending on its ranking in each metric compared to other products in the same category.

Internet Security Software

In the following table the highest possible score attainable is 221; in a hypothetical situation where a product has attained first place all 17 metrics. Internet Security products have been ranked by their overall scores:

Product Name Overall Score

Webroot SecureAnywhere Essentials 195

Norton Internet Security 2012 158

Trend Micro Virus Buster 2012 134

Avira Internet Security 2012 133

Avast! Internet Security 6 128

Kaspersky Internet Security 2012 119

AVG Internet Security 2012 116

Trend Micro Titanium Internet Security 2012 114

Panda Internet Security 2012 101

G Data Internet Security 2012 103

SourceNEXTstyle ZERO 93

McAfee Internet Security 2012 86

BitDefender Internet Security 2012 68

Performance Benchmark Page 6 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Antivirus Software

In the following table the highest possible score attainable is 289; in a hypothetical situation where a product has attained first place in all 17 metrics. Antivirus products have been ranked by their overall scores:

Product Name Overall Score

Webroot SecureAnywhere AntiVirus 2012 257

Norton Antivirus 2012 203

Avast! Pro Antivirus 6 183

Kaspersky Anti-Virus 2012 165

Trend Micro Titanium 2012 167

PC Tools Spyware Doctor with AntiVirus 9 164

Avira Free Antivirus 12 161

Panda Cloud Antivirus Free 152

Avast! Free Antivirus 6 148

Microsoft Security Essentials 148

AVG Anti-Virus Free 2012 141

Quick Heal AntiVirus 2012 138

Avira Antivirus Premium 2012 135

AVG Anti-Virus 2012 123

G Data Antivirus 2012 116

Panda Antivirus Pro 2012 108

McAfee Antivirus 2012 90

Total Security Software

In the following table the highest possible score attainable is 85; in a hypothetical situation where a product has attained first place in all 17 metrics. Total Security products have been ranked by their overall scores:

Product Name Overall Score

Norton 360 v6 69

Trend Micro Titanium Maximum Security 2012 55

McAfee Total Protection 2012 49

Kaspersky PURE 9 42

BitDefender Total Security 2012 40

Performance Benchmark Page 7 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Products and Versions

In all cases, we have tested the full, retail release of the newest generation (2012 versions) of security products. The names and versions of products which were tested can be found below from each category table:

Internet Security Software

Manufacturer Product Name Release Year Product Version Date Tested

Symantec Corporation Norton Internet Security 2012 2011 19.1.0.21 Aug 2011

G Data Software AG G Data Internet Security 2012 2011 22.0.2.25 Jun 2011

Kaspersky Lab Kaspersky Internet Security 2012 2011 12.0.0.374 Jun 2011

Panda Security SL Panda Internet Security 2012 2011 17.00.00 Jul 2011

AVAST Software a.s. Avast! Internet Security 6 2011 6.0.1203 Aug 2011

AVG Technologies AVG Internet Security 2012 2011 2012.0.1809 Sep 2011

Trend Micro Titanium Internet Trend Micro Inc. 2011 5.0.1280 Oct 2011 Security 2012

Sourcenext SourceNEXTstyle ZERO 2011 11.0.0047 Oct 2011 Corporation

Trend Micro Inc. Trend Micro Virus Buster 2012 2011 5.0.1280 Oct 2011

Avira Operations Avira Internet Security 2012 2011 12.0.0.871 Oct 2011 GmbH & Co. KG

BitDefender BitDefender Internet Security 2012 2011 15.0.31.1282 Oct 2011

McAfee, Inc. McAfee Internet Security 2012 2011 11.0.393 Nov 2011

Webroot Webroot SecureAnywhere Essentials 2011 8.0.0.66 Feb-2012

Performance Benchmark Page 8 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Antivirus Software

Manufacturer Product Name Release Year Product Version Date Tested

Symantec Corporation Norton AntiVirus 2012 2011 19.1.0.21 Aug 2011

G Data Software AG G Data Antivirus 2012 2011 22.0.2.25 Jun 2011

Kaspersky Lab Kaspersky Antivirus 2012 2011 12.0.0.374 Jun 2011

Panda Security Panda Antivirus Pro 2012 2011 11.00.00 Jul 2011

AVAST Software a.s. Avast! Pro Antivirus 6 2011 6.0.1203 Aug 2011

AVG Technologies AVG Anti-Virus 2012 2011 2012.0.1809 Sep 2011

Trend Micro Inc. Trend Micro Titanium 2012 2011 5.0.1280 Oct 2011

Avira Operations Avira Antivirus Premium 2012 2011 12.0.0.871 Oct 2011 GmbH & Co. KG

Quick Heal Quick Heal AntiVirus 2012 2011 13.00(6.0.0.1) Oct 2011 Technologies (P) Ltd.

McAfee, Inc. McAfee AntiVirus 2012 2011 11.0.623 Nov 2011

PC Tools Spyware Doctor with PC Tools 2011 9.0.0.888 Nov 2011 AntiVirus 9

Microsoft Corporation Microsoft Security Essentials 2011 2.1.1116.0 Nov 2011

AVAST Software a.s. Avast! Free Antivirus 2011 6.0.1289 Dec 2011

AVG Technologies AVG Anti-Virus Free Edition 2012 2011 2012.0.1873 Dec 2011

Panda Security Panda Cloud Antivirus Free 2011 1.5.1 Dec 2011

Webroot Webroot SecureAnywhere Essentials 2011 8.0.0.60 Dec 2011

Avira Operations Avira Free Antivirus 12 2011 12.0.0.870 Dec 2011 GmbH & Co. KG

Total Security Software

Manufacturer Product Name Release Year Product Version Date Tested

Symantec Corporation Norton 360 v6 2011 6.0.0.141 Dec 2011

Trend Micro Titanium Maximum Trend Micro Inc. 2011 5.0.1280 Jan 2012 Security 2012

SecurityCenter 11.0.649 McAfee, Inc. McAfee Total Protection 2012 2011 Jan 2012 Virus Scan 15.0294

Kaspersky Lab Kaspersky PURE 9 2011 9.1.0.124 Jan 2012

BitDefender BitDefender Total Security 2012 2011 15.0.35.1478 Jan 2012

Performance Benchmark Page 9 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Performance Metrics Summary

We have selected a set of objective metrics which provide a comprehensive and realistic indication of the areas in which an antivirus may impact system performance for end users. Our metrics test the impact of the antivirus software on common tasks that end-users would perform on a daily basis.

All of PassMark Software’s test methods can be replicated by third parties using the same environment to obtain similar benchmark results. Detailed descriptions of the methodologies used in our tests are available as “Appendix 2 – Methodology Description” of this report.

Benchmark 1 – Boot Time

This metric measures the amount of time taken for the machine to boot into the operating system. Security software is generally launched at Windows startup, adding an additional amount of time and delaying the startup of the operating system. Shorter boot times indicate that the application has had less impact on the normal operation of the machine.

Benchmark 2 – Scan Time

All antivirus solutions have functionality designed to detect viruses and various other forms of malware by scanning files on the system. This metric measured the amount of time required to scan a set of clean files. Our sample file set comprised a total file size of 982 MB and was made up of files that would typically be found on end-user machines, such as media files, system files and Microsoft Office documents.

Benchmark 3 – User Interface Launch Time

This metric provides an objective indication as to how responsive a security product appears to the user, by measuring the amount of time it takes for the user interface of the antivirus software to launch from Windows. To allow for caching effects by the operating system, both the initial launch time and the subsequent launch times were measured. Our final result is an average of these two measurements.

Benchmark 4 – Memory Usage during System Idle

This metric measures the amount of memory (RAM) used by the product while the machine and antivirus software are in an idle state. The total memory usage was calculated by identifying all antivirus software processes and the amount of memory used by each process.

The amount of memory used while the machine is idle provides a good indication of the amount of system resources being consumed by the antivirus software on a permanent basis. Better performing products occupy less memory while the machine is idle.

Benchmark 5 – Browse Time

It is common behavior for security products to scan data for malware as it is downloaded from the internet or intranet. This behavior may negatively impact browsing speed as products scan web content for malware. This metric measures the time taken to browse a set of popular internet sites to consecutively load from a local server in a user’s browser window.

Performance Benchmark Page 10 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 6 – Internet Explorer Launch Time

This metric is one of many methods to objectively measure how much a security product impacts on the responsiveness of the system. This metric measures the amount of time it takes to launch the user interface of Internet Explorer 8. To allow for caching effects by the operating system, both the initial launch time and the subsequent launch times were measured. Our final result is an average of these two measurements.

Benchmark 7 – Installation Time

The speed and ease of the installation process will strongly influence the user’s first impression of the antivirus software. This test measures the minimum installation time required by the antivirus software to be fully functional and ready for use by the end user. Lower installation times represent antivirus products which are quicker for a user to install.

Benchmark 8 – Installation Size

In offering new features and functionality to users, antivirus software products tend to increase in size with each new release. Although new technologies push the size limits of hard drives each year, the growing disk space requirements of common applications and the increasing popularity of large media files (such as movies, photos and music) ensure that a product's installation size will remain of interest to home users.

This metric aims to measure a product’s total installation size. This metric is defined as the total disk space consumed by all new files added during a product's installation.

Benchmark 9 – Registry Keys Added

A large registry increases a machine’s use of resources. This may negatively impact system performance, especially on much older machines. This test measures the amount of keys and values added to registry, after rebooting the test machines, following a successful product installation. Lower numbers mean that a product has added fewer keys during installation and had less impact on the registry.

Benchmark 10 – File Copy, Move and Delete

This metric measures the amount of time taken to move, copy and delete a sample set of files. The sample file set contains several types of file formats that a Windows user would encounter in daily use. These formats include documents (e.g. Microsoft Office documents, Adobe PDF, Zip files, etc), media formats (e.g. images, movies and music) and system files (e.g. executables, libraries, etc).

Benchmark 11 – Installing Third Party Applications

This metric measures the amount of time taken to install and uninstall third party programs. The installation speed of third party applications may be impacted by antivirus behavior such as heuristics or real time malware scanning.

Benchmark 12 – Network Throughput

The metric measures the amount of time taken to download a variety of files from a local server using the HyperText Transfer Protocol (HTTP), which is the main protocol used on the web for browsing, linking and data transfer. Files used in this test include file formats that users would typically download from the web, such as images, archives, music files and movie files.

Performance Benchmark Page 11 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 13 – File Format Conversion

This test measures the amount of time taken to convert an MP3 file to a WAV and subsequently, convert the same MP3 file to a WMA format.

Benchmark 14 – File Compression and Decompression

This metric measures the amount of time taken to compress and decompress different types of files. Files formats used in this test included documents, movies and images.

Benchmark 15 – File Write, Open and Close

This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down). This metric measures the amount of time taken to write a file, then open and close that file.

Benchmark 16 – PE Scan Time

All antivirus solutions have functionality designed to detect viruses and various other forms of malware by scanning files on the system. This metric measured the amount of time required to scan a set of PE (Portable Executable) files. Our sample file set comprised a total file size of 2.03GB and consisted of .exe (329MB), .dll (920MB) and .sys files (827MB).

Benchmark 17 – File Copy Disk To Disk

This test measures the amount of time taken to copy files between two local drives. The data set comprised a total file size of 5.44GB, and the formats used included documents, movies, images and executables.

Performance Benchmark Page 12 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Internet Security Software – Test Results

In the following charts, we have highlighted the results we obtained for Norton Internet Security 2012 in yellow. The average has also been highlighted in blue for ease of comparison.

Benchmark 1 – Boot Time

The following chart compares the average time taken for the system to boot (from a sample of five boots) for each Internet Security product tested. Products with lower boot times are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 26.37

Trend Micro Virus Buster 2012 27.32

Avast! Internet Security 6 28.95

AVG Internet Security 2012 30.78

Trend Micro Titanium Internet Security 2012 30.80

Norton Internet Security 2012 31.73

Avira Internet Security 2012 32.61

SourceNEXTstyle ZERO 32.91

Kaspersky Internet Security 2012 34.10

Average 36.26

Panda Internet Security 2012 36.99

G Data Internet Security 2012 38.21

BitDefender Internet Security 2012 38.51

McAfee Internet Security 2012 82.12

0 s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s 90 s

Performance Benchmark Page 13 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 2 – Scan Time

The following chart compares the average time taken to scan a set of 6159 files (totaling 982 MB) for each Internet Security product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2- 5) scan times. Products with lower scan times are considered better performing products in this category.

G Data Internet Security 2012 22.50

Webroot SecureAnywhere Essentials 2012 22.63

Norton Internet Security 2012 23.25

AVG Internet Security 2012 25.38

McAfee Internet Security 2012 27.50

Trend Micro Titanium Internet Security 2012 36.21

Avast! Internet Security 6 36.75

Average 41.08

Avira Internet Security 2012 42.75

BitDefender Internet Security 2012 43.50

Trend Micro Virus Buster 2012 44.25

Kaspersky Internet Security 2012 45.58

Panda Internet Security 2012 75.79

SourceNEXTstyle ZERO 88.00

0 s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s 90 s 100 s

Performance Benchmark Page 14 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 3 – User Interface Launch Time

The following chart compares the average time taken to launch a product’s user interface. Products with lower launch times are considered better performing products in this category.

Trend Micro Virus Buster 2012 31.04

Trend Micro Titanium Internet Security 2012 32.63

SourceNEXTstyle ZERO 90.35

McAfee Internet Security 2012 127.77

G Data Internet Security 2012 241.30

Webroot SecureAnywhere Essentials 2012 289.99

Avast! Internet Security 6 305.06

Kaspersky Internet Security 2012 395.00

Average 523.33

Avira Internet Security 2012 789.20

BitDefender Internet Security 2012 790.14

Norton Internet Security 2012 991.78

AVG Internet Security 2012 1,082.47

Panda Internet Security 2012 1,636.60

0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms 1,600 ms 1,800 ms

Performance Benchmark Page 15 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 4 – Memory Usage during System Idle

The following chart compares the average amount of RAM in use by an Internet Security product during a period of system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot. Products with lower idle RAM usage are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 2.70

Avast! Internet Security 6 16.93

Norton Internet Security 2012 24.04

Panda Internet Security 2012 40.30

BitDefender Internet Security 2012 44.37

Average 52.16

Avira Internet Security 2012 52.54

Kaspersky Internet Security 2012 55.03

AVG Internet Security 2012 56.50

Trend Micro Virus Buster 2012 62.49

McAfee Internet Security 2012 71.17

Trend Micro Titanium Internet Security 2012 80.26

SourceNEXTstyle ZERO 85.10

G Data Internet Security 2012 86.60

0 MB 10 MB 20 MB 30 MB 40 MB 50 MB 60 MB 70 MB 80 MB 90 MB 100 MB

Performance Benchmark Page 16 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 5 – Browse Time

The following chart compares the average time taken for Internet Explorer to successively load a set of popular websites through the local area network from a local server machine. Products with lower browse times are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 35.66

Avast! Internet Security 6 43.09

Norton Internet Security 2012 44.47

Trend Micro Virus Buster 2012 47.37

Panda Internet Security 2012 47.65

Avira Internet Security 2012 47.76

Trend Micro Titanium Internet Security 2012 49.57

AVG Internet Security 2012 53.05

Kaspersky Internet Security 2012 60.04

Average 64.80

McAfee Internet Security 2012 84.65

SourceNEXTstyle ZERO 89.54

BitDefender Internet Security 2012 92.51

G Data Internet Security 2012 147.03

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s

Performance Benchmark Page 17 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 6 – Internet Explorer Launch Time

The following chart compares the average launch times of Internet Explorer after rebooting the machine for each Internet Security product we tested. Products with lower launch times are considered better performing products in this category.

Avira Internet Security 2012 826.63

Webroot SecureAnywhere Essentials 2012 843.83

Norton Internet Security 2012 853.93

AVG Internet Security 2012 860.34

Kaspersky Internet Security 2012 889.90

McAfee Internet Security 2012 1,037.22

G Data Internet Security 2012 1,051.27

BitDefender Internet Security 2012 1,061.29

Avast! Internet Security 6 1,082.24

Average 1,104.90

Panda Internet Security 2012 1,414.76

Trend Micro Virus Buster 2012 1,431.90

SourceNEXTstyle ZERO 1,467.79

Trend Micro Titanium Internet Security 2012 1,542.66

0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms 1,600 ms 1,800 ms

Performance Benchmark Page 18 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 7 – Installation Time

The following chart compares the minimum installation time it takes for Internet Security products to be fully functional and ready for use by the end user. Products with lower installation times are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 10.70

Norton Internet Security 2012 52.50

SourceNEXTstyle ZERO 64.90

Kaspersky Internet Security 2012 82.70

Panda Internet Security 2012 193.70

Trend Micro Titanium Internet Security 2012 208.14

Average 221.79

Avira Internet Security 2012 230.50

BitDefender Internet Security 2012 242.30

G Data Internet Security 2012 248.80

Avast! Internet Security 6 264.80

Trend Micro Virus Buster 2012 309.00

AVG Internet Security 2012 343.36

McAfee Internet Security 2012 631.90

0 s 100 s 200 s 300 s 400 s 500 s 600 s 700 s

Performance Benchmark Page 19 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 8 – Installation Size

The following chart compares the total size of files added during the installation of Internet Security products. Products with lower installation sizes are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 7.13

Avira Internet Security 2012 172.18

SourceNEXTstyle ZERO 233.24

Norton Internet Security 2012 249.54

Panda Internet Security 2012 302.28

Avast! Internet Security 6 339.90

Trend Micro Virus Buster 2012 450.58

Average 456.44

Trend Micro Titanium Internet Security 2012 485.70

Kaspersky Internet Security 2012 520.30

McAfee Internet Security 2012 529.98

AVG Internet Security 2012 632.78

G Data Internet Security 2012 800.32

BitDefender Internet Security 2012 1,209.82

0 MB 200 MB 400 MB 600 MB 800 MB 1,000 MB 1,200 MB 1,400 MB

Performance Benchmark Page 20 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 9 – Registry Keys Added

The following chart compares the amount of Registry Keys created during product installation for each Internet Security product tested. Products with lower key counts are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 110

Avira Internet Security 2012 3,102

Trend Micro Titanium Internet Security 2012 4,353

Norton Internet Security 2012 4,698

Trend Micro Virus Buster 2012 4,760

SourceNEXTstyle ZERO 5,737

AVG Internet Security 2012 5,865

Kaspersky Internet Security 2012 6,478

BitDefender Internet Security 2012 6,613

Average 7,064

Panda Internet Security 2012 7,690

Avast! Internet Security 6 8,121

G Data Internet Security 2012 9,120

McAfee Internet Security 2012 20,079

0 5000 10000 15000 20000 25000

Performance Benchmark Page 21 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 10 – File Copy, Move and Delete

The following chart compares the average time taken to copy, move and delete several sets of sample files for each Internet Security product tested. Products with lower times are considered better performing products in this category.

G Data Internet Security 2012 24.27

Webroot SecureAnywhere Essentials 2012 24.37

Norton Internet Security 2012 24.57

Avast! Internet Security 6 24.66

Trend Micro Virus Buster 2012 25.93

Kaspersky Internet Security 2012 26.22

Trend Micro Titanium Internet Security 2012 26.74

Average 29.11

Panda Internet Security 2012 30.01

McAfee Internet Security 2012 30.65

BitDefender Internet Security 2012 32.66

AVG Internet Security 2012 33.33

SourceNEXTstyle ZERO 34.71

Avira Internet Security 2012 40.33

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s 45 s

Performance Benchmark Page 22 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 11 – Installation of Third Party Applications

The following chart compares the average time taken to install 3 different third party application for each Internet Security product tested. Products with lower times are considered better performing products in this category.

Trend Micro Virus Buster 2012 94.35

Avira Internet Security 2012 97.21

Panda Internet Security 2012 97.98

SourceNEXTstyle ZERO 98.04

G Data Internet Security 2012 100.05

Webroot SecureAnywhere Essentials 2012 100.08

Average 103.20

Trend Micro Titanium Internet Security 2012 104.55

Avast! Internet Security 6 106.03

Norton Internet Security 2012 107.03

McAfee Internet Security 2012 107.09

Kaspersky Internet Security 2012 107.53

AVG Internet Security 2012 110.02

BitDefender Internet Security 2012 111.69

85 s 90 s 95 s 100 s 105 s 110 s 115 s

Performance Benchmark Page 23 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 12 – Network Throughput

The following chart compares the average time to download a sample set of common file types for each Internet Security product tested. Products with lower times are considered better performing products in this category.

SourceNEXTstyle ZERO 10.47

Webroot SecureAnywhere Essentials 2012 10.56

Avast! Internet Security 6 10.70

Trend Micro Virus Buster 2012 10.89

Trend Micro Titanium Internet Security 2012 12.20

McAfee Internet Security 2012 13.23

Norton Internet Security 2012 13.34

BitDefender Internet Security 2012 14.75

Average 17.90

AVG Internet Security 2012 26.26

G Data Internet Security 2012 26.56

Panda Internet Security 2012 32.17

Avira Internet Security 2012 33.73

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s

*Results for Kaspersky Internet Security 2012 could not be obtained as the test was blocked by the software.

Performance Benchmark Page 24 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 13 – File Format Conversion

The following chart compares the average time it takes for five sample files to be converted from one file format to another (MP3  WMA, MP3  WAV) for each Internet Security product tested. Products with lower times are considered better performing products in this category.

G Data Internet Security 2012 75.69

Avira Internet Security 2012 75.73

Kaspersky Internet Security 2012 75.74

AVG Internet Security 2012 75.84

Panda Internet Security 2012 75.99

Norton Internet Security 2012 76.07

McAfee Internet Security 2012 76.20

Webroot SecureAnywhere Essentials 2012 76.32

Average 76.40

SourceNEXTstyle ZERO 76.74

Trend Micro Virus Buster 2012 76.91

Trend Micro Titanium Internet Security 2012 77.10

Avast! Internet Security 6 77.13

BitDefender Internet Security 2012 77.74

75 s 75 s 76 s 76 s 77 s 77 s 78 s 78 s

Performance Benchmark Page 25 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 14 – File Compression and Decompression

The following chart compares the average time it takes for sample files to be compressed and decompressed for each Internet Security product tested. Products with lower times are considered better performing products in this category.

AVG Internet Security 2012 78.59

Webroot SecureAnywhere Essentials 2012 79.45

Norton Internet Security 2012 81.66

Avira Internet Security 2012 81.97

G Data Internet Security 2012 82.37

Kaspersky Internet Security 2012 82.40

Avast! Internet Security 6 84.44

McAfee Internet Security 2012 85.32

Average 86.60

BitDefender Internet Security 2012 89.29

SourceNEXTstyle ZERO 92.57

Trend Micro Virus Buster 2012 92.68

Trend Micro Titanium Internet Security 2012 95.49

Panda Internet Security 2012 99.59

0 s 20 s 40 s 60 s 80 s 100 s 120 s

Performance Benchmark Page 26 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 15 – File Write, Open and Close

The following chart compares the average time it takes for a file to be written to the hard drive then opened and closed 180,000 times, for each Internet Security product tested. Products with lower times are considered better performing products in this category.

Webroot SecureAnywhere Essentials 2012 15.95

AVG Internet Security 2012 17.13

Kaspersky Internet Security 2012 26.46

Panda Internet Security 2012 26.64

Norton Internet Security 2012 26.83

Avira Internet Security 2012 27.33

Avast! Internet Security 6 29.97

Trend Micro Virus Buster 2012 59.29

McAfee Internet Security 2012 64.80

Trend Micro Titanium Internet Security 2012 66.36

Average 274.13

BitDefender Internet Security 2012 697.21

SourceNEXTstyle ZERO 1,054.84

G Data Internet Security 2012 1,450.85

0 s 200 s 400 s 600 s 800 s 1,000 s 1,200 s 1,400 s 1,600 s

Performance Benchmark Page 27 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 16 – PE Scan Time

The following chart compares the average time taken to scan a set of 6351 portable executable files (totaling 2076 MB) for each Internet Security product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2-5) scan times. Products with lower scan times are considered better performing products in this category

Kaspersky Internet Security 2012 30.95

Trend Micro Virus Buster 2012 44.25

Norton Internet Security 2012 58.00

Webroot SecureAnywhere Essentials 2012 58.00

Trend Micro Titanium Internet Security 2012 65.24

G Data Internet Security 2012 69.88

McAfee Internet Security 2012 85.25

Average 92.38

AVG Internet Security 2012 95.13

Avira Internet Security 2012 95.38

BitDefender Internet Security 2012 106.63

Avast! Internet Security 6 153.88

Panda Internet Security 2012 168.74

SourceNEXTstyle ZERO 169.61

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s 180 s

Performance Benchmark Page 28 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 17 – File Copy Disk to Disk

The following chart compares the average time taken to copy a total of 8,501 files, with a total file size of 5.44GB files, from one local drive to another local drive for each Internet Security product tested. The test was performed 5 times, and the average of all 5 runs was taken as the result. Products with lower times are considered better performing products in this category.

Avast! Internet Security 6 106.02

Webroot SecureAnywhere Essentials 2012 114.30

Trend Micro Virus Buster 2012 116.24

Norton Internet Security 2012 116.27

AVG Internet Security 2012 120.10

Kaspersky Internet Security 2012 121.42

Panda Internet Security 2012 123.42

Trend Micro Titanium Internet Security 2012 125.13

Average 128.67

Avira Internet Security 2012 128.78

BitDefender Internet Security 2012 134.47

G Data Internet Security 2012 138.07

McAfee Internet Security 2012 153.42

SourceNEXTstyle ZERO 175.11

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s 180 s 200 s

Performance Benchmark Page 29 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Antivirus Software – Test Results

In the following charts, we have highlighted the results we obtained for Norton AntiVirus 2012 in yellow. The average has also been highlighted in blue for ease of comparison.

Benchmark 1 – Boot Time

The following chart compares the average time taken for the system to boot (from a sample of five boots) for each Antivirus product tested. Products with lower boot times are considered better performing products in this category.

Microsoft Security Essentials 28.07

Webroot SecureAnywhere AntiVirus 2012 29.14

Norton AntiVirus 2012 29.21

Avast! Pro Antivirus 6 29.84

Trend Micro Titanium 2012 30.56

Panda Cloud Antivirus Free 32.15

McAfee Antivirus 2012 32.15

Avira Free Antivirus 32.34

Avast! Free Antivirus 6 32.60

G Data Antivirus 2012 33.18

Average 33.32

Kaspersky Anti-Virus 2012 34.73

Avira Antivirus Premium 2012 34.95

Quick Heal AntiVirus 2012 35.00

Panda Antivirus Pro 2012 35.31

AVG Anti-Virus Free 2012 36.64

PCTools Spyware Doctor with Antivirus 9 39.91

AVG Anti-Virus 2012 40.67

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s 45 s

Performance Benchmark Page 30 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 2 – Scan Time

The following chart compares the average time taken to scan a set of 6159 files (totaling 982 MB) for each Antivirus product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2-5) scan times. Products with lower scan times are considered better performing products in this category

Webroot SecureAnywhere AntiVirus 2012 20.38

G Data Antivirus 2012 21.88

Norton AntiVirus 2012 24.88

AVG Anti-Virus Free 2012 26.75

McAfee Antivirus 2012 30.33

AVG Anti-Virus 2012 31.00

PCTools Spyware Doctor with Antivirus 9 36.38

Avast! Pro Antivirus 6 37.00

Avira Free Antivirus 43.88

Avira Antivirus Premium 2012 43.88

Kaspersky Anti-Virus 2012 45.38

Average 45.44

Quick Heal AntiVirus 2012 47.50

Trend Micro Titanium 2012 49.75

Avast! Free Antivirus 6 51.88

Panda Antivirus Pro 2012 71.50

Panda Cloud Antivirus Free 73.14

Microsoft Security Essentials 116.96

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s

Performance Benchmark Page 31 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 3 – User Interface Launch Time

The following chart compares the average time taken to launch a product’s user interface. Products with lower launch times are considered better performing products in this category.

Trend Micro Titanium 2012 37.27

Microsoft Security Essentials 95.04

G Data Antivirus 2012 236.95

Avast! Pro Antivirus 6 240.39

Webroot SecureAnywhere AntiVirus 2012 289.24

Avast! Free Antivirus 6 334.22

Kaspersky Anti-Virus 2012 345.00

Quick Heal AntiVirus 2012 440.57

PCTools Spyware Doctor with Antivirus 9 521.84

McAfee Antivirus 2012 557.51

Avira Free Antivirus 734.62

Avira Antivirus Premium 2012 801.67

Average 879.91

Norton AntiVirus 2012 907.21

AVG Anti-Virus 2012 1,346.08

AVG Anti-Virus Free 2012 1,902.89

Panda Antivirus Pro 2012 2,334.58

Panda Cloud Antivirus Free 3,833.45

0 ms 500 ms 1,000 ms 1,500 ms 2,000 ms 2,500 ms 3,000 ms 3,500 ms 4,000 ms 4,500 ms

Performance Benchmark Page 32 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 4 – Memory Usage during System Idle

The following chart compares the average amount of RAM in use by an Antivirus product during a period of system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot. Products with lower idle RAM usage are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 2.08

Avast! Free Antivirus 6 7.00

Avast! Pro Antivirus 6 8.75

Panda Cloud Antivirus Free 22.50

Norton AntiVirus 2012 24.45

PCTools Spyware Doctor with Antivirus 9 27.62

Avira Free Antivirus 29.19

AVG Anti-Virus Free 2012 38.30

Avira Antivirus Premium 2012 41.61

Average 42.11

AVG Anti-Virus 2012 42.75

Kaspersky Anti-Virus 2012 55.03

Trend Micro Titanium 2012 55.03

Panda Antivirus Pro 2012 55.77

G Data Antivirus 2012 56.80

McAfee Antivirus 2012 68.57

Microsoft Security Essentials 83.88

Quick Heal AntiVirus 2012 96.50

0 MB 20 MB 40 MB 60 MB 80 MB 100 MB 120 MB

Performance Benchmark Page 33 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 5 – Browse Time

The following chart compares the average time taken for Internet Explorer to successively load a set of popular websites through the local area network from a local server machine. Products with lower browse times are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 39.34

Microsoft Security Essentials 39.57

Norton AntiVirus 2012 40.27

Avira Free Antivirus 41.35

Avast! Pro Antivirus 6 44.07

Panda Antivirus Pro 2012 44.80

Avast! Free Antivirus 6 47.50

Trend Micro Titanium 2012 48.64

AVG Anti-Virus Free 2012 50.28

AVG Anti-Virus 2012 54.53

Kaspersky Anti-Virus 2012 57.87

PCTools Spyware Doctor with Antivirus 9 60.13

Average 70.67

Panda Cloud Antivirus Free 78.28

McAfee Antivirus 2012 85.52

Quick Heal AntiVirus 2012 87.70

G Data Antivirus 2012 144.07

Avira Antivirus Premium 2012 237.38

0 s 50 s 100 s 150 s 200 s 250 s

Performance Benchmark Page 34 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 6 – Internet Explorer Launch Time

The following chart compares the average launch times of Internet Explorer after rebooting the machine for each Antivirus product we tested. Products with lower launch times are considered better performing products in this category.

Quick Heal AntiVirus 2012 812.60

AVG Anti-Virus Free 2012 872.89

Kaspersky Anti-Virus 2012 889.12

Webroot SecureAnywhere AntiVirus 2012 896.06

Avira Free Antivirus 913.33

PCTools Spyware Doctor with Antivirus 9 934.80

Panda Cloud Antivirus Free 966.04

Avast! Pro Antivirus 6 1,004.46

Trend Micro Titanium 2012 1,020.85

Microsoft Security Essentials 1,025.66

AVG Anti-Virus 2012 1,074.64

Average 1,076.46

G Data Antivirus 2012 1,092.61

Norton AntiVirus 2012 1,233.80

Panda Antivirus Pro 2012 1,300.87

Avira Antivirus Premium 2012 1,337.53

Avast! Free Antivirus 6 1,350.79

McAfee Antivirus 2012 1,573.84

0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms 1,600 ms 1,800 ms

Performance Benchmark Page 35 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 7 – Installation Time

The following chart compares the minimum installation time it takes for Antivirus products to be fully functional and ready for use by the end user. Products with lower installation times are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 7.70

Panda Cloud Antivirus Free 47.60

Norton AntiVirus 2012 52.50

Avast! Free Antivirus 6 59.60

Panda Antivirus Pro 2012 74.40

Quick Heal AntiVirus 2012 74.90

Kaspersky Anti-Virus 2012 76.00

Avast! Pro Antivirus 6 113.00

Trend Micro Titanium 2012 115.20

Avira Antivirus Premium 2012 120.60

Average 156.79

AVG Anti-Virus 2012 189.50

G Data Antivirus 2012 201.70

Avira Free Antivirus 211.00

Microsoft Security Essentials 232.20

AVG Anti-Virus Free 2012 268.20

PCTools Spyware Doctor with Antivirus 9 313.80

McAfee Antivirus 2012 507.60

0 s 100 s 200 s 300 s 400 s 500 s 600 s

Performance Benchmark Page 36 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 8 – Installation Size

The following chart compares the total size of files added during the installation of Antivirus products. Products with lower installation sizes are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 5.88

Microsoft Security Essentials 104.53

Avira Antivirus Premium 2012 168.67

Panda Cloud Antivirus Free 179.45

Avast! Free Antivirus 6 204.72

Avira Free Antivirus 219.79

Norton AntiVirus 2012 231.27

Trend Micro Titanium 2012 280.40

Avast! Pro Antivirus 6 308.67

AVG Anti-Virus Free 2012 349.65

Panda Antivirus Pro 2012 384.90

Average 392.35

McAfee Antivirus 2012 435.28

Kaspersky Anti-Virus 2012 474.75

AVG Anti-Virus 2012 572.35

PCTools Spyware Doctor with Antivirus 9 795.82

Quick Heal AntiVirus 2012 924.41

G Data Antivirus 2012 1,029.40

0 MB 200 MB 400 MB 600 MB 800 MB 1,000 MB 1,200 MB

Performance Benchmark Page 37 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 9 – Registry Keys Added

The following chart compares the amount of Registry Keys created during product installation for each Antivirus product tested. Products with lower key counts are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 434

Avast! Pro Antivirus 6 2,137

Avast! Free Antivirus 6 2,311

Avira Antivirus Premium 2012 2,427

Panda Cloud Antivirus Free 2,532

Avira Free Antivirus 3,035

Microsoft Security Essentials 3,273

Quick Heal AntiVirus 2012 3,656

Trend Micro Titanium 2012 3,896

Norton AntiVirus 2012 4,395

Panda Antivirus Pro 2012 5,520

AVG Anti-Virus Free 2012 5,546

AVG Anti-Virus 2012 5,785

Kaspersky Anti-Virus 2012 5,909

Average 6,416

G Data Antivirus 2012 7,289

McAfee Antivirus 2012 16,414

PCTools Spyware Doctor with Antivirus 9 34,516

0 5000 10000 15000 20000 25000 30000 35000 40000

Performance Benchmark Page 38 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 10 – File Copy, Move and Delete

The following chart compares the average time taken to copy, move and delete several sets of sample files for each Antivirus product tested. Products with lower times are considered better performing products in this category.

Webroot SecureAnywhere AntiVirus 2012 24.07

Norton AntiVirus 2012 24.13

Avast! Pro Antivirus 6 24.35

Kaspersky Anti-Virus 2012 25.88

Trend Micro Titanium 2012 26.57

G Data Antivirus 2012 26.86

PCTools Spyware Doctor with Antivirus 9 27.53

Panda Cloud Antivirus Free 28.72

Quick Heal AntiVirus 2012 29.41

Average 29.71

Avast! Free Antivirus 6 29.81

AVG Anti-Virus 2012 30.52

McAfee Antivirus 2012 30.54

Panda Antivirus Pro 2012 31.52

AVG Anti-Virus Free 2012 32.85

Microsoft Security Essentials 33.94

Avira Free Antivirus 38.73

Avira Antivirus Premium 2012 39.72

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s 45 s

Performance Benchmark Page 39 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 11 – Installation of Third Party Applications

The following chart compares the average time taken to install a third party application for each Antivirus product tested. Products with lower times are considered better performing products in this category.

Avira Free Antivirus 96.18

Avira Antivirus Premium 2012 97.21

Panda Antivirus Pro 2012 98.24

Trend Micro Titanium 2012 98.40

G Data Antivirus 2012 98.43

PCTools Spyware Doctor with Antivirus 9 100.01

Webroot SecureAnywhere AntiVirus 2012 100.32

Avast! Pro Antivirus 6 103.83

Norton AntiVirus 2012 104.61

McAfee Antivirus 2012 106.91

Panda Cloud Antivirus Free 107.00

Average 107.07

Avast! Free Antivirus 6 108.37

Quick Heal AntiVirus 2012 113.23

Kaspersky Anti-Virus 2012 113.59

AVG Anti-Virus Free 2012 118.27

AVG Anti-Virus 2012 127.75

Microsoft Security Essentials 127.84

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s

Performance Benchmark Page 40 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 12 – Network Throughput

The following chart compares the average time to download a sample set of common file types for each Antivirus product tested. Products with lower times are considered better performing products in this category. *Results for Kaspersky Anti-Virus 2012 could not be obtained as the test was blocked by the software.

PCTools Spyware Doctor with Antivirus 9 9.48

Panda Cloud Antivirus Free 10.18

Webroot SecureAnywhere AntiVirus 2012 10.27

Microsoft Security Essentials 10.59

Avast! Pro Antivirus 6 10.76

Quick Heal AntiVirus 2012 11.18

Trend Micro Titanium 2012 11.48

Avast! Free Antivirus 6 12.13

Avira Free Antivirus 12.37

Norton AntiVirus 2012 12.94

AVG Anti-Virus Free 2012 13.68

Average 14.16

AVG Anti-Virus 2012 15.30

Avira Antivirus Premium 2012 16.37

McAfee Antivirus 2012 16.42

G Data Antivirus 2012 26.06

Panda Antivirus Pro 2012 27.35

0 s 5 s 10 s 15 s 20 s 25 s 30 s

Performance Benchmark Page 41 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 13 – File Format Conversion

The following chart compares the average time it takes for five sample files to be converted from one file format to another (MP3  WMA, MP3  WAV) for each Antivirus product tested. Products with lower times are considered better performing products in this category.

Kaspersky Anti-Virus 2012 74.92

Avira Antivirus Premium 2012 75.65

Panda Cloud Antivirus Free 75.86

Avira Free Antivirus 75.91

Norton AntiVirus 2012 75.93

Webroot SecureAnywhere AntiVirus 2012 75.95

Microsoft Security Essentials 76.07

PCTools Spyware Doctor with Antivirus 9 76.16

AVG Anti-Virus 2012 76.19

Panda Antivirus Pro 2012 76.24

Average 76.37

AVG Anti-Virus Free 2012 76.73

Trend Micro Titanium 2012 76.89

Quick Heal AntiVirus 2012 76.92

G Data Antivirus 2012 76.96

Avast! Pro Antivirus 6 77.02

Avast! Free Antivirus 6 77.32

McAfee Antivirus 2012 77.63

74 s 74 s 75 s 75 s 76 s 76 s 77 s 77 s 78 s 78 s

Performance Benchmark Page 42 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 14 – File Compression and Decompression

The following chart compares the average time it takes for sample files to be compressed and decompressed for each Antivirus product tested. Products with lower times are considered better performing products in this category.

PCTools Spyware Doctor with Antivirus 9 79.14

Webroot SecureAnywhere AntiVirus 2012 79.32

Norton AntiVirus 2012 81.12

Kaspersky Anti-Virus 2012 81.25

AVG Anti-Virus 2012 82.44

AVG Anti-Virus Free 2012 82.84

Quick Heal AntiVirus 2012 84.12

Avast! Pro Antivirus 6 84.20

Microsoft Security Essentials 84.79

G Data Antivirus 2012 84.88

Avast! Free Antivirus 6 85.30

Average 85.72

Panda Cloud Antivirus Free 86.62

McAfee Antivirus 2012 88.10

Avira Free Antivirus 89.50

Avira Antivirus Premium 2012 89.56

Trend Micro Titanium 2012 95.17

Panda Antivirus Pro 2012 98.94

0 s 20 s 40 s 60 s 80 s 100 s 120 s

Performance Benchmark Page 43 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 15 – File Write, Open and Close

The following chart compares the average time it takes for a file to be written to the hard drive then opened and closed 180,000 times, for each Antivirus product tested. Products with lower times are considered better performing products in this category.

AVG Anti-Virus Free 2012 17.26

AVG Anti-Virus 2012 18.07

PCTools Spyware Doctor with Antivirus 9 22.65

Webroot SecureAnywhere AntiVirus 2012 22.67

Kaspersky Anti-Virus 2012 23.60

Quick Heal AntiVirus 2012 24.05

Avira Antivirus Premium 2012 26.31

Norton AntiVirus 2012 26.78

Panda Antivirus Pro 2012 27.13

Avira Free Antivirus 27.66

McAfee Antivirus 2012 29.19

Avast! Free Antivirus 6 30.94

Avast! Pro Antivirus 6 31.09

Panda Cloud Antivirus Free 31.68

Trend Micro Titanium 2012 33.67

Average 121.32

Microsoft Security Essentials 238.23

G Data Antivirus 2012 1,431.52

0 s 200 s 400 s 600 s 800 s 1,000 s 1,200 s 1,400 s 1,600 s

Performance Benchmark Page 44 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 16 – PE Scan Time

The following chart compares the average time taken to scan a set of 6351 portable executable files (totaling 2076 MB) for each Antivirus product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2-5) scan times. Products with lower scan times are considered better performing products in this category

PCTools Spyware Doctor with Antivirus 9 15.88

Norton AntiVirus 2012 22.88

Kaspersky Anti-Virus 2012 33.63

Quick Heal AntiVirus 2012 54.63

Trend Micro Titanium 2012 65.89

Webroot SecureAnywhere AntiVirus 2012 67.88

G Data Antivirus 2012 70.13

AVG Anti-Virus Free 2012 78.38

AVG Anti-Virus 2012 81.63

McAfee Antivirus 2012 84.05

Average 90.11

Avira Antivirus Premium 2012 95.13

Avira Free Antivirus 97.00

Panda Cloud Antivirus Free 138.39

Avast! Pro Antivirus 6 152.50

Panda Antivirus Pro 2012 155.13

Avast! Free Antivirus 6 156.13

Microsoft Security Essentials 162.73

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s 180 s

Performance Benchmark Page 45 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 17 – File Copy Disk to Disk

The following chart compares the average time taken to copy a set of files from one local drive to another local drive for each Antivirus product tested. The test was performed 5 times, and the average of all 5 runs was taken as the result. Products with lower times are considered better performing products in this category.

Trend Micro Titanium 2012 106.79

Microsoft Security Essentials 111.97

Webroot SecureAnywhere AntiVirus 2012 115.08

Norton AntiVirus 2012 116.90

Kaspersky Anti-Virus 2012 120.42

Avast! Pro Antivirus 6 121.35

Avast! Free Antivirus 6 122.79

Avira Free Antivirus 126.74

AVG Anti-Virus Free 2012 126.74

Panda Antivirus Pro 2012 126.80

Average 127.55

PCTools Spyware Doctor with Antivirus 9 129.01

Avira Antivirus Premium 2012 130.42

AVG Anti-Virus 2012 131.61

Quick Heal AntiVirus 2012 134.52

G Data Antivirus 2012 136.22

McAfee Antivirus 2012 155.25

Panda Cloud Antivirus Free 155.82

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s 180 s

Performance Benchmark Page 46 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Total Security Software – Test Results

In the following charts, we have highlighted the results we obtained for Norton 360 v6 in yellow. The average has also been highlighted in blue for ease of comparison.

Benchmark 1 – Boot Time

The following chart compares the average time taken for the system to boot (from a sample of five boots) for each Total Security product tested. Products with lower boot times are considered better performing products in this category.

Trend Micro Titanium Maximum Security 2012 30.75

McAfee Total Protection 2012 35.23

Average 36.14

Norton 360 v6 36.28

Kaspersky PURE 9 38.22

BitDefender Total Security 2012 40.24

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s 40 s 45 s

Benchmark 2 – Scan Time

The following chart compares the average time taken to scan a set of 6159 files (totaling 982 MB) for each Total Security product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2-5) scan times. Products with lower scan times are considered better performing products in this category

McAfee Total Protection 2012 19.18

Norton 360 v6 24.50

Average 36.37

BitDefender Total Security 2012 39.13

Trend Micro Titanium Maximum Security 2012 42.57

Kaspersky PURE 9 56.50

0 s 10 s 20 s 30 s 40 s 50 s 60 s

Performance Benchmark Page 47 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 3 – User Interface Launch Time

The following chart compares the average time taken to launch a product’s user interface. Products with lower launch times are considered better performing products in this category.

McAfee Total Protection 2012 28.70

Trend Micro Titanium Maximum Security 2012 247.11

Average 567.98

Norton 360 v6 630.84

BitDefender Total Security 2012 748.64

Kaspersky PURE 9 1,184.62

0 s 200 s 400 s 600 s 800 s 1,000 s 1,200 s 1,400 s

Benchmark 4 – Memory Usage during System Idle

The following chart compares the average amount of RAM in use by a Total Security product during a period of system idle. This average is taken from a sample of ten memory snapshots taken at roughly 60 seconds apart after reboot. Products with lower idle RAM usage are considered better performing products in this category.

Norton 360 v6 24.64

Kaspersky PURE 9 34.71

Average 60.02

BitDefender Total Security 2012 79.23

McAfee Total Protection 2012 80.68

Trend Micro Titanium Maximum Security 2012 80.84

0 MB 10 MB 20 MB 30 MB 40 MB 50 MB 60 MB 70 MB 80 MB 90 MB

Performance Benchmark Page 48 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 5 – Browse Time

The following chart compares the average time taken for Internet Explorer to successively load a set of popular websites through the local area network from a local server machine. Products with lower browse times are considered better performing products in this category.

Trend Micro Titanium Maximum Security 2012 42.60

Norton 360 v6 45.80

Kaspersky PURE 9 59.51

Average 62.79

BitDefender Total Security 2012 81.39

McAfee Total Protection 2012 84.64

0 s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s 90 s

Benchmark 6 – Internet Explorer Launch Time

The following chart compares the average launch times of Internet Explorer after rebooting the machine for each Total Security product we tested. Products with lower launch times are considered better performing products in this category.

BitDefender Total Security 2012 845.41

Norton 360 v6 1,084.04

Average 1,139.00

McAfee Total Protection 2012 1,209.08

Kaspersky PURE 9 1,217.78

Trend Micro Titanium Maximum Security 2012 1,338.72

0 ms 200 ms 400 ms 600 ms 800 ms 1,000 ms 1,200 ms 1,400 ms 1,600 ms

Performance Benchmark Page 49 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 7 – Installation Time

The following chart compares the minimum installation time it takes for Total Security products to be fully functional and ready for use by the end user. Products with lower installation times are considered better performing products in this category.

Norton 360 v6 63.20

BitDefender Total Security 2012 78.77

Trend Micro Titanium Maximum Security 2012 181.78

Kaspersky PURE 9 181.78

Average 217.71

McAfee Total Protection 2012 583.00

0 s 100 s 200 s 300 s 400 s 500 s 600 s 700 s

Benchmark 8 – Installation Size

The following chart compares the total size of files added during the installation of Total Security products. Products with lower installation sizes are considered better performing products in this category.

Norton 360 v6 303.38

Trend Micro Titanium Maximum Security 2012 474.15

McAfee Total Protection 2012 535.70

Kaspersky PURE 9 614.95

Average 619.19

BitDefender Total Security 2012 1167.76

0 MB 200 MB 400 MB 600 MB 800 MB 1,000 MB 1,200 MB 1,400 MB

Performance Benchmark Page 50 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 9 – Registry Keys Added

The following chart compares the amount of Registry Keys created during product installation for each Total Security product tested. Products with lower key counts are considered better performing products in this category.

Trend Micro Titanium Maximum Security 2012 4,741

Norton 360 v6 5,752

BitDefender Total Security 2012 7,656

Average 9,958

Kaspersky PURE 9 11,061

McAfee Total Protection 2012 20,578

0 5000 10000 15000 20000 25000

Benchmark 10 – File Copy, Move and Delete

The following chart compares the average time taken to copy, move and delete several sets of sample files for each Total Security product tested. Products with lower times are considered better performing products in this category.

Norton 360 v6 25.34

Kaspersky PURE 9 25.80

Trend Micro Titanium Maximum Security 2012 27.78

Average 27.85

McAfee Total Protection 2012 29.61

BitDefender Total Security 2012 30.74

0 s 5 s 10 s 15 s 20 s 25 s 30 s 35 s

Performance Benchmark Page 51 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 11 – Installation of Third Party Applications

The following chart compares the average time taken to install a third party application for each Total Security product tested. Products with lower times are considered better performing products in this category.

McAfee Total Protection 2012 99.41

BitDefender Total Security 2012 100.11

Trend Micro Titanium Maximum Security 2012 101.16

Average 103.60

Kaspersky PURE 9 106.03

Norton 360 v6 111.29

92 s 94 s 96 s 98 s 100 s 102 s 104 s 106 s 108 s 110 s 112 s 114 s

Benchmark 12 – Network Throughput

The following chart compares the average time to download a sample set of common file types for each Total Security product tested. Products with lower times are considered better performing products in this category.

Trend Micro Titanium Maximum Security 2012 11.48

Norton 360 v6 13.17

BitDefender Total Security 2012 14.56

McAfee Total Protection 2012 17.57

Average 23.19

Kaspersky PURE 9 59.17

0 s 10 s 20 s 30 s 40 s 50 s 60 s 70 s

Performance Benchmark Page 52 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 13 – File Format Conversion

The following chart compares the average time it takes for five sample files to be converted from one file format to another (MP3  WMA, MP3  WAV) for each Total Security product tested. Products with lower times are considered better performing products in this category.

Norton 360 v6 76.03

McAfee Total Protection 2012 76.13

Trend Micro Titanium Maximum Security 2012 77.26

Average 77.28

BitDefender Total Security 2012 78.01

Kaspersky PURE 9 79.00

75 s 75 s 76 s 76 s 77 s 77 s 78 s 78 s 79 s 79 s 80 s

Benchmark 14 – File Compression and Decompression

The following chart compares the average time it takes for sample files to be compressed and decompressed for each Total Security product tested. Products with lower times are considered better performing products in this category.

Norton 360 v6 81.72

Kaspersky PURE 9 82.81

McAfee Total Protection 2012 84.21

Average 86.48

BitDefender Total Security 2012 87.30

Trend Micro Titanium Maximum Security 2012 96.34

70 s 75 s 80 s 85 s 90 s 95 s 100 s

Performance Benchmark Page 53 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 15 – File Write, Open and Close

The following chart compares the average time it takes for a file to be written to the hard drive then opened and closed 180,000 times, for each Total Security product tested. Products with lower times are considered better performing products in this category.

Norton 360 v6 27.64

Kaspersky PURE 9 29.23

Trend Micro Titanium Maximum Security 2012 66.71

McAfee Total Protection 2012 66.91

Average 271.71

BitDefender Total Security 2012 1,168.04

0 s 200 s 400 s 600 s 800 s 1,000 s 1,200 s 1,400 s

Benchmark 16 – PE Scan Time

The following chart compares the average time taken to scan a set of 6351 portable executable files (totaling 2076 MB) for each Total Security product tested. This time is calculated by averaging the initial (Run 1) and subsequent (Runs 2-5) scan times. Products with lower scan times are considered better performing products in this category

McAfee Total Protection 2012 18.94

Kaspersky PURE 9 38.25

Norton 360 v6 45.13

Average 49.26

Trend Micro Titanium Maximum Security 2012 67.47

BitDefender Total Security 2012 76.50

0 s 10 s 20 s 30 s 40 s 50 s 60 s 70 s 80 s 90 s

Performance Benchmark Page 54 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 17 – File Copy Disk to Disk

The following chart compares the average time taken to copy a set of files from one local drive to another local drive for each Total Security product tested. The test was performed 5 times, and the average of all 5 runs was taken as the result. Products with lower times are considered better performing products in this category.

Trend Micro Titanium Maximum Security 2012 118.60

Norton 360 v6 118.71

Kaspersky PURE 9 125.67

Average 129.47

BitDefender Total Security 2012 135.55

McAfee Total Protection 2012 148.80

0 s 20 s 40 s 60 s 80 s 100 s 120 s 140 s 160 s

Performance Benchmark Page 55 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Disclaimer and Disclosure

This report only covers versions of products that were available at the time of testing. The tested versions are as noted in the “Products and Versions” section of this report. The products we have tested are not an exhaustive list of all products available in these very competitive product categories.

Disclaimer of Liability

While every effort has been made to ensure that the information presented in this report is accurate, PassMark Software Pty Ltd assumes no responsibility for errors, omissions, or out-of-date information and shall not be liable in any manner whatsoever for direct, indirect, incidental, consequential, or punitive damages resulting from the availability of, use of, access of, or inability to use this information.

Disclosure

Symantec Corporation funded the production of this report, selected the test metrics and list of products to include in this report, and supplied some of the test scripts used for the tests.

Trademarks

All trademarks are the property of their respective owners.

Contact Details

PassMark Software Pty Ltd Suite 202, Level 2 35 Buckingham St. Surry Hills, 2010 Sydney, Australia Phone + 61 (2) 9690 0444 Fax + 61 (2) 9690 0445 Web www.passmark.com

Download Location

An electronic copy of this report can be found at the following location: http://www.passmark.com/tpsreport12

Performance Benchmark Page 56 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Appendix 1 – Test Environment

For our testing, PassMark Software used a test environment running Windows 7 Ultimate (64-bit) SP1 with the following hardware specifications:

Windows 7 (64-bit) System

CPU: Intel Core i7 920 Quad Core @ 2.67GHz Video Card: nVidia GeForce 8800 GT Motherboard: Intel x58 Motherboard RAM: 6GB DDR3 RAM HDD: Western Digital 500GB 7200RPM Network: Gigabit (1GB/s) switch

Performance Benchmark Page 57 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Appendix 2 – Methodology Description

Windows 7 Image Creation

As with testing on Windows Vista, Norton Ghost was used to create a “clean” baseline image prior to testing. Our aim is to create a baseline image with the smallest possible footprint and reduce the possibility of variation caused by external operating system factors.

The baseline image was restored prior to testing of each different product. This process ensures that we install and test all products on the same, “clean” machine.

The steps taken to create the base Windows 7 image are as follows:

1. Installation and activation of Windows 7 Ultimate Edition. 2. Disabled Automatic Updates. 3. Changed User Account Control settings to “Never Notify”. 4. Disable Windows Defender automatic scans to avoid unexpected background activity. 5. Disable the Windows firewall to avoid interference with security software. 6. Installed Norton Ghost for imaging purposes. 7. Disabled Superfetch to ensure consistent results. 8. Installed HTTP Watch for Browse Time testing. 9. Installed Windows Performance Toolkit x64 for Boot Time testing. 10. Installed Active Perl for interpretation of some test scripts. 11. Install OSForensics for testing (Installation Size and Registry Key Count tests) purposes. 12. Disabled updates, accelerators and compatibility view updates in Internet Explorer 8. 13. Update to Windows Service Pack 1 14. Created a baseline image using Norton Ghost.

Benchmark 1 – Boot Time

PassMark Software uses tools available from the Windows Performance Toolkit version 4.6 (as part of the Microsoft Windows 7 SDK obtainable from the Microsoft Website) with a view to obtaining more precise and consistent boot time results on the Windows 7 platform.

The boot process is first optimized with xbootmgr.exe using the command “xbootmgr.exe -trace boot – prepSystem” which prepares the system for the test over six optimization boots. The boot traces obtained from the optimization process are discarded.

After boot optimization, the benchmark is conducted using the command "xbootmgr.exe -trace boot -numruns 5”. This command boots the system five times in succession, taking detailed boot traces for each boot cycle.

Finally, a post-processing tool was used to parse the boot traces and obtain the BootTimeViaPostBoot value. This value reflects the amount of time it takes the system to complete all (and only) boot time processes. Our final result is an average of five boot traces.

Performance Benchmark Page 58 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 2 – Scan Time

Scan Time is the time it took for each product to scan a set of sample files. The sample used was identical in all cases and contained a mixture of system files and Office files. In total there were 6159 files whose combined size was 982 MB. Most of these files come from the Windows system folders. As the file types can influence scanning speed, the breakdown of the main file types, file numbers and total sizes of the files in the sample set is given here:

File Extension Number of Files File Size

.dll 2589 490MB .exe 695 102MB .sys 332 23MB .gif 302 1MB .doc 281 64MB .wmf 185 2MB .png 149 2MB .html 126 1MB .nls 80 6MB .jpg 70 1MB .ini 59 2MB .ico 58 <1MB .mof 43 6MB .ax 39 4MB .xls 38 3MB .ime 35 5MB .drv 31 1MB .txt 31 1MB .chm 30 6MB .cpl 29 4MB .mfl 29 3MB .inf 26 2MB .hlp 22 3MB .imd 20 18MB .py 20 <1MB .msc 18 1MB .vbs 18 1MB .xml 18 1MB .rtf 16 62MB .ocx 16 4MB .tsp 14 1MB .com 14 <1MB .xsl 14 <1MB .h 13 <1MB .vsd 12 2MB .scr 12 2MB .aw 12 2MB .js 12 1MB .zip 11 25MB .lex 9 10MB .ppt 9 4MB .acm 9 1MB .wav 7 5MB Total 6159 982

Performance Benchmark Page 59 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

This scan was run without launching the product’s user interface, by right-clicking the test folder and choosing the “Scan Now” option. To record the scan time, we have used product’s built-in scan timer or reporting system. Where this was not possible, scan times were taken manually with a stopwatch.

For each product, five samples were taken with the machine rebooted before each sample to clear any caching effects by the operating systems.

In 2009, we noticed many more products showing a substantial difference between the initial scan time (first scan) and subsequent scan times (scans 2 to 5). We believe this behavior is due to products themselves caching recently scanned files.

As a result of this mechanism, we have averaged the four subsequent scan times to obtain an average subsequent scan time. Our final result for this test is an average of the subsequent scan average and the initial scan time.

Benchmark 3 – User Interface Launch Time

The launch time of a product’s user interface was taken using AppTimer (v1.0.1006). For each product tested, we obtained a total of fifteen samples from five sets of three UI launches, with a reboot before each set to clear caching effects by the operating system. When compiling the results the first of each set was separated out so that there was a set of values for the initial launch after reboot and a set for subsequent launches.

We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result for this test is an average of the subsequent launch average and the initial launch time.

In some cases, AppTimer did not correctly record the time taken for UI launch. For instance, some applications would open their window and look like they were ready, but then continued to be unresponsive. Where the measurement from AppTimer appeared inaccurate, we have taken the time manually with a stop watch.

AppTimer is publically available from the PassMark Website.

Benchmark 4 – Memory Usage during System Idle

The Perflog++ utility was used to record process memory usage on the system at boot, and then every minute for another fifteen minutes after. This was done only once per product and resulted in a total of 15 samples. The first sample taken at boot is discarded.

The PerfLog++ utility records memory usage of all processes, not just those of the anti-malware product. As a result of this, an anti-malware product’s processes needed to be isolated from all other running system processes. To isolate relevant process, we used a program called Process Explorer which was run immediately upon the completion of memory usage logging by PerfLog++. Process Explorer is a Microsoft Windows Sysinternals software tool which shows a list of all DLL processes currently loaded on the system.

Benchmark 5 – Browse Time

We used a script in conjunction with HTTPWatch (Basic Edition, version 6.1) to record the amount of time it takes for a set of 106 ‘popular’ websites to load consecutively from a local server. This script feeds a list of URLs into HTTPWatch, which instructs the browser to load pages in sequence and monitors the amount of time it takes for the browser to load all items on one page. Performance Benchmark Page 60 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our browser.

The set of websites used in this test include front pages of high traffic pages. This includes shopping, social, news, finance and reference websites.

The Browse Time test is executed five times and our final result is an average of these five samples. The local server is restarted between different products and one initial ‘test’ run is conducted prior to testing to install Adobe Flash Player, an add-on which is used by many popular websites.

Benchmark 6 – Internet Explorer Launch Time

The average launch time of Internet Explorer interface was taken using AppTimer. This test was practically identical to the User Interface launch time test. For each product tested, we obtained a total of fifteen samples from five sets of three Internet Explorer launches, with a reboot before each set to clear caching effects by the operating system. When compiling the results the first of each set was separated out so that there was a set of values for the initial launch after reboot and a set for subsequent launches.

For this test, we have used Internet Explorer 8 (Version 8.0.6001.18783) as our test browser.

We have averaged the subsequent launch times to obtain an average subsequent launch time. Our final result for this test is an average of the subsequent launch average and the initial launch time.

Benchmark 7 – Installation Time

This test measures the minimum Installation Time a product requires to be fully functional and ready for use by the end user. Installation time can usually be divided in three major phases:

 The Extraction and Setup phase consists of file extraction, the EULA prompt, product activation and user configurable options for installation.

 The File Copy phase occurs when the product is being installed; usually this phase is indicated by a progress bar.

 The Post-Installation phase is any part of the installation that occurs after the File Copy phase. This phase varies widely between products; the time recorded in this phase may include a required reboot to finalize the installation or include the time the program takes to become idle in the system tray.

To reduce the impact of disk drive variables, each product was copied to the Desktop before initializing installation. Each step of the installation process was manually timed with a stopwatch and recorded in as much detail as possible. Where input was required by the end user, the stopwatch was paused and the input noted in the raw results in parenthesis after the phase description.

Where possible, all requests by products to pre-scan or post-install scan were declined or skipped. Where it was not possible to skip a scan, the time to scan was included as part of the installation time. Where an optional component of the installation formed a reasonable part of the functionality of the software, it was also installed (e.g. website link checking software as part of an Internet Security Product).

Performance Benchmark Page 61 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Installation time includes the time taken by the product installer to download components required in the installation. This may include mandatory updates or the delivery of the application itself from a download manager. We have noted in our results where a product has downloaded components for product installation.

We have excluded product activation times due to network variability in contacting vendor servers or time taken in account creation.

Benchmark 8 – Installation Size

A product's Installation Size was previously defined as the difference between the initial snapshot of the Disk Space (C: drive) before installation and the subsequent snapshot taken after the product is installed on the system. Although this is a widely used methodology, we noticed that the results it yielded were not always reproducible in Vista due to random OS operations that may take place between the two snapshots. We improved the Installation Size methodology by removing as many Operating System and disk space variables as possible.

Using PassMark’s OSForensics we created initial and post-installation disk signatures for each product. These disk signatures recorded the amount of files and directories, and complete details of all files on that drive (including file name, file size, checksum, etc) at the time the signature was taken.

The initial disk signature was taken immediately prior to installation of the product. A subsequent disk signature was taken immediately following a system reboot after product installation. Using OSForensics, we compared the two signatures and calculated the total disk space consumed by files that were new, modified, and deleted during product installation. Our result for this metric reflects the total size of all newly added files during installation.

The scope of this metric includes only an ‘out of the box’ installation size for each product. Our result does not cover the size of files downloaded by the product after its installation (such as engine or signature updates), or any files created by system restore points, pre-fetch files and other temporary files.

Benchmark 9 – Registry Key Count

This test measures the amount of keys and values added to registry, after rebooting the test machine following a successful product installation. The test was conducted using PassMark’s OSForensics, to count the number of keys, errors and values that were added under HKEY_LOCAL_MACHINE and HKEY_USERS. The Create Signature feature is used to take a before and after signature of these folders, and the signatures are then compared so that the new keys can be identified.

Benchmarks 10-15 – Real-Time Performance

We used a single script in testing Benchmarks 10-15. The script consecutively executes tests for Benchmarks 10- 15. The script times each phase in these benchmarks using CommandTimer.exe and appends results to a log file.

Benchmarks 10 – File Copy, Move and Delete

This test measures the amount of time required for the system to copy, move and delete samples of files in various file formats. This sample was made up of 812 files over 760,867,636 bytes and can be categorized as documents [26% of total], media files [54% of total] and PE files (i.e. System Files) [20% of total].

Performance Benchmark Page 62 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

The breakdown of the main file types, file numbers and total sizes of the files in the sample set is shown in the following table:

File format Number Size (bytes)

DOC 8 30,450,176

DOCX 4 13,522,409

PPT 3 5,769,216

PPTX 3 4,146,421

XLS 4 2,660,352

XLSX 4 1,426,054

PDF 73 136,298,049

ZIP 4 6,295,987

7Z 1 92,238

JPG 351 31,375,259

GIF 6 148,182

MOV 7 57,360,371

RM 1 5,658,646

AVI 8 78,703,408

WMV 5 46,126,167

MP3 28 191,580,387

EXE 19 2,952,914

DLL 104 29,261,568

AX 1 18,432

CPL 2 2,109,440

CPX 2 4,384

DRV 10 154,864

ICO 1 107,620

MSC 1 41,587

NT 1 1,688

ROM 2 36,611

SCR 2 2,250,240

SYS 1 37,528,093

TLB 3 135,580

TSK 1 1,152

UCE 1 22,984

EXE 19 2,952,914

DLL 104 29,261,568

AX 1 18,432

Performance Benchmark Page 63 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

CPL 2 2,109,440

CPX 2 4,384

DRV 10 154,864

ICO 1 107,620

MSC 1 41,587

NT 1 1,688

ROM 2 36,611

SCR 2 2,250,240

SYS 1 37,528,093

TLB 3 135,580

TSK 1 1,152

UCE 1 22,984

Total 812 760,867,636

This test was conducted five times to obtain the average time to copy, move and delete the sample files, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 11 – Third Party Program Installation

This test measured how much time was required to install and uninstall a third party application. For this test, CommandTimer.exe timed how long it took to install and uninstall the following applications on the test machine:

 Firefox 3.6.3 (11,909 KB) (MSI File)  Microsoft .NET 3.5 (34,121 KB) (MSI File)  Steam (1,551 KB) (MSI File)

This test was conducted five times to obtain the average time to install/uninstall the above third party programs, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 12 – Network Throughput

This benchmark measured how much time was required to download a sample set of binary files of various sizes and types over a 100MB/s network connection. The files were hosted on a server machine running Windows Server 2008 and IIS 7. CommandTimer.exe was used in conjunction with GNU Wget (version 1.10.1) to time and conduct the download test.

The complete sample set of files was made up of 553,638,694 bytes over 484 files and two file type categories: media files [74% of total] and documents [26% of total]. The breakdown of the file types, file numbers and total sizes of the files in the sample set is shown in the following table:

File format Number Size (bytes)

JPEG 343 30,668,312

GIF 9 360,349

PNG 5 494,780

Performance Benchmark Page 64 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

MOV 7 57,360,371

RM 1 5,658,646

AVI 8 78,703,408

WMV 5 46,126,167

MP3 28 191,580,387

PDF 73 136,298,049

ZIP 4 6,295,987

7Z 1 92,238

Total 484 553,638,694

This test was conducted five times to obtain the average time to download this sample of files, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 13 – File Format Conversion (MP3  WAV, MP3  WMA)

This test measured how much time was required to convert five (5) different MP3 files into WAV files and subsequently, convert the same MP3 samples into a WMA files. The total size of the five (5) MP3s used was 25,870,899 bytes.

To encode the MP3 into another format, we used an application called ffmpeg.exe. The format conversion process was timed using CommandTimer.exe.

This test was conducted five times to obtain the average conversion speed between these formats, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 14 – File Compression and Decompression

This test measured the amount of time required to compress and decompress a sample set of files. For this test, we used a subset of the media and documents files used in the File Copy, Move and Delete benchmark. CommandTimer.exe recorded the amount of time required for 7zip.exe to compress the files into a *.zip and subsequently decompress the created *.zip file.

This subset comprised 1,218 files over 783 MB. The breakdown of the file types, file numbers and total sizes of the files in the sample set is shown in the following table:

File Type File Number Total Size

.xls 13 9.23 MB

.xlsx 9 3.51 MB

.ppt 9 7.37 MB

.pptx 11 17.4 MB

.doc 17 35.9 MB

.docx 19 24.5 MB

.gif 177 1.10 MB

Performance Benchmark Page 65 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

.jpg 737 66.2 MB

.png 159 48.9 MB

.mov 7 54.7 MB

.rm 1 5.39 MB

.avi 46 459 MB

.wma 11 48.6 MB

.avi 46 459 MB

.wma 11 48.6 MB

Total 1218 783 MB

This test was conducted five times to obtain the average file compression and decompression speed, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 15 – File Write, Open and Close

This benchmark was derived from Oli Warner’s File I/O test at http://www.thepcspy.com (please see Reference #1: What Really Slows Windows Down).

For this test, we developed OpenClose.exe, an application that looped writing a small file to disk, then opening and closing that file. CommandTimer.exe was used to time how long the process took to complete 180,000 cycles.

This test was conducted five times to obtain the average file writing, opening and closing speed, with the test machine rebooted between each sample to remove potential caching effects.

Benchmark 16 – PE Scan Time

This test measures the on demand scan times of a file set comprised only of executable files (.exe, .dll and .sys files). We performed five scans of the sample file set, with a machine restart between each scan to remove possible caching effects. The time taken to scan the files is taken from an antivirus product’s scan logs, or where logs are not available, manually with a stopwatch. Scans were launched by right clicking on the folder to be scanned.

A breakdown of the sample file set is as follows:

File Type Number File Size of Files Sys Files 2174 329MB

Dll Files 2037 920MB

Exe Files 2140 827MB

Total 6351 2076MB

The final result is calculated as an average of the five samples.

Performance Benchmark Page 66 of 67 Edition 3 16 February 2012

Consumer Security Products PassMark Software

Benchmark 17 – File Copy Disk to Disk

This test measures the amount of time taken to copy files between two local drives. The data set comprised of 8,501 files with a total file size of 5.44GB, and the formats used included documents, movies, images and executables. A breakdown of the sample file set is given below:

File Number File Size Extension of Files

.jpg 2903 588MB

.dll 773 25MB

.exe 730 197MB

.gif 681 63MB

.wav 430 260MB

.sys 501 79MB

.png 451 27MB

.mp3 333 2157MB

.wma 585 925MB

.docx 267 81MB

.avi 247 1079MB

.doc 160 57MB

.xls 329 132MB

.ppt 97 148MB

.zip 14 177MB

Total 8501 5995MB

A total of five runs of this test were performed, with a machine restart between each run. The time taken to copy files was measured and recorded by CommandTimer.exe. All the files were copied between a folder on the local drive and a 2nd folder on a different drive. Files were deleted from the 2nd drive once the copy was complete. The final result is calculated as an average of the five samples

Performance Benchmark Page 67 of 67 Edition 3 16 February 2012