<<

Vendor Comparison Matrix Example – Comparison of Antivirus https://protectourpower.org/best-practices/pop-bp-vendor-comparison-matrix-example-antivirus.pdf

In summary, this paper provides and discusses a cut-and-paste of a Wikipedia antivirus matrix into the PoP analysis matrix and provides the resultant Vendor Comparison Matrix.

Introduction

The Protect Our Power (PoP) Best Practices in Cybersecurity for Utilities Project has 2 Goals:

1. Provide organized and comprehensive information to the electric Utilities related to specific actionable subsets (Topics) of cybersecurity. 2. Provide an analysis of suppliers for each Topic that allows a Utility to understand better which suppliers are more related to each Utility’s interest in using best practices.

The first Goal is addressed via a literature search specific to the Topic under consideration. The second Goal is addressed via a Vendor Comparison Matrix. These represent two of the Work Products expected for each Topic. The Topics and associated Vendors are contained in the Taxonomy. Educational Institutions are used to complete all Work Products and have final authority to ensure an independent analysis.

Companies like Forrester and Gartner produce analyses of Vendors related to specific software areas – but are not targeting specific Utility needs. The Vendor Comparison Matrix in Protect Our Power’s Project can be thought of as like a Magic Quadrant or other analyses of vendors sometimes produced by such companies.

Wikipedia, interestingly, provides a “Comparison of .” It is unusual to locate such an analysis – and we use the data here to populate a PoP Vendor Comparison Matrix and furnish the result. Although such an open source comparison is rare, it provides an opportunity to simply cut-and-paste the data into the PoP Vendor Comparison Matrix for testing and demonstration purposes. We have done that, and the results are included in this paper. This allows for the demonstration of additional features of the analysis matrix.

Wikipedia’s Comparison of antivirus software

The Wikipedia site shows a different matrix for Windows, macOS, Linux, Solaris, FreeBSD, Android, iOS, Windows Phone, Symbian, and BlackBerry – i.e. an antivirus vendor matrix for each of these operating systems. In this demonstration, we take and cut-and-paste only the data for the Windows into the PoP software. There are 24 Vendors listed in this matrix – starting with AhnLab and ending with . However, there are 51 rows in the Wikipedia/Windows matrix because some companies have more than one product that are differentiable enough where multiple rows are used, and an “Info” Column is used (see the “Software Description” column) for the information that specifies the actual product that the specific row addresses. An “Info” column in the matrix is one of the three Types of columns – the others are “Binary” and “Score.” The columns in the matrix are various Criteria of interest and PoP has a separate document that discusses Criteria.

The only changes to the Wikipedia matrix are the addition of an additional Column (Criterion) for a “Score” kind of Criterion) and a weights row to weight all the Criteria – all of which are “Binary” Criteria. The “Score” column is only added for demonstration purposes (there could be many “Score” Criteria in a different situation). The “Info” Columns are not used in any calculations, but are provided for the potentially valuable information they contain.

The Resultant Vendor Comparison Matrix

The Wikipedia data (rows and columns) were copied into the PoP software (an Excel Workbook with Macros), a Weights row was added (green), a Types row was added (blue) under the Criteria names. Otherwise, entry into the PoP Workbook was a straightforward cut and paste. (see the three Sections of the Matrix starting on the next page)

The Weights (must add to 100%) provide the user (a Utility in PoP’s case) with the ability to customize the matrix to address their particular situation. In this example, weights were simply randomly assigned. The “Info” columns should always have zero weights. The Score column in this case has been assigned a zero weight as it has no non-zero data. The Vendor Scores (in red in column 3) are calculated using the values in the matrix interior weighted by the green weights (that add to 100%). Then the matrix was sorted by Vendor Score with the products rising to the top that do best given the user’s (Utility’s) needs (as expressed in the Weights for each Criterion Column. This promotes movement to best practices (that still allow for individual utility circumstances.)

While the actual Workbook can be downloaded and examined, it is manually divided into three sections and included here as Section 1, Section 2, and Section 3 below. Each Section includes the Company List and Vendor Score columns for better reader continuity and movement from section to section.

Please note that Protect Our Power in no way endorses the Wikipedia data or any ranking of Vendor products here based on random weights. The production of this matrix is provided solely for demonstration purposes.

Section 1 demonstrates that multiple products from the same Vendor can be added into the matrix simply by adding an “Info” column titled (Software Description). The other columns in Section 1 are all “Binary” as that is the kind of data furnished in the Wikipedia example. Blanks (in the body of the matrix) are not used in any calculations.

Section 2 repeats the first 3 columns for continuity and includes an additional 9 Criteria. One of those Criteria is “Sandbox”, but that column has many blank entries so it is a candidate for a zero weight (not enough data) which the matrix reflects.

Finally, Section 3 includes the remainder of the columns – many of which are “Info” in the Types row (row 3). “Info” columns are not used in any calculations.

Section 1

On- On- Boot- Vendor dema Heuri Cloud Firew Criteria Names > Software Description acces time Score nd stics AV all s scan scans scan

Types > Info Binary Binary Binary Binary Binary Binary Companies V 100% 0% 14% 6% 5% 11% 5% 12% 1 Dr.Web 6.0 Dr.Web Security Space Y Y Y Y Y Y 2 5.8 Kaspersky Total Security Y Y Y Y Y Y 3 ESET 5.8 ESET Smart Security Y Y Y Y Y Y 4 Comodo Group 5.8 Comodo Y Y Y Y Y Y 5 Symantec 5.5 Nrton Internet Security Y Y Y Y Y Y 6 5.5 Sophos EndUser Protection Y Y Y Y Y Y 7 McAfee 5.5 McAfee Internet Security Y Y Y Y Y Y 8 Kaspersky Lab 5.5 Kaspersky Internet Security Y Y Y Y Y Y 9 F-Secure 5.5 F-Secure SAFE Y Y Y Y Y Y 10 Dr.Web 5.5 Dr.Web Anti-virus Y Y Y Y Y Y 11 5.5 Bitdefender Antivirus Plus Y Y Y Y Y Y 12 Bitdefender 5.5 Bitdefender Internet Security Y Y Y Y Y Y 13 AVG Technologies 5.5 AVG Internet Security Y Y Y Y Y Y 14 5.5 Quick Heal Total Security Y Y Y Y N Y 15 AhnLab 5.3 AhnLab V3 Internet Security Y Y Y Y Y Y 16 Comodo Group 5.0 Comodo Antivirus Y Y Y Y Y N 17 G Data Software 4.9 G DATA InternetSecurity Y Y Y Y Y Y 18 4.9 Avast Premium Security Y Y Y Y Y Y 19 4.6 Panda Internet Security Y Y Y Y Y Y 20 4.6 ZoneAlarm Extreme Security Y Y N Y Y Y 21 Sophos 4.4 Sophos Anti-Virus Y Y Y Y Y Y 22 Webroot 4.2 SecureAnywhere AntiVirus Y Y N Y Y Y 23 Webroot 4.2 SecureAnywhere Internet Security Y Y N Y Y Y 24 Panda Security 4.1 Panda Antivirus Pro Y Y Y Y Y Y 25 Panda Security 4.1 Panda Antivirus Free Y Y Y Y Y Y 26 Avast 4.0 Avast Free Antivirus Y Y Y Y Y N 27 ESET 3.9 ESET ND32 Antivirus Y Y Y Y Y N 28 3.9 Avira Internet Security Y Y Y Y Y N 29 TrustPort 3.8 TrustPort Antivirus Y Y N Y N Y 30 TrustPort 3.8 TrustPort Internet Security Y Y N Y N Y 31 TrustPort 3.8 TrustPort Total Protection Y Y N Y N Y 32 AVG Technologies 3.7 AVG Antivirus Y Y Y Y Y N 33 Check Point 3.5 ZoneAlarm PRO Antivirus + Firew Y Y N Y N Y 34 3.4 360 Total Security Y Y Y Y Y N 35 Avira 3.4 Avira Antivirus FREE (formerly An Y Y Y Y Y N 36 3.2 Titanium Internet Security Y Y Y Y Y N 37 Bitdefender 3.2 Bitdefender Antivirus Free Y Y Y Y Y N 38 McAfee 3.1 McAfee Antivirus Y Y Y Y N N 39 Kaspersky Lab 3.1 Kaspersky Anti-Virus Y Y Y Y N N 40 G Data Software 3.1 G DATA AntiVirus Y Y Y Y N N 41 F-Secure 3.1 F-Secure Antivirus Y Y Y Y N N 42 FRISK Software 3.1 F-PROT Antivirus Y Y Y Y N N 43 AVG Technologies 3.1 AVG Antivirus FREE Y Y Y Y N N 44 2.9 FortiClient Y Y Y N N Y 45 Trend Micro 2.9 Titanium Antivirus Plus Y Y Y Y N N 46 NANO Security Ltd 2.8 NAN Antivirus Y Y N Y N N 47 ClamWin 2.6 ClamWin Y Y Y Y N N 48 Cisco (originally Im 2.2 Y Y N N Y N 49 Symantec (origina 1.7 Spyware Doctor with AntiVirus Y Y N N N N 50 1.7 Windows Defender Y Y N N N N 51 VirusBlokAda 1.2 Vba32 AntiVirus Y Y N N N N Average > 6.0 6.0 4.6 5.4 3.9 3.4 Section 2

Email Web Macro Live Vendor Sandb AntiS Suppo Criteria Names > IDS IPS Securi prote prote Updat Score ox pam rt ty ction ction e

Types > Binary Binary Binary Binary Binary Binary Binary Binary Binary Companies V 100% 5% 10% 0% 4% 4% 4% 4% 4% 4% 1 Dr.Web 6.0 Y Y N Y Y Y Y Y Y 2 Kaspersky Lab 5.8 Y Y Y Y Y Y Y Y 3 ESET 5.8 Y Y Y Y Y Y Y Y 4 Comodo Group 5.8 Y Y Y Y Y Y Y Y Y 5 Symantec 5.5 Y Y Y Y Y Y Y Y 6 Sophos 5.5 Y Y Y Y Y Y Y Y 7 McAfee 5.5 Y Y Y Y Y Y Y Y 8 Kaspersky Lab 5.5 Y Y Y Y Y Y Y Y 9 F-Secure 5.5 Y Y Y Y Y Y Y Y 10 Dr.Web 5.5 Y Y N N N Y Y Y Y 11 Bitdefender 5.5 Y Y N Y Y Y Y Y Y 12 Bitdefender 5.5 Y Y N Y Y Y Y Y Y 13 AVG Technologies 5.5 Y Y Y Y Y Y Y Y 14 Quick Heal 5.5 Y Y Y Y Y Y Y Y Y 15 AhnLab 5.3 Y Y Y N Y Y Y Y 16 Comodo Group 5.0 Y Y Y Y Y Y Y Y Y 17 G Data Software 4.9 Y N Y Y Y Y Y Y 18 Avast 4.9 Y N Y Y Y Y Y Y Y 19 Panda Security 4.6 N N Y Y Y Y Y Y 20 Check Point 4.6 N N Y Y Y Y Y Y 21 Sophos 4.4 Y N Y N N Y Y Y 22 Webroot 4.2 N Y N N Y N Y Y 23 Webroot 4.2 N Y N N Y N Y Y 24 Panda Security 4.1 N N N Y N Y Y Y 25 Panda Security 4.1 N N N N Y Y Y Y 26 Avast 4.0 Y N N Y N Y Y Y Y 27 ESET 3.9 N N Y N Y Y Y Y 28 Avira 3.9 N N Y Y Y Y Y Y 29 TrustPort 3.8 N N Y Y Y N Y Y 30 TrustPort 3.8 N N Y Y Y N Y Y 31 TrustPort 3.8 N N Y Y Y N Y Y 32 AVG Technologies 3.7 N N Y N Y Y Y Y 33 Check Point 3.5 N N N N N Y Y Y 34 Qihoo 360 3.4 N N N N Y Y Y Y 35 Avira 3.4 N N N Y N Y Y Y 36 Trend Micro 3.2 N N N N Y N Y Y 37 Bitdefender 3.2 N N N N N Y N Y Y 38 McAfee 3.1 N N Y N N Y Y Y 39 Kaspersky Lab 3.1 N N Y N Y N Y Y 40 G Data Software 3.1 N N N N Y Y Y Y 41 F-Secure 3.1 N N Y N N Y Y Y 42 FRISK Software 3.1 N N Y N N Y Y Y 43 AVG Technologies 3.1 N N N N Y Y Y Y 44 Fortinet 2.9 N N N N Y N Y Y 45 Trend Micro 2.9 N N N Y N N Y Y 46 NANO Security Ltd 2.8 N N Y N Y N Y Y 47 ClamWin 2.6 N N Y N N N Y N 48 Cisco (originally I 2.2 N N Y N N N Y Y 49 Symantec (origina 1.7 N N N N Y N Y N 50 Microsoft 1.7 N N N N N N Y Y 51 VirusBlokAda 1.2 N N N N N N N N Average > 2.4 2.1 2.4 4.0 2.9 4.6 4.1 5.9 5.6

Section 3

Setting installe Origina s r/ Score Countr Vendor First l Criteria Names > Import updater Criteria License Price y of Notes Score release author /Expor MIMA 01 origin /s t safe Types > Binary Binary Score Info Info Info Info Info Info Companies V 100% 4% 4% 0% 0% 0% 0% 0% 0% 0% 1 Dr.Web 6.0 Y Y 0 Proprieta Trialware 2006 Russia Igor Danilov 2 Kaspersky Lab 5.8 Y 0 Proprieta Nn-free Russia Eugene KBanned b 3 ESET 5.8 Y 0 Proprietarialware[5 2007 Slovakia Peter Paško and Mi 4 Comodo Group 5.8 Y 0 Proprieta Nn-free 2010 United States *Protectio 5 Symantec 5.5 0 Proprieta Nn-free 2000 United States Whitelists 6 Sophos 5.5 0 Proprieta Nn-free 2000 United Ki Jan Hruska and Pet 7 McAfee 5.5 0 Proprieta Nn-free 2006 United StJohn McAJohn McA 8 Kaspersky Lab 5.5 0 Proprieta Nn-free 2007 Russia Eugene KBanned b 9 F-Secure 5.5 0 Proprieta Nn-free 2004 Finland Petri AllaThird-par 10 Dr.Web 5.5 Y Y 0 Proprieta Trialware 1992 Russia Igor Dani *macOS/ 11 Bitdefender 5.5 0 Proprieta Nn-free 1996 Romania 12 Bitdefender 5.5 0 Proprieta Nn-free 2008 Romania 13 AVG Technologies 5.5 N 0 Proprieta Nn-free 2008 Czech Re Jan Gritzbach and 14 Quick Heal 5.5 Y 0 Proprieta Nn-free 1995 India *Browser 15 AhnLab 5.3 0 Proprieta Nn-free 1988 South KoDr. Ahn Cheol-Soo 16 Comodo Group 5.0 Y 0 Proprieta Free 2008 United States *Protectio 17 G Data Software 4.9 0 Proprieta Nn-free 2004 Germany Andreas LThird-par 18 Avast 4.9 N 0 Proprieta Trialware 1997 Czech Re Pavel Baudiš and E 19 Panda Security 4.6 0 Proprieta Nn-free 2002 Spain Mikel Urizarbarren 20 Check Point 4.6 Y 0 Proprieta Nn-free 2002 United States 21 Sophos 4.4 0 Proprieta Nn-free 1988 United Ki Jan Hruska and Pet 22 Webroot 4.2 0 Proprieta Trialware 2011 United States 23 Webroot 4.2 0 Proprieta Nn-free 2011 United States 24 Panda Security 4.1 0 Proprieta Nn-free 1990 Spain Mikel Urizarbarren 25 Panda Security 4.1 0 Proprieta Trialware 2009 Spain 26 Avast 4.0 N 0 Proprieta Free 1988 Czech Re Pavel Baudiš and E 27 ESET 3.9 Y 0 Proprietarialware[5 1987 Slovakia Peter Paško and Mi 28 Avira 3.9 0 Proprieta Nn-free 2002 Germany Tjark Auerbach 29 TrustPort 3.8 0 Proprieta Trialware 2008 Third-par 30 TrustPort 3.8 0 Proprieta Trialware 2008 Czech Republic Third-par 31 TrustPort 3.8 0 Proprieta Trialware 2008 Czech Republic Third-par 32 AVG Technologies 3.7 N 0 Proprieta Nn-free 2006 Czech Re Jan Gritzbach and 33 Check Point 3.5 Y 0 Proprieta Nn-free 2002 Israel Third-par 34 Qihoo 360 3.4 0 Proprieta Free 2006 China 35 Avira 3.4 0 Proprieta Free 1988 Germany Tjark Auerbach 36 Trend Micro 3.2 0 Proprieta Nn-free 2008 Japan 37 Bitdefender 3.2 0 Proprieta Free 2013 Romania 38 McAfee 3.1 0 Proprieta Nn-free 1987 United StJohn McAJohn McA 39 Kaspersky Lab 3.1 0 Proprieta Nn-free 1997 Russia Eugene KBanned b 40 G Data Software 3.1 0 Proprieta Nn-free 1987 Germany Andreas LThird-par 41 F-Secure 3.1 0 Proprieta Nn-free 1991 Finland Petri AllaThird-par 42 FRISK Software 3.1 0 Proprieta Trialware 1989 Iceland Friðrik Skúlason (s 43 AVG Technologies 3.1 N 0 Proprieta Free 1992 Czech Re Jan Gritzbach and 44 Fortinet 2.9 0 Proprieta Free 2004 United States 45 Trend Micro 2.9 0 Proprieta Nn-free 1990 Japan 46 NANO Security Ltd 2.8 N 0 Proprieta Free 2009 Russia 47 ClamWin 2.6 0 GNU GPL Free 2002 Australia Third-par 48 Cisco (originally Im 2.2 0 Proprieta Free 2010 United States Third-par 49 Symantec (origina 1.7 0 Proprieta Nn-free 2003 United States 50 Microsoft 1.7 N 0 Proprieta Free 2012 United States 51 VirusBlokAda 1.2 0 Proprieta Nn-free 2010 Belarus Average > 3.5 6.0 0.0

Conclusion

The Windows Antivirus Vendor Comparison Matrix presented above contains actual data (per Wikipedia) and randomly assigned weights to each Criterion Column and therefore, the results are not meaningful. If a Utility would download the Matrix and provide weights meaningful that were meaningful to their situation, the results would be helpful in determining which Vendors are more related to Best Practices, and to invite for a deeper conversation.

Contact Erick Ford | Project Manager [email protected]