A Performance Analysis of Intru- Sion Detection with Snort and Se
Total Page:16
File Type:pdf, Size:1020Kb
Linköping University | Department of Computer and Information Science Master’s thesis, 30 ECTS | Datateknik 2021 | LIU-IDA/LITH-EX-A--21/068--SE A Performance Analysis of Intru- sion Detection with Snort and Se- curity Information Management En Prestandaanalys av Intrångsdetektering med Snort och Hantering av Säkerhetsinformation Christian Thorarensen Supervisor : Mohammad Borhani Examiner : Andrei Gurtov External supervisor : Villiam Rydfalk Linköpings universitet SE–581 83 Linköping +46 13 28 10 00 , www.liu.se Upphovsrätt Detta dokument hålls tillgängligt på Internet - eller dess framtida ersättare - under 25 år från publicer- ingsdatum under förutsättning att inga extraordinära omständigheter uppstår. Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka ko- pior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervis- ning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säker- heten och tillgängligheten finns lösningar av teknisk och administrativ art. Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsman- nens litterära eller konstnärliga anseende eller egenart. För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/. Copyright The publishers will keep this document online on the Internet - or its possible replacement - for a period of 25 years starting from the date of publication barring exceptional circumstances. The online availability of the document implies permanent permission for anyone to read, to down- load, or to print out single copies for his/hers own use and to use it unchanged for non-commercial research and educational purpose. Subsequent transfers of copyright cannot revoke this permission. All other uses of the document are conditional upon the consent of the copyright owner. The publisher has taken technical and administrative measures to assure authenticity, security and accessibility. According to intellectual property law the author has the right to be mentioned when his/her work is accessed as described above and to be protected against infringement. For additional information about the Linköping University Electronic Press and its procedures for publication and for assurance of document integrity, please refer to its www home page: http://www.ep.liu.se/. © Christian Thorarensen Abstract Network intrusion detection systems (NIDSs) are a major component in cybersecurity and can be implemented with open-source software. Active communities and researchers continue to improve projects and rulesets used for detecting threats to keep up with the rapid development of the internet. With the combination of security information manage- ment, automated threat detection updates and widely used software, the NIDS security can be maximized. However, it is not clear how different combinations of software and basic settings affect network performance. The main purpose in this thesis was to find out how multithreading, standard ruleset configurations and near real-time data shipping affect Snort IDS’ online and offline per- formance. Investigations and results were designed to guide researchers or companies to enable maximum security with minimum impact on connectivity. Software used in per- formance testing was limited to Snort 2.9.17.1-WIN64 (IDS), Snort 3.1.0.0 (IDS), PulledPork (rule management) and Open Distro for Elasticsearch (information management). To in- crease the replicability of this study, the experimentation method was used, and network traffic generation was limited to 1.0 Gbit/s hardware. Offline performance was tested with traffic recorded from a webserver during February 2021 to increase the validity of test re- sults, but detection of attacks was not the focus. Through experimentation it was found that multithreading enabled 68-74% less run- time for offline analysis on an octa-thread system. On the same system, Snort’s drop rate was reduced from 9.0% to 1.1% by configuring multiple packet threads for 1.0 Gbit/s traf- fic. Secondly, Snort Community and Proofpoint ET Open rulesets showed approximately 1% and 31% dropped packets, respectively. Finally, enabling data shipping services to in- tegrate Snort with Open Distro for Elasticsearch (ODFE) did not have any negative impact on throughput, network delay or Snort’s drop rate. However, the usability of ODFE needs further investigation. In conclusion, Snort 3 multithreading enabled major performance benefits but not all open-source rules were available. In future work, the shared security information man- agement solution could be expanded to include multiple Snort sensors, triggers, alerting (email) and suggested actions for detected threats. Acknowledgments First, I am extremely grateful to Andrei Gurtov and Mohammad Borhani for the great feed- back during the whole thesis project. Secondly, a big thank you to MindRoad AB for pro- viding me with this assignment and a great work environment at their COVID-safe office in Linköping Science Park. Finally, I would like to thank Cornelia Folke for her love, support and never-ending belief in me. iv Contents Abstract iii Acknowledgments iv Contents v List of Figures viii List of Tables ix Listings x 1 Introduction 1 1.1 Motivation . 1 1.2 Aim............................................ 2 1.3 Research questions . 3 1.4 Delimitations . 3 1.5 MindRoad AB . 4 2 Theory 5 2.1 Intrusion detection . 5 2.2 Intrusion detection system categories . 5 2.3 Intrusion prevention . 7 2.4 IDS and IPS differences . 7 2.5 Snort . 9 2.5.1 Snort architecture . 9 2.5.2 Snort sniffing, logging and network interfaces . 9 2.5.3 Snort pcap readback . 10 2.5.4 Snort rules . 12 2.5.5 Snort NIDS . 13 2.5.6 Snort packet I/O and LibDAQ . 13 2.5.7 Preprocessors . 13 2.5.8 PulledPork . 14 2.6 Rulesets . 14 2.6.1 Snort Community Ruleset . 14 2.6.2 Snort Subscriber Ruleset . 14 2.6.3 Snort Registered Ruleset . 14 2.6.4 Proofpoint Emerging Threats Open Rules . 15 2.6.5 Proofpoint Emerging Threats Pro Rules . 15 2.7 Elastic Stack . 15 2.7.1 Elasticsearch . 16 2.7.2 Kibana . 16 2.7.3 Logstash . 16 v 2.7.4 Beats . 18 2.7.5 Open source and software licenses . 18 2.8 Open Distro for Elasticsearch . 18 2.9 D-ITG . 18 3 Related Work 19 3.1 IDS: Detection of attacks . 19 3.2 IDS: Performance, load and stress testing . 20 3.3 IDS: Packet I/O performance . 22 3.4 IDS: Network traffic generation for testing . 22 3.5 IDS: Encrypted traffic . 23 3.6 Security information and event management . 24 3.7 Related work compared to this thesis . 25 4 Method 26 4.1 Pre-study . 26 4.1.1 Experimentation in software engineering . 26 4.1.2 Definitions in experimentation . 27 4.1.3 Methods in related work . 28 4.2 Experiment planning . 29 4.2.1 Snort, PulledPork and SIEM . 29 4.2.2 Variable definitions . 29 4.2.3 Datasets . 30 4.2.4 Snort rulesets . 33 4.2.5 Snort performance test rule and payload . 34 4.2.6 D-ITG network traffic generation configurations . 34 4.2.7 SIEM integration and test environment . 35 4.2.8 Information management . 35 4.2.9 Snort 3 multiple packet threads and DAQ . 37 4.3 Experiment design . 37 4.3.1 Experiment: Snort pcap readback and packet threads . 37 4.3.2 Experiment: Snort pcap readback, standard rulesets and policies . 39 4.3.3 Experiment: Snort IDS real-time performance . 41 4.4 Experiment presentation . 44 5 Results 45 5.1 Snort pcap readback and packet threads . 45 5.2 Snort pcap readback, standard rulesets and policies . 47 5.2.1 MR1: Snort 2 Community Max-detect alerts . 47 5.2.2 MR1: ET Open alerts . 49 5.2.3 MR1: Snort 2 built-in rules alerts . 49 5.2.4 MR1: All rulesets . 49 5.3 Snort IDS real-time performance . 50 5.3.1 Generated network traffic with and without Snort . 50 5.3.2 Snort 3 packet threads . 50 5.3.3 Snort 2 open-source rulesets . 51 5.3.4 SIEM integration . 51 5.3.5 Snort 3 DAQs: pcap vs afpacket . 52 6 Discussion 53 6.1 Results . 53 6.1.1 Snort pcap readback and packet threads . 53 6.1.2 Snort pcap readback and rulesets . 54 vi 6.1.3 Snort rulesets and detection of attacks . 54 6.1.4 Snort IDS real-time performance . 55 6.1.5 SIEM integration . 55 6.1.6 Snort 3 DAQs: pcap vs afpacket . 55 6.1.7 Snort IDS real-time performance hypotheses . 56 6.2 Method . 56 6.2.1 Experimentation . 56 6.2.2 Datasets . 56 6.2.3 Snort IDS and environment . 57 6.2.4 Snort, NIC and truncation . ..