<<

Inner Conflict: How Smart Device Components Can Cause Harm

Omer Shwartz, Amir Cohen, Asaf Shabtai, Yossi Oren Ben-Gurion University of the Negev, P.O.B. 653, Beer-Sheva, 8410501 Israel.

Abstract Mobile smart devices are built of smaller components that are often fabricated by third-parties, and not by the device manufacturers themselves. Components such as sensors, radio transceivers, and touchscreen controllers are generally supplied to the manufacturers, along with driver that facilitates communication between the component and the host device. Driver software is normally embedded within the kernel, where it is trusted to behave within defined parameters. Since the hardware of the smart device is expected not to change frequently, the device driver source code implicitly assumes that the hardware of the component is authentic and trustworthy. Such trust permits driver designs with a lax approach towards the integrity and security of data exchanged between the main processor and the hardware component. In this paper, we question this trust in hardware components. Smart devices such as phones are often repaired with replacement components. Identifying and authenticating the source of these components is usually very difficult. We assume the threat consists of a malicious replacement touchscreen procured from an untrusted vendor. We construct two standalone attacks based on malicious touchscreen hardware. Our experiments demonstrate that these attacks allow an adversary to impersonate and eavesdrop on a user, exfiltrate data, and exploit the operating system, enabling the execution of privileged commands. For mitigating these and other similar attacks, we build and evaluate a machine-learning, hardware-based countermeasure capable of detecting abnormal communications with hardware components. Keywords: , Privacy, Reverse engineering, Smart devices.

1. Introduction and Maxim Integrated provide these components, as well as the device driver source code, to phone manufacturers Smart devices and gadgets can break or malfunction who integrate the components into their phones. The man- and often require repair. A common example is the smart- ufacturers then integrate this device driver source code phone which is easily dropped, resulting in a shattered into their own source code, making slight adjustments to touchscreen. According to a 2015 study, over 50% of global account for minor differences between device models, such owners have damaged their phone screen at as memory locations, I/O bus identifiers, etc. These minor least once, and 21% of global smartphone owners are cur- tweaks and modifications make the process of creating and rently using a phone with a cracked or shattered screen [1]. deploying patches for such device drivers a very difficult While phones with broken touchscreens may be repaired at endeavor, as we discuss further in Section 2. The example phone vendor-operated facilities such as Apple Stores, it in Figure 1 illustrates this setting. The smartphone’s main is often more convenient and cost-effective for phone users logic board runs a specific OEM code (the device driver) to use third-party repair shops. Some technically savvy that communicates with the touchscreen over the internal users may even purchase touchscreen replacement kits from bus using a common, simple interface. Even hardened, online marketplaces such as eBay and perform the repair secure, or encrypted phones, such as those used by govern- themselves. These types of unofficial repairs tend to include mental and law enforcement agencies, often use commercial the cheapest possible components, and thus may introduce, operating systems and a driver architecture that follows knowingly or unknowingly, counterfeit components into the the same paradigm [2]. phone. An important observation we make is that the device Replaceable components, including phone touchscreens, drivers (written by the OEMs and slightly modified by batteries or various sensors, are seldom produced by the the phone manufacturers) exist inside the phone’s trust phone manufacturers themselves. Instead, original equip- boundary. In contrast to drivers for “pluggable” peripherals ment manufacturers (OEMs) such as Synaptics, MediaTek such as USB accessories, these OEM drivers assume that the internal components they communicate with are also Email address: [email protected], inside the phone’s trust boundary. We observe, however, [email protected], [email protected], [email protected]. that these internal components actually belong outside (Omer Shwartz, Amir Cohen, Asaf Shabtai, Yossi Oren)

Preprint submitted to Elsevier October 26, 2019 of the smartphone’s trust boundary. Indeed, there are a series of end-to-end attacks that can severely some hundreds of millions of with untrusted compromise a stock Android phone with standard replacement screens. This inherent dissonance underlies firmware. We implement and evaluate three different our research questions: How might a malicious replacement attacks, using an experimental setup based on a low- peripheral abuse the trust given to it by its device driver? cost embedded in-line with the touch How can we defend against this form of abuse? controller communication bus. These attacks can: Hardware replacement is traditionally considered a Impersonate the user - By injecting touch events strong attack model, under which almost any attack is into the communication bus, an attacker can perform possible. However, in our case, we add an important re- any activity representing the user. This includes in- striction to this model: we assume that only a specific stalling software, granting permissions and modifying component, with an extremely limited hardware interface, the device configuration; Compromise the user - is malicious. Furthermore, we assume that the repair tech- An attacker can log touch events related to sensitive nician installing the component is uninvolved. One can as- operations such as lock screen patterns, credentials or sume that these limitations make this attack vector weaker passwords. An attacker can also cause the phone to than complete hardware replacement; we show that it is enter a different state than the one expected by the not. Given the hundreds of millions of devices satisfying user by injecting touch events. For example, we show these assumptions in the wild [1], this form of abuse poses an attack that waits until a user enters a URL for a a significant danger. website and then stealthily modifies the touch events While composing this paper, we have searched for alle- to enter a URL of a phishing website, thereby caus- gations or reports of hardware implants in smartphones in ing the user surrender his or her private information; public news sources. Despite our efforts could find none. Compromise the phone - By sending crafted data This is unsurprising, and can be attributed to the highly to the phone over the touch controller interface, an directed and covert nature of these attacks, which are attacker can exploit vulnerabilities within the device most likely to be mounted by nation-state attackers against driver and gain kernel execution capabilities. targets who are either unaware of the attacks, or highly 5. We make a case for a hardware-based countermea- interested in suppressing reports about them. sure that listens to or intercepts communications In this work we highlight the risk of counterfeit or over a hardware interface and monitors for anoma- malicious components in the consumer setting, where the lies. We implement such a countermeasure using target is the user’s privacy, personal assets, and trust. machine learning techniques, and evaluate it against We show how a malicious touchscreen can record user the attacks demonstrated. The evaluation shows a activity, take control of the phone and install apps, direct true-positive rate of 100% with a false-positive rate the user to phishing websites and exploit vulnerabilities of 0% over a detection period of 0.2 seconds. in the operating system kernel in order to gain privileged control of the device. Since the attack is carried out by The rest of this paper is structured as follows. Section 2 malicious code running from the CPU’s main code space, reviews related work on smart device malware and hard- the result is a fileless attack, which cannot be detected by ware attacks. We summarize some of the current state anti-virus software, leaves no lasting footprint and survives of malicious hardware implants in Section 3. We explore firmware updates and factory resets. We also propose a the attack model in detail in Section 4, including a sur- method for designing a flexible and generic countermeasure vey of the attack surface along with the reasoning behind that may help protect against such attacks. the proposed countermeasure. In Sections 5, 6, and 7, Our paper makes the following contributions: we demonstrate the attacks described above. Section 8 describes the implemented countermeasure and its effec- 1. We explore the risk of malicious peripheral attacks tiveness against the attacks. Finally, our work is concluded on consumer devices and argue that this avenue of in Section 9. attack is both practical and effective. 2. We detail recent events and analyses in regard to 2. Related Work allegations of malicious hardware implants discovered in deployed severs and investigate the similarities to Throughout the relatively short history of smartphones, the attack model described in this paper. both smartphone malware and smartphone protection mech- 3. We describe two attacks that form building blocks anisms have evolved drastically as the smartphone plat- that can be used in a larger attack: a touch injec- form grew popular. Android malware in particular has tion attack that allows the touchscreen to imperson- been shown to utilize privilege escalation, siphon private ate the user, and a buffer overflow attack that lets user data and enlist phones into botnets [3]. Bickford et the attacker execute privileged operations. We imple- al. [4] address the topic of smartphone rootkits, defining a ment both of these attacks on two Android devices rootkit as “a toolkit of techniques developed by attackers assembled by different manufacturers. to conceal the presence of malicious software on a com- 4. Combining the two building blocks, we also present promised system”. A rootkit’s malicious activities include 2 Device (HID), as well as through undocumented capabili- ties. Yang et. al [11] used power analysis techniques on a smartphone’s USB connection to detect and classify touch screen activity, allowing them to infer browsing activity through this channel. Most of the existing work dealing with internal hardware interfaces focused on hardware components that can either Main Memory Main CPU be updated by the user or easily replaced. Compared to PCs, smartphones are more monolithic by design, with a OEM Code static hardware inventory and components that can only be replaced with matching substitutes. The smartphone Main Logic Board operating system contains device firmwares that can only be updated alongside the operating system. Thus, there has I2C Bus been far less research emphasis on smartphone hardware, based on the assumption that it cannot be easily replaced or updated and is therefore less exposed to the threats Aux. Memory Aux. CPU discussed above. We challenge this assumption, noting that smartphone components are actually replaced quite frequently and often with non-genuine parts, as we show Touchscreen in Section 4. The trouble that accompanies counterfeit components has not been completely ignored by the mobile industry (e.g., the “error 53” issue experienced by some iPhone users after replacing their fingerprint sensors with off-brand ones Smartphone and failing validity checks [12]), however, it seems that such validity checks are not widely accepted, as counterfeit Figure 1: The smartphone, its touch screen, and its associated device replacement components usually pass unnoticed. The risk driver software. of counterfeit components was also raised in the national security setting in a National Institute of Standards (NIST) draft, with an emphasis on supply chains [13]. Indeed, wiretapping into phone calls, exfiltration of positioning counterfeit assemblies or components are a key point of information and battery exhaustion denial of service. They concern for the hardware industry [14]. can be detected with dynamic or static techniques [5]. A factor that hinders development of standard meth- Hardware interfaces have recently been the subject of ods for counterfeit components is device customization. concern for security researchers in the personal computer Android device customization security hazards have been setting, due to their involvement in highly privileged pro- systematically studied by Zhou et al. [15]. The authors cesses [6]. Hardware components enjoying Direct Memory designed a tool, ADDICTED, that detects customization Access (DMA) such as the Graphics Processing Unit (GPU) hazards. The authors raised the concern that the cus- can implant malware within the kernel memory [7]. Ladakis tomizations performed by manufacturers can potentially et al. [8] demonstrate a GPU-based keylogger where the undermine Android security. In a previous work [16] we GPU abuses its DMA capabilities for monitoring the key- focused on driver customizations, reviewing the source code board buffer inside the kernel memory and saving keystroke of 26 Android phones and mapping the device drivers that information to the GPU memory. Brocker et al. [9] use are embedded in the kernel of their operating system. Our the firmware update functionality of a MacBook iSight survey found a great deal of diversity in OEMs and device camera for installing malicious firmware on the camera. drivers. Each phone contained different driver software, and The authors showed how their own firmware could be used there were few common device drivers among the phones to take photos discretely or videos, without turning on the tested. This landscape makes it difficult to patch, test, and indicator light that informs the user that the camera is in distribute fixes for vulnerabilities found in driver code. use. Additionally, the authors use their firmware for enu- In the wider context of smartphone security, several merating the camera as a USB keyboard and present the works [17, 18, 19] have investigated the potential harm ability of the enumerated device to escape virtual machine to user privacy by OEM and third party code, such as confinement. advertisement and analytics libraries. In contrast to these When considering vulnerabilities emerging from hard- works, which address third-party code running as part of ware interfaces, some prior research has focused on ex- applications with user-level permission, we discuss here ternal hardware interfaces such as USB. Wang et al. [10] OEM code in drivers, which has kernel-level permissions performed an extensive study on the exploitation of smart- and therefore carries a higher security risk. phones via trusted profiles such as the Human Interface

3 3. Alleged Hardware Implants Reported 3.3. Feasibility of an Implant Security researcher Trammel Hudson investigated if On October 4th, 2018, an article titled “The Big Hack: such an implant can be manufactured and would it have How China Used a Tiny Chip to Infiltrate U.S. Companies” any ability to disrupt and spy on the motherboard [25]. In was published in the Bloomberg Businessweek website [20]. his work, Hudson addresses several key issues contesting The article described a malicious hardware implant of Chi- such an implant: nese origin, as small as a grain of rice, that was found in electronic boards used in servers deployed in datacen- 1. The ability to change the board behavior - by an- ters. According to Bloomberg, “The attack by Chinese alyzing finding unencrypted portions in the BMC spies reached almost 30 U.S. companies, including Amazon firmware Hudson showed how a simple modification and Apple, by compromising America’s technology supply to the firmware resulted in execution of arbitrary chain, according to extensive interviews with government code in the BMC. and corporate sources”. Bloomberg also claimed that the 2. Initiating a change from a small physical footprint - affected companies discovered and reported these implants by inserting his implant instead of a resistor on the to U.S. authorities. motherboard, Hudson gained the ability to intercept data sent from the SPI Flash to the BMC and nullify 3.1. Responses bits on demand. Hudson also showed how these The report from Bloomberg was quickly met with skepti- capabilities are sufficient for making useful changes cism and denial, a response article published by The Wash- to the firmware loaded to the BMC. ington Post under the title “Your move, Bloomberg” [21] 3. Manufacturing of a component small enough to avoid called for Bloomberg to provide more information while detection - Hudson showed how a modern microcon- reaching out to the allegedly attacked companies and pub- troller core is smaller than a standard surface mount lishing their responses. The responses quoted of Amazon resistor and therefore can be embedded within it. and Apple included a complete denial of any knowledge of hardware modification or implants in their datacenters. 3.4. Review and Limitations Later on, some of the companies even called for Bloomberg A malicious implant in hardware, similar to the one to retract the article [22]. reported by Bloomberg has the capacity for doing great While the equipment manufacturer Supermicro’s stock damage. It benefits from the trust components have in lost over 40 percent of its value [23], the security community embedded systems and also from simple interfaces such as remained conflicted and confused about whether these SPI and Inter Integrated Circuit (I2C) buses that facilitate attacks happened and if such an implant can actually be communication between components. manufactured, installed and maintain effectiveness without The replacement of components is also demonstrated being detected. in projects such as the TPMGenie [26] where a malicious implant hijacks the I2C bus between a CPU and a Trusted 3.2. Properties of the Implant Platform Module (TPM) thus disabling a crucial level of The Bloomber article described the hardware implant as security used in encryption and verification. smaller than a grain of rice and also not being a part of the These attacks are limited by their requirement for access original board design. Additionally, Bloomberg provided to the victim device during manufacturing or transport. In a drawing showing the implant’s location on the mother- the rest of this paper we investigate how these attacks can board. Security researchers [24] proposed that based on the be performed on an already deployed device by targeting drawing, the implant was disguised as a passive component replaceable units of hardware. near the Flash memory chip containing the firmware for the BMC (Baseboard Management Controller). 4. Our Attack Model The BMC is an integrated circuit (IC) that is responsi- ble for the management of various resources of the electronic Counterfeit components have been in existence ever board and usually have direct access to other processors since the dawn of the industrial age. When tampered with, and their memory as well a network connection. Com- their effectiveness as attack vectors is also well known. promising the BMC can mean compromising the whole What, then, is unique about the particular setting of a system. smartphone? We argue that our specific attack model A typical BMC, such as the one installed in the moth- is a unique restriction the hardware replacement attack erboards examined, makes use of an SPI (Serial Peripheral model: we assume that only a specific component, with an Interface) Flash chip to hold its firmware. Subverting the extremely limited hardware interface, is malicious, while the SPI Flash or it’s communication lines is equal to modifying rest of the phone (both hardware and software) can still be the BMC firmware. trusted. Furthermore, we assume that the repair technician installing the component does not have malicious intent, and won’t perform any operations beyond replacing the

4 original component with a malicious one. One can assume by malicious pluggable peripherals (e.g., USB peripherals), that these limitations make this attack vector weaker than to compromise a smartphone have been demonstrated [35]. complete hardware replacement, however, we show that A review of 1077 Android CVEs (Common Vulnerabilities this is not the case, demonstrating that the nature of and Exposures) patched between August 2015 and April the smartphone ecosystem makes this attack model both 2017 shows that at least 29.5% (318 items) take place in practical and effective. Our attack model is particularly the device driver context [27]. Figure 2 shows the growth dangerous, given that there are hundreds of millions of in driver related CVEs. devices in the wild that satisfying these assumptions. Driver vulnerabilities are often detected in the pluggable The pervasiveness of untrusted components in the smart- setting where the driver controls a peripheral that might phone supply chain was investigated in September 2016 by be detached or replaced by the user. This leads to a Underwriters Laboratories (UL) [28]. UL researchers ob- general lack of attention to the internal component setting. tained 400 iPhone charger adapters from multiple sources We argue that it is very likely that internal components in eight different countries around the world, including the might be used to exploit vulnerabilities just like pluggable U.S., Canada, Colombia, China, Thailand, and Australia, components are known to do. In this paper we describe two and discovered that nearly all of them were counterfeit such vulnerabilities that we found in common touchscreen and contained sub-standard hardware. Similarly, in Octo- drivers (Synaptics S3718 and Atmel T641). ber 2016, Apple filed a lawsuit against Amazon supplier Mobile Star LLC, claiming that “Apple, as part of its on- 4.2. Case Study - Touch Controller Communications going brand protection efforts, has purchased well over In most smartphones, the touch controller communi- 100 iPhone devices, Apple power products, and Lightning cates with the device driver residing on the host processor cables sold as genuine by sellers on Amazon.com [...and] via a dedicated I2C bus [36], a general purpose, low speed revealed that almost 90% of these products are counter- bus designed for cost effective communication between ICs. feit.” [29]. Considering the condition of the third-party The I2C bus behaves as a physical layer between master marketplace, one can assume with high confidence that un- and slave devices, where master devices are allowed to read less a phone has been repaired at a vendor-operated facility and write from and to registers in the slave device’s mem- such as an Apple Store, it is likely to contain counterfeit ory. By manipulating these registers, the device driver, components. acting as master, can control the behavior of the touch controller, acting as slave. In the other direction, the touch 4.1. Malicious Peripherals controller can send events to the device driver by populat- Let us next assume that a malicious peripheral, such ing the appropriate registers and triggering an interrupt. as a touchscreen, has made it into a victim’s smartphone. On top of this low-level communication interface, the de- What sort of damage can it cause? vice driver typically defines a proprietary layer required for As stated in [16, 30], attacks based on malicious hard- the instrumentation and operation of the touch controller. ware can be divided into two different classes. First-order In the Nexus 6P phone, the Synaptics S3718 touch attacks use the standard interaction modes of the compo- controller daughter board has I2C connections to the host nent, but do so without the user’s knowledge or consent. In processor. It has an additional contact for generating the specific case of a malicious touchscreen, the malicious an interrupt signal notifying the host processor of touch- peripheral may log the user’s touch activity or imperson- related events. The I2C bus operates at the rate of 400 ate user touch events in order to impersonate the user for Kbps. malicious purposes. We demonstrate some of these attacks A basic mapping of the touch controller registers and in Section 5. Second-order attacks go beyond exchanging functions was extracted from the open source device driver properly-formed data, and attempt to cause a malfunction made available by Google. Additional reverse engineering in the device driver and compromise the operating system and observation provided a fuller picture of the communi- kernel. Such an attack requires that the peripheral send cation protocol. malformed data to the CPU, causing the device driver to malfunction and thereby compromise the operating sys- 4.2.1. Boot up process tem kernel. Once the kernel is compromised, it is possible During the boot up process, the device driver probes to disable detection and prevention of suspicious system the touch controller memory and learns which functions activity, eavesdrop on sensors and on other applications, the controller possesses. A controller function or capability and most significantly, operate on systems where only a is reported through a six byte function descriptor. The partial software stack has been loaded, such as a device in function descriptor contains four register addresses used for a charging, standby, or off state [31, 32, 15, 33, 34]. manipulating the function, along with an interrupt count While first-order attacks don’t rely on the presence of that signifies the number of interrupt types the function a software vulnerability and can be performed by any ma- generates. A mapping of several controller functions can licious peripheral contained in , the be seen in Table 1. After probing and querying for the existence of a vulnerability that can be exploited is a pre- functions, the device driver checks the firmware installed requisite of second-order attacks. Similar attacks performed 5 Figure 2: The percentage of patched Android CVEs that occur in driver context out of all patched Android CVEs. The figure was compiled using information from the Android Security Bulletins [27]. On May 2017, changes to the Android Security Bulletins website format decreased the amount of information given, rendering the context on many CVEs uncertain.

Table 1: Partial mapping of the main Synaptics S3718 functions and their purpose

Function Query Command Control Database Function Purpose ID Address Address Address Address 0x01 0x3F 0x36 0x14 0x06 General control and status of the touch controller 0x12 0x5C 0x00 0x1B 0x08 Reporting of simple touch events, including multi-finger touches 0x51 0x04 0x00 0x00 0x00 Firmware update interface

6 located. Using a hot air blower on the connection between the touch controller daughter board and the main assembly daughter board we were able to separate the boards and access the copper pads. The copper pads were soldered to thin enameled copper wire that was attached to a proto- typing board. Using this setup, we were able to simulate a chip-in-the-middle scenario in which a benign touchscreen has been embedded with a malicious integrated chip that manipulates the communication bus. Fig 3 shows the entire attack setup. Our attack used a commonly available Arduino plat- form [38] based on the ATmega328 microcontroller. A setup such as the one described above can easily be scaled down in size by a factory or skilled shop in order to fit on the touchscreen assembly daughter board. ATmega328, the programmable microcontroller used in our attacks, is available in packages as small as 4x4x1 mm [39]. Other, more powerful are available with smaller footprints of 1.47x1.58x0.4 mm or less [40]. Since the data sent by our attack fully conforms to layers 1 and 2 de- scribed in the I2C specification, it can also be implemented in the firmware of the malicious peripheral’s internal mi- crocontroller, removing the need for additional hardware Figure 3: The complete attack setup. The figure shows an exposed touch controller interface wired to a prototyping board embedded with altogether. auxiliary electronics and connected to an Arduino microcontroller module. The prototyping board is also connected to an STM32L432 4.4. Applicability to Other Phone Models and Operating microcontroller module [37] which is used as a countermeasure. Inset: Systems wires soldered onto the touch controller communication connection An analysis of other recent Android phones: Google Pixel 3 and Essential PH-1, suggests that similar architec- against the firmware file embedded in the kernel memory tures and practices are still shared among the majority of and triggers a firmware update if necessary. Eventually, phone models. the device driver enables the appropriate handlers for all The two phones analyzed use either I2C or SPI buses to function specific interrupts and writes the configuration communicate with a multitude of peripherals such as wire- data to the relevant functions. less charger, power management, NFC, speaker amplifier, battery management and touchscreen. 4.2.2. Touch reporting Additional analysis of the driver source code revealed In order to generate a touch event, the touch controller usage of unsafe programming practices such as the use of electrically pulls the interrupt line to the ground and thus unrestricted memory copy functions. A few potential bugs notifies the device driver of an incoming event. The de- and one minor security flaw were also found and are being vice driver in turn reads the interrupt register 0x06 and communicated to the vendors. deduces which of the touch controller functions generated the interrupt. In the case of a normal touch event this will 5. First-Order Attacks be function 0x12. The device driver continues to read a bitmap of the fingers involved in this event from register First-order attacks take advantage of peripherals’ capa- 0x0C and eventually reads register 0x08 for a full inventory bilities and abuse the trust they are afforded by the user of the touch event. or other components of the system. In this section, we demonstrate and evaluate several such attacks. 4.3. Attack Setup The attacks were demonstrated on a Huawei Nexus 6P 5.1. Touch Logging and Injection Attacks smartphone running the Android 6.0.1 operating system, In this attack, the malicious microcontroller eavesdrops build MTC19X, and operating with stock manufacturer on user touch events (touch logging) and injects generated firmware; in addition, the phone has been restored to ones into the communication bus (touch injection). The mi- factory state with a memory wipe using the “factory data crocontroller software behind the phishing attack shown in reset” feature in the settings menu. Table 2 is built of three components: two state machines, The touchscreen assembly was separated from the rest one maintaining a keyboard mode and the other main- of the phone and the touch controller daughter board was taining a typing state, and a database that maps screen

7 Table 2: A summary of the first-order attacks demonstrated.

Attack Time to Execute Screen Video Demo Blanked? Malicious software 21 seconds Yes https://youtu.be/83VMVrcEOCM installation Take picture and send via 14 seconds Yes https://youtu.be/WS4NChPjaaY email Replace URL with <1 second No https://youtu.be/XZujd42eYek phishing URL Log and exfiltrate screen 16 seconds Yes https://youtu.be/fY58zoadqMA unlock pattern regions to virtual keyboard buttons. The state machine Our third attack shows how the malicious screen can that maintains the keyboard modes changes state when a stealthily replace a hand-typed URL with a phishing keyboard mode switch key had been pressed. The basic URL. As the video shows, this attack waits for the user Nexus 6P keyboard has four modes: English characters, to type a URL, then quickly replaces it with a matching symbols, numbers, and emoji. The typing state machine phishing URL. The confused user can then be enticed to is used for tracking the typed characters and matching type in his or her credentials, assuming that a hand-typed them to specified trigger events (such as typing in a URL). URL is always secure. This attack takes less than one Complex context information, such as keyboard orienta- second but requires that the screen is turned on and the tion, language, activity, and even user identity, has been user is present, thus risking discovery. We note that the shown to be detectable from low-level touch events by other demonstrated attack setup has a typing speed of over 60 authors [41, 42, 43, 44]. When the required trigger event characters per second. is reached, touch injection begins and a set of generated Our fourth attack shows how the malicious screen can touch events is sent on the communication line. Our cur- log and exfiltrate the user’s screen unlock pattern rent hardware is capable of generating touch events at a to an online whiteboard website. The video demonstrates rate of approximately 60 taps per second. how the attack records the user’s unlock pattern and draws it on an online whiteboard, which is shared via the Internet 5.2. User Impersonation and User Compromise with the attacker’s PC. This attack demonstrates both The touch logging and injection capabilities shown can the collection and the infiltration abilities of the attack be extended and used for a variety of malicious activities. vector. This attack also takes less than 30 seconds, and its Since our attack model assumes a malicious touchscreen exfiltration step can also be performed while the screen is assembly, the attacker can turn off power to the display turned off. panel while a malicious action is performed, allowing most Table 2 summarizes the attacks discussed above. attacks to be carried out stealthily. The first attack we demonstrate is the malicious soft- 6. Second-Order Attacks ware installation attack. As illustrated in the video, this attack installs and starts an app from the Google Play Second-order attacks are attacks that exploit weak- Store. By using Android’s internal search functionality, nesses in the system that are unintentionally exposed to the attacker can type in the name of the Play Store app the peripherals. The examples presented in this section instead of searching for it onscreen, making our attack show how software vulnerabilities, such as buffer overflows, more resilient to users who customize their graphical home can be leveraged in this fashion. screens. It is important to note that the attack can install an app with arbitrary rights and permissions, since the 6.1. Arbitrary Code Execution Attacks malicious touchscreen can confirm any security prompt This attack exploits vulnerabilities in the touch con- displayed by the operating system. This attack takes less troller device driver embedded within the operating system than 30 seconds and can be performed when the phone is kernel in order to gain arbitrary code execution within the unattended and when the screen is powered off. privileged kernel context. A chain of data manipulations Next, we show how the malicious touchscreen can take performed by the malicious microcontroller causes a heap a picture of the phone’s owner and send it to the at- overflow in the device driver that is further exploited to tacker via email. As seen in the video, this attack activates perform a buffer overflow. the camera and sends a ’selfie’ to the attacker. This attack also takes less than 30 seconds and can be performed while 6.1.1. Design the display is turned off, allowing the attack to be carried As a part of the boot procedure, the device driver out without the user’s knowledge. queries the functionality of the touch controller. We discov- ered that by crafting additional functionality information 8 we can cause the device driver to discover more inter- tem calls, resulting in many different vulnerabilities rupts than its internal data structure can contain, causing exploitable through many techniques. a heap overflow. Using the heap overflow we were able to further increase the amount of available interrupts by • Create a hidden vulnerability within the kernel. A overrunning the integer holding that value. Next, an in- specific kernel vulnerability is generated, functioning terrupt was triggered causing the device driver to request as a backdoor for a knowledgeable attacker while an abnormally-large amount of data and cause a buffer remaining hidden. overflow. The buffer overflow was exploited using a Return Oriented Programming (ROP) [45] chain designed for the 6.1.4. Attacks on additional devices ARM64 architecture. While the main attack demonstrated here is crafted for the Nexus 6P phone, many other phones use similar device 6.1.2. Implementation drivers [16]. A small scale review performed by the authors When triggered, the malicious microcontroller shuts on three additional phones that contain a Synaptics touch down power to the touch controller and begins imitat- controller (Samsung Galaxy S5, LG Nexus 5X, LG Nexus ing normal touch controller behavior. During the boot 5) show that these phones have similar vulnerabilities to sequence, the malicious microcontroller emulates the mem- the ones exploited in the attack described here. ory register image of the touch controller and responds To further demonstrate the generality of our attack in the same way to register writes using a state machine. method, we extended it to another target device with a When probed for function descriptors in addresses higher different hardware architecture. The device we investigated than 0x500 that normally do not exist within the touch con- was an LG G Pad 7.0 (model v400) tablet. This device troller, the microcontroller responds with a set of crafted runs the Android 5.0.2 operating system and contains a function descriptors designed to cause the interrupt reg- different touchscreen controller than the Nexus 6P phone. ister map to exceed its boundaries. Within the device An STM32L432 microcontroller module was connected driver, a loop iterates over the interrupt register map and to the communication lines belonging to the touch con- writes values outside the bounds of an interrupt enable troller, and the original touch controller daughter board map, causing the integer holding the number of interrupt was disconnected. The microcontroller was programmed sources to be overwritten. After waiting 20 seconds for to replay previously recorded responses of a genuine touch the boot procedure to complete, the microcontroller ini- controller. Inspection of the device driver revealed unsafe tiates an interrupt by pulling the interrupt line to the buffer handling in numerous locations. By falsely reporting ground. The device driver, which should then read up to an abnormally large entity, the malicious microcontroller four interrupt registers, instead reads 210 bytes, causing was able to cause the device driver to read 2048 bytes from a buffer overflow, a ROP chain that calls the ker- the bus into an 80-byte global array. The buffer overflow af- nel function mem_text_write_kernel_word() that writes fected kernel memory and resulted in the overrun of various over protected kernel memory with a chosen payload resides internal pointers and eventually a crash. within the 210 bytes requested from the touch controller. While the attack shown in this section is not complete, Table 3 contains additional information about the ROP these preliminary results show how the complete attacks chain. shown in Sections 5 and 7 can be implemented on additional devices with different peripherals. 6.1.3. Evaluation In addition, the similarity in different peripheral im- Four different payloads were demonstrated on top of plementations makes adapting existing attacks to new pe- the ROP chain described above and tested in attack sce- ripherals easier. For example, after reverse engineering the narios on a phone with stock firmware and factory-restored touch reporting mechanism of the Atmel touch controller, settings and data. the Synaptics touch injection attack can be copied onto Each of the four payloads succeeded in compromising to devices with an Atmel touch controller, even without the phone’s security or data integrity. A list of the tested discovering any vulnerability in the Atmel device driver. payloads is as follows: 7. End-to-End Attacks • Disable all user capability checks in setuid() and setgid() system calls. This allows any user or app While each of the attacks described in Section 5 poses to achieve root privileges with a simple system call. a threat on their own, a combination of these attacks along • Silently incapacitate the Security Enhanced Linux with second-order attacks can lead to an even more powerful (SELinux) [46] module. While SELinux will still outcome. report blocking suspicious activity, such activity will The final attack presented completely compromises the not actually be blocked. phone, disables SELinux, and opens a reverse shell con- nected to a remote attacker. This attack is unique in that • Create a user exploitable system-wide vulnerability. it requires an exploitable bug in the third-party device The buffer check is disabled for all user buffers on sys- driver code. We describe this attack in more detail in the 9 Table 3: ROP chain designed for the ARM64 architecture. This chain results in a call to a predefined function with two arguments.

Gadget Gadget Code Relevant Pseudocode Order 1 ldp x19, x20, [sp, #0x10]; ldp x29, x30, [sp], Load arguments from stack to registers #0x20; ret; X19 and X20 2 mov x2, x19; mov x0, x2; ldp x19, x20, [sp, Assign X2 := X19; load arguments from #0x10]; stack to registers X19 and X20 ldp x29, x30, [sp], #0x30; ret; 3 mov x0, x19; mov x1, x20; blr x2; ldp x19, x20, Assign X0 := X19; assign X1 := X20; call [sp, #0x10]; X2(X0, X1) ldr x21, [sp, #0x20]; ldp x29, x30, [sp], #0x30; ret;

Table 4: A demonstration of a complete end-to-end attack.

Attack Time to execute Screen Blanked? Video Demo Complete phone compromise 65 seconds Yes https://youtu.be/sDfD5fJfiNc following subsection and provide additional information in root, deactivates the SELinux protection module, exfil- Table 4. trates private application data and authentication tokens, submits the data to an online , and finally, creates a 7.1. Phone Compromise root shell enabling an attacker to gain remote access. A To completely compromise the phone, we use a combi- video demonstration of the attack is available via the link nation of touch events and driver exploits, as illustrated in Table 4. in Figure 4: First, the attacker uses touch injection to install an innocent-looking app from the Google Play app 8. Countermeasure market. The next time the phone restarts, the malicious microcontroller initiates kernel exploitation during the A countermeasure that efficiently protects against the boot sequence and creates a vulnerability in the kernel that demonstrated attacks while not hindering development and is exploitable by app. Once the phone completes booting, production needs to have certain attributes. It needs to the previously installed app uses the vulnerability created be able to detect threats quickly while remaining cheap, by the microcontroller to take control of the system and generic, and easy to implement. Additionally, it must not perform malicious activity. The malicious app then reboots be affected greatly by mechanisms such as software updates the phone and the now-compromised phone resumes normal which may damage its detection capabilities. activity. In order to protect the phone from a malicious replace- ment component, we propose implementing a low-cost, 7.2. Attack Implementation hardware-based solution in the form of an I2C interface For this demonstration, a user app was created and up- proxy firewall. Such a firewall can monitor the communi- 2 loaded to the Google Play app market. The app starts when cation of the I C interfaces and protect the device from the phone boots up, and performs a series of system calls attacks originating from the malicious component. by writing to the pesudo-file “/prof/self/clear_refs”. The new hardware component can be physically located While the phone is in a normal state, these system calls in-line, on the path between the component and the OEM cause no issues and shouldn’t raise suspicion. During the code running on the CPU (as shown in Figure 5) and thus exploitation of the kernel by the malicious microcontroller, can modify or block malicious communication. Another the actions of the pesudo-file “/prof/self/clear_refs” alternative is that the component does not interrupt the are modified, and a vulnerability is introduced to it. This path between the untrusted component and the OEM code causes a change in the behavior of the app which is now able running on the CPU, but rather passively monitors the to exploit that vulnerability and execute code in the kernel traffic without modifying it. This may allow the use of context. We note that since the app is designed to exploit a lower-cost components with only a single monitoring port vulnerability that is non-existent under normal conditions, and lower clock rates, since the component does not have it appears completely benign when a malicious screen is to operate at the line speed. In another possible implemen- not present. This enabled our app to overcome malware tation of the solution, the firewall is not implemented as an filters and detectors, including Google Play’s gatekeeper, individual hardware component, but rather as independent Google Bouncer. software modules running on the primary CPU. Once the app has gained the ability to execute com- The use of a hardware countermeasure allows for pro- mands with kernel permissions, it elevates privileges to tection against both inserted malicious components and 10 Malicious Play Store App App

(3) App Installed in User Space Malicious Android Play Store App (5) App Exploits Vuln., Launches Full Attack

New Vuln.

1 Victim Phone Malicious Screen

Figure 4: Fully compromising the phone using a malicious touchscreen.

or even electronic interference. Many of these character- istics are unique for each family of microcontrollers, and a great deal of effort to is required to mimic or copy such Main Main Main CPU Main CPU characteristics in a different device. Memory Memory OEM Code OEM Code Performing intrusion detection using hardware finger- Main Logic Board Main Logic Board printing had previously been shown to be effective on more Internal Bus complex communication buses such as Controller Area I2C I2C Internal Bus Firewall Network (CAN bus) [47]. Firewall A hardware implementation allows for a transparent solution where the firewall component does not modify or Aux. Aux. Aux. CPU Aux. CPU influence the behavior of OEM-supplied code. Robustness Memory Memory Auxiliary Board Auxiliary Board is also achieved where when the firmware of the component or the OEM-supplied code has been updated, the firewall does not need to be modified or replaced. In addition, the User Device User Device hardware implementation can be developed to be generic at the interface level. An implementation of the firewall for the Figure 5: I2C on path (left) and off path (right) firewall. I2C interface can be applied with little or no modifications to different devices and components communicating using modified firmware attacks. It may also detect malicious the specific interface. behavior of firmware code that was modified by an insider Such hardware implant can even be designed in a way and therefore still be signed or encrypted. that is agnostic and unaware of the type and function of This proposed solution implements common firewall the untrusted hardware peripheral. The implementation functionality such as signature matching, anomaly detec- described and evaluated in the following subsections relies tion, filtering and rate limiting. Each data packet traveling on data transmission statistics to create a model of normal between the component and the CPU, and vice versa, will communication and discover anomalies while being unaware be analyzed by the I2C interface firewall. When the traffic of the type of peripheral it is guarding. Agnostic design traveling between the I2C and CPU is detected as mali- also facilitates large scale implementation where a single cious, the firewall can block this traffic and thus protect the design can be used in multiple hardware configurations. integrity of the CPU (if implemented inline as presented in Figure 5). In addition, the firewall can send a signal to the 8.2. Analysis CPU or reset the CPU, thus returning it to a safe state. In our research, we focus on protecting against chip-in- the-middle or chip replacement attack scenarios where the 8.1. Motivation main CPU interacts with a malicious microcontroller. The unique attack model we discuss in our paper al- The design of our countermeasure was motivated by the lows us to “fight hardware with hardware”. While software anticipated slight variations in communication properties countermeasures are capable of performing analysis of the between different microcontrollers and implementations. contents of data packets, they are blind to variations in While any microcontroller capable of I2C should comply the implementation of the physical protocols. Hardware with the standard, the standard does allow some imple- components have the ability to sense changes in the physi- mentational freedom so as to maintain robustness. In cal layer of the protocol, such as cadence, delays, voltages chip-in-the-middle scenarios, the CPU communicates di- rectly with the malicious chip and the activities on their 11 # of samples Max Median Mean Std # of samples Max Median Mean Std SDA rise 252296 2.8 1.6 1.4 0.5 SDA rise 141370 2.4 1.6 1.4 0.5 time (µs) time (µs) Inter-frame 12042 26.4 24.4 24.7 0.4 Inter-frame 6935 25.4 25 24.9 0.3 delay (µs) delay (µs) Message 12019 74 3 5.7 10.2 Message 6926 8 2 4 2.8 length length (bytes) (bytes) Inter- 12017 - 288.4 466441 4521765.2 Inter- 6924 - 457.2 19145.5 140717.6 message message delay (µs) delay (µs) (a) Data collected during phone boot (unmodified phone) (b) Data collected while using the touchscreen (unmodified phone)

# of samples Max Median Mean Std # of samples Max Median Mean Std SDA rise 2038 1.8 1.6 1.6 0.2 SDA rise 7815 1.8 1.6 1.6 0.2 time (µs) time (µs) Inter-frame 114 3494.6 24.2 955.6 1327.4 Inter-frame 527 2833.8 24.2 240.4 744.3 delay (µs) delay (µs) Message 80 74 3 6.7 12.5 Message 505 8 2 3.9 2.8 length length (bytes) (bytes) Inter- 79 - 3548.4 97373.3 595202.5 Inter- 504 - 3397.2 53320.6 322301 message message delay (µs) delay (µs) (c) Data collected during normal boot (chip-in-the-middle scenario) (d) Data collected during touch injection (chip-in-the-middle scenario)

Table 5: Statistics of the collected features from an unmodified phone operating under normal conditions (a,b) and from a phone with a malicious chip replacement to the touch screen controller (c,d).

communication bus is expected to differ than when the malicious chip is absent. In order to explore the differences in communication between the original controller and our malicious microcon- troller, we connected a Saleae Logic Pro 8 logic analyzer and recorded the communication bus in both scenarios at a sampling rate of 5,000,000 samples per second. Differences were observed and recorded between the communications in both scenarios. Notably, some of the dif- ferences were the result of the micro-architectural difference between the original controller and our micro-controller. The features observed are described below: • SDA rise time after clock signal - this measures the elapsed time between a clock signal generated by the I2C master device to the toggle of the SDA signal by the I2C slave device. Since our bus operates at 400,000 bits-per-second, the SDA rise time can be as high as 2.5µs. Due to the sampling frequency, values slightly above 2.5µs were also recorded. This feature is determined by the micro-architecture of the slave. An example of the difference between this parameter on different micro-controllers can be seen in Fig. 6. • Inter-frame delay - this measurement shows the elapsed Figure 6: Differences in SDA rise time patterns for benign touch 2 interactions (12294 samples) and chip-in-the-middle touch injections time between consecutive frames in I C messages. attack (7815 samples) scenarios. The differences originate from While this parameter is usually determined by the architectural differences between the malicious microcontroller and I2C master as the master controls the clock and rate of the original touch controller. data transfer, the slave may employ clock stretching to delay its response, thus prolonging the inter-frame time. Usage of clock stretching is determined by the micro-architecture of the slave along with the software operating on it. • Message length - the length of each transmitted mes- sage, in bytes. • Inter-message delay - this measurement shows the 12 elapsed time between consecutive I2C messages. while The parameters required for the calibration and oper- the time difference between messages is governed ation of the detection algorithm can be determined and by the I2C master, the master may be driven into programmed during the production of the whole device, sampling the slave at a certain rate by the slave, or learned by the countermeasure during a brief learning depending on the protocol. period. During the collection of data from 150 device boots and several hours of touch interactions, the recorded Table 5 shows gathered statistics of these features. The statistics did not deviate from a baseline established at the samples were collected from 150 boot cycles of an unmodi- beginning of the experiment, suggesting that the statistics fied phone along with several hundreds of touch events. are very well constant under time, and that the learning The table show how architectural features, such as period can be as short as several minutes. the SDA rise time and inter-frame time, do not differ between operation modes while data-related features, such 8.4. Evaluation as message length and inter-message time, have small but significant differences. An effective countermeasure is expected to discriminate between benign and malicious scenarios with a low rate of 8.3. Design error. In the case of a hardware countermeasure that is integrated in critical hardware paths, error rates should be negligible. 2 input : A sliding window W containing N I C We indicate an evaluation for such countermeasure to bytes be successful if it provides a zero false-positive malicious output : A classification ∈ (benign, malicious) of identification rate along with safe and explainable margins the data parameter : N - Size of the sliding window for classification of new and unknown samples. parameter : t - Tolerance for benign classifications parameter : OMSRT - observed median of SDA rise 8.4.1. Implementation time A separate microcontroller module was attached parallel 2 parameter : OSSRT - observed standard deviation of to the I C bus and interrupt line of the touch controller. SDA rise time The microcontroller monitors the communication between parameter : OMIFD - observed median of inter-frame the touch controller and the host processor, and issues delay alerts if it detects anomalous behavior. parameter : OSIFD - observed standard deviation of An STM32L432 microcontroller module was connected inter-frame delay to the SDA, SCL, and INTR lines of the touch controller. parameter : OMML - observed maximal message 2 length Patterns and frequencies of I C communications were recorded under various usage scenarios by decoding the signals ac- MSRT ← median of SDA rise time in W ; cording to the I2C protocol. Due to the simple nature of the SSRT ← standard deviation of SDA rise time in W ; communication protocol between the host processor and MIFD ← median of inter-frame delay in W ; the touch controller, the patterns had very little variance. SIFD ← standard deviation of inter-frame delay in W ; 2 MML ← maximal observed message length in W ; Additionally, the maximum I C message length represents if MML > OMML then the largest normal register read and never exceeds a certain return malicious; value. Afterwards, the same setup was used for recording end the attacks described in this paper. values ← {(MSRT,OMSRT ), (SSRT,OSSRT ), The recorded data was then divided into two data (MIFD,OMIFD), (SIFD,OSIFD))}; sets as seen in Table 6. The training set was used to foreach (val, observed) ∈ values do determine the baseline and threshold for detection and if val < observed * (1-t) or val > observed * (1+t) the test set was input to a script executing the algorithm then described in Algorithm 1 for evaluation using a sliding return malicious; end window methodology. end return benign; 8.4.2. Selection of detection parameters Algorithm 1: Detection of communications done by a The detection algorithm relies on the windows size to malicious implant using micro-architectural and behavior maximize TPR and minimize FPR. A window size too differences. small increases noise and results in misdetections while a windows size too large can cause the algorithm to ignore Using the collected statistics of the features, a simple small anomalies. The window size selection was done by statistics-based classifier can be employed to detect anoma- analyzing the behavior of the device under normal settings lous behavior that points to a change in the I2C slave (as described in the training set in Table 6) while inferring micro-architecture or behavior. The algorithm for such the minimal transaction size for an activity. In our case classification can be seen in Algorithm 1. we notice that a full boot sequence involves the transfer

13 Data set Activities recorded Size in Size in Amount of Amount of name bytes bytes overlapping distinct (benign) (malicious) windows windows (N=400) (N=400) Phone boot sequence (benign), Training 134714 0 134315 336 touch interactions (benign) set Phone boot sequence (benign), touch interactions (benign), phone boot with a chip-in-the-middle Test set 128676 13127 141404 354 (malicious), exploitation through phone boot (malicious), touch injection (malicious)

Table 6: Detection model evaluation data sets of 1089 bytes and that a 0.2 seconds simple touch event is over any windows size, but it relies on the exploitation to indicated by the transfer of 420 bytes. The window size require an amount of data that exceeds normal transac- chosen for this experiment was N = 400. tions. In order to select parameters for successful detection of threats, a baseline was established by observing the 8.4.5. Detection of the complete end-to-end attacks statistics generated on the training set using the window As shown in the previous sections, the countermeasure size previously determined. was capable of detecting both the first-order and second- Setting a window size of N = 400 and a threshold values order stages of this attack and generating an indication of SSRT = 0.22, t = 0.55 yields an FP rate of FPR = 0% or interruption that would have completely stopped the when evaluating the test set. attack. Other parameters can be determined similarly by ob- serving statistics gathered from benign data. This method 8.5. Impact on Device Performance could also possibly be developed to be performed auto- The hardware countermeasure module performs an ac- matically during a learning period of the countermeasure tive collection and evaluation of data during its operation. device’s life. As such, it requires power to operate. The manufacturer datasheet of the microcontroller used 8.4.3. Detection of first-order attacks for evaluating the countermeasure methods in this paper, When evaluated on the test set, the algorithm correctly STM32L432 [37], reports a power usage of 1.05mA when detected all of the windows involving first-order attacks running on 8 MHz 1.8v which translates to 0.0019 Watt. using the parameters of N = 400 and t = 0.55. In addition, While inactive and in shut-down state, the microcontroller no benign windows were mis-classified in the test set. Con- consumes 63nA (1.1−7 Watt). Considering a typical smart- sidering the minimal data transaction duration and size phone battery capacity in the range of 8-13 Watt-hours [48] determined previously, we conclude that a successful detec- and active screen time of 10% on a 48 hour power budget, tion of a chip-in-the-middle scenario can be done within 0.2 the microcontroller will be responsible for less than 0.1 seconds while maintaining TPR = 100% and FPR = 0%. percent of the overall power consumption. As previously mentioned, the countermeasure may be 8.4.4. Detection of second-order attacks placed in-line within the communication bus and belay This attack was performed using a malicious chip im- messages in a manner that introduces a delay to messages. plant that replaced the original touch controller in the We observe that the duration of a single byte transmission same way that was done in the demonstration of first-order by a full-speed, 400kbit/s I2C device is equivalent to 1600 attacks. The fact that the micro-architectural differences clock cycles of the 80MHz STM32L432 microcontroller. between the benign and test setup remains lead to the Therefore, we conclude that an efficient countermeasure successful classification of the attack using the same pa- program executed on a modern microcontroller should be rameters from the last Subsubsection. sufficient for avoiding delays larger then the transmission In addition to micro-architectural based detection, it duration of a single byte. is worth mentioning that the second-order attack demon- strated in Section 6 injects an abnormally large amount of 8.6. Limitation 2 data using the I C bus, this allows for the simple classifica- While the protection of the proposed countermeasure tion done in line 6 of Algorithm 1. Using frame size for de- against chip-in-the-middle scenarios and chip replacement tection allows the countermeasure to maintain FPR = 0%

14 scenarios is hard to circumvent, the protection against ma- [5] Seyyedeh Atefeh Musavi and Mehdi Kharrazi. Back to static licious firmware is not as powerful. A malicious firmware analysis for kernel-level rootkit detection. IEEE Transactions operates on the same microcontroller as a benign one and on Information Forensics and Security, 9(9):1465–1476, 2014. [6] Jonathan Corbet, Alessandro Rubini, and Greg Kroah-Hartman. may possess any of the properties of the latter. We expect Linux Device Drivers: Where the Kernel Meets the Hardware." that in the case of malicious firmware, data-based anomaly O’Reilly Media, Inc.", 2005. detection can be used to further enhance the countermea- [7] Janis Danisevskis, Marta Piekarska, and Jean-Pierre Seifert. Dark side of the shader: Mobile gpu-aided malware delivery. In sure in a way that will improve detection. International Conference on Information Security and Cryptol- ogy, pages 483–495. Springer, 2013. [8] Evangelos Ladakis, Lazaros Koromilas, Giorgos Vasiliadis, 9. Conclusions Michalis Polychronakis, and Sotiris Ioannidis. You can type, but you can’t hide: A stealthy gpu-based keylogger. In Proceedings The threat of a malicious peripheral existing inside of the 6th European Workshop on System Security (EuroSec), consumer electronics should not be taken lightly. The con- 2013. versation around the alleged hardware implants raises a [9] Matthew Brocker and Stephen Checkoway. iseeyou: Disabling the macbook webcam indicator led. In USENIX Security, pages serious concern about if and where these have been embed- 337–352, 2014. ded, and what damage are they doing. As this paper shows, [10] Zhaohui Wang and Angelos Stavrou. Exploiting smart-phone attacks by malicious implants and peripherals are feasible, usb connectivity for fun and profit. In Proceedings of the 26th scalable, and invisible to most detection techniques. A Annual Computer Security Applications Conference, pages 357– 366. ACM, 2010. well-motivated adversary may be fully capable of mounting [11] Qing Yang, Paolo Gasti, Gang Zhou, Aydin Farajidavar, and such attacks on a large-scale or against specific targets. Kiran S Balagani. On inferring browsing activity on smartphones System designers should consider replacement components via usb power analysis side-channel. IEEE Transactions on Information Forensics and Security, 12(5):1056–1066, 2017. to be outside the phone’s trust boundary, and design their [12] Apple Inc. Error 53 support page. https://support.apple.com/ defenses accordingly. en-il/HT205628. Conservative estimates assume that there are about two [13] National Institute of Standards and Technology. Cyber- billion smartphones in circulation today. Assuming that security Framework v1.1 - Draft. https://www.nist.gov/ cyberframework/draft-version-11. 20% of these smartphones have undergone screen replace- [14] Inez Miyamoto, Thomas H Holzer, and Shahryar Sarkani. Why a ment [1], there are on the order of 400 million smartphones counterfeit risk avoidance strategy fails. Computers & Security, with replacement screens in the world. An attack which 66:81–96, 2017. compromises even a small fraction of these smartphones [15] Xiaoyong Zhou, Yeonjoon Lee, Nan Zhang, Muhammad Naveed, and XiaoFeng Wang. The peril of fragmentation: Security through malicious components will be comparable to that hazards in android device driver customizations. In 2014 IEEE of the largest PC-based botnets. Symposium on Security and Privacy, pages 409–423. IEEE, 2014. The countermeasure described in our work was shown [16] Omer Shwartz, Guy Shitrit, Asaf Shabtai, and Yossi Oren. From to be effective against the proposed attacks. As a low-cost smashed screens to smashed stacks: Attacking mobile phones using malicious aftermarket parts. In Workshop on Security for and minimally disruptive solution, the hardware counter- Embedded and Mobile Systems (SEMS 2017), 2017. measure can be further developed to protect against threats [17] Xing Liu, Jiqiang Liu, Sencun Zhu, Wei Wang, and Xiangliang similar to those presented in this paper, as well as other Zhang. Privacy risk analysis and mitigation of analytics libraries threats that may be introduced to the system such as active in the android ecosystem. IEEE Transactions on Mobile Com- puting, 2019. fault attacks. [18] Yongzhong He, Xuejun Yang, Binghui Hu, and Wei Wang. Dy- namic privacy leakage analysis of android third-party libraries. Journal of Information Security and Applications, 46:259–270, Acknowledgments 2019. [19] Jacob Leon Kröger and Philip Raschke. Is my phone listening This research was supported by Israel Science Founda- in? on the feasibility and detectability of mobile eavesdropping. tion grants 702/16 and 703/16. In IFIP Annual Conference on Data and Applications Security and Privacy, pages 102–120. Springer, 2019. [20] Bloomberg Businessweek. The big hack: How china used a tiny chip to infiltrate u.s. companies. bloomberg.com, 2018. References [21] Eric Wemple. Your move, bloomberg. washingtonpost.com, 2018. [22] Duncan Riley. Apple, amazon and super micro call on bloomberg [1] Motorola Mobility. Cracked screens and broken hearts - the 2015 to retract china spy chip story. siliconangle.com, 2018. motorola global shattered screen survey. https://community. [23] Supermicro. Supermicro historic stock price. supermicro.com, motorola.com/blog/cracked-screens-and-broken-hearts. 2019. [2] Defense Information Systems Agency. The Department of [24] Theo Markettos. Making sense of the supermicro motherboard Denfense Approved Products List. https://aplits.disa.mil/ attack. lightbluetouchpaper.org, 2018. processAPList. [25] Trammell Hudson. Modchips of the state. 2018. [3] Yajin Zhou and Xuxian Jiang. Dissecting android malware: [26] NCC Group Plc. TPM Genie. https://github.com/nccgroup/ Characterization and evolution. In 2012 IEEE Symposium on TPMGenie. Security and Privacy, pages 95–109. IEEE, 2012. [27] Google. Android Security Bulletin. [4] Jeffrey Bickford, Ryan O’Hare, Arati Baliga, Vinod Ganapathy, [28] UL. Counterfeit adapters. and Liviu Iftode. Rootkits on smart phones: attacks, implications [29] Amit Chowdhry. Apple: Nearly 90% of ’genuine’ iphone chargers and opportunities. In Proceedings of the eleventh workshop on on amazon are counterfeit. Forbes.com, 2016. systems & applications, pages 49–54. ACM, [30] Omer Shwartz, Amir Cohen, Asaf Shabtai, and Yossi Oren. 2010.

15 Shattered trust: when replacement smartphone components attack. In 11th {USENIX} Workshop on Offensive Technologies ({WOOT} 17), 2017. [31] Shakeel Butt, Vinod Ganapathy, Michael M Swift, and Chih- Cheng Chang. Protecting commodity operating system kernels from vulnerable device drivers. In Computer Security Appli- cations Conference, 2009. ACSAC’09. Annual, pages 301–310. IEEE, 2009. [32] CC Okolie, FA Oladeji, BC Benjamin, HA Alakiri, and O Olisa. Penetration testing for android smartphones. 2013. [33] Mordechai Guri, Yuri Poliak, Bracha Shapira, and Yuval Elovici. Joker: Trusted detection of kernel rootkits in android devices via jtag interface. In Trustcom/BigDataSE/ISPA, 2015 IEEE, volume 1, pages 65–73. IEEE, 2015. [34] Guillermo Suarez-Tangil, Juan E Tapiador, Pedro Peris-Lopez, and Arturo Ribagorda. Evolution, detection and analysis of malware for smart devices. IEEE Communications Surveys & Tutorials, 16(2):961–987, 2014. [35] Nir Nissim, Ran Yahalom, and Yuval Elovici. Usb-based attacks. Computers & Security, 70:675–688, 2017. [36] NXP. I2C-bus specification and user manual, April 2014. http: //www.nxp.com/documents/user_manual/UM10204.pdf. [37] STMICROELECTRONICS. STM32L432 Datasheet, May 2018. https://www.st.com/resource/en/datasheet/stm32l432kc. pdf. [38] Arduino. Arduino Home Page. https://www.arduino.cc. [39] Atmel Corporation. ATmega Datasheet, 2018. https://ww1.microchip.com/downloads/en/DeviceDoc/ ATmega48A-PA-88A-PA-168A-PA-328-P-DS-DS40002061A.pdf. [40] Cypress. PSoC 4000 Family Datasheet, November 2017. http: //www.cypress.com/file/138646/download. [41] Mario Frank, Ralf Biedert, Eugene Ma, Ivan Martinovic, and Dawn Song. Touchalytics: On the applicability of touchscreen input as a behavioral biometric for continuous authentication. IEEE Trans. Information Forensics and Security, 8(1):136–148, 2013. [42] Julian Fierrez, Ada Pozo, Marcos Martinez-Diaz, Javier Galbally, and Aythami Morales. Benchmarking touchscreen biometrics for mobile authentication. IEEE Transactions on Information Forensics and Security, 13(11):2720–2733, 2018. [43] Chao Shen, Yong Zhang, Xiaohong Guan, and Roy A Maxion. Performance analysis of touch-interaction behavior for active smartphone authentication. IEEE Transactions on Information Forensics and Security, 11(3):498–513, 2016. [44] Moran Azran, Niv Ben Shabat, Tal Shkolnik, and Yossi Oren. Brief announcement: Deriving context for touch events. In International Symposium on Cyber Security Cryptography and Machine Learning, pages 283–286. Springer, 2018. [45] Ralf Hund, Thorsten Holz, and Felix C Freiling. Return-oriented rootkits: Bypassing kernel code integrity protection mechanisms. In USENIX Security Symposium, pages 383–398, 2009. [46] Stephen Smalley, Chris Vance, and Wayne Salamon. Imple- menting selinux as a linux security module. NAI Labs Report, 1(43):139, 2001. [47] Kyong-Tak Cho and Kang G Shin. Fingerprinting electronic control units for vehicle intrusion detection. In 25th {USENIX} Security Symposium ({USENIX} Security 16), pages 911–927, 2016. [48] Kyung Mo Kim, Yeong Shin Jeong, and In Cheol Bang. Thermal analysis of lithium ion battery-equipped smartphone explosions. Engineering Science and Technology, an International Journal, 22(2):610–617, 2019.

16