SECURITY & PRIVACY

Rapid Trust Establishment for Pervasive Personal Computing

Trust-Sniffer’s staged approach to establishing confidence in untrusted machines balances security and ease-of-use, facilitating rapid use of transient hardware.

n the emerging Internet Suspend/Resume hardware use to become commonplace, users must mobile computing model,1–3 users exploit be able to quickly establish a similar confidence pervasive deployments of inexpensive, mass- level in hardware that they don’t own or manage. market PC hardware rather than carrying To address this problem, we’ve created Trust- portable hardware. ISR’s driving vision is Sniffer, a tool that helps users incrementally gain thatI plummeting hardware costs will someday elim- confidence in an initially untrusted machine. inate the need to carry computing environments in Trust-Sniffer focuses on software attacks but portable computers. Instead, ISR will deliver an doesn’t guard against hardware attacks, such as exact replica of the last checkpointed state of a user’s modifying the basic I/O system. Potential defenses entire computing environment (including the oper- against hardware attacks include physical sur- ating system, applications, files, and customiza- veillance or the use of tamper-proof or tamper- tions), on demand over the Internet to hardware evident hardware. Although Trust-Sniffer is de- located nearby. So, ISR cuts the tight binding signed to safeguard mobile users, you can couple between PC state and PC hard- it with system operators’ security technologies, Ajay Surie, Adrian Perrig, ware. ISR is implemented by lay- such as full disk encryption, direct memory-access Mahadev Satyanarayanan, ering a virtual machine on dis- monitoring, and remote attestation. and David J. Farber tributed storage. The virtual Trust-Sniffer’s staged approach to establishing Carnegie Mellon University machine encapsulates execution confidence in an untrusted machine balances the and user customization; the dis- needs for security, usability, and speed (for a com- tributed storage transports that parison of Trust-Sniffer and other approaches, state across space and time. see the sidebar). In this new computing model, establishing trust in unmanaged hardware for transient use becomes Design overview a major issue. Today, when a user sits down at a Trust-Sniffer aims to enhance security with computer in the office or home, he or she implic- modest user effort. A key design principle, moti- itly assumes that the machine hasn’t been tampered vated by ISR’s unique characteristics, is to vali- with and that it doesn’t contain malware, such as date only the software a user needs for a task. a keystroke logger. This assumption is reasonable Specifically, most of a user’s execution environ- because unauthorized physical access to the ment is fetched from a trusted server over an machine is restricted. The same assumption applies authenticated, encrypted channel. This includes to a portable computer that the user physically the guest and applications that safeguards at all times. For ISR’s vision of transient execute inside the user’s virtual machine. Cached

24 PERVASIVEcomputing Published by the IEEE Computer Society ■ 1536-1268/07/$25.00 © 2007 IEEE Figure 1. Trust-Sniffer’s staged approach to trust establishment. Initially, all Untrusted code Trusted application software is untrusted. (a) The trust Trusted code 1 initiator’s boot uses a minimal trusted operating system to validate the on-disk Minimal trusted OS Trusted host OS Trusted host OS OS. (b) Next, the trusted host OS is (from trust initiator) (from disk) (from disk) booted from disk, which validates Validate OS Validate application 1 applications as required. (c) The host OS only permits trusted applications to execute. Application 2 Application 2 Application 2

Application 1 Application 1 Application 1 virtual-machine state on disk is always encrypted, and ISR verifies its integrity OS OS OS before a virtual machine uses it. So, it’s only necessary to verify the integrity of a Host hardware Host hardware Host hardware small core of local ISR and soft- (a) (b) (c) ware. Other compromised state isn’t a threat if it’s never used when a machine Time is functioning as an ISR client. This min- imalist approach speeds trust establish- ment and makes ISR more ubiquitous. A reboot from disk, which is now known robust than en-masse validation with second important design principle is to to be safe. This ensures that a full suite respect to ISR software evolution. If a prevent the execution of untrusted soft- of drivers for the local hardware (such new release of ISR client software uses a ware. This is in contrast to attestation as graphics accelerators) as well as cor- local software component that previous techniques, which facilitate the detection rect local environment settings (such as releases didn’t use, Trust-Sniffer will dis- of untrusted software but don’t prevent those for printers, networks, and time) cover its use, even if a software devel- its execution. is obtained. During this boot process, oper fails to list it. Trust-Sniffer dynamically loads the trust Staged approach extender module into the kernel, which Example use Figure 1 shows Trust-Sniffer’s staged extends the zone of trust as execution The following example illustrates our approach to trust establishment. proceeds. On the first attempt to execute system’s use. Bob boots up a PC at his any code that lies outside the current hotel’s business center using his USB key. Establishing a root of trust. First, the user zone of trust (including dynamically Trust-Sniffer’s initial scan quickly vali- performs a minimal boot of the untrusted linked libraries), the kernel triggers a dates the on-disk operating system, which machine from a small, lightweight device trust fault. The trust extender handles a is subsequently booted. Bob then initiates such as a USB memory stick that the user trust fault by verifying the suspect mod- the resume step of ISR. Once Trust-Snif- owns and carries at all times. This trust ule’s integrity and stops execution if the fer verifies the ISR client software’s initiator device serves as the root of trust module’s integrity cannot be established. integrity, ISR fetches Bob’s personal exe- in Trust-Sniffer. Relying on a trusted cution environment over a trusted com- physical possession is convenient and Validating other local software on trust munication channel. The locally installed easy for the user to understand. This faults. As each component of ISR client Web browser is riddled with malware, but stage involves a minimal boot, because software is accessed for the first time, the this doesn’t affect Bob’s task because he its sole purpose is for Trust-Sniffer to kernel generates trust faults and Trust- works only within his trusted personal ex- examine the local disk and verify that it’s Sniffer validates the component’s inte- ecution environment. safe to perform a normal boot using the grity. Once the user gains confidence in on-disk operating system and its associ- the machine and its ISR software, he or Threat model and assumptions ated boot software. The minimal boot she can perform the resume step of ISR. Machines used as ISR clients are vul- ignores the network and any devices On the other hand, if Trust-Sniffer can’t nerable to attacks such as modifications other than the disk. validate any component of ISR client to client or system software and installa- software, it alerts the user and terminates tion of malware such as key-logging or Booting the on-disk operating system. the ISR resume sequence. screen-capture software. Trust-Sniffer’s Next, Trust-Sniffer performs a normal On-demand validation is much more goal is to avoid potential loss or disclosure

OCTOBER–DECEMBER 2007 PERVASIVEcomputing 25 SECURITY & PRIVACY

Applications Trust alerter

• Minimal OS Trust-fault Mismatched data USB User space • Measurement utilities • Reference measurements Kernel space

Trust extender Kernel modules

Linux kernel

Trust initiator Untrusted machine

Figure 2. The Trust-Sniffer architecture. The trust initiator plugs into an untrusted machine, validates its operating system, and equips it with a list of trusted applications’ SHA-1 hashes (reference measurements). Applications that haven’t been validated cause a trust fault, which the trust extender handles. If an application can’t be validated, the user-space trust alerter notifies the user.

of user data by validating software on an mentation of Integrity Measurement Validating applications ISR client machine before a user accesses Architecture for Linux.5 IMA checks Trust-Sniffer validates an application his or her personal execution environment. the system software stack’s integrity by comparing its sample measurement, We assume that, to use our system, the by computing a SHA-1 hash over an obtained when the application is loaded, user may reboot the untrusted machine executable’s contents when it’s loaded. to a known list of reference measure- and that the BIOS allows booting from a This SHA-1 hash is called a measure- ments generated from trusted applica- USB memory stick. We also assume that ment. IMA is built into the kernel and tions. A mismatch is an inequality be- modifying the BIOS is difficult. So, we is invoked whenever executable code tween a sample measurement and every don’t guard against virtual-machine- is loaded, accounting for user-level bi- reference measurement in the list. In based attacks, such as SubVirt,4 where a naries, dynamically loadable libraries, contrast to attestation, where measure- compromised BIOS could boot directly kernel modules, and scripts. IMA mea- ments detect untrusted software after it into a malicious virtual-machine monitor. sures each loaded executable and has been loaded, validation refers to In addition, Trust-Sniffer is based on stores its measurement in a list in the detection of untrusted software before load-time binary validation, so it doesn’t kernel. An aggregate measurement of it’s used. protect against runtime attacks. Speci- the list is stored in the Trusted Plat- The reference measurement list must fically, only applications with valid sig- form Module, a secure hardware be generated initially and updated only natures may load and execute, but coprocessor that protects the list’s when patches and new software are Trust-Sniffer wouldn’t detect, for exam- integrity. The TPM aggregate and the released. Failing to update the mea- ple, buffer overflow attacks initiated kernel-measurement list facilitate surement list implies that new trusted after execution has begun. remote-party attestation of the system software will be prevented from exe- software stack. cuting. Execution of untrusted software Detailed design IMA is part of the kernel and assumes is never allowed. For simplicity and and implementation trust in all software that’s executed be- comparability, our measurement list for- Figure 2 shows Trust-Sniffer’s three fore it’s invoked. Our system builds on mat is the same as that output by sha1sum, major components: IMA and addresses these concerns. Be- a utility commonly available on most cause remote-party attestation isn’t our distributions. • the trust initiator and its associated system’s intent, we don’t use a TPM In our initial implementation, we minimal boot software; chip. Instead, we use a small, user-car- recorded measurements for the latest • the trust extender, implemented as a ried passive device to initiate the estab- version of an application. However, the kernel module; and lishment of confidence in the boot list could contain multiple trusted mea- • the trust alerter, a user-space notifier loader and kernel on the untrusted ma- surements for an application such as the application. chine. The machine’s OS kernel is then kernel. In this case, if Trust-Sniffer responsible for measuring software encountered a system with multiple ker- Integrity measurement architecture applications and preventing untrusted nels, it could permit the execution of a Trust-Sniffer builds on the imple- software from executing. trusted kernel even if the remaining ones

26 PERVASIVEcomputing www.computer.org/pervasive Related Work in Trust Establishment

hen comparing Trust-Sniffer to other work, we considered provide verifiable code execution on an untrusted platform.7 Be- W the following: cause Pioneer establishes a dynamic root of trust, it can function as a building block for future iterations of Trust-Sniffer. • Most systems are designed with an administrator in mind and Scott Garris and his colleagues describe a mechanism for a user don’t focus on giving users security guarantees. to establish trust in an untrusted kiosk—a problem similar to the • Although some solutions offer more security than our system, one Trust-Sniffer addresses.8 However, their implementation relies their complex hardware and software requirements limit their on secure hardware, including a Trusted Platform Module chip, adoption in practice. and a new instruction recently added to the architecture. In addition, their implementation requires validating all the software The trusted boot and secure boot1 mechanisms verify software on a kiosk before it can be used, whereas Trust-Sniffer allows incre- components required during a PC’s bootstrap process. However, mental program validation to facilitate transient use. both mechanisms require platform modifications. Neither mecha- nism verifies software, such as operating system services and user REFERENCES

applications, which are executed after the bootstrap process. Ad- 1. W.A. Arbaugh, D.J. Farber, and J.M. Smith, “A Secure and Reliable Boot- ditionally, although trusted boot helps detect untrusted software strap Architecture,” Proc. 1997 IEEE Symp. Security and Privacy, IEEE CS through recorded application signatures, it doesn’t prevent un- Press, 1997, pp. 65–71.

trusted execution because it doesn’t validate the signatures it 2. R. Cáceres et al., “Reincarnating PCs with Portable SoulPads,” Proc. 3rd records. Int’l Conf. Mobile Systems, Applications, and Services (MobiSys 05), ACM Trust-Sniffer differs from solutions such as SoulPad, which lets Press, 2005, pp. 65–78.

users configure a machine with their own personal computing 3. D.E. Clarke et al., “The Untrusted Computer Problem and Camera- environment using software carried on a portable device.2 This Based Authentication,” Proc. 1st Int’l Conf. Pervasive Computing, LNCS approach isn’t always practical because personalized environ- 2414, Springer, 2002, pp. 114–124.

ments often don’t contain correct local-environment settings, 4. M. Naor and B. Pinkas, “Visual Authentication and Identification,” Proc. such as printer and network settings, or appropriate drivers for the 17th Ann. Int’l Cryptology Conf. Advances in Cryptology, LNCS 1294, target machine. Trust-Sniffer requires minimal boot software and Springer, 1997, pp. 322–336.

subsequently loads software on the target system after establish- 5. M. Abadi et al., “Authentication and Delegation with Smart-Cards,” Sci- ing its integrity. ence of Computer Programming, vol. 21, no. 2, 1993, pp. 91–113. A well-studied problem is the untrusted terminal, where a user 6. R. Kennell and L.H. Jamieson, “Establishing the Genuinity of Remote interacts with an untrusted device. The most interesting formula- Computer Systems,” Proc. 12th Conf. USENIX Security Symp., USENIX tion of the problem is using an untrusted device to establish secure Assoc., 2003, pp. 295–310. communication with a trusted remote device. Researchers have 7. A. Seshadri et al., “Pioneer: Verifying Code Integrity and Enforcing proposed several solutions, including camera-based authentica- Untampered Code Execution on Legacy Systems,” Proc. 20th ACM tion,3 visual cryptography,4 and trusted smart cards.5 Other tech- Symp. Operating Systems Principles, ACM Press, 2005, pp. 1–16. niques to verify the authenticity of a system’s configuration include 8. S. Garris et al., “Towards Trustworthy Kiosk Computing,” Proc. 8th IEEE a challenge-response mechanism between a trusted authority and Workshop Mobile Computing Systems and Applications, IEEE CS Press, a remote system.6 The Pioneer system uses a similar protocol to 2007.

were compromised. Because ISR client Rapidly establishing a root of trust It does this as rapidly as possible to avoid software fetches most of a user’s execu- As we mentioned earlier, none of the a long delay before the user can do use- tion environment, Trust-Sniffer must software on the target machine is initially ful work. maintain measurements for only a small trusted. The trust initiator aims to The trust initiator is partitioned, for- set of OS and client software. To address matted, and loaded with the Finnix frequent software updates and patches, • bootstrap the trust-establishment pro- bootable OS, a derivative of . users could obtain new measurements cess by establishing a root of trust and Finnix provides both excellent support from a server that periodically generated • equip the on-disk OS with the neces- for devices and automatic hardware updated lists and digitally signed them sary tools to validate the rest of the detection. It’s suitable for our purposes for distribution. software on the machine. because it boots quickly and has a small

OCTOBER–DECEMBER 2007 PERVASIVEcomputing 27 SECURITY & PRIVACY

footprint. The trust initiator is loaded to load the trust extender with the trust surement of an executable, its value is with the list of reference measurements initiator’s list of reference measurements. compared against the reference list. If for the initial boot-validation phase as The trust extender can begin enforc- no matching reference measurement well as subsequent application valida- ing measurement mismatches only after exists, the measurement is untrusted, tion by the on-disk kernel. the reference measurement list has been and the kernel disallows the execution. After the machine is booted using the loaded. So, it’s imperative to load this The system’s caching mechanism trust initiator, all on-disk software com- list as early in the boot process as pos- reduces the overhead of measuring ponents associated with the boot process sible. Until the list is loaded, the trust unchanged applications when they’re are validated. A custom start-up script extender is nonintrusive and allows all executed more than once. When a pre- mounts local hard disks and discovers execution. We use a custom initrd (ini- viously measured application is executed the boot partition. It then uses sha1sum to tial RAM disk) image with modified again, the kernel simply uses the appli- cation’s cached measurement value, The system’s caching mechanism unless it’s stale (that is, unless the file is opened for writing or is on an un- reduces the overhead of measuring mounted file system). Stale measure- ments must be recomputed. unchanged applications when they’re Alerting the user executed more than once. When an untrusted application is encountered, the kernel blocks its exe- measure the kernel image, the initial ram boot scripts to load the reference mea- cution. The user isn’t aware of commu- disk, and the GRUB boot loader. Trust- surements into the trust extender. All nication between applications and the Sniffer compares the measurements of initial execution before the list is loaded kernel and must be alerted to relevant this set of software to the trust initiator’s is from the initrd, which is part of the information in two instances: when the list of reference measurements. Any mis- root of trust. Once the reference mea- trust extender encounters an untrusted matches cause the boot process to be surements are loaded, the trust exten- application, and when a failure occurs untrusted and to halt trust establishment der protects the in-kernel list’s integrity in the kernel that would compromise the because a trusted OS is required to vali- by disabling the measurement-loading measurement process. The latter case date the rest of the software stack. If all interface in securityfs. encompasses unexpected failures in the the boot software is valid, Trust-Sniffer The reference measurement list re- measurement process, such as out-of- copies the trust initiator’s reference mea- sides in a hash table indexed by each memory errors. surement list to a predetermined loca- executable’s SHA-1 hash value. The To establish a communication channel tion on disk, which the kernel subse- trust extender uses the file_mmap Linux between the trust alerter and the trust quently accesses when it boots. Security Module hook to measure files extender, we used the netlink socket inter- mapped with the PROT_EXEC bit set (which face. The netlink interface is a bidirec- Dynamically extending includes dynamically loadable libraries tional, versatile method for passing data the root of trust and executables) and a custom hook in between the kernel and user space, and Next, the trust extender validates appli- the load_module function to measure ker- the interface is well documented. We cation software. We avoid making the nel modules. The LSM interface is part installed a new netlink protocol type, user manually reboot (that is, powering of the main kernel. Each measurement TS_NETLINK, and added the TS_NLMSG_MISMATCH off the machine, removing the trust ini- consists of the executable code’s SHA- and TS_NLMSG_FAILURE message types. When tiator, and then powering the machine on) 1 hash value. The trust extender also the trust extender is initialized, it opens a by using kexec, which lets you boot into a stores additional file metadata such as netlink socket using a call to netlink_ new kernel directly from the current run- pathname, user ID, and group ID. The kernel_create. A user process that binds to ning kernel without using a boot loader. metadata doesn’t affect the actual mea- this socket waits for messages from the The trust extender kernel module uses surement validation, but we use it to kernel using the recvmesg system call. the securityfs pseudo file system to commu- communicate information to the user When the trust extender encounters nicate with user-space programs. This pro- (see the next section for more details). an untrusted application, it sends the ap- vides an interface for a user-space program When a trust fault triggers the mea- propriate message with a call to netlink_

28 PERVASIVEcomputing www.computer.org/pervasive broadcast. In the case of a mismatch, this machines to update the reference mea- We averaged latency measurements over includes the process ID of the process surement list. In addition, the trust es- 10 runs of the benchmark. that failed and the file name of the appli- tablishment procedure doesn’t depend When an application is executed for the cation on disk, if available. We’ve imple- on network communication with the first time, the system computes the ap- mented a simple console application that untrusted machine. So, we disable net- plication’s SHA-1 measurement, which communicates with the kernel in the cur- working capabilities on the minimal OS we denote as a cache miss. If on subse- rent implementation. We plan to convert that we use to validate the on-disk kernel. quent executions an application’s this into a friendlier GUI application in cached measurement is valid, its execu- the future. Performance tion results in a cache hit. In our exper- Our prototype implementation used iments using a system without Trust- Evaluation the Fedora Core 5 distribution, config- Sniffer, an mmap operation’s latency was We evaluated Trust-Sniffer in light of ured with version 2.6.15 of the Linux 0.99 s. A cache hit had a latency of 1.19 user expectations about the system’s kernel. We implemented Trust-Sniffer by s, with an overhead against the base- security guarantees and performance. extending a core part of the Linux ker- line of 0.20 s. A cache miss had a nel, so configuration changes (such as latency of 4.29 s, with an overhead of Security using a different ) 3.3 s. Clearly, with a cache hit, the over- We needed an appropriate balance be- shouldn’t significantly affect the results. head of validating a previously measured tween ease of use and security guaran- Our test hardware consisted of an IBM application isn’t significant. When an tees. Currently, users have no choice but T43 laptop with a 2.0 GHz Pentium M application is first executed, there’s some to trust any unmanaged hardware that processor and 1 GB of RAM. We used a measurement overhead compared to the they encounter. Trust-Sniffer provides a standard USB memory stick with 1 GB baseline. However, this overhead is low simple solution designed for use by non- of storage capacity as the trust initiator. in absolute terms because applications experts and security guarantees that are From the transient usage model’s per- are measured only once. a substantial improvement over current spective, we evaluate how much over- To evaluate the boot process, we mea- practice. head the Trust-Sniffer system requires sured the average time (over five trials) it Trust-Sniffer isn’t foolproof. It’s vul- compared to a system without security took to boot a machine from when we nerable to certain attacks. As we men- tioned earlier, we don’t guard against a We needed an appropriate balance between malicious BIOS or virtual-machine attack. We also assume that the hardware is trust- ease of use and security guarantees. Trust-Sniffer worthy, so we don’t protect against direct- memory access (DMA) based attacks by provides security guarantees that are malicious devices. Although Trust-Sniffer disallows the a substantial improvement over current practice. execution of unknown software, this shouldn’t inconvenience users. Estab- checks. The user cares most about how pushed the power button until a logon lishing trust in a machine for use as an long the system takes to boot and the prompt appeared. This metric is relevant ISR client requires validation of only a overhead it needs to check applications. to the user because it defines the “warm- small subset of software because most With Trust-Sniffer, the downtime caused up” time the system requires before the applications are contained in the user’s by having to reboot is small and the user can start working. These results personalized execution environment additional security is worth the perfor- depend somewhat on system configura- that’s downloaded from a trusted server. mance penalty. Note that the perfor- tion, such as the number of daemons It’s easy to equip the trust initiator with mance overhead for application mea- started when the OS is booted. However, reference measurements required to val- surements is only on the first execution. they illustrate sufficiently that the over- idate an ISR client’s software. Because the trust extender primarily head of using Trust-Sniffer is small. A Because the trust initiator is write pro- uses the file_mmap LSM hook to measure standard machine without Trust-Sniffer tected, unauthorized external sources applications at load time, we measure an takes 97.1 seconds to boot; a machine can’t modify its software. Of course, users mmap operation’s latency in different con- with Trust-Sniffer takes 111.4 seconds. can disable write protection on their own texts using the HBench-OS framework.6 The overhead of using Trust-Sniffer is

OCTOBER–DECEMBER 2007 PERVASIVEcomputing 29 SECURITY & PRIVACY

the AUTHORS

Ajay Surie is a graduate student in computer science at Carnegie Mellon University. His research interests include distributed systems and security. He received his BS in computer science from Carnegie Mellon University. Contact him at the Computer Science Dept., Carnegie Mellon Univ., Wean Hall 4212, 5000 Forbes Ave., Pitts- thus only 14.2 percent over a standard burgh, PA 15213; [email protected]. system. One might argue that users don’t usually have to reboot a system. How- ever, we believe that the additional time spent to establish trust is well worth the Adrian Perrig is an assistant professor in electrical and computer engineering at security improvement. Carnegie Mellon University. His research interests include networking and systems security and security for mobile computing and sensor networks. He received his PhD in computer science from Carnegie Mellon University. He’s a member of the Usability and extensibility ACM and IEEE and a recipient of the NSF CAREER award, an IBM faculty fellowship, In computing systems, very often a and a Sloan research fellowship. Contact him at 2110 Collaborative Innovation Cen- ter, 4720 Forbes Ave., Pittsburgh, PA 15213; [email protected]. solution’s complexity hinders its accep- tance in practice. We focused our design of Trust-Sniffer on simplicity and ease- Mahadev Satyanarayanan is the Carnegie Group Professor of Computer Science at Carnegie Mellon University. His research interests include mobile computing, perva- of-use to facilitate adoption by nonex- sive computing, and distributed systems. He received his PhD in computer science pert users. With Trust-Sniffer, the zone from Carnegie Mellon University. He’s a fellow of the ACM and IEEE and the found- of trust expands from a small, conve- ing editor in chief of IEEE Pervasive Computing. Contact him at the Computer Science Dept., Carnegie Mellon Univ., Wean Hall 4212, 5000 Forbes Ave., Pittsburgh, PA nient, trusted device that the user carries. 15213; [email protected]. Trust-Sniffer’s operation model can help increase awareness and security for sim- David J. Farber is a Distinguished Career Professor of Computer Science and Public ple day-to-day computing tasks. Policy at Carnegie Mellon University. His research interests include networking, dis- Our design is flexible and extensible. It tributed systems, and public policy. He received an honorary doctor of engineering should be easy to set up a mechanism for degree from the Stevens Institute of Technology. He’s a fellow of the ACM and IEEE. Contact him at the Inst. for Software Research Int’l, Carnegie Mellon Univ., 5000 users to obtain updates to reference mea- Forbes Ave., Pittsburgh, PA 15213; [email protected]. surement lists. Our prototype used Fedora Core; however, because we implemented the trust extender as a kernel module, the trust initiator could be loaded with appro- priately configured kernel modules and initial ram disks for stock kernels of var- Science Foundation under grant CNS-0509004, the 5. R. Sailer et al., “Design and Implementa- ious distributions. This would make it US Army Research Office under grant DAAD19-02- tion of a TCG-Based Integrity Measurement 1-0389 to Carnegie Mellon’s CyLab, and the Intel possible to equip arbitrary machines that Architecture,” Proc. 13th Conf. USENIX Corporation. Security Symp., USENIX Assoc., 2004, pp. don’t have preconfigured kernels with 223–238. Trust-Sniffer software. 6. A.B. Brown and M.I. Seltzer, “Operating System Benchmarking in the Wake of REFERENCES LMbench: A Case Study of the Performance rust-Sniffer provides greater of NetBSD on the Intel x86 Architecture,” 1. M. Kozuch and M. Satyanarayanan, “Inter- overall security for users and Proc. 1997 ACM SIGMETRICS Int’l Conf. net Suspend/Resume,” Proc. 4th IEEE Measurement and Modeling of Computer increases their awareness about Workshop Mobile Computing Systems and Systems, ACM Press, 1997, pp. 214–224. T security risks. In future ver- Applications, IEEE CS Press, 2002, p. 40. sions, we plan to explore ways to provide 2. M. Satyanaranyanan et al., “Towards Seam- increased security while keeping the de- less Mobility on Pervasive Hardware,” Per- sign focused on nonexpert users. vasive and Mobile Computing, vol. 1, no. 2, 2005, pp. 157–189.

3. M. Satyanaranyanan et al., “Pervasive Per- sonal Computing in an Internet Suspend/ Resume System,” IEEE Internet Comput- ACKNOWLEDGMENTS ing, vol. 11, no. 2, 2007, pp. 16–25.

We thank Reiner Sailer of IBM Research for his assis- 4. S.T. King et al., “SubVirt: Implementing tance with the Integrity Measurement Architecture Malware with Virtual Machines,” Proc. For more information on this or any other comput- and Jan Harkes for his suggestions on the project. 2006 IEEE Symp. Security and Privacy, ing topic, please visit our Digital Library at www. This research was supported by the US National IEEE CS Press, 2006, pp. 314–327. computer.org/publications/dlib.

30 PERVASIVEcomputing www.computer.org/pervasive