EchoLock: Towards Low Effort Mobile User Identification Yilin Yang Chen Wang Yingying Chen Rutgers University Louisiana State University Rutgers University yy450@scarletmail.
[email protected] ∗ yingche@scarletmail. rutgers.edu rutgers.edu Yan Wang Binghamton University
[email protected] ABSTRACT User identification plays a pivotal role in how we interact with our mobile devices. Many existing authentication ap- proaches require active input from the user or specialized sensing hardware, and studies on mobile device usage show significant interest in less inconvenient procedures. In this paper, we propose EchoLock, a low effort identification scheme that validates the user by sensing hand geometry via com- modity microphones and speakers. These acoustic signals produce distinct structure-borne sound reflections when con- tacting the user’s hand, which can be used to differentiate Figure 1: Capture of hand biometric informa- between different people based on how they hold their mo- tion embedded in structure-borne sound using bile devices. We process these reflections to derive unique commodity microphones and speakers. acoustic features in both the time and frequency domain, convenient practices [37]. Techniques such as facial recog- which can effectively represent physiological and behavioral nition or fingerprinting provide fast validation without traits, such as hand contours, finger sizes, holding strength, requiring considerable effort from the user, but demand and gesture. Furthermore, learning-based algorithms are de- dedicated hardware components that may not be avail- veloped to robustly identify the user under various environ- able on all devices. This is of particular importance in ments and conditions. We conduct extensive experiments markets of developing countries, where devices such as with 20 participants using different hardware setups in key the Huawei IDEOS must forgo multiple utilities in or- use case scenarios and study various attack models to demon- der to maintain affordable price points (e.g.