FORUM INTERACTION TECHNOLOGIES Envisioning, designing, and implementing the user interface require a comprehensive understanding of interaction technologies. In this forum we scout trends and discuss new technologies with the potential to influence interaction design. — Albrecht Schmidt, Editor Radar Sensing in Human-Computer Interaction

Hui-Shyong Yeo and Aaron Quigley, University of St Andrews

he exploration of novel based approaches. It is also useful in computing, gestures, or in sensor fusion. sensing technologies to medical applications where traditional Soli is a new gesture-sensing facilitate new interaction approaches such as capacitive and technology for HCI with many potential modalities remains an galvanic skin response sensing do not use cases. Compared with either active research topic in work well. capacitive sensing or vision-based human-computer Within the HCI context, Doppler sensing, it aims to overcome problems interaction. Across the radar was used as early as 1997 in the with occlusion, lighting, and embedded Tbreadth of HCI conferences, we can see Magic Carpet [3] for sensing coarse sensing. It also aims to support 3D, the development of new forms of body motion. However, the system distance, and micro motions for novel interaction underpinned by the requires signal-processing knowledge forms of interaction. Soli combines a appropriation or adaptation of sensing and custom hardware, which present a view of the hardware architecture, techniques based on the measurement high barrier for entry. The more recent signal processing, software of sound, light, electric fields, radio development of low-cost, miniaturized, abstractions, a UX paradigm, and waves, biosignals, and so on. radar-on-chip devices with developer- gesture recognition for embedded Commercially, we see extensive friendly SDKs truly opens up the hardware and the final product. industrial developments of radar potential of radar sensing in HCI. Such Soli technology is hardware agnostic, sensing in automotive and military sensing is exemplified in the Soli which means the sensing technology settings. Radar is often considered a [4] for tracking micro gestures (such as can work with different radar chips. In long-range sensing technology that finger wiggle, hand tilt, check mark, or fact, the team has developed two fully is all weather, offers 3D position thumb slide) and has led to increased integrated radar chips (Figure 1), a information, operates at all times as interest in radar for gesture detection. frequency-modulated continuous wave it doesn’t require lighting, and that (FMCW) SiGe chip, and a direct- can penetrate surfaces and objects. At PROJECT SOLI sequence spread spectrum (DSSS) very long range, radar technology has The Google Soli significantly lowers the CMOS chip. There are four receive been used for many decades in weather barrier to entry for the HCI community (Rx) and two transmit (Tx) antennas. and aircraft tracking. At long, mid, in providing a plug-and-play SDK with The Rx antenna spacing is designed for and short range, radar has been used gesture support and software examples. optimal beam forming, while the Rx/ for autonomous cruise control (ACC), Radar, however, presents a number of Tx spacing is designed to gain isolation. emergency brake assist (EBA), security challenges and opportunities, which The radar prototype was a custom scanners, pedestrian detection, and this article seeks to introduce as a 57–64 GHz radar with multiple blind-spot detection. At very short primer for those seeking to explore narrow-beam horn antennas. In the 60 range, radar has been employed in radar for interaction, tangible GHz band, the FCC limits the disbond detection, corrosion detection, bandwidth to 7 GHz (40 to 82 dBm and foam-insulation flaw identification. EIRP), which results in a resolution of In addition, the research community Insights ~2cm less than sensor has explored radar technology for →→Radar has emerged as a new resolution. Today, with a 60 GHz center various purposes, including presence sensing technique in HCI with frequency and 5mm wavelength, the sensing and indoor user tracking great potential. Soli radar has 0.05 –15m range with a [1], vital-signs monitoring [2], and →→Portable radar systems such 180-degree field of view. The alpha emotion recognition. At this range, as the Google Soli developer developer kit (Figure 2) uses the FMCW radar addresses problems with kit allow developers and version with an integrated development privacy, occlusion, lighting, and researchers to create new and board that allows USB connection with limited field of view that affect vision- interesting research prototypes. the host computer.

70 INTERACTIONS JANUARY–FEBRUARY 2018 INTERACTIONS.ACM.ORG RADAR SIGNAL gesture recognition [4] along with provide access to uncompressed raw PROCESSING deep convolutional and recurrent data from the various channels. The Google Soli team’s novel paradigm neural networks [5]. This research However, radar systems with hardware for radar sensing is based on signal and development allow the Google and software offer pre-processing processing from a broad antenna beam, Soli team to propose a ubiquitous including noise subtraction, error which delivers an extremely high gesture-interaction language that can correction, and filtering, while signal temporal resolution instead of focusing allow people to control any device processing affords developers more on high spatial resolution. As a result, with a simple yet universal set of enriched views of the radar signal, Soli can track sub-millimeter motion at in-air hand gestures. including image formation, clutter high speeds with great accuracy. By Basic radar sensing systems for removal, range-Doppler map, elevation illuminating the entire hand in a single long- and short-range use typically estimation, object tracking, and target wide beam (Figure 3), the Soli can measure the superposition of reflections from multiple dynamic scattering centers (e.g., arches, fingertips, and finger bends) across the human hand. One radar signal provides information about the instantaneous scattering range and reflectivity of the various centers, while taking this over time with multiple repetition intervals affords information about the dynamics of the centers’ movements. An analysis of the instantaneous scattering results in characteristics that can be used to describe the pose and orientation of the hand, while dynamic movements and characteristics can be used to estimate hand gestures. Published studies have explored random forest Figure 1. Soli radar chips with antennas-in-package. Top: 12 x 12mm, FMCW SiGe chip. Bottom:

IMAGE FROM NEWS.CISION.COM, COURTESY NEWS.CISION.COM, ATAP GOOGLE FROM IMAGE OF machine-learning classifiers for 9 x 9mm DSSS CMOS chip. (Figure extracted from the Soli paper [4], used with permission.)

INTERACTIONS.ACM.ORG JANUARY–FEBRUARY 2018 INTERACTIONS 71 FORUM INTERACTION TECHNOLOGIES detection. Radar, as we have noted, is not a new technology; its use at varying ranges means there are an abundance of object techniques, methods, and approaches that researchers in HCI might learn from and adapt. shell However, when considering the Soli, USB port it is important to understand the Soli Processing Pipeline (SPP), from PCB 2x TX antenna hardware to software to application. There is a further distinction between heat sink 4x RX antenna the hardware-specific transmitter, receiver, and analog signal pre- processing, and digital-signal pre- processing in software, which is also Figure 2. Exploded view of the Soli alpha hardware (not to scale, image adapted from Soli alpha hardware specific. Hardware-agnostic SDK). In RadarCat [7], an object is placed on top of the sensor, where raw radar signals are signal transformations include range- extracted and classified using machine-learning techniques. Doppler, range profile, Doppler profile (micro-Doppler), and spectrogram, followed by feature extraction and gesture recognition, which can be provided to an application. The Soli SDK enables developers to easily access and build upon the gesture-recognition

Gesture Soli Raw Soli Signal Gesture Detected Virtual Tool Gesture pipeline. The Soli libraries extract Chip Signal Transformation Classification real-time signals from radar hardware, Figure 3. Soli is designed end-to-end for ubiquitous and intuitive fine-gesture interaction. outputting signal transformations, (Figure extracted from the Soli paper [4], used with permission.) high-precision position and motion data, and gesture labels and parameters at frame rates from 100 to 10,000 frames per second.

RADAR APPLICATIONS AREA Currently, the main application of Soli is centered around close-range sensing of fine and fluid gestures [4,5]. In addition, the team also suggested potential use cases in wearables, mobile, VR/AR systems, smart appliances, and IoT, as well as in scanning and imaging, wayfinding, accessibility, security, and spectroscopy. In research work, concrete examples developed by the alpha developers and shown at Google I/O 2016 [6] include material and Figure 4. Four example applications to demonstrate the interaction possibilities of RadarCat, object recognition [7], 3D imaging, clockwise from top left: physical object dictionary; tangible painting app; context-aware predictive drawing, in-car gestures, interaction and body shortcuts; and automatic refill.

Currently, the main application of Soli is centered around close-range sensing of fine and fluid Figure 5. Three potential future applications of RadarCat, from left to right: automatic waste sorting in a recycling center; assisting the visually impaired; and automatic check-out machine. gestures.

72 INTERACTIONS JANUARY–FEBRUARY 2018 INTERACTIONS.ACM.ORG gesture unlock, visualization, and This developer program demonstrates 33rd Annual ACM Conference on Human musical applications. More recent what is possible for the HCI community Factors in Computing Systems. ACM, New work also demonstrated interesting when it considers radar-on-chip or York, 2015, 837–846. DOI: https://doi. org/10.1145/2702123.2702200 use cases, such as the world’s smallest higher levels of processed data, such 3. Paradiso, J., Abler, C., Hsiao, K.-Y., and violin [8], a freehand keyboard [9], as with the Google Soli gestures. We Reynolds, M. The Magic Carpet: Physical biometric user identification, fluid/ suggest that avenues of research in sensing for immersive environments. powder identification, and glucose healthcare, on- and around-body CHI '97 Extended Abstracts on Human monitoring [10]. interaction, and object interaction are Factors in Computing Systems. ACM, New The small, low-power, single-package now opening up. Challenges with radar York, 1997, 277–278. DOI: https://doi. size of the Google Soli radar chip can remain, including energy consumption, org/10.1145/1120212.1120391 4. Lien, J., Gillian, N., Karagozler, M.E., be embedded in myriad consumer dealing with noisy signal processing, Amihood, P., Schwesig, C., Olson, E., devices, from smart watches to home- and achieving high recognition rates. Raja, H., and Poupyrev, I. Soli: Ubiquitous control stations. As a technology that The implication is that HCI researchers gesture sensing with millimeter wave can be embedded beneath surfaces and can use this new sensing technology radar. ACM Trans. Graph. 35, 4 (July that does not require light, it can be to achieve more. Although radar is not 2016), Article 142. DOI: https://doi. incorporated into both wearables and a Swiss army knife solution to every org/10.1145/2897824.2925953 5. Wang, S., Song, J., Lien, J., Poupyrev, a range of objects such as cars or IoT problem, with sensor fusion it can form I., and Hilliges, O. Interacting with devices, where an exposed sensor would part of a rich sensing space for new Soli: Exploring fine-grained dynamic not be permissible. Such a sensing forms of interaction. gesture recognition in the radio- technology can afford a paradigm shift The current availability of the Soli frequency spectrum. Proc. of the 29th in how we interact in a touchless manner developer kit is limited, but we hope Annual Symposium on User Interface with any computation, embedded into and believe that soon it will be available Software and Technology. ACM, New any device. Examples of virtual tools, to more developers and researchers. York, 2016, 851–860. DOI: https://doi. org/10.1145/2984511.2984565 smart watch interaction, loudspeaker Indeed, at the Google I/O last year, 6. Bridging the physical and digital. Imagine interaction, and musical applications the Soli team announced a newer beta the possibilities. ATAP. - Google I/O have also been demonstrated. developer kit with a built-in processing 2016; https://www.youtube.com/ RadarCat [7] is a small, versatile unit and swappable modules. In the watch?v=8LO59eN9om4 system for material and object meantime, for $2 or less readers can buy 7. Yeo, H.-S., Flamich, G., Schrempf, classification that enables new forms of an alternative radar sensor module, such P., Harris-Birtill, D., and Quigley, A. everyday proximate interaction with as the HB-100 (10 GHz) and RCWL- RadarCat: Radar categorization for input and interaction. Proc. of the 29th digital devices. RadarCat exploits the 0516 (a great comparison video can be Annual Symposium on User Interface unique raw radar signals that occur found on YouTube [11]). Alternative Software and Technology. ACM, New when different materials and objects are plug-and-play portable radars include York, 2016, 833–841. DOI: https://doi. placed on the sensor. By using machine- those from Walabot or Xethru. High- org/10.1145/2984511.2984515 learning techniques, these objects can end custom radar modules can also be 8. Project Soli -World’s Tiniest Violin; be accurately recognized. The system found but are not within the scope of https://vimeo.com/155570863 9. SoliType; https://www.youtube.com/ can also recognize an object’s thickness this article. Ultimately, it is the advent watch?v=EoCyfl_TlMI or state (filled or empty mug), as well of small, low-cost radar units that 10. MobileHCI 2017: Workshop on Object as different body parts. This gives opens up new avenues of research and Recognition for Input and Mobile rise to research and applications in exploration in HCI, bringing radar Interaction; https://sachi.cs.st-andrews. context-aware computing (Figure 4), sensing into the fabric of our lives. ac.uk/activities/workshops/mobilehci-17/ tangible interaction (with tokens and 11. Radar Sensors/Switches: Comparison objects), industrial automation (e.g., ACKNOWLEDGMENTS and Tests; https://www.youtube.com/ watch?v=9WiJJgIi3W0 recycling, Figure 5), and laboratory Thank you to the Soli team for the process control (e.g., traceability). feedback and corrections during the Hui-Shyong Yeo is a third-year Ph.D. Radar-on-chip systems have also been drafting of this article. student at the University of St Andrews. His demonstrated in presence sensing and research focuses on exploring and developing breath monitoring. novel interaction techniques, including for Endnotes mobile and wearable systems, AR/VR, text 1. Bahl, P. and Padmanabhan, V.N. RADAR: entry, and pen interaction, with a focus on WHAT’S DOWN THE ROAD? An in-building RF-based user location single-handed interaction. We were selected to participate in the and tracking system. Proc. of IEEE →→ [email protected] Google Soli Alpha Developers program, INFOCOM 2000. Conference on Computer for which we thank the Google ATAP Communications. 19th Annual Joint Conference Aaron Quigley is the director of SACHI, the St Andrews computer-human interaction Soli team. This provided us access at of the IEEE Computer and Communications Societies (Cat. No.00CH37064). IEEE, 2000, research group. His research interests include the time to the raw radar signal, which 775–784, vol. 2. surface and multi-display computing, human- gave us the opportunity to explore radar 2. Adib, F., Mao, H., Kabelac, Z., Katabi, D., computer interaction, pervasive and ubiquitous sensing and machine learning for object and Miller, R.C. Smart homes that monitor computing, and information visualization. and material classification in novel ways. breathing and heart rate. Proc. of the →→ [email protected]

DOI: 10.1145/3159651 © 2018 ACM 1072-5520/18/01 $15.00

INTERACTIONS.ACM.ORG JANUARY–FEBRUARY 2018 INTERACTIONS 73