<<

[POSTER] Holographic iRay: Exploring Augmentation for Medical Applications

Tian Xie1, Mohammad M. Islam1, Alan B. Lumsden2, Ioannis A. Kakadiaris1,* 1Computational Biomedicine Lab, Dept. of Computer Science, Univ. of Houston, 2Methodist Research Institute, Houston, USA

Figure 1: Overview of the Holographic iRay prototype: (a) selection view; (b) AR view, and (c) observation view.

ABSTRACT depth sensor and tracks the camera using a simultaneous A Holographic iRay prototype focusing on medical augmented localization and mapping (SLAM) method provided by Structure reality is presented. The prototype is built using SDK [4]. HoloLens and Unity engine based on a previous iRay system built The previous iRay system offers limited immersive experience for iPad. A human subject is scanned using magnetic resonance due to the handheld 2D display and non-spatial screen-touch imaging and the torso surface is pre-operatively segmented for 3D interactions. These limitations motivated us to explore the registration. The registration is performed using the iterative closest capability of iRay on HMD. In this paper, we chose HoloLens to point algorithm between the pre-operative torso surface and the implement the Holographic iRay considering the integrated active torso surface mesh provided by the HoloLens spatial function of surface reconstruction provided by spatial mapping [5]. mapping and the gaze interaction. A scanning box is visualized at Therefore, the reconstructed mesh is being used with the same the gazing point to help the user select the targeting area. The pre- registration strategy. operative torso surface and the sampled active vertices within the HoloLens has been released for only one year, but has impressed scanning box are additionally overlaid in the box as the auxiliary the public with its inside-out tracking system and independent information for guidance. Several simple interactions are designed computational capability. Several holographic applications [6-9] to control the rendering of the inner organs after the registration. have been developed on HoloLens for medical applications. The experimental results demonstrate the potential of the However, most do not have sufficient registration with the human Holographic iRay for medical applications. subject, but only subjectively place the holograms on the ground, table, or in the air for education purposes. Keywords: Medical AR, Markerless Registration, ICP, HMD, One reason for this phenomenon is that the HoloLens does not HoloLens, Spatial Mapping. provide an integrated registration method. It automatically tracks head pose and position, but does not provide tracking of individual Index Terms: H.5.1 [Information Interfaces and Presentation]: objects by default. Thus, the registration technique remains Multimedia Information Systems – Artificial, augmented, and challenging, as well as the traditional AR. virtual realities; H.5.2 [Information Interfaces and Presentation]: Scopis recently launched a Holographic Navigation Platform User Interfaces – Interaction styles; I.3.6 [Computer Graphics]: [10] for surgical use. It uses an external 3D positional tracking Methodology and Techniques – Interaction techniques; J.3 [Life system based on a stereoscopic infrared camera with markers. and Medical Sciences] In contrast, the Holographic iRay prototype presented in this paper does not require any additional sensors but only the 1 INTRODUCTION HoloLens. The markerless registration is achieved with the help of This work is a holographic exploration of the iRay system [1] from spatial mapping, which reconstructs the environment surfaces. The mobile devices to a head-mounted display (HMD) such as experiments demonstrate this prototype to be effective for the tasks Microsoft HoloLens [2]. The iRay system is a mobile Augmented explored and easy-to-use. Reality (AR) application implemented on the iPad [3] with a high precision depth sensor (Structure Sensor [4]). It registers the pre- 2 IMPLEMENTATION operative torso surface to the active torso surface scanned by the The Holographic iRay follows the system structure of the previous iRay system [1]. The offline pre-operative phase consists of medical scan (MRI/CT), segmentation of torso surface and inner * Corresponding Author: [email protected] organs, and model creation. The online system scans the environment and human body, obtains the 3D points located in the target area selected by the user’s gaze, registers the pre-operative

surface to this point cloud, renders the inner organs based on the registration result, and responds to the user’s interactions. prototype, but it has not been tested or verified to represent the best performance. The ICP registration works in real time. Both pre-operative and active point clouds of vertices are sampled to less than 100 points before registration. The design of the scanning box additionally improves the registration speed, because during the selection step, the pre-operative torso surface has been placed very near the active surface, which can be considered as a coarse match unconsciously Figure 2: Examples of visibility control. implemented by the user. The accuracy of the active augmentation cannot be physically Figure 1 depicts the three major views of the Holographic iRay measured because the ground truth of camera pose is not currently prototype. Figure 1(a) depicts the selection view before the available. Fiducial markers or some other additional sensors are registration. Under this view, the reconstructed surfaces generated necessary if a quantitative measurement is required. A 3 cm by the HoloLens spatial mapping are rendered in white wireframe. translation in the direction of gravity is currently added to coarsely A red circle cursor is drawn at the 3D point where the gaze direction compensate the 3D reconstruction error of HoloLens. A calibration actively hits the reconstructed surfaces. Around the cursor, a is required in the future work to estimate the exact displacement of transparent white scanning box is simultaneously rendered to spatial mapping. However, the raw depth information would be emphasize the region of interest (ROI) in 3D. The registration will preferred if HoloLens can provide it in the future. be applied on the vertices within the scanning box and the vertices The vivid immersive experience and convenient interactions are sampled from the pre-operative torso mesh. Therefore, the size of validated by several different users, which demonstrates the the box is defined by the 3D size of the pre-operative torso surface, Holographic iRay to be effective and efficient for the tasks and the box is being kept parallel to the ground. explored. The preview of the vertices located inside the scanning box is Future work will focus on accuracy improvement and additional actively visualized as several green spheres. The maximum number interactions. More experiments will be performed to compare the of these preview spheres is set to 50 to guarantee real-time different registration methods for the tasks explored. Tracking of a performance. To guide the user to better select the ROI, the pre- moving human body will be studied, as currently the subject must operative torso surface is overlaid in the middle of the scanning be still, which is limited by the SLAM tracking. Furthermore, user box. This guidance strategy is demonstrated to be effective by a studies will be performed to improve the immersive experience user study in our previous work [11]. depending on the specific scenario. According to the above settings, the selection of active vertices is only controlled by the gaze interaction, which is provided by 4 CONCLUSION HoloLens and Unity [12] components. When the user is satisfied with the selection and is ready to register the surface, a simple voice In this paper, we present a Holographic iRay prototype command “Match” can launch the registration. The registration is implemented on the HoloLens. The prototype explores the achieved by the iterative closest point (ICP) algorithm [13], which capability of our previous iRay system on the latest high-end HMD is built as a dynamic link library inside the application, in real time. for medical applications. Spatial mapping surfaces are applied to Figure 1(b) depicts the AR view after registration. The register the torso surface of a human subject. The good immersive holograms of the torso surface and inner organs are all rendered experience and simple interactions are validated by several users. based on the transformations (6 degrees of freedom) computed by This work expands the usability of HoloLens in the medical the ICP. The rendering of spatial mapping meshes is stopped. domain. Meanwhile, the meshes stop to interact with the 3D cursor. Figure 1(c) depicts the observation view where the user can ACKNOWLEDGMENTS observe the details of the organs of interest as floating holograms This research was funded in part by the Methodist Research beside the patient. The user can use air-tap gestures [14] or say Institute and the UH Hugh Roy and Lillie Cranz Cullen Endowment “Detail” to activate this view. Fund. All statements of fact, opinion or conclusions contained Under the AR view and observation view, the user can also use herein are those of the authors and should not be construed as the air-tap or voice commands such as “Hide” and “Show” to representing the official views or policies of the sponsors. control the visibility of the selected organs. Figure 2 depicts two examples of the results of visibility control. REFERENCES In the current prototype, the detailed holograms are in double [1] I. A. Kakadiaris, M. M. Islam, T. Xie, C. Nikou, and A. B. Lumsden, scale and not movable. However, it is easy to add more functions "iRay: mobile AR using structure sensor," In Proc. 15th IEEE in the future to control the detailed holograms and add additional International Symposium on Mixed and , Merida, auxiliary information (e.g., text, images, or animations). Mexico, 2016, pp. 127-128. [2] Microsoft Corp. HoloLens. Available: https://www.microsoft.com/ 3 RESULTS AND DISCUSSION en-us/hololens [3] Apple Inc. iPad. Available: https://www.apple.com/ipad/ Figure 3 depicts the augmentation results from various [4] Occipital Inc. Structure Sensor. Available: https://structure.io/ perspectives. The tracking by HoloLens is robust to large slant and [5] Microsoft HoloLens. Spatial mapping. Available: https://developer. motion blur if the human subject keeps still, but small trembling microsoft.com/en-us/windows/mixed-reality/spatial_mapping occurs when holograms are too close to the eyes. Additionally, the [6] Microsoft HoloLens. Partner Spotlight with Case Western Reserve holograms have dynamic bias relative to the camera position, which University. Available: https://www.youtube.com/watch?v=SKpKlh1 may be caused by camera distortion or tracking limitations. -en0 The Holographic iRay works in real time. The precision of [7] MediSIM. Medical Simulated Interactive Manikin. Available: HoloLens spatial mapping is set to 5,000 triangles per cubic meter http://www.etc.cmu.edu/projects/medisim/ though it may not meet the exact number. The time between mesh [8] HoloEyes Inc. HoloLens Surgery: holographic updates is set to 2.5 s to allow re-computation of deforming augmented mixed reality navigation. Available: https://www.you surfaces. This combination of parameters is suitable for the current tube.com/watch?v=qLGD570I1OE [9] ANIMA RES GmbH. INSIGHT HEART on HoloLens. Available: [12] Unity. Available: https://unity3d.com/ https://www.youtube.com/watch?v=c9O7L1Wtqqc&list=PLKFqg- [13] A. Geiger, P. Lenz, and R. Urtasun, "Are we ready for autonomous NQRGaIR_mcetDGh7qZn2k4XklCB driving? the kitti vision benchmark suite," In Proc. 25th IEEE [10] Scopis GmbH. Holographic Navigation Platform Available: Conference on Computer Vision and Pattern Recognition, http://holo.scopis.com/ Providence, RI, USA, 2012, pp. 3354-3361. [11] T. Xie, M. M. Islam, A. B. Lumsden, and I. A. Kakadiaris, "Semi- [14] Microsoft HoloLens. Gestures. Available: https://developer.micro automatic Initial Registration for the iRay System: A User Study," In soft.com/en-us/windows/mixed-reality/gestures Proc. 4th International Conference on Augmented Reality, and Computer Graphics, Ugento, Lecce, Italy, 2017, pp. 33- 42.

Figure 3: Augmentation results from various perspectives with motion blur. The first row depicts the augmentation results under motion blur when camera is close to the subject. The second row depicts the results with big slants when the camera is moving farther or even not looking at the registered torso. The third row depicts the results under large motion blur.