Creating Novel Mixed Reality Experiences Using Existing Sensors
Total Page:16
File Type:pdf, Size:1020Kb
Creating Novel Mixed Reality Experiences using existing sensors and peripherals by Fernando Trujano S.B., Massachusetts Institute of Technology (2017) Submitted to the Department of Electrical Engineering and Computer Science in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer Science at the MASSACHUSETTS INSTITUTE OF TECHNOLOGY February 2018 ○c Fernando Trujano, 2017. All rights reserved. The author hereby grants to MIT permission to reproduce and to distribute publicly paper and electronic copies of this thesis document in whole or in part in any medium now known or hereafter created. Author................................................................ Department of Electrical Engineering and Computer Science December 18, 2017 Certified by. Pattie Maes Professor Thesis Supervisor Accepted by . Christopher J. Terman Chairman, Master of Engineering Thesis Committee 2 Creating Novel Mixed Reality Experiences using existing sensors and peripherals by Fernando Trujano Submitted to the Department of Electrical Engineering and Computer Science on December 18, 2017, in partial fulfillment of the requirements for the degree of Master of Engineering in Electrical Engineering and Computer Science Abstract Mixed Reality (MR) technology allows us to seamlessly merge real and virtual worlds to create new experiences and environments. MR includes technologies such as Virtual Reality and Augmented Reality. Even though this technology has been around for several years, it is just starting to become available to an average consumer with the introduction of devices such as the HTC Vive and the Microsoft Hololens. This thesis consists of the design, implementation and evaluation of three projects that highlight a novel experience in mixed reality using existing sensors and periph- erals. VRoom describes a simple and novel approach to 3D room scanning using only an HTC Vive and 360 camera. Mathland is a mixed reality application that enables users to interact with math and physics using a stretch sensor, IMU and camera. ARPiano is an augmented MIDI keyboard that offers a different approach to music learning by adding intuitive visuals and annotations. Thesis Supervisor: Pattie Maes Title: Professor 3 4 Acknowledgments I would like to thank my advisors, Prof. Pattie Maes and Mina Khan for their guid- ance and support throughout this thesis. I started my Master’s without a particular project or idea in mind and they were instrumental in introducing me to the poten- tials of MR. I would also like to thank everyone in the Fluid Interfaces group at the Media Lab for sharing their passions and inspiring me to pursue mine. Mathland, as described in Chapter 3 was a collaborative effort with Mina Khan, Ashris Choudhury and Judith Sirera. They helped build and shape Mathland into the tool is it today, and will continue on with its development after my graduation. Finally, I would like to thank my mom, Marisa Pena-Alfaro, my sister, Linda Trujano, and the rest of my family for always cheering me up and giving me the confidence needed to pursue this project. I would like to thank my girlfriend, Juju Wang, for providing valuable input and constantly putting up with my rants on Mixed Reality technology. I would also like to thank my friends for providing a de-stressing environment during my time at MIT. 5 6 Contents 1 Introduction 15 2 VRoom 17 2.1 Introduction . 17 2.2 Current Research . 18 2.3 System Overview . 19 2.4 Implementation . 20 2.4.1 Capture Scene . 21 2.4.2 Viewer Scene . 21 2.4.3 PointViewer Scene . 23 2.5 Evaluation . 23 2.6 Challenges . 25 2.6.1 Synchronization . 25 2.6.2 VR Comfort . 26 2.7 Future Work . 27 2.8 Conclusion . 29 3 Mathland 31 3.1 Introduction . 31 3.1.1 Goals . 31 3.2 Integrated Sensors . 33 3.2.1 Hololens Camera . 33 3.2.2 F8 Sensor . 34 7 3.3 Design . 38 3.4 Sharing . 39 3.5 Evaluation . 39 3.5.1 Study results . 41 3.6 Challenges . 42 3.6.1 Hololens Development . 43 3.6.2 Hololens Anchor Sharing . 44 3.6.3 Showcasing AR Work . 45 3.6.4 Hardware Limitations . 46 3.7 Future Work . 47 3.8 Conclusion . 48 4 ARPiano 49 4.1 Introduction . 49 4.1.1 Goals . 49 4.2 Previous Work . 50 4.2.1 SmartMusic . 51 4.2.2 Synthesia . 51 4.2.3 Andante/Perpetual Canon . 51 4.3 Design . 52 4.3.1 Integrated Sensors . 52 4.3.2 Physical and Augmented Keyboard . 53 4.3.3 Keyboard UI/Helper classes . 54 4.3.4 Component Menu . 55 4.3.5 Sharing . 56 4.3.6 Unity Debugging . 57 4.4 Keyboard Components . 58 4.4.1 Chord Suggestion . 59 4.4.2 Keyboard Labels . 60 4.4.3 Song Tutorial . 60 8 4.4.4 Note Spawner and Flashy Visualizer . 64 4.4.5 Chord Detector . 65 4.5 Evaluation . 66 4.5.1 Study Results . 67 4.6 Challenges . 68 4.7 Future Work . 69 4.8 Conclusion . 71 5 Conclusion 73 9 10 List of Figures 2-1 VRoom system diagram. 360 Video and 3D positional data from the 360 cameras and Vive Tracker are fed into Unity in order to produce a VR application. 19 2-2 The viewer scene finds the best frame to render given the user’s position and renders it on the head mounted display. 22 2-3 The point viewer scene renders each frame as a sphere in the 3D posi- tion in which it was captured. 23 3-1 (a) Arm controller comprises of a compression sleeve that can be worn by the user. There is a figur8 sensor, which comprises of an IMU and stretch sensor. (b) and (c) Tracking user’s arm movements using the Arm controller . 35 3-2 Object controller that enables tangible interactions with virtual objects in Mixed Reality. 36 3-3 The Microsoft Hololens headset (for MR experience) with our Object Controller and Arm Controller . 37 3-4 Five virtual items in Mathland’s menu to allow the user to experience different physical forces and interactions. 38 11 3-5 Three puzzles created by arranging the virtual checkpoints in MR. The left side images show the puzzle whereas the right side ones show the solution. Each puzzle corresponds to a specific physics concept: a) Circular Motion (using ball with a rope and velocity vector) b) Linear Motion (using ball with a downward Force Field, two Ramps, and a Velocity Vector) c) Projectile Motion (using a downward Force Field, and a Velocity Vector) . 40 3-6 Results from our survey with 30 participants. The responses were based on a 5-point Likert scale: Strongly agree, Agree, Neutral, Disagree, Strongly Disagree. Charts a) to f) each correspond to one question on our post-study survey. 42 4-1 Multifunction knob and MIDI keyboard with AR markers on first and last keys. 52 4-2 Calibration ensures that the Hololens knows the exact location of the physical keyboard. The markers show the perceived location of the first and last keys. 54 4-3 3D icons representing Keyboard Components are overlayed on top of the keyboard. From left to right, the icons represent: KeyboardLabels, NoteSpawner, ChordSuggestion, ChordDetector and SongTutorial. In this figure, the NoteSpwaner component is enabled. 56 4-4 Non-calibration features, such as the Piano Roll can be debugged di- rectly in Unity . 58 4-5 The Chord Suggestion component highlights keys that can form chords. In this figure, both major (green highlight) and minor (pink highlight) chords are suggested with shared keys being highlighted blue. 59 4-6 The Keyboard Labels component augments each key by labeling it’s corresponding note. 60 4-7 SpectatorView image of AR PianoRoll representation of Beethoven’s Für Elise . 61 12 4-8 The knob can be used to see and control the position of the song being played (left) or the speed at which it is being played (right) . 62 4-9 At the end of a Song Tutorial, the keyboard is transformed into a visual representation of the user’s performance. In this graph, it is clear that the user made more mistakes in a specific section near the beginning of the song. 63 4-10 The NoteSpawner component emits cuboids as the user plays the key- board in order to visualize rhythmic and tonal patterns in the song. 64 4-11 The Chord Detector component analyzes the keys as they are pressed in order to label chords when they are played. This figure also shows how multiple components (ChordDetector and KeyboardLabels) can be enabled at once. 65 13 14 Chapter 1 Introduction Mixed Reality (MR) technology allows us to seamlessly merge real and virtual worlds to create new experiences and environments. The term “Mixed Reality” was defined in 1994 and includes technologies such as Virtual Reality (VR) and Augmented Reality (AR). Even though this technology has been around for several years, it is just starting to become available to an average consumer with the introduction of devices such as the HTC Vive and the Microsoft Hololens. To make these products a reality, researchers had to devise new sensors and technologies such as Microsoft’s Spacial Understanding sensors and HTC’s Lighthouses. Until now, most research has focused on creating new sensors along with their corresponding product, but little research has been published on the use of existing sensors and peripherals in a mixed reality application. As such, there may be lost or unknown potential available in existing sensors when used in a MR application. The Leap Motion for example, initially started out as a desktop hand and finger tracker, but it’s true potential was later discovered to be in VR.