Natural Navigation in Space and Time Dmitriy Bespalov

Natural Navigation in Space and Time Dmitriy Bespalov

Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 6-6-2013 Natural navigation in space and time Dmitriy Bespalov Follow this and additional works at: http://scholarworks.rit.edu/theses Recommended Citation Bespalov, Dmitriy, "Natural navigation in space and time" (2013). Thesis. Rochester Institute of Technology. Accessed from This Thesis is brought to you for free and open access by the Thesis/Dissertation Collections at RIT Scholar Works. It has been accepted for inclusion in Theses by an authorized administrator of RIT Scholar Works. For more information, please contact [email protected]. Natural Navigation in Space and Time by Dmitriy Bespalov A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Science Department of Computer Science B. Thomas Golisano College of Computing and Information Sciences Rochester Institute of Technology Rochester, NY June 6, 2013 Committee Approval: Dr. Hans-Peter Bischof Date Thesis Advisor Dr. Brian O'Keefe Date Reader Dr. Leon Reznik Date Observer To my parents. Abstract Faculty at the Department of Computer Science at RIT had developed the Spiegel, a scientific data visualization framework. The system needed a natural interface to control 3D data visualizations in real-time. This thesis describes an extendable system for testing remote control interfaces for 3-dimensional virtual spaces. We had developed and compared 4 remote controls: multi-touch TouchPad, gyroscope-based GyroPad, wearable Data Glove, and Kinect-based Hands controller. Our research study revealed TouchPad as the most natural remote control. Acknowledgements First and foremost, I sincerely thank my advisor, Dr. Hans-Peter Bischof, for his continuous guidance, support and wisdom that helped me during research and writing of this thesis. I am also thankful to Dr. Brian O'Keefe for his insightful comments, valuable feedback and support during the whole period of thesis work. Besides, I would like to thank Dr. Leon Reznik for his encouragement, time and support that helped me a lot. My sincere gratitude to the faculty of Computer Science department at RIT, especially to Dr. Minseok Kwon, Dr. Edith Hemaspaandra, Dr. Joe Geigel, Dr. Ivona Bezakova, Warren R. Carithers, Alan Kaminsky and Dr. Richard Zanibbi for their extraordinary passion, knowledge and expertise they are sharing in their work. Besides, I thank Christina Rohr for her advising during Master's program; also I thank Jason Harrison for his cheerful support and help in solving administrative questions. My sincerest appreciation goes to Dr. Oli Mival, Senior Research Fellow at the Edinburgh Napier University, for his advice on multi-touch gestures for three- dimensional navigation. I thank my friend Ruslan for his great support, inspiration, and stimulating discussions during sleepless nights we worked together, and for all the fun we had for the time I know him. Also I thank my friends Georgii and Vladimir for their support. There are a lot of my friends who I'm thankful for support. Unfortunately, the space is limited and for those who can't find his or her name: please accept my deep gratitude. Last, but not least, I thank my parents, Olga and Nikolai, who gave me all the best. Their love provided me confidence and inspiration, their emotional support and wisdom strengthened me in every way. I thank the families of my brother Eugenii, my sister Julia, my brother Michael, my grandma, my uncles and aunts. Finally, I express my appreciation to my girlfriend, Albina, whose love and support helped me to finish this journey. The research for this thesis was supported in part by the Republic of Tatarstan Higher Education Fellowship Program, a program sponsored by Ministry of Science and Education of the Republic of Tatarstan and administered by American Councils for International Education. The opinions expressed herein are the author's own and do not necessarily express the views of the Government of Tatarstan or American Councils. Contents 1 Introduction 1 2 Related Work 2 2.1 Problem description . .2 2.2 Similar studies . .3 2.3 Comparison of existing methods . .5 2.4 Our Approach . .5 2.5 Hypothesis . .7 3 Software Architecture 8 3.1 System Overview . .9 3.1.1 Communication protocol . 11 3.2 Main Program . 12 3.2.1 Application logic overview . 12 3.2.2 Architecture . 13 3.2.3 Sequence Diagram . 18 3.3 TouchPad App . 20 3.3.1 Functionality overview . 20 3.3.2 Architecture . 23 3.4 GyroPad App . 26 3.4.1 Functionality overview . 26 3.5 Data Glove Application . 28 3.6 Hands Controller Application . 34 3.6.1 Architecture . 34 v CONTENTS vi 3.6.2 Gesture recognition . 37 4 Evaluation 40 4.1 Experiment . 40 4.1.1 Overview . 40 4.1.2 Discussion guide . 41 4.2 Experiment results . 47 4.2.1 Hands controller . 49 4.2.2 Data processing . 49 4.2.3 GyroPad vs. TouchPad . 52 4.3 Summary . 52 5 Future work 54 Bibliography 56 List of Figures 2.1 Process of interaction between human and interface for 3D navigation. .6 2.2 High-level view of the system . .6 3.1 Overview of testing system workflow . 10 3.2 Architecture of the main program . 14 3.3 UDP message structure . 16 3.4 Example of a message from remote control program to server program . 18 3.5 Sequence diagram of main program when it runs an experiment task . 19 3.6 Sequence diagram of TouchPad app recognizing a remote control command from user . 21 3.7 User interface of the main program (window on top) and TouchPad app's screen (bottom window . 22 3.8 Architecture of the TouchPad app . 24 3.9 Sequence diagram of the GyroPad app recognizing a remote control command. 27 3.10 User interface of the GyroPad app . 27 3.11 Architecture of the GyroPad app . 29 3.12 Circuit that was sewed on a glove. Flex sensors on the left connected to resistors in parallel and to the pins of the main Arduino LilyPad board. Radio transmitting bottom board also connected to the main board. 30 3.13 Architecture of the Data Glove application. 31 3.14 States of the gesture recognizer in the Data Glove application. 33 3.15 Architecture of the Hands Controller application. 34 vii LIST OF FIGURES viii 3.16 Diagram for a set of blocks that gets coordinates of two hands on input and outputs smoothed, filtered coordinates of two hands and a middle point between them. 36 3.17 Invariant condition of a movement gesture. 38 3.18 Invariant condition of a rotation gesture. 38 3.19 Invariant condition of a scaling gesture. 39 4.1 Top: Main screen of testing program, bottom: main program running a task 41 4.2 User interface of the TouchPad app . 42 4.3 User interface of the GyroPad app . 42 4.4 Custom-built data glove based on Arduino LilyPad board . 43 4.5 Kinect for Xbox 360 . 43 List of Tables 2.1 Comparison of 3D interfaces for navigation . .5 4.1 Number of samples per task-interface pair . 48 4.2 Duration (in seconds) and number of select actions per experiment task per remote control interface. Outliers are shown in boldface. 51 ix Chapter 1 Introduction Recent advances in technology boosted interest of researchers to new methods of human- computer interaction. Traditional mouse and keyboard interfaces are often uncomfortable to control 3D virtual environment, whether it is a game, modeling software, or data visu- alization system. Numerous approaches were suggested to make user interaction with a system more natural (see Takala et al. (2012)). Originally, the research concentrated on visual-based hand tracking methods for 3D object manipulation (Guan & Zheng (2008), Lin et al. (2002), Pavlovic et al. (1997), Rehg & Kanade (1994)). Ubiquitous usage of multi-touch devices with gyroscope and accelerometer sensors, starting with iPhone, also led to new methods of 3D control (Kratz & Rohs (2010), Edelmann et al. (2009)). With advances in motion sensing, affordable sensors allowed researchers to build data glove-based interfaces (see Teleb & Chang (2012), Kumar et al. (2012), Zhu et al. (2010)). Further, Microsoft Kinect motion sensor and open source libraries stimulated development of completely new, full body interaction, systems (Woodall & Bevly (2012), Vanco et al. (2012), Acuna et al. (2012)). Moreover, Leap motion sensor device is capable of fast finger-precise hand tracking (http://leapmotion.com). Doubtless, the need to compare how natural are these approaches for 3D navigation is growing. We compare major types of 3D navigation interfaces. In Chapter 2 we review existing methods of navigation in virtual environments. Chapter 3 describes different devices and methods that are tested. Next, chapter 4 focuses on experiment setup and discussion of the results, and chapter 5 discusses future work. 1 Chapter 2 Related Work 2.1 Problem description Affordable motion sensors, ubiquitous smart devices, and growing need in natural human- computer interfaces have attracted attention of researchers since 2009, although tremen- dous amount of work was done previously. Again and again similar concepts are imple- mented using different devices. For example, virtual 3D mouse interface developed based on multi-touch, data glove, and camera hand tracking systems. Faculty at the Department of Computer Science at RIT had developed the Spiegel, a scientific data visualization framework. The system is a powerful tool to create visualiza- tions but lacks of interactivity. Problem is to find natural interface to control visualization in real-time. Both 3D position of the camera and time parameter of data visualizations needs to be controlled. By 'natural' and 'intuitive' control we mean that, ideally, user doesn't need any instruc- tions on how to use an interface to easily move in virtual space and change time parameter of a visualization.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    70 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us