Virtual Reality Musical Instruments: State of the Art, Design Principles

Total Page:16

File Type:pdf, Size:1020Kb

Virtual Reality Musical Instruments: State of the Art, Design Principles Stefania Serafin,∗ Cumhur Erkut,∗ Juraj Kojs,† Niels C. Nilsson,∗ Virtual Reality Musical and Rolf Nordahl∗ Instruments: State of the ∗Multisensory Experience Lab Aalborg University Copenhagen Art, Design Principles, A.C. Meyers Vænge 15 2450 Copenhagen, Denmark and Future Directions {sts, cer, ncn, m}@create.aau.dk †Frost School of Music University of Miami P.O. Box 248165 Coral Gables, FL 33146, USA [email protected] Abstract: The rapid development and availability of low-cost technologies have created a wide interest in virtual reality. In the field of computer music, the term “virtual musical instruments” has been used for a long time to describe software simulations, extensions of existing musical instruments, and ways to control them with new interfaces for musical expression. Virtual reality musical instruments (VRMIs) that include a simulated visual component delivered via a head-mounted display or other forms of immersive visualization have not yet received much attention. In this article, we present a field overview of VRMIs from the viewpoint of the performer. We propose nine design guidelines, describe evaluation methods, analyze case studies, and consider future challenges. In 1965, Ivan Sutherland described his idea of an between novel interfaces and sound synthesis algo- ultimate display, conceived as a multisensorial rithms have been at the forefront of development. virtual experience able to recreate the Wonderland Progress in interactive auditory feedback and novel where Alice once walked (Sutherland 1965). More control mechanisms has made great strides. On than 50 years later, such ultimate display is nearly the other hand, the new interfaces for musical ex- at hand. In the past couple of years, the rapid devel- pression (NIME) community has not shown a wide opment of low-cost virtual reality displays, such as interest in the visual aspects of VMIs. One of the the Oculus Rift (www.oculus.com/rift), HTC’s Vive reasons why virtual reality musical instruments (www.htcvive.com), and open-source virtual reality (VRMIs) have not drawn more attention in this (www.razerzone.com/osvr), has boosted interest in community might be the fact that musicians mostly immersive virtual reality technologies. Hardware rely on their auditory and tactile feedback and on the devices once exclusive to high-end laboratories are interaction with the audience. Another reason could now available to consumers and media technol- be that low cost and portable visualization devices ogy developers. We define virtual reality (VR) as have only become available in the last five years. an immersive artificial environment experienced Researchers working with interactive audio through technologically simulated sensory stimuli. and haptic feedback have called their instruments Furthermore, one’s actions have the potential to VRMIs (see, e.g., Leonard et al. 2013); however, shape such an environment to various degrees. visual feedback was absent. In this article we dis- In the fields of new interfaces for musical ex- tinguish, on the one hand, between virtual musical pression and sound and music computing, virtual instruments (VMIs), defined as software simulations musical instruments (VMIs) have been developed or extensions of existing musical instruments with and refined in recent decades (see, e.g., Cook 2002; a focus on sonic emulation, for example, by physical Smith 2010). Physical interfaces for musical ex- modeling synthesis (Valim¨ aki¨ and Takala 1996) and, pression, gestural control, and mapping systems on the other hand, virtual reality musical instru- ments (VRMIs), where the instruments also include a simulated visual component delivered using either Computer Music Journal, 40:3, pp. 22–40, Fall 2016 doi:10.1162/COMJ a 00372 a head-mounted display (HMD) or other forms of c 2016 Massachusetts Institute of Technology. immersive visualization systems such as a CAVE 22 Computer Music Journal Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/COMJ_a_00372 by guest on 02 October 2021 (Cruz-Neira, Sandin, and De Fanti 1993). We do not the audience to the mechanisms of digital musical consider those instruments in which 3-D visual instruments using 3-D visualization. The topic of objects are projected using 2-D projection systems, audience participation in virtual reality experiences such as the environment for designing virtual mu- lies, however, beyond the scope of this article. sical instruments with 3-D geometry proposed by Virtual reality musical instruments have the Axel Mulder (1998) and the tabletop based systems potential to bridge the fields of NIME and VMIs. proposed by Sergi Jorda` (2003). We focus instead on Adding meaningful immersive visualization to a the examination of 3-D immersive visualizations digital musical instrument can result in a new way from the viewpoint of the performer. of experiencing music. In the following sections, We believe that virtual reality shows the greatest we present an overview and the current status of potential when facilitating experiences that cannot research and practice of VRMIs. We propose nine be encountered in the real world. As Jaron Lanier design guidelines, provide case studies, and discuss envisioned in Zhai (1998): potential future challenges. I have considered being a piano. I’m interested in being musical instruments quite a lot. Also, Background Work you have musical instruments that play reality in all kinds of ways aside from making sound in Although research and development in VMIs has a Virtual Reality. That’s another way of describing long history and many examples can be found in the arbitrary physics. With a saxophone you’ll be literature, the same cannot be said for VRMIs. In able to play cities and dancing lights, and you’ll the following we propose an overview of the most be able to play the herding of buffaloes made of significant VRMI research efforts. crystal, and you’ ll be able to play your own body Cadoz and colleagues have been developing their and change yourself as you play the saxophone CORDIS-ANIMA system for many years. It is a (Heilbrun 1989, p. 110). modeling and simulation programming language for In line with Lanier’s thoughts, immersive tech- audio and image synthesis based on physical models nologies enable new musical experiences that extend (Cadoz, Luciani, and Florens 1993). Although to beyond those offered by traditional musical instru- our knowledge the system has never been used in ments. Immersive visualizations thus stimulate immersive virtual environments, it includes all additional levels of abstraction, immersion, and the elements of immersive multimodal musical imagination. instruments, covering auditory, haptic, and visual On the other hand, virtual reality research has feedback. focused mainly on the exploration of visual feedback, Maki-Patola¨ and coworkers reported on a software with other senses playing a secondary role. In some system designed to create virtual reality musical cases, the sole focus on visual domain can provide instruments (Karjalainen and Maki-Patola¨ 2004; a powerful visual experience away from the daily Maki-Patola¨ et al. 2005). The team presented four life. Ben Shneiderman has stated that virtual reality case studies: a virtual xylophone, a membrane, a offers a lively new alternative for those who seek theremin, and an air guitar. The authors discuss the immersion experiences via blocking out the real quality, efficiency, and learning curve as outlined by world with goggles on their eyes (Preece et al. 1994). Jorda` (2004). For example, low spatial and temporal It is important to note that virtual reality can pose resolution and latency decrease efficiency (Maki-¨ some challenges for the audience that witnesses a Patola et al. 2005). They suggest that existing performer wearing head-mounted display for im- technologies can improve these deficiencies and mersive visualization as stated by Berthaut, Zappi, allow for more responsive instruments, expanding and Mazzanti (2014). This is mainly because the both design and performance possibilities. audience is unable to share performer’s experience. One of the most important features of VR in- Berthaut et al. (2013) presented an attempt to expose terfaces is their potential for visualization. Virtual Serafin et al. 23 Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/COMJ_a_00372 by guest on 02 October 2021 Figure 1. Virtual FM Synthesizer (a) and Virtual Air Guitar (b). Images are from http://mcpatola.com /almaproject/HUTTMLIndex.html, used with kind permission of Teemu Maki-Patola.¨ reality musical instruments can stimulate intel- visualization. Gelineck and colleagues further de- ligent visual feedback that may aid the player in tailed possibilities that could not be achieved in the performing. This expands the interaction with physical counterpart, such as the ability to change the physical instruments, where visual feedback the dimensions of the simulated instruments while is directly connected to the mechanism of sound playing them. production, for example, vibrating strings. Gelineck Similarly, the FM synthesizer and the virtual air et al. (2005) proposed several physics-based VRMIs, guitar that were developed within the Algorithms such as a flute and drum. The instruments used for the Modeling of Acoustic Interactions (ALMA) a combination of physical models, 3-D visualiza- project and presented by Maki-Patola¨ and coworkers tions, and alternative sensing techniques. Blowing sonically expand the theremin and guitar, respec- into a plastic
Recommended publications
  • P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications
    P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications Problem Definition Preamble ● Thank you for attending our review! We are looking for: ● constructive criticisms ● outside perspective Team Member Introduction Member Role Alex Dawson-Elli Team Lead - Biomedical Engineer Nate Burrell Biomedical Engineer Jascha Wilcox Biomedical Engineer Kenny LaMarca Computer Engineer Kevin Alexandre Computer Engineer context context volume rendering stereo visualization technology Background ● Anatomy is inherently a 3D problem ● conventional teaching techniques: ○ book -> 2D ○ dissection -> may discourage some students ● Modern technology is underutilized in the classroom Problem Statement Current State:Students learning anatomy from a textbook, at a low level of excitement, and a low level of exposure to modern imaging modalities Desired State: Students learning in a stereoscopic anatomy visualization environment, with a high level of excitement, and a high level of exposure to modern imaging modalities Project Goals: create a fully virtual stereo anatomy viewer Constraints: budget, pre-designed hardware, must use existing data set Stakeholders ● Project advisor ○ Dr. Christian Linte ● Teachers ○ Instructor of students (grade 6-12) ● Students ○ Our target age group is middle and high school students (grade 6-12) Use Scenario Customer Needs Customer Rqmt. # Importance Description CR1 9 Easy for a teacher to learn CR2 9 Easy for the Student to use CR3 9 Graphical Interface CR4 9 Stereo Viewing System, in 3D
    [Show full text]
  • Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task
    Old Dominion University ODU Digital Commons Psychology Theses & Dissertations Psychology Spring 2019 Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task Shelby K. Long Old Dominion University, [email protected] Follow this and additional works at: https://digitalcommons.odu.edu/psychology_etds Part of the Applied Behavior Analysis Commons, Human Factors Psychology Commons, and the Robotics Commons Recommended Citation Long, Shelby K.. "Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task" (2019). Master of Science (MS), Thesis, Psychology, Old Dominion University, DOI: 10.25777/yh1y-0016 https://digitalcommons.odu.edu/psychology_etds/231 This Thesis is brought to you for free and open access by the Psychology at ODU Digital Commons. It has been accepted for inclusion in Psychology Theses & Dissertations by an authorized administrator of ODU Digital Commons. For more information, please contact [email protected]. EFFECTS OF CONTROL DEVICE AND TASK COMPLEXITY ON PERFORMANCE AND TASK SHEDDING DURING A ROBOTIC ARM TASK by Shelby K. Long B.S. December 2013, Georgia Institute of Technology A Thesis Submitted to the Faculty of Old Dominion University in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE PSYCHOLOGY OLD DOMINION UNIVERSITY May 2019 Approved by: James P. Bliss (Director) Yusuke Yamani (Member) Xiaoxiao Hu (Member) ABSTRACT EFFECTS OF CONTROL DEVICE AND TASK COMPLEXITY ON PERFORMANCE AND TASK SHEDDING DURING A ROBOTIC ARM TASK Shelby K. Long Old Dominion University, 2019 Director: Dr. James P. Bliss The use of robotic arms across domains is increasing, but the relationship between control features and performance is not fully understood.
    [Show full text]
  • Institutionen För Systemteknik Department of Electrical Engineering
    Institutionen för systemteknik Department of Electrical Engineering Examensarbete Design of a graphical user interface for virtual reality with Oculus Rift Examensarbete utfört i Informationskodning vid Tekniska högskolan vid Linköpings universitet av Robin Silverhav LiTH-ISY-EX--15/4910--SE Linköping 2015 Department of Electrical Engineering Linköpings tekniska högskola Linköpings universitet Linköpings universitet SE-581 83 Linköping, Sweden 581 83 Linköping Design of a graphical user interface for virtual reality with Oculus Rift Examensarbete utfört i Informationskodning vid Tekniska högskolan vid Linköpings universitet av Robin Silverhav LiTH-ISY-EX--15/4910--SE Handledare: Jens Ogniewski isy, Linköpings universitet Jonathan Nilsson Voysys Examinator: Ingemar Ragnemalm isy, Linköpings universitet Linköping, 1 oktober 2015 Avdelning, Institution Datum Division, Department Date Informationskodning Department of Electrical Engineering 2015-10-01 SE-581 83 Linköping Språk Rapporttyp ISBN Language Report category — Svenska/Swedish Licentiatavhandling ISRN Engelska/English Examensarbete LiTH-ISY-EX--15/4910--SE C-uppsats Serietitel och serienummer ISSN D-uppsats Title of series, numbering — Övrig rapport URL för elektronisk version http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-XXXXX Titel Design av ett användargränssnitt för virtual reality med Oculus Rift Title Design of a graphical user interface for virtual reality with Oculus Rift Författare Robin Silverhav Author Sammanfattning Abstract Virtual reality is a concept that has existed for some time but the recent advances in the per- formance of commercial computers has led the development of different commercial head mounted displays, for example the Oculus Rift. With this growing interest in virtual real- ity, it is important to evaluate existing techniques used when designing user interfaces.
    [Show full text]
  • Leap Motion Controllertm
    Leap Motion ControllerTM The world’s leading hand tracking technology The Leap Motion Controller from Ultraleap is an optical hand tracking module that captures the movement of users’ hands and fingers so they can interact naturally with digital content. Small, fast, and accurate, the Leap Motion Controller can be used for productivity applications with Windows computers, integrated into enterprise- grade hardware solutions or displays, or attached to virtual/augmented reality headsets for AR/VR/XR prototyping, research, and development. The controller is capable of tracking hands within a 3D interactive zone that extends up to 60cm (24”) or more, extending from the device in a 120×150° field of view. Leap Motion‘s software is able to discern 27 distinct hand elements, including bones and joints, and track them even when they are obscured by other parts of the hand. With a 120Hz refresh rate and low-latency software, the time between motion and photon falls beneath the human perception threshold1. An accessory component, the VR Developer Mount, allows for easy, reliable, and consistent attachment of the device to virtual reality headsets such as the Oculus Rift and HTC Vive. The Leap Motion App Gallery features 75+ legacy desktop applications and a demo suite for virtual reality. Example applications • Entertainment (location-based VR/AR experiences, arcades, amusement parks) • Healthcare (stroke rehabilitation, training, mirror, medical imaging, lazy eye treatment) • Therapy and Education (anatomic visualizations, hands-on learning) • Personnel training (flight simulators, complex computer systems) • Industrial design and engineering (automotive, assembly lines, facilities management) • Robotics (telepresence, robotic controls, AI-assisted teaching) The Leap Motion Controller features a 120×150° • Global team collaboration field of view and an interactive tracking range of more than 2 feet (60cm).
    [Show full text]
  • Natural Menu Interactions in VR with Leap Motion Masterarbeit
    Natural Menu Interactions in VR with Leap Motion Masterarbeit Zur Erlangung des Grades Master of Science (M.Sc.) im Studiengang Informatik vorgelegt von Björn Zeutzheim Erstgutachter: Prof. Dr. Stefan Müller (Institut für Computervisualistik, AG Computergraphik) Zweitgutachter: Nils Höhner, M.Sc. (Institut für Wissensmedien, IWM) Koblenz, der 6. September 2019 Erklärung Ich versichere, dass ich die vorliegende Arbeit selbständig verfasst und keine anderen als die angegebenen Quellen und Hilfsmittel benutzt habe. Ja Nein Mit der Einstellung dieser Arbeit in die Bibliothek bin ich einverstanden. □ □ . (Ort, Datum) (Unterschrift) Abstract With the appearance of modern virtual reality (VR) headsets on the consumer market, there has been the biggest boom in the history of VR technology. Naturally, this was accompanied by an increasing focus on the problems of current VR hardware. Especially the control in VR has always been a complex topic. One possible solution is the Leap Motion, a hand tracking device that was initially developed for desktop use, but with the last major software update it can be attached to standard VR headsets. This device allows very precise tracking of the user’s hands and fingers and their replication in the virtual world. The aim of this work is to design virtual user interfaces that can be operated with the Leap Motion to provide a natural method of interaction between the user and the VR environment. After that, subject tests are performed to evaluate their performance and compare them to traditional VR controllers. Zusammenfassung Mit dem Erscheinen moderner Virtual Reality (VR) Headsets auf dem Verbrauchermarkt, gab es den bisher größten Aufschwung in der Geschichte der VR Technologie.
    [Show full text]
  • Universidad Católica De Santiago De Guayaquil Facultad De Artes Y Humanidades Carrera De Ingenieria En Producción Y Dirección En Artes Multimedia
    UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES CARRERA DE INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA TÌTULO: CREACIÓN DE UNA APLICACIÓN MULTIMEDIA QUE DESARROLLE POR MEDIO DE JUEGOS Y ACTIVIDADES LAS HABILIDADES MOTRICES FINAS EN LOS NIÑOS DE PREESCOLAR USANDO LA REALIDAD VIRTUAL EN LOS DISPOSITIVOS MÓVILES TRABAJO DE TITULACIÓN PREVIO A LA OBTENCIÓN DEL TÍTULO DE INGENIERO EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA AUTOR: ROBERTO ADELKARÍN BELTRÁN LUCAS TUTOR: LIC. LAMBERT SARANGO YAMIL, MGS Guayaquil, Ecuador 2015 UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA CERTIFICACIÓN Certificamos que el presente trabajo fue realizado en su totalidad por Roberto Adelkarín Beltrán Lucas, como requerimiento parcial para la obtención del Título de Ingeniero en Producción y Dirección de Arte Multimedia. TUTOR ______________________ Lic. Yamil Lambert Sarango, Mgs DIRECTOR DE LA CARRERA ______________________ Lic. Víctor Hugo Moreno, Mgs Guayaquil, a los 29 del mes de septiembre del año 2015 UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA DECLARACIÓN DE RESPONSABILIDAD Yo, Roberto Adelkarín Beltrán Lucas DECLARO QUE: El Trabajo de Titulación Creación de una aplicación multimedia que a través de juegos y actividades desarrolle las habilidades cognitivas y motrices en los niños de preescolar, usando para ello la realidad virtual en los dispositivos móviles previa a la obtención del Título de Ingeniería en producción y dirección de arte multimedia, ha sido desarrollado respetando derechos intelectuales de terceros conforme las citas que constan al pie de las páginas correspondientes, cuyas fuentes se incorporan en la bibliografía.
    [Show full text]
  • Whitepaper Head Mounted Displays & Data Glasses Applications and Systems
    Whitepaper Head Mounted Displays & Data Glasses Applications and Systems Dr.-Ing. Dipl.-Kfm. Christoph Runde Virtual Dimension Center (VDC) Fellbach Auberlenstr. 13 70736 Fellbach www.vdc-fellbach.de © Competence Centre for Virtual Reality and Cooperative Engineering w. V. – Virtual Dimension Center (VDC) System classes Application fields Directions of development Summary Content . System classes Head Mounted Display (HMD) – Video glasses – Data glasses . Simulator disease / Cyber Sickness . Application fields HMDs: interior inspections, training, virtual hedging engineering / ergonomics . Application fields data glasses: process support, teleservice, consistency checks, collaboration . Directions of development: technical specifications, (eye) tracking, retinal displays, light field technology, imaging depth sensors . Application preconditions information & integration (human, IT, processes) . Final remark 2 SystemSystem classes classes Application fields Directions of development Summary Head Mounted Displays (HMDs) – Overview . 1961: first HMD on market . 1965: 3D-tracked HMD by Ivan Sutherland . Since the 1970s a significant number of HMDs is applied in the military sector (training, additional display) Table: Important HMD- projects since the 1970s [Quelle: Li, Hua et. al.: Review and analysis of avionic helmet-mounted displays. In : Op-tical Engineering 52(11), 110901, Novembre2013] 3 SystemSystem classes classes Application fields Directions of development Summary Classification HMD – Video glasses – Data glasses Head Mounted Display
    [Show full text]
  • Controller-Free Hand Tracking for Grab-And-Place Tasks in Immersive Virtual Reality: Design Elements and Their Empirical Study
    Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 21 October 2020 doi:10.20944/preprints202010.0431.v1 Article Controller-free Hand Tracking for Grab-and-place Tasks in Immersive Virtual Reality: Design Elements and their Empirical Study Alexander Masurovsky 1,2* , Paul Chojecki1, Detlef Runde1, Mustafa Lafci1, David Przewozny1 and Michael Gaebler3 1 Fraunhofer Heinrich Hertz Institute, Berlin, Germany 2 Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Germany 3 Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany * Correspondence: [email protected] Abstract: Hand tracking enables controller-free interaction with virtual environments, which can, compared to traditional handheld controllers, make virtual reality (VR) experiences more natural and immersive. As naturalness hinges on both technological and user-based features, fine-tuning the former while assessing the latter can be used to increase usability. For a grab-and-place use case in immersive VR, we compared a prototype of a camera-based hand tracking interface (Leap Motion) with customized design elements to the standard Leap Motion application programming interface (API) and a traditional controller solution (Oculus Touch). Usability was tested in 32 young healthy participants, whose performance was analyzed in terms of accuracy, speed and errors as well as subjective experience. We found higher performance and overall usability as well as overall preference for the handheld controller compared to both controller-free solutions. While most measures did not differ between the two controller-free solutions, the modifications made to the Leap API to form our prototype led to a significant decrease in accidental drops. Our results do not support the assumption of higher naturalness for hand tracking but suggest design elements to improve the robustness of controller-free object interaction in a grab-and-place scenario.
    [Show full text]
  • Usability Evaluation of Input Devices for Navigation and Interaction in 3D
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by International SERIES on Information Systems and Management in Creative eMedia... Usability evaluation of input devices for navigation and interaction in 3D visualisation Peter Skrljˇ Jozeˇ Guna Ciril Bohak, Matija Marolt University of Ljubljana University of Ljubljana University of Ljubljana Faculty of Computer and Faculty of Electrical Faculty of Computer and Information Science Engineering Information Science [email protected] [email protected] fciril.bohak, [email protected] ABSTRACT exploration, which describes navigation through space with We present an assessment study of user experience and us- no target goal, search, where the user must navigate to a ability of different kinds of input devices for view manipula- certain goal with speed, and finally maneuvering, which de- tion in a 3D data visualisation application. Three input de- scribes slower movements but with higher precision. All of vices were compared: a computer mouse, a 3D mouse with the above mentioned aspects are important while developing six degrees of freedom, and the Leap Motion Controller - a highly usable interface for 3D navigation. a device for touchless interaction. Assessment of these de- In our case we do not address a specific aspect of navigation vices was conducted using the System Usability Scale (SUS) listed in introduction (exploration, search or maneuvering), methodology, with addition of application specific questions. we are covering the user experience. Further research for ad- To gain further insight into users’ behaviour, the users’ per- dressing individual aspects are planned as part of future work.
    [Show full text]
  • Virtual Reality Games Integration Module for Managing and Monitoring of Hand Therapy
    SBC { Proceedings of SBGames 2020 | ISSN: 2179-2259 Games and Health { Full Papers Virtual Reality Games Integration Module for Managing and Monitoring of Hand Therapy Ana Grasielle Dionísio Corrêa Adriano Rapussi Felipe Monteiro da Costa Programa de Pós-Graduação em Faculdade de Computação e Informática Faculdade de Computação e Informática Distúrbios do Desenvolvimento Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie São Paulo, Brasil São Paulo, Brasil São Paulo, Brasil [email protected] [email protected] [email protected] Raquel Cymrot Natália Regina Kintschner Silvana Maria Blascovi-Assis Escola de Engenharia Programa de Pós-Graduação em Distúrbios Programa de Pós-Graduação em Distúrbios do Universidade Presbiteriana Mackenzie do Desenvolvimento Desenvolvimento São Paulo, Brasil Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie [email protected] São Paulo, Brasil São Paulo, Brasil [email protected] [email protected] Abstract—Injuries to the hand function caused by overwork need to manipulate the joystick. However, it is still a challenge or neurological problems lead to the need for intense motor to detect finer hand movements, for example, movements of rehabilitation programs. Rehabilitation training is generally grasping and releasing objects [1]. Other solutions using limited to the rehabilitation center and/or hospitals and patients cameras or wearable technologies such as data gloves are are idle at home. We have developed a module for managing and some of the hand movement capture options most used in monitoring virtual reality games, controlled by the Leap Motion manual rehabilitation [6]. Controller (LMC) sensor, for the rehabilitation of the manual function that can be used in a clinical and/or home environment.
    [Show full text]
  • THE CHALLENGE of HAND GESTURE INTERACTION in the VIRTUAL REALITY ENVIRONMENT Evaluation of In-Air Hand Gesture Using the Leap Motion Controller
    CORE Metadata, citation and similar papers at core.ac.uk Provided by Theseus Hong An Pham THE CHALLENGE OF HAND GESTURE INTERACTION IN THE VIRTUAL REALITY ENVIRONMENT Evaluation of in-air hand gesture using the Leap Motion Controller THE CHALLENGE OF HAND GESTURE INTERACTION IN THE VIRTUAL REALITY ENVIRONMENT Evaluation of in-air hand gesture using the Leap Motion Controller Hong An Pham Bachelor’s Thesis Spring 2018 Information Technology Oulu University of Applied Sciences ABSTRACT Oulu University of Applied Sciences Degree Programme in Information Technology, Internet Services Author: Hong An Pham Title of Bachelor´s thesis: The Challenge of Hand Gesture Interaction in the Virtual Reality Environment Supervisor: Lasse Haverinen Term and year of completion: Spring 2018 Number of pages: 53 In recent years, people in the world have perceived how 3D inventions change our life. Along with the rise of the Virtual Reality technology, there are two types of interaction methods for illustrating human ‘s command in the virtual environment: via controllers and via in-air hand gesture. Using in- air hand gesture is more considered because of natural manipulation and a lower cost for hardware controller. However, comparing with controller device, in-air hand gesture has many limitation of imprecise operation so it is not popular in 3D game and manufacturing industries. By means of a user study with several levels of users, this research conducts a usability test in order to analyse the factors affect in-air hand gesture interaction. With a virtual environment with an Oculus headset and a Leap Motion controller, users were free to perform gesture actions and acknowledge the world with three sensors: sight, touch and hearing.
    [Show full text]
  • Virtual Reality – Streaming and Concepts of Movement Tracking
    VIRTUAL REALITY – STREAMING AND CONCEPTS OF MOVEMENT TRACKING. Bachelor thesis in Computer Science MARCH 27, 2014 MÄLARDALENS HÖGSKOLA Author: Mikael Carlsson Supervisor: Daniel Kade Examiner: Rikard Lindell ABSTRACT This thesis was created to support research and development of a virtual reality application allowing users to explore a virtual world. The main focus would be on creating a solution for streaming the screen of a PC to a mobile phone, for the purpose of taking advantage of the computer’s rendering capability. This would result in graphics with higher quality than if the rendering was done by the phone. The resulting application is a prototype, able to stream images from one PC to another, with a delay of about 200ms. The prototype uses TCP as its transmission protocol and also uses the Graphics Device Interface (GDI) to capture the host computer’s screen. This thesis also provides some information and reflections on different concepts of tracking the movement of the user, e.g. body worn sensors used to capture relative positions and translating into a position and orientation of the user. The possibility of using magnetic fields, tracking with a camera, inertial sensors, and finally triangulation and trilateration together with Bluetooth, are all concepts investigated in this thesis. The purpose of investigating these methods is to help with the development of a motion tracking system where the user is not restricted by wires or cameras, although one camera solution was also reviewed. TABLE OF CONTENTS LIST OF FIGURES ...................................................................................................................................... 1 EXPLANATION OF TERMS AND ABBREVATIONS ...................................................................................... 2 CHAPTER 1 - INTRODUCTION .................................................................................................................. 3 CHAPTER 2 – PROBLEM DEFINITION ......................................................................................................
    [Show full text]