Global Illumination for Fun and Profit

Total Page:16

File Type:pdf, Size:1020Kb

Global Illumination for Fun and Profit A Wearable Hardware Platform for Capturing Expert’s Experience Puneet Sharma1, Tre Azam2, Soyeb Aswat2, Roland Klemke3, Fridolin Wild4 1University of Tromsø, Norway 2Myndplay, UK 3Open University of the Netherlands, Netherlands 4Oxford Brookes University, UK ABSTRACT proposed instructional methods in the form of In this article, we propose a mapping of methods components that facilitate the transfer of (that facilitate the transfer of experience from experience from an expert to a novice. expert to trainee) to low-level functions such as 2 METHODOLOGY gaze, voice, video, body posture, hand gestures, bio-signals, fatigue levels, and location of the user In this section, first, we will discuss the mapping of in the environment. This mapping is used to transfer mechanisms; second, we outline sensor decompose the low-level functions to their recommendations, and finally we discuss the associated sensors. After reviewing the proposed hardware platform prototype. requirements, a set of sensors is proposed for the experience-capturing platform. Based on this, a 2.1 Mapping Transfer Mechanisms first version of hardware platform is designed and The authors [3], define instructional methods as implemented. Our proposed platform is modular transfer mechanisms i.e., methods that facilitate and can be adapted to different sensors from the transfer of knowledge. The transfer different vendors as technology progresses. mechanisms include: remote symmetrical tele- Furthermore, we discuss key challenges and future assistance, virtual/tangible manipulation, haptic directions associated with the hardware platform. hints, virtual post its, mobile control, in situ real This work was done as part of an EU project called time feedback, case identification, directed focus, WEKIT. self-awareness of physical state, contextualisation, object enrichment, think aloud protocol, zoom, and Keywords: sensors, experience capturing, augmented slow motion. In this section, we decompose the reality, AR, WEKIT, WT different transfer mechanisms to low-level functions and their associated state-of-the-art Index Terms: WT, AR, WEKIT sensors [5]. 1 INTRODUCTION 1. Remote symmetrical tele-assistance For the daily smooth operations of an organization, experienced workers are vital at every level. By Description: View and capture the activity sharing their knowledge, experience, expertise of of another person from their perspective procedures, and best practices with colleagues, transmit video & audio. trainees, managers and bosses, they build, Sensors: smart/AR glasses. maintain, and support the different functions of an organization. Industries are fully aware of that and Key products: Moverio BT-200/2000, are trying new ways to capture, support, and Microsoft Hololens, Sony SmartEyeglass, preserve, the experience of an expert [1]. Google Glass, Meta 2, Vuzix M-100, Optinvent Ora-1, ODG R7. One of the key challenges faced by the industry is keeping the workers up to date in terms of their 2. Virtual/ tangible manipulation skill set. With this in mind, WEKIT is a European research and innovation project supported under Description: Hand movement tracker, Horizon 2020 to develop and test within three accelerometer, gyroscope. years a novel way of industrial training enabled by smart Wearable Technology (WT). Sensors: Depth camera, smart armband. The WEKIT industrial learning methodology Key products: Myo Gesture control comprises of capturing experience of an expert, armband, Leap Motion controller. and re-enacting the experience for training novices [2], with the former being the focus of this article. 3. Haptic hints In a recent study by [3], using the principles of Description: Vibrations on arm or fingers. 4C/ID model [4] for hands on industrial training an Sensors: Vibrotactile bracelets and rings. extensive review was performed, and the authors Key products: Myo, magic ring. Description: zoom in and get details. 4. Virtual post its, contextualisation, in situ Sensors: Smart glasses / tablet with high- real-time feedback, resolution camera. Description: Object tracking in Key products: Several. environment. 11. Slow motion Sensors: Smart glasses, Tablet Computer or Mobile Phone. Description: Allow replay at slower speed. Key products: several. Sensors: High frame rate camera (high frame rate often comes at price of 5. Mobile control, resolution with smart glasses; and vice versa). Description: control dials and other user interface elements. Key products: Several. Sensors: hand gesture, controller. The mapping from transfer mechanisms to sensors is not injective. For instance, a transfer functions Key products: Myo, Leap motion. such as remote symmetrical tele-assistance requires both audio and video information, for 6. Case identification which we need more than one sensor. On the other Description: Link with existing cases, link hand, some smart glasses (such as Microsoft with error knowledge. Hololens) are equipped with a number of integrated sensors, which enables it to capture Sensors: case based reasoning component. various transfer mechanisms. Some transfer mechanisms (e.g., virtual post its, Key products: no specific sensor. contextualisation, in situ real-time feedback) need highly processed information provided by 7. Directed focus subroutines or software libraries of an API. Some transfer mechanisms (such as zoom and slow Description: Direct focus of user. motion) are computationally expensive and can be Sensors: Gaze direction / object impractical based on the current state-of-the art of recognition, EEG (attention/focus/mental wearable devices. effort), 2.2 Sensor recommendations Key products: Smart Glasses (or In this section, we will discuss recommendations gyroscope only), MyndPlay MyndBand, on sensor choice that are used in the development Interaxon Muse EEG, Neurosky Mindwave, of the first hardware platform and discuss the Emotiv EEG. associated issues. 8. Self-awareness of physical state In order to design and develop the first version of the hardware prototype, we focus on the most Description: Fatigue level, vigilance level, important transfer mechanisms defined in the Biodata (e.g., steps, sleep, heart rate,), body studies [2, 3]. posture: ergonomics (e.g. lean back, forward). Taking into consideration features such as built-in microphone array, environment capture, gesture Sensors: EEG, smart watch, posture tracking, mixed reality capture, Wi-Fi 802.11ac, tracker. and fully untethered holographic computing, Microsoft Hololens [6] was selected for the role of Key products: MyndPlay MyndBand, AR / Smart glasses. Furthermore, the built in Neurosky Mindwave, Emotiv EEG, Fitbit, components of Hololens enable us to capture apple watch, LeapMotion, Myo, Lumo Lyft, several different attributes of the user and her Alex posture. environment. For EEG, the MyndBand [7] and 9. Think aloud protocol Neurosky chipset were favoured due to the availability of the processed data and real time Description: Capture voice of the user. feedback. For detecting hand, arm movements and gestures: Leap Motion [8] and Myo armband [9] Sensors: Microphone. were chosen. To track the position and angle of the neck, Alex posture tracker [10] was suggested. Key products: Cochlea Wireless Mini Microphone, built-in microphone of 2.2.1 Data from sensors Camera/Smart Glasses, Wireless In this section, we analyze the different types of Microphones (e.g. from AKG). data and interpreted signal available from the selected sensors. As shown in Table 1, by using the 10. Zoom API associated with Microsoft Hololens, we get processed data in the form of spatial data, computational power, the sensor-processing unit orientation, and gaze, and interpreted data in the acts as a processor for collecting, syncing and form of gestures, but, raw data is not accessible. streaming the data from all the different sensors. We can get raw data for Leap motion, Myo, and the The number of sensors can be decreased or Myndband, this was important in allowing us to increased based on the requirements of the manage the visualisations, record the data for post industrial training scenario. Hardware processing and help us discover potentially new components can use different communication interpreted markers or algorithms relating to standards such as: WiFi and Bluetooth. The experience capture. Also, different sensors require proposed architecture is quite flexible and allows different bandwidths. It is particularly high for for communication protocols such as Apache Thrift video signals (associated with AR glasses), [11], TCP/IP, UDP. moderate for Leap motion and audio signals (microphone of AR glasses) and low for Myo, Myndband, and Alex posture tracker. Table 1: Sensors and their raw/processed/interpreted Data, Sensor Data Raw Processed Interpreted Microsoft Not accessible Spatial data, Gesture HoloLens via official API Orientation, Gaze Figure 1: Proposed hardware architecture Leap Raw sensor Hand model - Motion images data Myo Raw EMG Orientation Detected data available and pose acceleration MyndPlay 512Hz Raw Bandwidth Attention, MyndBand spectrum meditation, 0.5Hz - 100Hz. zone, mental Delta - Mid effort, Gamme familiarity Alex Raw Unconfirmed Head, Neck posture accelerometer from Vendor and Shoulder tracker and posture gyroscope 2.3 Proposed Hardware Platform The proposed hardware platform (as shown in Figure 1) consists of three components: smart Figure 2: Hardware platform first iteration glasses, external sensors, and sensor processing unit, with the last two being housed in a WEKIT Based
Recommended publications
  • P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications
    P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications Problem Definition Preamble ● Thank you for attending our review! We are looking for: ● constructive criticisms ● outside perspective Team Member Introduction Member Role Alex Dawson-Elli Team Lead - Biomedical Engineer Nate Burrell Biomedical Engineer Jascha Wilcox Biomedical Engineer Kenny LaMarca Computer Engineer Kevin Alexandre Computer Engineer context context volume rendering stereo visualization technology Background ● Anatomy is inherently a 3D problem ● conventional teaching techniques: ○ book -> 2D ○ dissection -> may discourage some students ● Modern technology is underutilized in the classroom Problem Statement Current State:Students learning anatomy from a textbook, at a low level of excitement, and a low level of exposure to modern imaging modalities Desired State: Students learning in a stereoscopic anatomy visualization environment, with a high level of excitement, and a high level of exposure to modern imaging modalities Project Goals: create a fully virtual stereo anatomy viewer Constraints: budget, pre-designed hardware, must use existing data set Stakeholders ● Project advisor ○ Dr. Christian Linte ● Teachers ○ Instructor of students (grade 6-12) ● Students ○ Our target age group is middle and high school students (grade 6-12) Use Scenario Customer Needs Customer Rqmt. # Importance Description CR1 9 Easy for a teacher to learn CR2 9 Easy for the Student to use CR3 9 Graphical Interface CR4 9 Stereo Viewing System, in 3D
    [Show full text]
  • Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task
    Old Dominion University ODU Digital Commons Psychology Theses & Dissertations Psychology Spring 2019 Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task Shelby K. Long Old Dominion University, [email protected] Follow this and additional works at: https://digitalcommons.odu.edu/psychology_etds Part of the Applied Behavior Analysis Commons, Human Factors Psychology Commons, and the Robotics Commons Recommended Citation Long, Shelby K.. "Effects of Control Device and Task Complexity on Performance and Task Shedding During a Robotic Arm Task" (2019). Master of Science (MS), Thesis, Psychology, Old Dominion University, DOI: 10.25777/yh1y-0016 https://digitalcommons.odu.edu/psychology_etds/231 This Thesis is brought to you for free and open access by the Psychology at ODU Digital Commons. It has been accepted for inclusion in Psychology Theses & Dissertations by an authorized administrator of ODU Digital Commons. For more information, please contact [email protected]. EFFECTS OF CONTROL DEVICE AND TASK COMPLEXITY ON PERFORMANCE AND TASK SHEDDING DURING A ROBOTIC ARM TASK by Shelby K. Long B.S. December 2013, Georgia Institute of Technology A Thesis Submitted to the Faculty of Old Dominion University in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE PSYCHOLOGY OLD DOMINION UNIVERSITY May 2019 Approved by: James P. Bliss (Director) Yusuke Yamani (Member) Xiaoxiao Hu (Member) ABSTRACT EFFECTS OF CONTROL DEVICE AND TASK COMPLEXITY ON PERFORMANCE AND TASK SHEDDING DURING A ROBOTIC ARM TASK Shelby K. Long Old Dominion University, 2019 Director: Dr. James P. Bliss The use of robotic arms across domains is increasing, but the relationship between control features and performance is not fully understood.
    [Show full text]
  • Leap Motion Controllertm
    Leap Motion ControllerTM The world’s leading hand tracking technology The Leap Motion Controller from Ultraleap is an optical hand tracking module that captures the movement of users’ hands and fingers so they can interact naturally with digital content. Small, fast, and accurate, the Leap Motion Controller can be used for productivity applications with Windows computers, integrated into enterprise- grade hardware solutions or displays, or attached to virtual/augmented reality headsets for AR/VR/XR prototyping, research, and development. The controller is capable of tracking hands within a 3D interactive zone that extends up to 60cm (24”) or more, extending from the device in a 120×150° field of view. Leap Motion‘s software is able to discern 27 distinct hand elements, including bones and joints, and track them even when they are obscured by other parts of the hand. With a 120Hz refresh rate and low-latency software, the time between motion and photon falls beneath the human perception threshold1. An accessory component, the VR Developer Mount, allows for easy, reliable, and consistent attachment of the device to virtual reality headsets such as the Oculus Rift and HTC Vive. The Leap Motion App Gallery features 75+ legacy desktop applications and a demo suite for virtual reality. Example applications • Entertainment (location-based VR/AR experiences, arcades, amusement parks) • Healthcare (stroke rehabilitation, training, mirror, medical imaging, lazy eye treatment) • Therapy and Education (anatomic visualizations, hands-on learning) • Personnel training (flight simulators, complex computer systems) • Industrial design and engineering (automotive, assembly lines, facilities management) • Robotics (telepresence, robotic controls, AI-assisted teaching) The Leap Motion Controller features a 120×150° • Global team collaboration field of view and an interactive tracking range of more than 2 feet (60cm).
    [Show full text]
  • Natural Menu Interactions in VR with Leap Motion Masterarbeit
    Natural Menu Interactions in VR with Leap Motion Masterarbeit Zur Erlangung des Grades Master of Science (M.Sc.) im Studiengang Informatik vorgelegt von Björn Zeutzheim Erstgutachter: Prof. Dr. Stefan Müller (Institut für Computervisualistik, AG Computergraphik) Zweitgutachter: Nils Höhner, M.Sc. (Institut für Wissensmedien, IWM) Koblenz, der 6. September 2019 Erklärung Ich versichere, dass ich die vorliegende Arbeit selbständig verfasst und keine anderen als die angegebenen Quellen und Hilfsmittel benutzt habe. Ja Nein Mit der Einstellung dieser Arbeit in die Bibliothek bin ich einverstanden. □ □ . (Ort, Datum) (Unterschrift) Abstract With the appearance of modern virtual reality (VR) headsets on the consumer market, there has been the biggest boom in the history of VR technology. Naturally, this was accompanied by an increasing focus on the problems of current VR hardware. Especially the control in VR has always been a complex topic. One possible solution is the Leap Motion, a hand tracking device that was initially developed for desktop use, but with the last major software update it can be attached to standard VR headsets. This device allows very precise tracking of the user’s hands and fingers and their replication in the virtual world. The aim of this work is to design virtual user interfaces that can be operated with the Leap Motion to provide a natural method of interaction between the user and the VR environment. After that, subject tests are performed to evaluate their performance and compare them to traditional VR controllers. Zusammenfassung Mit dem Erscheinen moderner Virtual Reality (VR) Headsets auf dem Verbrauchermarkt, gab es den bisher größten Aufschwung in der Geschichte der VR Technologie.
    [Show full text]
  • Universidad Católica De Santiago De Guayaquil Facultad De Artes Y Humanidades Carrera De Ingenieria En Producción Y Dirección En Artes Multimedia
    UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES CARRERA DE INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA TÌTULO: CREACIÓN DE UNA APLICACIÓN MULTIMEDIA QUE DESARROLLE POR MEDIO DE JUEGOS Y ACTIVIDADES LAS HABILIDADES MOTRICES FINAS EN LOS NIÑOS DE PREESCOLAR USANDO LA REALIDAD VIRTUAL EN LOS DISPOSITIVOS MÓVILES TRABAJO DE TITULACIÓN PREVIO A LA OBTENCIÓN DEL TÍTULO DE INGENIERO EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA AUTOR: ROBERTO ADELKARÍN BELTRÁN LUCAS TUTOR: LIC. LAMBERT SARANGO YAMIL, MGS Guayaquil, Ecuador 2015 UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA CERTIFICACIÓN Certificamos que el presente trabajo fue realizado en su totalidad por Roberto Adelkarín Beltrán Lucas, como requerimiento parcial para la obtención del Título de Ingeniero en Producción y Dirección de Arte Multimedia. TUTOR ______________________ Lic. Yamil Lambert Sarango, Mgs DIRECTOR DE LA CARRERA ______________________ Lic. Víctor Hugo Moreno, Mgs Guayaquil, a los 29 del mes de septiembre del año 2015 UNIVERSIDAD CATÓLICA DE SANTIAGO DE GUAYAQUIL FACULTAD DE ARTES Y HUMANIDADES INGENIERIA EN PRODUCCIÓN Y DIRECCIÓN EN ARTES MULTIMEDIA DECLARACIÓN DE RESPONSABILIDAD Yo, Roberto Adelkarín Beltrán Lucas DECLARO QUE: El Trabajo de Titulación Creación de una aplicación multimedia que a través de juegos y actividades desarrolle las habilidades cognitivas y motrices en los niños de preescolar, usando para ello la realidad virtual en los dispositivos móviles previa a la obtención del Título de Ingeniería en producción y dirección de arte multimedia, ha sido desarrollado respetando derechos intelectuales de terceros conforme las citas que constan al pie de las páginas correspondientes, cuyas fuentes se incorporan en la bibliografía.
    [Show full text]
  • Whitepaper Head Mounted Displays & Data Glasses Applications and Systems
    Whitepaper Head Mounted Displays & Data Glasses Applications and Systems Dr.-Ing. Dipl.-Kfm. Christoph Runde Virtual Dimension Center (VDC) Fellbach Auberlenstr. 13 70736 Fellbach www.vdc-fellbach.de © Competence Centre for Virtual Reality and Cooperative Engineering w. V. – Virtual Dimension Center (VDC) System classes Application fields Directions of development Summary Content . System classes Head Mounted Display (HMD) – Video glasses – Data glasses . Simulator disease / Cyber Sickness . Application fields HMDs: interior inspections, training, virtual hedging engineering / ergonomics . Application fields data glasses: process support, teleservice, consistency checks, collaboration . Directions of development: technical specifications, (eye) tracking, retinal displays, light field technology, imaging depth sensors . Application preconditions information & integration (human, IT, processes) . Final remark 2 SystemSystem classes classes Application fields Directions of development Summary Head Mounted Displays (HMDs) – Overview . 1961: first HMD on market . 1965: 3D-tracked HMD by Ivan Sutherland . Since the 1970s a significant number of HMDs is applied in the military sector (training, additional display) Table: Important HMD- projects since the 1970s [Quelle: Li, Hua et. al.: Review and analysis of avionic helmet-mounted displays. In : Op-tical Engineering 52(11), 110901, Novembre2013] 3 SystemSystem classes classes Application fields Directions of development Summary Classification HMD – Video glasses – Data glasses Head Mounted Display
    [Show full text]
  • Controller-Free Hand Tracking for Grab-And-Place Tasks in Immersive Virtual Reality: Design Elements and Their Empirical Study
    Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 21 October 2020 doi:10.20944/preprints202010.0431.v1 Article Controller-free Hand Tracking for Grab-and-place Tasks in Immersive Virtual Reality: Design Elements and their Empirical Study Alexander Masurovsky 1,2* , Paul Chojecki1, Detlef Runde1, Mustafa Lafci1, David Przewozny1 and Michael Gaebler3 1 Fraunhofer Heinrich Hertz Institute, Berlin, Germany 2 Berlin School of Mind and Brain, Humboldt-Universität zu Berlin, Germany 3 Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany * Correspondence: [email protected] Abstract: Hand tracking enables controller-free interaction with virtual environments, which can, compared to traditional handheld controllers, make virtual reality (VR) experiences more natural and immersive. As naturalness hinges on both technological and user-based features, fine-tuning the former while assessing the latter can be used to increase usability. For a grab-and-place use case in immersive VR, we compared a prototype of a camera-based hand tracking interface (Leap Motion) with customized design elements to the standard Leap Motion application programming interface (API) and a traditional controller solution (Oculus Touch). Usability was tested in 32 young healthy participants, whose performance was analyzed in terms of accuracy, speed and errors as well as subjective experience. We found higher performance and overall usability as well as overall preference for the handheld controller compared to both controller-free solutions. While most measures did not differ between the two controller-free solutions, the modifications made to the Leap API to form our prototype led to a significant decrease in accidental drops. Our results do not support the assumption of higher naturalness for hand tracking but suggest design elements to improve the robustness of controller-free object interaction in a grab-and-place scenario.
    [Show full text]
  • Usability Evaluation of Input Devices for Navigation and Interaction in 3D
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by International SERIES on Information Systems and Management in Creative eMedia... Usability evaluation of input devices for navigation and interaction in 3D visualisation Peter Skrljˇ Jozeˇ Guna Ciril Bohak, Matija Marolt University of Ljubljana University of Ljubljana University of Ljubljana Faculty of Computer and Faculty of Electrical Faculty of Computer and Information Science Engineering Information Science [email protected] [email protected] fciril.bohak, [email protected] ABSTRACT exploration, which describes navigation through space with We present an assessment study of user experience and us- no target goal, search, where the user must navigate to a ability of different kinds of input devices for view manipula- certain goal with speed, and finally maneuvering, which de- tion in a 3D data visualisation application. Three input de- scribes slower movements but with higher precision. All of vices were compared: a computer mouse, a 3D mouse with the above mentioned aspects are important while developing six degrees of freedom, and the Leap Motion Controller - a highly usable interface for 3D navigation. a device for touchless interaction. Assessment of these de- In our case we do not address a specific aspect of navigation vices was conducted using the System Usability Scale (SUS) listed in introduction (exploration, search or maneuvering), methodology, with addition of application specific questions. we are covering the user experience. Further research for ad- To gain further insight into users’ behaviour, the users’ per- dressing individual aspects are planned as part of future work.
    [Show full text]
  • Virtual Reality Games Integration Module for Managing and Monitoring of Hand Therapy
    SBC { Proceedings of SBGames 2020 | ISSN: 2179-2259 Games and Health { Full Papers Virtual Reality Games Integration Module for Managing and Monitoring of Hand Therapy Ana Grasielle Dionísio Corrêa Adriano Rapussi Felipe Monteiro da Costa Programa de Pós-Graduação em Faculdade de Computação e Informática Faculdade de Computação e Informática Distúrbios do Desenvolvimento Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie São Paulo, Brasil São Paulo, Brasil São Paulo, Brasil [email protected] [email protected] [email protected] Raquel Cymrot Natália Regina Kintschner Silvana Maria Blascovi-Assis Escola de Engenharia Programa de Pós-Graduação em Distúrbios Programa de Pós-Graduação em Distúrbios do Universidade Presbiteriana Mackenzie do Desenvolvimento Desenvolvimento São Paulo, Brasil Universidade Presbiteriana Mackenzie Universidade Presbiteriana Mackenzie [email protected] São Paulo, Brasil São Paulo, Brasil [email protected] [email protected] Abstract—Injuries to the hand function caused by overwork need to manipulate the joystick. However, it is still a challenge or neurological problems lead to the need for intense motor to detect finer hand movements, for example, movements of rehabilitation programs. Rehabilitation training is generally grasping and releasing objects [1]. Other solutions using limited to the rehabilitation center and/or hospitals and patients cameras or wearable technologies such as data gloves are are idle at home. We have developed a module for managing and some of the hand movement capture options most used in monitoring virtual reality games, controlled by the Leap Motion manual rehabilitation [6]. Controller (LMC) sensor, for the rehabilitation of the manual function that can be used in a clinical and/or home environment.
    [Show full text]
  • THE CHALLENGE of HAND GESTURE INTERACTION in the VIRTUAL REALITY ENVIRONMENT Evaluation of In-Air Hand Gesture Using the Leap Motion Controller
    CORE Metadata, citation and similar papers at core.ac.uk Provided by Theseus Hong An Pham THE CHALLENGE OF HAND GESTURE INTERACTION IN THE VIRTUAL REALITY ENVIRONMENT Evaluation of in-air hand gesture using the Leap Motion Controller THE CHALLENGE OF HAND GESTURE INTERACTION IN THE VIRTUAL REALITY ENVIRONMENT Evaluation of in-air hand gesture using the Leap Motion Controller Hong An Pham Bachelor’s Thesis Spring 2018 Information Technology Oulu University of Applied Sciences ABSTRACT Oulu University of Applied Sciences Degree Programme in Information Technology, Internet Services Author: Hong An Pham Title of Bachelor´s thesis: The Challenge of Hand Gesture Interaction in the Virtual Reality Environment Supervisor: Lasse Haverinen Term and year of completion: Spring 2018 Number of pages: 53 In recent years, people in the world have perceived how 3D inventions change our life. Along with the rise of the Virtual Reality technology, there are two types of interaction methods for illustrating human ‘s command in the virtual environment: via controllers and via in-air hand gesture. Using in- air hand gesture is more considered because of natural manipulation and a lower cost for hardware controller. However, comparing with controller device, in-air hand gesture has many limitation of imprecise operation so it is not popular in 3D game and manufacturing industries. By means of a user study with several levels of users, this research conducts a usability test in order to analyse the factors affect in-air hand gesture interaction. With a virtual environment with an Oculus headset and a Leap Motion controller, users were free to perform gesture actions and acknowledge the world with three sensors: sight, touch and hearing.
    [Show full text]
  • Self Representation and Interaction in Immersive Virtual Reality
    Self Representation and Interaction in Immersive Virtual Reality Eros Viola a, Fabio Solari b and Manuela Chessa c Dept. of Informatics, Bioengineering, Robotics, and Systems Engineering, University of Genoa, Italy Keywords: Virtual Reality, Self Representation, Intel Realsense D435, Interaction, Leap Motion, Manus Prime Haptic Gloves, Alignment, Rigid Transformation, Live Correction. Abstract: Inserting a self-representation in Virtual Reality is an open problem with several implications for both the sense of presence and interaction in virtual environments. To cope the problem with low cost devices, we de- vise a framework to align the measurements of different acquisition devices used while wearing a tracked VR head-mounted display (HMD). Specifically, we use the skeletal tracking features of an RGB-D sensor (Intel Realsense d435) to build the user’s avatar, and compare different interaction technologies: a Leap Motion, the Manus Prime haptic gloves, and the Oculus Controllers. The effectiveness of the proposed systems is assessed through an experimental session, where an assembly task is proposed with the three different interaction medi- ums, with and without the self-representation. Users reported their feeling by answering the User Experience and Igroup Presence Questionnaires, and we analyze the total time to completion and the error rate. 1 INTRODUCTION The focus is to develop a framework that is compat- ible with the most common head-mounted displays In current Virtual Reality (VR) applications, the vi- (HMDs), and that can also be extended to other track- sual feedback of the user’s body is often missing, de- ing devices, both for the body tracking and for the spite the abundance of enabling technologies, which hands and fingers tracking.
    [Show full text]
  • Virtual Reality 3D Space & Tracking
    Virtual Reality 3D Space & Tracking Ján Cíger Reviatech SAS [email protected] http://janoc.rd-h.com/ 16.10.2014 Ján Cíger (Reviatech SAS) RV01 16.10.2014 1 / 57 Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 2 / 57 Who am I? Who am I? PhD from VRlab, EPFL, Switzerland I AI for virtual humans, how to make autonomous characters ("NPCs") smarter I Lab focused on human character animation, motion capture, interaction in VR I ≈ 20 PhD students, several post-docs, many master students. VRlab: http://vrlab.epfl.ch IIG: http://iig.epfl.ch Ján Cíger (Reviatech SAS) RV01 16.10.2014 3 / 57 Who am I? Who am I? SensoramaLab, Aalborg University, Denmark I Building a new lab focusing on new media, interaction, use in rehabilitation I Large projection system, tracking, many interactive demos I Teaching bachelor & master students, project supervision. Ján Cíger (Reviatech SAS) RV01 16.10.2014 4 / 57 Who am I? Who am I? Reviatech SAS I Applied research, looking for new technologies useful for our products/customers I Tracking, augmented reality, AI, building software & hardware prototypes. http://www.reviatech.com/ Ján Cíger (Reviatech SAS) RV01 16.10.2014 5 / 57 Coordinates in 3D Space Position in 3D Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 6 / 57 Coordinates in 3D Space Position in 3D 3D Position Carthesian I 3 perpendicular axes I Left or right handed! I Z-up vs Y-up! I Usually same scale on all axes http://viz.aset.psu.edu/gho/sem_notes/3d_ I Common units – meters, fundamentals/html/3d_coordinates.html millimetres, inches, feet.
    [Show full text]