Prediction Analysis of Optical Tracker Parameters Using Machine Learning Approaches for Efficient Head Tracking

Total Page:16

File Type:pdf, Size:1020Kb

Prediction Analysis of Optical Tracker Parameters Using Machine Learning Approaches for Efficient Head Tracking Prediction Analysis of Optical Tracker Parameters using Machine Learning Approaches for efficient Head Tracking AMAN KATARIA1, SMARAJIT GHOSH1, VINOD KARAR2 1 Department of Electrical and Instrumentation Engineering, Thapar University, Patiala-147004, Punjab, India 2 Department of Optical Devices and Systems, CSIR-Central Scientific Instruments Organization, Chandigarh, India A head tracker is a crucial part of the head mounted display systems, data under different light intensity of the environment as it tracks the head of the pilot in the plane/cockpit simulator. The and distance between transmitter and receiver [4]. operational flaws of head trackers are also dependent on different According to the variations in environmental conditions, environmental conditions like different lighting conditions and stray appropriate number of trackers could be incorporated in a light interference. In this letter, an optical tracker has been employed to gather the 6-DoF data of head movements under different HMDs. Also in the aircraft simulator, it can help to choose environmental conditions. Also, the effect of different environmental the appropriate optical tracker according to the conditions and variation in distance between the receiver and optical environmental condition of simulator bed. transmitter on the 6-DoF data was analyzed. Machine learning has been used to great extent in Keywords: Machine Learning, Head Tracking, Random Forest, predicting mathematical and statistical optimization [5]. Optical Head Tracking, Aviation In the proposed work, four methods of machine learning applied to predict the 6-DoF data under different light Introduction: Head tracking has always been a intensities and the distance between transmitter significant part in designing of Head mounted display (infrared light source) and receiver (optical receiver). Six systems (HMDs). HMDs are considered as integral part DOF directions of head tracking are analyzed for different of fighter aircraft avionics. A head tracker is a crucial environmental conditions. For predicting the output, part of the HMDs, as it tracks the head of the pilot in the Random forest, Neural networks, Generalized linear plane/cockpit simulator to synchronize the view of outer model (GLM) and Support vector machine (SVM) are world with the view in the HMDs. There are various used. To determine the stableness and robustness of the types of head trackers which work with different best method of prediction, the K-fold cross validation is tracking techniques. Their operational flaws are also performed [6]. dependent on different environmental conditions which are different lighting conditions and stray light Remaining paper is structured with Section 1 interference. presenting the experimental setup, Section 2 represents Tracking can be positional or orientation in terms of the methodology used. Model evaluation is described in coordinates and when both are combined, the system is section 3. Results are discussed in section 4. Finally recognized as six degrees of freedom (6-DoF) system [1]. conclusion and future scope is presented in section 5 and Directions of head tracking in a 6 DOF system are X, Y, and section 6 respectively. Z known as positional coordinates and Roll, Yaw and Pitch known as orientation coordinates. TrackIRTM 5 Optical 1. Experimental Setup: For carrying out desired head tracker is used to track the six DOF directions of head experimentation conditions, the Cockpit Simulator is movement. TrackIRTM 5 optical tracker acts as a source to used. It comprised of pilot seat at designated distance generate the Infrared (IR) light. The incoming IR light is from the transmitter. The aircraft canopy covered the reflected back by the three retro-reflective markers, which head up display along with the optical transmitter. The are assembled in the TrackClip which is placed on the receiver (infrared retro-reflector) was mounted on the helmet/head of the pilot. The reflected IR light is detected user cap. The diffused light source controlled the by the infrared camera present in TrackIRTM 5, which then ambient light variations, while the pilot movement tracks the positional and orientation coordinates of the simulated the distances between the transmitter and head of the pilot. the receiver. TrackIR 5™ is used to track the head Experiments were conducted under different light movements in Display Simulator. The light intensity intensities (LI) and different distance range values was measured through Lux Meter (Model TES 1332 between transmitter and receiver (tracker) on which the Digital Lux Meter). performance of an optical tracker strongly depends. Generally, the expected performance of an optical tracker diverges largely if the stray light interference is more. Each optical tracker has limited work capacity under some particular LI [2]. In the experiment, the 6-DoF data is predicted through different machine learning methods under different light intensity and distance variation between sensor and transmitter to virtually judge the performance of an optical tracker [3]. A desirable predictive method is required which can predict the 6-DoF a) Display Simulator Fig.4. Experiment conducted under 111 lux light intensity with different distances 25cm, 50cm, 75cm and 100cm marked as 1, 2, 3 and 4 respectively. ™ Fig.1. Display Simulator with optical tracker TrackIR 5 (Shown in red circle) at CSIR- CSIO Fig.2. Experiment conducted under 3 lux light intensity Fig.5. Experiment conducted under 165 lux light intensity with different distances 25cm, 50cm, 75cm and 100cm with different distances 25cm, 50cm, 75cm and 100cm marked as 1, 2, 3 and 4 respectively. marked as 1, 2, 3 and 4 respectively. Fig.3. Experiment conducted under 75 lux light intensity with different distances 25cm, 50cm, 75cm and 100cm marked as 1, 2, 3 and 4 respectively Fig.6 Lux meter readings under different light intensity. 2. Methodology Used is measured in Lux (lx). The experiments are conducted for light intensity varying form 4 lux to 166 lux. In the 2.1 Description of Data Set and its Features experimentation, the distance between the pilot’s head (transmitter is located on cap worn on head) and the Data set have been modeled after performing receiver are kept in four distances: 25cm, 50cm, 75cm and experiments under different conditions viz. variation in 100cm. light intensity and variation in distances between the transmitter and the receiver. Description regarding 2.3 Feature Description movements of head in optical head tracking used in the experiment has been described in Table I. Values of light Six physical DOF directions are extensively used in the intensity, distances between transmitter and receiver experimentation and subsequent analysis. The X and sample values of 6-DOF head tracking coordinates coordinate in this case is the translation of the body along are provided in Table- II. X-axis also known as forward and backward translation. Forward translation leads to positive value and backward Table- I: Description of the movements of head recorded by translation leads to negative values. The Y coordinate is an Optical Tracker the translation of the body along Y-axis also known as left DoF Information and right translation. Right translation leads to positive X Forward and backward translational value and left translation leads to negative values. Z coordinate of the pilot’s head coordinate is the coordinate of translation of the body along Y Left and right translational coordinate of the Z-axis also known as up and down translation. Up pilot’s head translation leads to positive value and down translation Z Up and Down translational coordinate of the leads to negative value. The rotational coordinates are pilot’s head Yaw, Pitch and Roll. Yaw is the rotation along Z-axis and Yaw Rotational coordinate of the pilot’s head is measured in degrees. Roll is the rotation along X-axis along Z-axis and is measured in degrees and Pitch is the rotation along Pitch Rotational coordinate of the pilot’s head Y-axis and is measured in degrees. The rotational matrixes along Y-axis of the rotational coordinates are given below: Roll Rotational coordinate of the pilot’s head along X-axis Light Distanc Yaw Pitch Roll X Y Z intensity e (in (in lux) cm) 3 25 -18.2 3.8 -17.8 -4.5 4.1 13.47 3 50 -20.5 -1.7 -7.8 0.2 0.7 -10.3 3 75 -23.5 -12.78 -2.67 -13.6 -0.2 3.97 3 100 5.1 -14.44 -2.9 6 0.7 -5.13 75 25 -54.3 1.1 -0.6 -2.8 0.51 6.05 75 50 -49 -1.4 -1.8 -1.2 1.4 2.95 75 75 -8.8 2 -4.5 -1.9 1.3 -0.16 Fig.7 Prototype view of Optical Head tracker [7] 75 100 -11.6 1 -3.8 0.3 -1.9 -12.36 111 25 -13.1 -2.8 0.1 -7.6 3.2 5.66 111 50 -7.4 -3 0 -6.2 3.3 5.38 111 75 -2 -2.8 -0.4 -1.1 3 6.46 111 25 -16 -2.8 -0.4 2 2.9 -2.15 165 50 9.3 -6.6 2.4 0.6 2.5 -9.43 165 25 9.7 -6.1 2 1 2 -11.21 165 75 9.43 -6.67 2.4 0.6 2.5 -9.43 165 100 -10.22 -7.99 -1.8 5 2 -7.76 Table 2. Sample set of Data 2.2 Quantitative Assessment Experiments conducted with light intensity and distance variation between the transmitter and the receiver helps in predication of the desirable conditions in which optical tracker give satisfactory output. Light intensity is measured as the rate at which energy of light is delivered Fig.8 Methodology used per unit surface per unit time per unit area. Light intensity 2.4 Approach candidate pool.
Recommended publications
  • Evaluating Performance Benefits of Head Tracking in Modern Video
    Evaluating Performance Benefits of Head Tracking in Modern Video Games Arun Kulshreshth Joseph J. LaViola Jr. Department of EECS Department of EECS University of Central Florida University of Central Florida 4000 Central Florida Blvd 4000 Central Florida Blvd Orlando, FL 32816, USA Orlando, FL 32816, USA [email protected] [email protected] ABSTRACT PlayStation Move, TrackIR 5) that support 3D spatial in- teraction have been implemented and made available to con- We present a study that investigates user performance ben- sumers. Head tracking is one example of an interaction tech- efits of using head tracking in modern video games. We nique, commonly used in the virtual and augmented reality explored four di↵erent carefully chosen commercial games communities [2, 7, 9], that has potential to be a useful ap- with tasks which can potentially benefit from head tracking. proach for controlling certain gaming tasks. Recent work on For each game, quantitative and qualitative measures were head tracking and video games has shown some potential taken to determine if users performed better and learned for this type of gaming interface. For example, Sko et al. faster in the experimental group (with head tracking) than [10] proposed a taxonomy of head gestures for first person in the control group (without head tracking). A game ex- shooter (FPS) games and showed that some of their tech- pertise pre-questionnaire was used to classify participants niques (peering, zooming, iron-sighting and spinning) are into casual and expert categories to analyze a possible im- useful in games. In addition, previous studies [13, 14] have pact on performance di↵erences.
    [Show full text]
  • PROGRAMS for LIBRARIES Alastore.Ala.Org
    32 VIRTUAL, AUGMENTED, & MIXED REALITY PROGRAMS FOR LIBRARIES edited by ELLYSSA KROSKI CHICAGO | 2021 alastore.ala.org ELLYSSA KROSKI is the director of Information Technology and Marketing at the New York Law Institute as well as an award-winning editor and author of sixty books including Law Librarianship in the Age of AI for which she won AALL’s 2020 Joseph L. Andrews Legal Literature Award. She is a librarian, an adjunct faculty member at Drexel University and San Jose State University, and an international conference speaker. She received the 2017 Library Hi Tech Award from the ALA/LITA for her long-term contributions in the area of Library and Information Science technology and its application. She can be found at www.amazon.com/author/ellyssa. © 2021 by the American Library Association Extensive effort has gone into ensuring the reliability of the information in this book; however, the publisher makes no warranty, express or implied, with respect to the material contained herein. ISBNs 978-0-8389-4948-1 (paper) Library of Congress Cataloging-in-Publication Data Names: Kroski, Ellyssa, editor. Title: 32 virtual, augmented, and mixed reality programs for libraries / edited by Ellyssa Kroski. Other titles: Thirty-two virtual, augmented, and mixed reality programs for libraries Description: Chicago : ALA Editions, 2021. | Includes bibliographical references and index. | Summary: “Ranging from gaming activities utilizing VR headsets to augmented reality tours, exhibits, immersive experiences, and STEM educational programs, the program ideas in this guide include events for every size and type of academic, public, and school library” —Provided by publisher. Identifiers: LCCN 2021004662 | ISBN 9780838949481 (paperback) Subjects: LCSH: Virtual reality—Library applications—United States.
    [Show full text]
  • Augmented Reality and Its Aspects: a Case Study for Heating Systems
    Augmented Reality and its aspects: a case study for heating systems. Lucas Cavalcanti Viveiros Dissertation presented to the School of Technology and Management of Bragança to obtain a Master’s Degree in Information Systems. Under the double diploma course with the Federal Technological University of Paraná Work oriented by: Prof. Paulo Jorge Teixeira Matos Prof. Jorge Aikes Junior Bragança 2018-2019 ii Augmented Reality and its aspects: a case study for heating systems. Lucas Cavalcanti Viveiros Dissertation presented to the School of Technology and Management of Bragança to obtain a Master’s Degree in Information Systems. Under the double diploma course with the Federal Technological University of Paraná Work oriented by: Prof. Paulo Jorge Teixeira Matos Prof. Jorge Aikes Junior Bragança 2018-2019 iv Dedication I dedicate this work to my friends and my family, especially to my parents Tadeu José Viveiros and Vera Neide Cavalcanti, who have always supported me to continue my stud- ies, despite the physical distance has been a demand factor from the beginning of the studies by the change of state and country. v Acknowledgment First of all, I thank God for the opportunity. All the teachers who helped me throughout my journey. Especially, the mentors Paulo Matos and Jorge Aikes Junior, who not only provided the necessary support but also the opportunity to explore a recent area that is still under development. Moreover, the professors Paulo Leitão and Leonel Deusdado from CeDRI’s laboratory for allowing me to make use of the HoloLens device from Microsoft. vi Abstract Thanks to the advances of technology in various domains, and the mixing between real and virtual worlds.
    [Show full text]
  • Off-The-Shelf Stylus: Using XR Devices for Handwriting and Sketching on Physically Aligned Virtual Surfaces
    TECHNOLOGY AND CODE published: 04 June 2021 doi: 10.3389/frvir.2021.684498 Off-The-Shelf Stylus: Using XR Devices for Handwriting and Sketching on Physically Aligned Virtual Surfaces Florian Kern*, Peter Kullmann, Elisabeth Ganal, Kristof Korwisi, René Stingl, Florian Niebling and Marc Erich Latoschik Human-Computer Interaction (HCI) Group, Informatik, University of Würzburg, Würzburg, Germany This article introduces the Off-The-Shelf Stylus (OTSS), a framework for 2D interaction (in 3D) as well as for handwriting and sketching with digital pen, ink, and paper on physically aligned virtual surfaces in Virtual, Augmented, and Mixed Reality (VR, AR, MR: XR for short). OTSS supports self-made XR styluses based on consumer-grade six-degrees-of-freedom XR controllers and commercially available styluses. The framework provides separate modules for three basic but vital features: 1) The stylus module provides stylus construction and calibration features. 2) The surface module provides surface calibration and visual feedback features for virtual-physical 2D surface alignment using our so-called 3ViSuAl procedure, and Edited by: surface interaction features. 3) The evaluation suite provides a comprehensive test bed Daniel Zielasko, combining technical measurements for precision, accuracy, and latency with extensive University of Trier, Germany usability evaluations including handwriting and sketching tasks based on established Reviewed by: visuomotor, graphomotor, and handwriting research. The framework’s development is Wolfgang Stuerzlinger, Simon Fraser University, Canada accompanied by an extensive open source reference implementation targeting the Unity Thammathip Piumsomboon, game engine using an Oculus Rift S headset and Oculus Touch controllers. The University of Canterbury, New Zealand development compares three low-cost and low-tech options to equip controllers with a *Correspondence: tip and includes a web browser-based surface providing support for interacting, Florian Kern fl[email protected] handwriting, and sketching.
    [Show full text]
  • Operational Capabilities of a Six Degrees of Freedom Spacecraft Simulator
    AIAA Guidance, Navigation, and Control (GNC) Conference AIAA 2013-5253 August 19-22, 2013, Boston, MA Operational Capabilities of a Six Degrees of Freedom Spacecraft Simulator K. Saulnier*, D. Pérez†, G. Tilton‡, D. Gallardo§, C. Shake**, R. Huang††, and R. Bevilacqua ‡‡ Rensselaer Polytechnic Institute, Troy, NY, 12180 This paper presents a novel six degrees of freedom ground-based experimental testbed, designed for testing new guidance, navigation, and control algorithms for nano-satellites. The development of innovative guidance, navigation and control methodologies is a necessary step in the advance of autonomous spacecraft. The testbed allows for testing these algorithms in a one-g laboratory environment, increasing system reliability while reducing development costs. The system stands out among the existing experimental platforms because all degrees of freedom of motion are dynamically reproduced. The hardware and software components of the testbed are detailed in the paper, as well as the motion tracking system used to perform its navigation. A Lyapunov-based strategy for closed loop control is used in hardware-in-the loop experiments to successfully demonstrate the system’s capabilities. Nomenclature ̅ = reference model dynamics matrix ̅ = control distribution matrix = moment arms of the thrusters = tracking error vector ̅ = thrust distribution matrix = mass of spacecraft ̅ = solution to the Lyapunov equation ̅ = selected matrix for the Lyapunov equation ̅ = rotation matrix from the body to the inertial frame = binary thrusts vector = thrust force of a single thruster = input to the linear reference model = ideal control vector = Lyapunov function = attitude vector = position vector Downloaded by Riccardo Bevilacqua on August 22, 2013 | http://arc.aiaa.org DOI: 10.2514/6.2013-5253 * Graduate Student, Department of Electrical, Computer, and Systems Engineering, 110 8th Street Troy NY JEC Room 1034.
    [Show full text]
  • Global Interoperability in Telerobotics and Telemedicine
    2010 IEEE International Conference on Robotics and Automation Anchorage Convention District May 3-8, 2010, Anchorage, Alaska, USA Plugfest 2009: Global Interoperability in Telerobotics and Telemedicine H. Hawkeye King1, Blake Hannaford1, Ka-Wai Kwok2, Guang-Zhong Yang2, Paul Griffiths3, Allison Okamura3, Ildar Farkhatdinov4, Jee-Hwan Ryu4, Ganesh Sankaranarayanan5, Venkata Arikatla 5, Kotaro Tadano6, Kenji Kawashima6, Angelika Peer7, Thomas Schauß7, Martin Buss7, Levi Miller8, Daniel Glozman8, Jacob Rosen8, Thomas Low9 1University of Washington, Seattle, WA, USA, 2Imperial College London, London, UK, 3Johns Hopkins University, Baltimore, MD, USA, 4Korea University of Technology and Education, Cheonan, Korea, 5Rensselaer Polytechnic Institute, Troy, NY, USA, 6Tokyo Institute of Technology, Yokohama, Japan, 7Technische Universitat¨ Munchen,¨ Munich, Germany, 8University of California, Santa Cruz, CA, USA, 9SRI International, Menlo Park, CA, USA Abstract— Despite the great diversity of teleoperator designs systems (e.g. [1], [2]) and network characterizations have and applications, their underlying control systems have many shown that under normal circumstances Internet can be used similarities. These similarities can be exploited to enable inter- for teleoperation with force feedback [3]. Fiorini and Oboe, operability between heterogeneous systems. We have developed a network data specification, the Interoperable Telerobotics for example carried out early studies in force-reflecting tele- Protocol, that can be used for Internet based control of a wide operation over the Internet [4]. Furthermore, long distance range of teleoperators. telesurgery has been demonstrated in successful operation In this work we test interoperable telerobotics on the using dedicated point-to-point network links [5], [6]. Mean- global Internet, focusing on the telesurgery application do- while, work by the Tachi Lab at the University of Tokyo main.
    [Show full text]
  • Motion Tracking in Field Sports Using GPS And
    Motion tracking in field sports using GPS and IMU M. Roobeek Master of Science Thesis Delft Center for Systems and Control Motion tracking in field sports using GPS and IMU Master of Science Thesis For the degree of Master of Science in Systems & Control at Delft University of Technology M. Roobeek February 16, 2017 Faculty of Mechanical, Maritime and Materials Engineering (3mE) · Delft University of Technology The work in this thesis was conducted on behalf of, and carried out in cooperation with JOHAN Sports. Their cooperation is hereby gratefully acknowledged. Copyright c Delft Center for Systems and Control (DCSC) All rights reserved. Abstract Injuries are a sportsman’s worst nightmare: They withhold players from playing, they weaken teams and they force clubs to buy a bench-full of substitutes. On top of that, injuries can cause an awful lot of pain. Regulating match and training intensity can drastically decrease the risk of getting injured. Furthermore, it can optimally prepare players for matchday. To enable player load regulation, accurate measurements of player positions, velocities and especially accelerations are required. To measure these quantities, JOHAN Sports develops a sports player motion tracking system. The device that is used for motion tracking contains a 9-DoF MEMS Inertial Measurement Unit (IMU) and a GPS receiver. These low-cost MEMS sensors are combined via sensor fusion. The challenge in this filtering problem lies in the limited and low-quality sensor measurements combined with the high-dynamic player motion consisting of rapid orientation changes and sensed impacts in every step. Being able to overcome these challenges paves the way for injury prevention, saving sports clubs, teams and players a lot of misery.
    [Show full text]
  • Augmented and Virtual Reality in Operations
    Augmented and Virtual Reality in Operations A guide for investment Introduction Augmented Reality and Virtual Reality (AR/VR) are not new, but recent advances in computational power, storage, graphics processing, and high-resolution displays have helped overcome some of the constraints that have stood in the way of the widespread use of these immersive technologies. Global spending on AR/VR is now expected to reach $17.8 billion in 2018, an increase of nearly 95% over the $9.1 billion estimate for 2017.1 Given this impetus, we wanted to understand more about the current state of play with AR/VR in the enterprise. We surveyed executives at 700+ companies, including 600 companies who are either experimenting with or implementing AR/VR; spoke with AR/VR-focused leadership at global companies, start-ups, and vendors; and analyzed over 35 use cases, with a particular focus on: • The use of AR/VR for internal industrial company operations, including, but not limited to, design, engineering, and field services • The automotive, utility, and discrete manufacturing sectors. This research focuses on organizations that have initiated their AR/VR journey, whether by experimentation or implementation, and aims to answer a number of questions: • How is AR/VR being used today? • Where should organizations invest – what are the use cases that generate the most value? • How can companies either begin or evolve their initiatives? Outside of this report, terms you will commonly find when reading about AR/VR are mixed reality, hybrid reality, or merged reality in which both the physical and the virtual or digital realities coexist or even merge.
    [Show full text]
  • Cinematic Virtual Reality: Evaluating the Effect of Display Type on the Viewing Experience for Panoramic Video
    Cinematic Virtual Reality: Evaluating the Effect of Display Type on the Viewing Experience for Panoramic Video Andrew MacQuarrie, Anthony Steed Department of Computer Science, University College London, United Kingdom [email protected], [email protected] Figure 1: A 360° video being watched on a head-mounted display (left), a TV (right), and our SurroundVideo+ system (centre) ABSTRACT 1 INTRODUCTION The proliferation of head-mounted displays (HMD) in the market Cinematic virtual reality (CVR) is a broad term that could be con- means that cinematic virtual reality (CVR) is an increasingly popu- sidered to encompass a growing range of concepts, from passive lar format. We explore several metrics that may indicate advantages 360° videos, to interactive narrative videos that allow the viewer to and disadvantages of CVR compared to traditional viewing formats affect the story. Work in lightfield playback [25] and free-viewpoint such as TV. We explored the consumption of panoramic videos in video [8] means that soon viewers will be able to move around in- three different display systems: a HMD, a SurroundVideo+ (SV+), side captured or pre-rendered media with six degrees of freedom. and a standard 16:9 TV. The SV+ display features a TV with pro- Real-time rendered, story-led experiences also straddle the bound- jected peripheral content. A between-groups experiment of 63 par- ary between film and virtual reality. ticipants was conducted, in which participants watched panoramic By far the majority of CVR experiences are currently passive videos in one of these three display conditions. Aspects examined 360° videos.
    [Show full text]
  • Virtual Reality 3D Space & Tracking
    Virtual Reality 3D Space & Tracking Ján Cíger Reviatech SAS [email protected] http://janoc.rd-h.com/ 16.10.2014 Ján Cíger (Reviatech SAS) RV01 16.10.2014 1 / 57 Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 2 / 57 Who am I? Who am I? PhD from VRlab, EPFL, Switzerland I AI for virtual humans, how to make autonomous characters ("NPCs") smarter I Lab focused on human character animation, motion capture, interaction in VR I ≈ 20 PhD students, several post-docs, many master students. VRlab: http://vrlab.epfl.ch IIG: http://iig.epfl.ch Ján Cíger (Reviatech SAS) RV01 16.10.2014 3 / 57 Who am I? Who am I? SensoramaLab, Aalborg University, Denmark I Building a new lab focusing on new media, interaction, use in rehabilitation I Large projection system, tracking, many interactive demos I Teaching bachelor & master students, project supervision. Ján Cíger (Reviatech SAS) RV01 16.10.2014 4 / 57 Who am I? Who am I? Reviatech SAS I Applied research, looking for new technologies useful for our products/customers I Tracking, augmented reality, AI, building software & hardware prototypes. http://www.reviatech.com/ Ján Cíger (Reviatech SAS) RV01 16.10.2014 5 / 57 Coordinates in 3D Space Position in 3D Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 6 / 57 Coordinates in 3D Space Position in 3D 3D Position Carthesian I 3 perpendicular axes I Left or right handed! I Z-up vs Y-up! I Usually same scale on all axes http://viz.aset.psu.edu/gho/sem_notes/3d_ I Common units – meters, fundamentals/html/3d_coordinates.html millimetres, inches, feet.
    [Show full text]
  • Bakalářská Práce Software Pro Rehabilitaci Paže Ve Virtuální Realitě
    Západočeská univerzita v Plzni Fakulta aplikovaných věd Katedra informatiky a výpočetní techniky Bakalářská práce Software pro rehabilitaci paže ve virtuální realitě Plzeň 2020 Jakub Kodera Místo této strany bude zadání práce. Prohlášení Prohlašuji, že jsem bakalářskou práci vypracoval samostatně a výhradně s použitím citovaných pramenů. V Plzni dne 7. května 2020 Jakub Kodera Abstract Arm rehabilitation software in virtual reality. The work is a part of a larger project, that is trying to achieve rehabilitation of the upper limb, in pa- tients suffering from a neurological disorder, in virtual reality. The result of this work is an application, that allows patients to perform rehabilitation exercises. The technologies used to develop the software are discussed here, mainly the HTC Vive system and the Unity game engine. The application is designed so that it can be further expanded. The application provides patients with a vizualization of the movement, that they are to perform. The therapeut obtains data and statistics of the exercises performed by the patient. Abstrakt Práce je součástí projektu, který se zabývá rehabilitací horní končetiny, u pacientů trpících neurologickou poruchou, ve virtuální realitě. Výsledkem této práce je aplikace umožňující provádět rehabilitačních cvičení. Jsou zde probírány technologie použité při vývoji softwaru. Zejména systém HTC Vive a herní engine Unity. Aplikace je navržena tak, aby ji bylo možné dále rozšiřovat. Aplikace poskytuje pacientům vizualizaci prováděného pohybu. Terapeut aplikací získá data a statistiky o cvičeních vykonaných pacientem. Obsah 1 Úvod7 2 Roztroušená skleróza mozkomíšní8 2.1 Průběh nemoci.........................8 2.1.1 Typy roztroušené sklerózy............... 10 2.2 Symptomy............................ 11 2.3 Léčba.............................
    [Show full text]
  • INSPECT: Extending Plane‑Casting for 6‑DOF Control
    Katzakis et al. Hum. Cent. Comput. Inf. Sci. (2015) 5:22 DOI 10.1186/s13673-015-0037-y RESEARCH Open Access INSPECT: extending plane‑casting for 6‑DOF control Nicholas Katzakis1*, Robert J Teather2, Kiyoshi Kiyokawa1,3 and Haruo Takemura2 *Correspondence: [email protected]‑u.ac.jp Abstract 1 Graduate School INSPECT is a novel interaction technique for 3D object manipulation using a rotation- of Information Science and Technology, Osaka only tracked touch panel. Motivated by the applicability of the technique on smart- University, Yamadaoka 1‑1, phones, we explore this design space by introducing a way to map the available Suita, Osaka, Japan degrees of freedom and discuss the design decisions that were made. We subjected Full list of author information is available at the end of the INSPECT to a formal user study against a baseline wand interaction technique using a article Polhemus tracker. Results show that INSPECT is 12% faster in a 3D translation task while at the same time being 40% more accurate. INSPECT also performed similar to the wand at a 3D rotation task and was preferred by the users overall. Keywords: Interaction, 3D, Manipulation, Docking, Smartphone, Graphics, Presentation, Wand, Touch, Rotation Background Virtual object manipulation is required in a wide variety of application domains. From city planning [1] and CAD in immersive virtual reality [2], interior design [3] reality and virtual prototyping [4], to manipulating a multi-dimensional dataset for scientific data explora- tion [5], educational applications [6], medical training [7] and even sandtray therapy [8]. There is, in all these application domains, a demand for low-cost, intuitive and fatigue-free control of six degrees of freedom (DOF).
    [Show full text]