Fusion De Données Multi-Kinect Visant À Améliorer L'interaction Gestuelle

Total Page:16

File Type:pdf, Size:1020Kb

Fusion De Données Multi-Kinect Visant À Améliorer L'interaction Gestuelle Université Paris 8 - Vincennes - Saint-Denis Laboratoire Paragraphe (EA 349) École doctorale Cognition, Langage, Interaction Mention: Informatique Thèse présentée et soutenue publiquement par Saleh SALOUS Fusion de données multi-Kinect visant à améliorer l’interaction gestuelle au sein d’une installation de réalité virtuelle Thèse dirigée par Khaldoun ZREIK Encadrée par Safwan CHENDEB et Laure LEROY Le 23 Novembre, 2015 Jury : Pr. Ioannis Kanellos, Telecom Bretagne, Brest Rapporteur Pr. Fouad Badran, Cnam-Paris Rapporteur Pr. Benoit Geller, ENSTA Examinateur Dr. Safwan Chendeb, Université Paris 8 Examinateur Dr. Taha Riden, ENSTA Examinateur Dr. Laure Leroy, Université Paris 8 Examinateur I Abstract Virtual Reality is the most modern technology that allows a user to interact with an artificial environment created by Hardware and Software, with visual and aural feedback powerful enough to create the impression of a realistic environment. As a consequence, this form of computer interaction can be used in various contexts such as entertainment, medicine or vehicle driving training. Furthermore, numerous types of VR installations exist depending on the physical and financial constraints as well as on the intended final user experience provided by the system. The subject of this thesis is user interaction in a specific type of VR installation called a CAVE. Our CAVE, named “Le SAS”, currently relies on AR technology technology to detect users, and a joystick is used to provide directional inputs. Our objective is to present, describe and analyze an alternative user-tracking method relying on a 4-Kinect set-up tasked with tracking the user‟s movements inside this CAVE. Proper user- tracking is one of the main challenges provided by Virtual Reality as well as one of the core elements that define a proper and functional VR system; therefore it is important to implement an effective tracking system. In order to create true interaction with the virtual world provided by the CAVE, the sensors can detect various types of input. In the case of a multi-Kinect system, interaction with the CAVE will be based on user gestures which recognition is performed by the Kinects on a skeleton created after fusing the joint data from the various sensors. This thesis will focus on four main points, as described below. II The first part will provide a context analysis of our immersive CAVE “Le SAS” and define the features as well as the constraints of this specific environment in which the multi-Kinect system is installed. In the second part, the topic of tracking algorithms will be discussed. Indeed, the immersive CAVE‟s large-scale implies a tracking system composed of several sensors. The use of a network of cameras to track a user inside the CAVE is synonymous with the use of an algorithm that determines in real-time what sensors provide the most accurate tracking data and will therefore properly recognize the user‟s inputs and movements. Subsequently, we will propose a gesture detection algorithm. Once the user‟s gestures are properly tracked, such an algorithm is necessary in order to provide interaction. While the Kinects can capture the user‟s movements, the question of the detection of specific gestures by this system comes into play as the CAVE needs to be configured as to recognize specific gestures as potential inputs. The presented algorithm will focus on three specific gestures: Raising the right hand, raising the left hand and short hopping. Lastly, we will provide experimental results comparing the effectiveness of a multi-Kinect set-up with the effectiveness of a single sensor and present data showing a noticeable increase in accuracy with the 4-Kinect system. III Résumé Les technologies liées à la réalité virtuelle sont les outils les plus avancés dans le domaine de l‟interaction numérique, permettant à un utilisateur de communiquer avec une simulation créée à partir d‟un matériel et d‟une solution logicielle dédiés. Le degré d‟immersion proposé par ces technologies et leur feedback audio et vidéo peut donner l‟impression que ces environnements virtuels sont réels. Par conséquent, de multiples secteurs tels que le divertissement vidéo-ludique ou la médecine peuvent incorporer ces technologies. De plus, les installations de réalité virtuelle existantes sont nombreuses et leurs caractéristiques peuvent varier en fonction des contraintes physiques et financières des projets, ainsi qu‟en fonction de l‟expérience utilisateur souhaitée. Un de ces types d‟installations de réalité virtuelle, le CAVE, est au cœur de cette thèse. Notre CAVE, nommé « Le SAS », utilise à l‟heure actuelle une combinaison de l‟technologie AR pour détecter des utilisateurs et d‟un joystick pour récupérer des inputs directionnels. Notre objectif à travers cette thèse est de présenter, décrire et analyser une méthode alternative de détection de mouvements au sein du SAS, reposant sur l‟utilisation d‟un système de 4 Kinects connectées ensemble. Cette analyse est pertinente et justifiée étant donnée l‟importance d‟un système de détection d‟utilisateur dans une installation de réalité virtuelle. Afin de proposer un niveau satisfaisant ‟interaction avec l‟environnement virtuel, les capteurs installés sur le CAVE peuvent détecter différents types d‟inputs. Dans le cadre d‟un système multi-Kinect, l‟interaction repose sur la détection de gestes effectués par l‟utilisateur. Ces gestes sont extraits d‟un squelette virtuel formé à partir des données recueillies par les Kinects. Cette thèse va aborder quatre points-clés décrits ci-dessous : IV Premièrement, nous étudierons le contexte lié à notre CAVE et définirions ses caractéristiques ainsi que les contraintes que cet environnement particulier de réalité virtuelle impose à notre dispositif multi-Kinect. En second lieu, nous aborderons le sujet es algorithmes de suivi d‟utilisateur au sein d‟un CAVE. En effet, les dimensions du SAS amènent à utiliser plusieurs capteurs pour suivre l‟utilisateur. Par conséquent, il devient nécessaire d‟utiliser un algorithme capable de déterminer en temps-réel quelles Kinects produisent les données les plus précises et les plus fiables afin de correctement détecter les mouvements de l‟utilisateur. Par la suite, nous proposerons un algorithme de détection de gestes. Cette étape est la suite logique de la détection d‟utilisateur et consiste à interpréter les mouvements enregistrés. Bien que les Kinects soient capables d‟enregistrer les mouvements et gestes de l‟utilisateur, le CAVE doit être configuré afin de reconnaître certains gestes spécifiques, créant ainsi la possibilité d‟interagir avec un environnement virtuel. Notre analyse se concentrera sur trois gestes spécifiques : Lever la main droite, lever la main gauche, et effectuer un petit saut. Finalement, nous fournirons des résultats d‟expérience ayant pour objectif de comparer l‟efficacité d‟un système Multi-Kinect par rapport à l‟utilisation d‟un seul capteur. Nous présenterons des données indiquant une amélioration de la précision de la détection de gestes avec plusieurs Kinects. V List of figures Figure 1.1‎ : Graphic representation of "SAS" (Ridene et al., 2013) .............................................. 32 Figure 1.2‎ : Kinect consists of Infra-red (IR) projector, IR camera and RGB camera (Smisek, 2011) ............................................................................................................................................. 33 Figure 1.3‎ : Diagram showing the Kinect FOV by Mr. Riley Porter (8) ......................................... 35 Figure 1.4‎ : Kinect detects movements of disabled persons(Chang et al,. 2011). ........................ 37 Figure 1.5‎ : joints of human body (Alexiadis et al., 2011) ............................................................ 38 Figure 1.6:‎ User input and control of system (Du et al., 2011) .................................................... 39 Figure 1.7‎ : A) original image, B) model scan from Kinect, C) model got from Kinect fusion technique (Lezadi et al, .2011). ..................................................................................................... 40 Figure 1.8‎ : A) and C) show facial animation expressions by Kinect, B) and D) shows combined facial animations from database. (Weise et al., 2011). ................................................................. 41 Figure 1.9‎ : A) Gloves with accelerometers B) detailed accelerometers (Zafrulla et al., 2011). .. 42 Figure 1.10‎ :A) seated B) standing (Zafrulla et al., 2011). ........................................................... 43 Figure 1.11‎ : hand gesture recognition process (Z. Renet et al., 2011) ......................................... 45 Figure 1.12‎ : 14 Gestures commands and four arithmetic operations (Z. Renet et al., 2011) ....... 45 Figure 1.13‎ : A) addition operation 3+9=12, B) multiplication operation 5*8=40 (Z. Renet et al., 2011) ............................................................................................................................................. 46 Figure 1.14‎ : three gestures for Rock-paper-scissors game (Z. Renet et al., 2011) ...................... 46 Figure 1.15‎ : two examples of Rock-paper-scissors game (Z. Renet et al., 2011) ........................ 47 Figure 1.16‎ : a) Offline step. From multiple 3D face instances the 3DMM is fit to obtain a person specific 3D model b)-d) online steps. b) The person model is registered at each instant to multimodal data to retrieve the head pose c) Head stabilization computed from the inverse head VI pose
Recommended publications
  • Digital Humanities Pedagogy: Practices, Principles and Politics
    To access digital resources including: blog posts videos online appendices and to purchase copies of this book in: hardback paperback ebook editions Go to: https://www.openbookpublishers.com/product/161 Open Book Publishers is a non-profit independent initiative. We rely on sales and donations to continue publishing high-quality academic works. Digital Humanities Pedagogy: Practices, Principles and Politics Edited by Brett D. Hirsch http://www.openbookpublishers.com © 2012 Brett D. Hirsch et al. (contributors retain copyright of their work). Some rights are reserved. The articles of this book are licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported Licence. This license allows for copying any part of the work for personal and non-commercial use, providing author attribution is clearly stated. Details of allowances and restrictions are available at: http://creativecommons.org/licenses/by-nc-nd/3.0/ As with all Open Book Publishers titles, digital material and resources associated with this volume are available from our website at: http://www.openbookpublishers.com/product/161 ISBN Hardback: 978-1-909254-26-8 ISBN Paperback: 978-1-909254-25-1 ISBN Digital (pdf): 978-1-909254-27-5 ISBN Digital ebook (epub): 978-1-909254-28-2 ISBN Digital ebook (mobi): 978-1-909254-29-9 Typesetting by www.bookgenie.in Cover image: © Daniel Rohr, ‘Brain and Microchip’, product designs first exhibited as prototypes in January 2009. Image used with kind permission of the designer. For more information about Daniel and his work, see http://www.danielrohr.com/ All paper used by Open Book Publishers is SFI (Sustainable Forestry Initiative), and PEFC (Programme for the Endorsement of Forest Certification Schemes) Certified.
    [Show full text]
  • Laval Virtual's Missions Are to Gather, Inspire and Valorize Involved in This Study
    The VR/AR special edition #4 health Clinical VR Medicine Well Being #EDITORIAL How VR is changing the way women breast cancer is diagnosed, treated and managed LAURENT CHRÉTIEN DIRECTOR / LAVAL VIRTUAL ancer cells live in complex communities. They will then take all the information they Just like houses in a city, each cell in a collect about the cells in a tumour and use it tumour is different from its neighbour, to construct a 3D version that can be studied Cand relies on infrastructure to support using virtual reality. its existence. And we know that there are different neighbourhoods, some worse than Using virtual reality will allow scientists others. Where we have roads, tumours contain to immerse themselves in a tumour, blood vessels that deliver nutrients, and act meaning they can study patterns and other as highways for different cell types to move characteristics within it, in entirely new around. And when a tumour spreads, the can- ways that aren’t possible in 2D. It will also cer cells themselves use these blood ‘roads’ to allow multiple doctors and scientists to look migrate. at a tumour at the same time, meaning people at opposite ends of a country, and with different areas of expertise, can What the healthcare experts need is a Google Earth-like view work together to help diagnose and treat patients better. And of a tumour. If they could make a 3D map, they would find with the Covid19 crisis, the use of virtual reality to cooperate new targets for treatment and, eventually, could use this view remotely is even more obvious! to track what’s going on in real time, such as in response to treatment.
    [Show full text]
  • Haptic Augmented Reality 10 Taxonomy, Research Status, and Challenges
    Haptic Augmented Reality 10 Taxonomy, Research Status, and Challenges Seokhee Jeon, Seungmoon Choi, and Matthias Harders CONTENTS 10.1 Introduction ..................................................................................................227 10.2 Taxonomies ...................................................................................................229 10.2.1 Visuo-Haptic Reality–Virtuality Continuum ...................................229 10.2.2 Artificial Recreation and Augmented Perception ............................. 232 10.2.3 Within- and Between-Property Augmentation ................................. 233 10.3 Components Required for Haptic AR ..........................................................234 10.3.1 Interface for Haptic AR ....................................................................234 10.3.2 Registration between Real and Virtual Stimuli ................................236 10.3.3 Rendering Algorithm for Augmentation ..........................................237 10.3.4 Models for Haptic AR.......................................................................238 10.4 Stiffness Modulation.....................................................................................239 10.4.1 Haptic AR Interface ..........................................................................240 10.4.2 Stiffness Modulation in Single-Contact Interaction .........................240 10.4.3 Stiffness Modulation in Two-Contact Squeezing .............................243 10.5 Application: Palpating Virtual Inclusion in Phantom
    [Show full text]
  • Virtual Reality and Augmented Reality a Survey from Scania’S Perspective
    EXAMENSARBETE INOM MASKINTEKNIK, AVANCERAD NIVÅ, 30 HP STOCKHOLM, SVERIGE 2019 Virtual Reality and Augmented Reality A Survey from Scania’s Perspective KARL INGERSTAM KTH SKOLAN FÖR INDUSTRIELL TEKNIK OCH MANAGEMENT Virtual Reality and Augmented Reality A Survey from Scania’s Perspective Karl Ingerstam 2019 Master of Science Thesis TPRMM 2019 KTH – Industrial Engineering and Management Production Engineering SE-100 44 Stockholm Abstract Virtual reality and augmented reality are technological fields that have developed and expanded at a great pace the last few years. A virtual reality is a digitally created environment where computer- generated elements are displayed to a user via different interfaces for the respective senses. Video is used for displaying images, creating a realistic environment, while audio is played to stimulate hearing and other sorts of feedback is used to stimulate the sense of touch in particular. Augmented reality is a sub-category of virtual reality where the user sees the real surroundings, but computer-generated imagery is displayed on top of objects in the environment. This type of technology brings a lot of new possibilities and potential use cases in all sorts of areas, ranging from personal entertainment, communication and education to medicine and heavy industry. Scania is a global manufacturer of heavy trucks and buses, and provider of related services, based in Sweden. By studying Scania’s different departments and surveying the fields of virtual reality and augmented reality, the aim of this thesis is to identify situations and use cases where there is potential for Scania to implement virtual reality and augmented reality technology. This thesis also studies what obstacles implementation of these technologies bring.
    [Show full text]
  • The Law and Ethics of Virtual Sexual Assault
    The Law and Ethics of Virtual Sexual Assault Forthcoming in Barflied, W. and Blitz, M. The Law of Virtual and Augmented Reality (Cheltenham, UK: Edward Elgar Publishers, 2018). By John Danaher In 1993, Julian Dibbell wrote an article in The Village Voice describing the world’s first virtual rape.1 The incident he described took place in a virtual world called LambdaMOO, which is a text-based virtual environment that still exists to this day. People in LambdaMOO create avatars (onscreen ‘nicknames’) and interact with one another through textual descriptions. Dibbell’s article described an incident in which one character (Mr. Bungle) used a “voodoo doll” program2 to take control of two other users’ avatars and force them to engage in sexual acts with both his avatar and one another. At least one of the victims found the incident extremely traumatising, with Dibbell describing her later as having ‘postraumatic tears’ streaming down her face when she spoke about it.3 In 2003, a similar incident took place in Second Life. Second Life is a well-known virtual world, created by the engineer and entrepreneur Philip Rosedale. It is visual rather than textual in nature. People create virtual avatars that can interact with other user’s avatars in a reasonably detailed visual virtual environment. In 2007, the Belgian Federal Police announced that they would be investigating a ‘virtual rape’ incident that took place in Second Life back in 2003.4 Little is known about what actually happened, and the investigation does not appear to have progressed to a charge, but taking control of another character’s avatar and forcing it to engage in sexual acts was not unheard of in Second Life.
    [Show full text]
  • IMMERSIVE MEDIA a STATE of the INDUSTRY May 2017
    IMMERSIVE MEDIA A STATE OF THE INDUSTRY May 2017 VR/AR Project VR/AR Project vrarproject.com [email protected] VR/AR Project New Amsterdam Media 1 VR/AR Project May 1, 2017 Rev 3.0 Editor: Seth Shapiro Deputy Editors: Bryce Paul & Andriy Pishchalenko Contributors: Perisa Brown, John Canning, Amy Dai, Xingting Gu, Hudson Leiser, Francisco Serna, Kinsey Thompson V2 Seth Shapiro 2016 Andriy Pishchalenko V1 Seth Shapiro 2015 Andriy Pishchalenko [email protected] VR/AR Project New Amsterdam Media 2 VR/AR Project [email protected] VR/AR Project New Amsterdam Media 3 VR/AR Project THIS IS AN OPEN COMMUNITY PROJECT. THIS IS A SNAPSHOT OF A RAPIDLY EVOLVING LANDSCAPE. TO OPT IN FOR FUTURE VERSIONS OF THIS PAPER, JOIN US AT VRARPROJECT.COM EMAIL US AT [email protected] [email protected] VR/AR Project New Amsterdam Media 4 VR/AR Project TABLE OF CONTENTS I 1. Preface 7 2. Overview 8 3. Enterprise Use Cases 9 Advertising & Marketing 9 Aerospace & Defense 11 Construction 12 Education 14 Entertainment 15 Fashion 18 Finance 19 Gaming 21 Healthcare 24 Live Events 26 Real Estate 28 Retail 29 Training 31 Travel & Hospitality 32 4. Computer Rendered VR 35 5. Live Action VR 37 6. Web VR 38 7. Social VR 39 8. Location-Based VR 42 Theme parks 42 VRcades 44 9. Smartphone VR/AR 46 10. VR Head Mounted Displays (HMDs) 49 11. AR/MR Head Mounted Displays (HMDs) 54 12. Spatial Audio 59 13. Haptics, Accessories, and Control Systems 63 Omnidirectional Treadmills 66 Exercise and fitness 29 Haptic Suits 64 Galvanic Vestibular Stimulation (GVS) 67 [email protected] VR/AR Project New Amsterdam Media 5 VR/AR Project TABLE OF CONTENTS II Advanced movement tracking 66 14.
    [Show full text]
  • Haptic Media Scenes
    Haptic Media Scenes Elisabeth Nesheim Thesis for the degree of Philosophiae Doctor (PhD) University of Bergen, Norway 2020 Haptic Media Scenes Elisabeth Nesheim ThesisAvhandling for the for degree graden of philosophiaePhilosophiae doctorDoctor (ph.d (PhD). ) atved the Universitetet University of i BergenBergen Date of defense:2017 28.08.2020 Dato for disputas: 1111 © Copyright Elisabeth Nesheim The material in this publication is covered by the provisions of the Copyright Act. Year: 2020 Title: Haptic Media Scenes Name: Elisabeth Nesheim Print: Skipnes Kommunikasjon / University of Bergen SCIENTIFIC ENVIRONMENT This dissertation is the result of research conducted at the University of Bergen (UIB), Humanities Faculty, Department of Linguistic, Literary, and Aesthetic Studies between August 2012 and March 2020, as part of a PhD research fellowship in Digital Culture. The funded doctoral program is two-fold, where 75% of the contract is assigned individual research to develop the doctoral thesis. Teaching and program specific research activities constitute 25% of the fellowship. Courses taught include DIKULT103, 104, 208, 250, and 302, in the areas of digital art, electronic literature, media aesthetics, philosophy, and digital humanities. My main supervisor is Scott Rettberg, professor of Digital Culture at UIB, and my co-supervisor is Jill Walker Rettberg, professor of Digital Culture at UIB. I have been part of two research groups throughout the fellowship: the Digital Culture Research group and the Bergen Electronic Literature research group (BEL), and been partaking in workshops, conferences, and exhibitions organized within the program. In addition, important meetings and conversations have taken place at conferences, workshops, and festivals, such as Transmediale, Piksel, Ars Electronica, Medialab Prado, the TBLR PhD seminar, and the E&T Education and Technology summer school.
    [Show full text]
  • Exploring the Effect of Using Vibrate-Type Haptic Glove in the VR Industrial Training Task
    DEGREE PROJECT IN INFORMATION AND COMMUNICATION TECHNOLOGY SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2020 Exploring the effect of using vibrate-type haptic glove in the VR industrial training task YI-HSIOU HSU KTH ROYAL INSTITUTE OF TECHNOLOGY ELECTRICAL ENGINEERING AND COMPUTER SCIENCE Authors YI-HSIOU HSU <[email protected]> Information and Communication Technology KTH Royal Institute of Technology Place for Project Stockholm, Sweden Examiner Ylva Fernaeus KTH Royal Institute of Technology Supervisor Anders Lundström KTH Royal Institute of Technology ii Abstract Is it a dream came true for you to experience a Virtual Reality (VR) and be able to touch virtual objects and manipulate them with your bare hands? The recent growth of the Virtual Reality market resulted in an intensification of the development of the haptics gloves technology. The newly haptics gloves, Bebop gloves launched and commercialized recently which will use for this study. Earlier research has explored a range of haptics effects mainly on VR surgery or gaming. Yet, VR industrial training has gradually received attention in recent years. Creating multiple scenarios in the virtual scene is not only cost-effective but also increases safety and reduces training time. However, not many research studies have explored using haptic gloves in the VR industrial training environment. This study tries to complement earlier research by investigating usability and user performance using bebop vibration gloves in VR industrial training. The purposes were to provide a usability review of bebop gloves and explored the effect of haptics in VR industrial training. Three different haptics settings (Non-haptics, Partial haptics, and Full-haptics) were being set up.
    [Show full text]
  • GREATER EXPERIENCES START HERE View This Year’S Demos Featured at SXSW in Austin, Texas
    GREATER EXPERIENCES START HERE View this year’s demos featured at SXSW in Austin, Texas. CONTENTS: WORK WITH THE FULL PICTURES . 3 EMOTIV . 4 WORK ON THE DARK SIDE . 5 WORK YOUR SMARTS . 6 WORK SUITS YOU . 7 DISNEY PHOTOGRAMMETRY . 8 THE FUTURE OF WORK IS STRANGE . 9 DISNEY DUMBO . 10 PLAYBACK TO THE FUTURE . 11 PLAY THE HERO . 12 PLAY WITH ART . 13 SHOP WITH CERTAINTY . 14 SHOP WITH STORYTELLING . 15 VIRTUAL MERCHANDISING . 16 SHOP WITH CONVICTION . 17 MOMENTS THAT MATTER . 18 HAPTX GLOVES NISSAN EXPERIENCE . 19 SHOP WITH EYES FOR EARS . 20 WORK WITH THE FULL PICTURES Take a glimpse at the future of real-time data visualization ViewMAX What if you could have a god-like view over an entire city block? And what if we could make it more amazing, giving you the power to zoom all the way into a scene with the simple tap of a finger? Come experience ViewMAX — featuring the cutting-edge HoloLens — with a holographic tour over, around and through Accenture’s Experience Cantina. ViewMAX demonstrates how augmented reality can change the way we see and interact with the world by displaying usable data in the user’s line of sight. Whether you’re a chemical plant manager or the CEO of a global transportation company, having real-time data and location visualization of your entire operation can not only help you avoid costly mistakes, but enable you to spot game-changing operational and market opportunities. WWW .ACCENTURE COM/SXSW. 3 EMOTIV Do you know what really makes you tick? Emotiv Do you know what really makes you tick? Our brain is the most adaptive organ in our bodies — a machine that continually changes and evolves depending on how we stretch, flex and use it.
    [Show full text]
  • Human Interface Technologies Team
    Human Interface Technologies Team School of Electronic, Electrical & Systems Engineering Human Factors, Virtual Reality, Simulation and Telerobotics Projects: Past & Present ABOUT THE HIT TEAM The University of Birmingham’s Human Interface Technologies (HIT) Team, based within the School of Electronic, Electrical & Systems Engineering (EESE) has been pioneering the development and uptake of serious games, interactive media and telerobotic technologies in the UK since 2003, building on over 28 years of experience in the domain of Virtual Reality (VR), Augmented Reality (AR), Mixed Reality, Simulation and Telerobotics/Telepresence. The Team’s participation within the UK’s Human Factors Integration Defence Technology Centre (HFI DTC) between 2003 and 2012, with Dstl, and, more recently similar collaborative initiatives in maritime defence and unmanned systems, provided excellent opportunities to work closely with stakeholders and end users in the development of methodologies supporting human- centred design for “serious games”-based part-task trainers and novel human interface concepts for telerobotic systems. The HIT Team’s award-winning research is helping to avoid the technology push failures evident in the Advanced Robotics and VR “eras” of the 1980s, ‘90s and early 2000s by developing and evaluating demonstrators that emphasise the importance of exploiting human factors knowledge when specifying issues such as task and context fidelity, learning content, evaluation metrics and appropriate interactive technologies. THE PROJECTS ... Unmanned Vehicle Simulation Early HIT Team games-based training projects were originally conceived to demonstrate how low-cost gaming environments could be developed quickly and cheaply, thereby supporting the rapid development of technology demonstrations (military systems, scenarios, etc.) and, ultimately, investigations into the use of different forms of interactive display and control devices.
    [Show full text]
  • Virtual Reality, Now with the Sense of Touch
    THE FUTURE OF EVERYTHING Virtual Reality, Now With the Sense of Touch By Sarah E. Needleman April 3, 2018 10:06 a.m. ET The future depicted in the new film ‘Ready Player One’ is closer than you think: Startups are developing haptic gloves and suits that let users feel virtual worlds. A scene from Steven Spielberg’s ‘Ready Player One,’ in which haptic technology allows the characters to feel and manipulate virtual objects. Photo: Jaap Buitendijk/Warner Bros It was a starry night on the cartoonish virtual-reality farm I was exploring with help from HTC Corp.’s Vive headset. Suddenly a tiny fox emerged from behind the barn and leapt onto my hand. I was surprised by the animal’s realistic appearance—and also by the sensation of its tiny paws as they walked a circle in my palm. In addition to the headset, I wore a bulky black glove from HaptX Inc., a six-year-old startup. Thick wires arched over the fingers and controlled the airflow to more than 100 inflatable pockets embedded in the glove’s mesh lining, which created sensations of touch and even texture on my skin. After a few seconds, the virtual fox leapt out of my hand and paused. I stroked its fur and felt soft strands brushing past my fingertips. 1 Haptic technology uses force, vibration and motion to simulate the feel of a virtual object. A basic version—responsible for the pulsing feedback found in smartphones, fitness bands and video-game controllers—has been around for years. But adding the sensation of touch to virtual reality has, until recently, been the stuff of science fiction.
    [Show full text]
  • Are We Still in the Game?
    ARE WE STILL IN THE GAME?: CONSTRUCTING CONSUMER VIRTUAL REALITY THROUGH THE LENS OF ARTIST AND INDUSTRY A THESIS SUBMITTED TO THE GRADUATE SCHOOL IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE MASTER OF ARTS BY CHARLIE ECENBARGER DR. ROBERT BROOKEY – ADVISOR BALL STATE UNIVERSITY MUNCIE, INDIANA JULY 2015 TABLE OF CONTENTS Chapter 1: Introduction 1 The Rise, Fall, and Reemergence of Virtual Reality Devices 3 Money, Form, Function 3 Max Settings: Graphics and Motion Controls 7 Facebook and Oculus Rift: Not Just for Games 12 Technology as Epistemology 14 Preview of Chapters 19 Chapter 2: Men of Integral Awareness 23 Cautionary Tales 25 Ready Player One: The Illusion of Reality 25 eXistenZ: Escaping Reality 28 Avalon: Anxiety 32 Proceed With Caution 36 Chapter 3: To Game or Not to Game 39 Sony and Project Morpheus 41 Facebook and Oculus Rift 47 Gaming and Beyond 53 Chapter 4: Towards a Media Conscious Society 56 The Politics of Virtual Reality 58 Educating with Virtual Reality 62 A Virtual Society 67 Preventing the Wreckage 71 References 77 Notes 98 1 INTRODUCTION My first experience with virtual reality gaming occurred in the summer of 1995 with the release of Nintendo’s Virtual Boy. Advertising touted the console as having the first “truly 3D graphics” and offering a new form of virtual reality. As further encouragement, Nintendo ran a promotional campaign in coordination with Blockbuster where renters would receive a coupon for $10 off the purchase of a Virtual Boy after returning it to the store. Using this promotion and my upcoming birthday—a mere week after the release of the Virtual Boy—to my advantage, I was successful in convincing my mother to rent the console with the intention of purchasing.
    [Show full text]