Estudo De Ferramenta Alternativa De Captura De Movimentos De Cabeça Para Aplicações De Realidade Virtual

Total Page:16

File Type:pdf, Size:1020Kb

Estudo De Ferramenta Alternativa De Captura De Movimentos De Cabeça Para Aplicações De Realidade Virtual CORE Metadata, citation and similar papers at core.ac.uk Provided by Periódicos PUC-Rio (Pontifícia Universidade Católica do Rio de Janeiro) 2 ESTUDO DE FERRAMENTA ALTERNATIVA DE CAPTURA DE MOVIMENTOS DE CABEÇA PARA APLICAÇÕES DE REALIDADE VIRTUAL. STUDY HEAD MOTION CAPTURE ALTERNATIVE TOOL FOR VIRTUAL REALITY APPLICATIONS . Diego de Jesus Penaforte Parreiras1, Diego Penaforte M.Sc. (1) Escola superior de Propaganda e Marketing - ESPM e-mail: [email protected] Realidade Virtual, interatividade, computação gráfica Este trabalho utiliza uma técnica alternativa na captura de movimentos da cabeça, para navegação em cenários virtuais, voltado para experimentações para realidade virtual. Tem como resultado o aprimoramento de um sistema alternativo, para essa captura de movimentos, mais barato, e de mais fácil utilização por parte de usuários comuns. Virtual reality, interactivity , computer graphics This work uses an alternative methodology to capture head movements of a user for navigation in virtual scenarios applied in trials for virtual reality. Results in the improvement of an alternative system, for this motion capture , cheaper , and easier to use by ordinary users . 1. Introdução instituição de ensino, seja ela curso, escola ou universidade. Assim, muitas vezes o ensino resume- A tecnologia da realidade virtual (RV) vem se se a aulas expositivas o que pode causar a popularizando a cada dia. Já é utilizada com desmotivação de quem aprende. sucesso, em treinamentos, aprendizado de tarefas, aquisição de habilidades específicas. a Polícia Assim sendo, a tecnologia da RV vem sendo Militar de Santa Catarina já utiliza esta tecnologia estudada e experimentada como com sucesso incorporada ao processo de ferramenta avançada para Treinamentos e treinamento de seus policiais (Rosa 2015). Temos aprendizado. segundo Rosa (2015), "o leitor esta tecnologia aplicada em treinamento para imersivo é marcado pela interatividade, aprendendo atividades de alta complexidade, como a e ensinando, velozmente por meio da interação manutenção de redes elétricas (Buriol, 2011). constante que, também, implica em constante Simulação de rotas de fuga (Soares, 2012). inovação." Rosa (2015), também afirma, Segundo Rosa 2015(apud Demo) a realidade referenciando-se em Santaella (2004) que "a virtual apresenta um novo paradigma, uma navegação interativa no cyber-espaço promove ferramenta para o aprendizado de tarefas, e alterações no âmbito das sensações , percepções aquisição de conhecimentos mais alinhada com as e cognições, diferenciando-se das experiências novas tecnologias de informação, que supera o anteriormente propostas pela cultura literária." chamado método escolástico, que para Azevedo e Segundo Monteiro (2015) as experiências vividas Souza (2015). Utiliza-se de métodos considerados através da realidade virtual através de seus tradicionais para ensinar os conteúdos, como o HMDs, como oculus Rift, por exemplo, podem apego exclusivo ao quadro e livro didático. Para este ganhar dimensões até então inexplorados por outros autor, estas técnicas não promovem a aproximação meios. Nesse contexto vale realçar a proximidade da realidade de quem estuda, com a realidade da da realidade virtual com os jogos 1 2 eletrônicos, sendo estes últimos sido estudados e formado por uma espécie de elmo ou capacete, que analisados como potentes ferramentas de auxílio ao adapta duas pequenas telas de projeção na cabeça aprendizado. para Azevedo e Souza (2015) os do usuário, uma para cada olho. Nestas telas são jogos eletrônicos possuem para os alunos, por projetadas imagens geradas geralmente por exemplo, uma maior expressividade e são mais aplicações de computação gráfica significativos. Para estas mesmos autores, é tridimensional, que simulam desde um fundamental a utilização da tecnologia para ambiente, um local, uma cidade, e até mundos incrementar a dinâmica ensino-aprendizagem, pois inteiros. vivemos em uma era em que os dispositivos tecnológicos como celulares, tablets, e Estes capacetes são dotados de um sistema de computadores, fazem parte do nosso cotidiano, e captura de movimentos da cabeça do usuário são os principais instrumentos de aquisição de (Kirner, Siscouto, 2007), para que sempre que o informações e conhecimentos. (Azevedo e usuário girar sua cabeça, o cenário Souza 2015). Para Soares (2012) citando Weiser virtual possa interagir modificando o ponto de vista (1994), este fenômeno chama-se “computação apresentado, dando a este usuário em questão, a ubíqua”, em que os aparatos tecnológicos baseados sensação de fazer parte do cenário e em computação estão tão presentes em nossas vidas, estivesse presente fisicamente nele. e de forma tão integrada ao ser humano, que é como se fossem invisíveis. O grande problema é que este dispositivo geralmente, é extremamente caro, e este o ponto Assim sendo, temos observado um crescente central na dificuldade da popularização da interesse, principalmente da indústria dos games na tecnologia da Realidade Virtual, pois é um apropriação dos conceitos da RV e utilização, para equipamento de extrema necessidade para a uma melhor integração entre usuário e computador, sensação de imersão, que é o ponto chave da e uma maior sensação de imersão. Como exemplo RV. Também são geralmente dispositivos deste fato temos o console de vídeo game chamado complexos e de operação e configuração “Xbox One Scorpio”, da Microsoft, que é a complicada, sendo pouco amistosos para o público evolução do “Xbox One”, e apresenta melhorias leigo em geral. voltadas para aplicações em RV (http://mundovr.com.br/2016/06/15/e3-2016-xbox- O objetivo principal deste trabalho é testar e one-scorpio-e-o-xbox-para-realidade-virtual/). A aprimorar uma técnica alternativa para captura de Ubisoft, grande empresa produtora de jogos, está movimento da cabeça para aplicações de Realidade planejando o lançamento de 4 jogos, formatados Virtual, com fins educacionais. para serem jogados com dispositivos de RV, até o final deste ano(2016). Ele se justifica através da busca de meios para (http://mundovr.com.br/2016/06/15/ubisoft-planeja- popularizar, baratear e difundir a RV, para sua 4-jogos-em-realidade-virtual-para-esse-ano/) ampla utilização como ferramenta potencializadora do aprendizado, elevando o ensino-aprendizado em Temos também o projeto de desenvolvimento do nosso país, a um novo patamar, melhorando sua Chamado Oculus Rift, eficácia, e formando técnicos, profissionais, e (https://www3.oculus.com/en-us/rift/) a princípio professores de melhor qualidade. desenvolvido por Palmer Luckey, sendo apresentado a princípio na E3 - Eletronic Entertainment Expo. 2. Conceitos importantes (Monteiro, 2016) e agora mantido e aprimorado pela empresa Facebook, exatamente com o fim de 2.1.Interação popularizar e baratear a tecnologia da realidade virtual. consiste na criação de um dispositivo head Outra palavra que está presente nas definições de mounted display (HMD), que é o equipamento VR é a interação. A ideia de interação está ligada ícone da tecnologia da realidade virtual, com a capacidade de o computador detectar as 2 2 entradas do usuário e modificar instantaneamente o TORI et al., 2006; PINHO et al., 2006; BARILLI, mundo virtual e as ações sobre ele de acordo com a 2007). vontade desse usuário (ESPINHEIRA, 2004). Neste contexto, certos dispositivos conectam ações do 2.4.Free Track usuário com elementos de cena do ambiente virtual (MACHADO et al., 2006). Freetrack é um sistema de captura de movimentos baseado em sistema óptico, para plataforma 2.2.Realidade Virtual Windows, de uso gratuito, lançado sob a licença GNU- General Public License, que pode ser usado O termo Virtual Reality (VR) em português com câmeras baratas comuns. Seu foco principal é o Realidade Virtual (RV) é creditado a Jaron Lanier acompanhamento dos movimentos da cabeça, para (ESPINHEIRA, 2004; BARILLI. Concentra utilizações em simulação, jogos de vídeo game, conhecimentos de diversas áreas, e por ser uma área modelagem 3D, desenho assistido por computador e multidisciplinar, torna-se difícil de conceituar. computação de mãos-livres em geral. A câmera KIRNER, (2004) e BARILLI, (2007) citando Cunha capta um sistema de pontos luminosos usado pelo (2001), consideram RV como sendo o “nome usuário e esses pontos precisam ser isolados da luz genérico que representa uma tecnologia sobre a qual de fundo por meio de filtragem física ou de estão sendo agrupados meios através dos quais o software. É um dispositivo que permite usuário pode livremente visualizar, explorar, movimentação com seis graus de liberdade – six manipular e interagir com dados” degree of freedom (6DOF). Outra definição que cabe neste trabalho é a proposta por Kirner (2007) que considera a RV uma 3. Metodologia “interface avançada de usuário” para interação com o computador. 3.1 - Dispositivo de rastreamento da cabeça 2.3.Imersão O sistema proposto neste trabalho consiste em um “A intenção da VR mais básica, é a imersão, no dispositivo de captura de movimentos da cabeça, sentido de o individuo se sentir transportado para o disposto na própria cabeça do usuário, um cenário mundo virtual, como se estivesse, com efeito, dentro virtual 3D, que será executado em tempo real, dele, envolvido, esquecendo-se completamente do recebendo os comandos do usuário, e um projetor, mundo real” (BRAGA, 2007). projetando este cenário em um aparato, para que o navegador possa interagir com o cenário e realizar A Realidade Virtual pode ser classificada, em uma experiência imersiva. função do senso de presença do usuário, em imersiva ou não imersiva
Recommended publications
  • Flightsim Community Survey 2019
    FlightSim Community Survey 2019 Final Report 1 Copyright Notice © 2020 Navigraph By licensing our work with the CC BY-SA 4.0 license it means that you are more than welcome to copy, remix, transform and build upon the results of this survey and then redistribute it to whomever you want in any way. You only have to give credit back to Navigraph and keep the same license. https://creativecommons.org/licenses/by-sa/4.0/ 2 Preamble This is the annual flightsim community survey, a collaborative effort between partners – developers, companies and organizations in the flightsim domain – coordinated and compiled by Navigraph. This survey is freely distributed for the common good of the flightsim community to guide future projects and attract new pilots. This flightsim community survey is the largest and most comprehensive of its kind. This year 17,800 respondents participated in the survey. This is an 18.6% increase from last year when 15,000 participated. This year’s survey consisted of 93 questions, compared to last year’s 77 questions. However, this year many more of the questions were conditional, allowing us to add new sections and ask in-depth questions only to those respondents for whom it was relevant. New sections this year contained questions specifically aimed at pilots using flight simulator software on mobile devices and helicopter flight simulators. We also added questions on combat simulators, air traffic control and flight planning. Finally, because of the upcoming release of the new Microsoft Flight Simulator 2020, we added questions on this topic as well. Our main objective this year was to recruit more and diverse partners to broaden the reach of the survey to get an even more representative sample of the community.
    [Show full text]
  • Freetrack Manual
    FREETRACK MANUAL June 2007 by Babasior. With the participation of The_target. English Translation by Gweeds. Manual Version 1.2. 1. Introduction 2. Materials and Equipment necessary to the operation of FreeTrack 3. Head gear Construction 4. Installation of FreeTrack 5. Setting up and tweaking FreeTrack 6. FreeTrack.ini 7. Mouse Emulation 8. Joystick Emulation 9. Keyboard Emulation 10.Profile Creation 11.List of Games and their possible movements Note: TrackIR is a registered trademark of Natural Point Inc. Windows XP and Windows Vista are trademarks of Microsoft Corporation. 1. Introduction: FreeTrack is software which simulates movements of your head within a game hence increasing realism and the pleasure of game play. To achieve this, FreeTrack uses a standard Webcam and head mounted markers which consist of four Infrared LED (light emitting diodes) mounted on a standard baseball cap. FreeTrack uses the Webcam to “see” the LED's and so it can calculate the position and movement of your head and feed this information to your game. FreeTrack functions in the same way TrackIR does and is compatible with the majority of games using head tracking technology. The latest version of FreeTrack is available at this address: http://n.camil.chez-alice.fr/index.php Or on the site of Didja at this address: http://freetrack.online.fr/ 2. Materials and Equipment necessary to the operation of FreeTrack: – a PC running Microsoft Windows XP SP2 or Windows VISTA – DirectX 9.0c installed – Processor with SE instruction set – Webcam – 4 diodes IR (ref. OSRAM SFH485P or equivalent), failing this, 4 bright red or orange diffused LED's can be used.
    [Show full text]
  • Analyse Und Entwicklung Optischer Trackingverfahren Für Das Dreidimensionale Skizzieren in Der Virtuellen Produktentwicklung
    Analyse und Entwicklung optischer Trackingverfahren für das dreidimensionale Skizzieren in der virtuellen Produktentwicklung Masterarbeit Vorgelegt an der Fachhochschule Köln Campus Gummersbach Im Studiengang Medieninformatik Ausgearbeitet von: Dipl.‐Ing.(FH) Patrick Tobias Fischer Erster Prüfer: Prof. Dr. rer. nat. Wolfgang Konen Zweiter Prüfer: Dipl.‐Inform. Johann Habakuk Israel Berlin, 17. April 2008 Inhaltsverzeichnis Inhaltsverzeichnis ........................................................................................................................................ 1 Kurzdarstellung der Arbeit ............................................................................................................................ 4 Abbildungsverzeichnis ................................................................................................................................... 5 Tabellenverzeichnis ....................................................................................................................................... 9 Akronyme und Glossar ................................................................................................................................ 10 1 Einleitung und Aufbau der Arbeit ....................................................................................................... 13 2 Einführung in das Thema..................................................................................................................... 16 3 Produktentwicklungsprozesse ...........................................................................................................
    [Show full text]
  • Virtual Reality 3D Space & Tracking
    Virtual Reality 3D Space & Tracking Ján Cíger Reviatech SAS [email protected] http://janoc.rd-h.com/ 16.10.2014 Ján Cíger (Reviatech SAS) RV01 16.10.2014 1 / 57 Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 2 / 57 Who am I? Who am I? PhD from VRlab, EPFL, Switzerland I AI for virtual humans, how to make autonomous characters ("NPCs") smarter I Lab focused on human character animation, motion capture, interaction in VR I ≈ 20 PhD students, several post-docs, many master students. VRlab: http://vrlab.epfl.ch IIG: http://iig.epfl.ch Ján Cíger (Reviatech SAS) RV01 16.10.2014 3 / 57 Who am I? Who am I? SensoramaLab, Aalborg University, Denmark I Building a new lab focusing on new media, interaction, use in rehabilitation I Large projection system, tracking, many interactive demos I Teaching bachelor & master students, project supervision. Ján Cíger (Reviatech SAS) RV01 16.10.2014 4 / 57 Who am I? Who am I? Reviatech SAS I Applied research, looking for new technologies useful for our products/customers I Tracking, augmented reality, AI, building software & hardware prototypes. http://www.reviatech.com/ Ján Cíger (Reviatech SAS) RV01 16.10.2014 5 / 57 Coordinates in 3D Space Position in 3D Outline Who am I? Tracking Coordinates in 3D Space Introduction Position in 3D Technologies Orientation in 3D VRPN Euler angles What is VRPN Axis/angle representation Simple Example Rotation matrices (3 × 3) Quaternions Resources Ján Cíger (Reviatech SAS) RV01 16.10.2014 6 / 57 Coordinates in 3D Space Position in 3D 3D Position Carthesian I 3 perpendicular axes I Left or right handed! I Z-up vs Y-up! I Usually same scale on all axes http://viz.aset.psu.edu/gho/sem_notes/3d_ I Common units – meters, fundamentals/html/3d_coordinates.html millimetres, inches, feet.
    [Show full text]
  • Developing Effective Interfaces for Cultural Heritage 3D Immersive Environments
    The 8th International Symposium on Virtual Reality, Archaeology and Cultural Heritage VAST (2007) D. Arnold, F. Niccolucci, A. Chalmers (Editors) Developing Effective Interfaces for Cultural Heritage 3D Immersive Environments Karina Rodriguez-Echavarria1, David Morris1, Craig Moore1, David Arnold1, John Glauert2 and Vince Jennings2 1University of Brighton, Brighton, U.K. 2University of East Anglia, Norwich, U.K. Abstract Although the advantages of the use of 3D Immersive Virtual Environments for the presentation and communica- tion of Cultural Heritage have been demonstrated, the user interfaces and interaction techniques (in software and hardware) that are best suited have not yet been completely identified and deployed. This paper describes research conducted for developing and studying the usability of a historical town 3D Virtual Tour. For this, usability meth- ods combined with head-eyetracking technology were used to explore the suitability of these interfaces. The paper also reflects on issues raised during the design of the testing system for this experience. It is expected the results of this research will contribute towards developing effective interfaces for 3D immersive technologies as well as building a repository of lessons learned from evaluation. Categories and Subject Descriptors (according to ACM CCS): I.3.7 [Three-Dimensional Graphics and Realism]: Virtual Reality I.3.6 [Methodology and Techniques]: Interaction Techniques H.5.2 [User Interfaces]: Evalua- tion/Methodology H.5.2 [User Interfaces]: Interaction Styles 1. Introduction researched within the Human Computer Interaction (HCI) area, it has not been readily applied to the CH field. While Although the advantages of the use of 3D Immersive Vir- there are many lessons which could be easily transferred tual Environments for the presentation and communication from this field, the use of 3D immersive environments in of Cultural Heritage (CH) have been demonstrated, the user museums or heritage sites presents specific challenges.
    [Show full text]
  • Manuel De Freetrack
    Freetrack Handbook HandHandbookbook V2V2 August 2007 by Babasior. Helped by The_target - Kestrel. Version 2.0 – Bêta (English) Translated by Tristan68. Version 2 – 08-22-2007 Page 1 / 30 Freetrack Handbook Summary Introduction.........................................................................................................................3 Required Hardware and Software...........................................................................................3 Tools................................................................................................................................3 Building a 4 Dots Cap, 3 Dots Cap, Single Dot system.........................................................4 LED wiring........................................................................................................................4 FreeTrack software installation..............................................................................................5 Single point mode.............................................................................................................6 3 points Clip mode............................................................................................................7 4 Points Cap mode............................................................................................................7 System setup........................................................................................................................7 Webcam settings..............................................................................................................8
    [Show full text]
  • Low-Cost and Home-Made Immersive Systems
    The International Journal of Virtual Reality, 2012, 11(3):9-17 9 Low-cost and home-made immersive systems Sébastien Kuntz , Ján Cíger VRGeeks, Paris, France Based on our experience we have settled on the following: Abstract²A lot of professionals or hobbyists at home would x )RU DQ LPPHUVLYH ZDOO ZH QHHG WR WUDFN WKH KHDG¶V like to create their own immersive virtual reality systems for position7KHKHDG¶VRULHQWDWLRQFDQEHRPLWWHGLIZH cheap and taking little space. We offer two examples of such assume that the user keeps their head facing the screen. ³KRPH-PDGH´ V\VWHPV XVLQJ WKH FKHDSHVW KDUGZDUH SRVVLEOH x The orientation is essential for an HMD, but we also while maintaining a good level of immersion: the first system is EHOLHYH WKDW ZH VKRXOG WUDFN WKH KHDG¶V position to based on a projector (VRKit-:DOO DQGFRVWV§¼ZKLOHWKH obtain a more natural viewpoint control. Furthermore, second system is based on a head-mounted display the movement parallax provides an important cue for (VRKit-+0' DQG FRVWV EHWZHHQ ¼ DQG ¼ :H DOVR the perception of depth. propose a standardization of those systems in order to enable x Tracking of at least one hand (both position and simple application sharing. Finally, we describe a method to orientation) seems important in order to be able to calibrate the stereoscopy of a NVIDIA 3D Vision system. interact with the virtual environment in both systems. x Having a few buttons and a joystick to simplify the Index Terms²Input/output, low cost VR, immersion, interaction is interesting too (action selection, interaction. navigation in the environment, etc).
    [Show full text]
  • Controlling Switchable Software with Head Gesture
    1. HeadBanger: Controlling Switchable Software with Head Gesture 1.1 About the author Matthew Oppenheim is an Honorary Researcher at InfoLab21, School of Computing and Communications, Lancaster University, UK. He can be contacted at: [email protected] 1.2 Acknowledgements The author is grateful for the continued support of the Assistive Technologists and Occupational Therapists at Beaumont College, Lancaster and for the support of the Embedded Interactive Systems group at InfoLab21, Lancaster University. The Services for Enabling Technology Testing (SETT) project being led by National Star College for the Joint Information Systems Committee (JISC) supported this work at Beaumont College, Lancaster and at InfoLab21, Lancaster University. 1.3 Abstract Purpose This paper presents a novel non-contact method of using head movement to control software without the need for wearable devices. Design/methodology/approach A webcam and software are used to track head position. When the head is moved through a virtual target, a keystroke is simulated. The system was assessed by participants with impaired mobility using Sensory Software’s Grid 2 software as a test platform. Findings The target user group could effectively use this system to interact with switchable software. Practical implications Physical head switches could be replaced with virtual devices, reducing fatigue and dissatisfaction. Originality/value Using a webcam to control software using head gestures where the participant does not have to wear any specialised technology or a marker. This system is shown to be of benefit to motor impaired participants for operating switchable software. Keywords Head gesture, depth camera, virtual control, assistive technology, software.
    [Show full text]
  • Flightsim Community Survey 2020 Final Report
    FlightSim Community Survey 2020 Final Report Copyright Notice © 2020 Navigraph By licensing our work with the CC BY‐SA 4.0 license it means that you are more than welcome to copy, remix, transform and build upon the results of this survey and then redistribute it to whomever you want in any way. You only have to give credit back to Navigraph and keep the same license. https://creativecommons.org/licenses/by‐sa/4.0/ 2 Preamble For the third consecutive year I am happy to present the annual flightsim community survey. This report is the result of a collaborative effort between survey partners ‐ developers, companies, organizations and media outlets in the flightsim community. Navigraph’s role is to organize, design and compile the survey and make the results freely available for the common good of the flightsim community to guide future projects and ultimately to attract new pilots. This flightsim community survey is the largest and most comprehensive of its kind. 2020 an amazing 23,500 respondents participated in the survey! This is a 32% increase from last year when 17,800 respondents participated! Moreover, the reach has also increased this year. Last year 40% of the survey respondents had not taken the survey before. This year 60% of the respondents were newcomers! The survey is growing! In last year’s meta analysis of the survey we could see that respondents were comfortable with the amount of questions, and that many pilots quite enjoyed the tour of the community, so we decided to extend the survey from 93 to 119 questions this year.
    [Show full text]
  • Development of a Flight Simulation Training Device and Remote Pilot Station: the URBLOG Unmanned Hybrid Airship Vehicle Case
    Development of a Flight Simulation Training Device and Remote Pilot Station: The URBLOG Unmanned Hybrid Airship Vehicle Case Laura Sofia Nobre Martins Dissertação para obtenção do Grau de Mestre em Engenharia Aeronáutica (mestrado integrado) Orientador: Professor Doutor Jorge Miguel dos Reis Silva junho de 2020 ii iii iv Dedicatory Alzira Borges Coelho Salvador Vieira Nobre Maria Castela Pereira Martins Ulisses Lopes Ferreira Martins Diogo Valério Brito Pintas Tobias Boby You are dearly missed “Those who pass by us, do not go alone, and do not leave us alone; they leave a bit of themselves, and take a little of us.” Aviator and Writer Antoine de Saint-Exupéry v vi Acknowledgements The development and conclusion of this dissertation mark the end of a chapter that has been filled with many adventures and special moments. This work represents the fulfilment of many dreams and projects that could not have been possible without the extraordinary people by my side. Their support had a major impact on me and how, why and when this dissertation was performed. Therefore, I would like to express my greatest gratitude to: Miguel and Tiago, for their friendship from the very first moment, standing with me side by side during all my successfulness and downfalls. Thank you for bearing with me closely for the last nine years! All the Academic and teaching personnel, at all the stages of my educational path: schools and university. Their teachings made me who I am today. NIT - Núcleo de Investigação em Transportes, where I met wonderful people that allowed me to explore my areas of interest and embrace new opportunities, in a family-like environment.
    [Show full text]
  • Quick Start Guide
    1 Quick Start Guide 2 Table of Contents Short Introduction 3 Chapter 1. Preconditions 4 Supported Games and Protocols 4 System Requirements 4 Steps before you begin playing 4 Step 1. Installation 4 Step 2. Making a Tracking Board 5 Step 3. Making a Calibration Board 6 Step 4. Calibrate your camera 6 Chapter 2. Playing Games. 10 Selecting Game Protocol 10 Prepare, start and stop Head Tracking 11 Set Zero Pose 13 Chapter 3. Tracking Settings. 14 Introduction to MotionCap tracking concepts 14 Use 3D Viewport to control your tracking settings 14 Tracking Settings panel: 15 1. Movement Smoothness 15 2. Stabilization Factor 16 3. Position and Rotation Axis Sensitivity 16 4. Sensitivity Response Curves 17 5. Asymmetric Curve Editing 19 6. "Position respects rotation" mode 20 7. "Use Zoom in Prepar3D" mode 20 FlightGear and OpenTrack UDP connection setup 21 Chapter 4. Game Presets. 21 Chapter 5. Troubleshooting. 22 Camera calibration issues 22 Head tracking issues 23 3 Short Introduction MotionCap is a High Precision Head Tracking Software with Six Degrees Of Freedom (6 DOF) which gives its users stable, sensitive, accurate, fast and noiseless head tracking. MotionCap was initially designed to overcome limitations of many existing head tracking solutions both commercial (such as TrackIR, TrackSmooth) and freeware (FaceTrackNoIR, FreeTrack, OpenTrack). It incorporates innovative tracking technology to achieve robust head tracking and sub-pixel accurate head pose estimation with sub- millimeter and sub-degree precision in all axes. The one of goals of MotionCap is a head tracking with maximum possible head rotation angles up to 145 degrees in each axis while keeping extremely accurate and fast response, stability and smoothness for any head pose in that range.
    [Show full text]
  • Facial Feature Tracking and Head Pose Tracking As Input for Platform
    Master’s Degree Thesis Game and Software Engineering Facial Feature Tracking and Head Pose Tracking as Input for Platform Games Tobias Andersson Blekinge Institute of Technology, Karlskrona, Sweden 2016 Supervisor: Dr. Veronica Sundstedt, BTH ABSTRACT Modern facial feature tracking techniques can automatically extract and accurately track multiple facial landmark points from faces in video streams in real time. Facial landmark points are defined as points distributed on a face in regards to certain facial features, such as eye corners and face contour. This opens up for using facial feature movements as a hands free human-computer interaction technique. These alternatives to traditional input devices can give a more interesting gaming experience. They also open up for more intuitive controls and can possibly give greater access to computers and video game consoles for certain disabled users with difficulties using their arms and/or fingers. This research explores using facial feature tracking to control a characters movements in a platform game. The aim is to interpret facial feature tracker data and convert facial feature movements to game input controls. The facial feature input is compared with other hands free input methods, as well as traditional keyboard input. The other hands free input methods that are explored are head pose estimation and a hybrid between the facial feature and head pose estimation input. Head pose estimation is a method where the application is extracting the angles in which the user’s head is tilted. The hybrid input method utilises both head pose estimation and facial feature tracking. The input methods are evaluated by user performance and subjective ratings from volun- tary participants playing a platform game using the input methods.
    [Show full text]