LOW-COST POSITIONAL TRACKING FOR VIRTUAL REALITY APPLICATIONS

A Degree Thesis Submitted to the Faculty of the Escola Tècnica d'Enginyeria de Telecomunicació de Barcelona Universitat Politècnica de Catalunya by Enric Moreu

In partial fulfillment of the requirements for the degree in AUDIOVISUAL SYSTEMS ENGINEERING

Advisor: Joan Llobera, Josep R. Casas Pla

Barcelona, February 2018

Abstract

The aim of the project is the implementation of a tracking system able to follow a ping- paddle and represent it precisely, and in real-time in a virtual reality environment. The data is obtained by several optical sensors and an inertial motion unit attached to the racket. Finally, to demonstrate the accuracy and immersion of the system, a virtual ping-pong is implemented.

1

Resum

L'objectiu del projecte és la implementació d'un sistema de seguiment capaç de seguir una pala de ping-pong i representar-la amb precisió, i en temps real, en un entorn de realitat virtual. Les dades són obtingudes per diversos sensors òptics i una unitat de moviment inercial units a la raqueta.

Finalment, per demostrar la precisió i la immersió del sistema, s’implementa un joc de ping-pong virtual.

2

Resumen

El objetivo del proyecto es la implementación de un sistema de seguimiento capaz de seguir una pala de ping-pong y representarlo con precisión, y en tiempo real en un entorno de realidad virtual. Los datos se obtienen mediante varios sensores ópticos y una unidad de movimiento inercial conectados a la raqueta.

Finalmente, para demostrar la precisión e inmersión del sistema, se implementa un juego virtual de ping-pong.

3

Revision history and approval record

Revision Date Purpose

0 21/11/2017 Document creation

1 21/01/2018 Document revision

DOCUMENT DISTRIBUTION LIST

Name e-mail

Enric Moreu [email protected]

Josep R. Casas Pla [email protected] ​ Joan Llobera Mahy [email protected]

Written by: Reviewed and approved by:

Date 29/12/2017 Date 24/01/2018

Name Enric Moreu Name Joan Llobera

Position Project Author Position Project Supervisor

4

Table of contents

Abstract 1

Resum 2

Resumen 3

Revision history and approval record 4

Table of contents 5

List of Figures 6

List of Tables 8

Introduction 9 Statement of purpose 9 Requirements and specifications 9 Organization of this thesis 10

State of the art 12 Technologies for object tracking in 3D 12 Immersive Virtual Reality 12

Tools used: 13 HTC Vive 13 Steam VR 13 Steam Hardware Developer Kit 14

Hardware assembly 16 System overview 16 Sensors position optimization 17 Calibration of the controller 19 Haptics 20

VR Demonstration environment 22 Demonstration 22 Network implementation 24

Budget 26 Conclusions 27

Bibliography 28

Appendices 29

5

Annex A. Detailed guide to create a custom controller 29 Annex B. JSON file 42

6

List of Figures

Figure 1. Gantt diagram...... 11

Figure 2. HTC Vive hardware...... 13

Figure 3. Watchman Core Module...... 14

Figure 4. EVM Application Board...... 14

Figure 5. Sensor Breakout Board...... 14

Figure 6. Optical Sensors...... 14 ​ ​ Figure 7. Assembled HDK ...... 15

Figure 8. System overview...... 16

Figure 9. Optimal sensors positions 3D representation...... 18

Figure 10. Optimal sensors positions 2D representation ...... 18

Figure 11. Assembled paddle ...... 19

Figure 12. IMU calibration procedure...... 20

Figure 13. EVM application board electronic scheme...... 21

Figure 14. Haptics output. …………………………………………………..………..………..21

Figure 15. 3G vibrator...... 22

Figure 16. Demonstration scheme...... 23

Figure 17. Demonstration environment ...... 23

Figure 18. Network overview...... 24

7

8

List of Tables

Table 1.Budget...... 26

9

1. Introduction

The current revolution in hardware technology for virtual reality (VR) applications still presents one major bottleneck: full body tracking is difficult to achieve, imprecise in its quality, or beyond the reach of consumer budgets. In this context, the recent release and open-sourcing of the consumer-targeted HTC Vive VR equipment [1] has prompted the development of custom hardware [2] peripherals using the SteamVR tools suite [3].

1.1. Statement of purpose The purpose of this project is to design, implement and validate a custom VR controller with high precision tracking, and to demonstrate how it works in a virtual reality demonstrator involving a physical simulation of a ping pong ball render in immersive virtual reality. The controller consists of a ping-pong racket with several sensors attached that can be used with the HTC Vive Head Mounted Display (HMD) and a 3D scenario in order to interact with the virtual ping pong ball, the game engine used in the simulation part will be Unity 3D [4] because of its GPU integration and the different platforms that can work it. The project is carried out in the i2CAT Foundation, in the Internet & Media department and supervised by professor Joan Llobera.

1.2. Requirements and specifications The requirements of the project remained the same during the thesis. However, the specifications changed due to internal and external factors. Project requirements ● Demonstrate the use of a tracking-enabled object in VR. ● Create a novel use case scenario involving the physical object, in a consistent virtual reality experience. ● Explore a use case scenario and perform user tests. ● Study to which extent such a scenario can contribute to the feeling of being there, and of being there together, within the VR environment.

Project specifications The language used to develop the game is C# under Unity 3D. ● Create a 3D model of a real paddle using Blender [5]. ● Track a ping pong racket using sensors: This is going to be an electronic task, knowing the location of the sensors, the system should predict the position and orientation of the racket. ● Calibrate the racket sensors to get an accurate tracking. ● Integration in Unity 3D: Simulation of the physics of the racket, the ball, and the ping-pong table.

10

● Connection with other players: The system should be able to sustain a ping-pong match with two persons. This will not consist in developing the network infrastructure, but rather using an existing one for this purpose ● Use a real ball that is simulated with precision by the virtual ball that the user will see. Make tests to confirm the precision of the simulation, combining different factors (physical and virtual ball combined or only virtual ball, custom controller or typical controller) ● Validate the usability of the proposed solution with real users

1.3. Motivation We are living in a society where the technology advances faster, it’s accessible for everybody, and it’s getting cheaper. In consequence, we find some problems due to his rapid evolution science: the technology isolates the people, reducing their social lives and the time that spend with family or friends. In this context, the Internet and Media department of the i2Cat, decided to make a solution using the last low budget virtual reality releases to engage people to spend more time socializing.

The scope of this project is a virtual reality ping-pong game where people can play together via the internet, in order to participate in the VR Together european project, which is a non profit initiative that invest the improvement of the VR interactions between people.

1.4. Organization of this thesis This project is divided into four packages: 1. Assembly of the controller: In the first part of my thesis, I attached the optical sensors to the optimal positions of the paddle and verified the connectivity with the computer. Also, create a 3D model of the paddle. 2. Calibration of the sensors: Refine the position and orientation of the optical sensors and the IMU to improve the tracking. 3. Integration with the game engine: Display the paddle the Unity 3D with realistic physics. 4. Gameplay experience: Create an environment to play ping-pong with a second player over the network.

11

Figure 1. Gantt diagram

12

2. State of the art

The virtual reality revolution still experiences a huge growth , and is expected to continue growing in the next few years. Several articles about VR were written in the last years [6]. To analyze the state of the art, we will explore different technologies that allow tracking and other methods used to create the sensation of immersiveness for the users.

Technologies for object tracking in 3D Most of the systems perform the tracking with cameras, using methods like particle filters, particle swarm optimization, gradient descent or iterative closest point among others. However, to obtain a good tracking several cameras are required and a considerable computing power. In 2010 the depth camera Kinect appeared, providing a decent quality of the surrounding environment at a very competitive price, but with problems like the big noise and amount of data to analyze. Other systems like the Steam HDK use optical sensors to apply triangulation techniques that can achieve sub-millimetric precision when the emitters are completely fixed and at the optimal positions.

Immersive Virtual Reality Researchers are trying to make virtual reality more immersive in different ways. Most of them are trying to make telepresence possible where different people could interact within the same space while being in real time, using virtual bodies can be very effective with a natural tracking of the limbs. Another technique to create the sensation of immersion is allowing the users to interact with real objects and not only premade devices or rendered virtual graphics that do not exist in the physical world. Normally, that objects include haptic feedback to improve the experience.

13

3. Tools used:

HTC Vive The HTC Vive is a developed by HTC and Valve Corporation able to create the sensation of moving in a 3D space and sync with other controllers to interact with the environment (Figure 2). To track the movement of the headset and the controllers, the two base stations, which are infrared emitters generate pulses at 60 Hz that are picked up by the headset, allowing sub-millimeter precision.

Figure 2. HTC Vive headset and its accessories: the controllers and the base stations.

Steam VR SteamVR is a Virtual Reality Platform developed by Valve that offers 360 degrees, full room VR experience. The main component that offer SteamVR is the headset, that is built by the company Valve in cooperation with HTC, and it is called HTC Vive. The headset position and orientation are tracked with a pair of laser- emitting base stations called Lighthouse. Two base stations need to be placed in the corners of the user virtual reality area so that the infrared pulses they emit can make contact with sensors arranged around the surface of the Vive headset. Before playing any game or using the virtual reality space, initial setup is required for setup. SteamVR provides a setup assistant for this that handles the calibration.

14

Steam Hardware Developer Kit

The SteamVR HDK is a kit of components that can be used to quickly prototype and evaluate custom SteamVR Tracked objects.

The kit is composed of the following components:

● Watchman Core Module: The processing unit, it also contains a 1000 Hz IMU. ● EVM Application Board: The support for the Watchman core, it provides communication via USB and the power supply plug. ● TS4231 Chiclet Sensor Modules: The system can process data from 32 optical sensors that receive the infrared pulses emitted by the base stations. ● Sensor Breakout Board: The PCB where the sensors are plugged. ● Steam Wireless Dongle: The USB that allows the wireless communication with the PC. ● 2.4 GHz Antenna: The Wifi antenna attached to the EVM Application Board that send the processed data to the Dongle.

Figure 3. Watchman Core Module Figure 4. EVM Application Board

Figure 5. Sensor Breakout Board Figure 6. TS4231 optical sensors x32

15

Figure 7. SteamVR HDK assembled using five optical sensors.

16

4. Hardware assembly

This section resumes the assembly of the controller, Annex A contains the detailed information of each step.

System overview The system is composed of a ping-pong paddle, the Steam HDK, an HMD, two base stations and a computer with SteamVR and Unity3D installed.

As we can see in Figure 8, the custom controller receives data from the optical sensors and the IMU to send its position and rotation to the computer using the antenna. Then, the computer loads the Unity3D scene, using the SteamVR plugins that allow the use of the HMD and the controllers.

Figure 8. Overview of a low-cost position tracking system using SteamVR.

17

The creation of a custom controller is divided into three parts: the simulation of the optimal sensor positions, the assembly of the controller and the calibration of the sensors.

Sensors position optimization To find the optimal position of the sensors, a software called HMD Designer is used. Given a 3D model and some sensors, the program simulates several combinations of sensor positions to find the best configuration. The simulation output is a 3D model (Figure 9) and a 2D graph (Figure 10) that illustrate the following concepts: ● Number of visible sensors The number of visible sensors for a pose is a good starting point for determining the trackability of that pose. Four visible sensors are required to begin tracking, although less than four may continue monitoring with assistance from the IMU. ● Pose rotation error Sufficient depth in three axes is necessary for an initial pose, but is also a key determinant of rotation error. A baseline in all three axes amplifies the change in relationships between sensors for a given degree of rotation. When sensors are mostly coplanar, rotation around any axis within the plane does not result in significant X or Y displacement between sensors. ● Initial pose position Represents the object’s ability to begin tracking from any orientation. Although the Number of Visible Sensors is a good place to start. Tracking performance also depends on the distance between sensors and their coplanarity. Four visible sensors are required, and they must also be non-coplanar, adding coplanarity criteria gives a more detailed simulation of whether a pose may be used to start tracking. ● Pose translation error Translation error is dominated by translation away from the base station. As the object moves away from the base station, the perceived distance between anterior sensors. If there is not sufficient baseline between sensors at the start, the object cannot move away from the base station without running out of the baseline.

18

Figure 9. 3D representation of the optimal position of the sensors, with the number of visible sensors, indicated depending on the base station position.

Figure 10. 2D pitch-yaw graph evaluating the sensor placement.

Once the positions and orientations of the sensors are defined and the configuration files loaded in the controller (see Annex A for further information), the assembly of the sensors can be complete.

19

Fig. 11. Assembled HDK using 11 sensors.

Calibration of the controller Before concluding the position and orientation of the controller a calibration routine is mandatory, which is divided into two steps:

1. Calibration of the IMU In this step, an orthogonal object is required in order to use it as a reference. For proper tracking, it is critical that the IMU measurements be as accurate as possible. Manufacturing tolerances, the assembly process, and temperature variations cause variations in IMU measurement accuracy. A calibration routine performed on each unit after final assembly reduces the impact of these inaccuracies on tracking performance. The procedure consists in placing the IMU in the six positions of Figure 12 and set up the accelerometer to use each one with one axis.

20

Fig 12. Procedure to calibrate the IMU, rotating a flat surface in 6 different axes.

2. Calibration of the optical sensors When developing a SteamVR tracked object, it is critical that the system knows exactly where the optical sensors are located on the object. Even if the JSON file may be configured with the locations of the sensors as dictated by the mechanical model, submillimeter anomalies introduced during the assembly process can have a negative impact on the tracking performance of the object.

To counteract these anomalies, a brief calibration procedure should be performed on each object after assembly. The procedure produces a new JSON configuration that places the sensors closer to their real-world locations.

The calibration routine for tracked objects requires the same hardware used during normal operation. An ideal JSON configuration file is provided to the calibration tool, and the object is moved and rotated through the VR space. The calibration tool records sensor hits and attempts to fit a new set of sensor locations and orientations that best matches the data recorded. These best-fit locations and orientations are provided in a new JSON file that can be programmed into the object for improved performance.

Haptics The SteamVR HDK includes the functionality of a vibration mode, using a proper device attached to the pin MCU_GPIO_7 of the EVM application board.

21

Figure 13. Scheme of the J2 ports of the EVM application board.

To determine the output of the pin 34, an oscilloscope with a proper time resolution is needed. In this case a LHT00SU1 Virtual Oscilloscope, which obtained a 3 ms duration pulse of 2V of amplitude.

Figure 14. 3ms pulse captured at the haptics output using an LHT00SU1 Virtual Oscilloscope. ​

22

Unfortunately, the haptic vibrator that fills the electronics requirements is only available for large orders (1000 units at least), in consequence, this module is not implementable.

Figure 15. 3G vibrator with two resonance points.

5. VR Demonstration environment

To demonstrate the precision of the tracking and the feeling of immersion, a VR game environment was created. The single player game allows the player to play with a paddle and a ball that obey a slow down physics to improve the game experience. Additionally, an online mode was developed but with some limitations, explained below.

Demonstration The simulation was performed with Unity 3D and the SteamVR tools: ● HMD ● Paddle enhanced with the HDK ● Base stations ● SteamVR module for Unity 3D

23

Figure 16. Schema of the demonstration ​ ​

The creation of a friendly environment (Figure 17) was performed using different 3D models from the Unity3D community, for example, the tennis table [7], the humans [8] and the room [9].

Figure 17. Demonstration accessible at https://youtu.be/QxPiP0HnYJk

24

Network implementation To improve the user immersion, different approaches of a multiplayer mode were intended:

The first solution we tried was the Spatial OS solution, a highly scalable platform that provides hosting servers and an API to interact with the clients. It has full integration with Unity 3D, unfortunately, the stable version of the software (5.6) was not compatible with the version of SteamVR we were using.

The second network solution we implemented was using the Network package of Unity3D, which provides a very easy way to create prototypes. The only problem was the interaction of the server with a constantly moving object (the paddle), the package can’t handle constant input from the clients.

The last and closer implementation of the multiplayer mode was using the TCP Sockets protocol using the native tools that provide C#. The system is composed of a server, that manages all the physics interactions, and multiple clients, that collect data using the controllers to send it to the server (Figure 18).

Figure 18. Master-slave multiplayer architecture

25

The packets sent over the network were composed of eight fields (32 bytes): 1. Identifier (with the paddle number or empty for the ball) 2. X position 3. Y position 4. Z position 5. X quaternion 6. Y quaternion 7. Z quaternion 8. W quaternion

The limitation of the server was the transmission rate, which can’t raise the 100 packets per second without sending corrupt packets (due to collisions between packets), another problem that we can’t resolve was the overstep of the ball when fastly moving the paddle.

26

6. Budget

This section is about the budget of the project. It provides a description of the cost of the project, including software, hardware and human resources.

Product Price

PC 1300.00 €

HTC Vive 800.00 €

SteamVR HDK 550.00 €

Windows 10 Home 135.00 €

Junior Developer during 750 hours 6000.00 €

Total 8785.00 €

Table 1. Budget

The price of the project sum a relatively low amount, considering the use of free software like Blender, or the free version of Unity3D and Visual Studio. To conclude this section, it is worth to compare the cost of the SteamVR HDK hardware with the other solutions commented in the state of the art, which double the price with a similar precision.

27

7. Conclusions The main goal of the project was to design and implement virtual reality paddle tennis game using the hardware provided by SteamVR. In order to meet the objectives, the Unity3D game engine has been used to simulate a ping-pong environment with realistic physics.

This work confirms that using a custom controller can create the sensation of being immersed in the virtual world, while interacting with real objects, and satisfying the principal requirement of the project. Other requirements like the network implementation, the haptics module, or the user tests don’t succeed as expected, due lack of time or external factors. Furthermore, it’s worth mentioning that the current budget it’s very low compared with other tracking solutions that are too expensive to the users budget.

The project gave me several skills like 3D modeling with Blender, C# programming with Unity3D, understand complex electronics schemes and especially to search, understand and apply documentation from the internet. As future work, it might be interesting to extend the game to other platforms using the game in the browser and correcting the network issues.

28

Bibliography

[1] HTC Vive VR. [Online] Available: https://www.vive.com/us/product/vive-virtual-reality-system [2] P. S. Hollander “Initial SteamVR tracking experimentation”. Talaria Blog. [Online] Available: http://www.talariavr.com/blog/steamvr-mug [Accessed: 5 October 2017] [3] SteamVR HDK. [Online] Available: https://partner.steamgames.com/vrlicensing [4] Unity3D Documentation. [Online] Available: https://docs.unity3d.com/Manual [5] Blender Documentation. [Online] Available: https://docs.blender.org/ [6] Matthew K.X.J. Pan, Günter Niemeyer. “Catching a Real Ball in Virtual Reality” IEEE Virtual Reality 2017 [7] Table tennis 3D model. Marco Mercurio. Available: https://www.turbosquid.com/3d-models/tennis-table-max-free/606336 [8] Human 3D model. "Renderpeople Free Rigged Models". Renderpeople. Available: https://www.assetstore.unity3d.com/en/#!/content/95860 [9] Room 3D model "Pack Gesta Furniture #1". Gesta2. Available: https://www.assetstore.unity3d.com/en/#!/content/28237

29

Appendices

Annex A. Detailed guide to create a custom controller The following guide give a detailed explanation of how to develop a custom controller using the SteamVR HDK. To illustrate the explanation, a table tennis paddle is going to be taken as example, to cover all the steps of the creation, from the 3D modeling to the calibration. Tools: ● Scissors ● Adhesive tape ● Computer with Windows Materials: ● The physical object that is going to be tracked ● A portable battery to power the electronic system ● Two SteamVR base stations ● The SteamVR HDK, which is composed by:

Figure 19. Watchman Core Module Figure 20. EVM Application Board

Figure 21. Sensor Breakout Board Figure 22. TS4231 optical sensors x32

Step 1: Create the 3D model of the object First of all, we need to port the physical object to the virtual world. To do that, we are going to use an open source software called Blender to create a 3D model.

30

As the focus of this tutorial is the development of a custom controller, the procedure to model won’t be explained. Some useful tips to create the object are:

● In the Scene tab of the Properties, set the Units to Meters. ​

Figure 23. Blender units setup

Now, the output model will have the correct scale.

● Use as less better faces to improve the simulation of the sensor positions in the next steps. Avoid using the Modifier: Subdivision surface. ● Center the model in the origin using the command Ctrl + Alt + Shift + C and the Transform settings to modify the location:

Figure 24. Blender setup of the origin. Figure 25. Blender position offset reduced to 0.

31

● Don’t worry about the orientation, it will be changed in the next steps.

Figure 26. Blender 3D model of a low poly paddle.

● Once you have finished the 3D model, export it :

Figure 27. Blender exporting options.

Step 2: Create a render SteamVR requires a render to display the virtual object, a render is composed by: ● A .OBJ file, that defines the 3D mesh.

32

● A .PNG file that contains all the textures. ● A .MTL file, that assign the textures to each surface. The best option to generate that files is the default installed Windows 10 program: 3D Builder. It provides a fast customization of the mesh, allowing multiple options of colors and textures.

Open the .STL with 3D Builder and select the meters as the importing units:

Figure 28. Units selector from Windows 3D Builder.

Then use the Paint menu to select the desired colors or textures to personalize the object:

Figure 29. Windows 3D Builder

Finally, save the render using the top-right button or simply press Ctrl+Shift+S, save all the files with the same name in a directory with the equal name. The folder should be placed in the SteamVR renders folder like in Figure 30. ​

33

Figure 30. Files that perform a SteamVR render model.

Step 3: Find the sensors optimal positions using HMD Designer In the third step, we are going to determine the exact positions of the optical sensors, we will need the .STL model created in the first step. The used software will be the HMD Designer tool, that can be downloaded using Steam. Note that you must be a SteamVR licensee, you can request the license free at https://partner.steamgames.com/

After installing SteamVR Tracking HDK (Figure 31), launch the app and select “Launch HMD Designer GUI”.

Figure 31. SteamVR software for VR licensees

This great tool will analyze the 3D model to find the optimal positions of the sensors to get a good tracking. Create a project and select “Add Input File” to browse your .STL 3D model, then search it in the dropdown menu “Sensor Object”. The “Obstacles” options allow us to indicate that some regions of the object may be covered by the human body, like the grip. In this case the simulator don’t place sensors in the paddle grip, so we don’t need to enable this option. Finally, indicate the number of “Total Sensors”. The minimum number of sensors to achieve tracking is 5, although the precision will be very poor and with many translation

34

and rotation errors. We recommend using at least 12, to cover correctly all the surfaces of the object. Then click “Simulate”.

Figure 32. HMD Designer tool

The output of the simulation are 3 windows, but we will use only one, the OpenScad report (Figure 33).

35

Fig 33. Simulation output with the sensors at the optimal positions.

The more valuable information of this report are the coordinates on left, that indicate the optimal position and orientation of each sensor. We will use those data in the next step. Step 4: Check the USB connectivity with the device To complete the next steps we must check that the device is working correctly, to perform that test: 1. Connect the controller to the PC using a USB cable. 2. Visit: C:\ProgramFiles(x86)\Steam\steamapps\common\SteamVR\tools\lighthouse\bin\win32 3. Open the lighthouse_console.exe 4. Type “serial” 5. The console should return something like LHR-XXXXXXXX

Step 4: Create the JSON file The JSON file is a configurations file, that allows the controller to work properly, identifying which sensors are connected and the the positions and normals they have. This step is the more important, and the source of the majority of errors. Let’s analyze all the fields of the JSON file:

36

{ "device_class" : "controller", "device_pid" : 8960, "device_serial_number" : "LHR-839FD115", "device_vid" : 10462,

The first 4 fields refer to general information of the device, the only field we must change is the device_serial_number using the LHR-XXXXXXXX we obtained in the last step.

"firmware_config" : { "mode" : "controller", "radio" : true, "sensor_env_on_pin_a" : 0, "spi_flash" : false, "trackpad" : true, "trigger" : true, "vrc" : true, "haptics" : true },

The firmware configuration configure some hardware outputs like the trackpad, the radio or the haptics.

"head" : { "plus_x" : [ 1, 0, 0 ], "plus_z" : [ 0, -1, 0 ], "position" : [ 0, 0, 0 ] }, "Imu" : { “acc_bias" : [ 0, 0, 0 ], "acc_scale" : [ 1, 1, 1 ], "gyro_bias" : [ 0, 0, 0 ], "gyro_scale" : [ 1, 1, 1 ], "plus_x" : [ 0, -1, 0 ], "plus_z" : [ 0, 0, 1 ], "position" : [ 0, 0, 0 ] },

The head field refers to the relative position and orientation of the controller, for the moment use these default values.

"lighthouse_config" : { "channelMap" : [ 31, 7, 21, 5, 25, 27, 13, 15, 17, 29, 9 ], "modelNormals" : [ [ 0, 0, 1 ], [ 0, 0, 1 ], [ 0, 0, -1 ], [ 0, 0, -1 ], [ 0, 0, 1 ], [ 0, 0, -1 ], [ 0, 1, 0 ], [ -0.7071068286895752, 0.7071068286895752, 0 ], [ 0.7071068286895752, 0.7071068286895752, 0 ], [ 1, 0, 0 ], [ -1, 0, 0 ] ], "modelPoints" : [ [ 0.067619331181049347, 0.018803209066390991, 0.0078593669459223747 ], [ -0.061476584523916245, 0.035129230469465256, 0.0079422267153859138 ], [ 0.03331269696354866, 0.069392964243888855, -0.0076686479151248932 ],

37

[ -0.023291664198040962, 0.077367834746837616, -0.0073862564750015736 ], [ -0.0083339251577854156, 0.07888437807559967, 0.0083589376881718636 ], [ 0.067561790347099304, 0.032618466764688492, -0.0077610346488654613 ], [ -0.00058411975624039769, 0.087797172367572784, -3.6973538954043761e-005 ], [ -0.055917959660291672, 0.062688156962394714, -0.00049256527563557029 ], [ 0.054849445819854736, 0.064705133438110352, 0.00011297903256490827 ], [ 0.077383697032928467, 0.012862982228398323, 0.00015525372873526067 ], [ -0.076466664671897888, 0.012704703025519848, 0.00053010595729574561 ] ] },

The channelMap number indicates the port where each controller is connected, try to use the nearest port for each sensor. Replace the modelNormals and the modelPoints with the ones that you obtained in the simulation.

"manufacturer" : "Valve", "model_number" : "REF-HMD", "render_model" : "pala", "resource_root" : "", "revision" : 4, "tracked_controller_role" : "", "type" : "Lighthouse_HMD" }

Finally, in “render_model”, add the render name you used in the Step 2, to allow SteamVR to represent your object.

Step 5: Attach the sensors to the custom controller Use the adhesive tape and the scissors to attach the sensors to the Breakout Board, in the exact position where the JSON specifies, with millimetric accuracy and placing the black face of the flex cable to the PCB. Remember to connect each sensor in the correct port.

38

Fig. 34. Assembled HDK using 11 sensors.

Step X: Pair the controller In order to use the controller without cables, we need to pair the controller to the USB Dongle. 1. Connect the Dongle to the PC and install the necessary drivers, depending on the operating system this will be done automatically. 2. Open the lighthouse_console.exe we used previously. 3. Type “pair” 4. Connect the controller to an external battery. 5. Hold the two buttons of the right. 6. A confirmation message should appear in the console. Step 6: Load the configuration into the controller With the controller connected using an USB, type “serial”:

39

Find your custom controller, note that if you have other controllers or the HMD connected, they will be displayed too. To identify your device unplug the other SteamVR pheripherials. Now, connect to the device with the command “serial LRH-XXXXXXXX” replacing the name of your container. To upload the JSON file type “uploadconfig C:/Path/to/myConfig.json”. Be careful, you can overwrite the configuration of the other SteamVR pheripherials, it’s highly recommended to perform a backup before uploading files: “downloadconfig myBackup.json”. Step 7: Calibrate the IMU After uploading the configuration file to the device, we should correct the factory offset of the IMU. The device must be connected using an USB. 1. Open the system console cmd.exe 2. Visit the folder: C:\Program Files (x86)\Steam\steamapps\common\SteamVR ​ Tracking HDK\tools\bin\win32 3. Run imu_calibrator.exe 4. Now, place the controller in a flat surface and rotate it in all this axes like in the Figure 35. Press Enter for each position.

Fig 35. Procedure to calibrate the IMU, rotating a flat surface in 6 different axis.

5. Finally, the console should output something like this:

Calibrating to gravity sphere, radius 9.8066 0.08774 accelerometer fit error (6 sample vectors x 8 subsamples per vector) "acc_scale" : [ 0.9992, 0.9984, 0.9951 ], "acc_bias" : [ 0.1326, 0.0691, 0.1427 ], "gyro_scale" : [ 1.0, 1.0, 1.0 ], "gyro_bias" : [ -0.03372, -0.0007292, 0.009967 ],

40

If the fit error is higher than 0.1, repeat the calibration. Otherwise, copy the 4 fields of the output, copy them into the JSON and upload it to the device. Step 8: Calibrate the optical sensors Before connecting to SteamVR, the orientation of the optical sensors must be corrected, otherwise, the coordinates configuration loaded to the controller using the JSON won’t match with the real position of the sensors, which have small variations that can’t be detected visually. The tool used to refine the positions is vrtrackingcalib.exe, it is located in the same path that the imu_calibrator.exe: C:\Program Files (x86)\Steam\steamapps\common\SteamVR Tracking HDK\tools\bin\win32 1. Disable one of the base stations, set it to mode A. 2. Connect the device via USB. 3. Visit the directory showed above. 4. Run vrtrackingcalib.exe /usedisambiguation framer /bodycal ​ ​ ​ 800 200 The two arguments at the end of the command represent the total number of required sensor hits and the number of hits required per sensor 5. Move the device slowly, exposing all the sensors to the base station. 6. When the procedure finish, a new JSON file is generated in the folder where vrtrackingcalib.exe is located. Upload it to the device. ​ Step 9: Connect to SteamVR Finally, we designed, optimized, assembled and calibrated our low-cost custom controller. To test it, connect again the base station you disabled in the last step, set one base station in mode B (master) and the other in mode C (slave). Now, you can connect the HTC Vive headset and your calibrated custom device using a portable battery or with the USB cable. SteamVR should detect both.

Summing up, the procedure of developing a device controller consist in design it using a 3D model, analyze where the sensors will be placed using HMD Designer, attach the electronics to the object and calibrate it using the tools provided by SteamVR.

41

Fig 36. Resume of the assembly of a custom controller using the SteamVR HDK

42

Annex B. JSON file The following file allow a ping pong paddle to be tracked using 11 optical sensors and an IMU:

{ "device_class" : "controller", "device_pid" : 8960, "device_serial_number" : "LHR-839FD115", "device_vid" : 10462, "firmware_config" : { "mode" : "controller", "radio" : true, "sensor_env_on_pin_a" : 0, "spi_flash" : false, "trackpad" : true, "trigger" : true, "vrc" : true, "haptics" : true }, "head" : { "plus_x" : [ 1, 0, 0 ], "plus_z" : [ 0, -1, 0 ], "position" : [ 0, 0, 0.005 ] }, "imu" : { "acc_bias" : [ 0.13259999454021454, 0.069099999964237213, 0.14270000159740448 ], "acc_scale" : [ 0.99919998645782471, 0.99839997291564941, 0.99510002136230469 ], "gyro_bias" : [ -0.033720001578330994, -0.00072920002276077867, 0.0099670002236962318 ], "gyro_scale" : [ 1, 1, 1 ], "plus_x" : [ -1, -0, 0 ], "plus_z" : [ 0, -0, 1 ], "position" : [ 0.0070000002160668373, 0.0010000000474974513, 0 ] }, "lighthouse_config" : { "channelMap" : [ 31, 7, 21, 5, 25, 27, 13, 15, 17, 29, 9 ], "modelNormals" : [ [ 0, 0, 1 ], [ 0, 0, 1 ], [ 0, 0, -1 ], [ 0, 0, -1 ], [ 0, 0, 1 ], [ 0, 0, -1 ], [ 0, 1, 0 ], [ -0.7071068286895752, 0.7071068286895752, 0 ], [ 0.7071068286895752, 0.7071068286895752, 0 ],

43

[ 1, 0, 0 ], [ -1, 0, 0 ] ], "modelPoints" : [ [ 0.067619331181049347, 0.018803209066390991, 0.0078593669459223747 ], [ -0.061476584523916245, 0.035129230469465256, 0.0079422267153859138 ], [ 0.03331269696354866, 0.069392964243888855, -0.0076686479151248932 ], [ -0.023291664198040962, 0.077367834746837616, -0.0073862564750015736 ], [ -0.0083339251577854156, 0.07888437807559967, 0.0083589376881718636 ], [ 0.067561790347099304, 0.032618466764688492, -0.0077610346488654613 ], [ -0.00058411975624039769, 0.087797172367572784, -3.6973538954043761e-005 ], [ -0.055917959660291672, 0.062688156962394714, -0.00049256527563557029 ], [ 0.054849445819854736, 0.064705133438110352, 0.00011297903256490827 ], [ 0.077383697032928467, 0.012862982228398323, 0.00015525372873526067 ], [ -0.076466664671897888, 0.012704703025519848, 0.00053010595729574561 ] ] }, "manufacturer" : "Valve", "model_number" : "REF-HMD", "render_model" : "pala", "resource_root" : "", "revision" : 4, "tracked_controller_role" : "", "type" : "Lighthouse_HMD" }

44