<<

DEGREE PROJECT IN ELECTRICAL ENGINEERING, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2020

Electronic System in Robotic with Movements

N. TOBIAS FORSÉN

KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Abstract

Eyetracking applications are becoming more and more common. Applications such as monitoring a driver's focus on the road and behavior of customers in a store are just a few examples. For a long time, eye tracking has been expensive and too complex to make all kinds of applications. Thanks to the recent explosion in camera sensor technology it is possible to develop eye trackers cheaper and better. However, developing and testing these sensors requires advanced algorithms. These algorithms are then tested by a variety of people to confirm that they work.

This thesis will try to see if it is possible to make robotic that move like a human's eyes. This project includes a detailed process for developing a human- robotic head. Implemented movements on the robot are Saccader and smooth- pursuit with potential for movements and Vestibulo-ocular movements and . The robot head could be used instead of a person for Eyetrackers. The report contains how the author developed the electronics to power the robotic . The electronics scheme is discussed and developed. The model of how the robot head and its eyes moves are explained and then implemented. The thesis also contains an explanation of the software used. With this project, it will make it easier to invent and develop good Eye trackers.

Keywords Robotic Head; Humanoid eyes; Robotic ; Saccade movement;

Sammanfattning

Applicationer med Eyetracking börjar bli mer och mer vanliga. Applikationer som att övervaka en chaförs fokus på vägen samt beteende hos kunder i en butik är bara ett få exempel. Under lång tid har ögonspårning varit dyrt och för komplex för att göra alla möjliga applikationer. Tack vare den senaste tidens explosion i teknik i kamerasensor går det att utveckla ögonspårare billigare och bättre. Men att utveckla och testa dessa sensorer kräver avancerade algoritmer. Dessa algoritmer testas sedan av en mängd personer för att bekräfta att de fungerar.

Denna avhandling kommer att försöka se om det är möjligt att göra robotögon som rör sig som en människas ögon. Detta projekt innehåller en detaljerad process för att utveckla ett mänskligt robot huvud. Implementerade rörelser på roboten är Saccader och smoothpursuit med potentioal för Vergence-rörelser och Vestibulo-okulära rörelser och microsaccades. Robothuvudet skulle kunna användas istället för en person för Eyetrackers. Rapporten innehåller hur författaren utvecklat elektroniken för att driva det robotiska mänskliga huvudet. Elektronik-schemat diskuteras och utvecklas. Modellen av hur robothuvudet och dess ögon förklaras och sedan implementeras. I avhandlingen finns även en förklaring av den programvara som används. Med detta projekt kommer det bli lättare att ta fram och utveckla bra Eye- trackers

Nyckelord

RobotHuvud; Humanoida ögon; Robot saccader; Saccade rörelse;

Acknowledgment I had a great time working at Tobii on this robotic head. I would like to thank Jennifer Tannfelt Wu my coworker for great teamwork. And I would also like to thank my handler Magnus Smedberg in Tobii and all the other super nice and helpful people in Tobii.

I also want to give a big thank you to my supervisor Zhonghai Lu for helping me with this report. It has been really helpful.

Table of Contents

1 Introduction ...... 1 Background ...... 1 Problem...... 2 Purpose ...... 2 Goal ...... 2 Benefits ...... 2 Ethics and Sustainability ...... 2 Methodology / Methods ...... 3 2 Background theory ...... 4 of the eye ...... 4 ...... 5 Distance between the of the eyes ...... 5 ...... 5 2.4.1 ...... 6 2.4.2 ...... 6 2.4.3 Vergence movement ...... 7 2.4.4 Vestibulo-ocular movement ...... 7 Eye trackers ...... 7 3 Previous work in the field ...... 10 4 Implementation ...... 13 Motor selection ...... 13 Mechanical design ...... 18 Electronic design ...... 20 Controller ...... 20 Implementing schematics ...... 23 Programming of the system...... 27 Model of system ...... 29 Movement planning in robot ...... 30 User interface...... 34 End implementation ...... 36 5 Results of the Project ...... 38 Final testing ...... 38 Test of saccades ...... 39 Test of smooth pursuit ...... 40 Test running over a long period of time...... 42 Test of head movements ...... 43 Test of two eyes ...... 43 Testing with Eye-Tracking ...... 44 Sources of errors ...... 46 6 Conclusions ...... 47 The electronics reflection ...... 47 Reproducible ...... 47 Control reflection ...... 48 Goals ...... 48 i

Future improvements ...... 48 7 References ...... 1

ii

1

1 Introduction

Humans have many ways of interacting with the world. We are inventing more and more different types of interactions with electronics. For a long time, we had sensors like the microphone to listen to and buttons to feel human’s intentions. Recently new technology is enabling to track the movement of the eye. But this new technology requires high processing power and good algorithms. When new eye-tracking software is developed they must be tested on human eyes. The problem is to gather enough people to get trustworthy results. And even if they can get enough people to test it's very likely the people are from only one part of the world and therefore not getting the diversity you want. I will try to solve this problem in this master thesis.

Background

Eye-tracking technology has been around for a long time. As early as 1879 Louis Émile Javal [1] started to observe that eyes when reading did small jumps. These movements in the eyes he called saccades. In the beginning, the technology was mainly used in research and for medical purposes. In the 1980s to 1990s eye tracking started being used in market research. The fields of using eye-tracking are starting to be more and more widespread. By knowing where humans look it’s possible to indicate more than just where the eyes look, for example like identification and emotions of a person [2].

Just recently the Eye-tracking technology has made big leaps forward. Tobii as an eye-tracking company was founded in 2001 and located in Danderyd Sweden. Tobii has three different focus areas. The first part is Tobii Tech and they are developing technology for Gaming and consumer products. The second part is Tobii dynavox mainly focusing on technology for assisting- and healthcare equipments. Lastly, Tobii Pro, where their focus is technology for research.

Tobii wants to more easily test their eye tracker. Normally the testing revolves around inviting a lot of different people to test their eye-tracker. The uncertainty that humans do the right thing in testing makes it important to have more people in testing to reduce human errors. Some years ago, Tobii developed a robotic head that was a good step in reducing human dependency in testing. This thesis will discuss a robot solution to make a system that is reliable and easy to reproduce for testing eye-trackers.

1

Problem Can it be possible to build a reliable robotic head that can move like a human and be able to test the Tobii eye-tracking cameras?

Purpose The purpose of the report is to construct a robotic model of a pair of human eyes. These eyes should be able to move like human eyes with high repeatability. The robot will be used for helping the company Tobii to test its eye-tracking cameras.

Goal Construct an electronic system for repeatability to be within 5% of the same movements every time? • Create an electric system to be able to provide enough speed, and be able to behave like human’s eyes with four basic movements [3]: o Saccades movements o Smooth pursuit movements o Vergence movements o Vestibulo-ocular movements.

The system should be stand-alone and not require any external computers to be functional.

The head should also look cool and flashy.

Benefits The benefits of developing this robotic head: • The company can test their algorithms whenever they want, without needing humans to participate in the tests. • Thanks to the embedded in the eyes we can measure how far of the calculated move we are. In comparison with a we don’t know exactly if the human followed our instruction for the test. This makes the robot able to run tests that require many humans to get the same results. • The robot will be able to move in higher velocity than human eyes can. This results in the algorithms to be more robust.

Ethics and Sustainability A big discussion that needs to be considered is that in the future robots will be able to take human jobs. If robots start to take jobs from people, we either need to find new jobs or redistribute wealth. We have a big responsibility for the future but for this project I believe our robot will be a complement to normal human testing.

2

If this project becomes a success more and more robots will be tested. That will result in the algorithms test the same looking . This could make the eye- trackers better on similar like the robot and could be worse on people that look different. To solve this problem I think this robot needs to be an addition not a replacement for humans in testing. In other words, the test rig still needs to use humans to get full diversity. And, when building these robots there might be an advantage to deliberately make different faces and eye on the robots.

Methodology / Methods The project is together with Jennifer Tannfelt Wu a thesis worker with the mechanical design role. First, there will be a stage where research is done on what related works are regarding these fields. After the research stage, we discuss and agree on the design we will develop. The electronic design will be made by me with a focus on the PCB to be easy to reproduce. The soldering on the PCB components is also done by me. The interface will be programmed together with Jennifer. The system will be tested to verify that our goals and requirements are fulfilled.

3

2 Background theory

Anatomy of the eye Before we start by explaining what has been done in this report we first need to explain how a human eye works. The anatomy of the eye is shown in the figure below [4]. Humans interpreted the surrounding world by enters the . The Cornea acts like a camera that bends the light to direct it into the eye. The decides how much light is letting through by making the hole into the eye smaller or bigger. The iris has the same functionality as the diaphragm of a camera [5]. Directly after the iris, there is a lens to distribute the light on to the back of the eye called the . This lens is the eyes version of the camera's autofocus. The retina helps the eye to automatically focus on an object close and far by being compressed for changing the form of the lens.

Figure 1: Illustration of a side cut eye and name on all parts.

4

In the end, the light will finally fall on the retina. The retina contains specialized cells called photoreceptors. In the human eye, there are two different kinds of photoreceptor cells. These are called rods and cones. Cones react to light with different frequencies with other words to detect . They are mainly located in the focus point of the lens. Cones are making us humans get the sharp image where we want to look, and it performs best in medium to bright light. Rods photoreceptor cells provide a black and white vision and it works well in lower light conditions and they are more sensitive to detect motions. These rods are distributed all over the retina. More information regarding rods and cons can be read on the Rochester Institute of Technology's [6] web page.

When light reaches the retina the rods and cones convert the light to electric signals and send it to the part of the called the visual cortex. In the visual cortex, the light is converted to an optical image for humans to interpret [7].

Field of view The field of view of a human is widely individual. Approximately human eyes have a field of view in the horizontal plane of 210 degrees. Due to how the face is structured with and the vertical range is limited to around 150 degrees [8].

Distance between the pupil of the eyes The distance between the of the eyes can vary from person to person. The distance is called IPD and stands for interpupillary distance. Age, ethnicity, and gender are some big reasons for having different IPD. According to a military survey 2012 made in US Army [9], the differences between a male and female are shown in the table below in units [mm]

Standard Gender Selection Mean deviation Minimum Maximum Female 1986 61.7 3.6 51 74.5 Male 4082 64 3.4 53 77

Table 1: A summary of the US Army survey showing the mean IPD on female and male in units mm.

Eye movement There are four basic types of movements for human eyes: Saccades-, smooth pursuit-, vergence-, and vestibulo-ocular movements. All the different movements are mentioned in the book 2nd edition written by Purves D, Augustine GJ, Fitzpatrick D [10]. Other movements will not be discussed.

5

2.4.1 Saccades Humans don’t look at different scenery on one steady point. The eyes make quick jumps to focus on interesting points. This jumping of the eye allows our brain to build up a mental 3-dimensional picture of the surroundings. These quick jumps called saccades. A saccades movement is a ballistic movement where the eye is looking on one object and quickly moves to another object with high speed. According to B. Fischer and E. Ramsperger in their research, it takes approximately 200ms normally for a human to initiate a saccade movement [ 11 ]. The speed of a saccade can be 20 deg/sec for small microsaccades and up to over 700 deg/sec for large saccadic movements [12]. When doing big jumps, it is common to undershoot the target in those cases a small saccade-move must be done again to reach the target. Getting an overshoot in a saccadic movement is uncommon.

Figure 2: Illustration the movement of a typical saccade.

How we model this movement into a compressed exponential model will be discussed in the implementation.

2.4.2 Smooth pursuit To follow an object that moves relatively slow, the eyes can then follow the movements of the object. This movement is called smooth pursuit. With smooth pursuit, an eye can follow objects up to 30 deg/s [13]. Faster than this speed the eyes will start to do small saccadic moves. There are very few people that can move their eyes as a smooth pursuit without having a moving object to focus on. Usually, when you try you will end up making small saccades movements.

6

Figure 3: Three smooth pursuit movements and their corresponding catch-up saccade.

2.4.3 Vergence movement When objects are moving back and forth- or located at different distances from the position of the eyes. The eyes need to follow the object by moving the opposite direction to each other. This movement is called Vergence movement. Compared to the other moves the eyes don’t move in the same direction.

Head

Figure 4: Visualization of Vergence movement.

2.4.4 Vestibulo-ocular movement Vestibulo-ocular movements are the last of the basic movements. This movement is stabilizing the eyes to compensating for the head movements. When the head moves the eyes will automatically move the same speed in the other direction.

Eye trackers Eye trackers have been used to follow and capture where someone is looking also called the point of gaze [14]. The gaze vector is the direction from the eyes to the object the eyes are looking at.

7

Today there are different kinds of eye trackers • A physical sensor is connected on the eye. One example of an eye tracker connected on the eye is written in “Eye Contact: Scleral Coil Eye Tracking for Virtual Reality” by Eric Whitmire, Laura Trutoiu, Robert Cavin, David Perek, Brian Scally, James O. Phillips, Shwetak Patel. They have a coil on a lens and then from VR connected with more coils. The direction of the eye can then be calculated [15]. I highly recommend reading this interesting concept of eye-tracking.

• Optically mounted camera pointed to the eyes. A lot of optically mounted cameras today utilize near-infrared projectors together with a separate high-resolution camera to calculate the point of gaze.

• Electroencephalography (EEG) measurements for reading the small electronic signals from the mussels around the eyes [ 16 ] in ”Electooculographic guidance of a wheelchair using eye movements codification” written by Rafael Barea, Luciano Boquete, Manuel Mazo, Elena López, L.M. Bergasa. They write about an actual concept of using this method to control a wheelchair.

Tobii is a company that focuses on eye-tracking. Their way of getting the point of gaze is to blast the eye with nonvisible near-infrared light and with a separate high-resolution camera recording the reflection of the light[17]. A projector is directing the near-infrared light towards the middle of the eyes. There are two different ways to illuminate the eye: One is called dark pupil effect and the other is called the bright pupil effect [18].

When placing the camera separately to the projector, the reflection that hits the inside of the lens will miss the camera, therefore, the pupil will get dark. But some light will reflect on the cornea and make its way to the camera. The camera is also finding the dark pupil. With that information, it’s possible to calculate the point of gaze. This method is called a dark pupil effect.

The bright pupil effect works like having the illuminations light as close to the camera as possible. The light will enter the eye and the reflection and when the eye moves there will be a little reflection on the spherical part of the eye both reflections will be picked up by the camera.

Both dark and bright pupil methods are shown in figure 5.

8

Figure 5: The dark pupil effect on top and bright pupil on the bottom.

Tobii Pro is selling eye trackers to research purposes and also developing wearable glasses to make eye tracking interactive to the world. In figure 6 the Tobii glasses are shown. It contains two cameras for each eye and some near- infrared light sources. A high definition camera is mounted in the middle of the frame to make the overlay between the person’s point of gaze and the real world. The master thesis will focus mainly on building a robotic test system compatible with these glasses.

Figure 6: Tobii Pro Glasses.

9

3 Previous work in the field

There has been work done regarding humanoid robots. A robot like Asimo Honda [19] and robot Nao [20] have been developed to let humans interact with human-like robots. These robots are made for living among us and communicate with us.

Even facial expression robot has been developed for trying to let robots express their feelings. An example is found in the report written by Hiroyasu Miwa, Kazuko Itoh, Munemichi Matsumoto, Massimiliano Zecca, Hideaki Takanobu, Stefano Roccella, Maria Chiara Carrozza, Paolo Dario, Atsuo Takanishi [21]. They have developed a robot able to show some human-like emotions.

Robotic models with human-like eyes are mainly there to have cameras in the eyes to be able to follow an object and interact with the environment. Robotic eyes have also been implemented in robots to look more, for example, humans like the highly-priced AI robot Sophia from Hanson Robotics [22]. This is a robot that has an advanced neural network to make the robot to be more like us humans. The robot can analyze what humans say and respond accordingly. A fun fact is that Sophia got citizenship in Saudi Arabia under a tech convention[23].

Projects which are more focused on eye movements relative to my theses are a sub-project of the project Robot-Cub (iCub) where they design the head of the robot [24]. In the report, they also explain the mechanical solution they made for the head. The authors also describe the process of building the head to the iCub. Results from there measurements of the head are shown in the table below

Table 2: The result from the iCub robots eyes.

One big difference from this project to ours is the max velocity of the iCubs eye is 180 deg/ s comparing to human eye movements of a top velocity on 700 deg/sec. The eyes of the iCub are equipped with one VGA camera in the center of the eyes that is to be able to do object tracking and detection. The eyes are not meant for mimicking the movement of a human eye.

The iCube is a project that builds a shell robot for others to continue to improve on.

10

In a report written by Dario Biamino, Giorgio Cannata, Marco Maggiali and Alessandro Piazza they write about a driven robotic eyes that can have human movements called MAC-EYE [25]. The model is a homogeneous sphere with three rotational degrees of freedom and in the eye, they have embedded a camera to be able to get visual feedback. Thanks to the DC-motors internal encoders the accuracy of the nominal eye is about 0.005 deg. They have optical sensors to avoid slackness in the lines. The electrical system is using one master microcontroller and for every DC motor one slave microcontroller. The slave microcontrollers have interface directly to a motor driver. (The control of the motor speed is implemented by a PI control loop in parallel with a P controller for the tension control loop.) Communications between the microcontrollers are over CAN bus (Controller Area Network bus) [26] protocol at 125Hz.

Giorgio Cannata and Marco Maggiali wrote 2007 a report about “Models for the design of a tendon driven robot eye” this work is highly relative to my thesis. There they used Listings Law to determine during saccades the orientation of the eye. Listings law says “There exists a specific eye orientation with respect to the head, called primary position. During saccades any physiological eye orientation, with respect to the primary position, can be described by a unit quaternion q whose (unit) rotation axis, v always belongs to a head-fixed plane, L (Listing plane). The normal to plane L is the eye´s direction of at the primary position.” According to the authors, Listings Law can also be valid for smooth pursuit.

There are also two unique solutions I have found to control the eyes. The first one uses three different linear motors this case piezo actuator. These piezo actuators control a camera and can make a movement like an eye. This solution is a quite complex solution, but they got it to work very well [27]. The solution is shown in figure 7.

Figure 7: Three piezo actuators controlling a camera [27].

11

The second different solution is a fluid-suspension, electromagnetically driven eye with video capability for animatronic applications [28]. This works by having magnets connected on the eyes and thanks to low resistance and coils connected to the frame you can control the movement. I think this solution is one of the cooler solutions I could find, but it will be more difficult to get to work exactly to our requirements. The eye is shown in figure 8 below.

Figure 8: A fluid-suspension, electromagnetically driven eye with video capability for animatronic applications [28].

12

4 Implementation

First, we had to figure out how many motors are needed for making all the movements and what kind of motors. We thought first of a design that contains three motors for the eye movements which makes the vertical movements of the eyes connected. In this design, the eyes can’t move separately in the vertical direction. This solution is helpful for the electrical point of view because we don’t need too much current when all motors are on at the same time. But the negative part with this idea is the design is not able to have complete independent movement in each eye.

We decided to go for a design that makes it possible to control every eye separately. This design was preferred because it made it possible to replicate moves like lazy eyes.

Motor selection To make the eyes move we need a motor, here I will discuss the pros and cons of some common motor types we thought of implementing into our project. This will just be a brief compilation of that discussion.

• Brushed DC electric motor is one of the most common dc motors. The motor works by having a static shell of the motor consisting of permanent magnets that varying between north and south polarization around. The rotational axis has some metal armature connected to it. Around the metal armatures, there is a coil. The coil makes it possible to make an electromagnet on the rotating movable axis. To synchronize the electromagnet to the rotation of the permanent magnet the current is transformed from a brush to the axis [29], as shown in figure 9.

To be able to control the DC brushed motor we would need to implement an H-bridge or a half H-bridge.

o Pros: ▪ This type of motor is widely used and relatively cheap. ▪ Easy to use since there are only two cables needed to power the DC motor. o Cons: ▪ Working best in higher rotations per minute. Therefore, we would need a gearbox. ▪ To be able to keep the location of where the eye is located we need one encoder. Encoders are devices that keep track of how many rotations a motor has rotated in other words letting us keep the position of the eye through the encoders. Good encoders are expensive and relatively big.

13

Figure 9: Illustration of how a brushed DC motor works [29].

• Brushless DC electric motor is a motor where the coils are static, and the permanent magnets are rotating. A brushless motor is shown in figure 10.

o Pros: ▪ Size can be small to fit inside the head. ▪ Less maintenance is needed o Cons: ▪ Needs an additional electric control unit and that will increase the complexity and the price of the project ▪ Working best in higher rotations per minute. Therefore, we would need a gearbox. ▪ To be able to keep the location of where the eye is located we need one encoder. Encoders apply the same discussion as above regarding brushed DC motors.

14

Figure 10: Illustration of how a brushless DC motor works [29].

• Stepper motor is like a brushless DC motor except for every coil press in and out teeth to turn a gear small steps. The gear is rotating with the axis. There are two different basic winding stepper motors: Unipolar and Bipolar motors. Unipolar stepper motors are powering the coils in the same direction. One lead will be magnetically positive and the other will be magnetically negative. This makes it that only half of the coils can be powered at the same time. Bipolar stepper motors, on the other , can drive the coils in both directions. This leads that the bipolar stepper motor can power all coils at the same time [30]. o Pros: ▪ Very precise step size. ▪ The position of the axis will be exact and easy to know. ▪ High torque. o Cons: ▪ High speeds make the torque low and produce high vibrations. ▪ Don’t have a smooth rotational movement.

• Servo motors are widely used in fields like radio-controlled vehicles. Inside a servomotor, there is a small DC-motor (usually a brushed), a control circuit, a potentiometer and a gearbox to gear down the rpm of the axis [31]. The potentiometers value is used as feedback to the system on where the shaft is directed.

o Pros: ▪ Servo motors provide high levels of torque at high speeds. ▪ Have an embedded encoder sensor. In many cases a potentiometer. ▪ The efficiency operates in 80%-90%. o Cons:

15

▪ The embedded encoder and embedded gearbox make it that the servo motor runs into more frequent maintenance and therefore higher costs. ▪ More expensive than stepper motors [32].

• A linear motor is a more high-priced motor used in the field’s example medical equipment and on a component placement robot. One type of linear motor works by having a rod that has many permanent magnets placed on top of each other. The polarization varies between north and south on each magnet as shown in Figure 11. On top of the rod, there are three different electromagnets in a row. This is connected to a 3-phase current. In this way, you can control the direction and speed of the rod back and forth thanks to the 3-phase current.

3-phase current

Figure 11: The functionality of a linear motor.

o Pros: ▪ Very precise and small step size ▪ High-speed movements ▪ High power rod movements o Cons: ▪ Expensive ▪ Need some control circuit to make a 3-phase current ▪ Have bigger size relativity to the rest of the motors in the list

Motors like the smallest linear motor from Faulhauber who was very promising but a bit too expensive and taking too much space in the head [33]. We decided upon using servomotors. The driving of a servo can be done in many ways. The most common way to control a servo motor is to send pulse width modulation (PWM [34]) signals to the servo. Or a less common way is to send commands via a serial bus [35].

The method we decided to use is PWM because we can get easy access to PWM signals from microcontrollers.

Before we choose servos, we made a test. The test was meant to verifying the speed, power consumption, and step size to be enough for the project. A laser 16

was glued on both servos and placed on different distances to a whiteboard. In the first test, the motor was placed 177 cm from the center of the whiteboard orthogonally. Then the motor moved the laser dot 25 cm on the board and the second test the laser dot moved 100 cm. The whiteboard is filmed in both 400 fps and 1200. For HSG8315BH we used 500 Hz as a refresh frequency and 50 Hz for D930SW.

0 100

Figure 12: Illustration of how the test of servos was set up.

The result of the test is summarized in the table below.

Servo motor Distance [cm] Vmax [ ◦/s] Video frame rate[fps] D930SW 25 227 400 D930SW 25 258 1200 D930SW 100 753 400 D930SW 100 776 1200 HSG8315BH 25 474 400 HSG8315BH 25 582 1200 HSG8315BH 100 1289 400 HSG8315BH 100 1293 1200

Table 5: The result of the ”test of servos”.

The servo we choose is called HSG8315BH for the eye movement and D930SW for head movements. The specifications of both servos are shown in the table below.

HSG8315BH Servo for eye movements Hz Refresh rate 200~560 Hz PWM range 760 ~ 1020 usec Dimensions 1.57” x 0.79” x 1.50” (40 x 20 x 38mm) Product Weight 2.12oz (60.0g) Voltage Range 6.0V - 7.4V No-Load Speed (6.0V) 0.05 sec/60° No-Load Speed (7.4V) 0.04 sec/60° Stall Torque (6.0V) 60 oz/in. (4.3 kg.cm) Stall Torque (7.4V) 74 oz/in. (5.3 kg.cm) Operating Temperature -20°C to +60°C Continuous Rotation Modifiable No Motor Type Coreless 17

Output Shaft Support Dual Ball Bearings Gear Material Karbonite and Aluminum Wire Length 11.81" (300mm)

Table 3: The specification for the servo motor HSG8315BH which was used in the system for eye movements.

D930SW Servo for head movements Resolution 4096 Operating Voltage Range (Volts DC) 4.8V ~ 7.4V Speed (Second @ 60°) 0.11 ~ 0.07 Maximum Torque Range oz. / in. 112 ~ 182 Maximum Torque Range kg. / cm. 8.0 ~ 13.0 Current Draw at Idle 30 mA No Load Operating Current Draw 500 mA Stall Current Draw 5,200 mA Dead Band Width 1 µs Dimensions (mm) 40.0 x 20.0 x 37.0 Weight (Ounces) 2.33 Weight (Gram) 66 Circuit Type 32bit Programmable Digital Motor Type Coreless Metal Brush Gear Material MK 1st Gear & 3 Steel Alloy Gears Bearing Type Dual Ball Bearing Case Material Plastic / Aluminum

Table 4: The specification for the servo motor D930SW which was used

in the system for head movements.

This test result shows that the servos we have chosen to fulfill the requirements of the project.

Mechanical design The mechanics of the head will be discussed deeper in my co-master thesis student Jennifer Tannfelt Wu theses. Just a little bit of explanation will be done here. Together we made sure that all the components had to fit inside a styrofoam head. The PCB inside the head had to be a maximum of 145 mm and have 100 mm as width.

To make the head stable we made a bottom plate in plastic. And enabling the head to rotate easier without any or as little friction as possible, a big bearing

18

was made and connected to the head. And in the middle of the rotation center, in the head, we screw down a servo motor upside-down with a metal rod pointing straight up through the head. On that rod, the whole aluminum chassis with both the eye modules and the PCB are connected to it. The plastic eyes are connected to two servo motors each. One for making the eye rotate in a horizontal plane and the other for the vertical plane movements. The eyes were connected in modules so it is possible to change the distance between the eyes that makes it possible to mimic a lot more different humans. The distance between a grown-up human can vary between 51 mm to 77 mm as an average[36]. Our interpupillary eye distance can cover that difference. A 3d image of the head is shown in figure 12.

Figure 12: The mechanics of our system.

In the back of the head, the PCB is slotted down which makes the PCB vertical and easy to slide up and down.

19

Electronic design One demand Tobii had was to make the robot head easy to replicate within a reasonable budget. Therefore I decided to create a PCB with as many surfaces mounted components.

One of the goals has been that the robot needs to be flashy and impressive, so it can potentially be brought in fairs for the future. It was important to implement some input to the head. The robot should be used with and without wifi and to be quick to show some functionality of what it can do. I decided for developing two circuit boards, one made for input from users around the head, and one made for controlling the Servos inside the head.

And to make the head flashy it was important to add LEDs to the head. The LED was placed to shine down onto the mechanics in the head. We decided to go for two bright RGB LEDs. The LEDs that were chosen have 4 pins on the package. One pin must be connected to 5V and the rest corresponding to Green, and Red pin. These three-color pins are connected to three PWM pins from the microcontroller. By connecting the colors of the LED to PWM signals we can choose what intensity the three colors will shine in. This allows us to light up the head in all the different colors that Red, Green, and Blue can mix too. To control the led in the head we must give them a signal from high voltage to 100/255 of the high voltage. Where voltage is high it corresponds to LED off and 100/255 of the high voltage corresponds to LED on.

Controller To be able to fulfill a saccade and other moves we need to do many calculations separately before sending signals to the servo to make the system as fast as possible. The head needs also to work independently that requires a microprocessor or a microcontroller inside the head for all the calculations. Ether processing the calculations in the controller that is directly connected to the servos or to have another device that is pre-calculating and sending all signals to the servo controller.

One of our big goals was to make this project easy to take over after we are done with the thesis. Therefore we will mainly look at controllers that are widely used such as Raspberry Pi, Arduino, Teensy and Intel Edison.

We need to have a processor to make the calculations beforehand with the possibilities of wifi and Bluetooth. A Raspberry pi was chosen because many engineers in Tobii have had some experience with that Microprocessor. A Raspberry pi is a credit card-sized processor developed by the raspberry pi foundation [ 37 ]. The programming I decided to use for the calculations and the user interface is python 3. The reason is that it is a language easy to understand and has many useful open source libraries to use. The raspberry pi has only 2 PWM ports which are not enough to control 5 Servos. In addition to the Raspberry pi, we need a controller to send out 5 PWM signals for all motors and can communicate with the raspberry pi. 20

We looked at an I2C PWM driver likes PCA9685 [38]. With this chip, we can get up to 16 channels to drive servos. Pros with this solution are it’s easy to implement and that I2C requires not many pins from the controller. The cons with this solution are that the PWM signal is a 12-bit resolution for 0-100%. The Servo can only react on 760 ~ 1020 usec PWM range. The 500 Hz has a period of 2ms. This means we only have (212/2000) ∗ (1020 − 760) = 532 steps for the 180-degree rotation of the servo. The I2C PWM driver is not enough to make small steps that we need.

A microcontroller ATMEGA2560 [39], a controller that has enough PWM pins to control all the servomotors and some additional for other uses. The microcontroller has four 8-bit PWM channels and twelve PWM channels with programmable resolution from 2 to 16 bits. 16 bit PWM channel will work fine for our case. Five PWM pins with a 16-bit clock timer are dragged out to control the signal pin to every servo motor. Following the calculation from above, a 16-bit clock will give us (216/2000) ∗ (1020 − 760) = 8519 steps in the servo movement for 180 degrees, that is enough!

ATMEGA2560 is the same microcontroller that is used for the Arduino mega. Thanks to microcontroller I decided we would use the Arduino bootloader and use the Arduino libraries for easy implementation.

To be able to take inputs like buttons and have external LED indicators a PCB was made to be placed outside of the head. The microcontroller to take care of all the inputs was decided to be ATMEGA328p and that is the same as in Arduino unos microcontroller. Also, for that chip, we could use an Arduino bootloader. Arduino IDE is widely used in both hobby and school projects and has a huge community that can be helpful in the future for Tobii to be quick to add functionalities to the Head.

To be able to send information from the raspberry pi to atmega2560 in the head we need some communication protocols. The protocols that can be used by both the ATMEGA2560 and the raspberry pi is SPI, Interrupt, I2C and serial communication. All three protocols pins have been prepared in the layout to be used in case someone decides they would need them.

Serial Peripheral Interface, SPI, [40]: A protocol used to send data between microcontrollers, sensors, SD-cards and more. SPI works by one device taking the role as master and the other device as a slave. The master device is providing a clock to the slave for the transmission it is called SCK wire. The two wires that are required to make the bit transaction between the devices are MISO (master in slave out) and MOSI (master out slave in). Every rising edge in the clock you check the MISO or MOSI wires and see if they are high or low.

Inter-Integrated Circuit I2C [41] is widely used for short-distance microcontroller communication. I2C is a communication bus with two wires 21

clock (SCL) and data (SDA) with 7-bit addressing. The bus has two different roles for the nodes master and slave. A master node generates the common clock frequency and initiates communications with slaves. Slaves receive the clock and respond when the master calls its address. When a connection has been made information can go back and forth. There are many different operating modes that are compatible with different speeds. All modes are compatible with 100kbit/s and some special modes can come up to up to 5 Mbit/s.

One wire communication bus provides low power rates and low speed over a single wire. One wire is similar to I2C except that it only uses one wire and that results in a slower transaction rate.

Serial communication [42] is a method of sending one-bit data at the time. In our case, 2 cables are needed for communication tx and rx. The speed of transferring data is called baud rate, normal standard rates of transferring bits- per-seconds are 1200, 2400, 4800, 19200, 38400, 57600, and 115200. Higher speeds are risky may cause errors in receiving.

Figure 13: A good explanation of how the serial bus works [42].

I connect the Serial connection on the PCB between both the microcontrollers and the raspberry pi illustrated in figure 15. I think the bound rate of 115200 or less will be enough.

Input 휇퐶 Serial Head 휇퐶 Serial Raspberry pi

Figure 15: The communications between the PCB in the head.

22

Implementing schematics Decoupling (bypass) capacitors are placed on every power pins on all the ICs on both PCBs. Decoupling capacitors I placed are capacitors around 100nF and it helps to suppress the noise of high frequency in the power supply signals. This noise can be harmful to the ICs if decoupling capacitors are not used.

Figure 16: The system with necessary decoupling capacitors.

All the pins from the raspberry pi and the ATMEGA:s that are not currently in use have been directed out in lists to make it possible to use in the future for improvements.

I have also placed the footprint of an SN754410 [43] (Quadruple Half-H Driver) that can be soldered on to power a stepper or a brushless dc motor. We planned to make it easy to change the distance between the eyes by using a motor.

Figure 17: The chips pinout of SN754410 [43].

One buck converters are more efficient compared to boost regulators. The voltage input of the boards I decided to be 12 V. That 12 V I will be able to be regulated down to what I need. 12 V is a voltage level widely used in industries worldwide. I placed a 12-volt power barrel jack in the PCB outside and then it's directed directly to the head PCB where both voltage regulators are located. A five-volt cable is dragged back to power the PCB outside of the head.

On the circuit board we need a voltage level of 5V to the microcontrollers and the raspberry pi, 7.4V is needed for powering the servos. When doing testing the servo that we use consumes 1.4 A peak current for full speed. For five servos to move in full speed it requires 7 A. Voltage regulators that can provide 7 A of current can easily become hot and overheat. We decided therefore that the 23

servo to rotate the head doesn’t have to move as fast and will never be used at full speed at the same time. With that in mind, a voltage step down regulator like TPS5450 [44] that has an input voltage range between 5.5 V to 36 V was used. It has an adjustable voltage out down to 1.22 V with 1.5% initial accuracy. The voltage regulator is capable of up to 5-A continuous (6-A Peak) output current. To design the voltage regulator to produce 7.4 V we need to follow the formula from the datasheet shown in the equation below. 푅1 ∙ 1.221 푅2 = 푉표푢푡 − 1.221 Let’s use 푅1 = 10 푘Ω this gives us 푅2 ≈ 2 푘Ω

I connected the voltage regulator according to the datasheet a 0.01-μF, a low- ESR ceramic capacitor between the BOOT pin and PH pin. This capacitor provides the gate-drive voltage for the high-side MOSFET.

In the datasheet, the minimum recommended decoupling capacitance is 4.7 μF for the input capacitor selection. The capacitor that is used needs to be of high- quality ceramic type. Figure 18 is showing what capacitors type X5R or X7R I used.

Figure 18: The voltage regulator.

The selection of the Inductor was a beefy MSS1278-153MLD [45] that could handle the current according to the datasheet.

For the catch diode, I choose to use B540C. This diode is connected according to the datasheet's suggestion.

The five-volt voltage regulator needs to be able to power both the raspberry pi and the microcontrollers. The current consumption of the raspberry pi 3B+ [46] is almost 1A in 100% CPU load. The Raspberry pi foundation recommends a 24

2.5A and 5V power supply [47]. The microcontrollers will consume around 100 mA each [48]. Totally for all the microcontrollers and the microprocessor the current supply needs to be more than 2.7A. I decided to use a fixed 5V voltage regulator in LM2596-5.0 [49]. This voltage regulator can output a current up to 3 A. I followed the implementation in the datasheet and the selection guides for the necessary components. That gave me the design shown in the figure below.

Figure 19: The schematics of the 5V power regulator.

To make the PCB outside of the head interactive to the outside world we added 4 buttons and one joystick and two LED indicators.

Every time a button is pressed hardware interrupt in the microcontroller is activated. The ATMEGA328p has only 2 hardware interrupt pins it means in order to figure out which button has been pressed additional analog inputs are uses. When an interrupt has occurred at the analog input, a pin is checking which of the buttons was pressed and resulted in showing a high voltage.

The joystick works by having two 10K ohm potentiometer connected in an angle of 90 degrees to each other to a stick. Every time you move the stick you will get an analog value which corresponds to how much the stick is tilting in each direction. There is also a button underneath the joystick and every time you press the joystick down hardware interrupt will be activated in the microcontroller. Two led indicators have been placed to blink different for the user to understand what button has been pressed and in what state the atmega328p is currently in.

25

The complete system is shown in the system chart below.

Figure 21: Two different PCB:s and how the controllers and processors communicate with each other. The figure also shows what extra electronics are connected to what.

The schematics and the board layout that was developed in this project are placed in A.

26

Programming of the system Both the Atmega328p and Atmega2560 have been programmed with Arduino bootloader. The fuses for the Atmega is shown and explained in the two tables below. The meaning of fuses for the Atmega2560 can be read in the datasheet for a correspondent microcontroller.

Bit Low 0xFF High 0xD8 Extended 0xFD 7 CKDIV8 OCDEN Divide clock by 8 Enable OCD 6 CKOUT JTAGEN Clock output Enable JTAG 5 SUT1 (X) SPIEN Select start-up time Enable Serial programming and Data Downloading 4 SUT0 WDTON Select start-up time Watchdog timer always on 3 CKSEL3 EESAVE The EEPROM memory is preserved through chip Select Clock Source erase 2 CKSEL2 (X) BOOTSZ1 BODLEVEL2 Select Clock Source Select Boot Size Brown-out Detector trigger level 1 CKSEL1 (X) BOOTSZ0 (X) BODLEVEL1 Select Clock Source Select Boot Size Brown-out Detector trigger level 0 CKSEL0 (X) BOOTRST BODLEVEL0 Select Clock Source Select Reset Vector Brown-out Detector trigger level

Table 6: The fuses for the ATMEGA2560. The (x) shows the ones I used.

Atmega328P

Bit Low 0xFF High 0xDA Extended 0xFD 7 CKDIV8 RSTDISBL Divide clock by 8 External reset disable 6 CKOUT DWEN Clock output debugWIRE Enable 5 SUT1 (X)SPIEN Select start-up time Enable Serial programming and Data Downloading 4 SUT0 WDTON Select start-up time Watchdog Timer Always On 3 CKSEL3 EESAVE The EEPROM memory is preserved through chip Select Clock Source erase 2 CKSEL2 (X) BOOTSZ1 BODLEVEL2 Select Clock Source Select boot size Brown-out Detector trigger level 1 CKSEL1 BOOTSZ0 (X) BODLEVEL1 Select Clock Source Select boot size Brown-out Detector trigger level 0 CKSEL0 (X) BOOTRST BODLEVEL0 Select Clock Source Select reset vector Brown-out Detector trigger level

Table 7: The fuses for the ATMEGA328P. The (x) shows the ones I used.

27

A high-resolution PWM signal is generated from the ATMEGA2560 to control the servo motors. The PWM signals used in the ATMEGA2560 have a 16-bit timer. The output pins that are used and its timers are shown in table 8 below.

Port pin Pin nr Timer PB5 24 OCR1A PH5 17 OCR4C PL3 38 OCR5A PL4 39 OCR5B PL5 40 OCR5C

Table 8: The output pins that are used and its timers for ATMEGA2560.

The PWM mode used in the PWM frequency is based on dual-slope operations. This means that the clock counts both rising edge and falling edge. The timer control registers A and B are called TCCRnA and TCCRnB. The parameters used in the microcontroller software code are shown in the table below.

Timer Hz TCCRnA TCCRnB ICRn OCR1 50 136 18 20000 OCR4 500 168 17 16000 OCR5 500 168 17 16000

Table 9: Timer parameters for the register needed.

All movement related to the calculations is handled by the Raspberry Pi. 1 byte can only carry 0-255 different values this is not enough for our wanted precision. We need to send two bytes for every motor. Therefore we can send 0- 65535 different position values to the motor. The sent package from the Raspberry pi and the input PCB need to be 10bytes. The first 2 bytes is representing value for motor 1 and the second two bytes represent value for motor 2 and so on. An explanation is shown in figure 22.

M1 M1 M2 M2 M3 M3 M4 M4 M5 M5

ATMEGA2560 RPI

Figure 22: The package of bits being set between the Raspberry and the It’sATMEGA2560 important that. sending and receiving takes less time than 2ms

We need a high baud rate because otherwise, the data can collect up in a queue resulting in values that can be missed. Baudrate used is 115200 to make sure the data fit on the bus.

28

Atmega2560 is programmed to, whenever an interrupt is triggered, listen to the corresponding serial bus. After each packet is received those values are directly sent to the motors. The Raspberry pi will have higher permission than the outside microcontroller, ATMEGA328p. This means that at any point the Raspberry pi can take over the control.

Model of system We will have to model the system for getting it easier to understand. The robot is placed in origin (0, 0, 0). When the robot looks straight ahead of the space vector that defines the eyesight is (1, 1, 1). And when the eye looks 90 degrees the vector is defined as (1, 0, 0)

If we want to make the eyes move in a straight line on a projected wall in front of the head, we can’t just change one servo-motor at the time because that makes a bent (bowline) line on a projected wall. In this case, the system will follow the green line in figure 23.

Figure 23: Illustration of a problem with getting straight lines projected on a wall when just moving one motor in the eye.

We need to follow the great circle of the red line.

The great circle distance is the distance measured on top of the surface on the sphere between two points. The picture below shows the great circle distance in red between point P and point Q [50].

Figure 24: Illustration of the great circle. 29

The great circle projected on a flat wall is a straight line. To take steps on the great circle Rodrigues rotation formula was used. That gave us vectors in the direction that we need to follow from the starting direction to the end [51]. The vectors are used as steps for our robot. Rodrigues's rotation formula is according to Wolfram “Rodrigues' rotation formula gives an efficient method for computing the rotation matrix/corresponding to a rotation by an angle 3 휃 about a fixed axis specified by the unit vector 휔̂ = (휔푥, 휔푦, 휔푧) ∈ ℜ ”.

Movement planning in robot Modulation of a saccade is explained in the report written by Peng Han, Daniel R. Saunders, Russell L. Woods and Gang Luo where they write about saccades [52]. I highly recommend reading through the report before continuing. In this report, they model a saccade eye movement by using the equation below. 푡 푝3 푓(푡) = 푝1 [1 − 푒푥푝 [− ( ) ]] 푝2

푝1, 푝2 and 푝3 was decided by data fitting. 푝1 corresponds to the amplitude of the saccade, when the p2 is changed the duration of the saccade is changed. Last the parameter 푝3 is used according to the report to control the decay trend.

By using a Tobii eye-tracking camera (Tobii Pro Spectrum 600) we measured two people’s eye movements meanwhile they were looking at a nature movie on a computer. From the eye-tracking data, we could save all the saccades the eyes done under the movie. In total, we saved more than 100 different saccades for each person. We later used a curve fitting in MATLAB on each saccade. Some of all saccades are shown in figure 25 below corresponding to my eyes.

30

Figure 25: 8 graphs of measured saccades and their corresponding curve fit.

All curve fitting where exanimated and the saccades that had bad fitting were removed for example saccade nr 17 from figure 25. Saccades that moved less than 2.5° and more than 100° was also removed. Our goal with the curve fitting is to find a connection between long saccades and short saccades and define our parameters.

31

The p1 parameter is plotted against the degrees of rotating the eye also called saccade amplitude ΔΘ. We find an almost linear line from the length of the movement in degrees.

Figure 25: A linear fitting of P1-values.

p1 was plotted together with p2 with an acceptable result of the linear fitting.

Figure 26: P2-values relative the P1-values in a graph.

32

To find a correlation of parameter for p3. We know that p3 is correlating the decay of the saccade but no clear single correlation to a parameter was found. We found out with the help of different graphs an approximate value of p3. See the figure below.

Figure 27: Different ways of getting the P3 value.

Lastly, the duration of the saccade movement (Δt) was plotted together with p2. With this plot, we could find a correlation of explaining Δt with p2.

Figure 28: Saccade duration relative to P2-values.

33

A summary of our saccade parameters for both test persons are shown in the table below.

Person 1 Person 2 Equation for p1 1.0317*∆θ - 0.06982 0.99744*∆θ - 0.05799 Equation for p2 1.2864*p1 + 17.102 1.3571*p1 + 16.9803 Mean value of p3 2.9425 2.7872 Saccade duration ∆t 1.5942*p2 + 8.8498 1.0349*p2 + 29.5163 1 Removed saccades 35.30% 55.40% Number of saccades 306 327 Eye tracker precision 0.15° 0.08° Eye tracker accuracy 0.08° 0.24°

Table 10: The parameters we used for achieving a saccadic model.

The accuracy of test person two is higher and that’s why we will use its parameter to control the eyes of the robot. To make sure to get the complete amplitude of the saccade we set p1 as equal to ∆θ so it will not stop too early or too late.

Modulation of smooth pursuit: The method I will use is to define a start and an endpoint and let the robot move small steps between the points. The duration of the step movement will correspond to the speed.

User interface

To be able to run the head we need a user interface. I have been around in Tobii and asked how they would like to use the head. Below I will describe the functionality of all the programs we produced.

o User interface move planner On the start page of the program, you have an option of just going to 3 different pages. On the first of those pages, there is an option to write a start unit vector and an end unit vector. (0,0,0 means the eyes will look straight ahead.) There is an option of choosing what movement the eyes will do (Saccade, Smooth pursuit). The “rest” textbox is in milliseconds and it indicates how long after the move the robot will stay on the end position until it moves to the next move. The input speed on the eyes is only used when the movement is smooth pursuit. We can also define how much the head shall move on the duration of the eye movement. The head will rotate from H1 in degrees to “H2” degrees with the Speed of “HSpeed” degrees/ sec. Pressing the save button saves all information above into a database stored on the Rasberry Pi.

34

When pressing the button “read move” all the saved moves from the database will be shown in the big rectangle below in the form and on the side. A figure will appear and show the movement in a virtual plane 60 cm from the head showed in the picture below.

Figure 29: The ”save move” user interface in my program.

The second page is the buttons of turning on and off the inside of each eye. The lasers can be toggled on and off individually.

The third page is the page where you can inspect the moves in a three- dimensional view after all the movements are inspected you can run the movements as a sequence on the head. All the movement from this form is saved in a .txt file with all servo motors values so you can re-run the move. o Run from txt file Txt files that have 5 values on every row can be handled by the program. The four first values correspond to the servomotor position value. The fifth value is the position of the head servo motor. The program will be running rows from the files in 500 Hz which means every 2 ms all five values will be sent to the microcontroller controlling the motors. If there is only a zero on one row the robot stops on the last position and waits until someone press on the enter button connected to the raspberry pi.

35

o Go to position This program asks for all the motor values one by one and after all the values are set the servos in the robot head moves to that position.

o Representational state transfer API There was a wish to be able to make a program there you can use http methods like GET and POST to the unit to control it. Therefore I installed the flask API on to the Raspberry pi and programmed a demo program that will be easily able to continue the rest API.

End implementation The eye that is used for the project is Tobii's own eyes which are documented in “A study of artificial eyes for the measurement of precision in eye-trackers” [53] in comparison to other eyes. The eye is designed to accurately simulate a human eye. The eyes we used are hand-painted. Seen in figure 30 a lens, anterior chamber and a reflective from a similar eye that we have [54].

Figure 30: How the doll eye is built up.

36

The result of the head is shown in the pictures below.

Figure 31: One eye module

Figure 32: The end result of the head

Figure 32: The complete head

37

5 Results of the Project

The results of my electric design of the 4 layer PCB are placed in Appendix B. To reassure the head that fulfills the above-mentioned goals, we decide to test our entire test without the help of a Tobii eye tracker because we want to test the eye tracker in the future.

Final testing In a separate room, we placed the robot in front of a camera. The room had all the turned off and the eyes, internal lasers lit up. Therefore only the laser dot is visible in the camera. The Robot was placed straight in front of a wall with a distance of 162cm. To make sure the head was facing orthogonal to the wall we moved equally on both sides and measured to ensure that it was in the right spot. The test rig is shown in the figure below.

Figure 33: A picture of the final testing setup.

Figure 34: Picture taken from the camera that filmed the tests.

The robot was tested by turning the eyes to look at three different gaze directions vectors [2 -1 6], [-1 1 6], [2 1 6]. Different eye movements were tested 38

between these vectors. The length of the diagonal path was measured to be 99.7cm, horizontal path to be 89 cm and the vertical path to be 39.7cm. The screen projection of the triangle taken from the robot’s user interface is shown in the figure below.

Figure 35: This is how the movement looks from the calculations before in our “user interface program”.

After the test, the film will be analyzed on an external computer. OpenCV was used to detect the red dot and save the x, y coordinates of the dot so we can analyze the results.

Test of saccades One eye laser is lit up and the eyes move in a triangle with the edges of the vectors [2 -1 6], [-1 1 6], [2 1 6]. Every step is a saccade movement. This test is made to measure if the system can move like a saccade. The test is filmed in 1200 fps.

39

0 0 50 100 150 200 250 300

-20

-40

-60

-80

-100

-120

Figure 36: One triangle from the Saccade test.

From this graph, we can calculate the maximum velocity and acceleration of each move. This can be accomplished by taking four points in the middle of the graph and by knowing the time it takes between all dots calculate the top velocity and acceleration. The results of the saccades in diagonal, horizontal and vertical moves are shown in the table below.

Diagonal Vertical Horizontal Velocity ~1500 °/푠 ~850 °/푠 ~1300 °/푠 Acceleration ~118000 °/푠 ~110000 °/푠 ~115000 °/푠

Table 11: Result of the three saccadic movements.

Test of smooth pursuit The same right eye laser is lit up as before. The test runs smooth pursuit in 20 triangles vectors [2 -1 6], [-1 1 6], [2 1 6]. The test is repeated 3 times with the speed of 30 deg/s, 20 deg/s, 10 deg/s. This test will enable us to analyze the potential of using Smooth pursuit. The test is filmed in both 60fps and small parts with 1200 fps.

40

0 0 200 400 600 800 1000 1200

-100

-200

-300

-400

-500

-600

Figure 37: One triangle of 30 deg/ sec.

0 0 200 400 600 800 1000 1200

-100

-200

-300

-400

-500

-600

Figure 38: One triangle of 20 deg/ sec.

Figures 37 and 38 show two samples of smooth pursuit and we can see some inconsistency. 41

For measuring the minimum step size, we tested the eyes to move a smooth pursuit with 1 deg/s. The result is shown in the figure below

102 100 98 96 94 0 200 400 600 800 1000 1200

Figure 40: A sample of the step size in a moment filmed in 1200 Hz.

From this graph, five different step sizes have been averaged to get an average step size. 0.0096° 0.0074° 0.0098° 0.0084° 0.0065° 퐴푣푒푟푎푔푒 푠푡푒푝 푠푖푧푒 = 0.00834°

Test running over a long period of time The setup for this test is that the right eye laser is on and the robot looks in the same triangle with vectors [2 -1 6], [-1 1 6], [2 1 6]. 2400 times the robot will move with the saccade on the horizontal and 30 deg/s smooth pursuits on the diagonal and 20 deg/s smooth pursuit on the vertical. This test will measure the accuracy and precision of each triangle. By comparing the triangles to each other we can analyze the repeatability of the robot's eyes.

At the start of the long test, the movement looks like this in 60fps

0 0 200 400 600 800 1000 1200 -100

-200

-300

-400

-500

-600

Figure 41: One start triangle of the long test. The movement filmed in 60 fps. 42

At the end of the test, the movements are drawn in the graph below being filmed in 60 fps. 0 0 200 400 600 800 1000 1200 -100

-200

-300

-400

-500

-600

Figure 42: One end triangle of the long test. The movement filmed in 60 fps.

By comparing the graphs with each other I calculated the biggest difference repeatability was 4.757%.

Test of head movements We concluded that head movements and its test are not to be a part of this thesis because of the mechanics did not allow the head to turn straight horizontally.

Test of two eyes Test of the eyes relative to each other: The meaning of this test is to test is the eyes move the same.

43

Figure 43: The two eyes is filmed together moving the same movement at the same time.

50 40 30 20 10 0 50 60 70 80 90 100 110 120 130 140

Figure 44: Difference between the eyes zoomed in.

Testing with Eye-Tracking The eye tracker we tested with is called Tobii Spectrum 1200. Spectrum records with a frequency of 1200 Hz. It took some tweaking to get the eye tracking to start tracking the doll's eyes. To get a good result of eye-tracking a calibration is required. To calibrate an eye tracker, we used the joystick to point the laser dot from the eyes on different points on the screen. The figure below shows the setup of the eye-tracking test.

44

Figure 45: The head in front of a Tobii Spectrum and a screen.

This picture shows the head pointed at a Spectrum and a computer running the Tobii:s eye tracker program.

Figure 46: The Tobii pro software found and could track the eyes of the robot.

The picture shows that the spectrum found both eyes and establish eye- tracking. After recording the eyes run the test triangle program as in the previous testing. The results are shown in the picture below.

45

Figure 47: Test triangle shown in the Tobii software. Captured by the spectrum camera.

Sources of errors We have found out some different sources of error that could affect the results of the test. o Under the testing day, the tripod that the camera was mounted on was not stuck to the floor. Unfortunately, we pushed it slightly under the day and tried putting it back this could result in a measurement error. o By the time of testing the mechanics around the eye started to be a bit loose after many days of letting the robot running. After the “long test”, we found out that the right eye was a bit loose could result in a measurement error. o The rod that pulled the left eye was not exactly the correct length. This makes the eyes not to behave exactly symmetrical. o Every time the microcontroller inside the head was reprogrammed there was no control over the movements of the servos. That resulted in movements beyond the limits. This made the system in more of a risk to glitch and break down. o The camera that was used for filming the laser dot was not a camera for slow-motion filming. The laser dot moving in the highest speed was stretched out when captured in the camera. When analyzing the video, the laser point was taken in the middle of the dot and then converted to coordinates. This could result in inconsistent measurements. o When filming in the highest frame per second the camera went down in resolution. That makes the number of pixels to choose from in the OpenCV program less than optimal. This could result in a bigger difference in distance between the coordinates on the screen.

46

6 Conclusions

The electronics reflection The board provided a stable voltage level and could provide the current to the motors in “normal” use. But if all five motors are used in maximum speed we can only provide 1 A each. The voltage regulator will not be able to deliver enough current required for maximum speeds to the motors. The motors for the eyes need 1.4A for maximum speed. To run with all the motors very fast at the same time would make the voltage regulator very hot. Thermal dissipation might be needed on the voltage regulators if they are running fast for longer periods of time. We took a decision that all the motors will rarely require maximum speed at the same time. So we were ok with not running the motors at full speed under long periods of time. When used according to our test programs the heat was not an issue.

I had plans, in the beginning, fitting a stepper motor in the head to drive the distance between the eyes automatically. On the PCB I placed a footprint of a driver for a potential stepper. When we fitted the head together we did not find any room for an additional motor.

The size of the board was defined in the early stages of the project. The size of the PCB can be reduced in the future. We used servo motors controlled by a PWM signal. I think that thanks to the motor's internal interpretations on the PWM signal that this is a big factor in the step size of the results. To make an optimal robot in both speed and precision different motors will be difficult. I can see that in the future you have special equipped with special motors that are mainly focused on one particular movement.

When the microcontroller in the head was reprogrammed the pins started behaving randomly and that made the mechanics move in position and reaching the limit of its movement. This made the mechanics a bit loose and eventually, after enough reprogramming, they broke so we had to repair them. This would need to be fixed for the future.

Reproducible One request we received for this project was to make the head reproducible on a bigger scale. When the board was made from standard components and mostly surface-mounted components. That will make it easy to reproduce the PCB on a bigger scale. For the first prototype, I soldered the components on but for making it more efficient Tobii can outsource that part. The mechanics were made from an aluminum frame and many 3d printed parts. Also, the head was made in styrofoam and the inside had to be carved. Both processes could be improved.

47

Control reflection Form the user interface we could see that the calculations showed straight lines. We realized under the testing of the saccades that the speed was a bit too fast. This could be a result of a miss in the calculations of the saccades

Goals One goal was to make the head flashy and cool to look at. The robot can control the RGB LEDs inside the head and toggle the lasers on and off. The eye movements look human-like and all these functionalities passed to be “flashy and cool” according to us. The head had all the electronics needed for the movements inside. This system works as a standalone system which is a requirement.

The system was made to be able to move all required movements but due to time these movements can be implemented in the future: o Vergence movement o Vestibulo-ocular movement

Future improvements For the future, implementation of these movements could be made o Vergence movement o Could be implemented thanks to that we have the eyes in the head controlled individually. o Vestibulo-ocular movement o Both movements of the head and the eyes are implemented so vestibule-ocular movement can be implemented in the future. o Micro saccades o Micro saccades can be implemented as very tiny saccades. o Catch up saccade before a smooth pursuit. o This is possible by first implementing a saccade before a smooth pursuit starts.

It would be easy to equip the head with a camera and then in the Raspberry pi run the OpenCV program that I included to be able to find an object from background subtraction. With that, we could make the eyes stare at persons walking by. I think that would be a real eye-catcher on future fairs for Tobii.

48

49

7 References

[1] Legget, D. a brief history of eyetracking (2010). [Online]. Available: https://www.uxbooth.com/articles/a-brief-history-of-eye-tracking/ (visited on 27/01/2019) [2] De Lemos J., Reza Sadeghnia G., Ólafsdóttir Í., Jensen O. Measuring emotions using eye tracking. (2008). Retrieved from Emotion Technology A/S, Copenhagen, Denmark: https://www.noldus.com/mb2008/individual_papers/FPS_eye_tracki ng/FPS_eye_tracking_deLemos.pdf (visited on 27/01/2019) [3] Richard J. Krauzlis, in Fundamental Neuroscience (Fourth Edition), (2013). Available: https://www.sciencedirect.com/topics/neuroscience/eye-movements (visited on 27/01/2019) [4] Gaurab Karki, Online Biology notes. Human Eye: Anatomy, parts and structure (2018) [Online]. Available: http://www.onlinebiologynotes.com/human-eye-anatomy-parts- structure/ (visited on 27/01/2019) [5] Diaphragm. (2018, June 28). Wikipedia. Available: https://en.wikipedia.org/wiki/Diaphragm_(optics) (visited on 2/12/2018) [6] Rochester Institute of Technology. Rods and Cones. [Online] Available: from https://www.cis.rit.edu/people/faculty/montag/vandplite/pages /chap_9/ch9p1.html (visited on 27/01/2019) [7] Visual cortex. (2018, Nov 10). In Wikipedia. Available: https://en.wikipedia.org/wiki/Visual_cortex (visited on 2/12/2018) [8] Field of view. (2018, Nov 16). In Wikipedia. Available: https://en.wikipedia.org/wiki/Field_of_view (visited on 2/12/2018) [9] Gordon, C. C., Blackwell, C. L., Bradtmiller, B., Parham, J. L., Barrientos, P., Paquette, S. P., Corner, B. D., Carosn, J. M., Venezia, J. C., Rockwell, B. M., Murcher, M., & Kristensen, S. (2014). 2012 Anthropometric Survey of U.S. Army Personnel: Methods and Summary Statistics. Available: https://apps.dtic.mil/dtic/tr/fulltext/u2/a611869.pdf (visited on 2/12/2018) [10] Purves D., J Augustine G., Fitzpatrick D., C Katz L., LaMantia A-S., O McNamara J, Williams S M. (2001). Neuroscience, 2nd edition Ch 20. Avalible: Retrieved from: https://www.ncbi.nlm.nih.gov/books/NBK10799/ (visited on 2/12/2018) [11] Fisher B, Ramsperger E Human express saccades: extremely short reaction times of goal directed eye movements. Available: https://www.ncbi.nlm.nih.gov/pubmed/6519226 (visited on 2/12/2018) [12] Bioelectromagnetism Portal, The Electric Signals Originating in the Eye Avalible: http://www.bem.fi/book/28/28.htm (visited on 2/12/2018)

[13] Smooth pursuit (2018, May 3). In Wikipedia. Available: https://en.wikipedia.org/wiki/Smooth_pursuit (visited on 2/12/2018) [14] WhatIs, Eye tracking (gaze tracking). [Online] Available: http://whatis.techtarget.com/definition/eye-tracking-gaze-tracking (visited on 2/12/2018) [15] Whitmire E., Trutoiu L., Cavin R., Perek D., Scally B., Phillips J O, Patel S. EyeContact: Scleral Coil Eye Tracking for Virtual Reality. Available: http://www.cs.cmu.edu/~ltrutoiu/pdfs/ISWC_2016_trutoiu.pdf (visited on 2/12/2018) [16] Plöchl M., Ossandón J P., König P. Combining EEG and eye tracking: identification, characterization, and correction of eye movement artifacts in electroencephalographic. Available: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3466435/ (visited on 2/12/2018) [17] Imotions, What is Eye Tracking and How Does it Work? (2018) Available: https://imotions.com/blog/eye-tracking-work/ (visited on 2/12/2018) [18] Tobii. Tobiipro.com. [Online] Available: https://www.tobiipro.com/learn-and-support/learn/eye-tracking- essentials/what-is-dark-and-bright-pupil-tracking/ (visited on 2/12/2018) [19] Honda. asimo.honda.com. [Online] Available: http://asimo.honda.com/ (visited on 2/12/2018) [20] Soft banks Robotics NAO. [Online] Available: https://www.softbankrobotics.com/emea/en/nao (visited on 2/12/2018) [21] MiwaH., Itoh K., Matsumoto M., Zecca M., Takanobu H., Roccella S., Chiara Carrozza M, Dario P., Takanishi A. Effective Emotional Expressions with Emotion Expression Humanoid Robot WE-4RII Available: [22] Hanson Robotics. [Online] Available: http://www.hansonrobotics.com/news/ (visited on 2/12/2018) [23] Stewart M. “Introducing: The First Robot Given Citizenship. Her Name? Sophia.” Available: http://sites.utexas.edu/longhornglobalbiznet/introducing-the-first- robot-given-citizenship-her-name-sophia/ (visited on 2/12/2018) [24] Beira R., Lopes M., Prac M., Santos-Victor J., Bernardino Giorgio Metta A., Becchi F., Saltaren R. Design of the Robot-Cub (iCub) Head (2006) Available: https://flowers.inria.fr/mlopes/myrefs/06-icra.pdf (visited on 2/12/2018) [25] Biamino D, Cannata G, Maggiali M, Piazza A. Department of Communications, Computer and Systems Science, University of Genova MAC-EYE: a Tendon Driven Fully Embedded Robot Eye (December 2005) Available: http://users.dibris.unige.it/cannata/Documents/Humanoid2005%20- %20MACEYE%20-%20Poster.pdf (visited on 2/12/2018) [26] CAN Bus (2018, Nov 28). In Wikipedia. [Online]. Available: https://en.wikipedia.org/wiki/CAN_bus (visited on 2/12/2018) [27] Guizzo E. Superfast Robotic Camera Mimics Human Eye (2010). Available: https://spectrum.ieee.org/automaton/robotics/industrial-

robots/superfast-robotic-camera-mimics-human-eye (visited on 2/12/2018) [28] Bassett K., Hammond M., Smoot L. A fluid-suspension, electromagnetically driven eye with video capability for animatronic application. Published in 2009 9th IEEE-RAS International Conference on Humanoid Robots Available: https://www.semanticscholar.org/paper/A-fluid-suspension%2C- electromagnetically-driven-eye-Bassett- Hammond/6d50395b8c0b92b2318d2ad2a77462a499f06f3a (visited on 2/12/2018) [29] What Are the Differences Between Brushed And Brushless Motor for new RC hobbyist? (2016). [Online]. Available: http://blogg.improveme.se/michaelcotter/2016/04/12/what-are-the- differences-between-brushed-and-brushless-motor-for-new-rc- hobbyist/ (visited on 2/12/2018) [30] Adafruit. All About Stepper Motors. [Online]. Available: https://learn.adafruit.com/all-about-stepper-motors/types-of-steppers (visited on 2/12/2018) [31] Jameco Electronics, How do servo motors work?. [Online]. Available: https://www.jameco.com/jameco/workshop/howitworks/how-servo- motors-work.html (visited on 2/12/2018) [32] Motion control online, Servo Motors vs. Stepper Motors in Motion Control: How to Choose the Right One for Your Application. [Online]. Available: https://www.motioncontrolonline.org/blog- article.cfm/Servo-Motors-vs-Stepper-Motors-in-Motion-Control-How- to-Choose-the-Right-One-for-Your-Application/34 (visited on 2/12/2018) [33] Faulhaber, Linear DC-Servomotors series, [Online]. Available: https://www.faulhaber.com/en/products/series/lm-083001/ (visited on 2/12/2018) [34] Sparkfun, Pulse Width Modulation, [Online]. Available: https://learn.sparkfun.com/tutorials/pulse-width-modulation/all (visited on 2/12/2018) [35] Sparkfun, Serial communication, [Online]. Available: https://learn.sparkfun.com/tutorials/serial-communication/all (visited on 2/12/2018) [36] Pupillary distance (2018, Nov 28). In Wikipedia. [Online]. Available: https://en.wikipedia.org/wiki/Pupillary_distance (visited on 2/12/2018) [37] Raspberry Pi Organization, [Online]. Available: https://www.raspberrypi.org/ (visited on 2/12/2018) [38] Adafruit 16-Channel 12-bit PWM/Servo Driver - I2C interface - PCA9685. [Online]. Available: https://www.adafruit.com/product/815 (visited on 2/12/2018) [39] Atmel, Datasheet of ATMEGA2560. Available: http://ww1.microchip.com/downloads/en/DeviceDoc/Atmel-2549-8- bit-AVR-Microcontroller-ATmega640-1280-1281-2560- 2561_datasheet.pdf (visited on 2/12/2018) [40] Sparkfun, Serial Peripheral Interface (SPI), [Online]. Available: https://learn.sparkfun.com/tutorials/serial-peripheral-interface- spi/all (visited on 2/12/2018)

[41] Sparkfun, i2c, [Online]. Available: https://learn.sparkfun.com/tutorials/i2c/all (visited on 2/12/2018) [42] Sparkfun, Serial communication, [Online]. Available: https://learn.sparkfun.com/tutorials/serial-communication/all (visited on 2/12/2018) [43] Texas instruments, Datasheet of SN754410. Available: http://www.ti.com/lit/ds/symlink/sn754410.pdf (visited on 2/12/2018) [44] Texas instruments, Datasheet of TPS5450. Available: http://www.ti.com/lit/ds/symlink/tps5450.pdf (visited on 2/12/2018) [45] Mouser, Datasheet of SMT Power Inductors, [Online]. Available: https://www.mouser.se/datasheet/2/597/mss1278-270709.pdf (visited on 2/12/2018) [46] Power Consumption Benchmarks, [Online]. Available: https://www.pidramble.com/wiki/benchmarks/power-consumption [47] Raspberry pi Organization, Frequently asked questions, [Online]. Available: https://www.raspberrypi.org/help/faqs/ (visited on 2/12/2018) [48] Lextrait T., (2016) Arduino: Power Consumption Compared, [Online]. Available: https://tlextrait.svbtle.com/arduino-power-consumption- compared (visited on 2/12/2018) [49] Texas instruments, Datasheet of LM2596. Available: http://www.ti.com/lit/ds/symlink/lm2596.pdf (visited on 2/12/2018) [50] Great circle, (2018, Nov 28). In Wikipedia. [Online]. Available: https://en.wikipedia.org/wiki/Great_circle (visited on 2/12/2018) [51] Rodrigues' rotation formula, (2018, Nov 22). In Wikipedia. [Online]. Available: https://en.wikipedia.org/wiki/Rodrigues%27_rotation_formula (visited on 2/12/2018) [52] Han P., Saunders T. R., Woods R. L., Luo G. Trajectory prediction of saccadic eye movements using a compressed exponential model. Journal of Vision July (2013). Available: https://jov.arvojournals.org/article.aspx?articleid=2193987 (visited on 2/12/2018) [53] Wang D., Mulvey F. B., Pelz J. B., Holmqvist K. A study of artificial eyes for the measurement of precision in eye-trackers (2016). Available: https://link.springer.com/article/10.3758/s13428-016-0755-8 (visited on 2/12/2018) [54] Ocular Imaging Eye Model and Bracket (2015), Datasheet of the eye model. [Online]. Available: http://www.ocularinc.com/_data/product/OEMI-7.pdf (visited on 2/12/2018)

Appendix A

Schematics over the Head components

Schematic show the “input” PCB that is located outside of the head.

Schematic shows the Voltregulators and the sevroconnections located on the pcb inside the head.

Appendix B

A picture of the PCB from this project. The upper part is the PCB outside of the head and the lower part is the PCB inside the head. TRITA-EECS-EX-2020:95

www.kth.se