Kamikaze Aibo Interface

Stephen Chow Lena Mac Abstract Dept. of Computer Science Dept. of Computer Science The Kamikaze Aibo Interface aims to provide a simple and intuitive way for users to control a Aibo University of Calgary University of Calgary dog in a simulated disaster zone. Specific steps were 2500 University Drive NW 2500 University Drive NW taken in the design of this interface to make it easy to Calgary, AB, Canada, T2N 1N4 Calgary, AB, Canada, T2N 1N4 use. These include using a large vision screen and map [email protected] [email protected] located at the top of the interface. Important interface elements are grouped around the vision screen to Ian Hern Phillipa Sessini reduce the amount of effort required by the user to use these items. Dept. of Computer Science Dept. of Computer Science University of Calgary University of Calgary Evaluation of this interface shows that in its current 2500 University Drive NW 2500 University Drive NW form users are able to learn to use basic features of the Calgary, AB, Canada, T2N 1N4 Calgary, AB, Canada, T2N 1N4 interface quite quickly. Additional features could be [email protected] [email protected] added to improve the user’s ability to navigate particularly difficult sections of the disaster zone.

Introduction As humans encounter increasingly dangerous disaster situations the need for designed for these situations is growing. For example, events such as the World Trade Center attack [2] and other disasters require robots designed to maneuver challenging terrains. These robots also need to be easy to control using intuitive interfaces in addition to being able to operate in disaster zones.

One of the primary causes of critical events in robots Copyright is held by the author/owner(s). deployed in urban search and rescue environments is a lack of awareness [3, 4]. The Kamikaze Aibo Interface aims to develop an interface that improves the user’s situational awareness of the robot to reduce critical incidents. The robotic platform used in this project is the Sony efficiency. They found that since users focused mainly Aibo robot with the Tekkotsu programming framework. on the video screen for data, interface elements that The primary objective of our robot is to navigate a required attention should be placed around the video simulated field of rubble and diffuse a simulated bomb. area. Also, they reduced the users cognitive load by Although the purpose of this interface is to diffuse a replacing most of the numerical data displayed in the bomb, methods used in the development of the original interface with icons and colors that were interface may also be applied in urban search and intuitive to the user. rescue (USAR) environments. Although the objective of this study is not search and The remainder of this paper is structured as follows. rescue, the robot will still be in a simulated USAR Studies relevant to the development of the interface are terrain. As a result of this, many lessons learned discussed in Related Work. The development of the studying interfaces for USAR are applicable to our interface over the course of the project is discussed in application. Evolution of the Design. The implemented interface is described in the Interface, Vision Screen, Map, Evolution of the Design Command Center and Indicator sections. Controlling The original design for the Kamikaze Aibo Interface the Robot describes methods of interacting with the consisted of a 3 column layout as illustrated in Figure 1. interface. The interface is evaluated and limitations are The map and vision screen were both large and at the discussed in the Evaluation and Future Work sections. same level. However, indicators were placed to the left of the vision screen in their own column at eye level. Related Work This interface was expected to be quite large Many studies of robotic interfaces for urban search and (approximately 1024x768) and had a lot of white space rescue tasks have taken place. The studies highlight the in its design. The majority of elements contained in the need for awareness in interfaces [3, 4] as well as final interface are present in this preliminary sketch. provide insights into interface design for urban search This included the distance sensor bars above the vision and rescue operations [1]. screen, crosshairs to indicate head movement and a box grouping items around the vision panel. Scholtz et. al [4] evaluated human-robotic interfaces at the Robocup 2003 USAR competition. They found that Figure 1 hints at the future directions of the Kamikaze the most common critical incident that occurred Aibo interface. This is evident in the arrow suggesting happened when robots encountered obstacles. They moving the indicators to a position below the map. The note that the team with the fewest critical incidents design presented in the proposal (Figure 2), illustrated relating to obstacles used a camera that could be this conversion from a 3 column to a 2 column layout manipulated. They also note that the same team was and the development of a more compact layout. The able to have fewer local navigation critical incidents, placement of indicators and less common actions were again due to their easily movable camera to view the still quite preliminary at this point. Also, a point in the robots position in relation to objects. center of the vision screen with a numerical value of distance between the Aibo and objects in front of it is Baker et. al [1] examine an existing robotic interface present. At this point in time the interface allowed for and redesign it to be more efficient by enhancing no autonomy on the part of the robot and anticipated awareness, lowering cognitive load and increasing complete user involvement in controlling the Aibo’s actions. The proposal version of the interface illustrated a milestone in the design of the Kamikaze Aibo Interface. This is the inclusion of a border surrounding the vision screen to indicate battery levels to the user.

The low fidelity prototype presented an interface design that was more consistent with our final implementation (Figure 3). The number of items below the vision panel was reduced and a drop down menu of less commonly used actions was added. This drop down menu decreased the number of buttons present on the interface and reduced the amount of clutter in the design. The drop down menu of less commonly used actions also included a “look at sound” behavior. This marked the beginning of autonomy for the Aibo in the interface.

Awareness of the Aibo’s body position is included in the low fidelity prototype in the form of paw sensor Figure 1: Original Layout of Kamikaze Interface indicators. These are included to inform the user when the Aibo has fallen over. Along with the paw indicators an array of recovery methods were also made available to the user. The current interface is illustrated in Figure 4 and is described in the following sections.

Interface The overall design of the interface aims to make using the interface as simple and intuitive as possible (Figure 4). Commonly used menus, buttons and indicators are grouped around the main vision screen for easy access. Less commonly used functions are located further away from the main focus area. The map is located at the same level as the vision screen to facilitate use of the map.

The interface also aims to reduce clutter and unnecessary information. This is accomplished by the Figure 2: Proposed layout of the Kamikaze Interface use of drop-down menus rather than many buttons. Also, items on the interface are grouped according to function to help the user quickly find desired options.

Vision Screen The vision screen is the largest component of the Kamikaze Aibo Interface (Figure 5). It presents a relatively large video feed to increase the user’s situational awareness of the Aibo. The vision screen is also used to convey relevant information about the status of the Aibo. Bars above the vision panel are meant to indicate the robot’s distance from objects in front of the robot. The border of the vision screen also provides information about the amount of battery power that the Aibo has remaining (Figure 6). Information about the Aibo’s head position is conveyed using crosshairs overlaid on the video screen. This is Figure 3: Low fidelity prototype of the interface illustrated in Figure 5. A slider to the right of the vision screen conveys information about the Aibo’s neck position.

Buttons, menus and indicators that will be used frequently are located below the vision screen for easy access. These items include the walk selector, walk speed selector as well as paw sensor indicators. The walk selector is a drop down menu that makes use of icons to facilitate selection of an appropriate walk. This is illustrated in Figure 7.

Map A map is provided to the user at the right of the interface, illustrated in Figure 8 (a). The map is quite large and located at the same level as the vision screen to make use of the map convenient. The map is based on a grid layout to increase the user’s ability to create precise maps. The user can define colors to mark various objects on the map. The user defined colors Figure 4: Current layout of the Kamikaze Interface give the user an opportunity to pick colors that are meaningful for them and create an easy to understand map. A button to clear the entire map space is also provided. Command Center The command center features less commonly used actions and behaviors. These include recovery actions that can be used to bring the Aibo back to a standing position if it falls over. Also, actions that need to be executed if a bomb is found such as diffuse bomb, and take picture are located in this menu. To prevent accidental execution of actions an execute button must be pressed before any selected actions are performed.

Indicators Indicators of battery and network status are located in the lower right hand corner of the interface. These icons are small so as to be unobtrusive but through the use of colors their meaning is quickly conveyed to user. Also, in the event that the battery level drops significantly, the color of the border of the vision screen Figure 5: The vision screen of the Kamikaze Interface will change to increase the user’s awareness of the issue. The changing of the border color in the event of low battery is presented in Figure 6.

Controlling the Robot The most common commands to be sent the Aibo, in the case of a bomb diffusion mission, will likely be commands to control the Aibo’s walking and head position. These actions are mapped to keyboard and mouse commands for convenience. Less commonly executed behaviors can be executed using the actions/behaviors pop up menu or also by using shortcut keys on the keyboard. One action that will hopefully not be required often is the emergency stop command. For easy access to this important command, it is mapped to the space bar which is easy to locate on the keyboard.

This interface allows for highly configurable walking. Figure 6: Low battery indicated by changing vision Both the speed and type of walk executed can be screen border color controlled using a slider and drop down menu, respectively. The direction the Aibo will walk is currently controlled by keyboard controls. Moving the Aibo’s head position is designed to be convenient for a variety of users. Two main methods of moving the head are currently available. These are right clicking on the interface and dragging the mouse to move the head, and using the number pad to adjust the head position. In both cases this only controls the Figure 7: The walk selector includes icons to help the user select a walk quickly. pan and roll of the head. The neck position can be controlled using a slider on the right of the vision screen or by using the scroll wheel of the mouse.

Evaluation of the Interface To help identify limitations of the Kamikaze Interface design some preliminary evaluation has been done. This section summarizes observations made in the evaluation.

The evaluation was done by allowing three users to interact with the interface. By observing their interactions with the Kamikaze Interface, strengths and weaknesses of the interface were made apparent.

One key insight gained through this evaluation of the interface was that, individuals with experience with video games had a much easier time using the interface than other users. One of the three individuals had no prior experience with video games. This individual successfully navigated the Aibo through the first half of the debris, however, the second more narrower tunnel proved to be too frustrating to navigate the Aibo through. The two individuals with video game experience had some difficulties but were both able to clear the course (with some assistance) in a relatively short amount of time. One individual noted that our keyboard layout was quite similar to that of a first person shooter and quickly adapted to the controls. The other individual used several unique tactics, including strafing through large portions of the tunnel to complete the task in a very short amount of time. Figure 8: a) map provided to the user, b) command center provides access to a variety of behaviors. By observing these users interacting with the interface some shortcomings of our interface became apparent. One of these is that users do not seem to use the head Non-implemented items include the loading of various movements as much as anticipated. Instead, almost all walks using the walk selector. Currently, only 2 of the 5 of them opted to rotate the Aibo’s body to look around. proposed walks are available. The paw sensor reporter This often lead to them over-rotating the Aibo and also has yet to be implemented. The distance indicators losing their sense of direction in the tunnels. Some sort and picture taking functionalities also could be of body position compass might help users when they implemented in future work. get turned around in the tunnels. Also, the sensitivity of the rotation should be adjusted in future Items that our design did not consider that would be implementations to prevent the uncontrollable over- important for a successful deployment of the robot rotation experienced by many users. would also be appropriate for future work. In particular, these future directions include recovery from network A deficiency in the “dive” behavior was also discovered failure. Also, providing a game controller or joy stick for by letting users use the Kamikaze Interface. This controlling the Aibo’s movements may make the behavior, that is designed to help the Aibo clear rubble interface easier to use. underneath it can sometimes lead to the Aibo performing a handstand like pose. This occurs when Since a bomb diffusing robot will likely be deployed in a balls under the Aibo remain under the rear part of the situation where only ad hoc, wireless networking is Aibo but not the front of the Aibo. Modifying this available, network failure would likely be encountered. behavior would help the robot perform better in As a result of this, packet loss recovery would be a key situations where it encounters a lot of rubble. component of interaction with the robot.

Overall, the interface was quite usable for non-expert From the evaluation performed in this study, it is users. With more time to gain familiarity with the apparent that the keyboard control of the robot appeals interface these users could become quite proficient at to some video game players. However, for users who do using the interface. In all cases some physical not play video games a tactile device such as a simple intervention was required to get the robot through joystick might be a more appropriate tool for controlling some difficult portions of the tunnel, but in general this the Aibo’s movement. For more advanced video game intervention was minimal. users, a more complex controller similar to those used by the X-Box or Playstation consoles may be an Future Work appropriate way to merge many functions into a single At the time of this report some features of the interaction device. Kamikaze Aibo Interface are not yet implemented. This is as a result of strict time constraints on the Conclusion completion of this project. These items could be the By using lessons learned in previous studies, the subject of future work. Kamikaze project aimed to design an easy to use intuitive interface for controlling a Sony Aibo robot in a simulated disaster zone.

The interface included several principles of good design for robots in disaster zones. These included a large video screen and map, both at eye level. Important functionalities such as walk and speed selection and Urban Search and Rescue. Proceedings of the IEEE stability sensor information are displayed around the Conference on Systems, Man and Cybernetics, October vision panel. By locating these items around the vision 2004. screen it enables users to use these items without [2]Casper, J. and Murphy, R. Human-Robot Interactions taking their attention away from the video feed. during the Robot-Assisted Urban Search and Rescue Response at the World Trade Center. IEEE Transactions By evaluating the design of the Kamikaze Interface, on Systems, Man, and Cybernetics, Part B 33(3): 367- strengths and weaknesses of the interface became 385 (2003) apparent. In particular, the interface is quite easy to use for individuals with experience using video games. [3]Drury, J., Scholtz, J. and Yanco, H. Awareness in In general, awareness of body direction limits the user’s Human-Robot Interactions. Proceedings of the IEEE ability to navigate the simulated rubble efficiently. Conference on Systems, Man and Cybernetics, Overall, users were able to learn to use the interface Washington, DC, October 2003. quickly and navigate the Aibo through the simulated [4]Scholtz, J., Young, J. and Drury, J. Evaluation of disaster zone. Human-Robot Interaction Awareness in Search and Rescue. Proceedings of the IEEE International The Kamikaze Aibo Interface succeeds in its goal of Conference on Robotics and Automation (ICRA), New creating an intuitive interface to control a Sony Aibo in Orleans, April 2004. a simulated disaster zone. The addition of more [5]Tejada, S., Cristina, A., Goodwyne, P., Normand, E., features, such as loading various walks and refining O’Hara R. and Tarapore, S. Virtual Synergy: A Human- rubble dispersing behaviors would alleviate many of the Robot Interface for Urban Search and Rescue. challenges currently faced by users of the program. Proceedings of AAAI Competition, 2003. [6]Yanco, H and Drury, J. Classifying Human-Robot References Interaction: An Updated Taxonomy. Proceedings of the [1]Baker, M., Casey, R., Keyes, B. and Yanco, H. IEEE Conference on Systems, Man and Cybernetics, Improved Interfaces for Human-Robot Interaction in October 2004.