Electric Elves: What Went Wrong and Why
Total Page:16
File Type:pdf, Size:1020Kb
AI Magazine Volume 29 Number 2 (2008) (© AAAI) Articles Electric Elves: What Went Wrong and Why Milind Tambe, Emma Bowring, Jonathan P. Pearce, Pradeep Varakantham, Paul Scerri, and David V. Pynadath I Software personal assistants continue he topic of software personal assis- is to highlight some of what went wrong to be a topic of significant research inter- tants, particularly for office environ- in the E-Elves project and provide a broad est. This article outlines some of the Tments, is of continued and growing overview of technical advances in the important lessons learned from a success- research interest (Scerri, Pynadath, and areas of concern without providing fully deployed team of personal assistant Tambe 2002; Maheswaran et al. 2004; specific technical details. agents (Electric Elves) in an office envi- Modi and Veloso 2005; Pynadath and ronment. In the Electric Elves project, a 1 team of almost a dozen personal assistant Tambe 2003). The goal is to provide soft- Description of Electric Elves agents were continually active for seven ware agent assistants for individuals in an months. Each elf (agent) represented one office as well as software agents that repre- The Electric Elves project deployed an person and assisted in daily activities in sent shared office resources. The resulting agent organization at USC/ISI to support an actual office environment. This project set of agents coordinate as a team to facil- daily activities in a human organization led to several important observations itate routine office activities. This article (Pynadath and Tambe 2003, Chalupsky et about privacy, adjustable autonomy, and outlines some key lessons learned during al. 2002). Dozens of routine tasks are social norms in office environments. In the successful deployment of a team of a addition to outlining some of the key les- required to ensure coherence in a human sons learned we outline our continued dozen agents, called Electric Elves (E- organization’s activities, for example, research to address some of the concerns Elves), which ran continually from June monitoring the status of activities, gather- raised. 2000 to December 2000 at the Informa- ing information relevant to the organiza- tion Sciences Institute (ISI) at the Univer- tion, and keeping everyone in the organi- sity of Southern California (USC) (Scerri, zation informed. Teams of software agents Pynadath, and Tambe 2002; Chalupsky et can aid humans in accomplishing these al. 2002; Pynadath and Tambe 2003, 2001; tasks, facilitating the organization’s coher- Pynadath et al. 2000). Each elf (agent) act- ent functioning, while reducing the bur- ed as an assistant to one person and aided den on humans. in the daily activities of an actual office The overall design of the E-Elves is environment. Originally, the E-Elves proj- shown in figure 1. Each proxy is called Fri- ect was designed to focus on team coordi- day (after Robinson Crusoe’s manservant nation among software agents. However, Friday) and acts on behalf of its user in the while team coordination remained an agent team. The basic design of the Friday interesting challenge, several other unan- proxies is discussed in detail in Pynadath ticipated research issues came to the fore. and Tambe (2003) and Tambe, Pynadath, Among these new issues were adjustable and Chauvat (2000) (where they are autonomy (agents dynamically adjusting referred to as TEAMCORE proxies). Friday their own level of autonomy), as well as can perform a variety of tasks for its user. privacy and social norms in office envi- If a user is delayed to a meeting, Friday ronments. Several earlier publications out- can reschedule the meeting, informing line the primary technical contributions other Fridays, who in turn inform their of E-Elves and research inspired by E-Elves users. If there is a research presentation in detail. However, the goal of this article slot open, Friday may respond to the invi- Copyright © 2008, Association for the Advancement of Artificial Intelligence. All rights reserved. ISSN 0738-4602 SUMMER 2008 23 Articles Friday Friday Communication broadcast net Friday Friday Figure 1. Overall E-Elves Architecture, Showing Friday Agents Interacting with Users. tation to present on behalf of its user. Friday can VIIs) and WAP-enabled mobile phones, which, also order its user’s meals (see figure 2) and facili- when connected to a global positioning service tate informal meetings by posting the user’s loca- (GPS) device, track users’ locations and enable tion on a web page. Friday communicates with wireless communication between Friday and a users through user workstations and use of wireless user. devices such as personal digital assistants (Palm Each Friday’s team behavior is based on a team- 24 AI MAGAZINE Articles Figure 2: Friday Asking the User for Input. work model called STEAM (Tambe 1997). STEAM cially if the response is not forthcoming, such as if encodes and enforces the constraints among roles the user is in another meeting. Some decisions are that are required for the success of the joint activ- potentially costly (for example, rescheduling a ity. For example, meeting attendees should arrive meeting to the following day), so an agent should at a meeting simultaneously. When an important avoid taking them autonomously. To buy more role within the team (such as the role of a presen- time for the user to make a decision, an agent has ter for a research meeting) opens up, the team the option of delaying the meeting (that is, chang- needs to find the best person to fill that role. To ing coordination constraints). Overall the agent achieve this, the team auctions off the role, taking has three options: make an autonomous decision, into consideration complex combinations of fac- transfer control, or change coordination con- tors and assigning the best-suited agent or user. Fri- straints. The autonomy reasoning must select from day can bid on behalf of its user, indicating these actions while balancing the various compet- whether its user is capable and/or willing to fill a ing influences. particular role. Lessons from Electric Elves Our first attempt to address AA in E-Elves was to Adjustable Autonomy learn from user input; in particular by using deci- Adjustable autonomy (AA) is clearly important to sion tree learning based on C4.5. In training mode, the E-Elves because, despite the range of sensing Friday recorded values of a dozen carefully select- devices, Friday has considerable uncertainty about ed attributes and the user’s preferred action (iden- the user’s intentions and even location, hence Fri- tified by asking the user) whenever it had to make day will not always be capable of making good a decision. Friday used the data to learn a decision decisions. On the other hand, while the user can tree that encoded various rules. For example, it make good decisions, Friday cannot continually learned a rule: ask the user for input, because it wastes the user’s IF two person meeting with important person AND valuable time. We illustrate the AA problem by user not at department at meeting time THEN delay focusing on the key example of meeting resched- the meeting 15 minutes. uling in E-Elves: A central task for the E-Elves is During training Friday also asked if the user want- ensuring the simultaneous arrival of attendees at a ed such decisions taken autonomously in the meeting. If any attendee arrives late, or not at all, future. From these responses, Friday used C4.5 to the time of all the attendees is wasted. On the oth- learn a second decision tree, which encoded its AA er hand, delaying a meeting is disruptive to users’ reasoning. Initial tests with the C4.5 approach schedules. Friday acts as proxy for its user so its were promising (Pynadath and Tambe 2003), but a responsibility is to ensure that its user arrives at the key problem soon became apparent. When Friday meeting at the same time as other users. Clearly, encountered a decision for which it had learned to the user will often be better able to determine transfer control to the user, it would wait whether he/she needs the meeting to be delayed. indefinitely for the user to make the decision, even However, if the agent transfers control to the user though this inaction could lead to miscoordina- for the decision, it must guard against miscoordi- tion with teammates if the user did not respond or nation while waiting for the user’s response, espe- attend the meeting. To address this problem a fixed SUMMER 2008 25 Articles time limit (five minutes) was added, and if the user and uncertainty, so that the agent does not have to did not respond within the time limit, Friday took learn everything from scratch (as was required an autonomous action (the one it had learned to with C4.5). Markov decision processes (MDPs) fit be the user’s preferred action). This led to these requirements and so, in a second incarnation improved performance, and the problem uncov- of E-Elves, MDPs were invoked for each decision ered in initial tests appeared to have been that Friday made: rescheduling meetings, delaying addressed. Unfortunately, when the E-Elves were meetings, volunteering a user for presentation, or first deployed 24/7, there were some dramatic fail- ordering meals (Scerri, Pynadath, and Tambe ures. In example 1, Friday autonomously cancelled 2002). Although MDPs were able to support a meeting with the division director because Friday sequential decision making in the presence of tran- overgeneralized from training examples. In exam- sitional uncertainty (uncertainty in the outcomes ple 2, Friday incorrectly cancelled the group’s of actions), they were hampered by not being able weekly research meeting when a timeout forced to handle observational uncertainty (uncertainty the choice of an autonomous action when Pyna- in sensing).