BUILD-IT: Intuitive Plant Layout Mediated by Natural Interaction by Morten Fjeld, Martin Bichsel and Matthias Rauterberg
Total Page:16
File Type:pdf, Size:1020Kb
Arbete Människa Miljö & Nordisk Ergonomi 1/99 BUILD-IT: Intuitive plant layout mediated by natural interaction By Morten Fjeld, Martin Bichsel and Matthias Rauterberg Morten Fjeld holds a MSc in Applied Mathematics from Norwegian University of Science and Technology, Trondheim. Since 1997 he is a PhD student and research assistant in Human- Computer Interaction and Cognitive Science at Institute for Hygiene and Applied Physiology (IHA), Swiss Federal Institute of Technology (ETH Zurich). Between 1990-97 he was working with design and realization of real-time, industrial simulators, measuring systems and training equipment at Contraves AG Zurich. Martin Bichsel, PhD in Physics, is Senior lecturer in Computer Vision and Graphics at Institute for Design and Construction Methods (IKB), Swiss Federal Institute of Technology (ETH Zu- rich). Matthias Rauterberg, PhD in Computer Science, is professor in Human Communication Tech- nology and director of the Center for Research on User-System Interaction (IPO), Technical Uni- versity Eindhoven (TUE), The Netherlands. Supporting natural behaviour in Human-Computer-Interaction (HCI) is getting increasingly im- portant. The authors suggest a new concept to enhance human expression and to support cogni- tive processes by making them visible. Keywords: Direct interaction, graspable interface, computer vision, augmented reality Abstract: BUILD-IT is a planning tool based manipulation and image display take place on intuitive computer vision technology, within the very same interaction space. To- supporting complex planning and configura- gether with the image displayed on the table, a tion tasks. Based on real, tangible bricks as perspective view of the situation is projected on an interaction medium, it represents a new a vertical screen. The system offers all kinds of approach to Human Computer Interaction users access to state-of-the-art computing and (HCI). It allows a group of people, seated visualisation, requiring little computer literacy. around a table, to move virtual objects using It offers a new way of interaction, facilitating a real brick as a interaction handler. Object team-based evaluation of alternative layouts. 49 Arbete Människa Miljö & Nordisk Ergonomi 1/99 Figure 1. A complete activity cycle in action regulation theory. • individual setting of goals, given by the Introduction task description, and later on, given by con- Supporting natural behaviour in Human- trolled feedback, Computer-Interaction (HCI) is getting in- • taking on planning functions, selecting creasingly important. We suggest a new tools and preparing actions necessary for concept to enhance human expression and to goal attainment, support cognitive processes by making them • physical (or even mental) performance visible. functions with feedback on performance To allow for a natural or direct way of task pertaining to possible corrections of actions, solving behaviour, we define a set of six de- and sign principles. These principles are then • controlled feedback on results and the pos- used as support to design an interaction tool sibility of checking the action results called BUILD-IT. Based on tangible bricks against goals. as interaction handlers, this system enables When computer users pursue an activity, their users to interact with complex data in a di- goal may be more or less clear. Their actions rect way. We call our design concept the may be classified according to goal-relatedness. Natural User Interface (NUI). Kirsh and Maglio (1994) considered motor ac- 1 2 tivity as being either epistemic or pragmatic . Outline of design principles Pragmatic actions have the primary function of As pointed out in the introduction, there is a bringing the user physically closer to a goal. In need for a concept bringing together cogni- contrast, epistemic actions are chosen to unveil tive (here: goal related) and motor activity. hidden information or to gain insight that oth- Based on task analysis, action regulation erwise would require much mental computa- theory (Hacker, 1994) is one possible con- tion. Hence, physical actions facilitate mental cept to answer this need. We choose action activity, making it faster and more reliable. regulation theory as the psychological basis Cognitive complexity may also be reduced by for this work. Within this tradition, high im- epistemic actions. portance is given to the concept of complete task. A complete task starts with a goal set- ting part, followed by three subsequent steps (Figure 1). In more detail, these four steps are: 1 Knowledge-based 2 Practice-based 50 Arbete Människa Miljö & Nordisk Ergonomi 1/99 Figure 2. A complete activity cycle in the case of epistemic action. Epistemic and pragmatic actions are, gener- both the idea of complete pragmatic as well as ally speaking, both present in task-solving the idea of complete epistemic actions. behaviour. This applies to all levels of ex- Now, the first three design principles for grasp- pertise. Independent of the level of expertise, able interfaces can be outlined: pragmatic and epistemic actions are both • Assure that mistakes only imply low risk so necessary for successful task solving per- that epistemic behaviour is being stimu- formance and should therefore be encour- lated, aged in the design of HCI tools. • allow users to choose between epistemic Pragmatic actions seem to come close to (exploratory) and pragmatic (goal-oriented) Hacker’s (1994) goal-driven actions. How- actions, and ever, if no goal can be derived directly from • support a complete regulation of pragmatic a task description, the first part of solving as well as epistemic behaviour. the task is epistemic. In that case, a complete activity cycle starts with observable action, Coinciding action and perception followed by goal setting and planning (Fig- spaces ure 2). In the rest of this paper, we make the ab- When manipulating objects in the real world, straction that pragmatic, as well as epistemic action space (hands and fingers) and perception action both can be represented by Figure 1. space (the position of the object in the real This means that the top and bottom of the world) coincide in time and space (Rauterberg, cycle in Figure 1 should no longer be taken 1995). Hacker and Clauss (1976) proved that literally. That figure is meant to transport offering task-relevant information in the same space as where action takes place leads to in- creased performance. Figure 3. User interface where perception and action space coincide. 51 Arbete Människa Miljö & Nordisk Ergonomi 1/99 With a screen-keyboard-mouse user inter- output devices. An alternative approach to in- face, there is a separation between these two terface design (Rauterberg, 1995), is to let per- spaces, given by the separation of in- and ception and action space coincide (Figure 3). Figure 4a & b. BUILD-IT, a brick-based Natural User Interface (NUI) instantiation supporting multi-expert, task solving activity. 52 Arbete Människa Miljö & Nordisk Ergonomi 1/99 Figure 5. In the centre, a plan view with objects (robots, tables etc.). On the sides, menu areas with objects and functions (virtual camera, print etc.). they are meant to support. This idea stems from Tactile feedback the concept of affordances, first suggested by Furthermore, to improve the feedback from Gibson (1986), later applied to design by Nor- interface to user, it is feasible to offer haptic man (1988). Applied to our system, this means (or: tactile) feedback. Akamatsu and that real interaction handlers and virtual, pro- MacKenzie (1996) showed how tactile feed- jected objects must be designed so that they back may improve task solving performance. clearly inform about the function they support, the structure they represent and the results they The real world cannot be au- produce. thentically reproduced by a Now, the final three design principles for can be established: computer • Support users to take on planning functions At this point, we merge the two preceding in a direct and intuitive way, concepts of interface design. Interfaces of- • clearly indicate which objects and tools are fering i) a coincident perception and action useful for task solving accomplishment, and space, and ii) haptic feedback, can be sub- • clearly show the results of user actions. sumed under Augmented Reality (AR). AR is based on real objects, augmented by com- Design and implementation of puter-based, intelligent characteristics. AR BUILD-IT recognises that people are used to the real world, which cannot be authentically repro- Guided by the outlined principles, we designed duced by a computer. A first AR interface, a brick-based NUI instantiation (Figure 4a & Digital Desk, was suggested by Newman 4b). Brick-based means that graspable bricks and Wellner (1992). Similar ideas were de- are used as interaction handlers, or mediators, scribed by Tognazzini (1996). We will between users and virtual objects. As task con- choose AR to be the technological basis for text, we chose that of planning activities for design of NUIs. factory design. A prototype system, called We find it important that real and virtual ob- BUILD-IT, was realised (Fjeld, Bichsel and jects clearly indicate the kind of interaction Rauterberg, 1998). This is an application that supports engineers in designing assembly lines 53 Arbete Människa Miljö & Nordisk Ergonomi 1/99 and building factories. The system enables The working principle of BUILD-IT is shown users, grouped around a table, to interact in a in Figure 6a. Users select an object by putting space of virtual and real world objects. A the brick at the object positions. Objects can be vertical screen gives a side view of the plant. translated, rotated and de-selected by simple In the table working area there are menu ar- brick manipulation. Using a material brick, eas, used to select new objects, and a plan everyday motor patterns like grasping, moving, view where such objects can manipulated rotating are activated. When the brick is cov- (Figure 5). ered, the virtual object stays put. Figure 6a & b. The basic steps for brick-based user manipulations (left), and two-handed inter- action (right).