Large Scale Integrating Project Grant Agreement no.: 257899 D7.1 – Multimodal interaction report and specification SMART VORTEX –WP7-D7.1 Project Number FP7-ICT-257899 Due Date 2012-09-30 Actual Date 2012-09-12 Document Author/s: Stefan Radomski Dirk Schnelle-Walka Version: 1.1 Dissemination level Confidential Status Restricted draft M24 Contributing Sub-project and Work All partners in the project package Document approved by RTDC Co-funded by the European Union Document Version Control Version Date Change Made (and if appropriate Initials of Commentator(s) reason for change) or Author(s) 0.2 07.08.2012 Initial structural Draft TUD 1.1 27.09.2012 Included RTDC comments TUD Document Change Commentator or Author Author Name of Author Institution Initials SR Stefan Radomski TUD DS-W Dirk Schnelle-Walka TUD Document Quality Control Version QA Date Comments (and if appropriate reason for change) Initials of QA Person 1.1 30/09/2012 Minor changes IK This document contains information which given under the Non-Disclosure Agreement of the SmartVortex consortium. Data in the tables which are shown in the document are confidential and cannot be used for any publication or scientific papers without explicit permission of the contributing company. Distribution of these data outside the SmartVortex Consortium members is forbidden and seen as a violation of the “Non-Disclosure Agreement”. SMART VORTEX WP7/D7.1 2 / 17 - 2 -Page 2 of 17 PAGE 19/xxx Catalogue Entry Multimodal Interaction Report and Specification Title 1st Restricted Draft Creators Stefan Radomski Subject Deliverable SmartVortex D7.1 Description Publisher Contributor Date September 30th 2012 ISBN Type Delivery SmartVortex Format Language English Rights SmartVortex Consortium Citation Guidelines SMART VORTEX WP7/D7.1 3 / 17 - 3 -Page 3 of 17 PAGE 19/xxx EXECUTIVE SUMMARY WP7 started at month 7 of the project and this document describes the ongoing work regarding multi-modal interfaces in the scope of WP7 within Smart Vortex. A framework to implement multi-modal interfaces following the recommendations of the W3C Multimodal Interaction Working Group is outlined and its application within Smart Vortex to achieve the objectives of WP7 is described. The work in the upcoming year 3 will focus on adding modalities to the framework; implement ISP specific requirements and a closer coordination and integration with related work packages. SMART VORTEX WP7/D7.1 4 / 17 - 4 -Page 4 of 17 PAGE 19/xxx TABLE OF CONTENTS EXECUTIVE SUMMARY ........................................................................................... 4 TABLE OF CONTENTS ............................................................................................... 5 1 Introduction ............................................................................................................. 6 1.1 Objectives of WP 7 ...................................................................................................... 6 1.2 Approach ..................................................................................................................... 6 2 The Multimodal Interaction Framework ................................................................. 8 2.1 Multimodal Dialog Control ......................................................................................... 9 2.2 Modality Specific Components ................................................................................. 10 2.2.1 Graphical / Text ...................................................................................................... 10 2.2.2 Spatial Audio .......................................................................................................... 11 2.2.3 3D Data .................................................................................................................. 11 2.2.4 Speech .................................................................................................................... 12 2.2.5 Notifications ........................................................................................................... 12 2.2.6 Multi-Touch Input .................................................................................................. 12 2.2.7 Location.................................................................................................................. 12 2.2.8 The Confero Suite from Alkit ................................................................................ 13 2.2.9 Visual Query Editor ............................................................................................... 13 2.2.10 Additional Components ....................................................................................... 13 2.3 The Event Bus ........................................................................................................... 13 3 Current State and Roadmap ................................................................................... 14 4 Key Performance Indicators .................................................................................. 15 5 References ............................................................................................................. 17 SMART VORTEX WP7/D7.1 5 / 17 - 5 -Page 5 of 17 PAGE 19/xxx 1 INTRODUCTION 1.1 Objectives of WP 7 The work in this package is concerned with enabling natural interaction for querying data- streams, in order to support a collaborative decision process. Here, natural refers to offering multi-modal interfaces to allow users to choose the input and output modalities most suited for the current context or have the system select suitable modalities with regard to meta-data, available devices and other sensors respectively. The red outline in figure 1 shows WP6 along with WP7 in the overall layered architecture of Smart Vortex from deliverable 2.3. While WP6 focuses on graphical query editing, WP7 will provide multi-modal interfaces for those queries. Figure 1: Contributions of WP6 and WP7 within the layered Smart Vortex architecture. 1.2 Approach In a first step, we implemented several isolated applications, employing various modalities to work with data streams similar to those expected in Smart Vortex. Some of these applications were delivered to project partners, others demonstrated at the consortium meetings and workshops in order to get early user feedback and suggestions for subsequent iterations. These applications, also, helped to clarify requirements, demonstrated technical feasibility and form part of the basis for our future work within the project. They are described as part of the modality specific components we will provide for Smart Vortex below. In order to integrate the various modalities and to offer a unified approach for employing multi-modal interfaces, we choose to integrate the isolated components in a coherent framework that will be part of the Smart Vortex suite. SMART VORTEX WP7/D7.1 6 / 17 - 6 -Page 6 of 17 PAGE 19/xxx The required capabilities and architecture of such a framework has been subject to extensive research since the original “Put That There” multi-modal application from Bolt in 1980 [1]. A multitude of approaches were proposed in subsequent work [2], and in 2002 the W3C formed the “Multimodal Interaction Working Group” to standardize multi-modal application development in the scope of the W3C MMI framework [3]. In august of 2012, the group proposed the W3C MMI architecture as a recommendation [4]. The framework for multi-modal interfaces developed as part of Smart Vortex closely follows the recommendations of this working group with custom adaptations to work with data- streams, ISP specific requirements of interfaces and support for collaboration. By following this approach, we expect to provide a coherent way to model adaptive multi-modal interfaces, with reusability of components between partners and applications and standardized extension points for future work. Furthermore, our experiences with employing the W3C MMI framework as part of Smart Vortex can help to identify additional requirements, for future iterations of the W3C recommendations. We will implement the multi-modal specific functionality of the ISP demonstrators as applications within this framework. SMART VORTEX WP7/D7.1 7 / 17 - 7 -Page 7 of 17 PAGE 19/xxx 2 THE MULTIMODAL INTERACTION FRAMEWORK The MMI framework developed for Smart Vortex closely follows the recommendations of the W3C working group for a Multimodal Interaction Framework (W3C MMI framework) and its related recommendations. Within the W3C MMI framework, a multi-modal application is described as a set of loosely coupled modality components (MC), coordinated by interaction managers (IM), nested in a tree-like data structure (see figure 2). Communication among the components is achieved via a technology agnostic event-bus with defined life-cycle events and application specific data. Figure 2: Course collaboration diagram in the W3C MMI Framework. MCs provide access to user in- and system output on different levels of abstraction. At the lowest level, MCs provide access to modality specific in- and output components. That might be access to actual hardware or available modality specific interpreters (e.g. a HTML browser). Input from these MCs is processed in a chain of nested MC/IM components until it reaches an uppermost IM, where the input is transformed to an abstract system output. This output representation is then, again, processed and concretized by a chain of nested MC/IM components until it reaches a set of MCs at the lowest
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages17 Page
-
File Size-