Fachbereich 4: Informatik EADS Innovation Works

A Service Oriented Architecture To Couple Virtual Prototypes With Functional Simulations

Diplomarbeit zur Erlangung des Grades eines Diplom-Informatikers im Studiengang Computervisualistik

vorgelegt von Marcus Berlage

Erstgutachter: Prof. Dr.-Ing. Stefan Muller¨ (Institut fur¨ Computervisualistik, AG Computergraphik) Zweitgutachter: Dipl. Inf. Ren´eSchubotz European Aeronautic Defence and Space Company

Koblenz, im August 2010

Erkl¨arung

Ich versichere, dass ich die vorliegende Arbeit selbst¨andig verfasst und keine anderen als die angegebenen Quellen und Hilfsmittel benutzt habe.

Ja Nein

Mit der Einstellung der Arbeit in die Bibliothek bin ich einverstanden.  

Der Ver¨offentlichung dieser Arbeit im Internet stimme ich zu.  

...... (Ort, Datum) (Unterschrift)

0

Abstracts

Zusammenfassung Mit Hilfe von virtuellen Prototypen werden Untersuchungen an Produkten, die sich noch in der Entwurfsphase befinden, vorgenom- men. Diese Arbeit stellt einen Ansatz vor, mit dessen Hilfe funktionales Verhalten und die daraus resultierenden Beziehun- gen zwischen funktionalen Objekten in virtuellen Prototypen dargestellt werden. Zu diesem Zweck wird eine Service Orien- tierte Architektur entwickelt, um eine VR Software mit funk- tionalen Simulationen bidirektional zu koppeln. Die Kommu- nikation zwischen den voneinander unabhaengigen Systemen wird von BPEL Prozessen orchestriert. Diese rufen die bereitgestell- ten Web Service Operationen auf und koennen zusaetzlich genutzt werden, um selber funktionales Verhalten zu simulieren. Zusaet- zlich wird eine grapische Benutzeroberflaeche bereitgestellt, um Szeneobjekte mit der Simulation ihres funktionalen Verhaltens zu verbinden und die erstellten Verbindungen zu verwalten.

Abstract Virtual Prototypes are often used to test products which are still being developed and do not exist as real objects. This work is aiming to improve and enhance Virtual Prototypes by adding functional behaviour to the prototype and by establishing the functional relations between objects inside the prototype. A Ser- vice Oriented Architecture is developed to bidirectionally couple the VR software with functional simulations. BPEL processes are used to orchestrate the communication between the inde- pendent systems. These processes invoke the Web Service oper- ations provided by the systems and can also be used to model functional behaviour themselves. Thus, they can be functional simulations themselves. Additionally, a is implemeted to allow the user of the VR system to connect scene objects with their simulated functional behaviour and to manage the connections.

i

CONTENTS CONTENTS

Contents

1 Introduction 1 1.1 Scope of Work ...... 1 1.2 Motivation ...... 2 1.3 Basic approach of this work ...... 3 1.4 Structure of the Work ...... 3

2 Related Work 5

3 Basic Knowledge for this Work 14 3.1 Virtual Reality ...... 14 3.2 Business Process Execution Language ...... 16 3.3 Web Services ...... 18 3.4 SOAP ...... 19

4 Requirements 21 4.1 Scenario Description and Stakeholder Analysis ...... 21 4.2 Functional Requirements ...... 26 4.3 Technical Requirements ...... 28 4.4 User Interface Requirements ...... 29 4.5 Quality Requirements ...... 30

5 Concept 31 5.1 VR System Concept ...... 31 5.2 Functional Simulation Concept ...... 33 5.3 Communication Concept ...... 35 5.4 General Architecture ...... 35

6 Implementation 40 6.1 The VDP Module ...... 41 6.2 The Connector Node ...... 42 6.2.1 Noticing User Interaction ...... 44 6.2.2 Sending Information to the Functional Simulation . . 47 6.2.3 Receiving Information from the Functional Simulation 51 6.3 The Connector Node Manager ...... 56 6.4 The Graphical User Interface ...... 61 6.4.1 Designing the Modules ...... 61 6.4.2 Implementing the Actions Performed by the Buttons . 63 6.4.3 Handling Listener Events ...... 65 6.4.4 Constructing the Module and its User Interface . . . . 66 6.5 The Functional Object ...... 67 6.6 The BPEL Processes ...... 70 iii CONTENTS CONTENTS

7 Deployment of the Module 74 7.1 User Guide for the Module ...... 74 7.2 Extending the Module to Incorporate New Processes . . . . . 78

8 Results and Validation 81

9 Conclusion 85

Bibliography 87

Table of Figures 90

List of Tables 92

Listings 93

iv CONTENTS CONTENTS

List of Abbreviations Abbreviation Meaning ALiSS Assembly Line Solution Set API Application Programming Interface BPEL Business Process Execution Language BPM Business Process Managment CAD Computer Aided Design DLL Dynamic Link Library DMU Digital Mock-Up EADS European Aeronautic Defence and Space Company FBB Functional Building Block FDMU Functional Digital Mock-Up FTP File Transfer Protocol FURPS Functionality, Usability, Reliability, Performance, Supportability GMTL Generic Math Template Library GUI Graphical User Interface HOTAS Hands ON Throttle And Stick HTTP Hypertext Transfer Protocol IEEE Institute of Electrical and Electronics Engineers JBI JAVA Business Integration LAN Local Area Network OASIS Organisation for the Advancement of Structured Information Standards ODE Orchestration Director Engine RPC Remote Procedure Call SLX Simulation Language with Extensibility SMTP Simple Mail Transfer Protocol SOA Service Oriented Architecture SysML Systems Modeling Language TCP/IP Transmission Control Protocol / Internet Protocol UML Unified Modeling Language URL Uniform Resource Locator VDP Visual Decision Platform VDT Virtual Development and Training Platform VP Virtual Prototype resp. Virtual Prototyping VR Virtual Reality W3C World Wide Web Consortium WS-I Web Service Interoperability Organisation WSDL Web Service Description Language X3D Extensible 3D XML Extensible Markup Language

v

page 1

1 Introduction

1.1 Scope of Work The work presented herein is set in the context of the research project AVILUS 1. AVILUS is a project under the management of the Innovation Alliance for Virtual Technologies (IA VT) and is sponsored by the German Federal Ministry of Education and Research (BMBF) with a total of 28 partners involved in the project. Several leading major companies accompa- nied by medium sized enterprises and research facilities are joined together in this project and have set themselves the goal to develop and test new technologies in the field of virtual and augmented reality. The use cases for industrial application of the technologies are provided by the key industries from the sector of automotive and aeronautical engineering as well as from the field of plant engineering. One of the focuses of the AVILUS project is to enable a manufacturer to test his product designs in the context of human interaction and perception. By using the potentials that reside in the use of virtual technology it will be possible to evaluate a product at all stages of the development process and thus integrate the human factor and the role of the future user right from the beginning. The high level of immersion provided by the use of virtual and augmented reality results in a reliable source for studies on user friendliness and functional correctness without having to actually build the product. The aforementioned method to test and study a product, which until this point in time does not exist as a real object, is called Virtual Prototyping (VP). In the field of so called Virtual Prototyping Digital Mock-Up (DMU) is the keyword for innova- tive product development processes (Bullinger et al., 1999).

This thesis is aiming to improve and enhance Virtual Prototypes, here- inafter refered to as VP’s, by adding additional functionality to the Virtual Prototype. Virtual Prototypes are artificial, respectively vir- tual environments created for testing purposes. In the automotive and aeronautical engineering sector, VP’s are utilized to perform ergonomi- cal tests on products that are currently being developed. Since building a real-life model for every design step or modification of the product is expensive both in cost and time, a less complicated tool for experiments is needed. When building a Virtual Prototype, it is often possible to use the data and specs that are already available from the engineering and design process, the technical drawings, or design studies. Modern design tools like 3D Studio Max, Maya, CATIA and other 3D modeling or CAD software are currently widely used in the field of engineering and the files created by them can play an important role in the efforts

1http://www.avilus.de page 2 1 Introduction to build a Virtual Prototype. The use case scenario for this thesis will be the industrial application of the Virtual Prototype to evaluate and verify concepts for aircraft designs. In this scenario, the VP must be able to offer realistic behavior and technical features and display cer- tain system configurations in a cockpit. All these attributes can then be examined with respect to Human-Computer-Interaction.

1.2 Motivation The use of functional and visual simulation for the purpose of early evaluation and system tests is presently a common practice. Connect- ing the two up until now independently used tools on the other hand is not bussiness as usual and is promissing to lead to some interesting fields of application.

• The visual simulation would gain additional sources for input as the functional simulation would send state changes that can be handled by the simulation. No longer would it be limited in its depicted strings of actions by previously specified storylines that result in a plotline of animations. Developers responsible for the visual simulation could use events coming from the functional simulation as input and connect this input to events in the 3D environment. • To be able to work with both simulations - the functional and the visual simulations - and deploy them, quite some training is required. The skills that are necessary to build a functional simulation cannot be applied to create a virtual environment and vice versa. They are used for different test scenarios and are constructed by people who come from different fields of expertise. To combine the two separate simulations into one tool would only complicate the work involved in building the simulation. If it would be possible to connect the two simulations with a tool that only requires minimum training, and thus maintain and guarantee the separation of concerns, this complication could be avoided. • Another factor that is not considered in this work but might be interesting to look at is related to the functional simulation. On the side of the functional simulation the benefits could be that input data can be acquired from a natural environment with re- alistic handling. Test cases which haven’t been thought of before could arise from real situations. Additionally, the control mecha- nisms for the tests performed with the functional simulation would 1.3 Basic approach of this work page 3

be quite intuitive and even persons who are new to the field of functional simulation or who have no technical background could easily perform the tests. As a consequence both simulations can make a step closer to the reality they are supposed to represent and gain a level of depth by being con- nected. Also, the efforts necessary to build a visual representation of a complex system would be lessened because the functional simulation can manage all the state changes. In this work, a bidirectional coupling of a functional simulation with a visual simulation is developed to show the benefits and synergy. The visual simulation is provided by IC:IDOs Visual Decision Plat- form (VDP) and a simple functional simulation is installed with the use of the Business Process Execution Language (BPEL)2. All commu- nication will by handled by Web Services with the use of SOAP3.

1.3 Basic approach of this work By building a module for the software Visual Decision Platform (VDP) by IC:IDO4 the aim is to utilize functional simulations to equip the VR system with new features and possible applications. This module must enable the user of the VDP software to connect objects from his virtual scene directly to any kind of simulation that offers a two-way commu- nication via Web Services and with a little workaround even with some simulations that just offer a web interface for communication. As an example for a possible application of this concept different types of switches and control systems inside a jet cockpit will be connected to a functional simulation. If those controls are used in the virtual en- vironment, say a push- was pressed, this event will be send to the functional simulation. The functional simulation then will send a message back to the virtual environment and order a previously spec- ified scene object to react accordingly. This reaction can be a texture change, a rotation or a translation of a scene object.

1.4 Structure of the Work In the following section the solutions suggested in papers which are related to this work are presented and analysed. After this the fun- damentals and principles of the used tools and methods are explained

2http://www.oasis-open.org/committees/wsbpel/ 3http://www.w3.org/TR/soap/ 4http://www.icido.de page 4 1 Introduction briefly. A short introduction to virtual reality and the corresponding technologies is offered, followed by BPEL, which is used for the part of the functional simulation. Web Services and SOAP, which are used for the communication between the functional simulation and the virtual environment, are explained next. The next section describes the sce- narios which need to be considered in this work and the requirements this solution has to meet. By using the results from the requirements, the concept - developed during the course of this work - is outlined and the software achitecture that emerged from this concept is described. In the section considering the implementation of the framework de- veloped in this work, these concepts and the architecture are used to develop the components of the VDP module. The deployment of this module and the steps necessary to extend the module to incorporate new functional behaviour are described in the next section. This is followed by a retrospective on the requirements, the concept and the implementation, to validate the quality and usefulness of the approach presented in this work. In the last section, the conclusion of this work is presented and possible extensions for the solution are suggested. page 5

2 Related Work

The focus of this work lies in a specific type of computer simulation in the area of development and testing. This type of simulation is called Virtual Prototyping. This term is often used in different meanings as Wang and Professor (2008) stated in his article ”Definition and Review of Virtual Prototyping”. In this article, he defines a Virtual Prototype or Digital Mock-Up as follows: Virtual prototype, or digital mock-up, is a computer simulation of a physical product that can be presented, analysed, and tested from con- cerned product life-cycle aspects such as design/engineering, manufac- turing, service, and recycling as if on a real physical model. The con- struction and testing of a Virtual Prototype is called virtual prototyping (VP) (Wang and Professor, 2008). It is important to mention that the proposed definition implies the importance of human-product interac- tion (Wang and Professor, 2008). At EADS Innovation Works the terms Digital Mock-Up and Virtual Prototype are defined differently. A Digital Mock-Up is considered to be a digital model of a product that is developed. The model contains all relevant information related to that product, like the CAD-files that describe the product’s geometry, but also cost analysis models, lifecycle data, clash analysis, and much more information. A Virtual Prototype is defined as a way to analyse certain aspects of a product like visibility, ergonomical aspects, aesthetic aspects and others by utilizing methods from the field of computer graphics. When using the terms Digital Mock-Up or Virtual Prototype in this work, the definitions stated by EADS Innovation Works are applied. Another important aspect of this work is the bidirectional coupling of a functional simulation and a virtual reality system (or graphical simulation). This aspect has been addressed by a number of scientific publications.

In his paper ”Integrating Operations Simulation Results with an Im- mersive Virtual Reality Environment” Rehn et al. (2004) presents a concept to combine an operations simulation with a virtual reality sys- tem. The operations simulation system ALiSS, Assembly Line Solution Set, developed by Deere & Company in the early 2000s is used for the industrial application of simulating manufacturing operations in a fac- tory. The simulation system was built using two commerical simulation software packages, SLX5 and Proof Animation6, both by the Wolver- ine Software Corporation. Proof Animation - responsible for the visual

5http://www.wolverinesoftware.com/SLXOverview.htm 6http://www.wolverinesoftware.com/ProofProducts.htm page 6 2 Related Work simulation - only provided a visual representation of the simulated pro- cess for a standard 2D output on a monitor. To allow the use of an immersive virtual reality environment for testing and training purposes, all relevant information from the SLX package had to be extracted and then made available to the virtual reality simulation program. For this purpose a SLX module was developed and integrated into ALiSS. To exchange the information between the two simulations, a standalone ASCII file, also refered to as VRF file, was used. A virtual reality simulation program was then developed using the SGI OpenGL Per- former7 to create the scene objects and the VRJuggler8 software library to manage the environment. The program consists of four components: • A graphics module to create and depict the 3D scene objects in the virtual environment. • A data processing module to interpret the VRF file containing the results from the operations simulation ALiSS and the standard Proof Animation layout files. • A logical module responsible for the behavior of the scene ob- jects, such as the motion of an object, animation of an assembly operation and so on. • An interaction module providing a user interface to control the virtual reality simulation, to get information about scene objects and to change the configuration parameters of the simulation. All events simulated in the operations simulation ALiSS are written to the VRF file including a timestamp. For every rendered by the VR system the VRF file needs to be examined for new events. The close coupling of the two simulations causes significant performance issues. During initial development stages of the application we have experienced frame rates as low as one frame per second (Rehn et al., 2004). To resolve this issue the total number of polygons simultane- ously rendered by the system had to be decreased (Rehn et al., 2004). To synchronise the two independent but closely linked simulations, the time elapsed during the rendering of a frame is used to compute the current simulation time, which then will be used as a reference value for accessing the events stored in the VRF file. The concept introduced by Rehn et al. (2004) shows the problems that reside in the close coupling of a functional simulation with a VR Sys- tem. To combine both of them into one system significantly effects the

7http://oss.sgi.com/projects/performer/ 8http://www.vrjuggler.org/ page 7 performance of both, but especially the performance of the VR system. The main arguments for the use of virtual reality are the effects on the user caused by the sensation of immersion into a virtual environment. If the immersion into the virtual environment is prevented, because the system cannot display events in real time, the frame rate is too low or the scene objects are only rudimentarily depicted, the use of a VR system seems to be futile.

Another paper on the subject of coupling discrete simulation and vir- tual reality systems also aims at the use of this approach for testing designs and providing new training methods in the context of plant engineering. With a focus on the necessary synchronisation of the two independent systems Strassburger et al. (2005) presents another so- lution with an application in the field of production, manufacturing and logistics. In his paper different methods for synchronisation are evaluated and a suitable solution for the approach of coupling simu- lations is presented with the method of the self-adapting buffer sizes. The adaptive buffer strategy controls the buffer size as a function of the visualization speed. If the visualization speed increases, the buffer size also increases autonomously(Strassburger et al., 2005). To test the solution, the simulation system SLX is used for the part of the func- tional simulation, while with the Virtual Development and Training Platform9 (VDT) and VRJuggler, two different VR systems are used. All three systems are extended with modules to implement the coupling and the suggested method of the self-adapting buffer size is applied in the component of the VR systems. • A communication unit is responsible for the sending and receiving of messages and commands exchanged between systems. • A managment unit is responsible for integrating the received mes- sages into the respective system and for the controlled release of outgoing messages to the other system (Strassburger et al., 2005). The messages are sent in a specific format, which is not further ex- plained in the paper, and the transport protocol used to transmit the messages is TCP/IP. In the client-server-architecture implemented for the pilot project, the functional simulation takes the part of the server which sends the visualization data to the VR system if requested by it. Only the synchronisation method introduced in this work was consid- ered for the concept developed in this work. The architecture and the

9http://www.iff.fraunhofer.de/de/iffdbde/Produkt_detail.php? ProduktId=41 page 8 2 Related Work communication method were not described in detail and thus could not be evaluated. Just like Rehn et al. (2004), Strassburger et al. (2005) uses the VR system as a sort of graphical user interface for the func- tional simulation. This approach is not suitable with the motivation for this work, where the aim is to extend the possibilities of a VR system by bidirectionally coupling it with one or more functional simulations.

Similar to the work of Strassburger et al. (2005), Mueck et al. (2002) de- scribed a solution to the problem of bidirectionally coupling a discrete simulation with a walkthrough system. His work also has a strong fo- cus on finding a synchronisation method which best suits this scenario. The architecture of the system consists of the commercial manufactur- ing simulation tool EM-Plant10, a coupling module and a walkthrough system; the latter were both developed for the prototype. For the syn- chronisation of the two simulations, the method of Fixed Follow-up Time is implemented. By using the transmission time of the first mes- sage sent from the simulation tool, an estimate for the follow-up time is found. To make sure that walkthrough system time will not outrun simulation time, we have added a tiny period ot time t0 as a reserve (Mueck et al., 2002). This paper again focuses primarily on the synchronisation. But again the technology of virtual reality is only used to create an elaborate graphical user interface for a functional simulation, which does not match the aim of this work. The walkthrough system implemented for the system seems very specific and not suited for tests concerning ergonomical or aesthetic aspects of a product.

A current study commisioned by the Fraunhofer Society11 led to the development of a concept which also uses functional simulations cou- pled with a VR system. For this study Stork et al. (2009) introduced a new term, namely Functional Digital Mock-Up (FDMU). By com- bining different already-existing and widely-used simulation tools, the functional aspects can be added and considerably more insight in the products properties can be achieved (Stork et al., 2009). The concept is more complex and elaborate and is aimed at a different scenario than this work, but has to be mentioned nonetheless. The FDMU framework will be used for the cooperative development and validation of func- tional prototypes of complex mechatronic products(Stork et al., 2009). To describe these products, the Functional Building Block (FBB) has been developed, which includes a description of the electro-mechanical

10http://www.plant-simulation.com 11http://www.igd.fhg.de/igd-a2/fdmu/ page 9 behavior in the form of a functional model and information about the geometry of the object in the form of 3D CAD data. A MasterSimu- lator will be responsible for all the communication between the FBBs and the different types of simulations. The different simulations are connected to the MasterSimulator via FDMU Wrappers who provide a common interface for communication. Although the FDMU concept has some commonalities with the con- cept that is developed for this work, the extent of the FDMU study is much bigger and more complex. The focus of this work will be to find a simple and flexible way to incorporate input from a functional simulation into the virtual environment and use the input to trigger transformations or texture changes. The coupling should not result in an increase in the necessary work effort, and the separation of the two independet simulation tools must remain.

An important starting point for the development of the concept pre- sented in this paper is a thesis that was just finished shortly before the beginning of this work. The thesis by Seidel (2009) was also a part of the AVILUS research project, had an almost identical task formulation and presented a solution to the problem of coupling a VR system with a functional simulation. The solution suggested in his thesis is analyised in the following pages, and it is explained why this concept is not used any further, as well as why a different approach is chosen.

To address the scenario his system was developed for, Seidel (2009) introduced an XML schema (see figure 1) describing the different types of switches used in aircraft cockpits. This XML description is used for data exchange and can be seen as one of the key elements in his solution. The description is very extensive and precise to cover ev- ery possible aspect and feature of a switch. One of its purposes is to create a common basis for the communication between the different participants involved in the creation of a Virtual Prototype. These participants are described in more detail in section 4.1. The Designers, Engineers and the VP Developer all contribute to the construction of the Virtual Prototype by providing the data used for its construction. All of them use different tools in their development process, which re- sults in a variety of data formats. The XML description is supposed to be used by all of the persons involved in the development process and by the system itself.

The software architecture is chosen to be a distributed system consist- ing of three components with the XML description as their common page 10 2 Related Work

aktueller Zustand

1 1

Zustand Schalter Verbundene Pins ID : string in : int ID : string Bezeichnung : string out : int Beschreibung : string 1 * Beschreibung : string 1 0..* Typ : string Verriegelungsart : string Anzahl Pins : int Verriegelungsrichtung : string Geometrische Bedingung verriegelt : bool 1 Kontaktdauer : string Displayanzeige : string Achse : string * Displayinhalt : string 1 1..6 Achsentyp : string 0..* 1 min : float max : float * 1 Zubehör Typ : string Nachbarzustand

Figure 1: UML-diagram with the design for the switch description introduced by Seidel (2009). data basis and data exchange format. The two main parts, the VR system and the functional simulation, are connected over a local area network through the third component, the so-called Mediator. For the part of the functional simulation, the software Rhapsody12 by Telelogic is used and the VR system consists of the software Visual Decision Platform (VDP) by IC:IDO. For the communication between the Me- diator and Rhapsody, an existing web-interface provided by Rhapsody is used. The VDP is extended with a plugin which uses the XML description as its data exchange format. All the information result- ing from interaction with switches inside the VR system is written to the XML data and the Mediator is then contacted via a Web Service interface. The Mediator maps the state information from the XML description to a Rhapsody event and sends a message to the Rhapsody web-interface to trigger the event in the functional simulation. To get information about state changes from the functional simulation, the VDP plugin has to ask the Mediator for an up-to-date XML switch description. This method is called polling. The VDP plugin sends a request to the Mediator which then checks Rhapsody for the present states in the simulation, this state is then written to an updated XML description. The VDP plugin receives the result of the request in form of the updated XML file. In figure 2 the complete system as developed by Seidel (2009) can be seen.

The functional simulation Rhapsody is utilized by Seidel (2009) to cre- ate state charts with the use of SysML (see figure 3). To interact with

12http://www.telelogic.com/products/rhapsody/ page 11

Funktionale Virtuelle Simulation Mediator Umgebung (Rhapsody) (VDP) Web Service Web Adapter C Plugin Server R

evToggleLeft off_unlocked evToggleRight U

evToggleLeft evToggleRight on_left on_right D evUnlock

off_locked evToggleLeft evToggleRight

Figure 2: software architecture for coupling Rhapsody with the VDP introduced by Seidel (2009). the system models run by Rhapsody, the events that need to be trig- gered from outside can be configured to be Web Managed. As a result, Rhapsody provides a web interface for communication (see figure 4). To initiate a state change or an event from outside of Rhapsody HTTP-

off_unlocked evToggleLeft evToggleRight

on_left on_right

evUnlock

off_locked evToggleRight evToggleLeft

Figure 3: Simulating a toggle switch in Rhasopy with Figure 4: Webinterface provided by the use of SysML Rhapsody

GET requests need to be send to Rhapsodys webserver. The three main HTTP-GET requests as found by Seidel (2009) are listed below. Using the example of the state chart for a toggle switch as shown in figure 3 the requests related to the issuing of an order for a state change, con- nected to the event that a toggle switch was unlocked, are the following.

• To address the event ToggleSwitch[0]::evUnlock the URL page 12 2 Related Work

/cgibin?Abs_App=GetMibIds&ToggleSwitch[0]::evUnlock=R

is used. As a result of this request the ID for the event evUnlock is returned. This ID can be used to address the event directly.

• To change a variable for a specific object, in this case the event evUnlock the URL

/cgibin?Abs_App=MibSaveVal&id=5&val=undefined

is called by using the ID returned by the GetMibIds function. This URL is used when the button Activate is pressed in the web interface 4 shown above. • The web interface uses the request

/cgibin?Abs_App=Refresh&1=R&tmID=3993393

to get up-to-date values for the objects it is displaying. In this case 1 denotes the object ID for the variable current_state and tmID is a timestamp. To map events from the VR system to Rhapsody events and thus allow a bidirectional communication, Seidel (2009) designed the Mediator component. In this Mediator a mapping table is used to retrieve the Rhapsody event related to a state change caused by the VR system and then trigger the HTTP command to update the simulation. Three basic steps have to be performed to build a Virtual Prototype for this system. First, the functional simulation has to be designed using Rhapsody. Second, the cockpit scene has to be loaded into the VDP. Third the XML description needs to be created. The XML description can neither be generated by Rhapsody nor by the VDP or its plugin, but has to be written manually. Rhapsody can be controlled over its built-in web interface and the interaction with objects inside the virtual environment serves as the interface for the VDP plugin. The events in- side the VR system that result from state changes in Rhapsody are hardcoded in the plugin and only texture changes are implemented in its current state. If a new functional object which is not already de- cribed by the XML schema and the Rhapsody state charts needs to be integrated into the cockpit scenario, the Mediator and the XML schema would have to be extended to regard the features of this object. page 13

The concept developed by Seidel (2009) is not further used because it is too static and an expansion of the system would only magnify the drawbacks of the system. One of the disadvantages is the polling mechanism which results in unnecessary network traffic. Mapping the switch description to event calls for Rhapsody is fiddly and awkward to handle and must be hardcoded in the Mediator for every new sce- nario. Thus Rhapsody is not used as the functional simulation in this work. Another method needs to be found to simulate the relationship of functional objects and the events caused by interactions with them. Finally, the lack of a graphical user interface which can be used to construct a Virtual Prototype inside the VDP prevents a user-friendly interaction with the system. On account of the very static architec- ture, which only seems to be able to address the very specific scenario of the switches modeled in the XML description and simulated in the Rhapsody state charts, it is necessary to start from scratch and find a different approach. page 14 3 Basic Knowledge for this Work

3 Basic Knowledge for this Work

3.1 Virtual Reality

One of the most influential pioneers in the field of virtual reality and virtual environments certainly is Ivan Sutherland, who laid the ground work for the development towards virtual reality. His most cited and known work ”The Ultimate Display” Sutherland (1965) can be seen as the first head-mounted display, although more commonly known as ”The Sword of Damocles”due to its immense weight and the fact that it had to be suspended from the ceiling. The actual term virtual reality or virtual environment was coined with a great influence by Jaron Lanier in the late 1980s Conn et al. (1989). Lanier, founder of the company VPL Research established in 1985, continued Sutherlands work with his company by developing and building the most advanced and influ- ential input and output devices for virtual environments of his time.

The term virtual reality or virtual environment describes an artifical environment generated by computers. Key features of virtual enviro- ments are:

• Real Time: Generally any output on a display with significantly less then 30 frames per second is not perceived as real in the sense of visual peception for the human eye. All reactions to interac- tion inside the environment have to be immediate and movement through the simulated space has to seem continuous.

• Interactivity: Interaction with objects in the virtual environ- ment has to be possible and the more intuitive the interaction is performed, for example with the help of special input devices like fingertracking, the bigger the sense of immersion into the environ- ment gets.

• Immersion: As it has just been mentioned above, immersion is the ultimate goal of virtual environments. The user has to have the impression of being a part of the virtual environment, to be involved and immmersed. A famous statement made by Ivan Sutherland in his article ”The Ultimate Display” is: The ultimate display would, of course, be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. (Sutherland, 1965) 3.1 Virtual Reality page 15

• Multimodal Interaction: The term multimodal interaction comes from the field of human-computer interaction. Different devices for input and output can be used to provide the user of the virtual environment with an interface. Aside from the traditional moni- tor, keyboard and mouse setup, other modalities such as speech, tracking of body parts, haptic input and output, head-mounted displays or caves, just to name a few, can contribute to the sense of immersion. To help classify an artificial computer generated environment Prof. Dr. Mueller (2009) of the University of Koblenz introduced a reference model in his lecture on the subject of ”Virtual Reality and Augmented Reality” at the University of Koblenz.

Object Behavior

dynamic behavior

static behavior

VR none

Interaction none interactive immersive single event

sequence of events

realtime

Presentation

Figure 5: Reference Model for Virtual Reality.

As it has already been outlined in the context of virtual reality, a vari- ety of input and output devices can be used. Since these devices do not play a role in the context of this work they will not be explained in any more detail. For further, more extensive information on virtual reality and all the related technologies the book ”Understanding Virtual Real- ity: Interface, Application, and Design” by Sherman and Craig (2002) was found to be very useful and complete. In the following the system configuration used to create the virtual environment for the scenario of this work will be described.

The software used to depict the cockpit model in a virtual environment is the Visual Decision Platform (VDP) by IC:IDO. This software offers page 16 3 Basic Knowledge for this Work a variety of modules, which can be used when building and reviewing the Virtual Prototype.The system used to run the VDP consists of the following hardware: • Master Computer for the Cluster: AMD Opteron, Dual NVIDIA Quadro 4500, Windows XP 64 Bit. • 2 Slave Computers for the Cluster: Intel Xeon, NVIDIA Quadro 4800, Windows XP 64 Bit. • Visual Output Devices: NVIS Head Mounted Display nVisor SX60, or two passive Stereo-Beamer. • Tracking: ART Tracking consisting of six cameras allowing Fin- gertracking for three respectively five fingers, a Flystick, Hand- targets for left or right hand and multiple free targets. Furthermore, the virtual environment is accompanied by real objects inside the lab. These objects are used for haptic feedback, so that the user doesn’t just touch the air, but feels a resistance where the virtual objects are supposed to be. The positions of all these objects have to be aligned with the objects in the virtual environment, so that they are at the same places in both worlds. These real objects used for haptic feedback are: • Seat • HOTAS System: abbreviation for Hands On Throttle And Stick, this represents the flight control stick and the throttle in an aircraft’s cockpit. • Various Plexiglas Panels: these panels simulate the control panels inside the cockpit.

3.2 Business Process Execution Language The open standard for the Web Services Business Process Execution Language (WS-BPEL 2.0) is managed by the Organization for the Ad- vancement of Structured Information Standards13 (OASIS) and is a revision of the original BPEL4WS 1.0, which was conceived by IBM, Microsoft and BEA in 2002. The BPEL4WS 1.1 specification was re- leased about a year later, with contributions from SAP and Siebel Sys- tems. At about the same time the BPEL4WS specification was submit- ted to the OASIS committee, to be developed into an open standard.

13http://www.oasis-open.org 3.2 Business Process Execution Language page 17

This led to the specification of the WS-BPEL 2.0 standard. BPEL uses a number of specifications that build the Web Service space, like the Simple Object Access Protocol (SOAP), the Web Service Descrip- tion Language (WSDL), the Extensible Markup Language (XML), the XML Path Language (XPath) and Universal Description Discovery and Integration (UDDI) to specify interactions between Web Services. Pro- cesses designed using BPEL do not support user-based interaction, they solely interact by exporting and importing information via Web Service interfaces. The BPEL4WS 1.1 specification states, that BPEL4WS is meant to be used to model the behavior of both executable and abstract processes (Andrews et al., 2003). An executable business process is used to model the behaviour of a participant in a business-related in- teraction. An abstract business process on the other side describes a protocol to define the behaviour of the message exchange between the interacting partners and is hiding the internal behaviour of the par- ticipiants. BPEL4WS provides a language for the formal specification of business processes and business interaction protocols. By doing so, it extends the Web Services interaction model and enables it to support business transactions (Andrews et al., 2003). WS-BPEL is a language which can be used to orchestrate Web Service interactions, this means it can be used to control the message exchange for the communication of distributed systems. For this purpose the BPEL offers the following concepts:

• Partner Link Types, Partner Links and Endpoint References to define the communication partners, their roles and the specific addresses used to contact them.

• Variable Properties, so the WS-BPEL process definition manip- ulating a variable can remain unchanged if a variable’s definition is changed.

• Data Handling for XML data types and WSDL messages. This in- cludes the use of variables for maintaining the states of processes, query and expression languages like XPath 1.0 to control the be- haviour of a process, and data assignment to copy, construct and insert data.

• Message Correlation to identify and route messages to the correct instance of a process. This is needed in case conversations involve more than two parties or use lightweight transport infrastructure with correlation tokens embedded directly in the application data being exchanged (Alves et al., 2006). page 18 3 Basic Knowledge for this Work

• Basic and Structured Activities to perfom the process logic. Ba- sic activities are those which describe elemental steps of the pro- cess behavior. Structured activities encode control-flow logic, and therefore can contain other basic and/or structured activities re- cursively. (Alves et al., 2006) For example the flow activity can be used for parallel processing. • Scopes can be used to build a process in a process. These scopes can, for example, be used handle events, errors, message exchange, or the termination of a process. In this work Apache’s Orchestration Director Engine (ODE) is used as the Business Process Managment (BPM) engine for the BPEL pro- cesses. It supports a communication layer based on Axis2 for the Web Services HTTP transport and another based on the JAVA Business Integration (JBI) standard.

3.3 Web Services The World Wide Web Consortium14 (W3C) defines a Web Service as a software system designed to support interoperable machine-to-machine interaction over a network. It has an interface described in a machine- processable format (specifically WSDL). Other systems interact with the Web Service in a manner prescribed by its description using SOAP- messages, typically conveyed using HTTP with an XML serialization in conjunction with other Web-related standards (W3C, 2004b). Web Services provide application programming interfaces (API), which are defined using a unique Uniform Resource Identifier (URI), and can be accessed using the HTTP, and are therefore also called web APIs. The technologies, protocols and architectures used by Web Services are submitted to the W3C, so the key objective behind their develop- ment, which is to achieve a high level of interoperability and software portability, can be achieved by defining standards. The Web Services Interoperability15 (WS-I) organization establishes Best Practises for Web Service interoperability which have to be considered when imple- menting a Web Service to guarantee interoperability across platforms, operating systems and programming languages. These rules are pub- lished in the so-called ”WS-I Basic Profiles”. Web Services use the SOAP protocol to exchange information between the provider and the consumer of a service. The interface of a Web Service is described by the XML-based Web Service Description Language (WSDL), which is

14http://www.w3.org/ 15http://www.ws-i.org/ 3.4 SOAP page 19 also standardized by the W3C (see Moreau et al. (2006) and Chin- nici et al. (2007). A WSDL document contains definitions for the el- ements () which are contained in its messages and their data types () and for the messages transmitted by the ser- vice (). The operations () provided by the service to send and/or receive the messages are encapsuled by the defi- nition of the port type respectively the interface (). The protocol and the data format used for the operations and their mes- sages are defined in the binding of the service (). Finally, the service () is connected with its port (). See listing 7 for an example of a WSDL file. Web Services can be utilized to connect distribute systems, in this context the design pattern called Service Oriented Architecture (SOA) is often applied.

3.4 SOAP The SOAP Version 1.2 Part 0:Primer describes SOAP in the following way. SOAP Version 1.2 provides the definition of the XML-based infor- mation which can be used for exchanging structured and typed informa- tion between peers in a decentralized, distributed environment. SOAP is fundamentally a stateless, one-way message exchange paradigm, but applications can create more complex interaction patterns (e.g., re- quest/response, request/multiple responses, etc.) by combining such one-way exchanges with features provided by an underlying protocol and/or application-specific information. SOAP is silent on the se- mantics of any application-specific data it conveys, as it is on issues such as the routing of SOAP messages, reliable data transfer, fire- wall traversal, etc. However, SOAP provides the framework by which application-specific information may be conveyed in an extensible man- ner. Also, SOAP provides a full description of the required actions taken by a SOAP node on receiving a SOAP message. (W3C, 2007) Originally SOAP was short for Simple Object Access Protocol, but this was dropped when SOAP became a W3C standard with version 1.2 in 2003. SOAP is a specification for a network protocol located in the application layer. It uses transport protocols like HTTP, SMTP, FTP, RPC, or others for message transport and negotiation. The messages constructed by SOAP are defined in an XML-based structure. They consist of a SOAP envelope with an optional header and a body. The header can be used to provide a mechanism for extending a SOAP message in a decentralized and modular way (Gudgin et al., 2007). It can, for example, contain information that is provided by other ap- plications, like routing tabels or keys for decryption. The body of a SOAP message carries the actual data. An exemplary SOAP message page 20 3 Basic Knowledge for this Work

is provided by listing 1.

1 2 5 6 7 8 9

1: Basic structure of a SOAP message. The envelope is sent inside the body of a HTTP Post request if HTTP is used as the transport protocol. In the case of Web Services, SOAP is utilized to build the Web services protocol stack and provides the framework for the Web Service messages. In this work the SOAP im- plementation for C++, called gSOAP16, is used to automatically con- struct SOAP and XML data bindings. The tools provided by gSOAP use advanced mapping methods and autocode generation to develop Web Services applications in C and C++. They support the integra- tion of (legacy) C/C++ codes (and other programming languages when a C interface is available), embedded systems, and real-time software in SOAP/XML applications that share computational resources and in- formation with other SOAP applications, possibly across different plat- forms, language environments, and disparate organizations located be- hind firewalls (Engelen, 2010).

16http://www.cs.fsu.edu/~engelen/soap.html page 21

4 Requirements

4.1 Scenario Description and Stakeholder Analysis As it has been outlined in the Introduction 1, the context of this work is Virtual Prototyping. Cockpit designs for helicopters and airplanes need to be reviewed in a virtual environment with regard to ergonomical and functional properties. To enable the user of the Virtual Prototype to, for example, push a button and experience a reaction inside the virtual environment, say a control light that turns green, the VDP needs to be connected with a functional simulation. Since the VDP doesn’t offer a module to connect to any kind of functional simulation, it is necessary to implement such a module.

The complete system consisting of the virtual environment, the func- tional simulation and of course the module itself concerns a number of people from different areas of work. To get a sense of what the module needs to be able to do, first all the participants have to be identified. • Test Pilot: The most obvious participant is of course the person interacting with the simulated objects inside the virtual environ- ment. This person will hereinafter be referred to as the Test Pilot. The Test Pilot will experience the effects of the coupling of the virtual environment with the functional simulation, because in- teractions with objects inside the virtual environment will result in the appropriate predefined event. • VP Developer: Before the Test Pilot can use the system, the Virtual Prototype has to be build. A 3D model of the prod- uct that will be tested has to be loaded into the virtual reality software, the functional simulation needs to be created, and the module will be used to connect the virtual environment with the functional simulation. All these tasks require a person that has the knowledge and skills that are necessary to perfom them. This participant will be called the Virtual Prototype Developer or VP Developer. He will be responsible for creating, deploying, attend- ing and maintaining the Virtual Prototype. • Designer: To depict the product inside the virtual environment, some sort of description of the product is needed. Designers often use 3D modeling tools like CAD software to build a representation of the product. The data from these modeling tools can be used by the VR software. Either the files can be imported directly into the virtual environment or they need to be reformated to a data format that can be read by the VR software. page 22 4 Requirements

• Engineer: Just like the Designers Engineers are also an impor- tant source for input data. They too use 3D modeling software in their design process that can be used to provide data for the virtual environment. Additionally, the processes that need to be simulated by the functional simulation will be specified by them. • VR System: The module will be developed to connect two sim- ulations. One of them is the virtual reality system that is used to provide a realistic environment in which the product can be tested. The module will be a part of the VR System. • Functional Simulation: The Functional Simulation will be used to describe certain sequences of events that have their origin in in- teractions perfomed inside the virtual environment. State changes inside the Functional Simulation on the other hand can result in a visual output inside the VR System. Now that all of the participants are identified, their expected inter- actions with the system need to be assessed. The possible use cases coming from the participants will be described in the following.

The Test Pilot needs to be able to interact with objects inside the Virtual Prototype. If the object is related to a process in the func- tional simulation the module will have to notify the simulation. The functional simulation will handle the notification and check if the inter- action is sufficient to cause a state change. This state change could have effects on other objects both inside the functional simulation and inside the virtual environment. Thus it is imperative that the functional sim- ulation has to be able to notify the module about state changes on any object and order the module to perform the action that is related to this state change. Possible actions will be a rotation of a scene object, a translation of a scene object or a texture change. This specific use case is also described in figure 6.

The VP Developer has to build the Virtual Prototype. First he will need the 3D data describing the scene the Test Pilot will interact with when inside the virtual environment. The geometry for the scene will be obtained either from the Designers or the Engineers. The files pro- vided by them may not be in the file format the virtual reality software requires. The VP Developer will have to reformat the data so that it can be read by the virtual reality software, in this case by the VDP. Next, he will have to build the Functional Simulation that will later be connected with the virtual environment. The Engineers may have already build some sort of functional simulation to test their design. If 4.1 Scenario Description and Stakeholder Analysis page 23

VDP Connector Module

receiving request to transform an object or change its texture

depicting events caused by <> the functional simulation inside the virtual environment

performing the requested action

listening for transformations interacting with an object Test Pilot inside the virtual environment Functional

<> Simulation

sending transformation data

Figure 6: Use case diagram for the scenario where the module will be used by the Test Pilot. possible, this simulation will be connected to the virtual environment by using the module. In case the functional simulation the Engineers provided cannot be connected by using the module, the VP Developer will have to build a new functional simulation which can be connected to the VR system by using the module. Now that all the ground work is done the VP Developer will use the module for a bidirectional coupling of the two independent systems. He will have to connect scene objects from the virtual environment with their corresponding processes in the functional simulation. After he has made all the necessary configura- tions to deploy the Virtual Prototype it will be helpful to be able to save these settings. This way he can simply load the settings he al- ready made and start again right where he left off without having to do the same work every time the prototype is restarted. In case one of the scene objects that already is connected is no longer needed it will be necessary to delete the settings made for this scene object. And to delete all configurations made so far, a reset command will be useful. See figure 7 for the resulting use case diagram for the VP Developer.

The Designers and Engineers create the demand for a realistic test environment. Their concepts and designs can be seen as the input data for this system. Both of them provide the system with the geo- metric objects that will be displayed in the virtual environment. The Engineers additionally deliver the kinetic sequences and functional de- pendencies associated with these objects. The Functional Simulation will be used to simulate these sequences and dependencies by describ- ing them through processes. Both the Designers and the Engineers can page 24 4 Requirements

VDP Connector Module

Creating a connection between a scene object and Functional a process inside the Functional Simulation Simulation

Saving all configurations

Building processes to describe the chains of events connected to Loading configurations the virtual prototype

<> VP Developer Deleting a connection for a scene object Connecting the processes Importing scene with the virtual environment Reseting all configurations <>

reformating 3D geometry data

Figure 7: Use case diagram for the scenario where the module will be used by the VP Developer. also act as Test Pilots if they perform the tests themselves. Therefore the use case diagram in figure 6 might also apply to the Designers and Engineers.

The VR System will be used to create the virtual environment; the more realistic and immersive the environment will be perceived by the Test Pilot, the better. The VR System will consist of the Visual Decision Platform (VDP) by IC:IDO and the module that has to be developed to allow the system communication with the Functional Sim- ulation. Aside from the need for a realistic graphic and the possibility for an intuitive interaction, which the VDP delivers, it is imperative for this scenario that objects can be connected to a process in a Func- tional Simulation. If connected to their process the VR System will have to listen for any changes on these objects. If changes occur that are sufficient to cause the need to inform the Functional Simulation, the VR System will have to send a message with a description of the change. On the other hand, if a message from the Functional Simula- tion is received the VR System will have to be able to react accordingly and perform any action these messages are supposed to initiate.

The Functional Simulation is used to run processes which will rep- resent the behaviour and relationship of certain objects in the virtual environment. Some objects are related to other objects and an inter- 4.1 Scenario Description and Stakeholder Analysis page 25 action - respectively a state change - for one object may also effect another object. The simulation of these causal relationships and the possibility to inform the VR System about the neccesary actions that might result from them are the key factors of the Functional Simula- tion. Just like the VR System the Functional Simulation has to be able to recieve messages from and send messages to its counterpart. If a message is received the information contained in it will be used to check if the object that sent the message has changed its state and if this state change has any implications on other objects that might be related to it. In case another object is affected by a message and con- sequently has changed its state, the VR System needs to be informed of this change and the action this state change requires needs to be performed by the VR System.

In figure 8 the use cases for both the VR System and the Functional Simulation are described.

Functional VDP Connector Module Simulation

Sending message

Running processes Receiving message <> <> {transformed} <> extension point: Scene Object

Updating process states Extension Point: Scene Object <> Listening for transformation Displaying the scene <> {state changed} extension point: Process State Extension Point: Performing requested action Process States

Checking for state changes <>

<> Receiving Message

Sending message

Figure 8: Use case diagram for the VR System and the Functional Simulation.

In the following sections the requirements elicited for the system as it has been outlined above will be specified. A number of different classification methods for requirements exists like the FURPS model developed by Hewlett-Packard and published by Grady and Caswell (1987). For this work the requirements will be divided in functional and non-functional requirements. A functional requirement is a re- quirement that specifies a function that a system or system component must be able to perform (IEEE, 1990). The non-functional require- ments will be subdivided into: page 26 4 Requirements

• the technical requirements which are used to describe the require- ments related to the hardware and software components that are used to develop and deploy the system. • the requirements concerning the user interface which describe the requirements related to the graphical user interface (GUI) and the input and output devices. • and requirements concerning the quality of the system which de- scribe the quality features of the system. There are qualities re- lated to features that are observable at runtime, like usability or security, and qualities related to the development of the system, like maintainability or extensibility.

4.2 Functional Requirements

Requirement Priority User Requirement ID FR 1 Must Test Pilot The module must allow a bidi- rectional communication be- tween the VR System and the Functional Simulation. FR 1.2 Must Test Pilot If a scene object is related to a process in the Functional Sim- ulation and it is transformed in the VR System it must in- form the Functional Simulation about the transformation. FR 1.3 Must Test Pilot If a state change occurs in a pro- cess run by the Functional Sim- ulation the Functional Simula- tion must be able to order the VR System to perform the ac- tion related to this state change. FR 1.3.1 Must Test Pilot The Functional Simulation must be able to order the VR System to rotate a scene object. FR 1.3.2 Must Test Pilot The Functional Simulation must be able to order the VR System to translate a scene object.

Table 1: Functional Requirements Part 1 4.2 Functional Requirements page 27

Requirement Priority User Requirement ID FR 1.3.3 Must Test Pilot The Functional Simulation must be able to order the VR System to change the texture of a scene object. FR 2 Must VP Devel- The VDP module must enable oper the VP Developer to construct a Virtual Prototype by using both the VR System and the Func- tional Simulation. FR 2.1 Must VP Devel- The VP Developer must be able oper to import different file formats into the VR System. FR 2.2 Must VP Devel- If the files cannot be imported oper directly because of their format the VP Developer must be able to reformat the files describing the scene so that they can be imported. FR 2.3 Must VP Devel- The VP Developer must be able oper to build calls to the VR Sys- tem that will be initated by the Functional Simulation. FR 2.4 Must VP Devel- The VP Developer must be able oper to build processes in the Func- tional Simulation which can get input from outside the simula- tion via LAN or Internet. FR 2.5 Must VP Devel- The VDP module must provide oper a GUI with all the input fields and methods that the VP De- veloper needs to connect a scene object with the Functional Sim- ulation. FR 2.5.1 Must VP Devel- The VDP module must provide oper a way to create a connection be- tween a scene object and a pro- cess in the Functional Simula- tion.

Table 2: Functional Requirements Part 2 page 28 4 Requirements

Requirement Priority User Requirement ID FR 2.5.2 Must VP Devel- The VDP module must provide oper a way to delete a connection be- tween a scene object and a pro- cess in the Functional Simula- tion. FR 2.5.3 Must VP Devel- The VDP module must provide oper a way to delete all connections between scene objects and their processes in the Functional Sim- ulation. FR 2.5.4 Must VP Devel- The VDP module must provide oper a way to save all the connections between scene objects and their processes in the Functional Sim- ulation. FR 2.5.5 Must VP Devel- The VDP module must provide oper a way to load the saved connec- tions.

Table 3: Functional Requirements Part 3

4.3 Technical Requirements

Requirement Priority System Requirement ID Component TR 1 Must VR System The software for the VR Sys- tem must be the Visual Decision Platform by IC:IDO. TR 2 Must VR System The module for the VDP must work for the desktop and the immersive module.

Table 4: Technical Requirements Part 1 4.4 User Interface Requirements page 29

Requirement Priority System Requirement ID Component TR 3 Must VR System The module must work if the VDP is run on a cluster. TR 4 Must VR System The module must be able to communicate in a local area net- work. TR 5 Must VR System The VR System and the Func- tional Simulation must not be running on the same computer. TR 6 Must VR System The module must be build with IDO-Develop. TR 6.1 Must VR System The module must be written in C++ Code. TR 7 Must VR System The operating system for the VR System must be Windows XP. TR 7.1 Must VR System The VR System must be built for a x64 architecture.

Table 5: Technical Requirements Part 2

4.4 User Interface Requirements

Requirement Priority Requirement ID UIR 1 Must The GUI of the VR System must allow the VP Developer to perform the actions defined in the functional requirements (FR 2.5.1 - FR 2.5.5). UIR 2 Must The GUI of the VR System must indicate that a scene object is connected to a simulation. UIR 3 Must If a scene object is selected the VDP modules GUI must show the configurations of the connection.

Table 6: User Interface Requirements page 30 4 Requirements

4.5 Quality Requirements

Requirement Priority Requirement ID QR 1 Must The system must guarantee that there will not occur any inconsistencies on an object’s state in the Functional Simulation and in the VR System. QR 2 Must The system must be able to display any changes resulting from the communication of the VR Sys- tem with the Functional Simulation in real time. QR 3 Must The VR System must not be stopped because of a message that is sent to or received from the Func- tional Simulation. QR 4 Must The communication between the VR System and the Functional Simulation must not result in a deadlock. QR 5 Must The module for the VR System must be devel- oped in a way that it can easily be extended or remodeled to work for different test scenarios. QR 6 Must The module for the VR System must be well doc- umented to allow easy maintenance. QR 7 Must All possible errors resulting from invalid com- mands or false use must not cause the system to crash. QR 7 Should It should be possible to connect the VR System with different Functional Simulations.

Table 7: Quality Requirements page 31

5 Concept

5.1 VR System Concept The virtual prototype is mainly perceived through the virtual environ- ment created and run by the VDP. The functional simulation would mainly be used to enhance the possibilies of the VR system. Thus it was decided to start with the part of the system that would play the main role with regard to the virtual prototype and develop a concept to expand its features by providing an interface for bidirectional com- munication with functional simulations. Geometric objects as depicted in virtual enviroments can undergo three different transformations. They can be rotated, translated and their texture can be changed. Every possible change related to an object and experienced inside the virtual environment could be caused by one of these actions or a combination of them. To enable another program to order the VDP to perform any of these actions, it seems impera- tive to provide an interface for these three commands. The commands would need the following information to allow the VR system to re- spond accordingly. • Rotation: To perform a rotation on a scene object the VDP first needs to know which scene object should be rotated. If the com- mand is not issued directly to the scene object, the scene object needs to be unequivocally specified. This could either be done by giving a name, or a unique ID to which only the scene object that is addressed is connected to. The rotation can be described in different ways. Since the rep- resentation of a rotation as Euler Angles is ambiguous and can cause a Gimbal Lock it is best to describe the rotation in the form of a Quaternion or an Axis Angle. • Translation: Just like a rotation a translation needs information to identifiy the scene object if the command isn’t issued directly to the scene object. The translation itself can be described by three float values or a vector composed of three float values. • Texture Change: Also the texture change requires an unmis- takable identification for the scene object that is affected by the command. The texture change also needs some sort of information to determine which image should be loaded as the new texture. This of course is only one of the necessary directions in which communi- cation will take place. Aside from receiving commands the VR system would need to send information about scene objects too. Mapping the page 32 5 Concept geometric properties of a scene object to a state inside the functional simulation is dependent on the object that is simulated and must be done by the VP Developer. The mapping should not be performed by the VR system. This would result in a system that is very specific and hard to expand or fitted to different scenarios. The mapping must be performed by the functional simulation itself or a component that is set in between the two simulations. It is important that the mapping can easily and quickly be created without requiring too much training. The messages the VR system is sending to the functional simulation need to contain a subset or all of the following informations to allow the functional simulation to map them to a state. • Name or a unique ID of the scene object: The functional simulation must be able to identify the scene object that caused the sending of the message. This scene object then needs to be linked with an object, a state, or a process inside the functional simulation. • Transformation the scene object just underwent: To deter- mine if a state transition needs to be performed by the functional simulation it is necessary to examine the transformation data sent by the VR system. The identification of the scene object and the transformation data can be used to assign a state change that corresponds to the received data. The transformation data can consist of a rotation and/or a translation. This information can be sent in different formats or just a part; for example, just the rotation data could be transmitted. • Timestamp for the event: A timestamp is necessary to ensure that only up-to-date information is used. This way messages that for some reason arrive out of time or are already made irrelevant by other messages can be spotted and discarded. Also, the times- tamp can be sent back in a response message. The VR system could use the returned timestamp to determine if the received answer is relevant or out of date. Some switches used in aircraft cockpits can only be set to a certain position after a lock mechanism has released the switch. One example for this is a type of toggle switch that needs to he lifted before it can be tilted to another angle (see figure 9). This lock status is not con- sidered in this concept. The problem with such a mechanism is that if the Test Pilot is able to tilt the switch inside the virtual environment, regardless of the lock mechanism, it will not help to model this state. Discrepancies between the saved lock status and the actual position of 5.2 Functional Simulation Con- cept page 33 the switch could result in inconsistencies between the two simulations. The mechanism needs to be modeled by defining geometric constraints on the switch. This is possible with the help of an existing VDP mod- ule called IDO:Package. By defining the geometric constraints for the switch, the Test Pilot will only be able to move the switch in the di- rections permitted in the constraints. Thus modeling the lock status is redundant and only the position of the switch needs to be considered when mapping from the scene object to a state in the functional simu- lation.

Figure 9: Toggle switch with a lock mechanism.

To provide a way of constructing the virtual prototype and connecting it with the functional simulation from inside the VDP instead of creat- ing a plugin, a module with a graphical user interface is implemented. With the help of the module the VDP Developer has a user-friendly device to connect the virtual prototype with the functional simulation. The functional requirements and the requirements for the user interface (see section 4.2 and 4.4) can be used to determine the functions the GUI has to offer.

5.2 Functional Simulation Concept Now that the concept for the communication interface which the VR system will provide is outlined, it is time to regard the functional simu- lation. The input received by the functional simulation and the requests it can send to the VR system result from the messages sent to and from the VR system. This creates some requirements concerning the possi- page 34 5 Concept bilities the functional simulation needs to offer to be connected to the VR system. • First of all, it must be able to send and receive messages. • Also, the VP Developer must be able to specify the information contained in those messages. • The simulation of a functional relationship associated with scene objects from inside the virtual environment must be easy to cre- ate. • The processes running in the functional simulation must provide a way to trigger the forwarding of a message to the VR system. • The communication must be asynchronous so that the process does not have to wait for a reply and can still be addressed by other communication partners. • The simulation must be able to receive and interpret the mes- sages sent from the VR system and send calls for the three basic methods the VR system offers itself (see section 5.1). These requirements led to the decision to use the Business Process Ex- ecution Language (see section 3.2) to build the functional simulation. The Business Process Execution Language provides features like the usage of query and expression languages; it can be used to assign data, and offers structured activities, like conditional behaviour, repetitive execution, selective event processing, parallel and control dependencies processing, or the ability to process multiple branches. Thus the pro- cesses built using BPEL can be said to be turing complete and are able to model any functional behaviour needed for the scenario of this work. BPEL also offers the ability to invoke Web Services from inside of the processes, which is exactly what is needed to trigger events inside the virtual environment. The processes created with the use of BPEL can be deployed as Web Services and other Web Services can be dynami- cally invoked from within the processes. The incoming and outgoing messages can be defined by the VP Developer to comply with the sce- nario of the virtual prototype. A graphical user interface that allows the construction of processes exists with the open source BPEL edi- tor provided by Eclipse17, a GEF-based editor that provides a graphical means to author BPEL processes (The Eclipse Foundation, 2010). An- other benefit gained from using BPEL is that the BPEL processes can

17http://www.eclipse.org/bpel/ 5.3 Communication Concept page 35 also be used to include other sources. This way, a process can orches- trate the incoming messages from the VR system and distribute them to other functional simulations. For example, a notification about a rotated scene object could be mapped to a state change event in Rhap- sody and the BPEL process could send the resulting call to Rhapsody’s web interface. Many different simulation tools that already provide a HTML or Web Service interface could be incorporated this way.

5.3 Communication Concept As a result of the choice to use BPEL to simulate the functional be- havior of objects, the use of Web Services (see section 3.3) as the com- munication platform arose. The module developed for the VDP has to include a Web Service through which BPEL processes can trigger the three basic events Rotate, Translate and Change Texture. The VDP module also has to provide a client to allow the VR system to send information to the BPEL processes. The requests sent to the VR system will always be the three basic commands Rotate, Translate and Change Texture, but the requests sent to the functional simu- lation might be different for every simulated scenario. Thus it is nec- essary that the client side of the VR system that sends the requests to the functional simulation can easily be modified to allow these changes. This must be considered when developing the system architecture for the VDP module. To work around the problem of having to identify the object which sent a message or was addressed by a message, it was decided to create a service and a client for every object that had to be connected. By doing so every object can unambiguously be defined by the address of its Web Service.

5.4 General Architecture The architecture resulting from the concept described above is now specified. As defined in the technical requirement TR 5 (see section 4.3) the functional simulation and the VR system must not be running on the same computer. This results from the fact that both simulations are attended by different teams, which results in two independent work en- vironments. To comply with the principle of Separation of Concerns a loose coupling of the two simulations is required. This way both teams can develop their simulations without having to regard requirements from the other team. As a result of this requirement the two simu- page 36 5 Concept lations need to run on different computers, thus a distributed system is built. To connect the two separate components, a Service Oriented Architecture is implemented using SOAP and WSDL. In the W3C doc- ument defining Web Service Architecture for distributed systems it is stated, that in general SOA and Web Services are most appropriate for applications: • Where components of the distributed system run on different plat- forms and vendor products; • Where an existing application needs to be exposed for use over a network, and can be wrapped as a Web Service. (W3C, 2004a) Both of these reasons apply to this scenario. The VR system needs to enable the functional simulation to initiate rotations, translations, or texture changes and the functional simulation must provide processes which recieve information from the VR system and subsequently decide about state changes. In a paper on the subject of ”Bridging 3D Graphics and Web Ser- vices”, which was written to suggest possible solutions for the scenario of this work, Schubotz (2009) addresses a scenario similar to this. By using Web Services to build a Service Oriented Architecture he aims to extend the network capabilities of X3D. In this paper three differ- ent approaches to extend the network capability of X3D are presented, called External Networking, Internal Networking and Nodal Network- ing (Schubotz, 2009). The solution presented in this work is related to the Nodal Networking approach. Just like in X3D, the scene ob- jects in the VDP are all represented as nodes in a scene graph. This is a basic approach in all computer graphics related solutions. To al- low scene objects to communicate with other programs, respectively to receive and send information, a Connector Node similar to the Ser- viceConsumer node and the Connection node (Schubotz, 2009) is used to provide this ability. The Service Oriented Architecture used for the solution presented in this work can be seen in figure 10.

By using the graphical user interface of the VDP module the VP De- veloper will create Connector Nodes for each scene object he wants to connect with the functional simulation. The module will equip every Connector Node with a Web Service so that the scene ob- ject can be manipulated from outside the VR system. To allow the Connector Node to send information to the functional simulation, a client for the Web Service of the BPEL process that simulates the functional behavior of the scene object will also be included in the 5.4 General Architecture page 37

AXIS 2 ODE VDP / gSOAP Web Service Provisioning Orchestration Director Engine Web Service Provisioning & Consumption Extern Web Services Functional Simulation VR System (BPEL) (VDP)

BPEL Process

SOAP message

VDP Module SOAP message

Connector Node

SOAP message

BPEL Process Connector Node SOAP message

Figure 10: Architecture of the system developed for this work.

Connector Node. If the Test Pilot interacts with the scene object the client will send all the necessary information to the corresponding BPEL process. To notice if an interaction with its scene object oc- cured, the Connector Node will have to listen for transformations of his scene object. The BPEL processes used to model the functional behavior of the scene objects can be designed with the help of Eclipses BPEL editor (see fig- ure 11).

Figure 11: GUI of Eclipses open source BPEL editor. page 38 5 Concept

The processes will be deployed as Web Services and will be contacted by scene objects which hold the functionalities modeled by them. The transformation data sent by the clients of the Connector Nodes will be used to determine the state of the scene object. If the transforma- tion was not sufficient to cause a state change nothing will have to be done by the process. Supposing the transformation data caused a state change, the process must inform the VR system. Depending on the state, the process will dynamically invoke the Web Service of the scene object that will have to be transformed or have its texture changed as a result of the state change. To give a better understanding of the whole procedure, an examplary sequence of events will be described. A toggle switch will be tilted by the Test Pilot and as a result of this a control light must turn green. 1. The scene object depicting the toggle switch is tilted by the Test Pilot inside the virtual environment. 2. The Connector Node created for this toggle switch notices the transformation of the scene object. 3. To inform the functional simulation the client of the Connector Node sends the transformation data to the Web Service of the BPEL process, which models the functional behavior of this switch. In the case of the toggle switch only the rotation data is needed. 4. The Web Service of the BPEL process receives the information about the interaction. 5. To determine the state of the switch the BPEL process examines the rotation data and maps it to the appropriate state. 6. Since the toggle switch is related to a control light, the process will now invoke the Change Texture method provided by the Web Service of the Connector Node that was created for the scene object depicting the control light. The information contained in the message must be sufficient to specify the image, which will then be set as the new texture. 7. The Web Service of the Connector Node for the control light receives the inforamtion contained in the Change Texture call. 8. The Change Texture call is noticed by a listener in the Connector Node. 9. The Connector Node hands the information received in that message over to a method which will perform the texture change. 5.4 General Architecture page 39

Now that the general approach for the bidirectional coupling of a VR system with a functional simulation and the architecture that is used to perform this coupling are set, it is time to go into the details related to this approach. The VDP module is broken down into its components, with the Connector Node being one of them. The implementation of the module and all of its classes is described in the following section 6. page 40 6 Implementation

6 Implementation

The functional simulation created with the use of the Business Pro- cess Execution Language (BPEL) does not need any extensions to be connected. The open source BPEL editor provided by Eclipse is used to create the processes that are utilized to simulate the functional be- havior of certain scene objects from inside of the virtual environment. To publish the Web Services for the processes, the Apache ODE18 is used to deploy them on an Apache Axis219 server. The steps necessary to publish the BPEL processes as Web Services will be described in section 6.6.

In this section, the implementation for the VDP module is described. The module is implemented with the use of the application program- ming interface (API) IDO:Develop provided by IC:IDO. IDO:Develop is used in the 2010 version with the service pack 2 for a 64 Bit sys- tem. Due to the application programming interface IDO:Develop, the programming language used to implement the module for the VDP is C++. To work with IDO:Develop, Microsoft Visual Studio 200520 is used as the integrated development environment (IDE). With the help of CMake21 it is possible to create all the files for a Microsoft Visual Studio 2005 solution which also contains some basic tutorials for IDO:Develop. The code for this work is included into this solu- tion and a CMakeList.txt file is written to build the new Visual Studio project and integrated it into the solution provided by IDO:Develop. To provide a way to deal with the mathematic problems linked with the transformations caused by user interaction ,especially those of lin- ear algebra, the Generic Math Template Library22 (GMTL) is supplied within the IDO:Develop package and is linked by CMake. In case the implementation requires means to create objects for a graphical user interface the QT23 library is also linked to the project configuration created by CMake. Since the module developed for the VDP is sup- posed to provide a graphical user interface, a QT distribution needs to be specified in the CMakeList.txt file. The newest QT version (4.6.3) could not be compiled with IDO:Develop, for this work QT 4.4.324 for

18http://ode.apache.org/ 19http://tomcat.apache.org/ 20http://msdn.microsoft.com/en-us/vstudio/dd430910.aspx 21http://www.cmake.org/ 22http://ggt.sourceforge.net/html/gmtlfaq.html 23http://qt.nokia.com/ 24ftp://ftp.qt.nokia.com/qt/source/qt-win-opensource-src-4.4.3. zip 6.1 The VDP Module page 41

Windows is used. The cluster used to run the simulated virtual envi- ronment uses a 64 Bit architecture, thus the C++ code written for the module has to be compiled for the 64 Bit architecture. A UML class diagram of all the classes implemented for the module can be seen in figure 30 in the appendix section 9.

6.1 The VDP Module To add all the features and functions needed for the bidirectional cou- pling of the VDP with a functional simulation, two possible ways are provided by IDO:Develop. The first way is to develop a plugin. A plu- gin can be dynamically loaded, initialized and destroyed during run- time; it is provided as a way to integrate new features. Another way to integrate new features into the VDP is a module. A module is an extension of a plugin. IC:IDOs VDP already comes with a variety of modules which serve different purposes; for example, the IDO:Package module which is used in this work to create constraints for scene ob- jects. One characteristic of a module is that only one module can be active. A module can be stopped and restarted at any time by using the ”Module-Switch-Bar” in the graphical user interface of the VDP. The characteristic that is most important for this work is that mod- ules are equipped with an API which can be used to integrate them into the QT based graphical user interface of IDO:Explore. , menues, dialogs and all the QT features can be used by the module to interact with the user. Because of the possibility to create a user interface, the module approach is chosen for the solution developed in this work. All modules are managed by the ModuleManager, which provides an API to add new modules to the 2D user interface. The ModuleManager is responsible for starting and stopping the module. By using the ModuleManager, the module can be added to the so called ”Module-Switch-Bar” where all the modules that are integrated in the VDP have a button to activate them. It is also possible to create a ”Module-Menu” or a ”Module-Docking-Dialog” for a module. This plu- gin and module architecture is described in the Programmers Guide for IDO:Develop IC: (2007) and can be seen in figure 12.

To create one’s own module and integrate it by using the ModuleM- anager, the new module must be derived from the IDOModule parent class. The two virtual functions suspend(IDOModule *module) and resume(IDOModule *module) have to be implemented in the newly created module. The function suspend(...) is called when the ModuleManager deactivates the module, because another mod- ule is selected, and the function resume(...) is called when the page 42 6 Implementation

Renderer Plugin Module Manager

Package Module Extractor Plugin VDP Immersive Module

FunctionalityConnector Module

2D / 3D User Interface

Figure 12: Plugin and module architecture of the VDP.

ModuleManager activates the module again. During runtime the mod- ule is first created by calling the initialize() function, where an instance of the class that is responsible for the modules user inter- face is created. The module class implemented in this work is called FunctionalityConnectorModule, its main purpose is to encap- sulate all the features implemeted for the VDP as a result of this work in a module and to make them available by exporting them to a dynamic link library (DLL). Additionally the FunctionalityConnector- Module will create an instance of the class Functionality- ConnectorModuleUI when the initialize() function is called. This class is used to create the user interface for the module. All the im- plementation details regarding the user interface of the module will be addressed in section 6.4. See figure 13 for the UML class diagram of the FunctionalityConnectorModule and the Functionality- ConnectorModuleUI class.

6.2 The Connector Node The concept to use a Connector Node to bidirectionally connect the VDP with a functional simulation, as outlined in section 5.4, led to the development of a series of classes, which are described in the following pages. The IDOServiceConnector class is corresponding to the concept of the Connector Node. The IDOServiceConnector class is used as the parent class for four different Connectors which were designed to address the scenario of this work. Its functions im- plementing the Web Service operations are virtual and need to be im- 6.2 The Connector Node page 43

Figure 13: UML class diagram of the FunctionalityConnectorModule and the FunctionalityConnectorModuleUI class. plemented by these classes. The focus of this work is to bidirection- ally connect control devices located in an aircraft cockpit to a func- tional simulation; therefore, the Connectors are designed for dif- ferent types of switches. The IDOToggleSwitchConnector, the IDORotarySwitchConnector, the IDOPushButtonConnector and the IDOTextureConnector are all derived from the parent class IDOServiceConnector and only implement the Web Service opera- tions that are relevant for them. See figure 14 for a UML class diagram of this specialisation. When adding a new Connector during run- time, a variable with the type of the parent class is created and then an instance is created by calling the constructor of one of the special- ized classes is assigned to that variable. In this way, all the different Connectors can be created stored and deleted as if they were of the same type. It is not necessary to implement functions for every different Connector type. This design pattern is called Factory Method and was introduced by the so-called ”Gang of Four” Gamma et al. (1995). To simplify the following explanations, concerning all the classes cre- ated for the framework of this work ,the different Connectors will all be referred to by their parent class IDOServiceConnector.

If a scene object is connected to a functional simulation an instance of the IDOServiceConnector class is created and all other important objects needed for the connection will be created for, through, or with page 44 6 Implementation

Figure 14: UML class diagram showing the different specialised Connector classes and their parent class. this instance. The goal is to enable the user to connect a scene object with the functional simulation that is modeling the functional behav- ior and the functional relationships of the scene object. This involves four important tasks the Connector Node has to be able to fulfill. He needs to send information to the functional simulation if a scene object that is connected via a Connector Node is transformed by user interaction. And he needs to receive information from the func- tional simulation and perform the actions required by them. Before the Connector Node can send and receive messages to and from the functional simulation, he must be able to recognize a user interaction with his scene object. Also, he must recognize if a message was re- ceived by his Web Service and then perfom the action required by this message.

6.2.1 Noticing User Interaction The VDP uses an event driven approach based on a Listener object which connects the Sender of an event with his designated Recipient. It is possible to define new types of events, by using so-called IDOGUID’s, but in this case the VDP already provides an event type which can be used to identify an interaction with a specified scene object. The IDOSceneObject class provides an event type called 6.2 The Connector Node page 45

SCENEOBJECT_PROPERTY_CHANGED and one of the subevents of this event type, called SCENEOBJECT_ON_SET_TRANSFORM_BS, de- scribes the event of a scene object being transformed. This is exactly the event which is needed to describe a user based interaction with a scene object. Now that the Sender of the transformation event and the event he needs to listen for are known, the last thing that is missing is the Recipient. The Recipient is called from inside the Listener and is handling the events. The Connector Node itself could be made the Recipient, but to comply with the principle of ”Separation of Concerns”, it was found to be better to create an inde- pendent class. Each IDOServiceConnector object holds a mem- ber variable, called m_recipient which is an instance of this class. To enable the Recipient in the Connector Node to handle the events, a Listener must be created who is calling the Recipient. The scene object the Connector Node is responsible for is the event Sender for this Listener, and the Recipient is provided by the Connector Node in form of a member variable. It is not neces- sary for the scenario of this work to react to every transformation. In the case of a toggle switch only rotations, and only such that are suf- ficiently large enough need to be considered. In the case of a push button, only sufficiently large translations need to be considered. Thus three different Listeners were implemented to meet the different re- quirements: one Listener for rotations, one for translations and one for transformations in general. These Listeners examine the trans- formation matrices for every transformation event, but only call their Recipient to handle the event if the transformation event matches their criteria. This way transformations that are too small, or that do not meet the kind of transformation connected with the functionality of the scene object, cannot result in sending messages to the func- tional simulation. Unnecessary network traffic is thereby prevented and communication is limited to the exchange of useful and impor- tant information. The classes developed for the solution presented in this work can be seen in the UML diagram in figure 15. The main class for implemeting the Connector Node concept is called IDOServiceConnector, the Recipient for the Connector is called IDOServiceConnectorRecipient and the three different Listeners are called IDOSceneObjectListener, IDOScene- ObjectTranslationListener and IDOSceneObjectRot- Listener.

The three Listener classes are all derived from the template class IDOListener. The IDOSceneObject the page 46 6 Implementation

Figure 15: UML class diagram with all the classes used to process user interaction. 6.2 The Connector Node page 47

Listener is watching is the template argument for the Sender and the IDOServiceConnectorRecipient hold by the IDOService- Connector object is the Recipient. All three Listeners imle- ment the virtual functions requestSubEventType(size_t subEventType) and update(IDOEventType::Type type, size_t subEventType, void* data=0) of their parent class. The requestSubEventType(...) function checks all events for their subevent type and returns a boolean value. If the subevent type or types specified in the requestSubEventType(...) function oc- cured, the update(...) function is called. This function is then used to call the Recipient to handle the event. The update(...) func- tion of all three Listener classes implemented for this work is called if the subevent type is SCENEOBJECT_ON_SET_TRANSFORM_BS. All three Listener classes request the current transformation matrix from their scene object, which is also their Sender, and examine this matrix. The IDOSceneObjectRotListener checks for a suf- ficiently large rotation, and if found calls his Recipient to han- dle the rotation. The IDOSceneObjectTranslationListener checks the transformation matrix of his Sender for a sufficiently large translation, and then calls his Recipient to handle the translation. And the IDOSceneObjectListener checks for both a sufficiently large translation and/or rotation and then calls his Recipient to han- dle the transformation. All three Listeners add a matrix containing the transformation data as a parameter to their handle call, so that their Recipient has the information he needs to handle the transfor- mation event. The IDOServiceConnectorRecipient class is de- rived from the abstract class IDOListenerContainerRecipient. To handle the transformation events noticed by the Listeners, it provides the handleSOEvent(gmtl::Matrix44f matrix) func- tion. This function extracts the translation and the rotation from the matrix and, depending on the type of the Connector, orders the Web Service client, which is provided with the IDOServiceConnector- Recipient to send them to the functional simulation. The actions performed when listening for user interaction are also described in the activity diagram in figure 16 by using the example of a toggle switch, which is tilted inside the virtual environment by the user.

6.2.2 Sending Information to the Functional Simulation Now that the first important task of the Connector Node, the part of listening for user interaction, is dealt with, it is time to consider the sec- ond task, which is sending the acquired information about user interac- page 48 6 Implementation

Toggle switch tilted by user

IDOSceneObject calls IDOSceneObjectRotListener

callListener(SCENEOBJECT_PROPERTY_CHANGED, SCENEOBJECT_ON_SET_TRANSFORM_BS)

IDOSceneObjectRotListener checks the subevent type

requestSubEventType(SCENEOBJECT_ON_SET_TRANSFORM_BS)

[SCENEOBJECT_ON_SET_TRANSFORM_BS]

update(SCENEOBJECT_PROPERTY_CHANGED, SCENEOBJECT_ON_SET_TRANSFORM_BS)

update() function of the IDOSceneObjectRotListener checks the transformation matrix

[transformation was a rotation & rotation > threshold]

handleSOEvent(rotationMatrix)

IDOSceneObjectRotListener calls IDOServiceConnectorRecipient to handle the event

sendToggleSwitch(...)

IDOServiceConnectorRecipient orders his web service client to send the information about the rotation to the functional simulation

Figure 16: UML activity diagram describing the actions caused by a user inter- action with a toggle switch. 6.2 The Connector Node page 49 tion to the functional simulation. For this task the Connector Node must be able to invoke the Web Services of the BPEL processes that are used to model the functional behavior of the scene objects. Thus a Web Service client is required. As presented in the section above, the Recipient will be called by the Listener to handle the transfor- mation event. Handling the event will also include sending the trans- formation data, hence the Web Service client must be included in the Recipient. To make it easier to add new Web Service clients and to comply with the principle of ”Separation of Concerns”, a class is designed especially for that purpose. Each Recipient will hold a member variable, called m_client, which is an instance of this Web Service client class. The order to invoke the Web Service of a BPEL process is then given in the handleSOEvent(..) function of the IDOServiceConnectorRecipient. For the cockpit scenario ad- dressed in this work, clients for five different BPEL processes were created. The BPEL processes already provide a description of their Web Service in the form of a WSDL file. This file is used to cre- ate a client for the Web Service specified in it. By using the gSOAP tool wsdl2h, a C++ header file defining the operations of the Web Service and his data types is generated from the WSDL file. This header file is then used by the gSOAP compiler soapcpp2 to gener- ate an XML serializer for the data types and a client stub. For fur- ther information on gSOAP see section 3.4. All the C++ files created with gSOAP were included in the class IDOWSClient, which contains all the Web Service clients necessary to invoke the BPEL processes. This class provides four functions called sendToggleSwitch(...), sendRotarySwitch(...), sendPushButton(...) and sendGeneralObject(...). The sendToggleSwitch(float angle, float rotXAxis, float rotYAxis, float rotZAxis, IC::IDOFunctionalObject* fObj) function can contact three different BPEL processes, which each model a differ- ent functional behavior connected to a toggle switch. The host ad- dress of the BPEL processes Web Service is used to determine which Web Service needs to be connected. This address is specified when a Connector Node for a scene object is created. Depending on that address the suiting client object is created and all the information re- quired by the BPEL process is sent to him. The following information is send to the BPEL processes modeling the behavior of toggle switches:

• The information about the rotation of the scene object. Every toggle switch process gets the rotation data in the Axis Angle representation (angle, X-axis, Y-axis and Z-axis). page 50 6 Implementation

• The address of the Web Service for the scene object that is asso- ciated with this scene object. To identify the scene object that might be affected by the scene object which sent the information to the process, the host address of the Web Service of this scene object is included in every message to a toggle switch process. • A timestamp for the time the message was send to the process. This way messages that arrive late and are already irrelevant due to a newer message and will not result in incorrect state changes. • The unique ID of the scene object that is sending the message. The process which is modeling a functional behavior where two toggle switches are needed to be set in the same position to cause a control light to change its colour additionally gets the ID of the scene object which is sending the message. The sendRotarySwitch(float angle, float rotXAxis, float rotYAxis, float rotZAxis) function contacts the pro- cess that was created to simulate the functional behavior of a rotary switch. In this case, it is not necessary to examine the host address of the processes Web Service, since only one process for rotary switches was created for the scenario of this work. The information that is sent to the rotary switch process is similar to that sent to the toggle switch process. The information about the rotation is included in the mes- sage, as well as the address of the associate of the scene object that is sending the information, and finally the timestamp for the time the message was sent. The sendPushButton(float xTransValue, float yTransValue, float zTransValue) function also only has one BPEL process it can contact. The information sent to this process consists of • the translation data, which is represented by three float values for the X, Y, and Z axis, • the host address of the associate of the scene object which sent the message, • and the timestamp for the moment the message was sent. All three functions write an error message to the console if the invoca- tion performed by their client did not end with the SOAP_OK result. The sendGeneralObject(float xTransValue, float yTransValue, float zTransValue, float angle, float rotXAxis, float rotYAxis, float rotZAxis) function is not 6.2 The Connector Node page 51 used in the cockpit scenario and only prints out its parameters to the console, therefore it is not necessary to check for errors in this function. The UML class diagram for the IDOWSClient class and its relation to the IDOServiceConnectorRecipient class is shown in figure 17.

Figure 17: UML class diagram showing the IDOWSClient class and its relation to the IDOServiceConncetorRecipient class.

6.2.3 Receiving Information from the Functional Simulation Now the Connector Node can inform the functional simulation about user interaction on scene objects, but one part of the communication is still missing. The BPEL processes which simulate the functional behav- ior of scene objects must also be able to invoke a Web Service provided by the Connector Node to inform him about events. This Web Ser- vice is implemented in the IDOServiceConnector class. This class was already mentioned in the beginning of the Connector Node sec- tion 6.2; it holds a member variable with an instance of the IDOServiceConnectorRecipient class, which again holds a mem- ber variable for the Web Service client. However the main function of the IDOServiceConnector class is the deployment of a Web Service which can be used to perform changes on its scene object. With the use of the gSOAP tool soapcpp2 the Web Service for the page 52 6 Implementation

Connector Node is generated from a C++ header file (see list- ing 2). The IDOServiceConnector class is derived from the Web Service class IDOWS::Service, which was created by the gSOAP tool soapcpp2. Since more than one Web Service must be ready to accept connections at all times the IDOServiceConnector class is also de- rived from the IDOThreadBase class provided by the VDP framework.

1 namespace IDOWS { 2 int ns changeTexture( int s t a t e , int timestamp , struct n s signalResponse {} ∗ r e s u l t ); 3 int ns translateSO( float xAxis , float yAxis , float zAxis , int timestamp , struct n s signalResponse ∗ r e s u l t ); 4 int ns rotateSO ( float xAxis , float yAxis , float zAxis , float degree , int timestamp , struct n s signalResponse ∗ r e s u l t ); 5 int ns setNewState( int s t a t e , int timestamp , struct n s signalResponse ∗ r e s u l t ); 6 }

2: C++ header file used by the gSOAP tool soapcpp2 to create a Web Service This way, every instance of the IDOServiceConnector class is a thread, which can be started, interrupted and destroyed. The Web Service provided by the IDOServiceConnector is running as long as the IDOServiceConnector thread is running. Multiple Connector Nodes can be run parallel without having to wait for one another. The standalone Web Service is implemented in the run() function which is a virtual function inherited by the parent class IDOThreadBase. This way the Web Service is started as soon as the thread for an IDOServiceConnector object is started. The Web Service pro- vides the four functions which are specified in the header file 2 that was used to create the Web Service. These functions are: • changeTexture(int state_, int timestamp, IDOWS::ns__signalResponse *result_), which is pro- vided to allow the BPEL processes or other Web Service based ap- plications to invoke a function which will result in a texture change for the scene object that deploys this Web Service. The state_ parameter defines the image which is to be set as the new texture. The timestamp is used to check if the invocation is up to date. If the timestamp is older than the last timestamp received by this Web Service, the texture change is not performed. If the timestamp is newer than the last timestamp, the state_ is handed over to the Recipient of the IDOServiceConnector by calling his setTextureState(...) function. The result_ 6.2 The Connector Node page 53

parameter is a *void struct which is included just for compiler rea- sons. The changeTexture(...) function has no return value other than SOAP_OK. • Invoking the function translateSO(float xAxis_, float yAxis_, float zAxis_, int timestamp, IDOWS:: ns__signalResponse *result_) results in a translation of the scene object for which the Web Service is provided. The pa- rameters xAxis_, yAxis_ and zAxis_ specify the values for the translation in a three dimensional space. The timestamp is again used to prevent that actions are performed in the wrong order. Only messages that have a timestamp newer than the last timestamp are regarded. If this condition is met, the trans- formation data is passed on to the Recipient in form of a 4x4 matrix, by calling his setMatrixFromWS(...) function. The translateSO(...) function also has no return value besides SOAP_OK. • rotateSO(float xAxis_, float yAxis_, float zAxis_, float degree, int timestamp, IDOWS:: ns__signalResponse *result_) is invoked if the scene ob- ject of the Web Service needs to be rotated. The parameters degree, xAxis_, yAxis_ and zAxis_ describe a rotation in a 3 dimensional space by providing an Axis Angle representation of the rotation. The timestamp parameter is used to check if the message is still relevant. If that is the case the informa- tion about the rotation is passed on to the Recipient of this IDOServiceConnector, by calling the setMatrixFromWS (...) function of the Recipient. Just like the other Web Ser- vice operations, this one has no return value besides SOAP_OK. The result_ parameter is a *void struct. • setNewState(int state_, int timestamp, IDOWS:: ns__signalResponse *result_) serves no function in the scenario of this work. It is included as a basis for future purposes. Now the Connector Node implemented in the IDOService- Connector class can receive messages via a Web Service interface. But still one necessary part is missing. He has to perform the ac- tions requested by the messages sent to his Web Service. To do so he must be able to notice if a message was received. This again calls for the use of a Listener. The Recipient for the Listener already exists in form of the IDOServiceConnectorRecipient, the functions that are needed to handle the event of receiving a mes- page 54 6 Implementation

sage are added to this class. All that is left is to implement the Sender and the event for which the Listener will listen. The event is of course the reception of a message, this event is defined by using IDOGUID’s, and is named SERVICE_MESSAGE_RECEIVED (see list- ing 3). The Sender of this event is the IDOServiceConnector, but to maintain the ”Separation of Concerns” an extra class called IDOServiceConnectorSender is implemented. The IDOService- Connector will hold a member variable, called m_sender, which is an instance of this class. Hence the IDOServiceConnector is both the Recipient and the Sender of the event SERVICE_MESSAGE_ RECEIVED.

1 #include ”IDODefs.h” 2 #include ”IDOEventType.h” 3 4 BEGIN IC NAMESPACE 5 namespace IDOEventType 6 { 7 // {05FD1C2F−E116−407e−A371−3D86D6CDFBC2} 8 DEFINE IDOGUID( SERVICE MESSAGE RECEIVED, 9 0 x5fd1c2f , 10 0xe116 , 11 0x407e , 12 0xa3 , 13 0x71 , 14 0x3d , 15 0x86 , 16 0xd6 , 17 0xcd , 18 0 xfb , 19 0 xc2 ) 20 } ; 21 END IC NAMESPACE 3: C++ header file defining the event type for receiving a message via the Web Service of an IDOServiceConnector object To distinguish between the event of receiving a message to rotate, to translate, or to change the texture, subevents are defined in the IDOServiceConnectorSender class. These subevents are called SERVICE_MESSAGE_RECEIVED_TRANSLATE, SERVICE_MESSAGE _RECEIVED_ROTATE and SERVICE_MESSAGE_RECEIVED_TEXTURE. Whenever the Web Ser- vice of an IDOServiceConnector object receives a message, the Sender object created for this Connector Node ´ıs used to inform the Listener. For this purpose the IDOServiceConnectorSender class provides the functions onTranslate(), onRotate() and 6.2 The Connector Node page 55 onTexture(), which call the Listener and hand over the correspond- ing subevent. Three functions are implemented in the IDOService- ConnectorRecipient class to handle these different subevents. • handleServiceEventTranslate(IDOResourceID soID) transforms the scene object identified by the parameter soID. The matrix containing the information about the translation was already handed over by the Web Service when the message was received. Now the matrix specifying the current position of the scene object in world coordinates is acquired and multiplied with the matrix containing the transformation data from the Web Ser- vice. The resulting matrix is set to be the new position of the scene object by using the setTransformBSbyWS(...) func- tion provided by the VDP class IDOAppSceneObject- Manipulator. • handleServiceEventRotate(IDOResourceID soID) ro- tates the scene object which is specified by the soID parameter. It also combines the matrix giving the current position of the scene object with the transformation matrix from the Web Ser- vice. By using the setTransformBSbyWS(...) function the new position of the scene object is set to be the result of that combination. • handleServiceEventTexture(IDOResourceID soID) sets the new texture for the scene object specified by the parame- ter soID. First the image that is to be set as the new texture must be found. The Web Service only received an integer value named state_ which is supposed to represent the image. To acquire the image file, the name of the scene object plus an underline charac- ter plus the integer value defined in state_ are combined. The resulting String is the name of the image file. This requires that all images that are used as textures must be named accordingly, otherwise the texture changes cannot be performed. Also, all im- ages have to be in the PNG image format. The function creates a reference for the scene object and uses that reference to obtain the material (IDOAppMaterial) of the scene object. After this the shader object (IDOAppShader) for the material is requested, which is used to get the shading pass (IDOAppShadingPass). If there is a first texture in the shading pass, this is removed and a new texture (IDOAppTexture) is created using the image file specified by the state_ parameter from the Web Service. To connect the IDOServiceConnectorSender with his IDO- ServiceConnectorRecipient a new Listener class is designed, page 56 6 Implementation called IDOServiceListener. Just like the other Listeners, this class is derived from the template class IDOListener, and the virtual functions requestSubEventType (...) and update(...) of the parent class are implemented. requestSubEventType(...) listens for the three subevent types SERVICE_MESSAGE_RECEIVED_TRANSLATE, SERVICE_MESSAGE _RECEIVED_ROTATE and SERVICE_MESSAGE_RECEIVED_TEXTURE and the update(...) function calls the functions provided by the Recipient to handle them. The classes used to enable the Connector to receive messages, and perform the actions that are related to them, are shown in figure 19. The sequence of action that results from receiv- ing a message to change a texture is described in figure 18.

A BPEL process invokes changeTexture(...) The IDOServiceConnector onRotate() the changeTexture() operation provided receives the message by an IDOServiceConnector

callListener(SERVICE_MESSAGE_RECEIVED, SERVICE_MESSAGE_RECEIVED_ROTATE) The IDOServiceConnectorSender notifies the IDOServiceListener

handleServiceEventRotate(...) The IDOServiceListener calls The IDOServiceConnectorRecipient performs the IDOServiceConnectorRecipient the rotation to handle the event

Figure 18: UML activity diagram showing the sequence of actions that results from invoking a texture change.

6.3 The Connector Node Manager All components needed to connect a scene object with a functional sim- ulation are now implemented. The next step is to design a class to man- age all the Connector Nodes. For this purpose, the IDOService- ConnectorManager class is implemented. This class is used to cre- ate, store and delete all the Connector Nodes and their Listeners. A new Connector Node is created by using the addConnector (IDOResourceID soID, int connectorType) function imple- mented in this class. The parameter soID specifies the scene object the Connector Node is created for, while the parameter connector- Type determines which type of Connector Node must be created. First, the function is checking if the scene object already has a Connector Node. To visually represent the Connector Node, a 6.3 The Connector Node Man- ager page 57

Figure 19: UML class diagram showing class involved in the deployment of a Web Service interface and the handling of the actions requested by messages send to that Web Service. page 58 6 Implementation scene object is created. This scene object has no geometry or ma- terial and can only be seen in the scene graph. It is appended as a direct child of the scene object for which it was created. The name of this scene object is composed of the String ”ConnectorNode” and the name of its parent scene object. Next the Connector is created by using the connectorType parameter. Depending on the parameter, either an instance of the IDOToggleSwitchConnector class, the IDORotarySwitchConnector class, the IDOPushButton- Connector class, or the IDOTextureConnector class is created. So the IDOServiceConnectorManager is the factory for these classes. The thread that is created to run the Connector is given a name constructed out of the String ”connectorthread” and the ID of the scene object for which he is created. The ID of the scene object the Connector is created for and the ID of the scene object repre- senting the Connector Node are inserted in a std::map container called m_connectorIDs. This way, their relation is remembered and can be used later. The Connector is also stored in a std::map container with the ID of the scene object he was created for as the key element. This container is called m_connectorList. After this, the member variables for the Sender and the Recipient of the Connector are set. Finally, the two Listeners for the Connector are created. Depending on the type of the Connector, either an IDOSceneObjectRotListener, an IDOSceneObjectTrans- lationListener, or in the case of an IDOTextureConnector no Listener is created. The creation of the IDOServiceListener is done in any of these cases. To create a Listener, the VDP framework de- ploys the template class IDOListenerCreator. Instead of creating Listeners by calling their constructor function, the create() func- tion of this class must be used. The Listener class which is meant to be created is the template argument and the Sender, the Recipient and the event type are the parameters of the function. See listing 4 for an example in which an IDOServiceListener is created. It is important to mention that the function addConnector(...) is just constructing the Connector. To start the Connector, the thread in which the Web Service deployed by the Connector is running must be started. This is done by calling the startConnector(IDORe- sourceID soID) function provided by the IDOServiceConnector class. 6.3 The Connector Node Man- ager page 59

1 IDOListenerCreator :: create ( m connectorList [soID]−>getSender(), m connectorList [soID ]−>getRecipient() , IDOEventType:: SERVICE MESSAGE RECEIVED) ;

4: Creating a Listener for the Web Service by using the create function provided by the IDOListenerCreator template class

The IDOServiceConnectorManager also provides a function to delete one Connector and one function to delete all Connectors. The function removeConnector(IDOResourceID soID) first checks if the scene object specified by the parameter soID has a Connector and if this is true, it performs all actions needed to remove this Connector and all the objects created for him. First the Listeners created for the Connector are deleted. Then the thread in which the Web Service is running is shut down. The en- tries for the Connector made in the containers m_connectorList and m_connectorIDs are erased, and finally the scene object cre- ated to represent the Connector Node is removed from the scene graph. The removeAllConnectors() function is basically doing the same just for all the Connectors currently managed by the IDOServiceConnectorManager. Some helper functions are also implemented for this class. The function sceneObjectHas- Connector(IDOResourceID soID) checks if the scene object spec- ified by the parameter soID has a Connector and returns a boolean value. The ID of the scene object for which a Connector was created can be obtained by calling the getSceneObjectID(IDOResourceID connectorID) function. To check if a port is already used by the Web Service of an existing Connector, the function portAlreadyInUse(int port) can be used, which returns a boolean value. The function rememberPort(int port) stores the port specified in the parameter port in a std::set container called m_portList. This is used to remember all the ports that are already used by the Web Services of the Connectors. To get an available port, the function getFreePort() is called. This function checks the container m_portList and returns a port that is not used by any Connector. The ports suggested by this function range from 8082 to 65535. Finally, the functions removePortFromList(int port) and resetPortList() can be used to erase one port from the port list, to delete all ports stored in that list. In figure 20 a UML class diagram for the IDOServiceConnectorManager class and all the classes whose objects are managed by this class is shown. page 60 6 Implementation

Figure 20: UML class diagram showing the IDOServiceConnectorManager class and all the classes managed by it. 6.4 The Graphical User Interface page 61

6.4 The Graphical User Interface The characteristics of a VDP module have already been described in section6.1. In this section, it was mentioned that the Functionality-

ConnectorModule creates an instance of the class Functionality- ConnectorModuleUI when it is initialized. This class is responsi- ble for the graphical user interface of the module. Before the imple- mentation of this class and the class that is used to build the widget for the user interface is explained, it is necessary to look back at the Connector Node and consider all the information we need from the user to build one and the things he will need when building a Virtual Prototype. The widget that is used for the menu of the module must be designed.

6.4.1 Designing the Modules Menu The functional requirements and the user interface requirements de- fined in section 4.2 and section 4.4 also need to be regarded. The user interface requirement UIR 1 demands that the GUI must allow the VP Developer to perform the actions defined in the functional require- ments FR 2.5.1 - 2.5.5. These actions correlate with the creation of a Connector Node, the removal of a Connector Node, the removal of all Connector Nodes, the storing of all configurations made dur- ing the construction of a Virtual Prototype, and the regeneration of that Virtual Prototype by loading the saved configurations. To per- form these actions, the user is provided with five buttons: a button to create a Connector Node, a button to remove a Connector Node, a button to save all Connector Nodes, a button to load previously saved Connector Nodes, and a button to reset all current config- urations (and thereby delete all the Connector Nodes created in the current session). Since different types of Connectors are imple- mented for the different types of switches, the GUI must provide a way to select the Connector which is best suited for a specific scene object. This is provided by using radio buttons. Before a Connector Node can be created, the scene object with which he is supposed to con- nect a functional simulation is needed. Selecting the scene object is done by clicking on the according scene object node in the scene graph, or by clicking on the geometric object for the scene object. It is important to choose the correct node, otherwise the functional be- havior might not be simulated as planned. Finally the properties of the Connector Node must be specified. On account of the different page 62 6 Implementation types of Connectors, the settings needed for their creation may vary. An IDOToggleSwitchConnector, for example, needs the following configuration parameters: • Hostname: The host address for the Web Service deployed by the Connector • Port: The port through which the Web Service of this Connector can be contacted. • Endpoint: The URL for the Web Service deployed by the BPEL process which models the functional behavior of the scene object this Connector is created for. This is needed so the Web Service client of this Connector knows which Web Service he must send his messages to. • Associate: The associate is the object that may be affected by the events caused by the scene object of this Connector. For example, if tilting a toggle switch must result in a colour change of a control light, the Web Service deployed by the Connector cre- ated for this control light is the associate. The complete URL of the associate must be specified (e.g. http://localhost:18084/Ser- vice/services). The Web Service client of this Connector will send the URL to the Web Service of his BPEL process. The pro- cess can dynamically invoke the Web Service of the associate, if the message he received resulted in a state change that affects the associate. • Rotation Threshold: The rotation threshold is used by the scene objects Listener to determine if the rotation was suffi- ciently large to require the sending of a message. The threshold must not be related to the functional behaviour of the scene ob- ject, it is just used to determine when to send the transformation data. This way not every tiny transformation results in the send- ing of a message. In contrast to the configuration parameters above, the parameters needed to create an IDOTextureConnector are not all the same. An IDOTextureConnector does not need a rotation or a trans- lation threshold in fact he does not even need a Web Service client, because there is no transformation data to send. He will need a host- name and a port for his Web Service, so his texture can be changed by invoking the Web Service. Also, it is necessary to specify a the direc- tory of a folder, where all the image files that can be used as textures are located. To avoid confusing the user of the module, the input fields 6.4 The Graphical User Interface page 63 for the configuration parameters need to be changed depending on the Connector type chosen by the user. Only the parameters necessary for the selected Connector must be displayed. The widget which is designed to provide all the features mentioned before can be seen in fig- ure 21. The image shows the widget in a state where the Connector type for a toggle switch is selected.

Figure 21: Screenshot of the widget designed for the VDP module Functionalty- Connector.

6.4.2 Implementing the Actions Performed by the Buttons The FunctionalityConnectorModuleWidget class is derived from the QWidget class to provide all the QT features and it is also derived from the IDOListenerContainerRecipient class, so a Listener can call it to handle the events noticed by him. All the buttons and in- put fields, which can be seen in figure 21, are defined in this class. By using the Signal-Slot concept of QT, the buttons are con- nected to the slots that perform the actions related to them. The FunctionalityConnectorModuleWidget class creates an instance of the IDOServiceConnectorManager, called m_Service- ConnectorManager and is using this instance in its functions. To hold all the Connector Nodes created in a session, an instance of the QT class QDomDocument, called m_domDocument, is used. This object is representing an XML document and can read or write the document to a file. The XML document is used to save or load Connector Node configurations. Another important member vari- able is a reference of the IDOPicker called m_UIPicker. This class provided by the VDP framework is responsible for all mouse-related se- lection and clicking operations. It is used to get the scene object that is currently selected. The constructor of the FunctionalityConnector- ModuleWidget class first creates an empty QDomDocument with a root node named ”FunctionalObjects”. Next the instance of the page 64 6 Implementation

IDOServiceConnectorManager class is created by calling its con- structor. The reference for the IDOPicker is obtained by requesting it from one of the main classes of the VDP, called IDOExploreApp. Finally all the QT objects like buttons, input fields and slots are created and the layout of the widget is defined to group those ele- ments. When the button ”Create Connector” is pressed, the signal of this event is handled by the slot createConnector(). This slot first checks if a scene object is selected by using the reference of the IDOPicker. If a scene object is selected, the slot checks if the scene object already has a Connector Node, and if not, if the port speci- fied by the user is already in use. This is done by calling the functions sceneObjectHasConnector(...) and portAlreadyInUse (...) provided by the IDOServiceConnectorManager. Depend- ing on the Connector type selected by the user, the slot creates the re- quested Connector Node by using the addConnector(...) func- tion of the Manager and sets all the configuration parameters for that Connector Node. The parameters are then written to the XML file represented by the QDomDocument and at last the thread in which the Connector is running is started. To save the Connector Node the function saveFunctionalObjectToXML(IC::IDOFunctional- Object* functionalObj) is called. This function writes all the information contained in the parameter functionalObj to the XML document represented by the QDomDocument. The parameter is an instance of the class IDOFunctionalObject which is used to de- scribe a Connector Node. For more information on this class see the section 6.5. The structure of the XML document is shown in fig- ure 8.

To remove a Connector Node, the user must select the scene ob- ject representing the Connector Node and then press the button ”Remove Connector”. The slot removeConnector() executes the required action. This slot first checks if a scene objects is selected by using the IDOPicker. If this is true, the ID of the scene object ob- tained from the IDOPicker is used to get the name of the scene object. The QDomDocument is then searched for a FunctionalObject with that name. If found, the complete entry for this FunctionalObject is removed. The port used by the Web Service of the Connector which is removed is deleted from the port list by calling the function removePortFromList(...) provided by the Manager. Finally the Connector Node is removed by calling the removeConnector (...) function of the Manager. If the button ”Save To XML File” is pressed, the slot saveToXMLFile() will perform the requested ac- 6.4 The Graphical User Interface page 65 tion. A QFileDialog is used to enable the user to select a directory where the file should be stored and to either choose an existing XML file, or to give the name for the new file that must be created. The QDomDocument which is holding all the Connector Node configu- rations is then written to that file. Loading such an XML file is done by pressing the ”Load XML File” button whose signal is handled by the loadFromXMLFile() slot. This slot opens a QFileDialog, so the user can specify the file he wants to load. The file name returned by the dialog is handed to the function loadFunctionalObjectsFromXML (...) as the parameter. This function then checks if the file can be opened for reading and if that is true the content of the XML file is written to a temporary QDomDocument. All the FunctionalObjects stored in the XML file are read and the Connector Nodes and Listeners are created. The newly loaded and created Connector- Nodes are written to the QDomDocument holding all the running Connector Nodes. This way if the ”Save To XML File” button is pressed all Connector Nodes will be stored, ones that were created manually and the ones that were created by loading an XML file. If a FunctionalObject from the XML file is connected to a scene ob- ject which already has a Connector Node, the Connector Node for this object is not created. By pressing the ”Reset” button all Connector Nodes are deleted. The slot reset() calls the remove- AllConnectors() and the resetPortList() function of the Man- ager and clears the QDomDocument holding all the Functional- Objects. To display only those input fields needed for the selected Connector type, the five slots for the radio buttons, which are used to select that type, set the required input fields visible and hide those who are not required. The IDOTextureConnector needs a texture folder specified so he knows where to get the images for the requested texture changes. This texture folder can be specified by pressing the ”Browse” button under the input field called ”Texture location”. Press- ing this button will call his slot pressedBrowseButton(), which will open a QFileDialog to allow the user to find the directory where the images for this IDOTextureConnector are stored.

6.4.3 Handling Listener Events The user interface requirement UIR 2 demands that the GUI of the VDP indicates that a scene object is connected to a functional simula- tion. This is already handled by the IDOServiceConnectorManager who creates a scene object to represent the Connector Node and ap- pends it as a direct child of the scene object for which it was created. page 66 6 Implementation

To regard the user interface requirement UIR 3, which demands that the GUI must display the configuration of the Connector Node if his scene object is selected, it is necessary to listen for the scene objects which are selected. If a scene object is selected by clicking on it, it must be checked if this scene object is a Connector Node and if so, the configurations of this Connector Node must be displayed in the wid- get. This again calls for the use of a Listener. Another Listener is needed to check if a new scene is loaded, because this requires that all the Connector Nodes which are currently running must be de- stroyed. The Listener implemented to send an event if a scene object is selected is called IDOSceneObjectSelectedListener. TheSender for this Listener is the IDOPicker and the event sent by him is the SCENEOBJECT_ON_SELECT event. The IDOScene- ObjectSelectedListener calls the IDOFunctionalityCon- nectorModuleWidget as his Recipient to handle the event. The function provided for this call is the handleSceneObjectSelected() function, which will set the appropriate as checked and dis- play all the configurations of the Connector Node, if the scene object selected is a Connector Node. If the selected scene object is not a Connector Node, default values are displayed and the Connector type for a toggle switch is checked. To detect if a new scene is loaded, which would require that all the cur- rently active Connector Nodes must be deleted, the IDOGUIListe- ner is implemented. The Sender for this Listener is the IDOExp- loreApp, which is the class representing the IDO:Explore application. This class provides the event type LOAD_FILES_STARTED. This event is emitted if the user is loading a new scene. If this event is received, the IDOGUIListener calls his Recipient, the IDOFunctionality- ConnectorModuleWidget, to handle the event. For this purpose, the function handleNewFileLoaded() is offered by the widget class. Calling this function results in the removal of all Connector Nodes and the clearing of the port list via the IDOServiceConnectorMana- ger. Also, the QDomDocument holding all the FunctionalObjects is reset.

6.4.4 Constructing the Module and its User Interface The class designed to integrate the widget into the module and to imple- ment all the operations that need to be perfomed by the widget is called IDOFunctionalityConnectorModuleWidget. The widget pro- vided by this class is used by the FunctionalityConnectorModule- UI class. This class only has one function, which is the constructor function. This constructor, among other things, creates the widget for 6.5 The Functional Object page 67

the module. If an instance of the class FunctionalityConnector- ModuleUI is created by calling its constructor, a reference to the IDOQT2DGUI object of the VDP is obtained and this reference is used to get a reference of the IDOQTModuleUIManager. This again is used to get the dockwindow widget of the VDP and use it as the parent wid- get for the new widget that is now created. The module itself, which was passed on as a parameter in the constructor call, and the new widget are then added to the IDOQTModuleUIManager. Now the module and its GUI object are created and linked to the VDP. The two Listeners needed by the modules widget are also created in the con- structor of the FunctionalityConnectorModuleUI class. By us- ing the template class IDOListenerCreator, the IDOGUIListener and the IDOSceneObjectSelectedListener are created and sup- plied with their Sender and Recipient (see listing 5).

1 IDOListenerCreator :: create (IDOExploreApp : : getSingletonPtr() , moduleWidget, IDOEventType:: EXPLORE APPLICATION STATE CHANGED) ; 2 IDOListenerCreator :: create ( picker , moduleWidget, IDOEventType:: PICKED SCENEOBJECTS CHANGED) ; 5: Creating the two Listeners for the widget by using the create function provided by the IDOListenerCreator template class A UML class diagram for all the classes described in this section see figure 22.

6.5 The Functional Object As described in section 6.2 the Connector Node is not a single ob- ject but rather a concept, which consists of a number of classes, which through their interactions realize this concept. The main class in this framework is of course the IDOServiceConnector, but without all the other classes this class is useless. Many parameters have to be specified so that these classes can perform their duties. Some of the parameters are required by more than one class and thus a con- tainer class for all the parameters is designed. This class is called IDOFunctionalObject. For every scene object that is connected to a functional simulation, an instance of this class is created to hold all the configurations necessary for the connection. Since the information contained in the IDOFunctionalObject is used by the IDOService- Connector and the IDOServiceConnectorRecipient, and the IDOServiceConnector is holding an instance of the IDOService- page 68 6 Implementation

Figure 22: UML class diagram for all the classes related to the module and its menu. 6.5 The Functional Object page 69

ConnectorRecipient in the form of a member variable, the IDO- FunctionalObject will be stored in the IDOServiceConnector- Recipient. This way the parameters saved in the IDOFunctional- Object can be accessed by the IDOServiceConnectorRecipient which is held by the IDOServiceConnector which are created and stored by the IDOServiceConnectorManager, whose instance is held by the IDOFunctionalityConnectorModuleWidget. The latter is using the IDOFunctionalObject when saving the config- uration settings of Connector Nodes to an XML file. The member variable for an instance of the IDOFunctionalObject, held by the Recipient, is called m_functionalObj. The following parameters are held by this variable and can be accessed via get and set functions. • m_connectorType is an integer value, which represents the type of the Connector. For the scenario of this work, four different types of Connectors are implemented (see section re- fConnectorNode). • m_clientAssociate is the URL for the Web Service of the Connector Node, which might be affected by this object’s func- tional behaviour. • m_clientEndpoint is the URL of the BPEL process, which is responsible for modeling this object’s functional behaviour. • m_soID is the unique ID of the scene object for which this object is created. • m_soName is the name of the scene object for which this object is created. • m_port is the port which must be used to contact the Web Ser- vice of this object. • m_host is the name of the host for the Web Service of this object. • m_rotThreshold is the rotation threshold set for this object. • m_transThreshold is the translation threshold set for this ob- ject. • m_textureFolder is the directory were all the images which can be used as textures for this object are stored. All these parameters are either set by the modules widget or by the manager class that creates the Connector Nodes. page 70 6 Implementation

6.6 The BPEL Processes

To provide the functional behaviour for the Virtual Prototype, that is build to simulate the cockpit scenario of this work, five BPEL pro- cesses are designed. These processes all model the functional relations and behaviour of control devices that are located in the cockpit of an aircraft. The first process is called BPELToggleSwitchProcess. This process can be used to simulate a toggle switch, that changes the colour of a control light if tilted. The toggle switch has three states which are mapped to the colour green, yellow and red for the control light. The process receives the angle of the toggle switch in an Axis Angle representation, the URL of the Web Service that is deployed for the control light and the timestamp for the moment the message was send by the toggle switch. The angle is used to map the position of the toggle switch to the colour of the control light. If the angle is smaller then -29.9 degrees the state -1 is set, if the angel is bigger then 29.9 degree the state 1 is set and otherwise the state 0 is set. The process sends the state and the timestamp that was received to the control light, by invoking the operation ChangeTexture(...) deployed by the Web Service that was created for the scene object of the control light. The process is built with the use of the BPEL editor provided by Eclipse is shown in figure 23. The two processes used to model the re- lationship between a rotary switch and a monitor and the relationship between a push button and a monitor are similar. The process for the rotary switch is called BPELRotarySwitchProcess, this process maps the rotation data received from the switch to four different states (0, 1, 2, 3) and then invokes the operation ChangeTexture(...) deployed by the Web Service specified in the associate parameter. The BPELPushButtonProcess does the same, but due to the character- istcs of a push button, which only has two states, he can only map to two different states, depending on the translation data he receives. The two processes are shown in figure 24. A more complex BPEL process is built to simulate a scenario where two toggle switches must be set in the same position to change a control light. Aside from mapping the rotation data to a state, this process must also be able to discriminate the two toggle switches which will send messages to him. In contrast to the processes described above, this process is not finished after pro- cessing one message. He must serve a number of requests before he can be terminated. To remember the states he assigned to the two tog- gle switches, he is holding two variables, and to identify the different switches, he is also holding the ID received in the messages from those switches. If a message is send to this process, he first checks the ID of the scene object that send the message. Depending on the ID he will 6.6 The BPEL Processes page 71

Figure 23: BPEL process for the toggle switch to control light scenario.

Figure 24: BPEL process modeling the functional relation of a rotary switch and a monitor. page 72 6 Implementation then assign the state to the appropriate state variable. If both states are the same the process will call the ChangeTexture(...) opera- tion of the Web Service that was specified by the associate parameter. The process can be seen in figure 25. The last process is very simple.

Figure 25: BPEL process for the scenario of two toggle switches.

It was build to show the use of another Web Service operation besides the ChangeTexture(...) operation used in all the other processes. This process is called BPELRotationEchoProcess and it just takes the rotation data it receives from a Connector Node and then in- vokes the Web Service specified in the associate parameter with the call for the operation rotateSO(...) using this rotation data. This way the scene object of the Connector Node that receives the invoke is rotated to the same degree as the object that caused the invoke. This process is used to connect the flight control stick with the root node 6.6 The BPEL Processes page 73 of the whole cockpit. When the flight control stick is tilted in any di- rection, the complete cockpit is tilted in the same direction. This way, the aircraft can be steered. page 74 7 Deployment of the Module

7 Deployment of the Module

7.1 User Guide for the Module

To use the module, the VDP first must be aware of its existence. To integrate the module into the VDP, it is necessary to edit the con- figuration file idoDefault.ini. The directory where the module is stored in form of a DLL file and the name of the module must be added to this file. The directory with the location of the Dll file is inserted under the category [Plugin.Paths]. The name of the module must be entered under the category [Plugin.Load]. When these modifications are done the module is integrated into the VDP and can be selected by clicking on the red button in the module of the VDP. In case the user switches to another module, all the threads for the Connector Nodes are kept alive; therefore, if the user switches back to the FunctionalityConnectorModule he can continue work right where he left. Loading a new scene auto- matically causes the module to destroy all active threads, therefore the user must consider this and save the configurations he made to an XML file if he intends to use them again later. Also by saving the configura- tion the user doesn’t have to create the Connector Nodes manually every time he wants to build a Virtual Prototype. By loading an XML file created for a configuration that was created earlier the user can recreate a previously constructed Virtual Prototype. Additionally it is possible to load several XML files, redundant Connector Nodes which are contained in more than one file will only be created once. If a scene object already has a Connector Node, it is not possible to create a second. When saving the Connector Nodes it does not matter if they were created manually or by loading an XML file; all active Connector Nodes are written to the XML file. To remove a Connector Node, the user must select the scene object which was created to represent the Connector Node he wants to delete and then press the ”Remove Connector” button. In order to delete all the Connector Nodes which are currently active he can press the ”Re- set” button. The configuration of a Connector Node is displayed in the modules menu, when the scene object which represents it is se- lected. In the following, an example is given for the construction of two Connector Nodes, one for a toggle switch and one for a control light.

Before creating the Connector Node it is important to identify the scene object which is associated with the functional behaviour modeled by a BPEL process. The toggle switch used in this example is com- posed of several scene objects which, taken together, form the complete 7.1 User Guide for the Module page 75 object. Not all of these scene objects need to be moved to trigger the sequence of actions related to the function of this switch. For example, the base in which the switch is mounted must and cannot be moved. In addition just moving the head of the toggle switch would not be the desired event. If the scene object node is found that together with all his child nodes forms the object that must be connected to a BPEL process, this object must be selected in the scene graph by clicking on it (see figure 26).

Figure 26: Selecting the appropriate scene object node in the scene graph.

After selecting the scene object, the type of the Connector Node must be selected. For this example, the ToggleSwitch Connector is chosen by clicking on the corresponding radio button. The following settings are entered to enable the Connector Node to connect the switch with his BPEL process:

• Hostname: c10006x64 • Port: 18083 • Endpoint: http://c08052x64:8080/ode-axis2-war-1.3.4- SNAPSHOT/processes/BPELToggleSwitchService page 76 7 Deployment of the Module

• Associate: http://c10006x64:18084/Service/services • Rotation Threshold: 30 Finally theConnector Node is created by pressing the ”Create Con- nector” button. The scene object to represent the Connector Node is appended as a direct child of the Connector Node’s scene ob- ject. The name of this scene object is composed of the word ”Con- nectorNode” and the name of its scene object. If this scene object is selected the settings of the Connector Node are shown in the menu (see figure 27). The Connector Node for the control light is cre-

Figure 27: Displaying the settings of a Connector Node. ated by selecting the scene object which holds the texture depicting the control light and choosing the Connector Node type ”Texture Connector” by checking the radio button for that type. The settings for this Connector Node are:

• Hostname: c10006x64 • Port: 18084 • Texture location: c:/Cockpit/ControlLightTextures/ 7.1 User Guide for the Module page 77

The location of the folder containing the images that are used as tex- tures is specified by pressing the ”Browse” button and selecting that folder. The image files in that folder have to meet the following require- ments. The file name has to be composed of the name of the scene ob- ject plus an underline character plus an number correlating to the states received from the BPEL process. All image files must be using the PNG file format. In this case, the image files must be called ”Body 22 1.png”, ”Body 22 0.png” and ”Body 22 -1.png”. Now that both scene objects have a Connector, their functional relationship can be simulated. If the user tilts the toggle switch by a degree bigger than the rotation threshold, the Web Service client of the toggle switch sends the rota- tion data to the BPEL process BPELToggleSwitchProcess, which will map the rotation to a state and then invoke the Web Service opera- tion ChangeTexture(...) provided by the control light. Figure 28 shows all three possible results of the simulated functional relationship.

Figure 28: On the left side the toggle switch is tilted by 30 degrees, in the middle the toggle switch is in his default position and on the right the toggle switch is tilted by -30 degrees.

For the final demonstrator that was for the scenario of this work, the BPEL processes BPEL2ToggleSwitchesProcess, BPELRotary- SwitchProcess, BPELPushButtonProcess and BPELRotation- Echo are used. The control light is set into relation to two toggle switches, a rotary switch is connected with the monitor showing a map, a push button affects the weapons monitor and the rotation data from the flight control stick is used to steer the whole cockpit. All the con- trol devices and the objects which are related to them are shown in figure 29 and are marked red. page 78 7 Deployment of the Module

Figure 29: For the final demonstrator Connector Nodes for all the objects that are marked red were created to connect them to their corresponding BPEL processes.

7.2 Extending the Module to Incorporate New Processes New BPEL processes can be integrated in the existing framework by using two different approaches. The most simple approach is to re- model one of the existing BPEL processes. This approach can only be used if one of the existing processes receives all the information the new process would need. To connect a scene object to this remodeled process, no further changes to the module or one of its components are needed. This approach is not very helpful since a new process would be integrated at the cost of losing another. The second appoach is to build a completely new BPEL process and then extend the Web Service client used by the VDP’s FunctionalConnectorModule so he can contact the Web Service of the new process. Building a new BPEL process is easy. It can be done by using the open source BPEL editor provided by Eclipse, or any other way one might prefer. In this work, the editor provided by Eclipse was used. For instruc- tions on how to get started with this editor, the tutorial25 published by the Institute for Applied Information Technologie at the Univer- sity of Hannover was found to be quite helpful. Once the new BPEL

25http://www.se.uni-hannover.de/lehre/tutorials/ /BPEL-ODE-Eclipse.php 7.2 Extending the Module to In- corporate New Processes page 79 process is built the WSDL file describing the Web Service of that pro- cess can be used to create all the files needed for the modules Web Service client. To exemplify the modifications that have to be done to integrate a new process into the module, the integration of the BPEL2ToggleSwitchesProcess into the module is described in detail. An existing Connector Node is extended to allow commu- nication with this process. The IDOToggleSwitchConnector is already able to communicate with the BPELToggleSwitchProcess at this point. The function handleSOEvent(...) of his Recipient is calling the sendToggleSwitch(...) function of its Web Service client in the event of a rotation. This function of the client must be extended, so he can also send the information to the new process. To create client stubs for the module’s Web Service client the gSOAP tools wsdl2h and soapcpp2 are used. The steps performed to extend the function are the following:

1. Copy the WSDL file provided by the BPEL process into an empty folder. 2. Open the WSDL file and remove the line specifying the import for the Web Service description of the VDP module. In the file cre- ated for the BPEL2ToggleSwitchesProcess, this line looks like this:

1 6: Line which needs to be removed from the Web Service description of the BPEL2ToggleSwitchesProcess.

3. Open a console and type the command

wsdl2h -o BPEL2ToggleSwitches.h BPEL2ToggleSwitchesProcessArtifacts.wsdl

to create the C++ header file with service operation definitions and types for the operation’s data. 4. Open the created header file and encapsule the contained class with a namespace. For this example the namespace IDO2Toggle- SwitchesClient is set. The encapsulation in a namespace is necessary so the Web Service client of the module can distinguish all the different clients created for the BPEL processes. page 80 7 Deployment of the Module

5. To create the client stub and XML serialization routines with the use of gSOAP type the following command:

soapcpp2 -i -C -n -pIDO2ToggleSwitchesClient -I”C:\Programm Files\gsoap-2.7\gsoap\import” BPEL2ToggleSwitches.h

The -i option indicates that we want C++ proxy and server ob- jects that include the client code, the -C option is used to only generate the files needed for a client and the -I option specifies the directory from which the stlvector.h file can be imported. For more information on how to build a client with the use of gSOAP, see the User Guide for gSOAP 2.7.17 by Engelen (2010), espe- cially the section 19.33 ”How to build a Client or Server in a C++ Namespace” and the section 19.34 ”How to Create Client/Server Libraries”. 6. Go to the WSClient folder located in the source code folder of the module and create a new folder for the files created with the gSOAP tools and copy them into this folder. It is important to also insert the stdsoap.h file into this folder. 7. Open the IDOWSClient.cpp file and include the IDO...Bind- ingProxy.h and the IDO2ToggleSwitchesClient.nsmap files. Now the sendToggleSwitch(...) can be extended to invoke the Web Service of the new process. The already existing code can be used for help. 8. Edit the CMakeList.txt file of the module. The files IDO...- BindingProxy.cpp and IDO2ToggleSwitchesClientC. cpp must be listed. After having done all these things, the solution for the module must be remade using CMake. Before the module can be rebuild using Visual Studio, it is necessary to open the properties for the stdsoap.cpp file by right-clicking on the file in the solution view. Select the option C/C++ CommandLine and then the option Additional Options and enter -DWITH NONAMESPACES. The resulting source code for the sendToggleSwitch(...) function of the IDOWSClient can be seen in listing 9. page 81

8 Results and Validation

The main goal of this work is to add the possiblity of simulating functional behaviour and relationships within Virtual Prototypes. To achieve this goal, the VR System which is used to display the Vir- tual Prototype is bidirectinally coupled with functional simulations. The functional simulations are only utilized to model the functional behaviour of the Virtual Prototype and are not themselves used to perform any tests. Because of this, it is not necessary to regard the reaction time respectivly the transmit time. The functional behaviour must not be performed or displayed in realtime, although an interac- tion with a scene object must directly result in the appropiate event. It is not important if the reaction to an interaction is executed within a hundredth or a tenth of a second, but the user of the prototype must have the feeling that it happens immediately. This is achieved by the solution presented in this work. Since another solution for the exact same task formulation and sce- nario was developed before this work by Seidel (2009), the quality of the class framework presented in this work can be assessed by compar- ing the two. To have some objective criterias, the Source Lines of Code measurements are applied. The two different approaches and the fea- tures provided by them are compared and the complexity of the code which implements them, represented by the Lines of Code, is set into relation with them. The Source Lines of Code measurements are of course not the main criteria used to determine the quality. Other crite- rias like extensibility, documentation, interoperability or usability play a much more important role in the evaluation process. First, the Source Lines of Code measurements for both solutions are presented to give a complete picture. The source code for the Web Service clients is not included, because it is not connected to the solution. Since the solution suggested by Seidel (2009) has no graphical user interface, two mea- surements for this works solution are given, one including the classes implementing the GUI 10 and one without them 9. The source code for the solution implemeted by Seidel (2009) includes the code written for the IDOSwitchHandler plugin and the Mediator. The Source Lines of Code measurements for his code are shown in table 8 For more details on these two components see section 2.

Both solutions are able to address the main scenario, which is to con- nect switches with displays. In other words, a transformation of a scene object can result in a texture change for another scene object. The IDOSwitchHandler plugin developed by Seidel (2009) requires a user page 82 8 Results and Validation

Table 8: Source Lines of Code measurements for the plugin developed by Seidel (2009).

Measurement Count Source Files 24 Lines of Code 2449 Blank Lines of Code 560 Physical Executable Lines of Code 1385 Logical Executable Lines of Code 836 Code and Comment Lines of Code 13 Comment Only Lines of Code 504 Commentary Words 2107

Table 9: Source Lines of Code measurements for the FunctionalityConnectorMod- ule without the Classes implemented for the module’s GUI.

Measurement Count Source Files 34 Lines of Code 3169 Blank Lines of Code 584 Physical Executable Lines of Code 1734 Logical Executable Lines of Code 1048 Code and Comment Lines of Code 99 Comment Only Lines of Code 851 Commentary Words 6437

Table 10: Source Lines of Code measurements for the FunctionalityConnector- Module including the Classes implemented for the modules GUI.

Measurement Count Source Files 40 Lines of Code 4458 Blank Lines of Code 767 Physical Executable Lines of Code 2703 Logical Executable Lines of Code 1794 Code and Comment Lines of Code 117 Comment Only Lines of Code 988 Commentary Words 7362 page 83 interaction with a scene object to trigger the texture change, whereas the FunctionalityConnector module, implemented for this work, allows the invocation of a texture change at all times and without any precon- ditions. Additionally this work also provides a way to trigger rotations or translations from outside of the VR system. The plugin connects the VDP with the simulation tool Rhapsody and the coupling of the two indepentent simulations is very strong. To allow communication, the Mediator component must map the scene objects and their state to an event inside Rhapsody. To perform this mapping, the user has to pro- vide an xml description of the switches, a configuration file in which the texture scene object for each switch and the image files for this texture object must be specified and a mapping table for the Mediator in which every switch scene object and all its possible states are set into relation with their corresponding Rhapsody events. The module developed dur- ing this work only needs two kinds of information to connect a scene object with a functional simulation: the URL of a BPEL process’s Web Service and the URL of the Web Service which is deployed for the scene object that might be affected by this scene object. The BPEL processes orchestrate all the invocations received from and send to the VDP and can also be used to simulate functional behaviour themselves. The cou- pling implemented for the module is therefore very loose. Because the module can be used to connect scene objects to different simulation tools it has a high interoperability, whereas the plugin which can only be used to connect a scene object to a Rhapsody simulation has a low interoperability. Extending the module to incorporate new functional behaviour is simple, only one function in the Web Service client class must be edited. The plugin can also easily be extended by creating the required XML switch description, the configuration file for the plugin and the mapping table for the Mediator. But this extensions can only incorporate functional behaviour linked to the switch-texture scenario. Adding functional behaviour for other scenarios would require much work. As a result the extensibility of the module is better then the extensibility of the plugin. By providing a graphical user interface, which can be used to create all the connections for the scene objects, the module also has a better usability and is more user-friendly. The Doxygen documentation standard was used for the documentation of the module’s source code. This allows future developers to generate an up-to-date reference for the software. The source code of the plugin is documented, but without using any standards which allow the genera- tion of references. This, and the fact that the number of commentary words included in the modules source code is 6437 (respectivly 7362 if including the code for the GUI) and the plugin only has 2107 commen- page 84 8 Results and Validation tary words, results in a better quality of the modules documentation. The IDOSwitchHandler plugin and the Mediator together have a total of 2449 lines of code with 1385 physical executable and 836 logical ex- ecutable lines. This is less then the number of lines for the modules source code (see table 9 and 10). But since the module offers more features and can be applied for a bigger range of scenarios, the higher amount of source code lines is justified. It does not mean that the code written for the module is less effective. As a result of all the charac- teristics of the two solutions, the Service Oriented Architecture used in the FunctionalityConnector module, is better suited to add functional behaviour to a Virtual Prototype.

Because of the cockpit control device scenario the Connector classes for a toggle switch, a rotary switch and a push button were implemeted. These classes can be traded for more generalized classes. The Connec- tor class for the toggle switch and the rotary switch can be merged into a Connector class for rotating objects. Both classes are basically the same anyway and are just implemeted separately to address the switch specific scenario of this work. The push button class can be renamed into a Connector class for translated objects. A Connector class for an object which can be rotated as well as translated, and the Lis- tener class for this Connector are already provided as a framework. Unfortunately both of these classes could not be finished in time, but do not require much more work to be fully implemented and incorporated into the module. page 85

9 Conclusion

The main goal of this work is to enhance Virtual Prototypes by adding functional behaviour to the scene objects of which the prototype con- sists. This goal is achieved by designing a Service Oriented Archi- tecture, which allows users to bidirectionally couple the independent simulation tools. The scene objects inside the VDP can be trans- formed and their textures can be changed by invoking Web Service operations provided by Connector Nodes. Therefore, all possible actions or events which might need to be triggered from outside can be triggered from outside of the VDP by invoking these operations. It is also possible to create Web Service clients to inform a functional sim- ulation about interactions with scene objects. This way interactions with scene objects performed by the user of the VDP can be set into relation with their functional behaviour and relationship. The loose coupling of the simulations is accomplished by using Web Services to provide an interface for the VDP and by using BPEL processes to or- chestrate any operation calls sent by or to the VDP. The Apache ODE, which is used to run the BPEL processes, can map all incoming op- eration calls to their associated Web Services. As a consequence, the independence of the simulation tools is maintained and the principle of Separation of Concerns is met. Another important requirement for the construction of a Virtual Prototype is that the construction of the prototype must be simple and may not consume a large amount of time. Due to the graphical user interface provided by the VDP mod- ule this requirement is fulfilled on the side of the VR system. For the part of the functional simulation this requirement is also met, because the effort which is necessary to build a BPEL process and create a Web Service client to contact this process is small. The integration of the BPEL2ToggleSwitchesProcess, for example, only took about one hour. Once the BPEL processes are constructed and the Web Ser- vice clients for these processes are integrated into the VDP module, the Virtual Prototype can be built using the graphical user interface provided by the module. No expert or even advanced knowledge of any kind is required to operate the graphical user interface. Creating the Connector Nodes to connect the scene objects with functional simulations is easy and consumes almost no time. Switching between different cockpit configurations can be done within moments by loading previously created and saved Connector Node settings which were made for the different cockpits. Thus different prototypes can directly be compared. Possible extensions for the solution presented in this work can, for ex- page 86 9 Conclusion ample, address an automated generation of Web Service clients for the BPEL processes. If this can be done, the user of the VDP module would just have to specify the URL of a BPEL process’s Web Service his Connector Node is connected to and the client for this Web Ser- vice is created automatically. Also, the information required by the BPEL process must be identified, provided and asserted autonomically by the module. Another interesting field to look at is the automatic dis- covery of services, or the composition of Web Services to automatically create new BPEL processes. REFERENCES page 87

References

Web services architecture. Technical report, World Wide Web Consor- tium, February 2004a. A. Alves, A. Arkin, S. Askary, B. Bloch, F. Curbera, Y. Goland, N. Kartha, Sterling, D. K¨onig, V. Mehta, S. Thatte, D. van der Rijn, P. Yendluri, and A. Yiu. Web services business process exe- cution language version 2.0. OASIS Committee Draft, May 2006. T. Andrews, F. Curbera, H. Dholakia, Y. Goland, J. Klein, F. Ley- mann, K. Liu, D. Roller, D. Smith, S. Thatte, I. Trick- ovic, and S. Weerawarana. BPEL4WS, Business Process Execution Language for Web Services Version 1.1. IBM, 2003. URL http://download.boulder.ibm.com/ibmdl/ pub/software/dw/specs/ws-bpel/ws-bpel.pdf. H.-J. Bullinger, R. Breining, W. Bauer, V. Reality, and F. Iao. Virtual prototyping - state of the art in product design, 1999. R. Chinnici, H. Haas, A. A. Lewis, J.-J. Moreau, D. Orchard, and S. Weerawarana. Web services description language (wsdl) version 2.0 part 2: Adjuncts. World Wide Web Consortium, Recommen- dation REC-wsdl20-adjuncts-20070626, June 2007. C. Conn, J. Lanier, M. Minsky, S. Fisher, and A. Druin. Virtual environments and interactivity: windows to the future. In SIG- GRAPH ’89: ACM SIGGRAPH 89 Panel Proceedings, pages 7– 18, New York, NY, USA, 1989. ACM. ISBN 0-89791-353-1. doi: http://doi.acm.org/10.1145/77276.77278. R. Engelen. Web services glossary, May 2010. URL http://www. cs.fsu.edu/~engelen/soapdoc2.html. E. Gamma, R. Helm, R. Johnson, and J. Vlissides. Design patterns: el- ements of reusable object-oriented software. Addison-Wesley Long- man Publishing Co., Inc., Boston, MA, USA, 1995. R. B. Grady and D. L. Caswell. Software metrics: establishing a company-wide program. Prentice-Hall, Inc., Upper Saddle River, NJ, USA, 1987. ISBN 0-13-821844-7. M. Gudgin, M. Hadley, N. Mendelsohn, J.-J. Moreau, H. F. Nielsen, A. Karmarkar, and Y. Lafon. Soap version 1.2 part 1: Messag- ing framework (second edition), W3C recommendation. Techni- cal report, W3C, 2007. URL http://www.w3.org/TR/2007/ REC-soap12-part1-20070427/. page 88 REFERENCES

IDO:Develop Programmers Guide - VDP 2007. IC:IDO, 2007.

E. IEEE. Ieee std 610.12-1990(r2002). IEEE Standard Glossary of Software Engineering Terminology, 1990.

J. J. Moreau, R. Chinnici, A. Ryman, and S. Weerawarana. Web ser- vices description language (WSDL) version 2.0 part 1: Core lan- guage. Candidate recommendation, W3C, March 2006.

B. Mueck, M. Fischer, W. Dangelmaier, W. Klemisch, H. Nixdorf, and H. Nixdorf. Bi-directional coupling of simulation tools a with walk- through system, 2002.

S. Prof. Dr. Mueller. Introduction for virtual reality and augmented re- ality. http://userpages.uni-koblenz.de/~cg/ws0910/ VRAR/01_einfuehrung.pdf, 2009.

G. D. Rehn, M. Lemessi, J. M. Vance, and D. V. Dorozhkin. Integrat- ing operations simulation results with an immersive virtual reality environment. In WSC ’04: Proceedings of the 36th conference on Winter simulation, pages 1713–1719. Winter Simulation Confer- ence, 2004. ISBN 0-7803-8786-4.

R. Schubotz. Bridging 3d graphics and web services. 2009.

F. Seidel. Interaktive kopplung virtueller produktmodelle mit funk- tionalen systemmodellen, 2009.

W. R. Sherman and A. B. Craig. Understanding Virtual Reality: Inter- face, Application, and Design. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 2002. ISBN 1558603530.

A. Stork, P. Schneider, C. Clauss, A. Schneider, T. Bruder, and T. Farkas. Towards more insight with functional digital mockup. 2009. URL http://publica.fraunhofer.de/documents/ N-101622.html.

S. Strassburger, T. Schulze, M. Lemessi, and G. D. Rehn. Temporally parallel coupling of discrete simulation systems with virtual reality systems. In WSC ’05: Proceedings of the 37th conference on Winter simulation, pages 1949–1957. Winter Simulation Conference, 2005. ISBN 0-7803-9519-0.

I. E. Sutherland. The ultimate display. In Proceedings of the IFIP Congress, pages 506–508, 1965. REFERENCES page 89

B. The Eclipse Foundation. Bpel project. http://www.eclipse. org/bpel/, July 2010. W3C. Soap version 1.2 part 0: Primer (second edition). online, April 2007. URL http://www.w3.org/TR/soap12-part0/. W3C Recommendation. G. W3C. Web services glossary, February 2004b. URL http://www. w3.org/TR/ws-gloss/. G. G. Wang and A. Professor. Definition and review of virtual pro- totyping abstract. 2008. URL http://citeseerx.ist.psu. edu/viewdoc/summary?doi=10.1.1.94.567. page 90 LIST OF FIGURES

List of Figures

1 UML-diagram with the design for the switch description introduced by Seidel (2009)...... 10 2 software architecture for coupling Rhapsody with the VDP introduced by Seidel (2009)...... 11 3 Simulating a toggle switch in Rhasopy with the use of SysML...... 11 4 Webinterface provided by Rhapsody ...... 11 5 Reference Model for Virtual Reality...... 15 6 Use case diagram for the scenario where the module will be used by the Test Pilot...... 23 7 Use case diagram for the scenario where the module will be used by the VP Developer...... 24 8 Use case diagram for the VR System and the Functional Simulation...... 25 9 Toggle switch with a lock mechanism...... 33 10 Architecture of the system developed for this work. . . 37 11 GUI of Eclipses open source BPEL editor...... 37 12 Plugin and module architecture of the VDP...... 42 13 UML class diagram of the FunctionalityConnectorMod- ule and the FunctionalityConnectorModuleUI class. . . 43 14 UML class diagram showing the different specialised Connector classes and their parent class...... 44 15 UML class diagram with all the classes used to process user interaction...... 46 16 UML activity diagram describing the actions caused by a user interaction with a toggle switch...... 48 17 UML class diagram showing the IDOWSClient class and its relation to the IDOServiceConncetorRecipient class. 51 18 UML activity diagram showing the sequence of actions that results from invoking a texture change...... 56 19 UML class diagram showing class involved in the deploy- ment of a Web Service interface and the handling of the actions requested by messages send to that Web Service. 57 20 UML class diagram showing the IDOServiceConnector- Manager class and all the classes managed by it. . . . . 60 21 Screenshot of the widget designed for the VDP module FunctionaltyConnector...... 63 22 UML class diagram for all the classes related to the mod- uleanditsmenu...... 68 LIST OF FIGURES page 91

23 BPEL process for the toggle switch to control light sce- nario...... 71 24 BPEL process modeling the functional relation of a ro- tary switch and a monitor...... 71 25 BPEL process for the scenario of two toggle switches. . 72 26 Selecting the appropriate scene object node in the scene graph...... 75 27 Displaying the settings of a Connector Node..... 76 28 On the left side the toggle switch is tilted by 30 degrees, in the middle the toggle switch is in his default position and on the right the toggle switch is tilted by -30 degrees. 77 29 For the final demonstrator Connector Nodes for all the objects that are marked red were created to connect them to their corresponding BPEL processes...... 78 30 UML class diagram of the FunctionalityConnectorModule.101 page 92 LIST OF TABLES

List of Tables

1 Functional Requirements Part 1 ...... 26 2 Functional Requirements Part 2 ...... 27 3 Functional Requirements Part 3 ...... 28 4 Technical Requirements Part 1 ...... 28 5 Technical Requirements Part 2 ...... 29 6 User Interface Requirements ...... 29 7 Quality Requirements ...... 30 8 Source Lines of Code measurements for the plugin de- veloped by Seidel (2009)...... 82 9 Source Lines of Code measurements for the Function- alityConnectorModule without the Classes implemented for the module’s GUI...... 82 10 Source Lines of Code measurements for the Functional- ityConnectorModule including the Classes implemented for the modules GUI...... 82 LISTINGS page 93

Listings

1 Basic structure of a SOAP message...... 20 2 C++ header file used by the gSOAP tool soapcpp2 to create a Web Service ...... 52 3 C++ header file defining the event type for receiving a message via the Web Service of an IDOServiceConnector object ...... 54 4 Creating a Listener for the Web Service by using the cre- ate function provided by the IDOListenerCreator template class ...... 59 5 Creating the two Listeners for the widget by using the create function provided by the IDOListenerCreator template class ...... 67 6 Line which needs to be removed from the Web Service description of the BPEL2ToggleSwitchesProcess. 79 7 WSDL file describing a service with an operation for a texture change...... 94 8 Exemplary XML File for Connector Node configura- tions. A Connector Node for a toggle switch and a control light, for a rotary switch and a monitor and for a push button and another monitor are contained in this file 96 9 Source code for the sendToggleSwitch(...) func- tion invoking either the BPELTogleSwitchProcess, or the BPEL2ToggleSwitchesProcess...... 98 page 94 LISTINGS

7: WSDL file describing a service with an operation for a texture change.

1 2 15 16 17 18 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 LISTINGS page 95

42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 Service definition of function ns changeTexture 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 gSOAP 2.7.15 generated service definition 78 79 80 81 82 83 page 96 LISTINGS

8: Exemplary XML File for Connector Node configurations. A Connector Node for a toggle switch and a control light, for a rotary switch and a monitor and for a push button and another monitor are contained in this file

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 LISTINGS page 97

41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 page 98 LISTINGS

9: Source code for the sendToggleSwitch(...) function invoking either the BPELTogleSwitchProcess, or the BPEL2ToggleSwitchesProcess.

1 #include ”ToggleSwitch/IDOToggleSwitchClientBindingProxy .h” 2 #include ”ToggleSwitch/IDOToggleSwitchClient .nsmap” 3 #include ”2ToggleSwitches/ IDO2ToggleSwitchesClientBindingProxy .h” 4 #include ”2ToggleSwitches/IDO2ToggleSwitchesClient .nsmap” 5 #include ”IDOWSClient.h” 6 #include

millitm ; 36 request.timestamp = timestamp; 37 38 i f (BPELService. initiate(&request) == SOAP OK) 39 { 40 std : : cout << ”Angle ” << angle << ” send to BPEL−WS” << std : : endl ; 41 } 42 else 43 { 44 BPELService. soap s t r e a m fault(std:: c e r r ) ; 45 } 46 } 47 else 48 { 49 IDOToggleSwitchClient :: BPELToggleSwitchBindingProxy BPELService ; 50 BPELService.namespaces = IDOToggleSwitchClient namespaces; 51 BPELService. soap endpoint = m clientEndpoint. c s t r ( ) ; /∗ z .B. ”h t t p ://localhost:8080/ode/processes/ BPELToggleSwitchService ”; ∗/ 52 IDOToggleSwitchClient :: ns1 BPELToggleSwitchProcessRequest r e q u e s t ; 53 request.degree = angle; 54 request.rotXAxis = rotXAxis; 55 request.rotYAxis = rotYAxis; 56 request.rotZAxis = rotZAxis; 57 request.endpoint = m clientAssociate; 58 59 struct timeb timemilli; 60 ftime(&timemilli); 61 struct tm∗ t i m e i n f o ; 62 time t t ; 63 t = time(NULL); 64 timeinfo = localtime(&t); 65 int timestamp = (((timeinfo −>tm hour ∗ 10000) + (timeinfo −>tm min ∗ 100) + timeinfo −>tm sec ) ∗ 1000) + timemilli. millitm ; 66 request.timestamp = timestamp; 67 std : : cout << ”timestamp: ” << timestamp << std : : endl ; page 100 LISTINGS

68 std : : cout << ”Associate: ” << m clientAssociate << std : : endl ; 69 70 i f (BPELService. initiate(&request) == SOAP OK) 71 { 72 std : : cout << ”Angle ” << angle << ” send to BPEL−WS” << std : : endl ; 73 } 74 else 75 { 76 BPELService. soap s t r e a m fault(std:: c e r r ) ; 77 } 78 } 79 } LISTINGS page 101

Figure 30: UML class diagram of the FunctionalityConnectorModule.