1

2 CEN/WS GITB3

3

4 Date: 2015-xx-xx

5

6 CWA XXXXX:2015

7

8 Secretariat: NEN

9

10

11 Draft CEN Workshop Agreement: 12 Global eBusiness Interoperability Test Bed (GITB)

13 Phase 3: Implementation Specifications and Proof-of-Concept

14

15 Status: Draft CWA for Public Comment

16

17

18

19

20

21

22

23

24

25

26

27

28

29

1

30

31

2

32 Contents 33 34

35 36 Foreword ...... 9 37 1 Executive Overview ...... 10 38 2 Definitions and Abbreviations ...... 16 39 2.1 Definitions ...... 16 40 2.1.1 eBusiness Specifications ( see Section 3) ...... 16 41 2.1.2 Testing Purposes and Requirements ( see Section 3.4) ...... 16 42 2.1.3 Testing Roles ( see Section 4 ) ...... 17 43 2.1.4 Testing Framework and Architecture ( see Section 3.4) ...... 17 44 2.2 Abbreviations ...... 20 45 Part I: Motivation for eBusiness Testing and Overview of GITB Testing Framework ...... 22 46 3 Motivation ...... 22 47 3.1 Testing as a Key Prerequisite to eBusiness Interoperability ...... 22 48 3.2 Stakeholders and their Interests in eBusiness Testing ...... 22 49 3.3 Categories of eBusiness Specifications ...... 24 50 3.4 eBusiness Testing...... 25 51 3.4.1 Conformance and Interoperability Testing ...... 25 52 3.4.2 Testing Context and Stakeholders ...... 26 53 3.5 Benefits of a Global eBusiness Interoperability Test Bed ...... 27 54 4 GITB Principles and Testing Framework ...... 28 55 4.1 Objectives and Principles ...... 28 56 4.2 Synthesis of GITB Testing Framework ...... 29 57 4.3 Roles within the Testing Framework ...... 29 58 4.4 GITB Methodology ...... 30 59 4.4.1 Using Test Assertions ...... 30 60 4.4.2 Standalone Document Validation ...... 30 61 4.4.3 SUT-Interactive Conformance Testing ...... 31 62 4.4.4 Interoperability Testing ...... 31 63 4.4.5 Proposed Testing Practices for SUTs ...... 32 64 4.5 GITB Architecture ...... 33 65 Part II: Core Test Bed Implementation Specifications and Proof-of-Concept ...... 36 66 5 Overview of Core Test Bed Implementation Specifications ...... 36 67 5.1 Relevant Core Test Bed Service Specifications and Artifacts ...... 36 68 5.2 GITB Namespaces and Common Element Definitions ...... 37 69 5.2.1 XML Schema for Common Elements ...... 39 70 6 Test Presentation Language (TPL) ...... 43 71 6.1 Abstract Model ...... 43 72 6.2 Test Step Identification ...... 45 73 6.3 XML Schema for TPL ...... 45 74 7 Test Reporting Format ...... 48 75 7.1 Abstract Model ...... 48 76 7.1.1 XML Schema for Test Reporting Format ...... 50 3

77 8 GITB Test Service Specifications ...... 52 78 8.1 Content Validation Service ...... 52 79 8.1.1 Service Overview ...... 52 80 8.1.2 Abstract Service Description ...... 52 81 8.1.2.1 ValidationClient Requests Module Definition ...... 52 82 8.1.2.2 Validation ...... 53 83 8.1.3 Web Service Description (WSDL) ...... 53 84 8.1.4 XML Schema for Request/Response Messages ...... 54 85 8.2 Messaging (Simulation) Service ...... 54 86 8.2.1 Service Overview ...... 54 87 8.2.2 Abstract Service Description ...... 56 88 8.2.2.1 Requesting Module Definition (GetModuleDefinition) ...... 56 89 8.2.2.2 Initiating the Session (Initiate) ...... 56 90 8.2.2.3 Initiating a Transaction (BeginTransaction) ...... 56 91 8.2.2.4 Commanding Messaging Service to Send a Message (Send) ...... 56 92 8.2.2.5 Notification of the Client for Received or Proxied Messages (NotifyForMessage callback) ...... 56 93 8.2.2.6 Closing the Transaction (EndTransaction) ...... 57 94 8.2.2.7 Closing the Session (Finalize) ...... 57 95 8.2.3 Web Service Description (WSDL) ...... 57 96 8.2.4 XML Schema for Request/Response Messages ...... 59 97 8.3 Test Bed Service ...... 60 98 8.3.1 Service Overview ...... 60 99 8.3.2 Abstract Service Description ...... 61 100 8.3.2.1 Requesting Test Case Definition (GetTestcaseDefinition) ...... 61 101 8.3.2.2 Initiating Test Process (Initiate)...... 62 102 8.3.2.3 Requesting Actor Definition (GetActorDefinition) ...... 62 103 8.3.2.4 Configure Test Execution (Configure) ...... 62 104 8.3.2.5 Initiate Preliminary Phase (InitiatePreliminary) ...... 62 105 8.3.2.6 Providing User Input for Execution (ProvideInput) ...... 62 106 8.3.2.7 Starting the Execution Phase (Start) ...... 63 107 8.3.2.8 Status Updates for Testcase Execution (UpdateStatus callback) ...... 63 108 8.3.2.9 User Interaction During Execution (InteractWithUsers callback) ...... 63 109 8.3.2.10 Stopping the Execution (Stop) ...... 63 110 8.3.2.11 Restarting the Execution Phase (Restart) ...... 63 111 8.3.3 Web Service Description (WSDL) ...... 64 112 8.3.4 XML Schema for Request/Response Messages ...... 66 113 9 GITB Test Description Language (TDL) ...... 69 114 9.1 GITB Test Bed Concepts and Interfaces ...... 69 115 9.1.1 Basic Concepts ...... 69 116 9.1.2 Type System and Expressions ...... 69 117 9.1.3 Modularity for Specific Functionalities ...... 70 118 9.2 Test Suite Definition ...... 71 119 9.3 Test Case Definition ...... 72 120 9.3.1 Namespace Declarations ...... 72 121 9.3.2 Importing External Test Modules and Artifacts ...... 73 122 9.3.3 Defining the Actors and Roles in the Test Case ...... 73 123 9.3.4 Defining the Variables ...... 73 124 9.3.5 Preliminary Phase for the Execution ...... 74 125 9.3.6 Test Steps and Commands ...... 75 126 9.3.7 Messaging Steps ...... 75 127 9.3.8 Validation Step ...... 77 128 9.3.9 User Interaction During Execution...... 77 129 9.3.10 Interim Computations ...... 77

4

130 9.3.11 Test Flow Steps ...... 77 131 9.3.12 Modular Test Scripting ...... 78 132 9.3.13 Expressions and Bindings ...... 79 133 9.4 XML Schema for TDL ...... 79 134 10 GITB Proof of Concept (PoC) Test Bed Implementation ...... 84 135 10.1 Architecture ...... 84 136 10.1.1 GITB Testbed ...... 85 137 10.1.2 GITB Testbed Modules ...... 86 138 10.1.2.1 The Central Part of the GITB Testbed: gitb-core ...... 86 139 10.1.3 GITB Execution Interface ...... 102 140 10.1.3.1 How to Use the GITB POC Interface ...... 102 141 10.1.3.2 REST API ...... 108 142 10.2 Setting Up GITB PoC Testbed ...... 109 143 10.2.1 GITB Testbed ...... 109 144 10.2.1.1 Building ...... 109 145 10.2.1.2 Running ...... 109 146 10.2.2 GITB Execution Interface ...... 109 147 10.2.2.1 Dependencies ...... 109 148 10.2.2.2 Configurations ...... 109 149 10.2.2.3 User Management ...... 110 150 10.2.3 Building & Running ...... 110 151 10.2.4 Test Suite Deployment...... 110 152 10.3 Case Studies with POC Test Bed ...... 111 153 10.3.1 UBL - Conformance Tests for PEPPOL BIS4A Invoice Only Specification ...... 111 154 10.3.1.1 Test Suite Definition ...... 111 155 10.3.1.2 Development of the Necessary Messaging Handlers ...... 112 156 10.3.1.3 Definition of Test Artifacts ...... 112 157

182 12.4.4.3 Test Bed ...... 133 183 12.4.4.4 Test Capability Component ...... 133 184 12.4.4.5 Test Logic Artifact ...... 133 185 12.4.4.6 Test Suite ...... 133 186 12.4.4.7 Test Case ...... 134 187 12.4.4.8 Payload File ...... 134 188 12.4.4.9 Messaging Adapter ...... 134 189 12.4.4.10 Document Validator ...... 134 190 12.4.4.11 Specification Type ...... 134 191 12.4.4.12 Identifier ...... 134 192 12.4.4.13 Publisher ...... 134 193 12.4.4.14 Standardization Level ...... 135 194 12.4.4.15 Representation Technique ...... 135 195 12.4.5 Controlled Vocabularies to be Used ...... 135 196 12.4.5.1 Specification Type of Asset ...... 136 197 12.4.5.2 Representation Type of Asset Distribution ...... 136 198 12.4.5.3 Standardization Level of Test Logic Artifact ...... 138 199 12.5 Features ...... 138 200 12.5.1 Overview ...... 138 201 12.5.2 Concepts ...... 139 202 12.5.3 Search Testing Resources ...... 140 203 12.5.3.1 Typical searches ...... 140 204 12.5.3.2 Examples of search queries and their answer ...... 141 205 12.5.4 Testing Resources management ...... 143 206 12.5.5 Secondary Features ...... 143 207 12.5.5.1 Workspace and Folders Management ...... 143 208 12.5.5.2 Bulletin board ...... 144 209 12.5.5.3 General administration ...... 144 210 12.6 Process View ...... 145 211 12.7 External Interfaces ...... 147 212 12.7.1 User Interfaces ...... 147 213 12.7.2 Software Interfaces ...... 147 214 12.7.3 Communications Interfaces ...... 147 215 13 Test Registry and Repository (TRR) Prototype Implementation ...... 149 216 13.1 Joinup ...... 149 217 13.2 TRR Joinup ...... 149 218 13.2.1 Use Case Diagram ...... 149 219 13.2.2 Actors ...... 150 220 13.2.2.1 Anonymous User ...... 150 221 13.2.2.2 Joinup Member ...... 150 222 13.2.3 Uses Cases ...... 151 223 13.2.3.1 Search Test Resources within the Joinup Platform ...... 151 224 13.2.3.2 View Test Resources ...... 153 225 13.2.3.3 Create & Update Test Resources ...... 155 226 13.2.3.4 Delete Test Resources ...... 159 227 13.2.4 Fields of Test Resources ...... 159 228 13.2.4.1 Reused Fields ...... 160 229 13.2.4.2 Updated Fields ...... 160 230 Part IV: GITB Application and Validation based on Use Cases from the Automotive Industry, 231 Healthcare and Public Procurement ...... 163 232 14 Applying GITB in Use Cases ...... 163 233 14.1 Approach ...... 163 234 14.2 Deriving Testing Requirements ...... 164

6

235 14.2.1 Verification Scope (“What to Test?”) ...... 164 236 14.2.2 Operational Requirements (« In Which Environment? ») ...... 166 237 14.3 Deriving Test Scenarios and Solutions ...... 167 238 Part IV. 1: Public Procurement ...... 169 239 15 OpenPEPPOL ...... 169 240 15.1 Background and Testing Requirements ...... 169 241 15.2 Verification Scope – What Should be Tested? ...... 170 242 15.2.1 Parties/Actors ...... 170 243 15.2.2 Business Process ...... 170 244 15.2.3 Underlying eBusiness Specifications / Standards ...... 171 245 15.3 Testing Environment – How should be tested? ...... 171 246 15.3.1 Testing Integration in Business Environment ...... 171 247 15.3.2 Testing Location ...... 171 248 15.4 Test Scenario ...... 172 249 15.4.1 Objectives and Success Criteria ...... 172 250 15.4.2 Interaction Diagram/Choreography ...... 172 251 15.4.2.1 Endpoint lookup ...... 172 252 15.4.2.2 Document exchange ...... 173 253 15.4.3 System Under Test (s) ...... 173 254 15.4.4 Abstract Test Steps ...... 173 255 15.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain ...... 174 256 15.5.1 Test Artifacts ...... 174 257 15.5.2 Test Tools and Services ...... 175 258 15.6 Related Stakeholders ...... 175 259 16 eSENS ...... 176 260 16.1 Background and Testing Requirements ...... 176 261 16.2 Verification Scope – What Should Be Tested? ...... 177 262 16.2.1 Actors and Roles ...... 177 263 16.2.2 Business Process ...... 177 264 16.2.3 Underlying eBusiness Specifications / Standards ...... 178 265 16.3 Test Scenario ...... 179 266 16.3.1 Objectives and Success Criteria ...... 179 267 16.3.2 Interaction Diagram/Choreography ...... 179 268 16.3.3 System Under Test (s) ...... 180 269 16.3.4 Abstract Test Steps ...... 180 270 16.4 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain ...... 181 271 16.4.1 Test Artifacts ...... 181 272 16.4.2 Test Tools and Services ...... 182 273 16.5 Related Stakeholders ...... 182 274 17 Connecting Europe Facility (CEF) ...... 183 275 17.1 Background and Testing Requirements ...... 183 276 17.2 Verification Scope – What Should Be Tested? ...... 184 277 17.2.1 Actors ...... 184 278 17.2.2 Business Process ...... 184 279 17.2.3 Underlying Standards/Specifications ...... 184 280 17.3 Test Scenario ...... 185 281 17.3.1 Objectives and Success Criteria ...... 185 282 17.3.2 System Under Test (s) ...... 186 283 17.3.3 Abstract Test Steps ...... 186 284 17.4 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain ...... 187

7

285 17.4.1 Test Artifacts ...... 187 286 17.4.2 Test Tools and Services ...... 187 287 17.5 Related Stakeholders ...... 187 288 Part IV. 2: e-Health ...... 189 289 18 Clinical Document Architeture (CDA) ...... 189 290 18.1 Background and Testing Requirements ...... 189 291 18.2 Verification Scope – What Should be Tested? ...... 189 292 18.2.1 Parties/Actors ...... 190 293 18.3 Underlying eBusiness Specifications / Standards ...... 190 294 18.4 Testing Scenarios ...... 191 295 18.4.1 Objectives and Success Criteria ...... 191 296 18.4.2 System Under Test (s) ...... 191 297 18.4.3 Abstract Test Steps ...... 191 298 18.4.3.1 Testing the content creator ...... 191 299 18.4.3.2 Testing the content consumer ...... 191 300 18.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain ...... 192 301 18.6 Related Stakeholders ...... 192 302 18.7 Re-usability of Test artifacts/Tools/Services for GITB3 ...... 193 303 19 IHE - Cross-Enterprise Document Sharing (XDS) ...... 194 304 19.1 Background and Testing Requirements ...... 194 305 19.2 Verification Scope – What to Test? ...... 194 306 19.3 Actors ...... 194 307 19.3.1 Interaction Diagram/Choreography ...... 195 308 19.3.2 Underlying eBusiness Specifications / Standards ...... 195 309 19.4 Details/Requirements of Test Scenario ...... 196 310 19.4.1 Objectives and Success Criteria ...... 196 311 19.4.2 System(s) Under Test ...... 196 312 19.4.3 Abstract Test Steps ...... 196 313 19.4.3.1 Testing the Document Source ...... 196 314 19.4.3.2 Testing the Document Consumer ...... 197 315 19.4.3.3 Testing the Document Repository ...... 197 316 19.4.3.4 Testing the Document Registry ...... 197 317 19.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain ...... 198 318 19.6 Related Stakeholders ...... 199 319 19.7 Re-usability of Test Artifacts/Tools/Services for GITB3 ...... 199 320

321

322

323

8

324 Foreword

325

326  (to be provided by Secretariat at the end of the project)

327

9

328 1 Executive Overview

329

330 Motivation

331 The work on GITB is motivated by the increasing need to support testing of eBusiness scenarios as a means 332 to foster standards adoption, achieve better compliance to standards and greater interoperability within and 333 across the various industry, governmental and public sectors. Without testing, it is cumbersome to reach 334 interoperability of eBusiness implementations and to achieve conformance with standards specifications. 335 More advanced testing methodologies and practices are needed to cope with the relevant set of standards 336 for realizing comprehensive eBusiness scenarios (i.e. business processes and choreography, business 337 documents, transport and communication protocols), as well as Test Beds addressing the specific 338 requirements of multi-partner interactions.

339 GITB intends to increase the coordination between the manifold industry consortia and standards 340 development organizations with the goal to increase awareness of testing in eBusiness standardization and 341 to reduce the risk of fragmentation, duplication and conflicting eBusiness testing efforts. It thereby supports 342 the goals of the European ICT standardization policy12 to increase the quality, coherence and consistency of 343 ICT standards and provide active support to the implementation of ICT standards.

344 Vision

345 The long-term objective is to establish a shared and Global eBusiness Interoperability Test Bed (GITB) 346 infrastructure to support conformance and interoperability testing of eBusiness Specifications and their 347 implementation by software vendors and end-users.

348 Objectives

349 The GITB project aims at

350  developing the required global Testing Framework, architecture and methodologies for state- 351 of-the-art eBusiness Specifications and profiles covering all layers of the interoperability stack 352 (business processes, business documents, transport and communication);

353  supporting the realization of GITB as a network of multiple Test Beds, thereby leveraging existing 354 and future testing capabilities from different stakeholders (for example standards development 355 organizations and industry consortia, Test Bed Providers, and accreditation / certification 356 authorities);

357  establishing under EU support and guidance, a setup of a comprehensive and global eBusiness 358 interoperability Test Bed infrastructure in a global collaboration of European, North American and 359 Asian partners.

360 GITB focuses on the architecture, methodology and guidelines for assisting in the creation, use and 361 coordination of Test Beds. It is not intended to become an accreditation/certification authority or to impose a 362 particular Test Bed implementation.

363

1 COMMUNICATION FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL AND THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE A strategic vision for European standards: Moving forward to enhance and accelerate the sustainable growth of the European economy by 2020 COM(2011)311 final

2 Regulation (EU) 1025/2012 on European Standardisation. 10

364 Benefits

365 Overall GITB benefits are two-fold: First, GITB raises the awareness of testing as a prerequisite to 366 standards adoption and interoperable eBusiness implementations. It ensures that advanced testing 367 methodologies and services will be available for state-of-the-art eBusiness Specifications. Second, GITB 368 promotes a shared, international testing infrastructure realized as a network of Test Beds that 369 leverages synergies between existing and future testing activities. Compared to stand-alone Test Beds 370 covering only one or a few eBusiness Specifications, the GITB network saves costs and increases speed in 371 developing and providing high-quality testing services for eBusiness Specifications.

372 End-users will benefit from the advanced testing methodologies, architectures and services for realizing 373 comprehensive eBusiness scenarios more quickly and with less project risks. They also avoid costs implied 374 by investments in low quality, non-interoperable standards.

375 By putting more emphasis on testing, standards development organizations (SDOs) and industry 376 consortia ensure developing high quality, timely eBusiness Specifications in support of the industry needs; 377 and enable straightforward and effective approaches for standards’ implementation assessment, piloting, 378 and deployment.

379 Based on advanced testing methodologies and services, software vendors, eBusiness consultants and 380 integrators are able to develop and integrate enterprise applications in a demonstrably conformant and 381 interoperable manner. Missing implementation guidelines and missing testing facilities increase their 382 implementation efforts and the risks that their software applications do not conform to eBusiness 383 Specifications and / or are not interoperable with other implementations

384 With GITB, standards development organizations, test service providers, and software vendors 385 benefit from a joint approach for developing Test Beds across different world regions and sectors, which 386 positively affects development cost, capability, and compatibility of future testing facilities by leveraging best 387 of class expertise and shared resources. They could benefit from sharing the work load, agreeing on the 388 interpretations of the standards, and working in a synchronized manner.

389 National governments and the European Union benefit by providing industry, including the SMEs, with 390 high-quality ICT standards in a timely manner to ensure competitiveness in the global market while 391 responding to societal expectations. By providing active support to the implementation of ICT standards 392 using a standard testing approach, they increase the quality, coherence, and consistency of ICT standards.

393

394 GITB Phases and Approach

395 GITB objectives are planned to be achieved in three phases (Table 1-1). This CWA summarizes the GITB 396 third phase which develops implementation specifications, an open source Test Bed and a prototype for Test 397 Registry and Repository as proof-of-concept for the GITB architecture. It builds on the results of the first 398 phase (feasibility study) and second phase (testing framework and architecture).

399

400

401

11

402 Table 1-1: The Three Phases of the GITB Project Phase Phase 1: Phase 2: Phase 3: Realization Feasibility study Conceptualization of the GITB framework and architecture

Main activities An analysis of the Analysis of alternative Refinement and detailed benefits, risks, tasks, approaches to architecting specification of GITB requirements, required and implementing a GITB. Testing Framework. resources for a GITB based on business A recommended Proof-of-concept use cases; current architecture and process to implementation state of eBusiness implement the Test Bed comprising testing facilities. that follows from the requirements and  Test Bed architectural analysis with  Test Registry and clear rationale. Repository  Test artifacts for Assessment requirements business use cases from international stakeholders.

Main results Assessment of testing GITB Testing Framework GITB specifications: requirements from (architecture, methodology three use cases and guidelines) for  Core Test Bed assisting in the creation, implementation Comparison to existing use and coordination of specifications testing facilities and Test Beds  Test Registry and gap analysis Repository Validation based on three use cases Open source Test Bed

Test Registry and Respository prototype

Sample Test Artifacts

Published as CWA 16093:2010 CWA 16408:2012. To be announced CWA

403

404 405 Figure 1-1 provides an overview of the GITB approach, which starts from eBusiness use cases to capture 406 real-world eBusiness testing requirements and to develop and validate the GITB Testing Framework and the 407 specifications. GITB relies on the following industry communities which contribute their requirements and 408 experience in eBusiness testing:

409  Public Procurement: CEN Business Interoperability Interfaces (BII), Pan European Public 410 Procurement Online (PEPPOL / openPEPPOL), Electronic Simple European Networked Services (e- 411 SENS).

412  Healthcare: HL7 Clinical Document Architecture (CDA), IHE Cross-Enterprise Document Sharing 413 (XDS).

414  Automotive Industry: MOSS (Materials Off-Shore Sourcing).

415 During the initial phase, the feasibility analysis was performed by gathering the requirements from three use 416 cases with regard to Verification Scope (“what to test”) and operational requirements (“how to test”). The 12

417 comparison between these requirements and the existing eBusiness Testing Capabilities revealed a set of 418 functional and non-functional gaps. The assessment of these gaps demonstrated that a shared, operational 419 Test Bed infrastructure is desirable and feasible to complement eBusiness standards development efforts.

420 421 Figure 1-1: Use Case-Driven Approach to Validate the GITB Testing Framework and Develop the 422 Global GITB Testing Infrastructure 423

424 The second phase further conceptualized and elaborated the suggested approaches to architecting and 425 implementing GITB. Its main result is the GITB Testing Framework which comprises architecture, 426 methodology and guidelines for assisting in the creation, use and coordination of Test Beds. The GITB 427 Testing Framework has been instantiated and validated for the use cases, and a pilot implementation has 428 been done in one case.

429 The GITB Testing Framework forms the basis for the realization of the GITB Platform in the third phase. The 430 aim of this third GITB phase is to elaborate implementation specifications, to develop a proof-of-concept 431 (POC) Test Bed that implements these specifications, and to apply this test bed to one or more real use case 432 test scenarios.

433 GITB Phase 3 comprises three lines of work:

13

434 (1) Implementation specifications: this line of work is about the development of detailed, machine- 435 processable specifications for the Test Artifacts and interfaces functionally described in Phase 2, so 436 that interoperable test bed implementations can be developed from these. These specifications 437 include formal document structures (e.g. XML schemas and rules), formally defined remote service 438 interfaces (e.g. WSDL) and internal APIs (e.g. as written in Java). These specifications also include 439 the profiling of existing standards or technologies that are considered supportive of Phase 3, called 440 here “supportive standards / technologies”. 441 442 (2) Test Bed proof-of-concept development: this includes all development and integration work for 443 the POC test bed. In particular: (a) development of a core test bed platform and some plug-in 444 components necessary for targeted test scenarios, i.e. at least one of each plug-in category: 445 message adapter, test suite engine, document validator. (b) development of a prototype TRR based 446 on a supportive Registry/Repository standard. This may also include (c) integration of an existing 447 “legacy” test bed following the “GITB-service compliance” approach described in Phase 2. 448 449 (3) Test Scenario development: this line of work is about demonstrating actual use of the test bed 450 POC for one or more industry domain(s). It includes (a) development or migration of a test suite for 451 the domain, including a document validator, (b) deployment of these test artifacts in the TRR and in 452 the test bed POCs, and (c) an end-to-end test demonstration over test material and/or SUTs used in 453 real use case test scenarios, followed by with feedback from domain experts and users. 454

455

456 Figure 1-2: Focus of GITB 3

457 How to Read This CWA Document

458 This CWA presents the GITB implementation specifications for Test Bed as well as the Test Registry and 459 Respository. It describes prototype implementations for both, which serve as proof-of-concept. It also applies 460 the GITB results in use cases. In order to improve readability, the report is structured in four main sections 461 addressing different target groups and their view on GITB project results (Table 1-2)

462 Table 1-2: Guidelines of How to Read the CWA Document Main Sections Content Relevant for …

Part I: Motivation for Why eBusiness testing matters? eBusiness users, eBusiness Testing and  Motivation for eBusiness Testing standard development GITB Testing Framework organizations, industry 14

Chapters 3 to 4 How GITB envisions eBusiness testing consortia, testing experts  GITB Architecture Vision and all other  GITB Testing Framework stakeholders interested in the general motivation for GITB and an overview of the proposed solution Part II: GITB Test Bed Core  Test Presentation Language (TPL) Testing experts and Implementation  Test Reporting Format architects that are Specifications  GITB Test Service Specifications interested in the detailed Chapters 5- 11  GITB Test Description Language (TDL) Test Bed Architecture and  GITB Proof-of-Concept Test Bed Specifications Implementation  GITB Compliance Part III: GITB Test  Application profile for TRR based on the Testing experts and Repository and Registry Asset Description Metadata Schema architects that are (TRR) Specifications (ADMS) interested in registries and  Prototype Implementation based on repositories for sharing Chapters 12-13 JoinUp Test Resources

Part IV: Testing Scenarios How to Use GITB for eBusiness testing? eBusiness users, from Public Procurement,  Test Scenarios definitions and workflow. standard development Healthcare and  Test Artifacts related to Test Scenarios organizations, industry Automotive / (Test Suite, Test Assertions) consortia that are Manufacturing Industries interested in applying the For selected industries Test Bed Architecture to Chapters 14-19  Public Procurement their eBusiness scenarios  Healthcare  Automotive and Manufacturing Industry 463

15

464 2 Definitions and Abbreviations

465

466 2.1 Definitions

467 The following definitions are intended to address the most commonly recurring terms about testing in this 468 report. They are general definitions that may be refined in later sections of the document. Other terms 469 relating to specific areas (e.g. architecture, artifacts), will be listed in the related sections.

470 For the purpose of the present document, the terms and definitions given in ISO/IEC 9646-1:1994 471 “Information technology - Open Systems Interconnection - Conformance Testing methodology and 472 framework - Part 1: General concepts” apply.

473 Most of the definitions below are capitalized, even when involving common terms – e.g. “Test Bed”. When 474 the capitalized version is used in this document, it should be understood as having the particular meaning 475 defined in this section (or as defined in a further section), usually more precise or specific to the GITB 476 context than the common meaning for the term.

477 2.1.1 eBusiness Specifications ( see Section 3)

478 eBusiness Specification: An eBusiness Specification is any agreement or mode of operation that needs to 479 be in place between two or more partners in order to conduct eBusiness transactions. An eBusiness 480 Specification is associated with one or more of three different layers in the eBusiness interoperability stack: 481 transport and communication (Messaging) layer, Business Document layer, and Business Process layer. In 482 many situations, an eBusiness Specification comprises a set of standards or a profile of these.

483 Profile (of eBusiness Specifications): A Profile represents an agreed upon subset or interpretation of one 484 or more eBusiness Specifications, intended to achieve interoperability while adapting to specific needs of a 485 user community.

486 Business Process: A Business Process is a flow of related, structured activities or tasks that fulfill a specific 487 service or business function (serve a particular goal). It can be visualized with a flowchart as a sequence of 488 activities. The term also includes the resulting exchanges between business partners, which is also named 489 “public process”. The public process makes abstraction of the back-end processes driving these exchanges.

490 Business Document: A Business Document is a set of structured information that is relevant to conducting 491 business, e.g., an order or an invoice. Business Documents may be exchanged as a paper format or 492 electronically, e.g. in the form of XML or EDI messages.

493 2.1.2 Testing Purposes and Requirements ( see Section 3.4)

494 System Under Test (SUT): An implementation of one or more eBusiness Specifications, which are part of 495 an eBusiness system which is to be evaluated by testing.

496 Conformance Testing: Process of verifying that an implementation of a specification (SUT) fulfills the 497 requirements of this specification, or of a subset of these in case of a particular conformance profile or level. 498 Conformance Testing is usually realized by a Test Bed connected to the SUT. The Test Bed simulates 499 eBusiness protocol processes and artifacts against the SUT, and is generally driven by the means of test 500 scripts.

501 Interoperability Testing: A process for verifying that several SUTs can interoperate at one or more layers of 502 the eBusiness interoperability stack (see “eBusiness Specification”), while conforming to one or more 503 eBusiness Specifications. This type of testing is executed by operating SUTs and capturing their exchanges. 504 The logistics of Interoperability Testing is usually more costly (time, coordination, set-up, human efforts) 505 thanConformance Testing. Conformance does not guarantee interoperability, and Interoperability Testing is 506 no substitute for a conformance Test Suite. Experience shows that Interoperability Testing is more 507 successful and less costly when Conformance of implementations has been tested first. The interoperability 508 test process can also be piloted by a Test Bed, using test scripts as in Conformance Testing. 16

509 Operational Testing Requirements: An operating environment requirement specifies the concerns of 510 defining, obtaining, and validating Test Items within a specific testing environment. It answers the question 511 what is the specific testing environment?

512 Verification Scope: The Verification Scope specifies the subject of testing. It answers the question: What 513 type of concern to test for? A type of concern is defined by (1) a specific aspect or quality of SUT to be 514 assessed and (2) an eBusiness Specification or Profile.

515 2.1.3 Testing Roles ( see Section 4 )

516 Test Designer: A Test Engineer who develops Test Suites, Test Cases and Document Assertions. This 517 includes interpreting the B2B specifications, understanding – or writing – Test Assertions if any in order to 518 derive test Cases from these.

519 Test Manager: A role responsible for executing Test Suites or for facilitating their execution, including 520 related organizational tasks such as coordination with Test Participants.

521 Test Participant: The owner or operator of an SUT, typically the end-user, an integrator or a software 522 vendor. This role defines the Verification Scope and Testing Requirements.

523 Test Bed Provider: A general role that applies to anyone offering a Test Bed to Test Participants / 524 Managers / Designers.

525 Testing Capability Provider: A general role that applies to anyone offering a Testing Capability for use in a 526 Test Bed (e.g. offering an HL7 conformance Testing Capability that can be plugged-in a Test Bed platform).

527 2.1.4 Testing Framework and Architecture ( see Section 3.4)

528 Document Assertions Set: A Document Assertions Set (DAS) is a package of artifacts used to validate a 529 Business Document, typically including one or more of the following: a schema (XML), consistency rules, 530 codelists, etc. These artifacts are generally machine-processable.

531 Document Validator: A processor (a software application) that can verify some aspects of document 532 requirements, i.e. some validation assertions about a document such as an XML schema or some 533 consistency rules. A Document Validator may be specialized for some type of validation assertion (e.g. XML 534 schema validation, or semantic rules).

535 GITB Architecture: An architecture for a testing infrastructure comprising Test Beds and a Test Registry 536 and Repository. It comprises the following elements:

537 (a) Test Artifacts that are processed by test beds

538 (b) Test Services for supporting testing activities,

539 (c) Test Bed Components and their integration,

540 (d) Test Registry and Repository for managing, archiving and sharing various Testing Resources.

541 GITB Compliant Test Bed: GITB-compliance means either GITB-framework compliance or GITB-service 542 compliance. A GITB-framework compliant Test Bed follows the GITB recommendations with regard to its 543 functional scope. A GITB-service compliant Test Bed is only required to follow the GITB recommendations 544 for its Service interfaces, and the Test Artifacts it produces.

545 GITB Methodology: provides guidelines for e-business testing. It assists in specifying the subject of testing 546 and the type of concern to test for (“what to test”). It also defines the means by which the testing goal is 547 achieved (“how to test”) and outlines typical testing scenarios (Standalone Document Validation, SUT- 548 Interactive Conformance Testing, Interoperability Testing).

549 GITB Testing Framework: The architecture, methodology and guidelines for assisting in the creation, use 550 and coordination of Test Beds. The GITB Testing Framework comprises the GITB Methodology and the

17

551 GITB Architecture for a modular testing infrastructure comprising Test Beds and a Test Registry and 552 Repository.

553 Legacy Test Bed: An existing Test Bed that has been developed prior to GITB recommendations. A Legacy 554 Test Bed can be made “GITB-service compliant” by extending it with a subset of the service interfaces 555 described in this report.

556 Test Agent: A processor – either a simple application or a complete Test Bed – that plays a 557 secondary role in the execution of a Test Suite, i.e. is interacting either with a Test Bed or with a Web 558 browser for the purpose of assisting Test Suite execution. A Test Agent may simulate one party in the 559 execution of a Test Suite (e.g. send messages to an SUT or wait for messages from the SUT), or may be 560 specialized for the execution of some Test Case, or for executing a Document Validator. A Test Agent may 561 be either one or both of: (a) An interacting [Test] Agent if it is able to directly interact with an SUT e.g. to 562 execute parts of the Test Suite (e.g. simulates a business party in some Test Case), (b) A validating [Test] 563 Agent if it is able to verify conformance of some Test Items to an eBusiness Specification or to a profile.

564 Test Artifact: A Test Artifact is a document used as input or output of Test Beds. These documents may 565 represent various data objects, e.g. Test Cases, Test Assertions, Test Suite scripts, Test Reports, test logs. 566 A Test Artifact should be machine-readable (e.g. formatted in XML).

567 Test Assertion (Cf. OASIS Test Assertion Guidelines [TAG]): A Test Assertion is a testable or 568 measurable expression - usually in plain text or with a semi-formal representation - for evaluating the 569 adherence of an implementation (or part of it) to a normative statement in a specification. Test Assertions 570 generally provide a starting point for writing a conformance Test Suite or an interoperability Test Suite for a 571 specification.

572 Test Bed: An actual test execution environment for Test Suites or Test Services. In the context of this 573 document, this generic term applies by default to various operational combinations of components provided 574 by or developed according to the (GITB) Testing Framework.

575 Test Bed Architecture: A particular combination of components and relationships among the components, 576 in a software system design based on Testing Framework resources and definitions and intended to perform 577 testing operations in accordance with use case requirements.

578 Test Bed Component: A component of a Test Bed that executes a function required for conformance and 579 interoperability testing. Either a core Test Bed platform component (performing an internal test Bed function, 580 e.g. Test Suite deployment) or a user-facing component (e.g. a Test Suite editor), or a component providing 581 a specific Testing Capability (e.g. a Document Validator).

582 Test Case: A Test Case is an executable unit of verification and/or of interaction with an SUT, corresponding 583 to a particular testing requirement, as identified in an eBusiness Specification. Each test case includes: (1) a 584 description of the test purpose (what is being tested - the conditions / requirements / capabilities which are to 585 be addressed by a particular test), (2) the pass/fail criteria, (3) traceability information to the verified 586 normative statements, either as a reference to a test assertion, or as a direct reference to the normative 587 statement (Cf. OASIS Test Assertion Guidelines definition [TAG]).

588 Test Description Language: In the eBusiness domain, Test Description Language (TDL) is a high-level 589 computational language capable of expressing Test Case and Test Suite execution logic and semantics.

590 Test Execution Log: (A specific kind of Test Artifact). Message capture or other trace of observable 591 behavior that results from SUT activity. It is a collection of Test Items, subject to further verification or 592 analysis.

593 Testing Capability: A general term to designate the set of resources (Test Bed Components, test logic or 594 test configuration artifacts) supportive of a particular test function or of a Test Suite execution, typically 595 related to an eBusiness standard. All Testing Capabilities (plug-in components and/or artifacts) are typically 596 add-ons to a Test Bed platform. They may be added to or removed from a Test Bed depending on the testing 597 needs without modifying the code of the Test Bed but instead via a configuration change – they do not 598 represent core functions of such a platform. Examples are:

18

599  Testing Capabilities that relate to a particular eBusiness Specification, e.g. an “HL7 document Testing 600 Capability” involves a set of resources necessary to validate HL7 documents: an HL7 document 601 assertion set (test logic definition) combined with a Document Validator component (the processor of this 602 test logic). An “ebMS2.0 messaging adapter” Testing Capability is an ebMS2.0 Adapter Test Bed 603 Component that will enable Test Suites to use ebMS2.0 messaging during execution. A Test Bed with 604 HL7 validation capability will be said to be “HL7 validation-capable”, or with ebMS2.0 messaging 605 capability to be “ebMS2.0 messaging capable”.

606  Some Testing Capability components are not associated with a specific eBusiness Specification or 607 standard, but rather with a specific test logic standard such as XML schema or a particular TDL. 608 Processors for such standards (e.g. XML schema validator, TDL script interpreter) are also considered 609 as Testing Capability components.

610 Testing Resource: A generic term to designate any part of a Test Bed (Test Artifact, Test Service interface, 611 core or plug-in Test Bed Component), or a combination of these.

612 Testing Framework (see GITB Testing Framework).

613 Test Item: A unit of data to be verified, e.g. a document, a message envelope, an XML fragment. In the B2B 614 or eBusiness environment, Test Item can be message instance, event, or status report that is obtained from 615 an SUT for the purposes of assessing conformance or interoperability of the SUT (see Conformance 616 Testing,Interoperability Testing).

617 Test Registry and Repository ( see Part III): A component for managing, archiving and sharing 618 distributed testing resources.

619 Test Report: documents the result of verifying the behavior or output of one or more SUT(s), or verifying 620 Test Items such as Business Documents. It is making a conformance or interoperability assessment (see 621 Conformance Testing and Interoperability Testing). It is generally intended for human readers (although 622 possibly after some rendering, e.g. HTML rendering in a browser or after a translation XML to HTML).

623 Test Services: These services allow for managing Test Artifacts (design, deploy, archive, search) as well as 624 controlling the major Test Bed functions (test execution and coordination).

625 Test Step: A unit of test operation(s) that translates into a controllable, separate unit of test execution.

626 Test Suite: (A kind of Test Artifact). A Test Suite defines a workflow of Test Case executions and/or 627 Document Validator executions, with the intent of verifying one or more SUTs against one or more eBusiness 628 Specifications, either for conformance or interoperability.

629 Test Suite Engine: A Test Suite Engine (or "Test Suite Driver") is a processor that can execute a Test Suite, 630 or has control of the Test Suite main process execution in case it delegates part of the execution - e.g. some 631 Test Cases or some validation tasks - to specialized Test Agents or to a Document Validator.

632

633

19

634 2.2 Abbreviations

635 AIAG Automotive Industry Action Group

636 B2B Business-to-Business

637 B2C Business-to-Consumer

638 B2G Business-to-Government

639 CDA Clinical Document Architecture

640 DAS Document Assertion Set

641 eAC ebXML Asia Committee

642 ebBP ebXML Business Process

643 eBIF eBusiness Interoperability Forum (eBIF)

644 EDI Electronic Data Interchange

645 EIRA European Interoperability Reference Architecture

646 GITB Global eBusiness Interoperability Test Bed

647 GUI Graphical User Interface

648 HL7 Health Level Seven

649 HTML Markup Language

650 IDEI Integrated Development Environment

651 IHE Integrating the Healthcare Enterprise

652 MOSS Material Off-Shore Sourcing

653 NHIS National Health Information System

654 PEPPOL Pan-European Public Procurement

655 PoC Proof-of-Concept

656 SDO Standards Development Organization

657 SOAP Simple Object Access Protocol

658 SUT System Under Test

659 TAG Test Assertion Guidelines

660 TAPM Test Artifacts Persistence Manager

661 TDL Test Description Language

662 TPL Test Presentation Language

663 TRR Test Registry and Repository

20

664 XML Extensible Markup Language

665

666

667

668

21

669 Part I: Motivation for eBusiness Testing and Overview of GITB Testing Framework

670 Part I summarizes the motivation for eBusiness testing and provides an overview of the GITB Testing 671 Framework. It is relevant for the following target groups: eBusiness users, standard development 672 organizations, industry consortia, testing experts and all other stakeholders.

673 3 Motivation

674 3.1 Testing as a Key Prerequisite to eBusiness Interoperability

675 In the move towards globally networked enterprises, eBusiness scenarios are to support increasingly 676 complex interactions among a larger number of organizations from industry, governmental and public 677 sectors. While eBusiness scenarios are implemented and adopted at a global level, interoperability has 678 become a major concern. Consequently, organizations from private and public sectors as well as technology 679 and software providers are engaged in cooperation for the development of vertical industry standards. 680 However, it can be noticed that it is still cumbersome for software vendors and end-users to demonstrate full 681 compliance with the specified standards and to achieve interoperability of the implementations3. This is due 682 to a number of facts: 683 684 (1) Many standards development organizations (SDOs) and industry consortia are only in the process of 685 conceptualizing how they will ensure interoperability of standards’ implementations. They are unsure 686 how to provide adequate testing and certification services.

687 (2) eBusiness interoperability typically requires that a full set of standards – from open and Web 688 Services standards to industry-level specifications and eBusiness frameworks – are implemented. 689 We denote this set of standards as eBusiness Specifications that underlie the electronic business 690 relationship.

691 (3) As of today, there are only limited and scattered Test Beds. If Test Beds are provided by one of the 692 standards development organizations, they have a rather narrow focus on a particular standard. In 693 particular, they might not encompass testing the entire set of relevant eBusiness Specifications from 694 a company perspective, i.e. a “Profile”, and interactions in more complex Business Processes with 695 several partners.

696 The following section outlines the demand for eBusiness testing from the perspective of the relevant 697 stakeholders. 698 3.2 Stakeholders and their Interests in eBusiness Testing

699 The relevant stakeholders in eBusiness testing comprise end-users from private and public sectors, industry 700 consortia and SDOs, technology and software vendors, testing laboratories as well as public authorities and 701 governments.

702 End-users comprise all organizations – from private and public sectors – which implement eBusiness 703 scenarios. Their ultimate goal is to increase the efficiency and effectiveness of their organizations and to 704 keep up-to-date in solutions for enhanced customer experiences. eBusiness testing is of interest for them as 705 they:

706 (1) realize the benefits of eBusiness solutions more quickly, with less project risks, and

707 (2) avoid costs implied by investments in low quality, non-interoperable standards.

708 For end-users, the lack of eBusiness testing has negative impacts on project duration for on-boarding 709 business partners and is one of the root causes of significant B2B integration costs. While the ability of an 710 enterprise to quickly add new business partners is a key factor in determining the level of its business agility,

3 eBusiness W@tch Report one-Business Interoperability and Standards: A Cross-Sector Perspective and Outlook, 2005 22

711 most companies need 3 to 10 days or more to on-board new business partners4. The most negative effects 712 of a lack of testing, however, are errors that occur in productive eBusiness scenarios, i.e. if supply chain 713 operations are slowed down or customer requirements cannot be fulfilled as planned.

714 Industry consortia and formal SDOs are communities of end-users, public authorities and other interested 715 parties that act to achieve the following objectives:

716 (1) Maintain cohesive community acting on key set of industry issues leading to industry-driven, 717 voluntary standards development;

718 (2) Develop high quality, timely industry standards specifications in support of industry needs;

719 (3) Effect efficient implementation of the developed standards by the vendors to provide a rational 720 basis for the standards assessment;

721 (4) Enable straightforward and effective approaches for standards’ implementation assessment, 722 piloting, and eventual deployment.

723 For industry consortia, the lack of testing increases the risks that implementations of the specified standards 724 are not interoperable.

725 Software vendors that act to achieve the following objectives:

726 (1) Develop enterprise applications that are standards-compliant, and

727 (2) Effectively support their client base by achieving functional and interoperable dBusiness 728 solutions.

729 Software application vendors are struggling with the pure number and complexity of standards as well as the 730 low quality of eBusiness Specifications with regard to their consistency. Missing implementation guidelines 731 and missing Testing Capabilities increase their implementation efforts and the risks that their software 732 applications do not conform to eBusiness Specifications and / or are not interoperable with other 733 implementations.

734 Testing laboratories act to achieve the following objectives:

735 (1) Increase efficiency and reliability of interoperable implementation of standards;

736 (2) Assure unbiased and objective nature of the standards implementation assessment process.

737 From the perspective of national governments and the European Union lacking interoperability and poor- 738 quality standards harm innovation and competition, burn investments, and drain the growth potential of 739 markets. In their current efforts to modernize the EU ICT standardization policy, the European Commission 740 states the following policy goals:

741  To provide industry including SMEs, with high-quality ICT standards in a timely manner to ensure 742 competitiveness in the global market while responding to societal expectations;

743  To increase the quality, coherence and consistency of ICT standard, and

744  To provide active support to the implementation of ICT standards.

745 eBusiness testing provides the necessary means to achieve these goals, as it contributes to solve quality 746 issues in standards development and addresses implementation issues which currently hamper the adoption 747 of eBusiness standards. Consequently, eBusiness testing needs to be a cornerstone of EU ICT 748 standardization policy.

4 Forrester Research Inc. (2009): The Value of a Comprehensive Integration Solution, Forrester Research Inc., Cambridge, 2009 23

749 3.3 Categories of eBusiness Specifications

750 Doing business electronically requires that certain agreements are in place between two or more partners in 751 order to conduct eBusiness transactions. We denote these agreements as the eBusiness Specifications 752 governing an electronic business relationship. An eBusiness Specification is associated with one or more of 753 three different layers in the eBusiness interoperability stack56 and often relies on standards that have been 754 developed or are still under development (Table 3.1)

755 1. Transport and Communication (Messaging) Layer: How do organizations communicate 756 electronically? 757 This layer addresses technical interoperability. Relevant specifications cover the range from 758 transport and communication layer protocols like HTTP to higher level messaging protocols such as 759 Simple Object Access Protocol (SOAP) or ebXML Messaging. Furthermore, security, reliability and 760 other quality of service protocols and extensions over the transport and communication protocols are 761 also considered in this layer.

762 2. Business Document Layer: What type of information do organizations exchange? 763 This layer addresses the semantic interoperability and specifies the form and content of Business 764 Documents which are exchanged electronically. Specifications may relate to:

765  Document structure, i.e. definition of the document syntax (e.g. XML), the naming and 766 design rules (e.g. rules for generic Business Document structure, as specified by OAGIS 767 BOD architecture) and the assembly of the document (e.g. rules for the assembly of 768 Business Documents, as defined by OAGIS BOD architecture);

769  Document semantics, i.e. the definition of document and fields (e.g. an XML document 770 definition) and their meaning including reference to external code lists, taxonomies and 771 vocabularies (UN/CEFACT Core Component Library, UBL Component Library), and

772  Business rules that define restrictions or constraints among data element values.

773 3. Business Process Layer: How do the organizations interact? 774 Business Processes address organizational interoperability. Specifications at this level describe how 775 Business Processes are organized across organizational boundaries. The Business Process layer, 776 either presented in a formal Business Process specification standard such as ebXML Business 777 Process Specification Schema (BPSS) or with an informal workflow definition like flowcharts or 778 interaction diagrams, provides a message choreography, exception flows (error handling) and other 779 business rules for the eBusiness application roles participating in the process.

780 In addition to these layers, an eBusiness Specification may rely on profiles which define cross-layer 781 dependencies and further restrictions on the single layers.

782

5 CEN ISSS: eBUSINESS ROADMAP addressing key eBusiness standards issues 2006-2008.

6 Legner, C.; Vogel, T. (2008): Leveraging Web Services for Implementing Vertical Industry Standards: A Model for Service-Based Interoperability, in: Electronic Markets, 18, 1, 2008, pp. 39-52. 24

783 Table 3-1: eBusiness Specifications

784

785 786 3.4 eBusiness Testing

787 3.4.1 Conformance and Interoperability Testing

788 From a general perspective, two types of testing are relevant in the context of eBusiness:

789  Conformance testing involves verifying whether an eBusiness implementation conforms to the 790 underlying eBusiness Specifications. This is the first step toward interoperability with other 791 conformant systems as prescribed by the specification.

792  Interoperability testing is verifying that two or more eBusiness implementations actually are able to 793 intercommunicate based on some exchange scenarios. This form of testing is generally more 794 difficult to automate than Conformance Testing, and is more effort intensive in terms of human 795 involvement and coordination.

796 Experience shows that only through conformance and Interoperability Testing, correct information exchange 797 among eBusiness implementations can be guaranteed and software implementations can be certified. 798 Conformance Testing is no substitute for Interoperability Testing, and vice-versa.

799 Experience also shows that the type and quality of the eBusiness Specifications impact whether 800 conformance and Interoperability Testing can easily be performed. If eBusiness Specifications comprise 801 substantial text descriptions, with some flow-charts or diagrams, these narrative or semi-formal 802 representations often leave many degrees of freedom for interpretation to the users. The efforts to prepare 803 test scripts and Test Cases are much higher than in the case of an eBusiness Specification which comprises 804 machine-readable representations, such as XML schemas, code lists, data models or formal representations 805 (e.g. in the Web Services Definition Language, or in ebXML Business Process (ebBP) -some examples for 806 machine-readable specifications are depicted in the right column of Figure 3-1).

807 As of today, the existing testing tools, Test Suites and testing committees individually address a specific 808 standard or one of the above layers. However, integrated Testing Frameworks which do not hard-code a 809 specific standard at any layer (because different communities may use different standards) and are capable

25

810 of handling testing activities at all layers of the interoperability stack are necessary for conformance and 811 Interoperability Testing.

812 3.4.2 Testing Context and Stakeholders

813 eBusiness testing is performed in different contexts with different business rationale and stakeholders:

814 (1) Standardization initiated by a standard development organization (SDO) or an industry 815 consortium (Figure 3-1): 816 An SDO or industry consortium develops an eBusiness Specification (or Profile) and deploys it to the 817 community of users, software vendors etc. In this case, testing occurs during standard development 818 (in order to test conformance with other specifications, such as Naming and Design Rules) for quality 819 assurance of the developed eBusiness Specifications. Testing also occurs during standard 820 deployment to ensure the quality and the interoperability of the implementations. Testing may lead to 821 certification of software or productive implementations.

822

823 Figure 3-1: Testing Context “Standardization” 824

825 (2) “Onboarding” of new business partners initiated by user company (Figure 3-2): In this case, a 826 company defines eBusiness Specifications and imposes their implementation on all business 827 partners. Testing is performed as part of the so-called ”onboarding process“ of partners.

26

828

829 Figure 3-2: Testing Context “Onboarding” 830 831 3.5 Benefits of a Global eBusiness Interoperability Test Bed

832 To summarize, without eBusiness testing the potential of standard setting is not fully exploited and the wide- 833 spread adoption of eBusiness standards will not be possible. Hence, the rationale for GITB can be 834 summarized as follows:

835 (1) Efficient allocation of resources and efforts in eBusiness implementation projects (less resources 836 will be spent to overcome low quality, conflicting or fragmented standards),

837 (2) Higher quality of eBusiness standards and mitigation of systemic risks in the eBusiness 838 community, and

839 (3) Improvement of the eBusiness standards development and diffusion process.

840 More attention to testing, visibility of outcomes and feedback from testing to industry consortia and SDO's 841 will imply increased attention to quality of standards and their implementation, and to a crisp boundary 842 between commons (standards as public resources) and proprietary assets.

843

844

27

845 4 GITB Principles and Testing Framework

846 GITB emphasizes the modularity and reusability of a Test Bed design and the easy plug-in of existing and 847 future Testing Capabilities for state-of-the-art eBusiness Specifications. In proposing a Testing Architecture, 848 GITB enables the coordination of multiple collaborating Test Beds in a network of Testing Resources, 849 offering Testing Capabilities for eBusiness Specifications that can be used either directly by Test 850 Participants, or by other Test Beds. 851 852 4.1 Objectives and Principles

853 The following objectives and principles were guiding the GITB work and should be met by the GITB 854 Architecture, the underlying Testing Framework and the Test Beds:

855  Coverage of all eBusiness Interoperability Layers: In view of the increasing number of eBusiness 856 Specifications that are implemented and adopted at a global level, testing has to address all 857 interoperability layers (i.e. business processes and choreography, business documents, transport 858 and communication protocols) as well as profiles of them.

859  Testing Anywhere, Anytime: Interoperability and Conformance Testing should not be restricted in 860 time and place. Software vendors and end-users should be able to test their implementations over 861 the Web anytime, anywhere and with any parties willing to do so. Interoperability Testing is expected 862 to be repeated on a regular basis, as B2B networks and systems evolve continuously due to new 863 versions of eBusiness Specifications, upgrades of eBusiness systems, changing business 864 communities, and changing business requirements.

865  Reduction of Time Spent in Testing: Considering the amount of Test Cases necessary to cover 866 the conformance or Interoperability Testing requirements of eBusiness Specifications, the time spent 867 by participants during the testing process should be significantly reduced by a testing methodology 868 that favors reuse, automation and test integration. Partial coverage of the eBusiness stack by using 869 disparate, unrelated tools for each layer is error prone and costly in terms of integration efforts and 870 skills. The GITB Testing Framework aims to provide a comprehensive approach to eBusiness testing 871 by integrating configuration management and other preliminary Test Steps into the testing process.

872  Ease of Design and Use: The Test Bed will aim at the “low cost of entry” for its users and hence 873 provide a graphical environment where a Test Designer can assemble the reusable Test Cases for 874 conformance and Interoperability Testing.

875  Independence of Test Bed design from the eBusiness Specifications: “Hard-coded” test logic in 876 one-off Test Bed implementations is not desirable due to opacity, maintenance difficulties, non- 877 reusable skills and platforms. The Test Bed design(s) have to be independent from eBusiness 878 Specifications to be tested for.

879  Modularity: Current eBusiness Specifications specify a variety of messaging protocols, business 880 document formats or choreographies. In order to support all of these and test them, the Test Bed 881 should be adaptable and modular. Therefore, it is necessary to define interfaces for several layers 882 and facilitate plug-in modules supporting different protocols or formats implementing the specified 883 interfaces.

884  Reuse of existing Test Beds and Test Suites: A “Service” approach allows for reuse and leverage 885 of existing Test Beds (legacy or not) and Test Suites. This reuse can be accomplished at design time 886 – by creating Test Suites from existing components, assembling and deriving Test Cases from 887 existing ones, reusing similar design patterns – or at run-time be enabling a distributed execution of 888 a Test Suite over cooperating Test Beds.

889  Flexibility in Architecture: The Testing Framework should allow for flexibility in architecture Test 890 Bed designs. It may be instantiated, e.g. as centralized Test Bed or as distributed Test Bed using a 891 service-oriented approach.

892  Standardized and Innovative Testing Methodologies will ensure the successful development of 893 testing of comprehensive eBusiness Specifications and Profiles. 28

894 4.2 Synthesis of GITB Testing Framework

895 The GITB Testing Framework comprises architecture, methodology and guidelines for assisting in the 896 creation, use and coordination of Test Beds.

897 The GITB Testing Framework’s constituents are two fold: 898 899 1. The GITB Methodology provides guidelines for e-business testing. It assists in specifying the 900 subject of testing and the type of concern to test for (“what to test”). It also defines the means by 901 which the testing goal is achieved (“how to test”) and outlines typical testing scenarios (Standalone 902 Document Validation, SUT-Interactive Conformance Testing, Interoperability Testing). 903 904 2. the GITB Architecture allows for a network of Test Beds to share Testing Resources and Testing 905 Capabilities by means of services, yet also recommends an internal Test Bed design that promotes 906 modularity and reuse. 907

908 The objectives in focusing on the definition of a Testing Framework and Architecture – as opposed to 909 defining a specific Test Bed design – are:

910  To define a general methodology and best practices related to all of the above, so that a common 911 set of skills in designing tests and operating them, may be shared and applied across eBusiness 912 disciplines.

913  To promote reuse of functional components across eBusiness Test Beds while allowing variability in 914 Test Bed architectural options,

915  To allow for the portability and reuse of Test Artifacts across Test Beds by defining some level of 916 standardization of these, and by facilitating their archival and discovery,

917  To ensure the use of common design concepts across Test Beds, thus promoting a common 918 understanding across eBusiness communities, and the same governance options,

919 4.3 Roles within the Testing Framework

920 The following roles, which generally correspond to different categories of Test Bed users and providers, are 921 identified and supported by the Testing Framework:

922  Test Designer: this role involves all tasks related to the creation of a Test Suite or of its parts (Test 923 Cases, document assertion sets, configuration artifacts). The Test Designer may also be responsible 924 for the creation of the set of Test Assertions from which Test Suite/Cases or Document Assertion 925 Sets will be derived. S/he must have a good understanding of the eBusiness domain and 926 specification(s) addressed by the Test Suite. S/he must also understand the testing conditions and 927 constraints under which the Test Suite and Test Bed will be used, and the variability that the Test 928 Suite must offer with respect to its reuse. The Test Designer is expected to be familiar with the 929 Testing Framework methodology and best practices.

930  Test Participant: The owner or operator of an SUT, typically the end-user, an integrator or a 931 software vendor. This role defines the Verification Scope and Testing Requirements. This role is 932 generally held by someone responsible for an eBusiness implementation, and having business 933 domain expertise.

934  Test Manager: A role responsible for executing Test Suites or for facilitating their execution, 935 including related organizational tasks such as coordination with Test Participants. The Test Manager 936 is an expert in Test Suites, and in the logistics involved in running tests. S/he is generally using the 937 Test Bed on behalf of the Test Participants, or assisting the Test Participant in using the Test Bed, 938 e.g. for configuring and deploying a Test Suite before execution, and for searching/discovering the 939 appropriate Test Suite in the Test Repository. S/he is also is familiar with the Test Suite logic and 940 related eBusiness domain. Test Participants may act as Test Manager, if they are knowledgeable in 941 testing. 29

942  Test Bed Provider: This role is about operating the Test Bed itself as a server or an application 943 service. It also may extend to the actual development and evolution of the Test Bed from Testing 944 Framework resources and components (as obtained from the Test Repository). The Test Bed 945 Operator is responsible for keeping the Test Bed functionally operational, and represents the Test 946 Bed owning party for any contractual relationship with users, i.e. all other roles. 947 948 4.4 GITB Methodology

949 4.4.1 Using Test Assertions

950 Ideally, a set of Test Assertions have been defined for an eBusiness Specification before a Test Suite and 951 Test Cases are developed. Test Assertions provide a way to bridge the narrative of an eBusiness 952 Specification and the Test Cases for verifying conformance (or interoperability). Test Assertions help to 953 interpret the specification statements from a testing viewpoint. Test Cases should then be derived from such 954 Test Assertions, as illustrated in Figure 4-1.

955

956 Figure 4-1: The Role of Test Assertions 957 Test Assertions provide a starting point for writing conformance and interoperability Test Suites. They 958 simplify the distribution of the test development effort between different groups: often, Test Designers are not 959 experts in the specification to be tested, and need guidance. By interpreting specification statements in terms 960 of testing terms and conditions, Test Assertions improve confidence in the resulting Test Suite and provide 961 the basis for coverage analysis (estimating the extent to which the specification is tested). OASIS has 962 developed Test Assertions Guidelines (TAG) that can be used to help developing Test Assertions.

963 4.4.2 Standalone Document Validation

964 Document validation – also sometimes called “Instance” or “conformance/unit” testing – is a particular form of 965 Conformance Testing which verifies a Test Item (e.g., an HL7 V2 message) against the rules defined in the 966 specification. This form of testing does not directly involve a System Under Test (SUT), but rather a testing 967 artifact (Test Item) that was produced by the SUT. Examples of such testing include validating a Clinical 968 Document Architecture (CDA) document instance against the CDA general rules and document type rules, 969 and validating an HL7 V2 message instance against an HL7 V2 conformance profile.

970

971 Figure 4-2: Workflow of a Standalone Document Validation

30

972 In “standalone” document validation, the document under test is obtained by a Test Participant, who directly 973 submits the document to and gets the Test Report from the Test Bed. This document validation is then 974 disconnected from any SUT communication, or larger Test Suite execution, as the Test Participant directly 975 controls all inputs to the Test Bed.

976 4.4.3 SUT-Interactive Conformance Testing

977 Conformance Testing is defined as verifying an artefact (e.g., an HL7 V2 message) against the rules defined 978 in the specification. Interactive Conformance Testing involves direct interaction between Test Bed and SUT, 979 combined with dynamic validation of SUT outputs (document validation). The document validation is usually 980 delegated by the Test Suite engine to a Document Validator. 981

Send Document System GITB Verify Under Doc Message & Test Test Bed Document Receive Response

Get Test Report

Test Participant 982

983 Figure 4-3: Sample Workflow in Interactive Conformance Testing 984 In such interactive Conformance Testing, the Test Participant (or Test Manager) only needs to interact with 985 the Test Bed to control the overall execution and get the final report. 986 987 4.4.4 Interoperability Testing

988 Interoperability is defined as the ability of two SUTs to interact with each other in compliance with the 989 specification. This interaction usually involves data artefacts (e.g. messages) produced by one SUT and 990 consumed by the other. Interoperability Testing (see definition in section 2.1.2) can be conducted in different 991 modes: 992 (1) Passive Interoperability Testing: in this mode, the SUTs are not controlled by the Test Suite, i.e. by a 993 Test Bed. The SUTs interact on their own or under regular business activity. The interoperability Test 994 Suite only verifies captured traffic: it is a validating Test Suite. 995 (2) Directly driven Interoperability Testing: in this mode, the interoperability Test Suite actively drives 996 one or more SUTs in order to cause them to interact: it is an interacting Test Suite. In addition, the 997 Test Suite (or another one, in case of “two-phase testing” – see next section) does the verification of 998 captured traffic. 999 (3) Indirectly driven Interoperability Testing: in this mode, the SUTs are controlled indirectly by the Test 1000 Bed.The interoperability Test Suite interacts using a different channel with an entity controlling the 1001 SUT – e.g. sends an email to a Test Participant asking for initiation of a message from or to the SUT. 1002 In addition, the Test Suite (or another one, in case of “two-phase testing” – see next section) does 1003 the verification of captured traffic. 1004 1005 Ideally, the message capture should not interfere with the way the SUTs interoperate as they would under 1006 real business conditions. The three most common ways to capture message traffic between SUTs are: 1007 1008 a) Using a “man-in-the-middle” system operating and re-routing messages at transport level (e.g. an 1009 HTTP proxy or a TCP intermediary). This is typically the least intrusive approach, although it 1010 imposes restrictive conditions (the messages and sessions should not be encrypted).

31

1011 b) Instrumenting of one of the SUT so that message capture is performed at the endpoint, e.g. on the 1012 message handler of the SUT. Later on this message capture can be consolidated in a Test 1013 Execution Log. 1014 c) Configuring the sending SUT(s) so that they duplicate messages sent and forward a copy a 1015 Monitoring component or directly to the Test Bed. 1016 1017 1018 Send Document System System Under Doc Under Test 1 Test 2 Send Response

G et T es Test Participant 1 t R Test Participant 2 Get Test Report GITB e p Test Bed or t

Verify Messages, Documents & Interactions

1019

1020 Figure 4-4: Basic Interoperability Testing 1021 1022 4.4.5 Proposed Testing Practices for SUTs

1023 If possible, first perform Document/Message Instance Testing: The Document/Message Instance testing 1024 eliminates the problems within a single document. The structure and the business rules are checked. After 1025 passing the Document/Message Instance Testing, the SUT can guarantee that can generate valid 1026 documents/messages.

1027 Always perform Conformance Testing: The Document/Message Instance Testing can ensure that a SUT can 1028 generate valid documents/messages. However, it cannot guarantee the SUT can send/receive these 1029 messages/documents as defined in the standard. Therefore, through the Conformance Tests, a SUT is 1030 tested to check whether it can send/receive messages in the order defined by the standard. In the 1031 Conformance Tests, all the other roles that the SUT communicates according to the specific standard are 1032 simulated by the testing applications or Test Beds. Therefore, the SUT is expected to behave as if it is in real 1033 life settings. The business rules that should be applied across documents are also controlled in Conformance 1034 Tests.

1035 Perform Interoperability Testing after the Conformance Testing, if possible design interoperability Test Suites 1036 so that they are not redundant with tests already done during Conformance Testing: Sometime fatal errors 1037 can be found during the Interoperability Testing. If so, Test Suites must be designed in such a way that those 1038 fatal errors are detected in the Conformance Testing. Through the interoperability tests more than one SUT 1039 is tested. Their ability to act with real-life settings is tested. In the certification process, most of the time, 1040 passing the Conformance Testing is sufficient. However, through Interoperability Testing, the interoperability 1041 with other real-life SUTs is tested.

32

1042 4.5 GITB Architecture

1043 For further portability and reuse, the Testing Framework defines a modular architecture based on standard 1044 Test Bed Component interfaces that allow for reusability of certain Testing Capabilities, and extensible plug- 1045 in design. A key tenet of interoperability and reuse across Test Beds is an that 1046 standardizes at appropriate levels the Test Artifacts to be processed (Test Cases, Test Suites, Test Reports, 1047 test configurations, etc.).

1048 Figure 4-5 provides an overview of the GITB Testing Architecture and its key elements:

1049  Test Artifacts that are processed by a Test Bed:

1050 o (a) test logic documents (Test Suite definitions, document Test Assertions),

1051 o (b) test configurations documents (parameters and message bindings for Test Suites, 1052 configuration of messaging adapters), and

1053 o (c) test output documents (test logs and Test Reports).

1054  Test Services definitions and interfaces. These services are about managing the above Test 1055 Artifacts (design, deploy, archive, search) as well as controlling the major Test Bed functions (test 1056 execution and coordination).

1057  Test Bed Components and their integration. These components are functionally defined. They are 1058 of three kinds:

1059 o (a) Core Test Bed platform components providing basic features, integration and 1060 coordination support to be found in any GITB-compliant Test Bed,

1061 o (b) Testing Capability components, that directly enable the processing of Test Suites (e.g. a 1062 Test Suite engine, a Document Validator) and related tasks (e.g. send/receive messages),

1063 o (c) User-facing components, through which the users interact with the Test Bed for various 1064 functions (e.g. Test Suite design, test execution).

1065  Test Registry and Repository for managing, archiving and sharing various Testing Resources. 1066 This component is not considered as part of a Test Bed, as it is a Testing Resource that can be 1067 independently deployed, managed and accessed. It supports the archiving, publishing and sharing of 1068 various Test Artifacts (e.g. Test Suites to be reused, Test Reports to be published). It also provides 1069 for storing and sharing Testing Capability components to be downloaded when assembling or 1070 upgrading a test Bed (e.g. the latest version of a Test Suite engine, of a Document Validator).

33

1071

1072 Figure 4-5: Overview of the GITB Architecture 1073 In the proposed architecture, the GITB Test Bed is perceived by its users (either persons with specific roles 1074 or other Test Beds) as a set of Test Services. The Test Bed in itself allows for plug-in Testing Capabilities 1075 (for example, Test Suite engines, specialized validation components, message adapters, etc.). These 1076 Testing Capabilities can be supported either by existing (legacy) Test Beds, by remote services or by future 1077 test components to be developed. The Test Bed is also a platform where various Test Suites or Document 1078 Assertions can be deployed, i.e. it is not tied to a particular eBusiness Specification and its Test Suites.

1079 The proposed architecture enables the coordination of several Test Beds specialized for the testing of 1080 different eBusiness Specifications, or for different testing procedures. Some of these Test Beds will be 1081 developed according to GITB recommendations, while others are Legacy Test Beds that have been 1082 augmented with GITB-compliant Service interfaces. Both types of Test Beds can then be integrated in the 1083 same network by providing access via similar Service interfaces. These Service interfaces can either be 1084 directly accessed by users, e.g. a Service Manager accessing the test services from a public interface (Web 1085 for instance), or they can be accessed by some other Test Beds, e.g. when a Test Suite executing on a Test 1086 Bed needs to delegate some document validation to another specialized Test Bed.

1087 The Testing Capabilities (either provided by local components or by remote services) support the 1088 conformance and Interoperability Testing of any eBusiness Specification. For example, the purpose of a 1089 Document Validation capability is to validate a given document according to a set of syntactical or semantic 1090 restrictions specified in an eBusiness Specification. Such capability can be implemented as a local, 1091 pluggable (and reusable) component, or as a remote service from another Test Bed. Similarly a Messaging 1092 Adapter capability aims to communicate with the SUTs based on specified transport and communication 1093 protocols and to provide some level of messaging validation. Such a capability can be provided as a 1094 component that has been downloaded from a common repository for reuse and local integration, or could 1095 also be provided as a remote service, e.g. from a Test Bed or Test Agent specialized in providing various 1096 messaging protocols. Additional Testing Capabilities or services may be added to validate the conformance 1097 to a specified message choreography and business rules. For each of such capability, a common interface 1098 will be defined so that any test service provider can implement a test component or service specific to a 1099 certain standard and can plug-in the Testing Capability to the GITB Test Bed. In GITB phase 3, further 1100 capability types other than messaging and document validation may be identified and the architecture may 1101 be extended accordingly by the same approach.

1102 This architecture promotes the reusability of Testing Resources and Capabilities among different domains 1103 and different standards. As shown in Figure 4-5, a Test Designer developing a Test Case for a certain 1104 eBusiness profile or standard may need a test service (e.g. profile may state that in a certain transaction the 1105 communication should be performed via ebXML messaging, so we need ebXML communications with the 34

1106 SUT) which may already be developed and published by other Test Designers and test service providers 1107 working for another domain or standard. In this way, the GITB Testing Framework leverages the existing, 1108 distributed test services related to eBusiness testing and allows users to discover them and access them via 1109 the Test Registry and Repository.

1110

1111

35

1112 Part II: Core Test Bed Implementation Specifications and Proof-of-Concept

1113 GITB Phase 3 complements and refines the GITB Testing Framework and the Test Bed Architecture defined 1114 in the previous phases by

1115  Defining the implementation specifications for GITB service interfaces and selected artifacts to 1116 achieve interoperability between testing facilities developed for different domains, specifications, or 1117 regions.

1118  Designing a reference Test Bed Architecture and underlying Test Description Language based 1119 on GITB principles for stakeholders in different domains for their future testing facilities (domains that 1120 do not have structured conformance and interoperability test frameworks)

1121  Developping the open source Proof-of-Concept Implementation of a Test Bed based on GITB 1122 specifications and architecture.

1123 Part II of this report summarizes GITB Phase 3 outcomes related to the Core Test Bed. It is relevant for 1124 testing experts and architects that are interested in the detailed Test Bed Architecture and Specifications.

1125 5 Overview of Core Test Bed Implementation Specifications

1126 5.1 Relevant Core Test Bed Service Specifications and Artifacts

1127 The GITB Service Specifications are a group of specifications for testing facilities, Test Beds, content 1128 validation tools, simulators, messaging handlers to achieve reusability of testing functionalities among them. 1129 The following are brief descriptions for each of them;

1130  The GITB Content Validation Service Specification defines a service where any content validation 1131 tool can implement to wrap its functionalities and serve them as a content validation service to other 1132 stakeholders. In some domains, there are already such services used by testing systems to delegate 1133 the content validation job to remote services. The Gazelle External Validation Service (EVS) in 1134 eHealth domain and PEPPOL Document Validation Service in eProcurement domain are some 1135 examples in this respect.

1136  In conformance and interoperability testing, testbeds need mechanisms to communicate with SUTs 1137 based on the protocol specified in the target specification or in other words simulate a specific actor 1138 to handle these communications. Communication protocols are used among different domains and 1139 reuse of these simulation facilities among the testing frameworks of different domain will be very 1140 useful. The GITB Messaging (Simulator) Service Specification defines a service to achieve this 1141 interoperability between testbeds and simulators. For example, an AS4 protocol simulator can be 1142 used by different domains in their testing frameworks to establish AS4 communication with SUTs. 1143 Then these domains will only concentrate on their specific testing requirements based on their 1144 extensions or profiling approach over the AS4 protocol.

1145  In addition to the reusability of more granular testing facilities, accessing testbeds' facilities in a 1146 common way will also be very useful for conformance and interoperability testing. The GITB Test 1147 Bed Service Specification will define this common service definition to drive a testbed remotely for 1148 the execution of a complete conformance or interoperability testing scenario. As specifications 1149 referring other specifications for conformance (profiles, customizations) are becoming more and 1150 more common in many domains, a testbed using another testbed's facilities is also becoming a basic 1151 requirement. With this specification, it is also possible to implement individual test monitoring 1152 interfaces driving multiple testbeds for test scenario executions.

1153 All these services require a common model for a number of test artifacts:

1154  All these services require a common model to report the results of the performed tests so that the 1155 client side can understand the results and render them to its users. The GITB Report Format 1156 Specification defines a model for representing test reports. It is a wrapper format to describe the 1157 brief summary of the results. Based on the validation methodology any report format (ex:

36

1158 Schematron Validation Report Language for schematron validations, a proprietary format for XML 1159 schema validations) can be used within this model.

1160  In order to realize GITB Test Bed Service, a common model is needed to describe a test scenario 1161 between the testbeds. As different testbeds use different models or languages to represent 1162 executable test scenario descriptions, it is not possible to find a common executable model. In fact, it 1163 is not necessary. The model only needs to define the basics of the execution flow by describing it in 1164 terms of granular test steps with a simple categorization. The GITB Test Presentation Language 1165 Specification provides this model to represent a conformance or interoperability test scenario.

1166 In addition to the service specifications, the GITB reference Test Bed Architecture is designed based on 1167 GITB principles in this phase. An important part of this architecture is the GITB Test Description Language 1168 (TDL) that defines the high level executable scripting language for the Test Bed. Stakeholders that need but 1169 do not have such conformance and interoperability test frameworks can use this architecture and the TDL as 1170 reference to build one for their specific needs. As the architecture is designed with GITB principles, the 1171 resulting testing frameworks will facilitate reusing of testing capabilities among different stakeholders and 1172 domains.

1173

1174 1175 Figure 5-5-1: GITB Implementation Specifications 1176 1177 5.2 GITB Namespaces and Common Element Definitions

1178 In this section, we describe the common element definitions used by all Test Bed Implementation 1179 Specifications. We recommend using this section as a reference while reading other parts.

1180 Table 5-1: GITB Namespaces

Prefix XML Namespace Comments gitb: http://www.gitb.com/core/v1/ The core schema defining the common elements for other models. vs: http://www.gitb.com/vs/v1/ GITB Validation Service namespace 37

ms: http://www.gitb.com/ms/v1/ GITB Messaging Service namespace tbs: http://www.gitb.com/tbs/v1/ GITB Testbed Service namespace tpl: http://www.gitb.com/tpl/v1/ GITB Test Presentation Language namespace tr: http://www.gitb.com/tr/v1/ GITB Test Reporting Model namespace tdl: http://www.gitb.com/tdl/v1/ GITB Test Description Language namespace 1181 1182 The is a common element to describe the metadata of the container element (ex: Testcase, 1183 TestModule, TestSuite). 1184  title – Name of the container. Should be descriptive for users. 1185  type (0..1) – Only used for testcases and indicates the type of the test case (CONFORMANCE or 1186 INTEROPERABILITY). 1187  description (0..1) – Long description of the container. 1188  version – Version of the container description 1189  authors (0..1) - List of authors who compose the container artifact 1190  issued (0..1) – Publication date for the container artifact 1191  modified (0..1) – Last modification date for the container artifact 1192 1193 The declares the actor in a Test Case definition that will take part in the test scenario; 1194  id – The unique identifier for the actor. It should be recommended to use URN format to uniquely 1195 identify the actor for the related test bed. 1196  name – Short name given to the actor (for referencing actor within the test case definition). 1197  role – The role of the actor within the test scenario. Value should be used from the enumeration 1198 (SUT, SIMULATED, MONITOR). If the test case aims to test (either conformance or interoperability 1199 testing) an actor, the SUT role should be given. If the role of a given actor is played by the test 1200 engine or some simulator within the test scenario, the SIMULATED should be used. The MONITOR 1201 value is used for further scenarios representing users that do not involve in the target business 1202 process but involve in the testing process for monitoring the test execution or perform manual 1203 validations. 1204 1205 The class is used to embed some content (ex: the message or document content) in a 1206 container element related with a messaging, validation or user interaction operation in a generic way and 1207 while transferring data between GITB modules/services. It describes the way to reach the content, abstract 1208 type of the content (in the type system of Test Bed) and how it is serialized to the receiver module so that it 1209 can parse the content accordingly. 1210  item (1..*) – AnyContent – If the content carries list of contents then this element recursively 1211 represents the carried content. For simple contents only element should be used. For 1212 container types (list or map) each item represents the content of the container items. 1213  value (0..1) – The actual content itself (either in string or base64 encoded representation) or the 1214 URL to access to the actual content. 1215  name – Name for the content item. 1216  embeddingMethod (0..1) – This attribute states the method that describes how the content is 1217 embedded in the value part. The value should be from the enumeration (BASE64, STRING, URI). The BASE64 indicates that the content 1219 is embedded in the format of base64 encoded string within the value. The STRING indicates that the 1220 xs:string representation of content is embedded into the value. Finally, URI indicates that an URL is 1221 given in the value from which the actual content is accessible over the Internet. The default is 1222 BASE64. type (0..1) – GITB enables implementers to extend the abstract type system of GITB when 1223 implementing GITB compliant services and testbeds. This attribute indicates the type of the content 1224 according to the type system of the target GITB compliant testbed or service. (ex: DICOM object, 1225 EDI content, etc). See GITB Type system. 1226  encoding (0..1) – If the type is given this attribute provides the serialization format of the content for 1227 the given abstract type (ex: XML serialization, JSON serialization, etc).

38

1228 1229 The defines a configuration parameter for any GITB module or service. 1230  name – Name of the parameter 1231  use (0..1) – Specifies whether parameter is required or optional for the operation. (R: required, 1232 O:optional). Default is “R”. 1233  kind (0..1) – Configuration value can be simple string or a binary content read from a file and this 1234 attribute indicates the kind of configuration (SIMPLE,BINARY). Default is SIMPLE. 1235  desc (0..1) –Describes the functionality of the configuration parameter within the related process. 1236  value – Default value of the configuration parameter if not provided within the operation. 1237 1238 The extends class to represent a typed value for input and 1239 output definitions of modules, constructs, and services defined in GITB 1240  type – Identifier for abstract parameter type (Based on the type system of the target GITB Compliant 1241 Service or Test Bed) 1242  encoding (0..1) – Identifier for the serialization format for the type (Based on the type system of the 1243 testbed). The default encoding of the type should be assumed when this attribute is not supplied. 1244 1245 The element is used to provide the value of a configuration parameter for the container 1246 construct. 1247  name – Name of the parameter 1248  value – Value of the parameter 1249 1250 The is used to provide configurations for each SUT in the process between two 1251 testing facility. 1252 ◦ actor – Identifier of the actor that the configurations are supplied for. 1253 ◦ endpoint (0-1) – Identifier for the endpoint that the configurations are supplied for. If actor has 1254 only one endpoint there is no need to supply it . 1255 ◦ config (1..*): – List of configurations for the given system (playing the 1256 given actor) 1257 1258 The element defines an actor in a testbed and declares the endpoints of the actor and the 1259 required configuration parameters for those endpoints. 1260  name – Unique identifier of the actor (URN) within the testbed. 1261  desc (0..1) –The textual description of the actor 1262  endpoint (1..*) – – The list of endpoint definitions. 1263 ◦ name – Name of the endpoint (should be unique within the Actor definition) 1264 ◦ config (0..*) – – Configuration parameters for the actor. When a SUT claims 1265 conformance to this actor, before the execution of a related test scenario, the configurations 1266 stated here should be collected from the SUT administrator. 1267 1268 5.2.1 XML Schema for Common Elements

39

40

41

1269

1270

42

1271 6 Test Presentation Language (TPL)

1272 Every Test Bed or testing tool uses some model to define the test execution flows for the 1273 automated processing of test scenarios. Most of them use some type of scripting languages (ex: 1274 TTCN3, OASIS TAML) to represent the execution model. Some of them do not have such a 1275 concrete representation, but still have some abstract model behind which is implemented either 1276 within a software or . Generally, the details of these models are strongly dependent on 1277 the underlying testbed architecture and technology. However, it is observed that these models and 1278 approaches show strong similarities when we focus on the actions and their main 1279 functionalities. For example, all test execution models have constructs to define a messaging step 1280 between a SUT and the simulator or to define a verification step to validate message content. From 1281 the user (SUT administrator) point of view, the important point is to be able to understand the 1282 basics of the testing process (the test flow, what is realized in each step in general, and what is 1283 expected from him and the SUT). The situation is similar for the scenarios, where a Test Bed 1284 drives another Test Bed or testing tool to perform specific set of actions and tests. A common 1285 abstract test scenario definition model will help us to establish this agreement among the 1286 components (testbeds, tools, test monitoring environments) and users (software vendors, SDOs, 1287 test developers) of our vision of global interoperability testing network. 1288 1289 The GITB Test Presentation Language (TPL) will provide the specification for this common 1290 model or language to represent a conformance or interoperability test scenario. The resulting 1291 language is neither a scripting language, nor used for automated test execution. Rather, its 1292 purpose is to present the flow and the test steps in a granular way to users and other testing 1293 software that want to interoperate with the testbed providing the test execution service for the 1294 scenario. Any testbed can easily map their internal test execution models, or test scripting 1295 languages to this abstract common model to describe its test scenarios to the outside world. 1296 1297 6.1 Abstract Model

1298

1299 Figure 6-1: Test Presentation Language Model 1300 1301 Figure 6-1: illustrates the abstract model of the TPL. The root element representing the test scenario is the 1302 . Its attributes and elements are as follows;

43

1303  id – Attribute defines the unique identifier for the test case. It is recommended to use a URN for the 1304 value of this attribute. (ex: urn:gitb:ihe:xds-document-source-conformace-test, urn:gitb:peppol:lime- 1305 protocol-conformance-test) 1306  metadata: – Describes the metadata attributes (name, description, author, version, 1307 etc) of the test case. 1308  actors (1..*): – Describes the actors in the business process defined by the test 1309 scenario’s target specification (ex: Supplier in PEPPOL profiles, Document Consumer in IHE 1310 profiles) and the role assignments regarding the testing process. 1311  preliminary (0..1): – Describes the preliminary requirements that should be 1312 shown to the SUT administrators before starting the test execution. 1313  steps: - List of test step descriptions that describes the flow and each test step. 1314 1315 The element is a container for the preliminary steps in the test case; 1316  instruct (0..*): – Preliminary instructions for the SUT administrators that 1317 describes some requirement regarding the test scenario. 1318  request (0..*): – Preliminary requests from the SUT administrators related 1319 with the test scenario. The SUT administrators are expected to provide the requested information 1320 as an input to the test case definition where these inputs will be used later in the test execution 1321 process. Inputs will be related with the test scenario requirements. 1322 1323 The element is a container for test steps that will be processed in the given order. 1324  steps (1..*) is an abstract class that describes the granular unit 1325 step of a test case. The Sequence class is a list of these test steps which will be executed in linear 1326 order. Test steps are categorized in seven categories; VerificationStep, MessagingStep, 1327 DecisionStep, FlowStep, LoopStep, ExitStep and UserInteractionStep and each extends the 1328 TestStep definition. 1329 1330 The is the abstract class that represents a test step in the definition. 1331  id – The unique identifier for the step within the test case definition. This identifier will be used to 1332 bind test step reports to test steps. Since a test execution can include decision steps, concurrent 1333 executions and loops, a special identification scheme is recommended for test step identification 1334 (See Section 6.2 ). 1335  desc (0..1) – Textual description of the test step which can be shown to the user to describe what 1336 this test step is doing and what is expected from the user. 1337 1338 The describes the actual validation or verification step (ex: XML schema validation of 1339 message/document content, Schematron validation of message/document content, XPATH expression 1340 validation of a value within the message/document content, or custom validation of a non-XML content). 1341 1342 The describes a messaging step between a SUT and a simulator or between two 1343 SUTs (in interoperability tests). This step indicates that a message communication is expected between the 1344 given actors at that time. The related SUT administrators should behave accordingly (drive the SUT) to 1345 initiate the messaging if necessary (some messages are initiated by other messages without any manual 1346 intervention with the user). Each represents only one way of communication, in other 1347 words, for request-response type communication two should exist. 1348  from – Refers to the actor (Actor.name) which is expected to send the message. 1349  to – Refers to the actor which is expected to receive the message. 1350 1351 Test execution flows generally include decision points where the execution is continued on certain branch 1352 based on a condition. The element defines such supplementary test steps. The desc 1353 attribute should describe the condition in textual form in order to enable users understand the behaviour and 1354 test flow. 1355  then (0..1): – Gives the sequence of test steps that test execution will follow when 1356 the condition holds. 44

1357  else (0..1): – Gives the sequence of test steps that test execution will follow when 1358 the condition does not hold. 1359 1360 The element indicates the concurrent execution of the child sequences. The step is 1361 completed when all branches are completed. 1362  thread (1..*): – The child sequences that are executed concurrently. 1363 1364 The extending the indicates that child steps will be executed a number of 1365 times in loop based on some condition. As the main aim of the TPL is showing the flow of the scenario and 1366 describing it textually to the users, the looping condition should be described in the “desc” attribute textually. 1367 1368 The indicates that the test execution will be stopped with this step. 1369 1370 The indicates that testbed will interact with the specified users in this step in 1371 order to instruct them or get some input regarding test execution. 1372  with (0..1): Refer to the actor that this user interaction is performed with. 1373  instruct (0..*): – Indicates that an instruction will be shown to thespecified user 1374  request (0..*): – Indicates that some input is requested from the specified user 1375 1376 The and types extend and they represent 1377 an interaction step either an instruction for a SUT administrator (former) or an input request from a SUT 1378 administrator (later). 1379  with – Refer to the actor (.name) that this instruction will be shown. 1380 1381 6.2 Test Step Identification

1382 The basic requirement for the identification of test steps is that each step should have unique ids within the 1383 same test case definition. However, we recommend a methodology to assign identifiers to test steps in order 1384 to make them more readable and understandable. The test case definition with the TPL is in fact an 1385 execution flow tree where each node is a test step. Traversal of this tree represents the execution order of 1386 the test steps. The identification methodology is in fact an id scheme to represent this traversal. The 1387 following rules are to follow while assigning ids to test steps; 1388  Test steps within the main sequence are assigned successive numbers starting from “1”. Therefore, 1389 first test step in the sequence will be identified as “1”, second as “2”, and so on. 1390  If a step is , the child sequence “then” will be identified as the concatenation of 1391 the id of with “[T]”. For example, if is the third step in the main 1392 sequence, then sequence will be “3[T]”. Similarly, the “else” sequence will be identified as “[F]”. 1393  A step within a sequence is identified as the concatenation of the id given to the sequence, the dot 1394 (“.”) and the successive numbering given to the step. For example, if we have a sequence identified 1395 as 3[T], the first step will be “3[T].1”, and the second will be “3[T].2”. 1396  If a step is , the child sequences will be identified with the concatenation of the id of 1397 the with the successive numbers for the child sequences within brackets. For 1398 example, is has id “3[T].2”, the first child sequence of will be 1399 “3[T].2[1]”, and the second child sequence will be “3[T].2[2]”. 1400  For a , as the is also a sequence the rule for child numbering of a 1401 seuqnece is applicable. For example, if is numbered as “5”, the first child will be 1402 “5.1”. 1403  should be viewed as the and same numbering scheme 1404 should be applied. 1405 1406 6.3 XML Schema for TPL

elementFormDefault="qualified" version="1.0"> 46

1407 1408

47

1409 7 Test Reporting Format

1410 Description missing

1411 7.1 Abstract Model

1412

1413 Figure 7-1: GITB Test Reporting Model 1414 Figure 7-1: illustrates the abstract model for test reporting. The represents the report 1415 for a test case execution and composed of elements each represent the report for the 1416 defined test step in the TPL. 1417 1418 The contains the following elements; 1419  id – The identifier of the test execution instance (test instance id) that this report is related. 1420  date – Date and time that the test case is executed 1421  result – the result of the test execution, can take values from the enumeration 1422 (SUCCESS, FAILURE, UNDEFINED). SUCCESS indicates that case is successfully executed and 1423 FAILURE indicates that there are some errors (non-conformant parts). UNDEFINED represents 1424 other situations where test case is not executed completely or existence of some errors (internal 1425 system errors) not related with conformity or testing process. 1426  counters (0..1): – Provides the number of assertions, errors and warnings 1427 based on the tests done within the test case. 1428  reports (0..*): – The list of test step reports for each step executed within the 1429 test case execution. 1430 1431 The is the base class for representing any test step report. 1432  id – Identifier of the test step that this report is related 1433  date – Date and time that this test step is executed. 1434  result - The result of the test execution, can take values from the enumeration 1435 (SUCCESS, FAILURE, UNDEFINED). SUCCESS indicates that step is successfully executed and 1436 FAILURE indicates that there are some errors (non-conformant parts) related with the step. 1437 UNDEFINED represents other situations where step is not executed completely or existence of 1438 some errors (internal system errors) not related with conformity or testing process. 1439

48

1440 The element (DecisionOrLoopReport) represents reports for decision steps and loop steps that 1441 changes the execution flow based on some condition. 1442  decision – Provides the resulting Boolean value for the condition. This value indicates how test flow 1443 will continue in the next steps. 1444 1445 The (SimpleTestStepReport) represents the reports for all other step types (, 1446 , and ). It is basicly the implementation of the abstract 1447 class. 1448 1449 The (TestAssertionReport) is used to represent reports for messaging and verification steps. 1450  name – Descriptive name for the test assertion group (ex: XML Schema Validation, Business Rule 1451 Validations, etc). 1452  overview (0..1): – Provides information about the validation tool/service 1453 used for the validation process and target specification for conformance checks. 1454  counters (0..1): – Provides the number of assertions, errors and warnings 1455 based on the tests done within this assertion group 1456  context (0..1): – For verification steps, this element provides the content that 1457 validation is done on. For example, the XML message where schematron validation is performed. 1458 For messaging steps, similarly it provides the related part of the message. 1459  reports (0..*): – Different testbed infrasturctures, validation 1460 procedures may have different methodology to group validations/assertions. A validation step may 1461 correspond to a simple assertion (ex: XPath expression validation) or a set of validation procedures 1462 to perform a complete conformance test of a message content (ex: XML schema validation + 1463 Schematron Validation). In order to provide the flexibility to testbeds for the grouping of assertions, 1464 TestAssertionGroup element is designed to include further groups recursively. 1465 1466 The is used to represent the reports for a set of test assertions or test 1467 assertion groups recursively. The element either includes assertion group reports recursively (reports 1468 element) or reports for each assertion (info,warning or error elements) in the group 1469  reports (1..*) : – If the reports are organized into a further grouping, each of these 1470 elements provides the reports for child groups. 1471  [info] OR [warning] OR [error]: – Represents the leaf reports for the 1472 smallest unit of validation process (ex: report from a schematron assertion). 1473 is an abstract class and one of the realizations; Info, Error,Warning 1474 elements will be used for reports. represents the assertions that are successful. 1475 represents the assertions with erroneous result and represents the 1476 successful results but with some warnings. abstract class provides a 1477 wrapper for different report formats. Testbeds can extend it to define their own report formats for 1478 different validation procedures. One implementation of it, the is given in this section to be 1479 an example assertion report format for basic validation procedures. 1480 1481 The provides some further information regarding the validation procedure and the 1482 target specification. 1483  profileId (0..1) – An identifier for the target specification that the validation step is related with. SDOs 1484 generally assigns identifiers to their specifications or part of the specifications (scenarios, use 1485 cases), and these can be used for this attribute. 1486  customizationId (0..1) – If the target specification is customized to a specific region/country or some 1487 specific purpose, an identifier for this customization can be given for this attribute. 1488  transactionId (0..1) – An identifier for the transaction/mesage or document type that the validation is 1489 performed on. 1490  validationServiceName (0..1) – Name of the validation service or tool that performs the validation 1491  validationServiceVersion (0..1) – Version of the validation service or tool that performs the 1492 validation 1493  note (0..1) – Any textual note regarding the validation

49

1494 1495 The is the container for the validation statistics. 1496  nrOfAssertions (0..1) – Total number of assertions evaluated in this assertion group 1497  nrOfErrors (0..1) –Total number of errors from those assertions within the assertion group 1498  nrOfWarnings (0..1) – Total number of warnings from those assertions within the assertion group 1499 1500 The (BasicAssertionReport) provides a default realization that can be 1501 used for many of the existing validation methodologies. 1502  assertionId (0..1) – Optional attribute to give an identifier to the assertion. It can be used to vind the 1503 assertion to a constraint, rule, or similar concepts defined within the specification. 1504  description - Textual description for the assertion result. 1505  location (0..1) – The expression that indicates the location of the error or warning within the content. 1506 For example, an XPATH expression can be used for Schematron reports to indicate the error 1507 location. 1508  test (0..1) This attribute gives the expression itself that performs the validation if it is possible to give 1509 such an expression. 1510  type (0..1) – This attribute describes the type of assertion (ex: cardinality check, usage control) if 1511 testbed provides such categorization 1512  value (0..1) – Some assertions checks the value of an element or attribute within a 1513 message/document content (ex: check if it is equal to some value, or in some specific format). This 1514 attribute gives the actual value within the content to enable the user to understand the assertion 1515 semantics and error better. 1516 1517 7.1.1 XML Schema for Test Reporting Format

50

1518

1519

51

1520 8 GITB Test Service Specifications

1521 The aim of the GITB Test Service specifications is to provide common service specifications for the existing 1522 conformance and interoperability testing facilities so that they can be used remotely by others, either from 1523 within the same domain or from a different domain, for their own testing requirements. Based on GITB Phase 1524 I and Phase II, three main services are identified as modular services that can be used between different 1525 testing setups regarding the execution of conformance and interoperability tests. In this section, we will 1526 describe these services by providing abstract service specification and Web Service binding (WSDL 1527 description) for each of the services:

1528  Content Validation Service 1529  Messaging (Simulation) Service 1530  TestBed Service

1531 1532 8.1 Content Validation Service

1533 8.1.1 Service Overview

1534 Figure 8-1 illustrates the remote content validation scenario between the ValidationService and 1535 ValidationClient actors. The content validation tools that want to implement the Content Validation Service 1536 interface should play the ValidationService role in the scenario. The ValidationClient role can be played 1537 by i) testbeds that wants to use the remote testing capability for specific content validations, and ii) any other 1538 systems or organizations (vendors, etc) that want to use the system for their internal testing procedures. 1539 1. vs:getModuleDefinition: The first step in the interaction is to retrieve the definition of the validation 1540 module. Module definition provides the details regarding the validation operations that module 1541 supports and inputs that the module takes if validation operation is supported. 1542 2. vs:validate: This is the actual validation operation. ValidationClient should prepare the inputs based 1543 on the module definition and supply them properly. Any content validation tool can be wrapped as 1544 Content Validation Service with this operation. All the validation operations returns the test report 1545 (the in the TPL) providing the overall result and description of performed assertions, found 1546 errors and warnings.

1547 1548 Figure 8-1: Sequence Diagram for Content Validation Service 1549 1550 8.1.2 Abstract Service Description

1551 8.1.2.1 ValidationClient Requests Module Definition

1552 The does not take any parameter and is used to request the service's 1553 description object. In response, namely , the service should return the 1554 for which the model is described below; 1555  id – A unique identifier for the validation service itself (can be used by client sides to distinguish 1556 different validation services). 1557  uri – The URL of the service endpoint

52

1558  operation(0..1)– Operation supported by the service. Should take value form enumeration (VC: 1559 validateByContentType, VS: validateBySchema, V:validate). The default is “V”. 1560  metadata: – Metadata regarding the service (name, description, authors, version, 1561 etc). 1562  config (0..*): - Configuration parameters for the module to change the behavior 1563 in the validation process 1564  input (0..*): - Describes the input parameters for the validation operation if 1565 it is supported. 1566 1567 8.1.2.2 Validation

1568 The ValidationClient should send message to the service with the following details; 1569  sessionId - An identifier for the session between the client and the service. 1570  config(0..*): - Supplied configuration parameter values for the validation 1571 process. 1572  input(1..*): – The supplied input parameters for the validation. The parameters 1573 should be in the same order with the parameters as defined in the module definition and should be 1574 matched for the type and encoding. 1575 The should be returned. 1576 1577 8.1.3 Web Service Description (WSDL)

53

1578

1579 8.1.4 XML Schema for Request/Response Messages

1580

1581 8.2 Messaging (Simulation) Service

1582 8.2.1 Service Overview

1583 Figure 8-2 illustrates a scenario where a client (a testbed) drives a Messaging Service to handle the message 1584 communication in a test scenario for a specific communication protocol. 1585 1. ms:getModuleDefinition: Client can use this operation to get the module definition, which is in fact 1586 the definition of the details related with inputs and outputs of the actual messaging commands. 1587 2. ms:initiate: Before starting the use the messaging service for messaging simulations, a session 1588 should be established between the client and the service to prepare the service for the operations. 1589 The client provides the required configurations (ex: network configurations) of the SUTs that will 1590 communicate with the simulator. The service should perform all the initializations and preparations 1591 for the messaging operations in this phase. In response, a session id and the related configurations 1592 of the simulator (ex: network configurations, etc) should be returned. 1593 3. ms:beginTransaction: Different domains requires different communication protocols which differs 1594 in message exchanging patterns. Most of the protocols (ex: Web Service communication) are 1595 based on request-response pattern over a single network connection. However, there are more 1596 complex patterns (ex: DICOM Communication) that requires multiple message exchanges, even 1597 not in pairs of request-response, over a single network connection. In order to be generic while 1598 handling the connections and message exchanges, separate commands are designed to notify the 1599 service accordingly. This command notifies the service that a new communication will start with a 1600 SUT within the next messaging command and makes the service to be ready for this. 1601

54

1602 1603 Figure 8-2: Messaging Service Scenario 1604 Scenario I: We will describe the messaging commands by different scenarios to describe the expected 1605 behaviours better. In the first scenario, assume that according to our test case we need to send a message 1606 to the SUT and then it will return a response to it. 1607 4. ms:send: The “send” command is used to drive the service to send a message to a SUT. 1608 Necessary configurations and message contents should be supplied within the command as 1609 described in the module definition. The service should prepare the whole message and send it to 1610 the SUT. When message is sent, the client will be notified. According to our assumption regarding 1611 the protocol, SUT immediately returns the response. The Messaging Service is expected to notify 1612 the client with this reponse and its validation report. 1613 5. ms:endTransaction: This commands notify the service that this transaction is completed. The 1614 service can release the resources (connections, etc) related with the transaction. 1615 6. ms:finalize: Finalize the session between messaging service and the client. 1616 Scenario II: This time the SUT is expected to initiate the communication by sending a message to our 1617 simulator service. 1618 7. ms:NotifyForMessage: The service should be ready to receive the message from the SUT any 1619 time after beginTransaction command. When the message is received, service notifies the client 1620 with the received message and its report. The client then use the “send” command to send the 1621 response to this message. The service sends the given message to the SUT and returns the report. 1622 GITB Messaging Service does not differentiate between acknowledgements and application level 1623 responses. It is the responsibility of test designers and module implementers to design the service 1624 and arrange the commands according to the scenario and setup. For example, using a reference 1625 implementation or messaging software as a Messaging Service can automatically return 1626 acknowldgements. In that case, test designer does not need to put another send command for the 1627 acknowledgement. 1628 55

1629 8.2.2 Abstract Service Description

1630 8.2.2.1 Requesting Module Definition (GetModuleDefinition)

1631 The client use this operation to retrieve the module definition from the service to understand the input and 1632 output parameters for messaging commands. The does not take any 1633 parameter and is used to request the service's description object. In response, namely 1634 , the service should return the element. 1635 1636 8.2.2.2 Initiating the Session (Initiate)

1637 The client uses this operation to establish a session with MessaginService and provides the configuration 1638 parameters of the SUTs that will involve in messaging. The details of the are as 1639 follows; 1640  actorConfiguration (1..*): – Configurations for each SUT that will 1641 communicate with this simulator in the process 1642 ◦ name – An identifier for the actor that the configurations are supplied for. 1643 ◦ config (1..*): – List of configurations for the given system (playing the 1644 given actor) 1645 In response, the service should return the ; 1646  sessionId – An unique identifier for the session. 1647  actorConfiguration (1..*): – List of configurations for the simulator. 1648 1649 8.2.2.3 Initiating a Transaction (BeginTransaction)

1650 This command is used to notify the service that communication is expected between the SUT and itself after 1651 this point of time. The details of are as follows; 1652  sessionId – The session identifier related with this transaction 1653  config (0..*): - Further configurations related with the transaction 1654  from (0..1) – The name of the actor (refers the name of the actor given in the 1655 in Initiate operation) that will initiate the transaction 1656  to (0..1) – The name of the actor that will be on the other side 1657 1658 8.2.2.4 Commanding Messaging Service to Send a Message (Send)

1659 This command is used to make the service to send the given message to a SUT. The details of 1660 are as follows; 1661  sessionId - The session identifier related with this operation 1662  to - The name of the actor that the message will be send 1663  input (1..*): – The inputs (message parts) supplied to the service. The service 1664 will use this inputs to construct the actual message. 1665 In response the SendResponse should be returned; 1666  report: – The validation report generated for the operation. 1667 1668 8.2.2.5 Notification of the Client for Received or Proxied Messages (NotifyForMessage callback)

1669 When the service received a message from the SUT as expected according to the scenario, it should use the 1670 callback and send the to the client; 1671  sessionId – The session identifier related with the message 1672  from (0..1) – The name of the actor that message is received from 1673  to (0..1) – The name of the actor that message is sent to (only used for Listen/proxy operations) 1674  report: – The received message parts and validation report generated 1675 for the operation. 56

1676 1677 8.2.2.6 Closing the Transaction (EndTransaction)

1678 When the communication between two actors is completed according to the scenario, the client can use this 1679 operation to notify the service about this so that it can release the related resources. The 1680 is as follows; 1681  sessionId - The session identifier related with this transaction 1682 1683 8.2.2.7 Closing the Session (Finalize)

1684 When the messaging is finished, this command can be used to finalize the session. The 1685 is used in this operation; 1686  sessionId - The session identifier 1687

1688 8.2.3 Web Service Description (WSDL)

message="tns:initiate" /> 58

1689

1690 8.2.4 XML Schema for Request/Response Messages

59

1691 1692 8.3 Test Bed Service

1693 8.3.1 Service Overview

1694 Figure 8-3 illustrates a scenario where a client system uses the remote testbed to execute a test scenario 1695 while it provides monitoring capability for its user (the SUT administrator). TestbedClient and TestbedService 1696 role represents the client and service sides respectively. 1697 1. tbs:getTestCaseDefinition: The scenario starts with user selecting the test case to execute, but it is 1698 out of scope for the service specification. Then TestbedClient calls this operation to retrieve the test 1699 case definition which will be presented in the TPL format as described in Section 0). TestbedClient 1700 should render the description and present it to the user in some way. 1701 2. tbs:getActorDefinition: In the test case definition, only the identifier and role of the actor in the test 1702 scenario is returned. TestbedClient needs to know the required configuration parameters for actors 1703 in order to get the configurations from the SUT administrators. This operation is used to retrieve 1704 these actor definitions. 1705 3. tbs:initiate: TestbedClient have to use the initiate operation by suppliying the test case identifier to 1706 initiate the test execution. In response, the TestbedService should return an unique identifier for the 1707 testcase execution session. 1708 4. tbs:configure: After this phase, TestbedService is expecting TestbedClient to send the 1709 configurations related with the SUT (or for all SUTs in case of interoperability tests). In response, 1710 TestbedService also compiles the configurations for the simulated actors and returns them. 1711 TestbedClient will show all these configurations to the user so that he can configure the SUT 1712 accordingly (ex: providing network parameters of the corresponding actor). 1713 5. tbs:initiatePreliminary: After configuration phase, if test case description has some preliminary 1714 phase, TestbedClient should use the initiatePreliminary to start the preliminary phase. In response, 1715 TestbedService returns all instructions and input requests for the user. TestbedClient will show these 1716 instructions and requests to the user. 1717 6. tbs:provideInput: When the user supply the requested information, TestbedClient use this 1718 operation to send these inputs to the TestbedService. 60

1719 7. tbs:start: When the preliminary phase end by collecting all the inputs, user can start the testing 1720 phase at any time. When he does, TestbedClient use “start” command to initiate the execution. After 1721 processing the test steps defined in the test case definition, TestbedService calls the 1722 tbs:updateStatus callback to notify TestbedClient about the latest status of the execution. If an user 1723 interaction step exists within the flow, the tbs:interactWithUsers callback will be called to initiate the 1724 interaction. TestbedClient will use the tbs:provideInput operation to supply the inputs if the 1725 interaction includes input requests. After completion of these, execution continues from the next 1726 steps. When execution finished, the overall report for test case will be sent to the TestbedClient. 1727 8. tbs:stop: This operations is not shown in the figure, but it can be used to stop the execution any 1728 time. 1729 9. tbs:restart: This is also not shown in the figure. When the execution is stopped (either finished 1730 normally, stopped by user, or by exit step), this command can be used to restart the execution with 1731 the same configurations and preliminary requirements. 1732

1733 1734 Figure 8-3: Testbed Service Scenario 1735 1736 8.3.2 Abstract Service Description

1737 8.3.2.1 Requesting Test Case Definition (GetTestcaseDefinition)

1738 This operation is used by TestbedClient to retrieve the test case definition in TPL format. The 1739 is sent for the operation; 1740  tcId - The identifier for the test case 1741

61

1742 The is returned in response; 1743  testcase: – Definition of test case 1744 1745 8.3.2.2 Initiating Test Process (Initiate)

1746 This is operation is used to initiate the execution for a test scenario. TestbedService is expected to generate 1747 an unique identifier for the execution. The is sent for the operation; 1748  tcId - The identifier for the test case. Used if the execution is not initiated yet. 1749 The is returned as a response; 1750  tcInstanceId – The identifier for the execution session 1751 1752 8.3.2.3 Requesting Actor Definition (GetActorDefinition)

1753 This operation is used to get the full definition of actors together with the required configuration parameters 1754 for SUTs that wants to play the actor. The is sent for the operation; 1755  tcId – The identifier for the test case 1756  actorId – The identifier for the actor 1757 1758 The is returned as a response; 1759  actor: – Definition of actor 1760 1761 8.3.2.4 Configure Test Execution (Configure)

1762 This operation is used to supply the configurations of the SUTs that will participate to the testing process. As 1763 each SUT will play an actor defined in the test case definition, configurations will be mapped with the actor 1764 name. The element is used for request; 1765  tcInstanceId – The identifier for the execution session 1766  actorConfiguration (1..*): – Configurations for each SUT in the 1767 process 1768 1769 The is returned as a response; 1770  actorConfiguration (1..*): – Configurations for the simulated actors 1771 1772 8.3.2.5 Initiate Preliminary Phase (InitiatePreliminary)

1773 This operation is used to initiate preliminary phase when TestbedClient is ready after configurations are 1774 done. TestbedService should execute the preliminary phase and return all the resulting instructions and input 1775 requests. The is sent for the operation; 1776  tcInstanceId – The identifier for the execution session 1777 1778 The is returned as a response; 1779  preliminary: – Instructions and input requests for the SUT administrators 1780 1781 8.3.2.6 Providing User Input for Execution (ProvideInput)

1782 This operation is used both in preliminary phase or execution phase to supply the requested inputs from the 1783 SUT administrators to the TestbedService. The response is just the acknowledgement of the operation. The 1784 is sent as the request; 1785  tcInstanceId - The identifier for the execution session 1786  input (1..*): - Inputs supplied by the users. Extends the 1787  stepId - Associated step id for the request () 1788 62

1789 8.3.2.7 Starting the Execution Phase (Start)

1790 When preliminary phase is completed by providing all requested inputs, TestbedClient can start the 1791 execution at any time. TestbedService starts to execute the test steps as defined in the test case description 1792 and send status updates by calling the tbs:updateStatus callback. The reponse to this request is just an 1793 acknowledgement. The is send for the operation; 1794  tcInstanceId - The identifier for the execution session 1795 1796 8.3.2.8 Status Updates for Testcase Execution (UpdateStatus callback)

1797 This is the callback to notify TestbedClient about the execution of each test step defined in the definition. 1798 TestbedService should send a notification for each test step once when it starts to processing it and once 1799 when it completed processing. For the completed steps, it also provides the report for the test step. The 1800 is sent for the callback; 1801  tcInstanceId - The identifier for the execution session 1802  stepId – The identifier of the test step (the identifier of the corresponding step in the test case 1803 definition) 1804  status – Status of the processing for that step. Values can be; 1805 ◦ “PROCESSING”: Used when the testbed starts processing the step. 1806 ◦ “SKIPPED”: Used when a step is skipped (One branch of decision step). 1807 we“WAITING”: Used for messaging steps or interaction steps when some input is expected either from SUTs 1808 or SUT administrators. (Replace PROCESSING for such steps) 1809 ◦ “COMPLETED”: Used when processing is completed for the step. 1810  report (0..1): – When a step is completed, this element is used to provide the 1811 report. 1812 1813 8.3.2.9 User Interaction During Execution (InteractWithUsers callback)

1814 This is the callback to notify the TestbedClient that interaction is required with certain users (SUT 1815 administrators) at this step to show them some instructions or request some input from them. As in the 1816 InitiatePreliminaryResponse, TestbedService will supply all the instructions and input requests in this 1817 callback. TestbedClient should interact with the given users, collect the input and call the ProvideInput 1818 operation to supply the inputs to the TestbedService back. The is used 1819 for the callback; 1820  tcInstanceId - The identifier for the execution session 1821  interaction: – Include instructions and input requests regarding the expected 1822 user interaction 1823 1824 8.3.2.10 Stopping the Execution (Stop)

1825 This operation can be used at any time to stop the test execution. The is used; 1826  tcInstanceId - The identifier for the execution session 1827 1828 8.3.2.11 Restarting the Execution Phase (Restart)

1829 If the test execution is completed normally or stopped by some reason during the execution, this operation 1830 can be used the restart the execution phase. In this way, the configuration and preliminary phases do not 1831 have to be repeated. However, the TestbedService should initiate a new execution session. The 1832 is used; 1833  tcInstanceId - The identifier for the execution session 1834 1835 The is returned as response; 1836  tcInstanceId - The identifier for the new execution session

63

1837 1838 8.3.3 Web Service Description (WSDL)

wsam:Action="http://gitb.com/tbs/getActorDefinitionResponse" message="tns:getActorDefinitionResponse"/> 65

1839

1840 8.3.4 XML Schema for Request/Response Messages

xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:tpl="http://www.gitb.com/tpl/v1/" xmlns:tr="http://www.gitb.com/tr/v1/" xmlns:gitb="http://www.gitb.com/core/v1/" elementFormDefault="qualified">

67

1841

1842

68

1843 9 GITB Test Description Language (TDL)

1844 9.1 GITB Test Bed Concepts and Interfaces

1845 Before presenting the GITB Test Description Language (TDL), we need to describe the main concepts and 1846 assumptions that it is based on to setup a global interoperability testbed. 1847 1848 9.1.1 Basic Concepts

1849 The TDL defines the model and the format to describe a conformance or interoperability test scenario in a 1850 way that, when executed, the Test Bed realizes the business process as defined in the target eBusiness 1851 specification between the SUTs and the simulated actors and performs the intended testing procedures. The 1852 definition of such a scenario with TDL is called Test Case definition. In order to check conformance or 1853 interoperability of a system for a target specification, in general, multiple test scenarios may be needed to 1854 test different aspects of the system in alternative scenarios. Therefore, a logical grouping among the Test 1855 Case definitions is required. The concept of Test Suite represents such grouping for a specific objective of 1856 the test designer. Furthermore, a Test Suite includes some common definitions for all included Test Cases. 1857 With these definitions, the testing process can be defined as a business process between the testbed, SUTs, 1858 and SUT administrators which is managed by the testbed itself by getting the Test Case definition and its 1859 attached Test Suite definition as input. 1860 The GITB testing process and model is based on the actor concept. All eBusiness specifications define 1861 some type of actor (party) in their business choreography. These abstract definitions represent systems and 1862 implementations in the real world and software implementers use these concepts to claim conformance to 1863 the target specification. The actors that will participate in the testing process are defined in a Test Case 1864 definition. Furthermore, their roles in the testing process are specified. The actors that are tested should be 1865 indicated as SUT and actors that are simulated by the testbed should be indicated as simulated. Systems 1866 claiming conformance to the corresponding actor, indicated as SUT in the test case definition, can initiate (or 1867 join for interoperability test scenarios) the testing process by playing the role. 1868 1869 9.1.2 Type System and Expressions

1870 Test Beds deal with messages, documents and intermediate results computed from them during test 1871 execution. A requirement for all Test Beds is to have mechanisms to temporarily store the intermediate 1872 results, pass them to other modules as input, navigate on the content to reach more granular values or parts, 1873 and compute further results from those. Like any programming or scripting language, these are handled by 1874 defining a type system and expression language working on that type system. 1875 Although many of the current eBusiness specifications use XML as the format for the content, there are 1876 some domains or specifications that have different data models and message or document formats (e.g. 1877 DICOM for digital images in medical domain, JSON for many lightweight specifications and EDI). In order to 1878 provide a generic testing platform and support all of them, we need an extendible type system for GITB. 1879 Furthermore, it should not be complex to ease the test definition process.

1880

1881 Figure 9-1: GITB Type System

69

1882 Figure 9-1 illustrates the GITB type system. The GITB Data Type represents the root abstract type where all 1883 other types will be inherited. The Primitive Types represent simple values. The two Container Types are 1884 used to hold multiple values. The List Type is used to represent a list of values in a specific GITB Data Type 1885 (homogen internally). The Map Type is used to hold (key, value) pairs where value is any GITB Data Type 1886 (not homogen within the map). The Object Type is the root type for complex structures. It is the extension 1887 point for registering new types to the Test Bed for supporting different requirements within a specific domain 1888 or specific to a standard. In GITB, the type system concept is rather abstract. For example, we can register a 1889 type to represent EDI content (or XML, DICOM, etc.) in order to handle the related operations in our test 1890 scenarios. A pluggable type handling mechanism is assumed for the GITB Test Bed where a new type 1891 handler is plugged-in for a new registered type. In other words, a complex type is viewed as a black box 1892 system (type handler) implementing the following operations defined in the abstract GITB Data Type class. 1893  deserialize – A type handler should have the capability to construct the instance of the target type 1894 from a byte stream. Abstract type may have different serialization formats. For example, originally 1895 DICOM has a non-XML binary data format used in the actual processes. However, a special XML 1896 serialization and string serialization are defined by some tools for testing and monitoring purposes as 1897 the original format is not human readable. In order to benefit from these formats in our testing 1898 scenarios (for example, to supply a message template to a messaging step for sending messages to 1899 SUTs), our type registered for DICOM can support these serialization formats. The parameter 1900 encoding represents the format that the byte stream is encoded (in our example XML, original 1901 DICOM encoding, or special string serialization). This operation is used to construct the instance 1902 from a message received from SUT, or from a file (sample message to send). 1903  serialize – Similar to deserialization, a type handler should have the capability to serialize an 1904 instance of the type to a byte stream in one of the supported serialization formats. The parameter 1905 encoding indicates the format for serialization. 1906  processExpression – In addition to the type system, an expression language is required to 1907 navigate on the content to reach granular content parts, elements, attributes, etc. As XML is the 1908 most frequently used format for eBussiness specifications, an XPath 2.0 based expression language 1909 is selected as default for GITB. Furthermore, many non-XML content specifications have already 1910 XML serializations (DOM representations) and support XPath for their models. A type handler should 1911 implement a mechanism to evaluate the given XPath expression on its . This mechanism 1912 can just be applying XPath to the special XML serialization of the content. 1913 1914 9.1.3 Modularity for Specific Functionalities

1915 As described in the GITB Testing Framework (CWA 16408:2012), the main testing functionalities like 1916 messaging and validation can be handled by pluggable modules within a modular architecture. GITB 1917 Messaging Service and GITB Validation Service are services that enables this modularity remotely between 1918 different testing facilities. In order to handle this modularity within a single Test Bed architecture, the GITB 1919 POC Test Bed is also designed to be modular in term of its messaging and validation capabilities. The 1920 modules that will handle the communication with SUTs within the business process are called Messaging 1921 Adapters (for example, an adapter to handle AS4 messaging). On the other side, the modules that will 1922 perform specific validation procedures are called Validation Adapters (for example, adapter for schematron 1923 validations). These modules can be internal pluggable modules only available within the Test Bed itself or 1924 implemented as GITB Validation Service or GITB Messaging Service to open the functionality to outside 1925 world as a reusable remote service. 1926 While plugging these modules to the Test Bed, each module should provide a definition describing its 1927 configuration, input and output parameters. The abstract class provides the details for 1928 this definition: 1929  id – A unique identifier for the module itself within the Test Bed (Test Case definitions will use this 1930 identifier to refer the module). 1931  uri – The path (address) used to access to the module. It will be a URL if the module is a service. 1932  metadata: – Metadata regarding the module (name, description, authors, version, 1933 etc). 1934  inputs (0..*): – Describes the input parameters for the module. 1935  outputs (0..*): – Describes the outputs of the module (not used for 1936 validation adapters).

70

1937  isRemote – Indicates if this module is a remote service, such as a GITB compliant Validation or 1938 Messaging Service. 1939 1940 extends for validation adapters with the following elements: 1941  configs (0..*): – Configuration parameters for the module to 1942 change the behavior in the procedure (validation process or messaging process) 1943 1944 extends for messaging adapters with the following elements: 1945  actorConfigs: – Generic configuration parameters for the 1946 systems that will communicate. 1947  transactionConfigs: – Configuration parameters specific to a 1948 transaction. 1949  listenConfigs: – Configuration parameters specific to listen 1950 operations. 1951  receiveConfigs: – Configuration parameters specific to receive 1952 operations. 1953  sendConfigs: – Configuration parameters specific to send 1954 operations. 1955 1956 Test designers while writing the related part of the test scenario can use these definitions like an API to 1957 supply the required configuration and input parameters to the module or bind the outputs to internal variables 1958 within TDL. Furthermore, the Test Bed itself will use these definitions to behave accordingly while 1959 communicating with these modules. 1960 Figure 9-2 illustrates the abstract model of the GITB TDL.

1961 1962 Figure 9-2: GITB Test Description Language Model 1963 9.2 Test Suite Definition

1964 The element represents a package (logical grouping) of executable test scenarios to check 1965 adherence of implementations to one or more normative statements in a specification. The methodology to 1966 form these logical groups is not within the scope of TDL. Test designers may choose different strategies in 71

1967 this respect. For example, a may represent the package of all conformance test scenarios to 1968 check conformance against a whole specification (for example, a conformance test suite for IHE XDS Profile, 1969 conformance test suite for PEPPOL Busdox). Or, it may represent a package of conformance test scenarios 1970 for a specific actor/party defined in a specification (for example, a conformance Test Suite for IHE XDS 1971 Document Registry, or a conformance Test Suite for CENBII Tender Notification Customer Role). Or it may 1972 correspond to a more granular set (such as a specific business scenario, etc). 1973 1974 The definition provides some basic information required for the execution of child Test Case 1975 definitions: 1976  metadata: – Metadata related to the Test Suite definition (name, description, 1977 author, creation time, etc). 1978  actor (1..*) : – Definition of actors that takes part in the business processes in the 1979 target specification which are related to the test scenarios in this Test Suite. 1980  testcases (1..*): – List of Test Cases in this Test Suite. 1981 1982 The indicates the identifier of the Test Case and its prequisite Test Cases: 1983  id – The identifier for the test case 1984  prequisite (0..*) - Identifiers of prequisite test cases for this test case 1985 1986 9.3 Test Case Definition

1987 The element represents an executable conformance or interoperability test scenario that 1988 evaluates the adherence of implementations to one or more normative statements in a specification. The 1989 following attributes make up the Test Case definition: 1990  id – Defines the unique identifier for the Test Case. It is recommended to use a URN for the value of 1991 this attribute. (ex: urn:gitb:ihe:xds-document-source-conformace-test, urn:gitb:peppol:lime-protocol- 1992 conformance-test) 1993  metadata: – Describes the metadata attributes (name, description, author, 1994 version, etc) of the test case. 1995  namespace (0..*): – The list of namespace declarations and their prefix bindings 1996 that will be used in the expressions used in the test case. 1997  import (0..*): – The list of import statements to declare the external test modules or 1998 test artifacts required for the execution. 1999  actor (1..*) : – Describes the actors in the business process defined by the target 2000 specification of the test scenario and the role assignments regarding the testing process. (ex: 2001 Supplier in PEPPOL profiles, Document Consumer in IHE profiles) 2002  variable (0..*): – The global variable definitions for the Test Case execution. 2003 Variables are used to temporarily store message/document parts or specific values during the 2004 execution. 2005  preliminary (0..1): – Container for describing preliminary requirements of the 2006 Test Case that should be shown to the SUT administrators before starting the test execution. 2007  steps: – The root container for the definition of test steps and their flows. 2008  scriplet (0..*): – A subsequence of test steps, or in other words a sub test flow, which 2009 can be used within the test case definition more than once. Similar to the concept of function 2010 definition in a programming language. 2011 2012 9.3.1 Namespace Declarations

2013 The GITB TDL has an abstract expression language concept for processing/selecting elements from 2014 message/document contents, or compute further values from such content. Although, a default expression 2015 format (XPath based) will be proposed in this document for the POC Testbed implementation, any 2016 expression language designed for these purposes can be used as an TDL expression. Namespaces are 2017 important in expressions while refering the element or attribute names in a document or message model. 2018 This element is used to declare the namespace and the prefix bounded to the

72

2019 namespace within the execution scope. These prefixes can then be used in expressions to refer the 2020 elements defined in the corresponding namespace. 2021  prefix – The prefix binding for the namespace. 2022  value – The string representing the namespace. 2023 2024 9.3.2 Importing External Test Modules and Artifacts

2025 The GITB architecture allows a Test Bed to use remote testing facilities and existing Test Artifacts (for 2026 example, schematrons, schemas, sample messages) within the test execution. The import statements 2027 provide the details for the test engine. They describe how to import those modules or artifacts so that the test 2028 engine can remotely access them and use them during the test execution. 2029 2030 The element is used to import external test modules. The following are the details of the 2031 element: 2032  name – The name of the imported module. This name will be used to refer this module within the 2033 test case definition. 2034  uri – The URI to access to the module. 2035  config (0..*): – Configuration parameters for the module. The test engine 2036 should configure the module with the supplied parameters before the execution. 2037 2038 The element is used to import external test artifacts required for test execution. The 2039 following are the details of the element: 2040  name – A name assigned to the artifact. This name will be used to refer this artifact within the test 2041 case definition. 2042  uri – The URI to access to the artifact. 2043  type – Indicates the type of the content. Should refer one of the default GITB types or plugged-in 2044 types for the testbed. (ex: gitb-types:DOM for schematron or XML schema artifacts) 2045  encoding (0..1) – Indicates the serialization format of the artifact content (ex: XML for schematron or 2046 XML schema artifacts). If not supplied the default format for the given type is assumed to be used for 2047 the artifact. 2048 2049 9.3.3 Defining the Actors and Roles in the Test Case

2050 As mentioned earlier, the GITB testing process and model is based on the actor concept, which is very 2051 common for eBusiness specifications. A Test Case definition should declare the actors who are participating 2052 in the target testing scenario (The details of the actors are defined within the Test Suite definition). 2053 Furthermore, based on the objective of the test scenario, the role of the actor in terms of testing should also 2054 be declared. For a conformance test scenario, the actor for which the conformity will be checked should take 2055 the System-Under-Test (SUT) role. The other actors will be simulated by the test engine. For an 2056 interoperability test scenario, the actors that will be tested should be indicated as SUT. The 2057 element is used to define an actor along with the following details: 2058  id – The unique identifier of the actor definition within the GITB Test Bed. 2059  name – A short name assigned to the actor. This name is used in the Test Case definition to refer 2060 this actor in the related constructs. 2061  role – The role of the actor for testing process. The value should be from TestRoleEnumeration; 2062 ◦ SUT – Indicates that this actor will be tested in this test scenario. It means that the systems that 2063 want to be tested can only participate to the test with this actor. 2064 ◦ SIMULATED – Indicates that this actor will be simulated by the Test Bed itself in the test 2065 scenario. 2066 2067 9.3.4 Defining the Variables

2068 Like any programming language, test scripting languages need variables to store intermediate results 2069 (message/document parts, computed values, etc) during test execution. The element defines 73

2070 a variable with its type and initial value if supplied. In GITB TDL, variable declarations are either done in a 2071 Test Case or in a Scriptlet and the scope of the variable is the enclosed container. The variables defined in 2072 the Test Case are global variables for test execution. Variables in Scriptlets are local variables for that 2073 scriptlet. 2074  name – Name of the variable. Should be unique within the scope (Test Case or Scriptlet). The 2075 expressions refer this name to access the value (see expression handling). 2076  type – Indicates the type of the variable (see GITB TDL type system). 2077  value (0..*): – Provides the initial value assigned to the variable before the 2078 execution of test steps. The element is a named expression and the evaluated value 2079 of this expression is assigned for the value. For composite types (map and list), more than one 2080 “value” element can be supplied. For “list” type, a list will be composed from the evaluated value of 2081 all supplied “value” elements. For “map” type, each “value”element will provide the key (the name 2082 attribute of element), value (evaluated expression value) pair. 2083 2084 9.3.5 Preliminary Phase for the Execution

2085 The construct is the container for the steps in the preliminary phase of the test 2086 scenario, but also used for user interactions during execution. This is the phase where SUT administrators 2087 are notified with preliminary requirements of the test scenario. These preliminary requirements can be 2088 instructions for SUT admins to do something on their implementations related to the scenario before the 2089 execution begins. This instruction will probably be related with a message/document exchanged between the 2090 SUT and test engine (or another SUT). For example, it may request from SUT admin to create a user profile 2091 in the system with the given values (id, name, address, etc) and check if the message part related with the 2092 created user profile complies with the given requirements. This type of preliminary interactions are 2093 represented by the element and its attributes are as follows; 2094  desc – The textual instruction to be shown to the SUT administrator. 2095  with (0..1) – Refers the actor (name attribute of TestRole element) that this instruction will be shown. 2096 If not supplied, it is assumed that this instruction is shown to all SUT actors defined in the test. 2097  type (0..1) – If a value (computed at run time) is supplied together with the instruction, the type of the 2098 value should be specified in this attribute. An example instruction can be “Please use the following 2099 value for the User identifier for the scenario”. 2100  encoding (0..1) – If a value is supplied together with the textual instruction, this attribute indicates 2101 the representation format of the value for the given type. 2102  expr (0..1) – extends and the evaluated value of 2103 expression will be supplied as the value to be shown in the instruction. 2104 2105 Sometimes rather than enforcing requirements (specific values in the scenario), it can be more convenient to 2106 give the freedom to the SUT administrators to set certain values in a testing scenario. For the same example, 2107 the instruction can be changed to “Please create a user profile in your system and copy the user identifier 2108 that your system assigns to the user to the following space”. Such instructions are also represented with 2109 element. The semantics of the attributes are as follows: 2110  desc – The textual instruction to be shown to the SUT administrator. 2111  with – Refers the actor (name attribute of TestRole element) that this instruction will be shown and 2112 input will be requested. 2113  expr (0..1) – extends and if an expression is supplied, it 2114 means that input taken from the user will be assigned to the mentioned variable in the expression. 2115 In other words, supplied expression should be a variable expression (left value). If this element 2116 exists, type and name attributes are not necessary and test engine should not process them. The 2117 type of the expected input is deducted from the type of the variable. 2118  type(0..1) – If the requested input is not assigned to a variable by using the expr, this attribute 2119 should be used to indicate the type of the requested input according to type system of the testbed. 2120  name (0..1) – If the requested input is not assigned to a variable by using the expr, this attribute will 2121 specify the name of the input unique within the container (Preliminary or UserInteraction). Then this 2122 name will be used to access the value within the later expressions. 2123  encoding (0..1) – Indicates the serialization format of the requested input. If not supplied the default 2124 format of the given type is assumed. 74

2125 2126 9.3.6 Test Steps and Commands

2127 The is used to represent a sequence of test commands for the test engine to execute 2128 sequentially. The root element in the definition is the entry point for the 2129 main execution phase. A Sequence may include the following constructs; 2130  btxn: 2131  etxn: 2132  send: 2133  receive: 2134  listen: 2135  if: 2136  while: 2137  forEach: 2138  flow: 2139  exit: 2140  assign: 2141  group: 2142  verify: 2143  call: 2144  interact: 2145 Some of these constructs (btxn, etxn, assign, call) are supplementary constructs and are not designated as 2146 test steps. Others are actual test steps which are presented to the users and should extend 2147 class. The common attributes for test steps are; 2148  desc – The textual description of the test step. It should be written as an instruction to the SUT 2149 administrators when some action is expected. 2150 2151 9.3.7 Messaging Steps

2152 Handling communication among SUTs and the simulated actors (i.e. the Test Bed itself) based on the target 2153 protocol (according to the rules and requirements stated by the target specification) is one of the major part 2154 for automated test processing. Communication can be between a SUT and a simulated actor, or between 2155 two SUTs and the required TDL commands and mechanisms are required to handle and drive the 2156 communication. The TDL has three messaging commands to represent these operations: 2157  send – This is used when the Test Bed (over a simulated actor) needs to send a message to a SUT 2158 based on the target specification for that step. 2159  receive – This is used when a SUT is expected to send a message to the Test Bed (over a 2160 simulated actor) for that step based on the target specification. 2161  listen – This is used when a SUT is expected to send a message to another SUT and the Test Bed 2162 is expected to listen (like a proxy) to this message. 2163 2164 Each of these test steps represents only one side of the communication between actors. The communication 2165 protocols (ex: SOAP Web Services, RESTFul Services, AS2, AS3, AS4, ebMS, etc) generally are based on 2166 request-response scheme at the application layer. Therefore, in order to handle the full communication two 2167 complimentary messaging steps should be used. For example, assume that we are testing a Web service 2168 client and out Test Bed is simulating the Web service. For this communication, we need a receive command 2169 as a first step to get the Web service request. Then after doing some validations and processing, we can use 2170 the send command to send the response message. However, some domain specific protocols (ex: DICOM 2171 communication protocol in eHealth) define more complex messaging schemes (multiple requests, responses 2172 in specific orders) at the application layer. In order to support all of these protocols, TDL defines the 2173 Transaction concept which is used to relate the messaging steps that simulate a complete communication 2174 between two actors at the application layer. The command is used to notify the test

75

2175 engine that a Transaction will start with the given in in the next messaging steps. The details of the element 2176 are as follows: 2177  txnId – An identifier assigned to the transaction. This identifier is used to relate the messaging 2178 steps with this transaction. Therefore, it should be unique among other transactions in the test 2179 case definition. 2180  from – The actor that will participate in the communication related to this transaction. The name 2181 attribute of the actor stated in the corresponding TestRole element should be used for the value for 2182 referral. As notation, the actor that will start the communication should be refered from attribute. 2183 Generally, specifications define single endpoints for their actor definitions. However, some may 2184 define more than one endpoint for an actor supporting different protocols. If an actor definition has 2185 more than one endpoint definition, then the value for this attribute should be in the following format 2186 “.”. 2187  to – The actor that will participate in the communication related with this transaction. 2188 2189 The class is used to notify test engine that a transaction is finalized and no following 2190 messaging steps will refer this transaction any more. 2191  txnId – The id of the transaction. 2192 2193 A common base class, is designed for the three messaging steps. 2194  txnId – The id of the transaction that this messaging step belongs to. 2195  handler – The unique identifier for the handler (messaging module) within the Test Bed that will 2196 handle the communication stated with this messaging step. 2197  from – The actor that will send the message. The name attribute of the actor stated in the 2198 corresponding TestRole element should be used for the value for referral. If an actor definition has 2199 more than one endpoint definition, then the value for this attribute should be in the following format 2200 “.”. 2201  to – Refers the actor that will receive the message (see from - Same rules apply). 2202  config (0..*): – List of configuration parameters to configure the messaging 2203 module for the communication. 2204 2205 In a GITB Test Bed, every registered messaging module has a definition file that defines its configuration, as 2206 well as the input and output parameters required for the operation. Supply of the configuration parameters 2207 and inputs and binding of the output parameters in the TDL are perfomed based on these definitions. 2208 2209 The command extends the with the following extra elements; 2210  input (0..*): – The list of input elements that will be supplied to the messaging 2211 module (most of the time inputs will be the parts of the message that will be send to the SUT). The 2212 Binding class is an Expression with a name attribute. The expression is evaluated and the value is 2213 given as the input parameter. The binding of the supplied input elements to the input parameters of 2214 the module can be done in two different ways; either by name binding, or by the order of parameters. 2215 By name binding, the name attribute in each input element should refer the parameter name defined 2216 in module definition. In this way, the optional parameters may not be supplied. For the other way, 2217 supplied input elements should be in the same order defined in the module definition and for optional 2218 parameters; if the test designer does not want to supply the parameter he should use an empty input 2219 element. 2220 2221 The and commands extends the with the following extra 2222 elements: 2223  output (0..*): – Messaging modules returns a set of outputs (message parts) as a 2224 result of receive or listen operations. These elements are used to bind the outputs to some variables 2225 in the Test Case definition. The binding of returned results to these elements can be done in two 2226 ways either by name binding, or by the order of parameters as in the case of input elements in the 2227 Send command. The expression given in the output element should be a variable expression. The 2228 value given in the corresponding output is then assigned to this variable.

76

2229  id (0 .1) – TDL provides a syntactic sugar for test designers to use the results (received messages) 2230 of Receive or Listen commands. Rather than binding the outputs to some variable, test designer can 2231 use the id attribute for the command without using any output elements. In this way, a map type 2232 variable will be created with this supplied id (name). The outputs of the step will be stored in this map 2233 where the keys are the output names defined in the module definition. 2234 2235 9.3.8 Validation Step

2236 The is used to represent validation steps in the test scenario where a specific validation 2237 methodology is applied on a given content. 2238  handler – The identifier (URN) for the validation module that will perform the actual validation. 2239  config (0..*): – The list of configuration parameters supplied to the validation 2240 module for the validation process. 2241  input (1..*): – The list of inputs (content, schemas, etc) supplied to the validation 2242 module. For each input element the expression is evaluated and the value will be supplied as input 2243 parameter for the module. The same methodology described in messaging steps is used to bind the 2244 values to the parameters (binding by order and binding by names). 2245  id (0..1) – If test designer needs a decision step or loop step that depends on the result of a 2246 validation step, the id attribute can be used to give a name to the validation result. In this case, a 2247 Boolean variable named with the given id will be created and this variable can be accessed by 2248 further expressions in other steps. 2249 2250 9.3.9 User Interaction During Execution

2251 The is used for steps to interact with SUT administrators during test execution. The 2252 Test Bed is expected to interact with users, show the child instructions and get the requested inputs for this 2253 step. When the interaction is finalized, the step is assumed to be completed and execution continues with the 2254 next step. 2255  with – Interaction can be with a specific SUT administrator for this step. In that case this attribute 2256 should refer (TestRole.name) the corresponding actor. If this attribute is not supplied, the “with” 2257 attribute should be supplied for each included Instruction or InputRequest element. 2258  instruction (0..*): – The list of instructions for this interaction group. 2259 The details of the Instruction element are described in preliminary phase section. 2260  request (0..*): – The list of input requests for this interaction group. 2261 The details of the Instruction element are described in preliminary phase section. 2262 2263 9.3.10 Interim Computations

2264 Test designers may need some interim computations on the received content (messages, documents, 2265 inputs) between test steps to use them in later steps. The is the supplementary test construct 2266 designed for this purpose. It extends Expression and the computation/processing performed with the 2267 expression is stored to a variable as a result of the construct. 2268  to – A variable expression indicating the variable to store the resulting value. 2269  append (0..1) – This attribute is only used for the list type variables to indicate whether the value 2270 calculated by the expression is a list type so a normal assignment is performed or it is not a 2271 container type and the value is appended to the list as a result. 2272 2273 9.3.11 Test Flow Steps

2274 Sometimes a test scenario can include decision points where execution continue with a specific branch 2275 based on a decision. The is used to indicate such decision points. 2276  cond: – A Boolean expression representing the condition for the decision point 2277  then (0..1): – The branch of steps that should be executed when the condition is 2278 evaluated true.

77

2279  else (0..1): – The branch of steps that should be executed when the condition is 2280 evaluated false. 2281 2282 Another construct required by any computational language is the loop construct to execute a part of the 2283 script in a loop based on some condition. The is one of them in TDL for generic loops. 2284  cond: – A Boolean expression representing the condition that decides to continue 2285 to the loop or not. 2286  do: – The sequence of steps to loop on. 2287 2288 The is another loop construct which executes the child steps at least once and then 2289 decides to loop over based on the given condition. 2290  do: – The sequence of steps to loop on. 2291  cond: – A Boolean expression representing the condition that decides to continue 2292 to the loop or not. 2293 2294 The is another loop construct for executing steps for a given number of times (iteration 2295 over a list type variable). 2296  counter (0..1) – Name of the iteration variable (number type). The default value is “i” if not supplied. 2297 The scope of the counter variable is the child steps given in the do sequence. It can be used as 2298 index for iteration over lists. For each iteration, the value of the variable is incremented. 2299  start (0..1) – The starting value for the counter variable. The default value is 0. 2300  end – The end value for the counter variable. If the value of the counter variable becomes larger 2301 than this value, the loop ends. 2302 2303 Some test scenarios need concurrent branches of steps that should be executed concurrently. The 2304 is used for this purpose. 2305  thread (1..*): – Each thread represents a branch that should be executed 2306 concurrently. 2307 2308 The is used to exit from the Test Case execution from any branch. 2309 2310 The , extending Sequence class, is used to form a logical group of sequence of steps in 2311 order to better present the test scenario to SUT administrators. 2312 2313 9.3.12 Modular Test Scripting

2314 Like developers writing code in any programming language, test designers need to group a set of steps, a 2315 sub execution flow, and reuse them in their Test Case descriptions more than once. The is 2316 used to define such function–like partial test scripts. Scriptlets can be defined within a Test Case definition or 2317 globally within the Test Suite packages. The following are the details of the element: 2318  id – The identifier used to identify the definition within the Test Case definition or a Test Suite 2319 package 2320  metadata (0..1): – The metadata of this partial test script definition. 2321  namespace (0..*): – The list of namespace declarations and their prefix bindings 2322 that will be used in the expressions used in the Scriptlet definition. 2323  import (0..*): – The list of import statements to declare the external test modules or 2324 test artifacts required for the execution of this partial definition. 2325  param (0..*): – The definition of input parameters of this Scriptlet (like function 2326 parameters). 2327  var (0..*): – Definition of local variables where the scope is this Scriptlet definition. 2328  steps: – The sequence of test steps for this partial test execution flow.

78

2329  output (0..*): – Definition of return values for this partial test script. The callee 2330 can access these values from its internal scope when the execution of the scriptlet is finalized. The 2331 TypeBinding extends Expression and the evaluated value of expression is returned as a result. As 2332 an extension to the Binding class, TypeBinding indicates the type of the returned result with the 2333 type attribute. 2334 2335 The is used to call a within a test case or another scriptlet definition. 2336  path – The identifier of the to call. 2337  input (0..*): – Input parameters supplied to the scriptlet. Binding is performed similar 2338 to bindings in other constructs. 2339  output (0..*): – Defines the bindings of the Scriptlet outputs to some variable in the 2340 context. Binding is performed similar to the bindings in other constructs. 2341  id (0..1) – Can be used to directly access the results of the scriptlet from a map type variable 2342 initialized by this name. In that case, the test designer does not need to use the output elements. 2343 2344 9.3.13 Expressions and Bindings

2345 The element is used to represent TDL expressions and used in all other TDL constructs 2346 as described in the above sections. 2347  lang (0..1) – Any expression language (both in terms of syntax and semantics) can be used as the 2348 TDL expressions if the Test Bed supports it. This attribute provides the unique identifier (URN) for 2349 the language for this expression. TDL provides a default XPath-based expression language and if 2350 this attribute is not supplied this scheme should be assumed. 2351  source (0..1) – The input source for the expression. In other words the expression will be evaluated 2352 based on this source content. Should be a variable expression (left value). 2353  expr (0..1) – The string representing the expression. If not supplied, the result is Null value. 2354 2355 The Default TDL Expression scheme extends the XPath 2.0 with the following simple extensions; 2356  The Variable References in the XPath expressions are extended to access the values of GITB 2357 container typed (list and map) variables. The following rules apply: 2358 ◦ $ is used as usual for the value of a variable (ex: $x to access to the x's value). 2359 ◦ ${} is used to access an entry (with the given key) in a map typed 2360 variable (ex: $x{name} to access the entry in the x with key “name”). 2361 ◦ ${} is used to access an entry of the list type attribute at the 2362 given index (ex: $x{0} access to the first element in the x list). 2363 2364 The same rules applied to the TDL Variable Expressions (reference to the variables) to refer the variables. 2365 2366 Some test constructs in TDL (messaging steps, interaction steps, validation steps, calling scriptlets) get 2367 inputs and return outputs within the test execution. In order to bind values to these input and output 2368 parameters, the element is used. Binding extends Expression with the the following attribute: 2369  name (0..1) – Name of the input parameter that the evaluated expression value will be bind while 2370 supplying input. Similarly, it can be the name of the output parameter that its return value will be bind 2371 to the given variable reference. As described in the related constructs, if this attribute is not used, the 2372 parameters will be bound in the same order. 2373

2374 9.4 XML Schema for TDL

79

80

81

82

2375

2376

83

2377 10 GITB Proof of Concept (PoC) Test Bed Implementation

2378 This section presents an overview of the GITB Test Bed implementation, which has been developed in GITB 2379 Phase 3 as Proof-of-Concept (PoC) for the GITB architecture and specifications.

2380 The GITB PoC Test Bed is an open source project and its source code can be found at GitHub Repository7. 2381 For code contribution, Git8, which is a distributed revision control and source code management (SCM) 2382 system, is used. Git enables distributed development and provides strong support for non-linear (branching 2383 and merging) development, which is important for the GITB PoC Test Bed implementation. 2384 The collaboration model that is followed in GITB PoC Test Bed development adopts a feature based 2385 workflow that suggests creation of a new branch for each new feature. When a new feature is decided to be 2386 developed, a new development branch, which denotes a slightly different direction in which the development 2387 is proceeding, is created. After the feature is implemented, if the feature is complete, it is merged into the 2388 master development branch. Therefore, a development branch never affects a stable release.

2389 10.1

2390 The software architecture behind the GITB PoC Test Bed consists of two main components – the GITB 2391 Testbed component and the GITB Execution Interface.

2392  GITB Testbed is responsible for execution of conformance and interoperability tests through a set of 2393 services.

2394  GITB Execution Interface provides a Graphical User Interface (GUI) and a REST API to manage a 2395 number of user activities (account and SUT registration, conformance statement definition, etc.) and 2396 testing operations by utilizing the services exposed by GITB Testbed.

2397 Figure 10-1the interactions between the two main components and their modules.

2398

2399 Figure 10-1: GITB PoC Implementation Components

7 https://github.com/srdc/gitb 8 http://git-scm.com/ 84

2400 10.1.1 GITB Testbed

2401 In order to be able to support and test a wide range of messaging protocols, business document formats and 2402 document exchange choreographies, a modular approach needs to be embraced by GITB Testbed 2403 component. This is achieved by adopting an interface-based architecture. The latter enables modularity and 2404 adaptability, thus, increases maintainability and facilitates development of additional auxiliary modules. In 2405 this way, the GITB Testbed component is built as a collection of modules, and API calls among its modules 2406 can only be established through the defined interfaces. At software level, this structure is realized by utilizing 2407 Apache Maven9, which is a software management and comprehension tool. With the help of interfaces 2408 developed and Maven's powerful module management facilities, new modules can be developed and 2409 existing modules can be integrated without requiring much effort. In this way, the GITB Testbed’s capabilities 2410 can be further extended without hindering the Test Bed execution. Maven is also used for dependency 2411 management, build automation, and for a broad range of plugins.

2412 Each module within the GITB Testbed component has an XML file representation kept in a file named 2413 pom.xml. This representation is called Project (POM) and is the central construct of the 2414 Maven’s build management philosophy. The POM file describes the intended software project being 2415 developed, its dependencies on other modules or external software libraries, the build order, managed 2416 resources and needed plug-ins. It comes with pre-defined targets for managing the life-cycle phases such as 2417 compilation, packaging and deployment.

2418 A multi-module software project like the GITB Testbed is defined by a parent POM (or top-level POM) 2419 referencing its modules. As a result of doing so, modules are grouped together. When a Maven command is 2420 executed against the parent POM, the same command will be executed at child modules, as well. For 2421 instance, building the parent will eventually build all modules, without the need of building each module 2422 separately. The top-level POM of GITB Testbed components can be seen below.

2423 Parent POM also defines a set of Maven coordinates: groupId is com.gitb, the artifactId is GITB and the 2424 version is 1.0-SNAPSHOT. Furthermore, some global properties such as compiler.version, gitb.version are 2425 defined in this POM, so that they are available in all child modules. The parent project does not create a JAR 2426 or a WAR like other modules; instead, it is simply a POM that refers to its child modules. Additionally, every 2427 child module has to specify who their parent POM is, in their POM files, too.

2428

9 http://maven.apache.org/ 85

1.0-SNAPSHOT 1.7

4.0.0 GITB com.gitb GITB 1.0-SNAPSHOT pom

org.apache.maven.plugins maven-compiler-plugin 3.0 ${compiler.version} ${compiler.version} org.eclipse.jetty jetty-maven-plugin 9.2.2.v20140723 -Dorg.eclipse.jetty.annotations.maxWait=180 gitb-testbed-service/src/main/webapp/WEB-INF/jetty-context.xml

gitb-core gitb-engine gitb-lib gitb-messaging gitb-remote-testcase-repository gitb-remote-modules gitb-testbed-service gitb-validator-validex gitb-validators

2429

2430 10.1.2 GITB Testbed Modules

2431 10.1.2.1 The Central Part of the GITB Testbed: gitb-core

2432 gitb-core module is the central part of the modular architecture of the GITB Testbed component. It provides a 2433 data model for the GITB type system explained in section 9.1.2, exceptions that may be thrown during the 2434 execution of the Test Bed and classes used for messaging operations, as well as an internal API through a 2435 set of interfaces for extendable functionalities. These functionalities encompass development of additional 2436 messaging and validation modules as well as integration of external messaging and validation services,

86

2437 external test case repositories and additional function registries. The current GITB Test Bed implementation 2438 provides a number of already-developed functionalities which will be explained in the upcoming sections.

2439 The internal APIs provided by gitb-core module are the following:

2440 ● IMessagingHandler: Provides abstract methods for implementing a new messaging service instead 2441 of integrating an existing one. These methods have similar signatures and functionalities as the ones 2442 defined in the Messaging (Simulation) Service Specifications. They are briefly explained below: 2443 ○ getModuleDefinition: Returns the module details including messaging configurations and 2444 inputs that the module requires and the outputs that the module provides. 2445 ○ initiate: Creates a session between the messaging service client (SUT) and GITB Testbed 2446 before any transaction is realized. 2447 ○ beginTransaction: Creates a communication between SUT and GITB Testbed with the 2448 given session ID and configurations. 2449 ○ sendMessage: Allows GITB Testbed to send given messages to a SUT according to given 2450 configurations over an existing transaction. 2451 ○ receiveMessage: Allows GITB Testbed to receive messages from a SUT according to given 2452 configurations over an existing transaction. 2453 ○ listenMessage: Allows GITB Testbed to listen messages between two SUTs according to 2454 given configurations over an existing transaction. 2455 ○ endTransaction: Destroys the given transaction in a session. 2456 ○ endSession: Destroys the session.

2457

2458 Figure 10-2: Class Diagram of IMessagingHandler Interface 2459

2460 ● IModuleLoader: Provides abstract methods for retrieving proxies of external Content Validation 2461 Services and Messaging (Simulation) Services implementing IValidationHandler and 2462 IMessagingHandler interfaces. 2463 ○ loadValidationHandlers: Returns all integrated Content Validation Service proxies 2464 implementing IValidationHandler interface. 2465 ○ loadMessagingHandlers: Returns all integrated Messaging (Simulation) Service proxies 2466 implementing IMessagingHandler interface. 2467

2468 2469 Figure 10-3: Class Diagram of IModuleLoader Interface 2470

2471 ● IFunctionRegistry: An interface for writing user-defined, reflexive extension functions to offer 2472 additional features beyond those specified in the XPath Specifications. In other words, extension 2473 functions enable invocation of JAVA methods, as if calling XPath functions, during Test Case 2474 processing since GITB Testbed uses an XPath 2.0 based expression language, by default. 2475 An extension function is written in JAVA and invoked by using the pattern, prefix:localname() within a 2476 Test Case document. The prefix must be the prefix associated with a namespace declaration in the 2477 Test Case within tdl:Namespace element. 87

2478 ○ getName: Returns the unique name of the function registry to identify it among the other 2479 function registries. 2480 ○ isFunctionAvailable: Returns a Boolean value indicating the existence of a function, which 2481 is referenced from a test case, within a function registry. 2482 ○ callFunction: Invokes a function defined in a function registry with given name and 2483 arguments. 2484

2485 2486 Figure 10-4: Class Diagram of IFunctionRegistry Interface 2487

2488 ● ITestCaseRepository: This interface defines generic methods for retrieval of various test resources 2489 (Test Cases, Test Suites, scriptlets or various Test Artifacts) on demand. Modules implementing this 2490 interface do not have to contain the test resources locally within the module; instead, they can 2491 provide access to external repositories. Therefore, any remote test resource repository serving on a 2492 specific protocol (local or remote file system, TCP, HTTP, etc) can be integrated with GITB Testbed 2493 by implementing the ITestCaseRepository methods according to the protocol requirements. 2494 ○ getName: Returns the unique name of the Test Case repository to identify it among the 2495 other Test Case repositories. 2496 ○ isTestCaseAvailable: Returns a Boolean value indicating the existence of a Test Case with 2497 given ID in a repository. 2498 ○ getTestCase: Retrieves a tdl:TestCase object with given ID 2499 ○ isTestSuiteAvailable: Returns a Boolean value indicating the existence of a test suit with 2500 given ID in a repository. 2501 ○ getTestSuite: Retrieves a tdl:TestSuite object with given ID 2502 ○ isScriptletAvailable: Returns a Boolean value indicating the existence of a scriptlet with 2503 given ID in a repository 2504 ○ getScriptlet: Retrieves a tdl:Scriptlet object with given ID 2505 ○ isTestArtifactAvailable: Returns a Boolean value indicating the existence of a test 2506 resource with given path in a repository 2507 ○ getTestArtifact: Returns a JAVA InputStream for the test resource from given path. 2508

2509 2510 Figure 10-5: Class Diagram of ITestCaseRepository Interface 2511

2512 ● IValidationHandler: Provides abstract methods for implementing a new validation service instead of 2513 integrating an existing one. These methods have similar signatures and functionalities as the 2514 methods defined in Content Validation Service Specifications and they are briefly explained below: 2515 ○ getModuleDefinition: Returns the module details including validation configurations and 2516 inputs that the module requires. 88

2517 ○ validate: Validates the content with given inputs and configurations and returns a report 2518 about the validation operation, providing the overall result and description of performed 2519 assertions, found errors and warnings. 2520

2521 2522 Figure 10-6 Class Diagram of IValidationHandler Interface 2523

2524 ● ValidationService: Provides Web service methods of GITB Content Validation Service. 2525 ● MessagingService: Provides Web service methods of GITB Messaging (Simulation) Service. 2526 ● TestbedService: Provides Web service methods of GITB Testbed Service. 2527

2528 As mentioned before, the GITB Testbed component consists of several modules and gitb-core is the center 2529 of this modular architecture. That is because gitb-core enables access to all these functionalities from a 2530 single point, which is, in fact, a singleton class called ModuleManager. In order to achieve this behavior, 2531 each implementor class, which denotes a concrete implementation of either the abovementioned interfaces 2532 or abstract GITB Data Type class, registers itself as a service provider through the JAVA Service Provider 2533 mechanism. This registration process is managed via one line of code by utilizing an annotation-driven 2534 META-INF/services auto-generator library10. By annotating the implementor classes with 2535 @MetaInfServices(.class) annotation provided by this library, a service provider 2536 configuration file, whose name is the fully-qualified binary name of the interface (e.g. 2537 com.gitb.messaging.IMessagingHandler) or abstract GITB Data Type class (com.gitb.types.DataType), is 2538 generated automatically and this file contains the names of implementor classes, one per line. Then, when 2539 GITB Testbed starts, the ModuleManager class aggregates all the implementations of the interfaces or GITB 2540 Data Types by calling the load method, which scans all the service provider configuration files in the run-time 2541 environment and returns them, of java.util.ServiceLoader class and serves them to the other modules. 2542 Identification of validation and messaging modules is carried out with the identifiers defined in their module 2543 definitions whereas each Test Case repository, function registry or GITB Data Type must have unique 2544 names within a running instance. As it can be seen in the class diagram below, ModuleManager is a 2545 singleton class. Concrete implementors are kept inside of private java.util.Map objects and accessed by 2546 public accessors.

10 http://metainf-services.kohsuke.org/ 89

2547

2548 Figure 10-7: Class Diagram of ModuleManager Class 2549 Another important characteristic of gitb-core module is that it also provides the object models for XML 2550 Schemas for TDL, TPL and Test Report format and Web Service stubs for TestbedService, 2551 MessagingService and ValidationService. However, these object models are not provided directly. Instead, 2552 gitb-core converts XML Schemas and WSDL files given in previous sections into JAVA objects by utilizing 2553 Maven JAXB plug-in11 during generate-sources lifecycle phase of Maven. Therefore, every GITB Testbed 2554 module requires gitb-core module as a dependency in order to coordinate testing activities.

2555 10.1.2.1.1 Utility Classes: gitb-lib

2556 gitb-lib module provides useful utility classes that define a set of methods performing common and reusable 2557 functions to be used by other modules, without encapsulating any state information. These classes are 2558 grouped under com.gitb.utils package and their job is summarized below:

2559 ● ActorUtils: Provides utilities for extracting information such as, actor IDs, actor configurations, 2560 endpoint IDs, endpoint name from gitb:ActorConfiguration objects. 2561 ● BindingUtils: Provides a method determining whether all elements in a list of tdl:Binding have the 2562 optional “name” attribute. 2563 ● ConfigurationUtils: Provides methods for manipulating gitb:Configuration objects. 2564 ● DataTypeUtils: Provides methods for conversion between GITB types and gitb:AnyContent. 2565 ● EncodingUtils: GITB Testbed is able to process messages in different types of encoding schemes. 2566 This class provides utilities for conversion among those encodings. 2567 ● JarUtils: Provides methods for retrieving JAR files of external messaging or validation proxy 2568 modules to access their module definitions. 2569 ● TimeUtils: Provides methods for performing formatting and conversion operations on date & time 2570 information represented by JAVA String and Date classes. 2571 ● XMLDateTimeUtils: Provides methods for manipulating date & time information of type 2572 XMLGregorianCalendar, which is the representation for W3C XML Schema 1.0 date/time datatypes. 2573 ● XMLUtils: Provides many methods for XML processing, conversion, transformation and etc.

2574 10.1.2.1.2 Access to Test Case Artifacts: gitb-remote-testcase-repository

2575 gitb-remote-testcase-repository module provides access to test resources served by GITB Execution 2576 Interface API, over HTTP protocol, so that Test Cases artifacts can be retrieved and processed for test 2577 execution. There are only 2 classes within this module:

11 https://jax-ws-commons.java.net/jaxws-maven-plugin/ 90

2578 ● RemoteTestCaseRepository: Implements ITestCaseRepository interface to retrieve test resources 2579 from GITB Execution Interface. Also, utilizes a LRU (Least Recent Used) Cache mechanism in order 2580 to avoid repetitive HTTP calls for same resources. 2581 ● TestCaseRepositoryConfiguration: Provides configuration parameters such as URLs of GITB 2582 Execution Interface repository services to retrieve Test Cases and Artifacts. Configurations are 2583 retrieved from remote-testcase-repository.properties located in the resources folder. The content 2584 of remote-testcase-repository.properties file can be seen below: 2585 #name of the placeholder string in remote.testcase.repository.url configuration remote.testcase.test-id.parameter = test_id

#name of the placeholder string in remote.testresource.repository.url configuration remote.testcase.resource-id.parameter = resource_id

#URL of the test case provider service remote.testcase.repository.url = http://localhost:9000/repository/tests/:test_id/definition

#URL of the test resource provider service remote.testresource.repository.url = http://localhost:9000/repository/suites/:resource_id 2586

2587 10.1.2.1.3 Validators: gitb-validators

2588 This module provides a built-in validation architecture as well as a simple reporting mechanism for 2589 generating validation results with basic validators. The latter are, actually, concrete implementations of 2590 IValidationHandler interface of the internal API provided by gitb-core module. This architecture and 2591 reporting mechanism is responsible for checking the validity and syntactical correctness of the business 2592 messages/documents retrieved by messaging adapters and for reporting the overall result of validation 2593 operation with performed assertions, found errors and warnings.

2594 Currently, gitb-validators module provides 3 validators with their module definition information. Each module 2595 converts the corresponding XML file containing the module definition into a JAVA object with the help of 2596 XMLUtils class provided by gitb-lib module. An example module definition can be seen below:

XSD Validator 1.0

97

HTTP Messaging 1.0 / / 2750

2751 10.1.2.1.5 Central Processing Module: gitb-engine

2752 gitb-engine is the central processing module of the GITB Testbed component and responsible for Test Case 2753 execution. It utilizes the internal API provided by gitb-core and its concrete implementations realized in 2754 abovementioned modules. In addition, it processes the requests received by TestbedService, performs the 2755 required actions according to the Test Bed Service Specifications explained in section 8.3 and returns the 2756 results.

2757 When a request is received by TestbedService, it is delivered to an appropriate manager class to be 2758 processed. The TestEngine singleton class is responsible for test execution, whereas the TestCaseManager 2759 provides utilities for various Test Case operations. Moreover, when Test Bed users request to execute a Test 2760 Case, a session needs to be created for each execution. The SessionManager singleton class manages user 2761 sessions and contains all test execution related data within instances of the TestCaseContext class. 2762 TestCaseContext stores all intermediate results, SUT configurations, messaging context, etc. during a test 2763 session.

2764 Intermediate results must be treated specially, since these results are stored in variables and may be 2765 referenced from other test steps. For this purpose, intermediate results or variables, in short, are stored in a 2766 class called TestCaseScope and they are accessed by their variable names. Variables in TestCaseScope 2767 are globally accessible from each step of the test execution. However, variables created in a Scriptlet are 2768 local variables and cannot be reached outside of it.

2769 Variables defined in a Test Case can be referenced from XPath expressions in the form of $variable_name. 2770 In that case, its value must be resolved in order to evaluate the XPath expression and proceed to the next 2771 steps. For this purpose, gitb-engine module provides a class VariableResolver which implements 2772 XPathVariableResolver JAVA XPath API interface whose resolveVariable method is called automatically 98

2773 when a variable is referenced. The VariableResolver class has access to TestCaseScope and can retrieve 2774 the value of a variable by its name. If the referenced variable is not found in TestCaseScope, then a null 2775 value is returned.

2776 The same mechanism applies to resolve user defined, extensive functions referenced from an XPath 2777 expression. In order to resolve user defined functions, gitb-engine module provides a class 2778 FunctionResolver which implements XPathFunctionResovler JAVA XPath API interface whose evaluate 2779 method is called automatically when a function that is not a default XPath function, is referenced. Then, the 2780 FunctionResovler scans all the IFunctionRegistry implementations through the ModuleManager, invokes the 2781 first available function having the same signature as the referenced function and returns the value returned 2782 from the function. If no function is found among all IFunctionRegistry implementers, a null value is returned.

2783 As mentioned before, an extensive function is invoked by using prefix:localname() pattern. For a successful 2784 user defined function invocation, the prefix must be resolved, as well, to a valid namespace URI defined in 2785 tdl:Namespace element. gitb-engine module provides a class NamespaceContext which implements 2786 javax.xml.namespace.NamespaceContext interface, for the resolution of prefixes. The NamespaceContext 2787 class has access to TestCaseContext class through TestCaseScope and can retrieve corresponding 2788 namespace URI’s for each prefix and vice versa. If the prefix is valid, then FunctionResolver class tries to 2789 resolve the function and invoke it.

2790 2791 Figure 10-13: Class Diagram of Test Engine 2792

99

2793 In order to enable concurrent and asynchronous execution of several test cases at the same time in parallel, 2794 the Akka framework19 is used. Akka is a framework that enables development of distributed and concurrent 2795 applications. The philosophy underlying Akka embraces the actor model which is a mathematical model of 2796 concurrent computation that treats actors as the universal primitives of concurrent computation. Akka 2797 concurrency is asynchronous and achieved by actors sending messages to each other. In other words, Akka 2798 actors are lightweight concurrent entities and process messages in an asynchronous way by using an event- 2799 driven receive loop. Actors are created and bound within an ActorSystem and each of them is a part of an 2800 actor-hierarchy.

2801 TestEngine singleton class has an ActorSystem object which is responsible for creating actors. There are 2802 two types of actors used within Testbed execution: Session and Processing Actors. When a user demands to 2803 execute a Test Case, a SessionActor is created to manage the test execution with commands such as, start, 2804 stop, configure etc. After gitb-engine retrieves the Test Case with one of the ITestCaseRepository 2805 implementers and receives start command, it creates a TestCaseProcessActor which takes over the test 2806 execution. TestCaseProcessActor then parses the Test Case and employs appropriate Test Step Processor 2807 Actors to execute each test step: AssignStepProcessorActor to process Assign test step, 2808 ReceiveStepProcessorActor to process Receive step and so on. In other words, there is a corresponding 2809 processor actor for each test step identified in section 9.3.6. After executing each of the test steps, 2810 TestCaseProcessorActors notifies the corresponding SessionActor. The latter notifies the TestbedService 2811 that the test execution is completed.

2812 2813 Figure 10-14: Class Diagram of the Actor System Based on Akka 2814

2815 During a test execution, it is natural that run-time exceptions can be thrown as it is very likely that gitb-engine 2816 may have to process erroneous input from SUTs. For instance, a validator may come across an invalid 2817 document or a messaging handler may receive a message that does not conform to transport level 2818 specifications. These exceptions are caught during test execution and sent to gitb-execution-interface as a

19 http://akka.io 100

2819 test step report indicating the cause of failure. SUT administrators can review the report and fix problematic 2820 parts of their systems.

2821 In order to handle exceptions in each module, a common exception model is created and provided by gitb- 2822 core module. The abstract GITBEngineRuntimeException class is the base of the common exception 2823 model. Its concrete implementation, GITBEngineInternalError, provides the necessary information 2824 regarding the cause of the exception thrown during test execution. To implement such behaviour, when an 2825 exception (of any run-time or compile-time exception type) is caught, it is wrapped within 2826 GITBEngineInternalError and rethrown so that it is caught within gitb-engine module. After gitb-engine 2827 catches the exception of the type GITBEngineInternalError, it creates a report from it and sends the report to 2828 gitb-execution-interface, as mentioned before.

2829 2830 Figure 10-15: GITB Testbed Exception Model 2831

2832 10.1.2.1.6 GITB Test Bed Service: gitb-testbed-service

2833 The gitb-testbed-service module provides the implementation (TestbedServiceImpl class) for Testbed 2834 Service Specifications. It packages every JAR file for the GITB Testbed component module into a WAR 2835 (Web Application Archive) which can then be deployed to an application server. It also contains the 2836 necessary files (deployment description - web.xml, JAX-WS RI deployment descriptor - sun-jaxws.xml, Jetty 2837 deployment description - jetty-context.xml) to deploy the WAR file as a web application. The deployment can 2838 be either manually by copying WAR file into appropriate application server folders or automatically by using 2839 Maven plugins. The current GITB Testbed implementation uses maven-jetty-plugin to deploy the WAR file 2840 onto an Jetty Application server20.

20 http://www.eclipse.org/jetty/ 101

2841 10.1.2.1.7 Integration of Remote Modules: gitb-remote-modules

2842 gitb-remote-modules module enables integration of external validation and messaging services and 2843 implements the IModuleLoader interface methods. To integrate a remote module, a module definition XML 2844 file must be created and put into the resources folder. Additionally, a separate Maven module with Validation 2845 Service or Messaging Service and client implementations for the external service must be developed. In this 2846 way, at the start of GITB Testbed component, the integrated modules are loaded by ModuleManager class in 2847 gitb-core module and will be ready to be used. Currently, Validex21 which is an online XML message 2848 validation service has been integrated with GITB Testbed.

2849

2850 10.1.3 GITB Execution Interface

2851 The GITB Execution Interface is designed to be the management interface of the GITB Testbed component 2852 to be exposed to outer world. Users of the GITB Execution Interface can perform their requests through a 2853 Web GUI to a REST server. The latter then delivers these requests to GITB Testbed component through 2854 TestbedService. Thus, the GITB Execution Interface is designed to manage the test executions within the 2855 GITB Testbed component on a graphical user interface through REST Services. In this section, the design 2856 and implementation details of the GITB Execution Interface are presented. 2857 At software level for the client side implementation, there are numerous JavaScript frameworks providing 2858 different functionalities at different level. For the GITB Execution Interface, the Play Framework is selected. 2859 The latter is an open source Web application framework written in Scala and Java and follows the MVC 2860 (Model View Controller) architecture. The MVC architecture provides rich API for models and views, and 2861 integrates with the REST services provided by the framework. Because of the model-view separation as well 2862 as JSON based REST support, the Play Framework fits perfect for the requirements of GITB Execution 2863 Interface. 2864 10.1.3.1 How to Use the GITB POC Interface

2865 Figure 10-16 presents the main screen of the GITB Execution Interface Web GUI. On the upper right corner, 2866 information about the authenticated user is shown along with a dropdown menu to manage settings and 2867 invalidate the session information. The Tutorial button navigates users to a page with a viewlet22 on how to 2868 use GITB Execution Interface. Moreover, Systems returns the users to the main screen.

2869

2870 2871 Figure 10-16: GITB Execution Interface Main Screen

21 https://validex.net/ 22 Available on https://www.youtube.com/watch?v=4K7eEvUS1UA 102

2872

2873 When users login to the system, the main screen retrieves all the available SUTs and displays them. All 2874 users have to register their systems as SUTs in order to be able execute Test Cases for them. This is 2875 achieved by clicking on the New System button. A popup window is then displayed asking some information 2876 regarding the SUT as illustrated in Figure 10-17.

2877 2878 Figure 10-17: Screen for Adding a New System 2879

2880 After entering the required information, SUT is registered, but not ready to be tested yet. In order to be able 2881 initiate a test for a SUT, a conformance statement must be defined, after clicking on SUT name. By defining 2882 a conformance statement, users tell the GITB Execution Interface that the SUT conforms to a specific 2883 eBusiness standard or specification. They get a number of Test Cases to prove their conformance. 2884 Creating a conformance statement is a four-step process. In the first step, the appropriate eBusiness domain 2885 selected. All available specifications of this domain are then retrieved and the user is asked to select the 2886 relevant specification in the second step. In the third step, the actors implemented by the SUT are selected. 2887 Finally, if the selected actors define any optional parameters, they are selected in the fourth step.

103

2888 Figure 10-18: Steps for Defining a Conformance Statement 2889

2890 After a conformance statement is defined, it should be selected to continue with the Conformance Statement 2891 Detail page. Each SUT actor defines one or more endpoints to be reached over the network. For each 2892 endpoint, a number of configuration parameters related to the SUT itself or its endpoint have to be provided 2893 to GITB Testbed, so that it can use these parameters during test execution. These configuration parameters 2894 can be simple string values (i.e IP address, port number, etc) as well as files (public keys, processable text 2895 files, etc). Finally, all Test Cases that are related to the selected actors are listed at the bottom of this page. 2896 In order to prove the claim of conformance, all these Test Cases have to be executed and passed.

104

2897 2898 Figure 10-19: Conformance Statement Details Page 2899 105

2900 After selecting a Test Case, the user is navigated to the test execution page. Test execution is a three-step 2901 process. In the first step, all the SUT configurations are checked against missing inputs. If there are any 2902 missing configurations, the Web GUI does not let the user proceed to the next steps. In the second step, a 2903 testing session within GITB Testbed is created for the user. After that, configuration parameters of each actor 2904 that is simulated by the GITB Testbed component are listed. At this point, SUT admins have to configure 2905 their SUTs according to these configurations. Finally, the third step is where the actual test execution takes 2906 place. Here all the actors, the messaging choreography between them, the validation steps etc. are 2907 displayed by means of sequence diagram elements. The actor role which the SUT plays is indicated with 2908 (SUT). Actors, that are simulated by the GITB Testbed component and the GITB Testbed itself are indicated 2909 with (SIMULATED) and Test Engine, respectively. Test execution starts by clicking the Start button.

2910 2911 Figure 10-20: Test Execution Interface 2912

2913 After a test case is executed, the Web GUI indicates success or failure of each step with green and red 2914 arrows, respectively. Moreover, GITB Testbed creates reports summarizing the results of each test step as 2915 well as their execution time, message content and headers (if it is a messaging step), validated and validator 2916 document content (if it is a validation step) and so on.

106

2917 2918 Figure 10-21: Test Report Screen Indicating the Errors 2919

2920 The GITB Execution Interface also provides an Admin Panel for system administrators. Through this panel, 2921 system admins add new domains, specifications and deploy Test Suites.

107

2922 2923 Figure 10-22: GITB Execution Interface Admin Panel 2924

2925 10.1.3.2 REST API

2926 The GITB Execution Interface’s REST API is responsible for managing the functionalities explained in 2927 previous section and exposing them to the outer-world as well as invoking the TestbedService methods of 2928 GITB Testbed. Furthermore, the API enables persistence of user and Test Case information within a MySQL 2929 database and provides access to them for authorized users. The GITB Execution Interface’s REST API 2930 provides following services: 2931 ● AccountService: Provides methods for user and vendor related operations such as registration, 2932 user/vendor profile retrieval, etc. 2933 ● AuthenticationService: Provides methods for user sessions by providing them access tokens. 2934 Secure services can only be invoked by access tokens. 2935 ● ConformanceService: Provides methods invoked when defining conformance statement for SUTs 2936 ● ReportService: Enables persistence of test results and reports retrieved from GITB Testbed and 2937 provides them to users on demand. 2938 ● RepositoryService: Provides access to Test Cases and resources on demand. 2939 RemoteTestCaseRepository retrieves test artifacts from this service. 2940 ● SystemService: Provides methods for creating and managing SUTs. 2941 ● TestService: Provides methods to enable access to GITB Testbed through TestbedService. 2942 ● TestSuiteService: Provides methods for deployment/undeployment of test suites 2943 ● WebSocketService: Delivers results of execution of test steps to Web GUI through WebSockets.

2944

108

2945 10.2 Setting Up GITB PoC Testbed

2946 As mentioned before, the source code of GITB PoC Testbed implementation can be accessible on GitHub 2947 Repository: https://github.com/srdc/gitb. Instructions on how to build and run the source code are presented 2948 in the following sections. 2949 10.2.1 GITB Testbed

2950 10.2.1.1 Building

2951 GITB PoC Testbed component uses Maven as a build automation and dependency management tool. This 2952 project can be built by executing the following command within the root folder of the project:

2953 $ mvn clean install

2954 This command builds and compiles all necessary files, executes tests and creates a war file located as in 2955 gitb-testbed-service/target/gitb-testbed-service-1.0-SNAPSHOT.war

2956 In order to skip test execution following command should be executed:

2957 $ mvn clean install -DskipTests=true

2958 10.2.1.2 Running

2959 GITB PoC Testbed implementation can be run after a successful build by executing the following command 2960 within the root folder of the project:

2961 $ mvn jetty:run -pl ./gitb-testbed-service/

2962 10.2.2 GITB Execution Interface

2963 GITB Execution Interface is located under gitb-ui folder. This project is an SBT23 project that uses the Play 2964 Framework. 2965 10.2.2.1 Dependencies

2966 GITB UI application assumes that the following tools are installed and running:

2967  Redis24 2968  MySQL RDBMS 2969  GITB Testbed

2970 10.2.2.2 Configurations

2971 The gitb-ui project can be configured using the parameters located in gitb-ui/conf/application.conf file.

2972 ● HTTP Server: It listens 9000 port by default. However this can be configured by either setting 2973 http.port configuration in gitb-ui/conf/application.conf file or running the application with 2974 Dhttp.port=${GITB_UI_PORT} parameter. 2975 ● MySQL Database: For this application to work properly, the following DB configurations must be set 2976 in gitb-ui/conf/application.conf: 2977 ○ db.default.url="jdbc:mysql://localhost/gitb?characterEncoding=UTF- 2978 8&useUnicode=true&autoReconnect=true"

23 http://www.scala-sbt.org/ 24 http://redis.io/ 109

2979 ○ db.default.user=... 2980 ○ db.default.password=... 2981 ○ db.default.rooturl="jdbc:mysql://localhost/" 2982 ○ db.default.name="gitb" 2983 ● Redis Server: By default, it is assumed that there is a Redis Server available at the 127.0.0.1:6379 2984 address. This can be configured by setting the parameters in gitb-ui/conf/application.conf: 2985 ○ redis.host = "127.0.0.1" 2986 ○ redis.port = 6379 2987 ● GITB Testbed: To run the GITB Execution Interface, an instance of the GITB Testbed should be up 2988 and running. The TestbedService endpoint for the GITB Testbed instance can be configured by 2989 setting the testbed.service.urlparameter in gitb-ui/conf/application.conf.

2990 10.2.2.3 User Management

2991 Currently, the following roles are implemented in the project:

2992 ● Vendor Admin: Vendor system administrator. 2993 ● Vendor User: Vendor system user (tester). 2994 ● System Admin: Manages the test engine, users, and repositories, creates new domains, 2995 specifications, deploys/removes test suites. 2996 2997 Unfortunately, there is no method except directly manipulating the database table to change the user roles in 2998 the current version. So, user roles in Users.role (in the form of table.column) have to be updated manually. 2999 The user roles are represented with the following unsigned integers in the database:

3000 ● Vendor Admin: 1 3001 ● Vendor User: 2 3002 ● System Admin: 4

3003 10.2.3 Building & Running

3004 Since GITB Execution Interface uses the Play Framework, an executable that takes care of the build & 3005 running process exists in the application folder. To do these operations, the following commands in the gitb- 3006 ui directory must be executed:

3007 ● ./activator clean compile to build the application. 3008 ● ./activator run to run the application in debug mode. 3009 ● ./activator clean dist to create a standalone distribution. This command creates a zip file located 3010 under gitb-ui/target/universal/ folder and it can be decompressed into the deployment folder. After 3011 this, there should be an executable script located in bin/gitb-ui path. This script can be executed 3012 directly to start the application in production mode.

3013 10.2.4 Test Suite Deployment

3014 Currently only users with System Admin roles can deploy test suites, compressed in ZIP files. One of the 3015 critical things about the structure of the test suite archives is the Test Suite definition xml file. This file is 3016 formatted according to the TestSuite type defined in the TDL schema (ns:http://www.gitb.com/tdl/v1/) and it 3017 should be located at the root of the test suite archive. Additionally, Test Cases and resources should be 3018 placed into folders in an organized way. Figure 10-23 shows a typical test suite folder hierarchy.

110

3019

3020 Figure 10-23: Hiearchy within a Test Suite 3021 To deploy a Test Suite, a domain and a specification related to the Test Suite must be created. Then, in the 3022 specification detail page, by using the Deploy test suite button, a Test Suite archive can be uploaded. If the 3023 deployment is successful, Test Cases and actors related to the Test Suite should be displayed.

3024 10.3 Case Studies with POC Test Bed

3025 10.3.1 UBL Use Case - Conformance Tests for PEPPOL BIS4A Invoice Only Specification

3026 The GITB PoC Test Bed implements a Test Scenario for the PEPPOL BIS4A Invoice Only Profile, which is a 3027 real-world eBusiness specification in the public procurement area (see section 15). A Test Suite with a set of 3028 Test Cases have been developed for this purpose. The Test Suite can be found in test-suites folder which 3029 comes along with the source code. More details regarding Test Suite and Test Cases can be found in the 3030 following sections. 3031 10.3.1.1 Test Suite Definition

3032 Basically, a Test Suite defines a number of Test Cases and the actors of the specification that take part in a 3033 Test Case. Furthermore, actors may define endpoints which denote the network protocol that the actor is 3034 accessible from, with a set of configuration parameters (e.g. network host, port, etc). As an example in the 3035 Test Suite below, all the Test Cases and actors of this suite are listed. It should be noted that, the Test Suite 3036 does not specify any relationships among the actors, it rather defines the endpoints and their configurations.

Peppol_BIS_4A_Invoice 0.1

Sender Access Point Sends business messages to a Receiver Access Point using the AS2 protocol.

111

Receiver Access Point Receives business messages from Sender Access Point using the AS2 protocol and validates it Service Metadata Locator A service which provides a client with the capability of discovering the Service Metadata Publisher endpoint associated with a particular participant identifier. Service Metadata Publisher Provides a service on the network where information about services of specific participant businesses can be found and retrieved.

3037

3038 10.3.1.2 Development of the Necessary Messaging Handlers

3039 The gitb-messaging module provides a set of widely used network protocols (e.g. TCP, HTTP(S), SOAP). 3040 The messaging architecture behind the gitb-messaging module allows extending these protocols in order to 3041 develop more complex ones. For instance, three additional messaging adapters have been developed to 3042 implement this case study. The AS2 adapter is built by extending the HTTPS adapter, Service Metadata 3043 Publisher (SMP) is developed on the HTTP adapter and Service Metadata Locator (SML) is implemented on 3044 top of the DNS adapter. Each new messaging adapter inherits the capabilities of the base adapter; therefore 3045 developing new adapters is basically adding new rules or abilities on already existing protocol 3046 implementations. 3047 10.3.1.3 Definition of Test Artifacts

3048 Test Cases are very likely to require one or more external resources such as validation schemas or message 3049 templates, during test execution. Therefore, required Test Artifacts must be included within a Test Case and 3050 be referenced with their relative paths to the Test Case. After that, their content is retrieved by the configured 3051 Test Case repository adapter and utilized during the test execution. 3052 A Test Case defines all the test resource imports, namespace and variable declarations, target actors of 3053 specification with their roles and test steps to execute. Target actors must be selected from the Test Suite 3054 and their role must be specified as either SUT (if the role will be played by the real SUT) or SIMULATED (if 3055 the actor will be simulated by GITB Testbed). If there will be any communication between actors, the 112

3056 messaging choreography between them must be defined by a number of transactions (see the example 3057 below). More information can be found in the Test Case below.

PEPPOL-SenderAccessPoint-Invoice-BusDox CONFORMANCE 0.1 The objective of this Test Scenario is to ensure the Sender Access Point (the System Under Test) is capable of querying both SML and SMP as well as submitting a conformant PEPPOL BIS 4A electronic invoice to a Receiver Access Point using the AS2 protocol. Then submitted document is validated by UBL 2.1 schema and PEPPOL Schematron rules.

Peppol_BIS_4A_Invoice/artifacts/UBL/maindoc/UBL-Invoice-2.1.xsd Peppol_BIS_4A_Invoice/artifacts/PEPPOL/BII CORE/BIICORE-UBL-T10-V1.0.sch Peppol_BIS_4A_Invoice/artifacts/PEPPOL/BII RULES/BIIRULES-UBL-T10.sch Peppol_BIS_4A_Invoice/artifacts/PEPPOL/SBDH.sch Peppol_BIS_4A_Invoice/artifacts/PEPPOL/peppol-smp-metadata- template.xml

0088:12345test

113

urn:oasis:names:specification:ubl:schema:xsd:Invoice- 2::Invoice##urn:www.cenbii.eu:transaction:biitrns010:ver2.0:extended:urn:www.peppol.eu:bis:peppol4a:ver2.0::2.1 urn:oasis:names:specification:ubl:schema:xsd:Invoice-2 urn:www.cenbii.eu:profile:bii04:ver2.0 Invoice

concat("https://", $SenderAccessPoint{ReceiverAccessPoint}{network.host}, ":", $SenderAccessPoint{ReceiverAccessPoint}{network.port})

B-351cd3bce374194b60c770852a53d0e6.iso6523-actorid- upis.localhost.

urn:oasis:names:specification:ubl:schema:xsd:Invoice- 2::Invoice##urn:www.cenbii.eu:transaction:biitrns010:ver2.0:extended:urn:www.peppol.eu:bis:peppol4a:ver2.0::2.1

114

nfig> urn:www.cenbii.eu:profile:bii04:ver2.0

$as2_output{business_message} $as2_output{business_message} $as2_output{business_message} $as2_output{business_header}

3058

3059 10.3.2 Using a GITB Compliant Validation Service Within A Test Case (here: Validex)

3060 GITB PoC implementation integrates an online validation tool, called Validex, to demonstrate its capabilities 3061 of integrating external testing services according to GITB Service Specifications.

3062 10.3.2.1 How to Integrate

3063 In order to integrate an external content validation tool with GITB Testbed, it must implement the GITB 3064 Content Validation Service Specifications to wrap its functionalities. To realize this integration, a new module, 3065 gitb-validator-validex has been created with the ValidationServiceImpl class that implements the 3066 ValidationService web service interface. With this implementation, Validex becomes accessible through the 3067 GITB Testbed Validation Service. Requests to this service are delivered to Validex and responses are 3068 wrapped with internal test reporting format and returned to tester. 3069 The module definition of Validex Validator can be seen below. One difference from the module definitions 3070 provided in gitb-validators module is an additional serviceLocation attribute which denotes the endpoint of 3071 the wrapper validation service. When the test engine utilizes this validator wrapping the functionalities of 3072 Validex, it calls this service which delivers the request to Valdiex, as mentioned before.

Validex Validator 1.0 Validex wrapper validation service

115

3073

3074 10.3.2.2 Definition of the Test Case

3075 In order to benefit from the external validation tools within a Test Case, it is enough to set the id of the 3076 validator in Verify step. For instance, as it can be seen from the module definition of Validex, the id attribute, 3077 ValidexValidator is set as the validation handler in the below Verify step.

PEPPOL-SenderAccessPoint-Invoice-Validex CONFORMANCE 0.1 The objective of this Test Scenario is to ensure the Sender Access Point (the System Under Test) can submit a conformant PEPPOL BIS 4A electronic invoice to a Receiver Access Point using the AS2 protocol. Then submitted document is validated by Validex. 0088:12345test urn:oasis:names:specification:ubl:schema:xsd:Invoice- 2::Invoice##urn:www.cenbii.eu:transaction:biitrns010:ver2.0:extended:urn:www.peppol.eu:bis:peppol4a:ver2.0 ::2.1 urn:www.cenbii.eu:profile:bii04:ver2.0

116

$as2_output{business_message} "Invoice document" 3078

3079

3080

3081

117

3082 11 GITB Compliance

3083 11.1 GITB Compliance Levels

3084 Test Beds can achieve different levels of compliance with the GITB recommendations:

3085  GITB Framework Compliance implies that a Test Bed provides the specific functional capabilities 3086 required for eBusiness testing (see GITB Phase 1, CWA 16093:2010, Chapter 5.7). It thereby fulfils 3087 quality criteria in eBusiness testing. 3088  GITB Service Compliance implies that the Test Bed implements the specifications outlined in this 3089 report (see GITB Phase 3, Chapters 5 to 9). It is thereby able to share and reuse testing resources 3090 with other GITB Service Compliant Test Beds.

3091 Both levels of compliance are complementary and fulfil different purposes – functional capabilities for GITB 3092 Framework Compliance and interoperability for GITB Service Compliance (see Table 11-1). The criteria to 3093 achieve GITB Compliance are described in the following sections.

3094 Table 11-1: GITB Compliance Levels Level of Description Criteria Added Value for the compliance Test Bed

GITB Comply with the functional Identified in GITB Phase 1 - Demonstrate the specific Framework requirements stated in the CWA 16093:2010 (Chapter functional capabilities Compliance GITB Testing Framework 5.7): required for eBusiness testing. Provide functional capabilities (test execution & test case model)

GITB Service Provide Service interfaces Identified in GITB Phase 3 Be able to share and Compliance and produce test report (see Chapters 5 to 9): reuse testing resources artifacts with other GITB Service Provide Service interfaces (for Compliant Test Beds. all or part of the Test Services defined in this report) Become part of the GITB network of Test Beds. Produce Test Report artifacts complying with the GITB recommendations.

3095

3096 11.2 GITB Framework Compliance

3097 The criteria for achieving GITB Framework Compliance have been elaborated in GITB Phase 1 as 3098 engineering-level requirements (CWA 16093:2010, Chapter 5.7) and have been reviewed and 3099 complemented in the current phase.

3100 A Test Bed can claim to be GITB Framework Compliant if it provides the functional capabilities related to

3101 a. the test case model (FUC-TCM) – criteria specified in Table 11-2, 3102 b. the test execution (FUC-TCE) – criteria specified in Table 11-3.

3103 Table 11-2: GITB Framework Compliance – Criteria related to Test Case Model (FUC-TCM) FUC-TCM/R01 Capability of representing test configuration information 1) Capability of representing declaration CWA For each message exchanged in a test case, of messaging protocol to be used 16093:2010 the protocol to be used need to be clearly identified

118

2) Capability of representing for each GITB3 - For each tested actor, the list of tested actor the type of configuration Addition configuration parameters that are needed parameters that are needed from the SUT should be identified FUC-TCM/R02 Capability of representing test procedural information 1) Capability of representing message CWA to be sent 16093:2010 2) Capability of representing message CWA choreography 16093:2010 3) Capability of representing conditional CWA expression (test step) for test case 16093:2010 4) Capability of representing iterative CWA expression 16093:2010 5) Capability of representing manual CWA steps 16093:2010 FUC-TCM/R03 Capability of representing test verification information 1) Capability of using external CWA The tool offers the capability of using document for verification 16093:2010 external reference material (document, services, schema....) for verification of exchanged messages/content FUC-TCM/R05 Capability of representing test suite (containing test cases) 1) Capability of representing CWA precedence relationships between 16093:2010 test cases 2) Capability of grouping test cases into GITB3 - a test suite Addition FUC-TCM/R05 Capability of representing test data 1) Capability of representation of user's CWA The tool offers the capability of referencing defined values 16093:2010 data value set for the context of the test 2) Capability of representation of CWA The tool is capable of generating data for automatically generated values 16093:2010 testing purpose 3104

3105 Table 11-3: GITB Framework Compliance – Criteria related to Test Execution Model 3106 (FUC-TCE) FUC-TCE/R01 Capability of test preparation and setup 1) Capability of providing the setup CWA Capability of the tools to provide the SUT information to SUTs 16093:2010 operator with the test configuration paramaters 2) Capability of requesting SUTs CWA Capability of the tools to request from the parameters and information 16093:2010 SUT operator the test configuration paramaters 3) Capability of test case customisation CWA Before the execution of the test ! 16093:2010 FUC-TCE/R02 Capability of controlling test steps 1) Capability of display of test flow and CWA test progress 16093:2010 2) Capability of requesting/storing CWA The user can upload evidences in the tool or user's information 16093:2010 input data 3) Capability of binding user' CWA The use can provide pointers (URL, URI...) information into test 16093:2010 to information external to the test bed 4) Capability of manual execution of test CWA steps 16093:2010 FUC-TCE/R03 Capability of message exchange 1) Capability of sending/receiving CWA The tool is able to send and receive message 16093:2010 messages. Removed payload from description

119

2) Capability of uploaded/downloading CWA Manual upload/download of exchanged message 16093:2010 messages 3) Capability of capturing message CWA Automatic capture of exchanged messages 16093:2010 by the test bed FUC-TCE/R04 Capability of message pre-/post-processing 1) Capability of decomposing messages CWA The test bed is capable of processing the 16093:2010 messages exchange in the supported 2) Capability of retrieving a value from CWA protocols. It is capable of using message message 16093:2010 content for subsequent steps 3) Capability of generation message CWA template from schema 16093:2010 4) Capability of generation of test data CWA for a specific message template 16093:2010 5) Capability of message transformation CWA Add here : for better readability (Binary to 16093:2010 Text, HL7 ER7 to Tree or XML) FUC-TCE/R05 Capability of validation and recovery 1) Capability of detecting unknown CWA Remove unknown. The test execution can problems 16093:2010 detect errors and report them 2) Capability of employing the existing CWA Capibility of using external validation tools validation engines messages 16093:2010 and display/process their report 3) Capability of recovery from errors CWA Whenever this is possibile the validation tool 16093:2010 shall continue processing messages analysis although an error has been identified. Test case execution should not abort up on error discovery when this does not impact the test case FUC-TCE/R06 Capability of reporting 1) Capability of display of error location CWA This section is about the reporting of logged 16093:2010 evidence to the testers 2) Capability of display of test log CWA information 16093:2010 3) Capability of display of detail test CWA result 16093:2010 FUC-TCE/R07 Capability of B2B system emulation 1) Capability of emulation of an arbitrary CWA The test bed offers the capability of emulate business unit 16093:2010 a business unit relevant to the tested context. Simulation tool 3107

3108 11.3 GITB Service Compliance

3109 The added value of GITB Service Compliance is that Test Beds are able to share and reuse Testing 3110 Ressources with other GITB Service Compliant Test Beds.

3111 A Test Bed or testing service can claim to be GITB Service Compliant if

3112  It provides Service interfaces (for all or part of the Test Services defined in this report, see section 8) 3113  It produces Test Report artifacts complying with the GITB recommendations (see section 7).

3114 A document validation software tool or service can claim to be a GITB Compliant Document Validation 3115 Service if

3116  It conforms to the GITB Document Validation Service Specification and performs the intended 3117 document validations as a result of the validation operation calls. 3118  It produces test reports conformant with GITB Test Report Format as a result of the validation 3119 operations defined in the service specification.

3120 A software system or service can claim to be a GITB Compliant Messaging (Simulator) Service if 120

3121  It conforms to the GITB Messaging (Simulator) Service Specification and perform the intended 3122 communications with or between SUTs as a result of the operations defined in the specification. 3123  It produces the test reports conformant with GITB Test Report Format as a result of those 3124 messaging operations.

3125 Do you have a Test Bed performing conformance and interoperability testing in the level of 3126 choreography, message protocol, and business document content?

3127 A Test Bed service can claim to be a GITB Compliant Test Bed Service if

3128  It conforms to the GITB Test Bed Service Specification and performs the intended testing process 3129 remotely based on the operation calls on the service. 3130  It can map internal testing process definition to the GITB Test Presentation Language (TPL) format 3131 and return it as a result of corresponding operation defined in the specification. 3132  It produces the test reports conformant with GITB Test Report Format as a result of the execution of 3133 a test scenario.

3134 The implementers of GITB service specifications are free to use their own approach, software architecture, 3135 or technology to implement their tools. The only requirement is to conform to the web service specifications 3136 to serve their functionalities to others in a common way.

3137 Criteria to become GITB Service Compliant are summarized in Table 11-4 :

3138 Table 11-4: GITB Service Compliance – Criteria

Element Definition Methodology Test Presentation A model to represent a conformance or You need to map your internal Language interoperability test scenario. test execution models, or test scripting languages to this model Its purpose is to present the flow and the test to describe your test scenarios steps in a granular way to users and other (see section 7) testing software - not executable Test Report Format A model to represent test reports You have to wrap your own report format within this model (ref in the report: paragraph 8) Document Validation A service to wrap existing validation capability You have to adapt your existing Service to be reused by others implementation to expose your own Test Bed logic through the Document Validation Service (see paragraph 9.1 of the report for details) Messaging/Simulator A service to support communication with SUTs You have to adapt your existing Service implementation to expose your own Test Bed logic through the Messaging/Simulator service (see paragraph 9.2 of the report for details) Test Bed Service A service to enable remote control of Test You have to adapt your existing Beds for the execution of a conformance or implementation to expose your interoperability testing scenario own Test Bed logic through the Testbed Service (see paragraph 9.3 of the report for details) 3139 The main benefit of being GITB-service compliant is that Test Beds can cooperate in a network and share 3140 their Testing Capabilities and Testing Resources with other GITB-service compliant Test Beds. Their own 3141 Testing Capabilities will then be usable (as add-ons) by other GITB-service compliant Test Beds, as they 3142 make use of standardized plug-in interfaces and exhibit the expected level of standardization for their Test 3143 Artifacts.

121

3144 The networking of both kinds of Test Beds is illustrated in Figure 11-1.

3145

3146 Figure 11-1: Networking of Test Beds Based on GITB Service Interfaces 3147

3148 11.4 Examples of GITB-Compliant Test Beds

3149 Within GITB phase 3, several legacy Test Beds have started to become GITB-compliant. They will be listed 3150 in the Appendix along with their claim of conformance.

3151

122

3152 Part III: GITB Test Registry and Repository (TRR) Specifications and Prototype 3153 The GITB Architecture foresees a Test Registry and Repository (TRR) as access point for any published 3154 Testing Resource. The Testing Resource could include not only Test Artifacts like Test Cases, script rules, 3155 and Test Suites, but also test components with their APIs and interfaces.

3156 Part III of this report summarizes GITB Phase 3 outcomes related to the TRR.

3157  TRR specifications in the form of an ADMS profile (Section 12). 3158  The prototype implementation based on Joinup (Section 13).

3159 This part is relevant for testing experts and architects that are interested in registries and repositories for 3160 managing, archiving and sharing Test Resources.

3161 12 GITB Test Registry and Repository (TRR) Specifications 3162 12.1 Role of TRR in the GITB Architecture

3163 In the GITB Architecture, the Test Registry and Repository (TRR) is aimed at supporting the Test Bed for 3164 managing, archiving and sharing distributed Testing Resources in a central location, accessible by other 3165 parties. The TRR can also be used as a long-term archival platform for Testing Resources. The TRR is part 3166 of the GITB testing infrastructure, however, it is not considered as part of a Test Bed, as it is a software 3167 component that is independently deployed, managed and accessed. The TRR is accessible through a 3168 graphical user interface called the TRR Client or through a Web service interface.

3169

3170 Figure 12-1: TRR in the GITB Architecture 3171 When it comes to managing and sharing resources in a distributed environment, a distinction is typically 3172 made between the concept of a registry, that lists items with pointers to find the items, and the concept of a 3173 repository, that stores the actual items. The TRR can fulfil both purposes.

3174 The main capability offered by the TRR is the search functionality that allow users to find information about 3175 and locate existing Testing Resources and Test Beds. The TRR features are described in details in section 3176 12.5.

123

3177 12.2 User Classes and Roles

3178 The following table summarizes the user and user classes of Test Beds, by referring to the roles defined in 3179 the GITB Testing Framework (see section 4.3).

3180 Table 12-1: List of the Test Bed Actors User Classes Roles in the GITB Testing Short Description Framework (see 4.3.)

Test Experts: Provider Test Designer Creator and editor of Test Resources of test beds or testing services Test Manager Executor or execution facilitator of Test Suites

Test Bed Provider Operator of a Test Bed

Test Participants: End User All organizations – from private and public Owner or operator of a sectors – which implement eBusiness scenarios SUT Industry Consortia and Communities of end-users, public authorities Formal SDOs and other interested parties

Software Vendors Vendors of enterprise applications that intend to be compliant with existing eBusiness Specifications

Testing Laboratories Laboratories specialized in increasing efficiency and reliability of interoperable implementation of standards

3181

3182 As the TRR is part of the GITB testing infrastructure, it is expected that the Test Bed users would also be 3183 users of the TRR. However, as the TRR is an independent system, new user roles or actors shall be 3184 introduced. The suggested TRR user roles are listed in Table 12-1. Test Bed Users and business users as 3185 previously defined could have any of the roles specific to the GITB Compliant TRR platform.

3186 Table 12-2: List of the TRR Actors TRR User Roles Short Description

TRR Administrator Administrator of the TRR platform

Workspace Administrator Workspace User who has the administrator rights on a workspace, a private place for a set of users, where users can administrate their folders and Testing Resources

Workspace User Authenticated User invited to participate to a workspace by a workspace administrator

Authenticated User Anonymous User who has created an user account on the platform and has logged in

Anonymous User User who is not logged in the platform

3187

124

3188 12.3 Basic Concepts

3189 12.3.1 Testing Resources Managed by the TRR

3190 Testing Resource is a generic term introduced in GITB that designates any part of a Test Bed (Test Artifact, 3191 Test Service interface, core or plug-in Test Bed Component), or a combination of these.

3192 The Testing Resources managed by the TRR are the following resources:

3193  Test Artifacts like Test Cases, script rules, and Test Suites, 3194  Test Components (also known as Testing Capability Components) with their APIs and interfaces.

3195 The following table lists the Testing Resources defined in the GITB Testing Framework (see section 4) which 3196 are managed by the TRR.

3197 3198 Table 12-3: Different Type of Testing Resources managed by the TRR

Primary Type Short description concepts

Document a package of artifacts used to validate a Business Document, typically Assertion including one or more of the following: a schema (XML), consistency rules, codelists, etc. These artifacts are generally machine- processable Test Test Logic Test Case an executable unit of verification and/or of interaction with an SUT, Artifact Artifact corresponding to a particular testing requirement, as identified in an eBusiness Specification

Test Suite defines a workflow of Test Case executions and/or Document Validator executions

Messaging specialized for messaging protocol stacks such as ebXML Adapter Messaging, Web Services with SOAP or REST, AS2/AS4, and the underlying transports: SMTP, HTTP, etc. Test Capability Component Document responsible for validating the content of the documents retrieved from Validator the Messaging Adapters in terms of both structure and semantic such as EDI: ANSI, EDIFACT, XML

3199

3200 12.3.2 Metadata

3201 As registry and repository, the TRR can contain either the actual Testing Resource, or a reference to a 3202 Testing Resource contained in another instance of system (TRR, Test Bed), or a reference to an actual Test 3203 Bed. To facilitate their management, discovery and identification, the data stored or referenced within the 3204 TRR need to be associated with metadata.

3205 The National Information Standards Organization25 defines metadata as the “structured information that 3206 describes, explains, locates, or otherwise makes it easier to retrieve, use, or manage an information 3207 resource. Metadata is often called data about data or information about information”. Metadata provides data 3208 or information that enables to make sense of data, concepts and real-world entities. Metadata is a particular 3209 kind of information, associated to Testing Resources and to Test Beds. For example, title, author, creation 3210 date, name are default metadata associated to the Testing Resources.

25 http://www.niso.org/publications/press/UnderstandingMetadata.pdf 125

3211 For providing proper metadata to the Testing Resources, the good practice26 is to reuse existing 3212 vocabularies developed by standards and specifications. For example, the following general purpose 3213 standards and specification can be reused: for published material (text, images), FOAF or ISA 3214 Core Vocabularies27 for people and organisations, SKOS for concept collections. These standards and 3215 specifications are available as RDF datasets.

3216 The use of RDF to structure, organize and model metadata is aligned with the actual ways to model and 3217 present data and follows the Linked Open Data principles.

3218 12.4 The Asset Description Metadata Schema application profile for TRR

3219 The Asset Description Metadata Schema (ADMS)28 was originally drafted to describe semantic 3220 interoperability assets. An ADMS application profile is a specification for data interchange that adds 3221 additional constraints to the original ADMS, so the scope of the ADMS is extended or restricted for specific 3222 purpose by modifying required terms, classes and properties, etc.

3223 A specific metadata schema has been developed for identifying and describing the Testing Resources 3224 managed by the TRR by reusing and combining existing terms from different standards and specifications. 3225 The metadata schema for Testing Resources is an ADMS application profile called ADMS.TRR.

3226 3227 Table 12-4: The primary concepts introduced by ADMS

Primary Definition Concept of GITB concept

Asset A system or service that provides facilities for storage The concept of TRR Repository and maintenance of descriptions of Assets and Asset Distributions, and functionality that allows users to search and access these descriptions.

Asset An abstract entity that reflects the intellectual content of The concept of an abstract the asset and represents those characteristics of the design or model of a Testing asset that are independent of its physical embodiment. Resource

Asset A particular physical embodiment of an Asset. A The concept of a concrete Distribution Distribution is typically a downloadable computer file representation of a Test (but in principle it could also be a paper document or Resource API response) that implements the intellectual content of an Asset.

3228

3229 In the following sections, the developed ADMS application profile is presented, and in particular we:

3230  introduce a namespace for Testing Resource (section12.4.2), 3231  list the classes to reuse and the ones to introduce (section 12.4.3), 3232  for each class, list its properties and their scope: mandatory, recommended, optional (section 0), 3233  for some properties, point to an existing vocabulary (e.g. Eurovoc domains category) or specify a 3234 new vocabulary (paragraph 12.4.5).

26 This is what we see from the European initiatives about software reuse and promotion of the Linked Open Data: https://joinup.ec.europa.eu/community/ods/description

27 For example: http://www.w3.org/TR/vocab-regorg/

28 http://www.w3.org/TR/vocab-adms/ 126

3235 Controlled vocabularies, which are predefined lists of values to be used as values for a specific property in 3236 the metadata schema, are used in the metadata schema for Testing Resource. Common controlled 3237 vocabularies make metadata understandable across systems.

3238 12.4.1 Logical view of the metadata

Class1 Existing ADMS classes -dcterms:publisher :Agent adms:AssetRepository Class1 -dcterms:title Extension -dcterms:issued * -dcterms:modified 1 -dcterms:description -dcat:accessURL -adms:supportedSchema -adms:identifier adms:Identifier

* * -dcat:datas1et

adms:Asset adms:AssetDistribution -dcterms:title -dcterms:issued -dcat:distribution -dcterms:description -dcterms:modified -dcterms:issued 1 -dcterms:title 1 -dcterms:modified * -dcat:accessURL -dcat:keyword 1 -dcat:downloadURL -owl:versionInfo 1

* -dcterms:type

-admstrr:productType * -admstrr:businessProcess skos:Concept 1 1 -admstrr:actor * -adms:representationTechnique * admstrr:TestBed -dcterms:hasPart admstrr:TestAsset *

-admstrr:standardizationLevel * 1 * 1

admstrr:TestLogicArtifact -admstrr:uses admstrr:TestCapabilityComponent

1 *

admstrr:TestSuite -admstrr:testCase admstrr:TestCase -admstrr:documentAssertionSet admstrr:DocumentAssertionSet

admstrr:MessagingAdapter admstrr:DocumentValidator 1 * 1 * 1 1 -admstr:payloadFile admstrr:PayloadFile 1 -admstr:payloadFile * 3239 * 3240 Figure 12-2: The TRR Metadata Schema 3241 The ADMS.TRR application profile extends the existing ADMS specification29. Also, the extended ADMS 3242 specification30 provided within the JoinUp platform is the reference of this application profile. It means that 3243 some of the classes presented here have been reutilized as-is from the ADMS specification.

3244 12.4.2 Namespaces

3245 In the following sections, classes and properties are grouped under headings ‘mandatory’, ‘recommended’ 3246 and ‘optional’. These terms have the following meaning.

3247  Mandatory class: a receiver of data MUST be able to process information about instances of the 3248 class; a sender of data MUST provide information about instances of the class. 3249  Recommended class: a receiver MUST be able to process information about instances of the class; 3250 a sender SHOULD provide the information if it is available. 3251  Optional class: a receiver MUST be able to process information about instances of the class; a 3252 sender MAY provide the information but is not obliged to do so. 3253  Mandatory property: a receiver MUST be able to process the information for that property; a sender 3254 MUST provide the information for that property.

29 http://www.w3.org/TR/vocab-adms/

30 https://joinup.ec.europa.eu/catalogue/distribution/Extended_ADMS_Specification_v100zip 127

3255  Recommended property: a receiver MUST be able to process the information for that property; a 3256 sender SHOULD provide the information for that property if it is available. 3257  Optional property: a receiver must be able to process the information for that property; a sender 3258 MAY provide the information for that property but is not obliged to do so.

3259 The table below lists the namespace prefixes that are used in the following sections with the corresponding 3260 namespaces URIs.

3261 Table 12-5: Namespaces of the ADMS application profile

Namespace Prefix Namespace URI

adms http://www.w3.org/ns/adms#

admstrr http://purl.org/adms/trr/

dcat http://www.w3.org/ns/dcat#

dcterms http://purl.org/dc/terms/

doap http://usefulinc.com/ns/doap#

foaf http://xmlns.com/foaf/0.1/

qb http://purl.org/linked-data/cube#

rad http://www.w3.org/ns/radion#

rdfs http://www.w3.org/2000/01/rdf-schema#

schema http://schema.org/

skos http://www.w3.org/2004/02/skos/core#

spdx http://spdx.org/rdf/terms#

swid http://standards.iso.org/iso/19770/-2/2009/

trove http://sourceforge.net/api/trove/index/rdf#

v http://www.w3.org/2006/vcard/ns#

wdrs http://www.w3.org/2007/05/powder-s#

xsd http://www.w3.org/2001/XMLSchema#

3262

3263 12.4.3 Application Profile Classes

3264 These classes include the Test Bed and Test Suite classes (at least one of them is mandatory depending of 3265 the scope) and all classes that appear as the range of mandatory properties in the description of instances of 3266 these two classes.

3267 Table 12-6: Mandatory Classes Class name Usage note for the Application Profile URI

Asset Abstract entity that reflects the intellectual content of an adms:Asset Asset and represents those characteristics that are 128

independent of its physical embodiment. This abstract entity combines the FRBR31 entities work (a distinct intellectual or artistic creation) and expression (the intellectual or artistic realization of a work).

The physical embodiment of an Asset is called an Asset Distribution. A particular Asset may have zero or more Distributions.

Test Bed An actual test execution environment for Test Suites or Test Services. Contains Testing Capabilities and various Test admstrr:TestBed Suites or Document Assertions.

Test Suite Defines a workflow of Test Case executions and/or Document Validator executions. admstrr:TestSuite The Test Suite class is a subclass of Asset class.

Organisation making information available. This can be an Publisher organisation that has customized a particular standard to foaf:Agent answer its specific business needs.

Identifier Identifier of an Asset. adms:Identifier

The type of specification the Test Bed or the Testing Specification Resource refers to, using a controlled vocabulary (see skos:Concept Type section 12.4.5).

3268

3269 The following classes are classified as Recommended to allow the user to give additional details about the 3270 content of a Test Suite or a Test Bed.

3271 Table 12-7: Recommended Classes Class name Usage note for the Application Profile URI

Particular physical embodiment of an Asset, which is an example of the FRBR entity manifestation (the physical embodiment of an expression of a work).

A Distribution is typically a downloadable computer file (but in Asset principle it could also be a paper document or API response) adms:AssetDistribution Distribution that implements the intellectual content of an Asset.

A particular Distribution is associated with one and only one Asset, while all Distributions of an Asset share the same intellectual content in different physical formats.

Document A package of artifacts used to validate a Business Document, AssertionSet typically including one or more of the following: a schema admstrr:DocumentAsserti (XML), consistency rules, codelists, etc. These artifacts are onSet generally machine-processable.

The Document Assertion Set class is a subclass of Asset

31 Cataloguing Section. Functional Requirements for Bibliographic Records, section 3. Entities. http://archive.ifla.org/VII/s13/frbr/frbr_current3.htm 129

class.

Test Case An executable unit of verification and/or of interaction with an SUT, corresponding to a particular testing requirement, as identified in an eBusiness Specification. admstrr:TestCase

The Test Case class is a subclass of Asset class.

Payload File represents a concrete document or part of it admstrr:PayloadFile The Payload File class is a subclass of Asset Distribution class.

Messaging specialized for messaging protocol stacks such as ebXML Adapter Messaging, Web Services with SOAP or REST, AS2/AS4, admstrr:MessagingAdapt and the underlying transports: SMTP, HTTP, etc. er The Messaging Adapter class is a subclass of Asset class.

Document responsible for validating the content of the documents Validator retrieved from the Messaging Adapters in terms of both admstrr:DocumentValidat structure and semantic such as EDI: ANSI, EDIFACT, XML. or The Document Validator class is a subclass of Asset class.

Standardizat Level of standardization of the Test Artifacts (e.g. Level 1, ion Level Level 2, Level 3) of an Asset, using a controlled vocabulary skos:Concept (see section 12.4.5).

3272 3273 Table 12-8: Optional Classes Class name Usage note for the Application Profile URI

System or service that provides facilities for storage and maintenance of descriptions of Assets and Asset Asset Distributions, and functionality that allows users to search adms:AssetRepository Repository and access these descriptions. An Asset Repository will typically contain descriptions of several Assets and related Asset Distributions.

Representati Machine-readable language in which a Distribution is on expressed, using a controlled vocabulary (see section skos:Concept Technique 12.4.5).

3274

3275

130

3276 12.4.4 Application Profile Properties per Class

3277 12.4.4.1 Asset

3278 Table 12-9: Mandatory Properties Property Range Usage note Card GITB Concept .

adms:identifier adms:Identifier identifier for the Asset 0..n artifactId

dcterms:title rdfs:Literal name of the Asset 1..n artifactName

dcterms:type skos:Concept type of the Asset, using a 1..n controlled vocabulary (see section 12.4.5)

3279

3280 Table 12-10: Recommended Properties Property Range Usage note Card GITB Concept .

version number or other version owl:versionInfo rdfs:Literal 0..n designation of the Asset

dcterms:publisher foaf:Agent organisation making the Asset 1..n authors available

dcterms:description rdfs:Literal descriptive text for the Asset 1..n description

geographic region or jurisdiction to which the Asset applies, using dcterms:spatial dct:Location 0..n a controlled vocabulary (see section 12.4.5)

adms:AssetDistr implementation of the Asset in a dcat:distribution 0..n ibution particular format

3281

3282 Table 11-12-11: Optional Properties Property Range Usage note Card GITB Concept .

rdfs:Literal typed date of formal issuance of this origDate dcterms:issued 0..1 as xsd:dateTime version of the Asset

rdfs:Literal typed modifDate dcterms:modified date of latest update of Asset 1..1 as xsd:dateTime

word of phrase to describe the keywords dcat:keyword rdfs:Literal 0..n Asset

3283

3284 12.4.4.2 Asset Distribution

3285 Table 12-12: Mandatory Properties 131

Property Range Usage note Card GITB Concept .

dcat:accessURL rdfs:Resource URL of the Distribution 1..n

3286

3287 Table 12-13: Recommended Properties Property Range Usage note Card GITB Concept .

direct link to a downloadable file dcat:downloadURL rdfs:Resource 0..n in a given format

media type of the Distribution as dcat:mediaType dct:FileFormat defined by IANA32, using a 0..1 controlled vocabulary

dct:LicenseDocu conditions or restrictions for (re-) dcterms:license 0..1 ment use of the Distribution

Eg: XSD, language in which the DICOM, EDI Distribution is expressed, using messages a controlled vocabulary (see section 12.4.5) adms:representation X12, EDIFACT, skos:Concept 0..1 Technique ODETTE, VDA Note: this is different from the file format, e.g. a ZIP file (file format) Rule script file: could contain an XML schema Schematron, (representation technique) JESS, XPATH

3288

3289 Optional properties

3290 See document about the extended ADMS specification for the exhaustive list of optional properties.

3291 Asset Repository

3292 See document about the extended ADMS specification for the exhaustive list of optional properties.

3293 Test Asset

3294 The Test Asset class is an abstract subclass of the Asset class and therefore inherits all the latter's 3295 properties and relationships.

3296 Table 12-14: Recommended Properties Property Range Usage note Card GITB Concept .

The actor or business process admstrr:actor skos:Concept 0..n role associated to this Asset

32 IANA (Internet Assigned Numbers Authority). MIME Media Types. http://www.iana.org/assignments/media-types 132

Property Range Usage note Card GITB Concept .

The business process associated to this Asset

admstrr:businessPro The type of process that skos:Concept 0..n cess provides a way to unambiguously identify the business activity to which the Asset is associated

The type of product associated admstrr:productType skos:Concept 0..n with this Asset

3297

3298 12.4.4.3 Test Bed

3299 The Test Bed class is a subclass of the Asset class and therefore inherits all the latter's properties and 3300 relationships. The expected properties for Test Bed are: identifier, title, publisher and landingPage.

3301 12.4.4.4 Test Capability Component

3302 The Test Capability Component class is an abstract subclass of the Test Asset class and therefore inherits 3303 all the latter's properties and relationships.

3304 The Messaging Adapter and Document Validator classes are subclasses of the Test Capability Component 3305 class.

3306 12.4.4.5 Test Logic Artifact

3307 The Test Logic Artifact class is an abstract subclass of the Test Asset class and therefore inherits all the 3308 latter's properties and relationships.

3309 The Test Suite, Test Case and Document Assertion Set classes are subclasses of the Test Logic Artifact 3310 class.

3311 Table 12-15: Recommended Properties Property Range Usage note Card.

admstrr:standardizationLe The level of standardization of the Test skos:Concept 0..1 vel Artifacts

3312

3313 12.4.4.6 Test Suite

3314 The Test Suite class is a subclass of the Test Logic Artifact class and therefore inherits all the latter's 3315 properties and relationships.

3316 Table 11-12-16: Recommended properties

Property Range Usage note Card.

admstrr:testCase admstrr:TestCase The associated Test Case 0..n

3317

133

3318 12.4.4.7 Test Case

3319 The Test Case class is a subclass of the Test Logic Artifact class and therefore inherits all the latter's 3320 properties and relationships.

3321 Table 11-12-17: Recommended Properties

Property Range Usage note Card.

admstrr:documentAssertionS admstrr:Document The associated Document Assertion Set 0..n et AssertionSet

admstrr:PayloadFi admstrr:payloadFile The associated Payload File 0..n le

admstrr:TestCapa admstrr:uses The associated Test Capability Component 0..n bilityComponent

3322

3323 Optional Properties

3324 12.4.4.8 Payload File

3325 The Payload File class is a subclass of the Asset Distribution class and therefore inherits all the latter's 3326 properties and relationships.

3327 12.4.4.9 Messaging Adapter

3328 The Messaging Adapter class is a subclass of the Test Capability Component class and therefore inherits all 3329 the latter's properties and relationships.

3330 12.4.4.10 Document Validator

3331 The Document Validator class is a subclass of the Test Capability Component class and therefore inherits all 3332 the latter's properties and relationships.

3333 12.4.4.11 Specification Type

3334 12.4.4.12 Identifier

3335 Table 12-18: Mandatory Properties Property Range Usage note Card.

skos:notation rdfs:Literal with character string for the identifier 1..1 datatype reflecting the identifier scheme

3336

3337 12.4.4.13 Publisher

3338 Table 12-19: Mandatory Properties Property Range Usage note Card.

134

Property Range Usage note Card.

type of the Publisher, using a dct:type skos:Concept controlled vocabulary (see section 0..n 12.4.5)

3339

3340 12.4.4.14 Standardization Level

3341 Table 12-20: Mandatory Properties Property Range Usage note Card.

rdfs:label rdfs:Literal label for the Standardization Level 0..1

3342

3343 12.4.4.15 Representation Technique

3344 Table 12-21: Recommended Properties Property Range Usage note Card.

skos:notation rdfs:Literal label for the Representation 0..n Technique

3345

3346 12.4.5 Controlled Vocabularies to be Used

Property URI used for Vocabulary Vocabulary URI class

dcterms:type Asset http://purl.org/adms/trr/ specificationtype/

adms:representation Asset ADMS Representation http://purl.org/adms/trr/ Technique Distribution Technique Vocabulary representationtype/

admstrr:businessProcess Asset ADMS.TRR Business Process http://www.ebxml.org/s Vocabulary pecs/bpPROC.pdf - needs to be enriched Based on the UNCEFACT for each domain not Catalog of Common Business covered Process

admstrr:businessProcessRole Asset ADMS.TRR Business Process http://www.ebxml.org/s Role Vocabulary pecs/bpPROC.pdf - needs to be enriched Based on the UNCEFACT for each domain not Catalog of Common Business covered Process

135

Property URI used for Vocabulary Vocabulary URI class

admstrr:productType Asset Common Procurement http://eur- Vocabulary (CPV) lex.europa.eu/legal- content/EN/ALL/?uri=C ELEX:32008R0213

admstrr:standardizationLevel Test Logic ADMS.TRR Standardization http://purl.org/adms/trr/ Artifact Level Vocabulary standardizationLevel

ADMS Publisher Type dct:type Publisher http://purl.org/adms/pu vocabulary blishertype/

http://publications.euro MDR Countries Named pa.eu/resource/authori Asset, Asset dct:spatial Authority List33, MDR Places ty/country, Repository http://publications.euro Named Authority List34 pa.eu/resource/authori ty/place/

3347

3348 12.4.5.1 Specification Type of Asset

Code URI - Definition

HL7 URI: http://purl.org/adms/trr/HL7 Definition: see Source: CWA 16408:2012 Related terms: IHE WS-I-BP2.0 URI: http://purl.org/adms/trr/WS-I-BP2.0 Definition: see Source: CWA 16408:2012 Related terms: Autogration

MOSS

ePRIOR

eSENS

OpenPEPPOL

3349

3350 12.4.5.2 Representation Type of Asset Distribution

3351 Based on the Representation technique of ADMS (http://purl.org/adms/representationtechnique/).

33 Publications Office of the EU. Metadata Registry. Authorities. Countries. http://publications.europa.eu/mdr/authority/country/

34 Publications Office of the EU. Metadata Registry. Authorities. Places. http://publications.europa.eu/mdr/authority/place/ 136

Code URI - Definition

Schematron Schematron

JESS

XPATH

DICOM

X12

EDIFACT

ODETTE

VDA

HumanLanguage Human Language

Diagram Diagram

UML Unified Modelling Language

XMLSchema XML Schema

SKOS Simple Knowledge Organization System

RDFSchema Resource Description Framework Schema

Genericode genericode

IDEF Integration Definition

BPMN Business Process Modeling Notation

Archimate Archimate

SBVR Semantics of Business Vocabulary and Rules

DTD Document Type Definition

OWL Full/DL/Lite

SPARQL SPARQL Query Language for RDF

SPIN SPARQL Inference Notation

WSDL Web Service Description Language

WSMO Web Service Modelling Ontology

KIF Knowledge Interchange Format

Prolog Prolog

Datalog Datalog

137

Code URI - Definition

RuleML Rule Markup Langauge

RIF

SWRL Rule Language

TopicMaps Topic Maps

CommonLogic

RelaxNG Relax NG

3352 12.4.5.3 Standardization Level of Test Logic Artifact

3353 Extracted from CWA_16408.

Code URI - Definition

Level 1 Standardization of a general wrapper or header to the artifact (meta-data standardization). Level 2 Level 1 plus standardization of external references or interfaces to other artifacts. Level 3 Level 1 plus Level 2 plus whole content standardization (e.g. detailed XML schema reflecting the entire structure of the artifact). 3354

3355

3356 12.5 Features

3357 12.5.1 Overview

3358 The TRR features are similar to the features of a Registry and a Repository, where storage, retrieval and 3359 search are the main features. Figure 12-3 shows the interactions between users and the GITB compliant 3360 TRR. The features of the TRR are:

3361  the workspace and folders management, 3362  the Testing Resources management, 3363  the bulleting board, 3364  the Testing Resources search, 3365  the general administration.

3366 Users need to be able to access features both through a graphical user interface and through a Web service 3367 interface. Section 12.7 gives more detail about how users interact with the TRR features.

138

3368

3369 Figure 12-3: Use Case Diagrams of the TRR 3370 12.5.2 Concepts

3371 The following table introduces the concepts used to describe the features of the TRR.

3372 Table 12-22: TRR Concepts Concept Description Optional

139

12.5.2.1.1.1.1 Workspace A workspace is the private place for a set of users, This is optional. where users can administrate their folders and Testing Resources. Some Testing Resources and information about Test Beds require confidentiality, while others are publically available.

12.5.2.1.1.1.2 Folder Testing Resources are organized in folders. It is possible to create a folder tree to organize and store Testing Resources. An archive is a top-level folder.

12.5.2.1.1.1.3 Testing This is the content managed by the TRR introduced in Resource section 12.3.1.

12.5.2.1.1.1.4 Bulletin The bulletin board is a public place in the TRR to board share announces and comments on them.

3373

3374 12.5.3 Search Testing Resources

3375 This is about searching among the existing Test Resources.

Actors All user

Pre condition None

Priority High

3376

3377 Requirements

3378 REQ-1. Perform a free text search: the user enters a plain text 3379 REQ-2. Perform a search on meta-data fields: the user enters some values for specific meta-data fields 3380 to refine the scope of the search 3381 REQ-3. Query the system: the user specifies a query in a formal language (e.g. SQL, SPARQL or 3382 others) 3383 REQ-4. Download the Test Resources: the user performs a search (free text, on meta-data or through 3384 a query) and can download Test Resources associated with the results of the search 3385 REQ-5. Access the Test Resources or Test Bed: the user accesses a Test Bed or a Test Resource 3386 stored in a remote system, and referenced within the TRR. 3387

3388 12.5.3.1 Typical searches

3389  Which testing resources are available for a specific context and a specific testing purpose? 3390 o Testing resources = as defined 3391 o Context = Industry A, Country B, e-business specification C, role D (an actor in the e- 3392 business specification) 3393 o Testing purpose = validation or simulation 3394  Which test beds / test providers have expertise in a particular context? 3395 o Context = Industry A, Country B, e-business specification C, role D (an actor in the e- 3396 business specification) 3397  Which testing resources are available for different layers of e-business specifications/standards? 3398 o messaging 140

3399 o business documents 3400 o message choreography 3401 o profile

3402 12.5.3.2 Examples of search queries and their answer

3403 This section gives examples of business queries and instantiations of the TRR metadata schema that 3404 answer these queries. The examples are expressed in RDF/.

3405 The header of the following examples contains the required namespaces:

3406 @prefix : . 3407 @prefix rdf: . 3408 @prefix rdfs: . 3409 @prefix owl: . 3410 @prefix xsd: . 3411 @prefix dct: . 3412 @prefix adms: . 3413 @prefix admstrr: . 3414 @prefix skos: . 3415 @prefix foaf: . 3416 @prefix dcat: .

3417  What are the simulators that this Test Bed is providing?

3418 Shows all instances of class MessagingAdapter linked to the specified Test Bed (dcterms:hasPart).

3419 :hl7TestBed1 a admstrr:TestBed ; 3420 dct:title "HL7 Gazelle" ; 3421 dct:hasPart :hl7Simulator1 ; 3422 dct:hasPart :hl7Simulator2 ; 3423 dct:hasPart :hl7Validator1 ; 3424 dct:hasPart :hl7TestSuite1 . 3425 3426 :hl7Simulator1 a admstrr:MessagingAdapter ; 3427 dct:title "Patient Demographics Query simulator" ; 3428 admstrr:businessProcessRole ; 3429 admstrr:businessProcessRole . 3430 3431 :hl7Simulator2 a admstrr:MessagingAdapter ; 3432 dct:title "PIX Identity Cross-Reference Manager" ; 3433 admstrr:businessProcessRole .

3434  Which validation services are available for the particular eBusiness specification/standard?

3435 Show all instances of class DocumentValidator linked to the specified eBusiness specification/standard 3436 (dcterms:type).

3437 :hl7Validator1 a admstrr:DocumentValidator ; 3438 dct:title "Patient Demographics Query validator" ; 3439 admstrr:businessProcessRole ; 3440 admstrr:businessProcessRole ; 3441 dct:type . 3442 3443 :hl7Validator2 a admstrr:DocumentValidator ; 3444 dct:title "GazelleHL7v2Validator External Validation Service" ; 3445 dct:type .

3446  Which Test Beds support me in testing a particular eBusiness specification / standard?

3447 List all the Test Beds that contain (dcterms:hasPart) Asset which specification type (dcterms:type) is equal to 3448 a particular standard (constrained through a controlled vocabulary detailed in paragraph 12.4.5.1).

141

3449 :hl7TestBed1 a admstrr:TestBed ; 3450 dct:title "HL7 Gazelle" ; 3451 dct:hasPart :hl7Simulator1 ; 3452 dct:hasPart :hl7Simulator2 ; 3453 dct:hasPart :hl7Validator1 ; 3454 dct:hasPart :hl7Validator2 ; 3455 dct:hasPart :hl7TestSuite1 . 3456 3457 :hl7TestBed2 a admstrr:TestBed ; 3458 dct:title "HL7 SRDC TestBed" ; 3459 dct:hasPart :hl7TestSuite2 . 3460 3461 :hl7Simulator1 a admstrr:MessagingAdapter ; 3462 dct:title "Patient Demographics Query simulator" ; 3463 admstrr:businessProcessRole ; 3464 admstrr:businessProcessRole . 3465 3466 :hl7Simulator2 a admstrr:MessagingAdapter ; 3467 dct:title "PIX Identity Cross-Reference Manager" ; 3468 admstrr:businessProcessRole . 3469 3470 3471 :hl7Validator1 a admstrr:DocumentValidator ; 3472 dct:title "Patient Demographics Query validator" ; 3473 admstrr:businessProcessRole ; 3474 admstrr:businessProcessRole ; 3475 dct:type . 3476 3477 :hl7Validator2 a admstrr:DocumentValidator ; 3478 dct:title "GazelleHL7v2Validator External Validation Service" ; 3479 dct:type . 3480 3481 :hl7TestSuite1 a admstrr:TestSuite ; 3482 adms:identifier :identifierTs1 ; 3483 dct:title "HL7V3-P1-TestSuite" ; 3484 admstrr:businessProcessRole ; 3485 admstrr:businessProcessRole ; 3486 owl:versionInfo 1.0; 3487 dct:publisher ; 3488 dct:issued "2010-11-15T10:10:03-07:00"^^xsd:dateTime ; 3489 dct:modified "2011-11-22T10:10:03-07:00"^^xsd:dateTime ; 3490 dct:description "Test Suite for Profile 1 of HL7"@en ; 3491 dct:type ; 3492 dcat:keyword "WebServices"@en ; 3493 dcat:keyword "SOAP"@en ; 3494 dcat:keyword "HTTP"@en ; 3495 dcat:keyword "WSDL"@en ; 3496 dct:spatial ; 3497 admstrr:testCase :tc1 . 3498 3499 :hl7TestSuite2 a admstrr:TestSuite ; 3500 adms:identifier :identifierTs2 ; 3501 dct:title "HL7V3-TestSuite from SRDC" ; 3502 dct:type ; 3503 dcat:keyword "WebServices"@en ; 3504 dcat:keyword "WSDL"@en ; 3505 dct:spatial ; 3506 admstrr:testCase :tc2 .

3507  Given a particular actor or profile, what are the simulators, the validation suites and the Test Suites 3508 available?

3509 Show all instances of class MessagingAdapter, DocumentValidator and TestSuite linked to the specified 3510 actor (admstrr:businessProcessRole).

3511  Given a particular validation formalism (e.g. schematron), which Test Bed supports it?

3512 List all the Test Beds that contain (dcterms:hasPart) Asset linked to an AssetDistribution (dcat:distribution) or 3513 a PayloadFile when applicable (admstrr:payloadFile) which representation type (dcterms:type) is equal to a 3514 particular validation formalism (constrained through a controlled vocabulary detailed in paragraph 12.4.5.2).

142

3515 :das_01_xml a admstrr:PayloadFile ; 3516 dct:description "XML encoding of das1." ; 3517 dcat:accessURL ; 3518 adms:representationTechnique ; 3519 dct:format .

3520

3521 12.5.4 Testing Resources management

3522 This is a set of action to store and manipulate Testing Resources in the TRR.

12.5.4.1.1.1.1 Actors If the workspace concept exists for TRR which require some confidentiality: Workspace User, Workspace Administrator

Otherwise: Authenticated User

12.5.4.1.1.1.2 Pre The user is logged in the TTR condition

12.5.4.1.1.1.3 Priority High

3523

3524 Requirements

3525 REQ-6. Add a Testing Resource: the user adds a new Testing Resource to a folder or adds the 3526 reference of a Test Bed or an existing Testing Resource stored in another system 3527 REQ-7. Modify a Testing Resource: the user renames a Testing Resource 3528 REQ-8. Delete a Testing Resource: the user deletes a Testing Resource 3529 REQ-9. Share a Testing Resource: the user shares a Testing Resource with other users of the system, 3530 or makes it public (publish) 3531 REQ-10. Subscribe to a Testing Resource: the user can subscribe to a Testing Resource to get 3532 notifications when the content of a Testing Resource changes 3533 REQ-11. Change the version of Testing Resource: the user manually changes the version of a Testing 3534 Resource 3535 REQ-12. Add meta-data to a Testing Resources: the users modifies the values of the meta-data 3536 associated with a Testing Resource 3537 REQ-13. Add a comment to a Testing Resource: the user can write evaluations and comments 3538 associated to a Testing Resource 3539

3540 12.5.5 Secondary Features

3541 12.5.5.1 Workspace and Folders Management

3542 It is possible to create a folder tree to organize and store Testing Resources. An archive is a top-level folder.

12.5.5.1.1.1.1 Actors Workspace Administrator

12.5.5.1.1.1.2 Pre The user is logged in the TTR condition

12.5.5.1.1.1.3 Priority Low

3543

3544 Requirements

143

3545 REQ-14. Add an archive: the user adds a new top-level folder in its workspace 3546 REQ-15. Add a folder: the user adds a new folder to an archive 3547 REQ-16. Modify a folder: the user renames a folder 3548 REQ-17. Delete a folder: the user deletes a folder 3549 REQ-18. Share a folder: the user shares a folder with other users of the system, or makes it public 3550 (publish) 3551 REQ-19. Subscribe to a folder: the user can subscribe to a folder to get notifications when the content of 3552 a folder changes 3553

3554 12.5.5.2 Bulletin board

12.5.5.2.1.1.1 Actors Authenticated User, Workspace User, Workspace Administrator

12.5.5.2.1.1.2 Pre The user is logged in the TTR condition

12.5.5.2.1.1.3 Priority Low

3555

3556 Requirements

3557 REQ-20. Post an announce: the user posts an announce on the bulletin board 3558 REQ-21. Edit an announce: the user edits a previously posted announce 3559 REQ-22. Delete an announce: the user deletes an existing announce 3560 REQ-23. Comment on an announce: the user comments on an existing announce 3561 REQ-24. Subscribe to an announce: the user can subscribe to an announce to get notifications when the 3562 content of the announce changes or when new comments are posted 3563

3564 12.5.5.3 General administration

3565 A role is associated with a set of privileges. Depending on the user's roles given by the administrator, an 3566 user have access to functionalities.

12.5.5.3.1.1.1 Actors TRR Administrator

12.5.5.3.1.1.2 Pre The user is logged as administrator in the TTR condition

12.5.5.3.1.1.3 Priority Medium (meta-data management is high)

3567

3568 Requirements

3569 REQ-25. Manage roles: the user creates a new role, modifies and deletes existing roles 3570 REQ-26. Manage users: the user creates a new user, modifies and deletes existing users 3571 REQ-27. Manage roles with users: the user gives a role to an user, remove a role to an user 3572 REQ-28. Manage workspaces: the user creates a new workspace, modifies and deletes existing 3573 workspaces 3574 REQ-29. Add users to a workspace: the user adds existing users to an existing workspace 3575 REQ-30. Manage meta-data: the user creates new meta-data fields to describe the Testing Resources 3576

144

3577 12.6 Process View

3578 This part explains the TRR processes and how they communicate. It focuses on the runtime behaviour of the 3579 TRR due to the user interactions with the TRR.

3580

3581 Figure 12-4: Publish new Test Suite for the Latest Version of an eBusiness Specification 3582

145

3583

3584 Figure 12-5: Find Available Testing Resources for an eBusiness Specification 3585

146

3586

3587 Figure 12-6: Publish a File or an Interface of new Testing Resources in TRR 3588

3589 12.7 External Interfaces

3590 12.7.1 User Interfaces

3591 The TRR provides a web-based user interface to publish, search and download the Test Resources for the 3592 user who does not want to use the GITB Test Bed interface.

3593 12.7.2 Software Interfaces

3594 The TRR is connected with the GITB TestBed component (and particularly the Test Deployment Manager, 3595 part of the Test Bed) through a component called the GITB TRR Client.

3596 12.7.3 Communications Interfaces

3597 The TRR contains a messaging interface called the Test Repository Services, which makes the TRR core 3598 core functionality available. The messaging interface is based on the standard messaging protocols, such as 3599 ebXML, SOAP, REST. It is specified in CWA 16408, Chapter 17.4, page 148-151. The GITB TRR Client 3600 leverages this messaging interface.

3601 Table 12-23: External Interfaces General Search Functions Get Test Artifacts Matching a Pattern

Create an Archive Administration functions Duplicate an Archive

147

Delete an Archive

Set Access Rights for an Archive

Store a Test Artifact

Download a Test Artifact Archival Functions Select a Test Artifact or a Set of Artifacts

Transfer a test Artifact or a Set of Artifacts

3602

3603

148

3604 13 Test Registry and Repository (TRR) Prototype Implementation

3605 As part of GITB 3, a prototype implementation for TRR is performed with a reduced set of functionality based 3606 on the Joinup platform. A first working prototype of the TRR is expected to be released on Joinup at the 3607 beginning of September 2015. To simplify the implementation of the TRR within Joinup, the set of features 3608 supported by the TRR and the ADMS.TRR have been simplified.

3609 The following sections provide an overview of the prototype implementation.

3610 13.1 Joinup

3611 Joinup is a collaborative platform created by the European Commission with the following capabilities:

3612  Sharing of information like news, case studies and events about a project, 3613  Cataloguing interoperability solutions software, 3614  Searching on the catalogue.

3615 Joinup is open source and uses ADMS extensively for content description.

3616 The main reasons of using Joinup to host the GITB TRR are:

3617  existing features of Joinup cover the GITB TRR required features, 3618  Joinup is released as an open source project which is actively maintained, 3619  sustainability of the GITB TRR is assured after the GITB project ends, 3620  the mission of the ISA, the organization behind Joinup, is aligned with the mission of GITB and the 3621 TRR.

3622 13.2 TRR Joinup Functional Specification

3623 13.2.1 Use Case Diagram

3624 The main features of the TRR are the following:

3625  management of Test Resources (creation, view, update, deletion),

3626  search of Test Resources.

3627 The use case model describes the functional requirements for a specific workflow. More specifically, it shows 3628 the interactions between the actors and the system from a user’s point of view. The following figure provides 3629 an overview of the different use cases foreseen for the simplified version of the TRR.

3630

149

TRR

Create Test Resource

Update Test Resource

Delete Test Resource

View Test Resource Joinup Member

Search Test Resource

Anonymous User

3631

3632 Figure 13-1: TRR Joinup Use Cases 3633

3634 13.2.2 Actors

3635 13.2.2.1 Anonymous User

3636 This actor represents anybody who has access or potential access to the Joinup platform, who can be 3637 logged in or not in the platform but does not belong to any Joinup project. This actor is allowed to search and 3638 view the Test Resources.

3639 13.2.2.2 Joinup Member

3640 The Joinup Member actor represents all members of the Joinup platform that belong to at least one 3641 repository. The actor has the same authorisations as the anonymous user and can thus search and view 3642 Test Resources.

3643 When the user is a registered user, the user is able to create, update and delete Test Resources within a 3644 Repository for Test Resources.

3645 The Joinup Member can also reference existing Test Resources inside the Project page he belongs to. Once 3646 a Test Resource has been created in a repository, it can be referenced in different interoperability solutions, 3647 project, repository, etc.

3648 For example, a Test Resource about eSens created in the TRR repository could be referenced in the eSens 3649 project, in the GITB project, etc.

150

3650 13.2.3 Uses Cases

3651 A detailed explanation of each Use Case is provided in this section accompanied by the correspondent 3652 mock-ups.

3653 13.2.3.1 Search Test Resources within the Joinup Platform

3654 Table 13-1: Search Test Resources in Joinup Actor Anonymous User, Joinup Project Member

Trigger Ad Hoc

Description Whenever the actor wants to consult any of the Test Resources they can access the Joinup platform and search for the relevant Test Resources.

Preconditions The Test Resources are available on the platform.

Post conditions Not applicable

Basic flow 1) The actor browses to the existing search section of the Joinup platform.

2) Search for the Test Resource using the existing search capability of Joinup, filter on some fields introduced in paragraph Error! Reference source not found.: Business process, Standard / eBusiness specification, Actor, Type

3) View the result of the search.

Alternative flow Not applicable

Exceptions Not applicable

Includes Not applicable

Priority High priority

Frequency of use Continuous

Business rules Not applicable

Special requirements Not applicable

Assumptions Not applicable

Additional comments No authentication is required to search the Test Resources.

3655

3656 Example of typical search:

3657 Which testing resources are available for a specific context and a specific testing purpose?

3658  Testing Resources = as defined

3659  Context = Industry A, Country B, e-business specification C, role D (an actor in the e-business 3660 specification)

3661  Testing purpose = validation or simulation (type of Test Resource) 151

3662

A new "Test Resource" filter becomes available

3663

3664 Figure 13-2: Search Test Resources – Mock-up 3665 Once executed the search, the results page is shown.

3666

152

3667 13.2.3.2 View Test Resources

3668 Table 13-2: View Test Resources – Specification in Joinup Actor Anonymous User, Joinup Project Member

Trigger Ad Hoc

Description Whenever the actor wants to view any of the Test Resources they can access the Joinup platform and select the relevant Test Resource(s).

Preconditions The Test Resources are available on the platform.

Post conditions Not applicable

Basic flow 1) The actor browses the Joinup platform.

2) The actor search Test Resources using the search capability of Joinup.

3) Click on individual Test Resources available for further details which brings the user to the details of the Test Resource with its fields and allow to download the associated distribution.

Alternative flow Not applicable

Exceptions Not applicable

Includes Not applicable

Priority High priority

Frequency of use Continuous

Business rules Not applicable

Special Not applicable requirements

Assumptions Not applicable

Additional No authentication is required to view the Test Resources. comments

3669

3670 After performing a search action, the actor can click the Test Resource’s name that he wants to access to 3671 the detailed view of the Test Resource.

3672

153

3673

3674 Figure 13-3: View Test Resource – Mock-up 3675

3676 Note: Some fields, introduced in that document in paragraph Error! Reference source not found., are not 3677 shown here.

3678

3679

154

3680 13.2.3.3 Create & Update Test Resources

3681 Table 13-3: Create & Update Test Resources – Specification in Joinup Actor Joinup Project Member

Trigger The member wants to create or update a Test Resource.

Description To create a Test Resource, the actor accesses the Joinup platform, click on Propose your... and select "Test Resource”.

To update a Test Resource, the actor can click on edit on the view page.

Preconditions The actor needs to be a registered member, i.e. the user needs to have the right to create or update content.

Post conditions Not applicable

Basic flow 1) The actor browses the Joinup platform where he chooses to update a Test Resource or create a new Test Resource.

2) The actor fills out the required fields of the Test Resource.

3) The actor saves and publishes the Test Resource

4) The Test Resource is saved under a federated Repository

Alternative flow 1) The actor browses the Joinup platform where he chooses to update a Test Resource or create a new Test Resource.

2) The actor fills out the required fields of the Test Resource.

3) The actor saves the Test Resource as draft.

4) The actor re-opens and continues the Test Resource.

5) The actor saves and publishes the Test Resource.

Exceptions Not applicable

155

13.2.3.3.1.1.1 Includes Not applicable

13.2.3.3.1.1.2 Priority High priority

13.2.3.3.1.1.3 Frequency of Continuous use

13.2.3.3.1.1.4 Business Not applicable rules

13.2.3.3.1.1.5 Special Not applicable requirements

13.2.3.3.1.1.6 Assumptions Not applicable

13.2.3.3.1.1.7 Additional Not applicable comments

3682

3683 The folowing figures are mock-ups of the Test Resource creation/update form.

3684 Note: these mockups give general indications but will probably be different once the form is implemented. 3685

156

3686

3687 Figure 13-4: Test Resource Creation Form – Mock-up (1) 3688

157

3689 3690 Figure 13-5: Test Resource Creation/Update Form – Mock-up (2) 3691

158

3692 13.2.3.4 Delete Test Resources

13.2.3.4.1.1.1 Actor Joinup Project Member

13.2.3.4.1.1.2 Trigger The member wants to delete a Test Resource.

13.2.3.4.1.1.3 Description To delete a Test Resource, the actor clicks on delete on the view page.

13.2.3.4.1.1.4 Preconditions The user needs to be a registered member, i.e. the user needs to have the right to create, update and delete content on Joinup.

13.2.3.4.1.1.5 Post Not applicable conditions

13.2.3.4.1.1.6 Basic flow 1) The actor browses the Joinup platform where he chooses an existing Test Resource.

2) The actor deletes the Test Resource

13.2.3.4.1.1.7 Alternative flow

13.2.3.4.1.1.8 Exceptions Not applicable

13.2.3.4.1.1.9 Includes Not applicable

13.2.3.4.1.1.10 Priority High priority

13.2.3.4.1.1.11 Frequency of Continuous use

13.2.3.4.1.1.12 Business Not applicable rules

13.2.3.4.1.1.13 Special Not applicable requirements

13.2.3.4.1.1.14 Assumptions Not applicable

13.2.3.4.1.1.15 Additional Not applicable comments

3693

3694 13.2.4 Fields of Test Resources

3695 To limit the complexity of the technical development required to integrate the TRR in Joinup, it has been 3696 decided to reuse as much as possible the existing fields of the interoperability solution form (see section 3697 13.2.4.1).

3698 When possible, an existing field is slightly modified to match the vocabulary of the TRR (see section 3699 13.2.4.2).

159

3700 13.2.4.1 Reused Fields

13.2.4.1.1.1.1 Field 13.2.4.1.1.1.2 Field 13.2.4.1.1.1.3 Comments name type

ID

Name

Description

Distribution

Solution category Do not add to the form is a specific page for Test Resources is created, otherwise, add Test Resource to the existing select list

Solution type Shall be extended to contain the Test Resource type, that's it: Test Bed, Test Suite, Test Case, Document Assertion Set, Messaging Adapter, Document Validator, Test Assertion

Keywords Besides allowing the user to associate keywords with a Test Resource, it will also be used to specify that a Test Resource can work with the GITB reference implementation and if the Test Resource is generic or not.

Geographic coverage

Status

Publisher

Licence

Homepage or Test Used to reference a reference to a Test Resource that is Resource Link stored in a remote repository (covers the case when the Joinup TRR is used as a registry only)

3701

3702 13.2.4.2 Updated Fields

13.2.4.2.1.1.1 Field 13.2.4.2.1.1.2 Field 13.2.4.2.1.1.3 Comments name type

Business process Select checkbox Called Themes previously, based on an existing taxonomy (http://eurovoc.europa.eu/). The existing taxonomy can be used as it is for now.

The type of process that provides a way to unambiguously identify the business activity to which the Asset is associated

Solution category Select list This is an existing list called Solution category to categorize the interoperability solutions. The list contains the following elements: Framework, Service, Tool. A Test Resource is a particular interoperability solution.

160

Reference to another Asset Called Reference to another interoperability solution Test Resource previously

Allow the user to represent the following relationships:

 hasPart (a TestBed references some Test Resources like TestCase or DocumentValidator, etc.)

 testCase (a TestSuite is composed by a set of TestCase)

 documentAssertionSet (a TestCase is composed by a set of DocumentAssertionSet)

 uses (a TestCase uses a MessagingAdapter or/and a DocumentValidator)

Relation Type Select list Default values are : Next Version, Previous Version, Translation, Included Asset, Related Asset, Sample

Somehow we need to support versioning between Test Resources so Next Version and Previous Version are important.

Desired values are :

 Next Version and Previous Version

 either: Included Asset to have a generic way to link Test Resource (e.g. a Test Suite contains several Test Cases)

 or: contains (instead of hasPart), testCase, documentAssertionSet, uses

Standard / eBusiness Asset Called Reference to another interoperability solution specification previously

Allow the user to represent the following relationships:

 Link to an existing standard or specification

Actor List of strings Called Keywords previously

The actor or business process role associated to this Asset. This is specific to a particular standard / eBusiness specification. Ideally, it would be an evolving text, i.e. once a user enters a new text that was not previously in the controlled vocabulary, it is added to the controlled vocabulary and becomes available for other users.

Ex: PatientIdentitySource, PatientDemographicConsumer, etc.

3703

161

3704

3705

162

3706 Part IV: GITB Application and Validation based on Use Cases from the 3707 Automotive Industry, Healthcare and Public Procurement

3708 In GITB, use cases are the basis for defining Testing Scenarios, instantiating the GITB Testing Architecture 3709 and developping Test Artifacts for the PoC implementation or other GITB-compliante Test Bes.

3710 Part IV of this report describes testing scenarios for the previously selected business use cases from 3711 different industries. The general approach for applying GITB is first presented in section 14, before 3712 describing its application to the use cases:

3713  Public Procurement  OpenPEPPOL (Chapter 15), eSens (Chapter 0) and CEF (Connecting 3714 Europe Facility (CEF)17)

3715  In eHealth  Clinical Document Architecture (Chapter 18) and IHE XDS (Chapter 19)

3716  In the automotive and manufacturing industry  (work in progress)

3717 Part IV is targeted at eBusiness users, standard development organizations, industry consortia that are 3718 interested in applying the Test Bed Architecture to their eBusiness scenarios.

3719 14 Applying GITB in Use Cases

3720 14.1 Approach

3721 Figure 14-1 outlines a step-wise approach for translating eBusiness scenarios into testing requirements and 3722 creating testing solutions based on the GITB Principles and Framework (see section 4).

3723 3724 Figure 14-1: Applying GITB in Use Cases 3725 The starting point is the business user’s need for implementing and testing one or more eBusiness 3726 Scenarios. Business users define the relevant set of eBusiness Specifications as well as the actors involved 3727 in the eBusiness interactions with their roles. From the eBusiness Scenarios, the people testing the business 3728 scenarios, typically business users responsible for implementation or the integrators or software vendors 3729 working with the business users, elaborate on two types of testing requirements. On the one hand, they 3730 analyze “what to test” by deriving the exact Verification Scope from the eBusiness Specifications. On the 3731 other hand, they determine “how to test” by specifying the testing environment with its operational 3732 requirements. From the testing requirements, the Test Designers and Test Managers can set up the 3733 appropriate Test Services and Test Artifacts supporting the Testing Scenarios. 163

3734 14.2 Deriving Testing Requirements

3735 Two types of testing requirements have to be taken into account prior to designing a testing solution: the 3736 Verification Scope (“What to test?”), which can be derived from the eBusiness Specifications, and the testing 3737 environment (“How to test”) that determines the operational testing requirements that have to be met by an 3738 appropriate testing solution.

3739 14.2.1 Verification Scope (“What to Test?”)

3740 When implementing eBusiness Scenarios, business users rely on one or more eBusiness Specifications 3741 referring to the different layers of eBusiness: Business Process, Business Document and messaging layer.

3742 At the Business Document Layer, the Verification Scope may comprise structural and semantic validations 3743 as well as cross-layer validations with the Messaging Layer (see Table 14-1).

3744 Table 14-1: Verification Scope (“Test Patterns”) for Business Document Layer Validation Type of Validation Verification Scope Description

Structural validation Document syntax and Testing whether messages conform to the message structure definitions, e.g. as defined by EDIFACT or XML document schemas (xsd)

Data types UN/CEFACT Core Data Type Catalogue (CDT Catalogue)

Document assembly Testing whether messages conform to naming and design rules, e.g. as defined by OAGi or UN/CEFACT Core Components Business Document Assembly (CCBDA)

Mandatory / optional Testing whether all mandatory fields are correctly fields filled, e.g. as defined by content definition (e.g. xsd)

Semantic validation Vocabulary and code Testing whether data fields comply with defined list verification vocabulary, code lists (e.g. DUNS, ISO, UNECE, ...) or core components (e.g. UN/CEFACT CCL, ...)

Business Document Testing whether document headers are correct, e.g. header definitions as defined by UN/CEFACT Standardized Business Document Header (BDH) or OAGI BOD's application area

Business rules Testing of business rules, e.g. as specified by Schematron

QoS Equivalent Business Testing whether "equivalent" versions for the same Document versions document, could be used for the same transaction

Equivalent syntax Testing whether "equivalent" document syntax, versions could be used for the same transaction, e.g. different implementations of syntax neutral Business Document specifications

Consistency of Testing whether message header and Business message header and Document content are aligned Business Document content

164

3745 At the Messaging Layer, the Verification Scope comprises structural validation, such as testing messaging 3746 protocols, validating message headers and testing the discovery of endpoints. It may also comprise 3747 validations for QoS and other validations (see Table 14-2).

3748 Table 14-2: Verification Scope (“Test Patterns”) for Messaging Layer Validation Type of Validation Verification Scope Description

Structural validation Messaging Protocol Testing transport and communication level protocols, e.g. as defined by ebXML Messaging (ebMS), SOAP, EDIFACT X12, RosettaNet Implementation Framework (RNIF), Minimal Lower Layer Message Transport protocol (MLLP)

Message header Testing whether the message header is valid

Addressing Testing the discover of endpoints

Quality of service (QoS) Security Testing security protocols, e.g. as defined by WS- validation Security

Other QoS Testing QoS, e.g. as defined by WS-Policy

Others Equivalent messaging Testing "equivalent" messaging styles / formats / styles / formats / versions, that could be used for the same versions transaction

3749

3750 At the Business Process Layer, structural validation comprises testing message sequence and process 3751 choreography, the correct interpretation of roles as well as timing conditions. In addition, cross-layer 3752 validations (or profile validation) are performed with Business Document and Messaging Layer (see Table 3753 14.3).

3754 Table 14-3: Verification Scope (“Test Patterns”) for Business Process Layer Validation Type of Validation Verification Scope Description

Structural validation Sequence of messages Testing the correct sequence of messages, e.g. as / choreography defined by sequence diagrams;

Testing process choreographies which are informally or formally defined

Roles Testing the different roles within a Business Processes, e.g. senders and receivers of messages

Timing conditions Testing the timing conditions in business transactions, e.g. as defined by triggering events or reaction times

Cross-layer validation / Data consistency Testing data relationships across different Profile validation across Business messages, e.g. as defined by a common information Documents model

Restrictions on the Testing syntactic and semantic restrictions on the Business Document Business Document format and content format and content

Restrictions on Testing restrictions on message header and

165

message header consistent use of conversation ID

Restrictions on Testing restrictions on and correct use of transport transport protocols protocols

3755

3756 14.2.2 Operational Requirements (« In Which Environment? »)

3757 The testing environment determines operational requirements that have to be met by an appropriate testing 3758 solution:

3759 The testing context (cf. Section 3.4.2) denotes the situation when testing is performed. This can be

3760  during standard development for quality assurance of the developed eBusiness Specifications,

3761  when implementing new or upgrading existing eBusiness endpoints,

3762  when new partners are onboarding.

3763 Testing integration in business environment: Several possibilities exist with regard to integrating in the 3764 business environment.

3765  Testing system is the in-production system: The user wants to do testing in the in-production 3766 system under exact business conditions (with same firewall setups, security setups, eBusiness 3767 gateway setup).

3768  Testing system is a non-production system: The user does not want to disturb currently deployed 3769 in-production system, but wants to test a system that is configured differently from the currently 3770 production system.

3771  No integration in business environment: In this case, testing is not integrated at all with the 3772 business environment, but is done manually.

3773 Testing location:

3774  On-premise testing: In this case, end-users do not want to access a remote server to undergo 3775 testing of their own eBusiness endpoints. Instead, they download and install a test server, along with 3776 automated Test Suites. On-premise testing avoids external access to an in-production system and 3777 reconfiguration of the firewall. It provides the convenience of local control of the test environment. It 3778 requires that end-users have the IT expertise to do testing onsite.

3779  Remote testing: In this case, the end-user does not have to handle any test equipment locally, e.g. 3780 because of the IT overhead of doing so, or because it wants to test its SUT exactly in its production 3781 context (not in an off-production test harness). Testing may be controlled by the user (remotely) or 3782 operated by a third party.

3783  Combination of remote and on-premise testing: A combined approach is appropriate if end-users 3784 want to decouple test execution from test analysis. For example, test driving may be local on the 3785 user premises, whereas test analysis may rely on remote services.

3786  Testing workshops: In this case, a testing workshop is organized with different Test Participants.

3787 Testing topology:

3788  Direct connection of systems (point-to-point)

3789  Mediation via business hub

166

3790  Mediation via testing hub

3791 14.3 Deriving Test Scenarios and Solutions

3792 The GITB Methodology for creating testing solutions for eBusiness Specifications relies on the step-wise 3793 approach presented in Section Error! Reference source not found.. Ideally, different Test Scenarios are 3794 performed sequentially, starting with standalone document validation (Test Scenario 1) and goes on to 3795 interactive Conformance Testing (Test Scenario 2) and Interoperability Testing (Test Scenario 3). The 3796 following table describes how the three test scenarios differ in terms of Verification Scope and integration in 3797 the business environment.

3798

167

3799

Verification Scope

Manual testing no / withSUT interaction Interaction SUT with Interactions betweenSUTs Messaging  Structural validation Test layer scenario  Quality of service (QoS) 3 validation

 Address discovery

 Others

Business  Structural validation Test scenario Test scenario Document 1 2 layer  Semantic validation

 Others layer

Business  Structural validation Process layer  Cross-layer / profile validation

3800 Table 13-4: Testing Scenarios, Requirements and Integration in Business Environment 3801

3802 To setup the Testing Architecture and the Test Bed for realizing the test scenarios, Test Designers and Test 3803 Managers will search for existing Testing Resources and Artifacts using the TRR. If no existing resources are 3804 available, they will have to create the necessary Testing Capability components and artifacts. If a GITB- 3805 compliant Test Bed is available, it will provide the non-core components and be used as testing platform. The 3806 required Testing Capabilities can then be implemented as plug-in components.

3807

168

3808 Part IV. 1: Public Procurement

3809 15 OpenPEPPOL

3810 15.1 Background and Testing Requirements

3811 With more than 70 Access Point service providers in Europe, the OpenPEPPOL35 community is growing and 3812 gaining users in Europe. Some countries have mandated its use and are the tractor for private and public 3813 entities around the EU. Other countries are still looking at this open network infrastructure that enables the 3814 interconnection between public entities and private companies to drive electronic public procurement. 3815 Besides electronic public procurement, OpenPEPPOL is more and more being used in the private sector to 3816 exchange structured documents not only with public entities but also with other private companies.

3817 In order to ensure interoperability, different service providers and Regional Authorities have implemented 3818 validation services. We have different examples:

3819  Norwegian DIFI has created a website for validation of document instances, for example 3820 http://vefa.difi.no/formatvalidering/invoice-validation-en.html.

3821  Private providers offer free validation services for PEPPOL instances, for example 3822 https://peppol.validex.net/.

3823 Most of these existing validation services use Test Artifacts to ensure the electronic documents that have to 3824 be exchanged over the PEPPOL network are conformant to the OpenPEPPOL specifications. With so many 3825 Access Point service providers and users, the OpenPEPPOL community has the challenge to ensure that 3826 every document exchanged follows the OpenPEPPOL specifications; therefore providing a test service to 3827 validate electronic documents is key to promote interoperability.

3828 Apart from conformance to the document specifications, OpenPEPPOL is currently facing another challenge: 3829 There has been a decision to move from the START transport protocol, created under the PEPPOL project, 3830 to a more common and widely adopted transport protocol called AS2. AS2 transport protocol has been used 3831 for several years now. It therefore offers more tools and is more stable than the new protocol developed 3832 under the PEPPOL Pilot project. However, moving a community of more than 70 Access Points from one 3833 transport protocol to another is not an easy task, and providing tools and services to test for conformance 3834 could be a major benefit.

3835 The OpenPEPPOL community has created a Validation and Quality Assurance project intended for 3836 ensuring its growing community of companies and service providers implement their specifications properly.

3837 The purpose of the Validation and Quality Assurance project is to further clarify and establish clear directions 3838 and rules in terms of responsibilities for quality assurance and validation in the OpenPEPPOL network. The 3839 project will also, if necessary, point to existing available resources and/or develop new resources (if 3840 necessary). Consequently, maintaining validation tools might be in scope for the project. The overarching 3841 objective is to allow parties exchanging information in the OpenPEPPOL network the capability to validate 3842 electronic documents based on the Business Interoperability Specifications (BIS) in a consistent manner. 3843 More complex tasks can be envisaged for the Validation and Quality Assurance project, providing test 3844 scenarios for conformance and interoperability testing.

3845 The Test Scenario described in the following sections addresses a complex scenario, combining the 3846 exchange of a document instance using AS2 with the document validation. Its deployment into the Global 3847 Interoperability Test Bed (GITB) could be a first step to demonstrate how to develop additional Test 3848 Scenarios for the OpenPEPPOL community.

3849

35 http://www.peppol.eu/ 169

3850 15.2 Verification Scope – What Should be Tested?

3851 The business process that will be used as the basis for this Test Scenario is the submission of an electronic 3852 invoice through the OpenPEPPOL network using the AS2 protocol. The Test Scenario will focus on 3853 submitting an electronic invoice from a sending Access Point to a receiving Access Point.

3854 15.2.1 Parties/Actors

3855 The following parties/actors assume a role in this business process:

3856  Seller – The original issuer of the electronic invoice. The submission of the electronic invoice to the 3857 sending Access Point is out of scope for this Test Scenario.

3858  Buyer – The original receiver of the electronic invoice. For the purpose of the test, the buyer will be 3859 always the one registered in the GITB SMP.

3860  Sending Access Point – The System Under Test.

3861  Receiving Access Point – Simulated by the GITB, receives electronic invoices in AS2 and validates 3862 them according to the BIS 4A rule set.

3863  Service Metadata Locator – Simulated by the GITB, receives a request and provides an URL to the 3864 Service Metadata Publisher. A service that provides a client with the capability of discovering the Service 3865 Metadata Publisher endpoint associated with a particular participant identifier. A client uses this service 3866 in order to find where information is held about services for a particular participant business.

3867  Service Metadata Publisher – Simulated by the GITB, receives a request and provides the AP 3868 endpoint. A service metadata publisher offers a service on the network where information about services 3869 of specific participant businesses can be found and retrieved. It is necessary for a client application to 3870 retrieve the metadata about the services for a target participant business before the client can use those 3871 services to send messages to the participant business.

3872 15.2.2 Business Process

3873 The business process has the following steps:

3874 1. The seller creates an invoice based on the PEPPOL BIS 4A Business Interoperability Specification.

3875 2. The seller authenticates with the sending Access Point and submits the electronic invoice. The 3876 authentication process of the seller to the sending Access Point is considered out of scope for this Test 3877 Scenario.

3878 3. The sending Access Point validates the electronic invoice for conformance to BIS 4A.

3879 4. The sending Access Point looks up the PEPPOL address of the endpoint of the buyer in the SML.

3880 5. Using the SML address, the sending Access Point gets the SMP registry entry.

3881 6. From the SMP entry, the sending Access Point gets the AS2 endpoint of the receiver.

3882 7. The sending Access Point wraps the electronic invoice into a message envelope based on the SBDH 3883 specification.

3884 8. The sending Access Point submits the electronic invoice using AS2 to the receiving Access Point.

3885 9. The receiving Access Point validates the electronic invoice using the BIS 4A rule set.

3886

170

3887 15.2.3 Underlying eBusiness Specifications / Standards

3888 The business process is described in the PEPPOL BIS 4A Invoice Only Specification and in the PEPPOL 3889 transport profiles and infrastructure specifications.

3890 Table 15-1: OpenPEPPOL Test Scenario – Relevant eBusiness Specifications Relevant specifications / standards References

Business Process Business Process specification is  PEPPOL BIS 4A defined in the PEPPOL BIS 4A.  CWA 16562 The BIS 4A is based in CEN BII2 Post Award CWA.

Business UBL Invoice document customized  UBL Invoice Documents following the CEN BII transaction data model.  CEN BII T10 Trdm

Attributes and code list defined using Genericode by CEN BII.

Business rules defined in schematron by CEN BII2 and PEPPOL.

Transport and Messaging protocols for the PEPPOL  Busdox Communication network are based on OASIS Busdox (Messaging) Technical Specification.  SML Service Protocols The transport protocols is AS2.  SMP Service

 RFC 4130R

 AS2 PEPPOL

 Policy for use of Identifiers

 Policy for using envelopes (SBDH)

Profiles PEPPOL BIS 4A defines the profile  PEPPOL Use Case Test Files and provides test files for the electronic invoice.

3891

3892 15.3 Testing Environment – How should be tested?

3893 15.3.1 Testing Integration in Business Environment

3894 SUT is a non-production system as these tests can be run in parallel to the development process.

3895 15.3.2 Testing Location

3896 It can be implemented as a web-based remote self-testing tool. The SUT operators can test their system 3897 whenever and wherever they want. The SUT Operators connect to the GITB Test Bed, execute the Test 3898 Suite/Cases and get the test results.

171

3899 15.4 Test Scenario

3900 15.4.1 Objectives and Success Criteria

3901 This Test Scenario implements a conformance test of an Access Point to the PEPPOL specifications. The 3902 objective of this Test Scenario is to ensure the sending Access Point (the System Under Test) can submit a 3903 conformant PEPPOL BIS 4A electronic invoice to a receiving Access Point using the AS2 protocol.

3904 The Access Point has to be able to discover the endpoint for the receiving Access Point based on the 3905 information on the electronic invoice header and has to submit the electronic document to this receiving 3906 endpoint using the AS2 protocol.

3907 Success criteria:

3908  The Sending Access Point can obtain the endpoint address of the receiving Access Point 3909  The Sending Access Point can send the electronic invoice using the AS2 protocol 3910  The exchanged electronic invoice follows the BIS 4A specifications

3911 15.4.2 Interaction Diagram/Choreography

3912 15.4.2.1 Endpoint lookup

3913 The sending Access Point has to perform a lookup for the receiver’s capabilities and technical endpoint 3914 information.

3915 1. An electronic invoice is issued by a PEPPOL user and handed over to the sender Access Point for 3916 transportation to the receiving Access Point. The invoice is then finally delivered to the ultimate 3917 receiver. The method used to communicate between the user and the sender Access Point is out of 3918 scope of the test but the sender Access Point must assure the authenticity of the PEPPOL user and 3919 the validity of the electronic invoice message.

3920 2. The message handed over by the user to the sending Access Point includes an envelope with 3921 required information such as:

3922 a. Recipient identifier and identifier type

3923 b. Sender identifier and identifier type

3924 c. Document identifier

3925 d. Process identifier

3926 These identifiers must follow the PEPPOL Policy on use of Identifiers. 3927 3928 3. The sender Access Point constructs an URL based on the business identifier of the receiver and 3929 queries the simulated SML. 3930 3931 4. The sender Access Point gets the address of the simulated SMP. 3932 3933 5. The sender Access Point requests service metadata to the receiver simuilated SMP creating a query 3934 with the document identifier and the receiver’s identifier. 3935 3936 6. SMP replies with the metadata for the receiver’s Access Point. 3937 3938 7. The sender Access Point validates that the metadata is signed using a PEPPOL certificate. 3939 3940 8. The sender Access Point gets the AS2 endpoint from the SMP reply. 3941

172

3942

3943 Figure 15-1: Endpoint Lookup 3944

3945 15.4.2.2 Document exchange

3946 OpenPEPPOL requires using AS2 protocol to exchange documents. The workflow between the sender 3947 Access Point and the receiving Access Point is as follows:

3948 1. The sending Access Point gets the OpenPEPPOL issued Private Key X509 certificate for signing 3949 from its own certificate stores. 3950 3951 2. The sending Access Point MUST ensure that the message envelope carries the correct headers 3952 containing identifiers for recipient and sender, process type and document identifier. 3953 3954 3. The sending Access Point signs the message using the OpenPEPPOL AP Certificate Private Key. 3955 3956 4. The sending Access Point uses HTTPS to send message securely to the receiving simulated Access 3957 Point using the URL as retrieved from the SMP and in accordance with AS2 specification RFC 4130. 3958 3959 5. The receiving simulated Access Point responds synchronously with a signed proof-of-delivery 3960 message to the sending Access Point using the Message Delivery Notification (MDN) specification 3961 as specified in the AS2 specification RFC 4130. 3962 3963 6. Finally the sending Access Point archives the MDN as a signed proof-of-delivery of the message. 3964

3965 15.4.3 System Under Test (s)

3966 The System Under Test (SUT) is the sending Access Point and belongs to a Service Provider. The GITB 3967 Test Bed simulates the systems of the OpenPEPPOL Service Metadata Locator (SML), and the Service 3968 Metadata Publisher (SMP) and the receiving Access Point.

3969 15.4.4 Abstract Test Steps

3970 This Test Scenario tests the conformance of a sending Access Point to the OpenPEPPOL specifications. It 3971 discovers the endpoint address of the buyer based on its recipient endpoint identifier, submits the electronic 3972 invoice using the AS2 protocol, and validates whether the electronic invoice is correct.

3973  The seller prepares a compliant PEPPOL BIS 4A electronic invoice.

3974  The seller authenticates with the sending Access Point and submits the electronic invoice.

3975  The sending Access Point validates the electronic invoice for conformance to BIS 4A.

173

3976  The sending Access Point looks up for the endpoint of the buyer in the simulated SML.

3977 o The SUT retrieves the buyer endpoint identifier from the electronic invoice and performs a 3978 DNS lookup into the GITB SML

3979 . Conformance criteria 1 – Request is well formed

3980 . How to test – The DNS lookup has to be done for a specific receiver.

3981  Using the SML address, the sending Access Point gets the SMP registry entry.

3982 o The SUT accesses the simulated SMP and retrieves the information about the protocol and 3983 endpoint where the electronic invoice has to be delivered.

3984 . Conformance criteria 2 – Request is well formed

3985 . How to test – Check URI request of the SMP record (e.g. 3986 http://smp.b2brouter.com/complete/iso6523-actorid-upis::9920:ESB63276174)

3987  From the SMP entry, the sending Access Point gets the AS2 endpoint of the receiver, wraps the 3988 electronic invoice with a SBDH envelope and submits the envelope using AS2 to the receiving 3989 Access Point.

3990 o The SUT creates the AS2 header and submits the electronic invoice using the AS2 protocol 3991 to the receiving Access Point

3992 . Conformance criteria 3 – Header well formed

3993 . How to test – The SBDH envelope has the correct format and the proper From and 3994 To fields

3995 . Conformance criteria 4 – Valid electronic invoice document format

3996 . How to test – Use the UBL XSD to check the syntax of the electronic invoice

3997 . Conformance criteria 5 – The contents of the electronic invoice is valid

3998 . How to test – The test bed receives the electronic invoice and performs the 3999 validation according to the PEPPOL BIS 4A validation artefacts.

4000

4001 15.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain

4002 15.5.1 Test Artifacts

4003 The electronic invoice can be tested using the following Test Artifacts:

4004  UBL XSD Invoice shema:

4005 o UBL-Invoice-2.1.xsd

4006  CEN BII Transaction 10 schematron validation :

4007 o http://www.invinet.org/BII2conformance/BII2-resources/xslt/BIIRULES-UBL-T10.xsl

4008 o http://www.invinet.org/BII2conformance/BII2-resources/xslt/BIICORE-UBL-T10-V1.0.xsl

4009  PEPPOL BIS4a schematron validation

174

4010 15.5.2 Test Tools and Services

4011 There are several test services implementing the test artifacts described in the section above, but we have 4012 not found services or tools that implement testing for the transport of the documents using AS2.

4013 15.6 Related Stakeholders

4014 Standard Development Organizations (SDOs), industry consortiums, companies, public authorities that may 4015 be interested to use the tests:

4016  Industry consortium’s

4017 o OpenPEPPOL AISBL

4018 o EESPA

4019  Private companies

4020 o Service Providers

4021 o ERP Vendors

4022

4023

4024

175

4025 16 eSENS

4026 16.1 Background and Testing Requirements

4027 The aim of the e-SENS large-scale project is to develop the idea of the European Digital Market through 4028 innovative ICT solutions and consolidates, improves and extends experiences in previous large-scale pilots 4029 with the objective of facilitating cross-border processes.

4030 The former large-scale projects are:

4031  SPOCS (Simple Procedures Online for Cross Border Services)36

4032  e-CODEX (e-Justice Communication via Online Data Exchange)37

4033  epSOS (European patient Smart Open Services)38

4034  PEPPOL (Pan European Public Procurement Online)39

4035  STORK (Secure idenTity acrOss boRders linKed)40

4036 The e-SENS large-scale pilot has been organized into six core work packages:

4037

4038 Figure 16-1: e-SENS Work Packages 4039

4040 There are four non-technical (general coordination and communication) and two technical-oriented work 4041 packages.

4042 Work package 5 objectives are to demonstrate how to deploy real-life ICT services within European 4043 countries, and work package 6 shall create the technical building blocks for these pilots to be deployed.

36 www.eu-spocs.eu

37 www.e-codex.eu

38 www.epsos.eu

39 www.peppol.eu

40 www.eid-stork.eu and www.eid-stork2.eu 176

4044 There are several domains for piloting projects (e-Procurement, e-Health, e-Justice and Business Lifecycle) 4045 in e-SENS. This testing scenario will be focused on the e-Procurement domain.

4046 Within the e-Procurement domain, e-SENS stakeholders have suggested several pilots. The Test Scenario 4047 to define and deploy in the Global Interoperability Test Bed (GITB) is related with the pre-award area for 4048 public procurement. The Test Scenario will be focused on the subscription process, where an economic 4049 operator discovers a business opportunity and subscribes his interest for the contracting authority to send 4050 him the tender documents electronically.

4051 e-SENS work package 6 has an specific requirement to create a conformance and test building block. Their 4052 aim is to provide an extensible, highly available and web based testing infrastructure in order to ensure 4053 interoperability conformance of the applications and organizations participating in e-SENS.

4054 This Test Scenario and its deployment into the GITB framework does not compete with the e-SENS work 4055 package 6. On the contrary, it can be used as a template or initial work to develop additional Test Scenarios 4056 for other pilot projects in the e-Procurement or other domains within the e-SENS Large Scale Pilot. The use 4057 of an existing global interoperability Test Bed like the one being developed in the CEN WS GITB can be 4058 encouraged reusing the Test Scenario templates. This could potentially simplify the tasks for the different e- 4059 SENS domains when creating Test Scenarios and could also provide a common and interoperable set of 4060 artifacts to allow the deployment of such Test Scenarios in different Test Beds.

4061 16.2 Verification Scope – What Should Be Tested?

4062 As described below, the business process pilot that will be used to create this Test Scenario is the 4063 subscription of interest in a call for tender from the economic operator to the contracting authority.

4064 16.2.1 Actors and Roles

4065 The following actors participate in this business process:

4066  Customer: The customer is the legal person or organization who is in demand of a product or 4067 service. Examples of customer roles: buyer, consignee, debtor and contracting authority.

4068  Supplier: The supplier is the legal person or organization that provides a product or service. 4069 Examples of supplier roles: seller, consignor, creditor, and economic operator.

4070 These actors play the following roles in this business process.

4071  Contracting authority (CA): ‘Contracting authorities’ means the state, regional or local authorities, 4072 bodies governed by public law, associations formed by one or several of such authorities or one or 4073 more such bodies governed by public law.

4074  Economic operator (EO): The terms ‘contractor’, ‘supplier’ and ‘service provider’ mean any natural 4075 or legal person or public entity or group of such persons and/or bodies which offers on the market, 4076 respectively, the execution of works and/or a work, products or services. The term ‘economic 4077 operator’ shall cover equally the concepts of contractor, supplier and service provider.

4078 16.2.2 Business Process

4079 The business process is described in the e-SENS work package D5.1 deliverable and in the CEN BII3 Profile 4080 46. The objective of the pilot is to demonstrate how an economic operator can subscribe interest to a tender 4081 published by a contracting authority in a foreign country. The business process has the following steps:

4082 10. CA41 sends a notice to the Publisher

4083 11. Publisher receives the notice and sends an acknowledgement back to the CA

41 Contracting Authority, the public entity that is willing to purchase products, services or works. 177

4084 12. EO42 starts a search in the Publisher’s site

4085 13. Publisher finds notices meeting EO’s criteria and sends the results back to the EO

4086 14. EO expresses his interest to one procurement submitting a subscription request to the CA

4087 15. CA receives the subscription request to the procurement by the EO

4088 16. CA subscribes the interested EO and sends him a subscription response as an acknowledgement

4089 17. EO receives the subscription response

4090

4091 16.2.3 Underlying eBusiness Specifications / Standards

4092 Relevant specifications comprise the e-SENS work package D5.1 deliverable and the CEN BII3 Profile 46.

4093 Table 16-1: e-SENS Test Scenario – Relevant eBusiness specifications Relevant specifications / standards References

Business Process  Business Process specified in the  Draft CEN BII3 Profile to be published Profile 46 – Subscribe to as a formal standard in CEN BII. procedure in CEN BII3

Business  e-SENS Specification: WP5.1  D5.1 Information Requirements Documents Deliverable eTendering

 Current work in CEN Business  XVergabe Messages Interoperability Interfaces 3 o Subscription request  XVergabe. The documents that will be used are the ones from the o Subscription response XVergabe initiative, according to the CEN BII T81 and T82  CEN BII information models information requirement models o T81 Expression of interest

o T82 Business Opportunity subscription confirmation

Transport and This test scenario does not test the Communication communication between the parties (Messaging) Protocols

Profiles CEN BII3 - Profile 46

4094

42 Economic Operator, the private company that is willing to sell products, services or works. 178

4095 16.3 Test Scenario

4096 This Test Scenario will check the conformance of an Economic Operator system participating in the e- 4097 Tendering e-SENS pilot. The Test Scenario will validate the contents of the submitted subscription request 4098 as well as the choreography of the Economic Operator system under test.

4099 16.3.1 Objectives and Success Criteria

4100 This Test Scenario implements a conformance test for the document exchange defined in the e-SENS D5.1 4101 Information Requirements for eTendering following the Profile 46 established in CEN BII. It is not a complete 4102 test scenario for the whole pilot, but an initial part to test the electronic document structure and associated 4103 business rules for the subscription request document.

4104 The scope of the test is limited to the subscription request and response messages. There are no bindings of 4105 these two information requirement models to existing international standard XML languages such as UBL or 4106 UN/CEFACT, and this is why CEN BII does not provide any binding for these transactions. The e-SENS 4107 project team, though, has been working jointly with the XVergabe initiative from Germany, and as they have 4108 a syntax that can support these two information models, this Test Case will use this syntax.

4109 Success criteria:

4110 1. The correct sequence of the messages as defined in the CEN BII Profile 46

4111 2. Validity of the syntax or structure of the documents being exchanged according to XVergabe syntax

4112 3. Validity of the business rules specified in CEN BII Profile 46 and transaction T81 and T82.

4113

4114 16.3.2 Interaction Diagram/Choreography

4115 The business process activity diagram defined in the CEN BII Profile 46 is as follows:

179

Collaboration BII46 - Subscribe to Procedure

Express Interest in Receive business opportunity Subscription

procedure Notice Confirmation Subscribed to

«Pool» Supplier received Procedure

«Lane» Economic Operator Subscribe- to

BiiTrdm081 Express BiiTrdm082 business interest in business opportunity subscription opportunity confirmation

Receive Procedure Confirm Procedure Subscriptiion Subscription Notice End published

«Pool» Customer

«Lane» Contracting Body Subscribe- procedure to

4116

4117 Figure 16-2: CEN BII3 Profile 46 – Subscribe to Procedure 4118

4119 16.3.3 System Under Test (s)

4120 This Test Scenario is used to test the system of the Economic Operator. SUTs are non-production systems 4121 as these tests can be run in parallel to the development process.

4122 The GITB simulates the Contracting Authority and the Publisher systems. 4123

4124 16.3.4 Abstract Test Steps

4125 This Test Scenario will check the conformance of an Economic Operator system participating in the e- 4126 Tendering e-SENS pilot. The Test Scenario will validate the contents of the submitted subscription request 4127 as well as the choreography of the Economic Operator system under test.

4128  CA sends a notice to the Publisher

4129  Publisher receives notice and sends acknowledgement to CA

4130  EO sends a search request to the Publisher

4131  Publisher searches notices

4132  Publisher sends a set of notices as a result to the EO

4133  EO shows his interest in one of the received notices 180

4134 o The Economic Operator has interest in one of the received notices and the SUT creates a 4135 “subscription request” transaction with the reference number of the notice.

4136 . Conformance criteria 1 – The document is well formed

4137 . How to test – Check document validity with XSD

4138 . Conformance criteria 2 – The document is valid

4139 . How to test – The electronic document is valid according to the business rules 4140 defined in CEN BII for the subscription of interest transaction.

4141  CA subscribes interest EO and sends acknowledgement to EO including all documents

4142  CA sends information updates of the procurement project including documents to all interested EO's 4143 whenever there are changes in the procurement process

4144  EO sends his tender for the procurement project to CA

4145 o The SUT receives acknowledgment from the test bed and creates a tender document.

4146 . Conformance criteria 3 – Correlation.

4147 . How to test – The reference number has to be in the list of the business 4148 opportunities sent from the test bed.

4149 . Conformance criteria 4 – The document is well formed

4150 . How to test – Check document validity with XSD.

4151 . Conformance criteria 5 – Data contents, for instance the submission date and 4152 time shall be the one of the transaction, or the hash of the document has to be 4153 valid.

4154 . How to test – Apply schematron and code list validation on specified data elements.

4155 . Conformance criteria 6 – Security check

4156 . How to test – Validate electronic signature of the tender document.

4157

4158 16.4 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain

4159 There is no syntax binding from the CEN BII information requirements to the XVergabe syntactical electronic 4160 documents. There are also no Business Rules identified for these two information requirement models in the 4161 CEN BII Profile 46 yet.

4162 Currently, the CEN BII Profile 46 and related transaction models are being reviewed internally in CEN BII 4163 pre-award team.

4164 As per the policy on syntax bindings from CEN BII, the Workshop does not create any other than UBL and 4165 UN/CEFACT.

4166 For that reason, the syntax binding to the XVergabe will have to be created within the e-SENS pilot project.

4167 16.4.1 Test Artifacts

4168 Currently the following artifacts already exist:

181

4169  XSD Schema for the Subscription Request document from XVergabe

4170  XSD Schema for the Subscription Response document from XVergabe

4171  CEN BII3 T81 Expression of interest information requirement model (Draft)

4172  CEN BII3 T82 Business Opportunity subscription confirmation information requirement model (Draft)

4173 16.4.2 Test Tools and Services

4174 Currently there are no tools or services for these transactions.

4175 16.5 Related Stakeholders

4176  Industry consortium’s

4177 o e-SENS

4178 o XVergabe

4179 o CEN WS BII

4180  Public institutions

4181 o European Commission

4182 o Publications Office

4183

4184

4185

4186

4187

4188

4189

182

4190 17 Connecting Europe Facility (CEF)

4191 17.1 Background and Testing Requirements

4192 Connecting Europe Facility (CEF) is the common financing instrument of trans-European networks for the 4193 period 2014-2020. During this period, CEF will finance projects of common interest in three different sectors:

4194  Transport

4195  Energy

4196  Telecommunications

4197

4198 Figure 17-1: CEF Structure 4199

4200 Within the telecommunications area, CEF has a budget to work on Digital Service Infrastructures delivering 4201 networked cross-border services for citizens, businesses and public administrations.

4202 The objective of the CEF Programme is to improve the competitiveness of the European economy by 4203 promoting the interconnection and interoperability, thus supporting the development of a Digital Single 4204 Market.

4205 The aim is to promote these key Digital Service Infrastructures (DSIs) in order to facilitate the cross-border 4206 and cross-sector interaction in Europe. One of the building blocks of the CEF programme is the e-Invoicing 4207 DSI. This building block will help public administrations implement electronic invoicing in compliance with the 4208 e-Invoicing Directive of the European Parliament and the Council.

4209 The European Committee for Standardization (CEN) is defining a new semantic standard for e-Invoicing in 4210 public procurement and the binding of the resulting standard to a number of existing syntaxes in a Project 4211 Committee known as PC 434.

4212 The e-Invoicing solution of CEF should provide tools for the public administrations to reduce the efforts for 4213 complying with the Directive.

4214 This Test Scenario aims at providing a GITB-compliant Test Case to allow the validation of electronic 4215 invoices created using different syntaxes against the PC 434 semantic standard for e-Invoicing.

183

4216 17.2 Verification Scope – What Should Be Tested?

4217 17.2.1 Actors

4218 The following actors participate in the business process:

4219  Customer: The customer is the legal person or organization who is in demand of a product or 4220 service. Examples of customer roles: buyer, consignee, debtor and contracting authority.

4221  Supplier: The supplier is the legal person or organization that provides a product or service. 4222 Examples of supplier roles: seller, consignor, creditor, and economic operator.

4223 These two parties take the following roles:

4224  Creditor: One to whom a debt is owed. The Party that claims the payment and is responsible for 4225 resolving billing issues and arranging settlement. The Party that sends the Invoice. Also known as 4226 Invoice Issuer, Accounts Receivable, or Seller.

4227  Debtor: One who owes debt. The Party responsible for making settlement relating to a purchase. 4228 The Party that receives the Invoice. Also known as Invoicee, Accounts Payable, Buyer

4229 17.2.2 Business Process

4230 The business process activity diagram defined in the PEPPOL BIS 4a can be used to depict the 4231 choreography of the submission of an electronic invoice, although this Test Case does not test the business 4232 process but the conformance of the transaction to the CEN PC 434 semantic model.

4233

4234 Figure 16-2 PEPPOL BIS4a - Invoice Only 4235 17.2.3 Underlying Standards/Specifications

4236 Directive 2014/55/EU of the European Parliament and of the Council of 16 April 2014 on electronic invoicing 4237 in public procurement states that the European "Commission shall request that the relevant European 4238 standardisation organisation draft a European standard for the semantic data model of the core elements of 4239 an electronic invoice. Based on this standardization request from the European Commission, the CEN 4240 Project Committee 434 was created in 2014-05-06. The Work on PC 434 has been divided into several work 4241 streams (WS):

4242  WS1 Definition of scope.

4243  WS2 Semantic model.

4244  WS3 External relations

4245  WS4 List of syntaxes

4246  WS5 Syntax binding

4247  WS6 Guidelines at transmission level

4248  WS7 Extension methodology

4249  WS8 Test methodology and test results

184

4250 The PC 434 will finalize by the end of 2016.

4251 The list of syntaxes and the rest of deliverables are not available yet. As long as there is not an official list of 4252 syntaxes, PEPPOL BIS for the electronic invoice will be taken as the basis for this Test Case. In order to 4253 demonstrate the potential use of additional syntaxes, the CEN BII syntax binding to the UN/CEFACT Cross 4254 Industry Invoice will be also considered.

4255 Once the syntaxes are selected and the syntax bindings defined within PC434, they must substitute the 4256 PEPPOL BIS and then CEN BII artefacts that will be used as part of this Test Scenario.

4257 Table 16-17-1: Test scenario eBusiness specifications Relevant specifications / standards References

Business Process  Not applicable

Business  UBL - PEPPOL BIS  CEN PC 434 Documents  Cross Industry Invoice o Draft semantic model

 PEPPOL

o PEPPOL BIS 4a Schematron Validation tools

 CEN BII

o CEN BII Syntax Binding to UBL

o CEN BII Syntax Binding to CII

 UBL 2.1 Invoice XSD

 Cross Industry Invoice XSD

Transport and This test scenario does not test the Communication communications between the parties (Messaging) Protocols

Profiles  Not applicable

4258

4259 17.3 Test Scenario

4260 17.3.1 Objectives and Success Criteria

4261 This Test Scenario implements a conformance test for electronic invoices according to the CEN PC 434 4262 semantic model. Its main objective is that regardless of the syntax used to create the invoice, the Test 4263 Service shall identify whether the PC 434 semantic data model is correctly implemented in the electronic 4264 invoice instance.

4265 The scope of the test is checking both the compliance to the underlying syntax and to the semantic model as 4266 defined by the CEN PC 434. In order to test the underlying syntax, the Test Service must identify it through 4267 the root namespace, and once the syntax layer is successfully validated, the corresponding Schematron 4268 validation artefact must be used to assess whether there are elements in the document not contained within 185

4269 the PC 434 semantic model, and whether the existing semantic elements in the document instance fulfil the 4270 business rules defined by the PC 434.

4271 Success criteria:

4272 4. The electronic invoice is written using one of the syntaxes accepted by the PC 434

4273 5. The structure of the electronic invoice is valid according to that syntax

4274 6. The semantics of the PC 434 are correctly implemented in the electronic invoice.

4275 7. The elements in the electronic invoice not part of the semantics of PC 434 are identified.

4276 This Test Case is a document conformance test to ensure an XML electronic invoice is valid according to the 4277 PC 434 semantic data model. This means that the document XML instance belongs to one of the selected 4278 syntaxes, that it is valid according to the syntax Schema, and that contains the elements required by the PC 4279 434 semantic data model.

4280 17.3.2 System Under Test (s)

4281 This Test Case is used to test document instances. A document instance can be provided either by the 4282 Customer or the Supplier. The System Under Test (SUT) is the one creating the electronic invoice.

4283 SUTs can be production systems and this test case can be used as a the initial step for a certification 4284 process of electronic invoices to the CEN PC 434 European Norm.

4285 17.3.3 Abstract Test Steps

4286 The System Under Test (SUT) produces the electronic invoice document. The Test Bed only validates the 4287 document instance, not the business process, therefore there is no test on the communication between both 4288 actors.

4289  The operator submits or uploads the electronic invoice to the GITB-Compliant Test Bed

4290 o The operator wants to know whether the XML document instance is compliant according to 4291 the PC 434.

4292 . Conformance criteria 1 – The document belongs to an accepted syntax

4293 . How to test – Check namespace for the root document being in the list of accepted 4294 syntaxes

4295 . Conformance criteria 2 – The document structure is valid

4296 . How to test – The electronic document is valid according to the XSD structure of the 4297 identified syntax.

4298 . Conformance criteria 3 – The electronic invoice is conformant to PC 434

4299 . How to test – The electronic invoice is valid according to the CEN PC 434 4300 Schematron semantic model and rules.

4301 . Conformance criteria 4 – Identification of additional elements

4302 . How to test – Use an Schematron file to identify elements of the XML electronic 4303 invoice not defined in the CEN PC 434.

186

4304 17.4 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain

4305 Currently there are no normative artifacts issued by CEN PC434 that can be used to perform these tests. 4306 Besides, there is not an official list of accepted syntaxes and versions yet.

4307 PEPPOL BIS should be taken as the basis for this test, therefore, the syntax bindings and artefacts used in 4308 PEPPOL to check the electronic invoice document will be used to implement a first release of this Test Case.

4309 Additionally, the CEN BII2 has a syntax binding to the UN/CEFACT CII 3.0 and there are validation artifacts 4310 that will also be implemented to perform this Test Case.

4311 The artifacts issued by the PC 434, once they become published, shall substitute these interim artifacts.

4312 17.4.1 Test Artifacts

4313 Currently the following artifacts will be used:

4314  UBL XSD Schema for the UBL Invoice

4315  UN/CEFACT XSD Schema for the Cross Industry Invoice

4316  CEN BII2 T10 Invoice information requirement model (to be substituted by the CEN PC 434 4317 semantic data model)

4318  PEPPOL BIS 4a Validation Package

4319  CEN BII2 T10 CII Core Business Rules (to be substituted by the CEN PC 434 Core rules)

4320  CEN BII2 T10 CII Business Rules (to be substituted by the CEN PC 434 Business Rules)

4321 17.4.2 Test Tools and Services

4322 There is a free GITB-Compliant Test Bed service where this Test Case is implemented. It is called 4323 Validex.net (https://validex.net)

4324 17.5 Related Stakeholders

4325  Industry consortium’s

4326 o OASIS UBL

4327 o UN/CEFACT CII

4328 o OpenPEPPOL AISBL

4329  Public institutions

4330 o CEF

4331 o CEN PC 434

4332 o CEN WS BII

4333 o European Commission

4334

4335

187

4336

4337

4338

4339

4340

4341

4342

4343

4344

4345

4346

188

4347 Part IV. 2: e-Health

4348 18 Clinical Document Architeture (CDA)

4349 18.1 Background and Testing Requirements

4350 The HITCH project (http://www.hitch-project.eu/), the Antilope project (http://www.antilope-project.eu/) and 4351 the eHealth Governance Initiative (http://www.ehgi.eu/) recommend the use of integration profilea by the 4352 European Union member states in order to promote the interoperability of eHealth applications. Among the 4353 recommended profiles, we would like to focus the following 2 profiles:

4354  XDS.b for sharing document,

4355  XD-Lab for sharing lab reports.

4356 Austria, France, Luxemburg, Swiss among other countries are publishing specifications on how to share lab 4357 reports using these profiles. The proposal is to apply GITB to the purpose of testing the implementation of 4358 these 2 profiles in those countries. How could these countries benefit from sharing testing artifact and thus 4359 insure better interoperability?

4360 The test case described in that document focus on testing the conformance of CDA documents containing 4361 laboratory reports.

4362 18.2 Verification Scope – What Should be Tested?

4363 This Test Case focuses on testing the conformance of CDA Lab reports. CDA Documents are usually 4364 designed as Russian dolls as described on the following schema. Looking at the specifications of the “Lab 4365 Report” document as specified by various organizations in Europe and elsewhere shows that all of them are 4366 referring the same underlying specifications.

4367 1. In France, ASIP santé published with the CI-SIS : Cadre d’interopérabilité des systèmes 4368 d’information de santé the specifications of a “Volet Compte Rendu d’Examens de Biologie 4369 Médicale43”.

4370 2. In Austria, Elga44 published the document HL7 Implementation Guide for CDA® R2: Laborbefund45.

4371 3. In Italy, IHE Italy46 published the document Rapporto di medicina di laboratorio47

4372 4. In Switzerland, eHealthSuisse 48published Format d’échange : Rapports de laboratoire soumis à 4373 déclaration en Suisse49.

43 http://esante.gouv.fr/services/referentiels/referentiels-d-interoperabilite/cadre-d-interoperabilite-des-systemes-d

44 http://www.elga.gv.at/index.php?id=28

45 http://www.elga.gv.at/fileadmin/user_upload/uploads/download_Papers/Harmonisierungsarbeit/140902__upload/HL7_Im plementation_Guide_for_CDA_R2_-_Laborbefund.pdf

46 http://www.hl7italia.it/node/34

47 http://www.hl7italia.it/sites/default/files/Hl7/docs/public/HL7Italia-IG-CDA2%2020RapportoMedicinaLab-v01.00-SI.pdf

48 http://www.e-health-suisse.ch/umsetzung/00252/index.html?lang=fr

49 http://www.e-health- suisse.ch/umsetzung/00252/index.html?lang=fr&download=NHzLpZeg7t,lnp6I0NTU042l2Z6ln1ae2IZn4Z2qZpnO2Yuq2Z 6gpJCDdIB5e2ym162epYbg2c_JjKbNoKSn6A-- 189

4374 4375 Figure 18-1: CDA-CH Laboratory Reports for Public Health in relation to other norms and profiles. 4376 The specifications provided by these 4 countries rely all on the IHE XD-LAB technical framework50. France, 4377 Austria and Switzerland provide also a set of testing tools in order to check the conformance of the CDA 4378 document that claim to support their specifications.

4379 The purpose of this use case is to optimize testing and test tools development by using the “Russian Doll” 4380 architecture of the CDA documents.

4381 The proposed use case described here is designed for the specific Laboratory Report document, but it could 4382 also be applied to other type of documents that these countries have specified.

4383 The challenge is to reuse test artifacts for the conformance checking of CDA document in various 4384 national/regional project that use common references.

4385 The benefit is clear for all parties. The burden to test the common part can be re-used. Only the tests specific 4386 to the requirement of a country specification need to be developed, the rest remains common. Quality of the 4387 testing is harmonized and risk of different outcome due to different implementation of the tools are reduced.

4388 18.2.1 Parties/Actors

4389 The following parties take some role in the business process.

4390  Content Creator : The issuer of the CDA Laboratory Report Document document.

4391  Content Consumer : the consumer of the CDA Laboratory Report Document. The consumer shall be 4392 able to read the document and “digest its content”.

4393 18.3 Underlying eBusiness Specifications / Standards

Relevant specifications / standards References

Business Process IHE XD-LAB See 51

Business HL7 CDA See 52

50 http://ihe.net/uploadedFiles/Documents/Laboratory/IHE_LAB_TF_Vol3.pdf

51 http://ihe.net/uploadedFiles/Documents/Laboratory/IHE_LAB_TF_Vol1.pdf

52 http://www.hl7.org 190

Documents IHE XD-LAB See 50 and 53

Transport and Transport is out of the scope of this Communication document. See Test Case 2 for the (Messaging) protocol Protocols

Profiles

4394

4395 18.4 Testing Scenarios

4396 18.4.1 Objectives and Success Criteria

4397 The testing is focused on the conformance of the exchanged laboratory report documents.

4398 The criteria for success:

4399 1. Document is a well-formed XML document

4400 2. Document is valid according the CDA schema (might be extended by the regional specifications)

4401 3. Document meets the requirements specified in the specs (syntax and semantic)

4402 This section will describe what and how we will test based on the target specification. The perspective of this 4403 section is from the software architect responsible for utilizing the GITB framework to set up the appropriate 4404 Test Services and Test Artifacts to support the Testing Scenarios.

4405 18.4.2 System Under Test (s)

4406 The systems under test in this case are the Content Creator, the system that creates the CDA Laboratory 4407 Report document and the Content Consumer who consumes it.

4408 The Content Creator is tested for its ability to create documents that are conformant to the specifications.

4409 The Content Consumer is tested for its ability to read and correctly display the information provided in the 4410 documents created by the Document Creator.

4411 18.4.3 Abstract Test Steps

4412 18.4.3.1 Testing the content creator

4413 The content creator creates a set of documents

4414 Each created document is checked for conformance

4415 A report for conformance is created, the report include the list of requirement that were identified and tested 4416 in the documents.

4417 18.4.3.2 Testing the content consumer

4418 The content consumer consumes (load) a set of documents.

4419 The content consumer shows evidence that the documents are correctly loaded and that it can display the 4420 content of the information contained in the consumed documents.

53 http://ihe.net/uploadedFiles/Documents/Laboratory/IHE_LAB_TF_Vol1.pdf 191

4421 18.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain

4422 A number of test artifacts such as schematron or model can be reused:

4423  ASIP Santé, Elga and eHealthSuisse provide Schematrons for the validation of their respective 4424 specifications.

4425  IHE and NIST provide schematron for the validation of XD-LAB CDA documents.

4426  MDHT and IHE provide model based CDA Document validator.

4427 Tooling is also available:

4428  IHE provides a web based54 and web service to perform the validation of CDA document either 4429 using a schematron55 or a model56

4430  NIST provide a Web based and Web service schematron validation57

4431 18.6 Related Stakeholders

4432 The choice of this Test Case is driven by its potential interest for many organization worldwide. Interested 4433 bodies are listed below:

4434  HL7 : SDO 4435  IHE : Integrating the Healthcare Enterprise is a not for profit organization for the promotion of the 4436 interoperability of solutions in Healthcare 4437  Public Authorities that customized the profile: 4438 o ASIP Santé in France 4439 o ELGA in Austria 4440 o eHealthSuisse in Switzerland 4441 o Agence eSanté in Luxemburg 4442 o NICTIZ in the Netherlands 4443 o Plate-format eHealth in Belgium 4444  Companies that implements the profile 4445 o 30 companies worldwide have shown interest in sharing this type of documents at during the 4446 IHE Connectathons in Europe or in North America 4447 . ALERT Life Sciences Computing 4448 . Allscripts Healthcare Solutions 4449 . Atlas 4450 . Axway 4451 . CapMed 4452 . CareEvolution, Inc. 4453 . eClinicalWorks 4454 . Eclipsys Corporation 4455 . e-MDs 4456 . Engineering Ingegneria Informatica 4457 . Evolucare Technologie 4458 . Fidelity Information Systems 4459 . Forcare BV 4460 . GE Healthcare 4461 . Get Real Health 4462 . Global Care Quest

54 http://gazelle.ihe.net/EVSClient

55 http://gazelle.ihe.net/SchematronValidator

56 http://gazelle.ihe.net/CDAGenerator/home.seam

57 http://cda-validation.nist.gov/cda-validation/ 192

4463 . InterSystems Corporation 4464 . Karos Health 4465 . MEDecision 4466 . Medical Informatics Engineering 4467 . NextGen 4468 . No More Clipboard 4469 . Open Health Tools 4470 . SAIC 4471 . SIEMENS Medical Solutions 4472 . Tiani "Spirit" Gmbh - Cisco Systems Inc. 4473 . Topicus Zorg 4474

4475 18.7 Re-usability of Test artifacts/Tools/Services for GITB3

4476 This Test Case gives GITB3 the ability to reuse Test Artifacts, tools and services in cross-organization 4477 scenarios.

4478 Scenario 1: ELGA, ASIP and eHealthSuisse to use a common set of tools to check the conformance of the 4479 XD-LAB profile implementation in CDA documents target to their respective context.

4480 Scenario 2: Company can test its implementation of the XD-LAB profile and check the conformance to the 4481 different extensions made by ELGA, ASIP and eHealthSuisse.

4482

4483

193

4484 19 IHE - Cross-Enterprise Document Sharing (XDS)

4485 The purpose of this Test Scenario is the sharing of Test Artifacts for testing the interoperability of systems 4486 participation to a sharing of document workflow based on XDS.b.

4487 19.1 Background and Testing Requirements

4488 Cross-Enterprise Document Sharing (XDS) provides a standards-based specification for managing the 4489 sharing of documents between any healthcare enterprise, ranging from a private physician office to a clinic to 4490 an acute care in-patient facility and personal health record systems. Many regional/national projects 4491 worldwide58 are deploying/specifying the sharing of medical document using an infrastructure based on the 4492 IHE XDS.b suite of profiles. We are proposing in this scenario to share the tests artifacts that could be 4493 common for all these projects, the XDS.b part of the exchange.

4494 Sharing the same set of Tests Artifacts among these project will help them. A set of Test Artifacts allowing 4495 the testing of the underlying transport mechanism will be available for them. So that they will only test the 4496 parts specific to their projects. SUTs who already got tested for one project know that the underlying 4497 transport mechanism has been tested and can focus on the projects specifics.

4498 The benefits are:

4499  Reuse of test artifacts and test tools, 4500  Harmonization of the infrastructures deployed by the projects, avoiding “specifics” implementations, 4501  Reduce the cost of test design and testing. 4502 4503 19.2 Verification Scope – What to Test?

4504 The business process that needs to be tested is described in the IHE Technical Frameworks of the IT- 4505 Infrastructure domain and available on the IHE web site.59

4506 19.3 Actors

4507 The Parties involved are the Document Source, Document Consumer, Document Registry and Document 4508 Repository as described in the figure below.

58 See http://motorcycleguy.blogspot.com/2010/01/where-in-world-is-xds.html

59See http://ihe.net/uploadedFiles/Documents/ITI/IHE_ITI_TF_Vol1.pdf http://ihe.net/uploadedFiles/Documents/ITI/IHE_ITI_TF_Vol2b.pdf http://ihe.net/uploadedFiles/Documents/ITI/IHE_ITI_TF_Vol2x.pdf http://ihe.net/uploadedFiles/Documents/ITI/IHE_ITI_TF_Vol3.pdf 194

Patient Identity Source

Patient Identity Feed

Query Document Documents Document Registry Consumer

Register Provide and Document Set Register Retrieve Document Document Document Set Document Source Repository 4509

4510 Figure 19-1 4511  The Document Source Actor is the producer and publisher of documents. It is responsible for sending 4512 documents to a Document Repository Actor. It also supplies metadata to the Document Repository Actor 4513 for subsequent registration of the documents with the Document Registry Actor. 4514  The Document Repository is responsible for both the persistent storage of these documents as well as 4515 for their registration with the appropriate Document Registry. It assigns a uniqueId to documents for 4516 subsequent retrieval by a Document Consumer. 4517  The Document Registry Actor maintains metadata about each registered document in a document 4518 entry. This includes a link to the Document in the Repository where it is stored. The Document Registry 4519 responds to queries from Document Consumer actors about documents meeting specific criteria. It also 4520 enforces some healthcare specific technical policies at the time of document registration. 4521  The Document Consumer Actor queries a Document Registry Actor for documents meeting certain 4522 criteria, and retrieves selected documents from one or more Document Repository actors. 4523

4524 19.3.1 Interaction Diagram/Choreography

4525 The following diagram shows the interactions between that need to be covered by the tests.

4526

4527 Figure 19-2 4528 19.3.2 Underlying eBusiness Specifications / Standards

4529 The XDS.b profile is relying on the following set of standards and specifications:

4530  ebXML 4531  IHE XDS.b 4532  HL7v3 datatypes 4533  MTOM 4534  HTTP 4535  SOAP 4536  TLS 195

4537

4538 19.4 Details/Requirements of Test Scenario

4539 19.4.1 Objectives and Success Criteria

4540 The following test scenarii implement conformance and interoperability tests for the XDS.b profile. Different 4541 tests need to be performed depending on which role is played by the system under test.

4542 The conformance to the XDS.b profile specification of the messages exchanged between the SUT and the 4543 simulator or the partner will be verified as well as the correct behavior of the actors participating to the test.

4544 If we exclude the Patient Identity feed from the testing scope, in order to test this profile, we have 4 actors to 4545 test and 4 transactions to test. Testing is described by considering each of the actors playing the role of the 4546 SUT.

4547 An affinity domain needs to be defined in order to perform the testing. The SUT and the Simulator involved in 4548 the testing need to share coded values and certificates.

4549 In the preparation of the testing, the actors document registry and document repository need to be feed with 4550 data for testing purposes.

4551 19.4.2 System(s) Under Test

4552 Possible SUTs are considered in the following Test Cases. Each scenario describes the test plan for one of 4553 them. As described above the Patient Identity Source is not considered here, restraining the SUT to the 4554 following list:

4555  Document Source 4556  Document Consumer 4557  Document Repository 4558  Document Registry 4559

4560 19.4.3 Abstract Test Steps

4561 19.4.3.1 Testing the Document Source

4562 In order to test the Document Source actor we need a simulator playing the role of the Document Repository.

4563 The different test steps required to test the Document Source are presented in the following sequence 4564 diagram.

4565

4566 Figure 19-3: Testing the Document Source

196

4567 19.4.3.2 Testing the Document Consumer

4568 In order to test the Document Consumer actor we need a simulator playing the role of both the Document 4569 Registry and the Document Repository actors. The different test steps required to test the Document 4570 Consumer are presented in the following sequence diagram.

4571

4572 Figure 19-4: Testing the Document Consumer 4573 19.4.3.3 Testing the Document Repository

4574 In order to test the Document Repository actor we need a simulator playing the role of both the Document 4575 Consumer and the Document Source actors. The different test steps required to test the Document 4576 Repository are presented in the following sequence diagram.

4577

4578 Figure 19-5: Testing the Document Repository 4579 19.4.3.4 Testing the Document Registry

4580 In order to test the Document Registry actor we need a simulator playing the role of both the Document 4581 Consumer and the Document Repository actors. The different test steps required to test the Document 4582 Registry are presented in the following sequence diagram.

197

4583

4584 Figure 19-6: Testing the Document Registry 4585 For each test steps, the message send by the SUT will be analyzed and test for conformance with the 4586 specifications.

4587  Verification of the TLS layer 4588  Verification of the HTTP transport 4589  Verification of the MTOM layer 4590  Verification of the SOAP header 4591  Verification of the Business Message 4592

4593 19.5 Related Existing Test Artifacts/Tools/Services to Reuse in the Domain

4594 The Test Scenarios described in this document are related to the Test Cases used by IHE in both the pre- 4595 connectathon and the connectathon testing phases.

4596 Tools are available for simulating the missing actors and checking the conformance of messages to the 4597 XDS.b requirements.

198

4598 One should consider the following existing set of tools:

4599 1. XDSTools260for simulation and conformance checking of the XDS Actors. 4600 2. XDStarClient61for simulation and conformance checking of XDS messages 4601 3. EVS Client62 for the validation of messages 4602 4. Sharing Value Set Simulator63 for the sharing the coded values with the test participants 4603 5. Gazelle TLS tools64 for the needs in term of security testing: Certificate generation, TLS testing 4604 4605 19.6 Related Stakeholders

4606 IHE XDS.b profile users are or will be clearly interested by using these tests. Although IHE is providing 4607 already set of tools to perform this testing, the ability to share and re-use test cases might be of interest to 4608 organization that extends the XDS.b profile for implementation. Regional projects, national projects (ELGA, 4609 ASIP santé (DMP), Agence eSanté, KELA…) might indeed benefit from re-using the test cases in their 4610 context.

4611 Over 150 companies worldwide65 have tested one of the XDS.b profile at one of the IHE connectathon.

4612 19.7 Re-usability of Test Artifacts/Tools/Services for GITB3

4613 The XDS.b profile requires the exchange of messages in a secure TLS connexion. Testing the TLS part of 4614 the transaction is not specific to the XDS.b context and the test artifacs/tools/services used to test it could be 4615 shared or common to different domain.

4616 The XDS.b profile uses the MTOM, SOAP and HTTP protocol for the transport of the messages. As for the 4617 security aspects, those protocols are not specific either and could be considered to be testing using 4618 artifacs/tools/services from other domains.

4619

4620

4621

4622

4623

4624

4625

4626

4627

4628 References

4629

60 http://ihexds.nist.gov/xdstools2/

61 http://gazelle.ihe.net/XDStarClient

62 http://gazelle.ihe.net/EVSClient

63 http://gazelle.ihe.net/SVSSimulator

64 http://gazelle.ihe.net/tls

65 http://connectathon-results.ihe.net 199

4630 4631 4632 [CEN10] CWA 16093:2010 Feasibility study for a global eBusiness interoperability test bed 4633 http://www.cen.eu/cen/Sectors/Sectors/ISSS/CEN%20Workshop%20Agreements/Pages/downloadArea.aspx

4634 [CEN12] CWA 16408:2012 Testing Framework for Global eBusiness Interoperability TestBeds (GITB)

4635 [TAG] OASIS Committee Note Draft 03, "Test Assertions Guidelines", June 2011 http://www.oasis- 4636 open.org/committees/download.php/42479/testassertionsguidelines-cnd-03-Jun03.pdf

4637 [TAML] OASIS Committee Specification Draft 05, "Test Assertions, Part 2 Test Assertions Markup 4638 Language Version 1.0", June 2011 http://www.oasis- 4639 open.org/committees/download.php/42478/testassertionmarkuplanguage-1.0-csd-05-Jun07.pdf

4640 [WSI10] WS-I Testing Tools V2 for Basic Profiles 1.2 and 2.0, Web Services Interoperability, 2010, 4641 http://www.ws-i.org/

4642 4643

200