EHR Usability Test Report of Pediatricxpress, Version 20

EHR Usability Test Report of Pediatricxpress, Version 20

Page | 1 EHR Usability Test Report of PediatricXpress, Version 20 Report based on ISO/IEC 25062:2006 Common Industry Format for Usability Test Reports Date of Usability Test: January 30, 2019 – February 8, 2019 Date of Report: February 11, 2019 (updated 10/28/20) Report Prepared by: PhysicianXpress, Inc. Vene Quezada, Product Specialist [email protected] 877-366-7331 409 2nd Avenue, Suite 201 Collegeville, PA 19426 Page | 2 Table of Contents 1 EXECUTIVE SUMMARY 3 2 INTRODUCTION 10 3 METHOD 10 3.1 PARTICIPANTS 10 3.2 STUDY DESIGN 12 3.3 TASKS 13 3.4 RISK ASSESSMENT 1 4 3.4 PROCEDURE 15 3.5 TEST LOCATION 16 3.6 TEST ENVIRONMENT 16 3.7 TEST FORMS AND TOOLS 17 3.8 PARTICIPANT INSTRUCTIONS 17 3.9 USABILITY METRICS 19 3.10 DATA SCORING 20 4 RESULTS 21 4.1 DATA ANALYSIS AND REPORTING 21 4.2 DISCUSSION OF THE FINDINGS 25 5 APPENDICES 29 5.1 Appendix 1: Participant Demographics 30 5.2 Appendix 2: Informed Consent Form 31 5.3 Appendix 3: Example Moderator’s guide 32 5.4 Appendix 4: System Usability Scale questionnaire 43 5.5 Appendix 5: Acknowledgement Form 44 Page | 3 EXECUTIVE SUMMARY A usability test of PediatricXpress, version 20, ambulatory E.H.R. was conducted on selected features of the PediatricXpress version 20 E.H.R. as part of the Safety- Enhanced Design requirements outlined in 170.315(g)(3) between January 30th , 2019 and February 8, 2019 at 409 2nd Avenue, Collegeville, PA 19426. The purpose of this testing was to test and validate the usability of the current user interface and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, eleven (11) healthcare providers and/or other intended users matching the target demographic criteria served as participants and used the PediatricXpress E.H.R. in simulated, but representative tasks. This study collected performance data on sixteen (16) testing tasks typically conducted on PediatricXpress E.H.R. Each task fit into one of the following categories: CPOE – Medications (170.315 (a)(1)) – 3 Tasks CPOE – Laboratory (170.315 (a)(2)) – 2 Tasks CPOE – Imaging (170.315 (a)(3)) – 3 Tasks Drug-Drug, Drug-Allergy Interaction (170.315 (a)(4)) – 2 Tasks Demographics (170.315 (a)(5)) – 2 Tasks Clinical Decision Support (170.315 (a)(9)) – 1 Tasks Implantable Device List (170.315 (a)(14)) – 3 Tasks CCDA Receive, View and Reconcile (170.315 (b)(2)) – 2 Tasks Page | 4 Since the certification categories do not necessary align to how users optimally utilize the PediatricXpress E.H.R. system, we developed sections of the user testing that better aligned to how users should use PediatricXpress with each of these sections having at least one and usually multiple tasks. Listed below are the sections that the users received testing of different tasks. Enter and edit Demographics for a Patient Complete the following medication activities • Access, record and change medication orders • Act on an allergy and adjust drug allergy severity level Access and record a lab order Access, Record and Change a diagnostic imaging order Use the clinical decision support Access and enter an implantable device View received CCDA document and reconcile the data appropriately Appendix 3 provides the details of the tasks associated with each of these sections as well as how each section and task(s) link to the specific certification criteria. We scheduled and conducted eleven (11), one hour, one-on-one usability tests for this study. During the one-on-one usability test, each participant was greeted by the administrator and asked to review and sign an informed consent/release form Page | 5 (included in Appendix 1); they were instructed that they could withdraw at any time. Participants had prior experience with the E.H.R.. The administrator introduced the test and instructed participants to complete a series of tasks (given one at a time) using the PediatricXpress E.H.R. During the testing, the administrator timed the test and, along with the data logger(s) recorded user performance data on paper and electronically. The administrator did not give the participant assistance in how to complete the task. The following types of data were collected for each participant: Number of tasks successfully completed within the allotted time without assistance Time to complete the tasks Number and types of errors Path deviations Participant’s verbalizations Participant’s satisfaction ratings of the system All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire. No compensation was provided to the participants. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the PediatricXpress E.H.R. Following is a summary of the performance and rating data collected on the PediatricXpress E.H.R, Version 20. Page | 6 Table 1: Performance and Rating Summary Measure Scenario N Task Path Task Time Errors Ratings (5 = Success Deviation easy) Task Deviations Mean Deviations # % (Observed/ (SD) (Observed/ % Mean (SD) optimal) Seconds optimal) (SD) (SD) CPOE - Medications 170.315(a)(1) task 1 – Access 11 100% 6/5 0% CPOE Medication Orders (0%) 193 120/30 (0%) 4.2 (31) (.75) 170.315(a)(1) task 2- Record CPOE 11 91% 30/22 9% Medication Order (30%) (30%) 170.315(a)(1) task 3 – Change 11 91% 13/9 9% CPOE Medication Order (30%) (30%) CPOE - Laboratory 0% 170.315(a)(2) Task 1 -Record 11 100% 14/9 39 (11) 38/9 (0%) 4.9 (.3) CPOE Laboratory Orders (0%) 0% 170.315(a)(2) Task 2 -Access 11 100% 7/5 (0%) CPOE Laboratory Orders (0%) CPOE – Imaging 170.315 (a)(3) Task 1 - Access 11 100% 7/5 0% CPOE Imaging Orders (0%) 44 39/10 (0%) 4.8 (12) (.4) 170.315 (a)(3) Task 2 - Record 11 100% 11/9 0% CPOE Imaging Orders (0%) (0%) 170.315 (a)(3) Task 3 - Change 11 100% 9/7 0% CPOE Imaging Orders (0%) (0%) Drug – Drug, Drug -Allergy Interaction 170.315(a)(4) Task 1 - Adjustments 11 82% 12/9 18% (Drug-Allergy severity level) (40%) 52 (12) 39/11 (40%) 4.2 (.75) 170.315(a)(4) Task 2 -Review and 11 73% 12/10 27% act upon Drug-Allergy interaction alert (47%) (47%) Demographics 170.315(a)(5) Task 1 - Access 11 100% 6/5 0% 5 (0) Demographics (0%) 24 (10) 33/8 (0%) 170.315(a)(5) Task 2 - Change 11 91% 7/6 9% Demographics (30%) (30%) Clinical Decision Support Page | 7 170.315(a)(9) Task 1 -Access and 11 73% 9/7 60 (21) 51/17 27% 4.4 (.69) set CDS for Vitals related to Weight (47%) (47%) > 200 lbs. Implantable Device List 170.315(a)(14) Task 1 - View all 11 100% 6/5 0% implantable devices (0%) 57 60/15 (0%) 4.5 (17) (.52) 170.315(a)(14) Task 2 - Record and 11 100% 14/11 0% implanted device ID on patient (0%) (0%) chart 170.315(a)(14) Task 3 - Change 11 73% 15/12 27% implanted device ID on patient (47%) (47%) chart CCDA: Receive – View, Receive - Reconcile 170.315(b)(2) Task 1 –CCDA 11 91% 4/3 32 (9) 25 (8) 9% 4.4 (.52) Receive and View (30%) (30%) 170.315(b)(2) Task 2 – CCDA 11 91% 6/4 47 (10) 41 (15) 9% 4.4 (.52) Receive and Reconcile (30%) (30%) The primary tasks of study participants included: Certification Criteria Task Description 170.315 (a)(1) Access CPOE Medication Orders 170.315 (a)(1) Record CPOE Medication Orders 170.315 (a)(1) Change CPOE Medication Orders 170.315 (a)(2) Record CPOE Laboratory Orders 170.315 (a)(2) Access CPOE Laboratory Orders 170.315 (a)(3) Access CPOE Imaging Orders 170.315 (a)(3) Record CPOE Imaging Orders 170.315 (a)(3) Change CPOE Imaging Orders 170.315 (a)(4) Adjustments (Drug-Allergy severity level) 170.315 (a)(4) Review and act upon Drug-Allergy interaction alert 170.315 (a)(5) Access Demographics 170.315 (a)(5) Change Demographics 170.315 (a)(9) Access and set CDS for Vitals related to Weight > 200 lbs. 170.315 (a)(14) View all implantable devices 170.315 (a)(14) Record and implanted device ID on patient chart 170.315 (a)(14) Change implanted device ID on patient chart 170.315 (b)(2) View CCDA received for a patient 170.315 (b)(2) Reconcile a CCDA for an existing patient Page | 8 The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be: 93.4. An average System Usability Scale score is considered to be 68. So a score under 68 is considered below average while a score above 68 is above average. Note that the System usability scale is a simple scale designed to provide a global view of usability of a system. The scale was originally developed by John Brooke of Digital Equipment Corporation during the 1980s as a tool to be used in usability engineering of electronic office systems. Major Findings Since most of the users in the study had over six months experience with the application, usability issues seem to surface only for areas of the application that the user was not familiar. The SUS, impression of ease survey and subjective satisfaction survey all show that the users view the application as easy to use overall.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    45 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us