sustainability

Article Evaluation of Key Performance Indicators (KPIs) for Sustainable Postgraduate Medical Training: An Opportunity for Implementing an Innovative Approach to Advance the Quality of Training Programs at the Saudi Commission for Health Specialties (SCFHS)

Abdulrahman Housawi 1,*, Amal Al Amoudi 1, Basim Alsaywid 1,2,3,* , Miltiadis Lytras 1,4,* , Yara H. bin Moreba 1, Wesam Abuznadah 1 and Sami A. Alhaidar 1

1 Saudi Commission for Health Specialties, 11614, ; [email protected] (A.A.A.); [email protected] (Y.H.b.M.); [email protected] (W.A.); [email protected] (S.A.A.) 2 College of Medicine, King Saud Bin-Abdul-Aziz University for Health Sciences, Jeddah 14611, Saudi Arabia 3 Urology Section, Department of Surgery, King Abdulaziz Medical City, Ministry of National Guard, Jeddah 14815, Saudi Arabia 4 King Abdulaziz University, Jeddah 21589, Saudi Arabia * Correspondence: [email protected] (A.H.); [email protected] (B.A.); [email protected] (M.L.)

 Received: 4 September 2020; Accepted: 23 September 2020; Published: 29 September 2020 

Abstract: The Kingdom of Saudi Arabia is undergoing a major transformation in response to a revolutionary vision of 2030, given that healthcare reform is one of the top priorities. With the objective of improving healthcare and allied professional performance in the Kingdom to meet the international standards, the Saudi Commission for Health Specialties (SCFHS) has recently developed a strategic plan that focuses on expanding training programs’ capacity to align with the increasing demand for the country’s healthcare workforce, providing comprehensive quality assurance and control to ensure training programs uphold high quality standards, and providing advanced training programs benchmarked against international standards. In this research paper, we describe our attempt for developing a general framework for key performance indicators (KPIs) and the related metrics, with the aim of contributing to developing new strategies for better medical training compatible with the future. We present the results of a survey conducted in the Kingdom of Saudi Arabia (KSA), for the enhancement of quality of postgraduate medical training. The recent developments in the field of learning analytics present an opportunity for utilizing big data and artificial intelligence in the design and implementation of socio-technical systems with significant potential social impact. We summarize the key aspects of the Training Quality Assurance Initiative and suggest a new approach for designing a new data and services ecosystem for personalized health professionals training in the KSA. The study also contributes to the theoretical knowledge on the integration of sustainability and medical training and education by proposing a framework that can enhance future initiatives from various health organizations.

Keywords: key performance indicators (KPIs); attitudes; training programs; Saudi Commission for Health Specialties; framework; best practices; healthcare; medical training; residents; postgraduate medical training; learning analytics; artificial intelligence (AI)

Sustainability 2020, 12, 8030; doi:10.3390/su12198030 www.mdpi.com/journal/sustainability Sustainability 2020, 12, 8030 2 of 37

1. Introduction Sustainability in healthcare is a key initiative towards digital transformation. Numerous emerging technologies provide a novel ecosystem for data and services exploitation. Applied data science and sophisticated computational information processing in the healthcare domain define new highways for research and innovation. This research work aims to develop an overall unique quality framework for healthcare sustainability and technological innovation in the Kingdom of Saudi Arabia (KSA). It summarizes leading-edge research conducted by the Saudi Commission for Health Specialties (SCFHS).

1.1. The Saudi Commission for Health Specialties’s Quality Assurance Initiative for Sustainable Personalized Medical Training The SCFHS has recently developed a strategic plan that focuses on expanding training programs’ capacity to comprehend the increasing demand for the country’s healthcare workforce, to provide comprehensive quality assurance and control, to ensure training programs upholding high quality, and to benchmark advanced training programs against international standards. This research paper summarizes a bold initiative towards the implementation of both this strategic plan for more efficient patient-centric healthcare, as well as an interactive data ecosystem for enhanced decision making in the context of the activities and the priorities of the SCFHS. The SCFHS was established by the approval of Royal Decree in 1993, in order to supervise and evaluate training programs, as well as set regulations and standards for the practice of health and allied professionals. Since its establishment, the number of residency and fellowship training programs has increased. Across the country and the Gulf region, 1200 programs, that cover the 79 health specialties, are conducted [1]. These newly initiated training programs respond to the anticipated increase in demand for health professionals and training needs over the years. Moreover, there will likely be an expansion in training programs over the coming years that will require sound development planning to ensure high-quality training for health professionals. Some countries have taken the initiative to pursue quality measurement in competency-based postgraduate training [2–4]. Saudi Arabia, through the SCFHS, is among the list of those countries that developed a system that measures quality outcomes of postgraduate training. The SCFHS has launched the Training Quality Assurance Initiative (TQAI) which aims to develop a system for setting quality standards, benchmarks, processes, and improvement guidelines for the postgraduate training programs.

1.2. Governance of Training Programs in SCFHS The latest developments in the big data and analytics domain bring forward the significance of defining efficiency and performance metrics as a basis for the design of a data ecosystem for personalized value-adding services. The healthcare domain is one of the most critical with great social impact. Key performance indicators (KPIs) serve as a way of gently guiding clinicians in the right direction to improve clinical practice and is being used in different fields of medicine [5–7]. Santana et al. [8] identified nine measurements that KPIs should meet. Specifically, KPIs should be:

Targeting important improvements (there is a direct connection with sustainability), • Precisely defined (measurable metrics, impose efficiency and enhanced decision making—they • also provide a systematic way for monitoring performance over time with significant managerial implications in the medical sector), Reliable, • Valid, • Implemented with risk adjustment, • Implemented with reasonable cost, • Sustainability 2020, 12, 8030 3 of 37

Implemented with low data collection effort, • Achieving results which can be easily interpreted, and • Global (overall evaluation). • In June 2017, the SCFHS’s Secretary General requested the formation of a Quality Indicators Committee (QIC) that served to assess the postgraduate training programs’ outcomes based on the set quality indicators. These indicators are measurable features as a means of providing evidence that reflects quality and, hence, serves as a basis to help raise the quality of training provided to health professionals. The collaborative efforts of the two main divisions of the SCFHS (i.e., Academic Affairs and Planning and Corporate Excellence Administration) have formed the committee. It consisted of the Chief of Academic Affairs, the Chief of Planning and Corporate Excellence, the Executive Director of Training (and Assistant), the Executive Director of Assessment (and Assistant), the Director of Quality Department, and the Director of Knowledge Management. The objectives of the QIC are to:

Improve the quality of postgraduate medical training (PGMT), • Provide means for objective assessment of residency programs and training centers, • Provide guidance to residency programs and training centers periodically and objectively, and • Assist program directors in reviewing the conduction and educational quality of their programs. • Thus, this research paper presents the determined outcome measures based on the quality indicators to identify what elements needed to be modified, which areas needed to be improved, and/or what areas might be developed to fulfill the objectives of the SCFHS. Hence, the main aim of this research article is to evaluate the residency training programs being conducted throughout Saudi Arabia through the core set of preformed KPIs. The main objectives are to evaluate the quality of training programs across the nation, assess the degree of achievement of pre-determined KPIs, and provide baseline data that support decisions for the improvement plan. As an ultimate objective, this research contributes a novel framework for the integration of KPIs in a sophisticated data and services ecosystem in the SCFHS that can be used in similar organizations worldwide. This research paper is organized as follows. Section2 integrates multidisciplinary research on analytics KPIs and personalized medicine and medical education. In Section3, we present our integrated research methodology with an emphasis on demographics, methods deployed, and research objectives. Analysis and key findings of the analysis of training programs in the SCFHS are in Section4, including statistical analysis. The main findings are presented in Section5. The key contribution of our research, the SCFHS Decision Making Model for Training Programs, is then summarized in Section6. Finally, in Section7, we present the conclusions and future research aims for this research work.

2. Critical Literature Review on KPIs in Healthcare Training Programs for Sustainable Healthcare The analysis of performance and efficiency in training programs; technology-enhanced learning; and the systematic analysis of teaching, learning, and training behavior is a key initiative in our days. In fact, in recent years, a new multidisciplinary domain has emerged: the domain of learning and training analytics. It provides critical insights for the methodological methods, measurement tools, and trusted key performance indicators that enhance decision making in the training and learning process [9–11]. In Table1, we summarize an overview of some key approaches in the literature that provide key insights to our methodological approach. What is more, a systematic evolution in the areas of big data analytics and data science [12,13] offers new unforeseen opportunities for the design, implementation, and functioning of data and services ecosystems, aiming to promote enhanced decision making. Within this diverse area, issues related to data governance, standardization of data, availability of data, and advanced data mining permit new insights and analytical capabilities. Furthermore, the social impact of big data research and the direct Sustainability 2020, 12, 8030 4 of 37 connection of sophisticated analytics to strategic management and decision making are also evident in various studies. In a parallel discussion on sustainable higher education [14,15], researchers emphasize the need to measure the impact of investments in training programs and to maximize the return on investment in social capital. A bold literature section also communicates that socially inclusive economic development must be an intervention of innovative education and enhanced skills and capabilities of citizens. Within the training domain, there is fast-developing literature related to learning analytics and key performance indicators that provide significant insights into our research. In the next paragraphs, we provide a compact synthesis of complementary approaches [16–20].

Table 1. Overview of literature review for the perceptions and utilization of learning analytics and key performance indicators (KPIs) for learning and training.

Literature Review Author(s) Learning Analytics Key Interpretation Impact on Our Research Model Metaphors [21–24] Learning analytics and There is a critical need for the definition of We have a tremendous interest in analyzing KPIs serve as a trusted, measurable, and efficient learning how KPI research can enhance the quality of measurement tool for analytics and KPIs to support decision training programs in the Saudi Commission for enhanced decision making. Health Specialties (SCFHS). We are looking for making a set of KPIs that will allow for enhanced decision making and adjustment of training programs. [21–25] Learning analytics as a In the recent literature, there is an In our research study, we want to investigate key approach for increasing interest in the use of KPIs and how well-defined KPIs can be used for a learning behavior learning analytics for understanding, short-term and long-term analysis of residents analysis and interpreting, and enhancing the learning in postgraduate medical training programs‘ adjustment behavior and adjustment. learning behavior and attitudes. [22–24,26] Learning analytics as a The use of learning analytics to predict In the wide range of the medical training key approach for performance in training programs can be programs of the SCFHS, this is not currently the predicting used as the basis for the adjustment of the priority. We are interested, though, in the performance and learning experience for the benefit of future, in utilizing a newly established big data personalizing learning trainees. ecosystem towards this direction. experience [21,22,26,27] Learning analytics as a KPIs can provide a systematic way of In the variety of medical training programs of key approach for monitoring the efficiency in various aspects the SCFHS, there is a necessity to define and real-time monitoring of the learning and training process. maintain a set of KPIs for monitoring the tool of learning learning and training processes. process [21–27] Learning analytics as a KPIs and learning analytics can be the basis A fully functional set of KPIs for the purposes key approach for for flexible training programs based on of the SCFHS’s training programs can be used customizing learning variations of training approaches. for a deep understanding of obstacles in experience training and alteration of training approach. [21–27] Learning analytics as a KPIs can be used as a methodological In the SCFHS, there is a special interest in a key approach for framework for the standardization of quality initiative that will be distributed across standardizing the flow instruction. Learning objectives as well as all training programs in a transparent way. of instruction learning outcomes in training programs can be codified as measurable KPIs. [28–33] Learning analytics as a A thorough, sophisticated learning The development of a reliable set of KPIs for means for analytics and KPI framework can be also the purposes of the quality initiative in the interoperable empower interoperable or distributed SCFHS can support interoperable learning technology-enhanced technology-enhanced learning systems. services with various other stakeholders and learning systems their training services in the near future.

Selected research works in the domain of learning analytics and KPIs provide further insights, especially when dealing with the learning, teaching, and training process. Different methodological approaches emphasize diverse pedagogical approaches and interpret them in meaningful sets of KPIs and learning analytics [25–28]. Other researchers propose KPI interventions that integrate the execution of the learning process with management and decision making at the highest level within organizations [29–34]. In an effort to interpret the literature review of learning analytics and to provide the pillars of our research methodology that will be presented in the next section, some critical insights are useful. Sustainability 2020, 12, 8030 5 of 37

In the context of the Saudi Commission for Healthcare Specialties, thousands of medical training programs are offered. Our interest in maintaining a total quality initiative incorporates the development and monitoring of a reliable, trusted, and efficient set of KPIs. In this research study, we emphasize exactly this aspect. We present the first run and execution of a full set of KPIs related to the total quality initiative. This is a constructive knowledge management approach associated with big data analytics research and significant implications for data science and sustainability. The standardization of KPIs and learning analytics for measuring the attitudes of residents on postgraduate medical training programs in the Kingdom of Saudi Arabia is a significant effort towards enhanced decision making. In this research study, we analyze the impact of using KPIs in order to develop a framework for sustainable medical education. We also introduce, in Section6, the SCFHS Training Quality Approach—Medical Training Quality Framework. In fact, we contribute to the body of knowledge of the learning analytics and KPI domain by attaching a managerial approach for total quality management in medical education. Our approach is also related to the latest developments in data science. It also contributes to the sustainability domain and brings forward new challenges and requirements for the digital transformation of healthcare systems. The necessity of having flexible, sustainable postgraduate medical education requires the design and implementation of socio-technical services that interconnect knowledge, knowledge processes, knowledge providers, and stakeholders [35–37]. In the next section, we provide the first phase of the quality initiative in the SCFHS. We focus on the design, implementation, and interpretation for sustainable medical education purposes of a newly, innovative KPI and learning analytics framework. As explained in this critical literature review, the main purpose of this framework is to support the following six complementary objectives:

Measurement tool for enhanced decision making • Learning behavior analysis and adjustment • Predicting performance and personalizing learning experience • Real-time monitoring tool of learning process • Customizing learning feedback • Standardization of instruction flow • Interoperable technology-enhanced learning systems • 3. Research Methodology The main aim of this research paper is to analyze the interconnections of quality assurance in medical training programs in the KSA as a key enabler for a big data ecosystem in the KSA for the provision of personalized medical training. Our research is directly related to the Vision 2030 for digital transformation and is also a bold initiative for analyzing how sustainable data science can promote integrated sophisticated medical services with social impact. The three integrated learning objectives in our research are summarized as follows. Research Objective 1: KPI definitions for efficient medical training. The main aim is to define and to operationalize a set of KPIs for efficient medical training. Research Objective 2: KPIs’ social impact and sustainability in medical education. The main aim is to understand and to interpret how a set of measurable, compact, and reliable KPIs can be used as the basis of a data ecosystem for efficient medical training. Research Objective 3: Sustainable data science for medical education and digital transformation of healthcare. We analyze the main impact of our key contribution towards a long-term sustainable plan for the introduction of personalized medical training services for individuals and the community in the KSA.

3.1. Study Design This research is a prospective, analytical, cross-sectional study design that represents the quality outcome measures of training programs and services provided by the SCFHS in 2018. The key performance Sustainability 2020, 12, 8030 6 of 37 indicator list was created by experts in quality and health profession education at the SCFHS. The initial KPI list consisted of 28 KPIs but, after in-depth review and discussion, the committee omitted five KPIs from the final list and only 23 KPIs were selected. The theoretical framework of those KPIs was based on the Kirkpatrick model. Basically, the model is objectively used to analyze the impact of training, to work out how well your team members learned, and to improve their learning in the future [11]. The four-level training model that was created in 1959 represents a sequence of ways to evaluate training programs. As you move from one level to the next, the process becomes more difficult and time-consuming, but it also provides more valuable information. The KPIs are combined into domains that are described in Table2.

Table 2. The List of Generated KPIs and Their Descriptors.

Domains Code Descriptors GMH QI 1.1 Percentage of trainees’ satisfaction GMH QI 1.2 Percentage of trainers’ satisfaction REACTION GMH QI 1.3 Percentage of program directors’ (PDs’) satisfaction GMH QI 1.4 Percentage of trainees’ burnout GMH QI 2.1 Percentage of program directors (PDs) who attended a PD training course offered by the SCFHS Number of trainers in postgraduate management training (PGMT) programs who successfully GMH QI 2.2 completed SCFHS training GMH QI 2.3 Percentage of surveyors who have successfully completed SCFHS’s accreditation certification Percentage of trainees’ compliance with minimal procedure, case exposure policies required GMH QI 2.4 competency index GMH QI 2.5 Percentage of trainees who have received trainees’ evaluation by program in specific period LEARNING GMH QI 2.6 Percentage of research inclusion in curricula GMH QI 2.7 Percentage of programs with burnout policy GMH QI 2.8 Percentage of compliance with implementing incorporated e-log system in each program GMH QI 2.9 Percentage of trainees who fulfilled their promotion criteria GMH QI 2.10 Percentage of trainees who passed the board exam GMH QI 2.11 Percentage of programs that incorporated simulation in their curricula GMH QI 2.12 Percentage of programs with trainees receiving annual master rotation plan GMH QI 2.13 Percentage of programs in compliance with the annual master plan GMH QI 3.1 Percentage of programs with complete goals and objectives for residency programs GMH QI 3.2 Percentage of completed trainer evaluations by trainee per program

GOVERNANCE GMH QI 3.3 Percentage of adherence to accreditation requirements GMH QI 3.4 Percentage of PD turnover rate GMH QI 3.5 Percentage of accreditation compliance score GMH QI 3.6 Percentage of violations of the matching regulations

There were 23 general KPIs created, defined, and agreed upon by the Quality Indicators Committee which were intended to provide objective evidence in measuring the level of performance and its progress. Establishing a positive and trusting relationship with the postgraduate training stakeholders is the core of the SCFHS reform strategy and success. To achieve this, all stakeholders were included in the process of reviewing the 23 agreed upon KPIs in the GMH quality matrix. They were asked specifically about whether these KPIs are measurable, usable, reliable, simple to understand, available (data), robust, and clearly defined. Selected KPIs were distributed to all stakeholders for review, feedback, and approval (20 December 2017).

3.2. Sources of Data The sources of data collected for the 23 KPIs were numerous, including primary and secondary data collection methods. Primary sources were self-administered, semi-structured, consisting of a series of closed and open-ended validated questionnaires distributed to program directors and trainees. The secondary sources of data were collected from the academic affairs department databases at the Saudi Commission for Health Specialties (SCFHS), which includes databases from four different Sustainability 2020, 12, 8030 7 of 37 sections (admission, accreditation, training, and assessment sections databases). Table3 represents the matrix of the data sources for each KPI.

Table 3. The Sources of Data for Each KPI.

Program Trainees’ Admission Accreditation Training Assessment Code Directors’ Survey Department Department Department Department Survey GMH QI 1.1 GMH QI 1.2 GMH QI 1.3 GMH QI 1.4 GMH QI 2.1 GMH QI 2.2 GMH QI 2.3 GMH QI 2.4 GMH QI 2.5 GMH QI 2.6 GMH QI 2.7 GMH QI 2.8 GMH QI 2.9 GMH QI 2.10 GMH QI 2.11 GMH QI 2.12 GMH QI 2.13 GMH QI 3.1 GMH QI 3.2 GMH QI 3.3 GMH QI 3.4 GMH QI 3.5 GMH QI 3.6 Secondary Data Primary Data (Surveys) (SCFHS Databases)

3.3. Validity and Reliability of the Primary Data Collection Tool (i.e., Survey) The validation process of the questionnaire took place by content validity, which was conducted by the content experts; after that, face validity was performed with the help of the medical educationist who found the survey fulfilling the objectives of the survey and that the flow of the questions was in a logical sequence. For the reliability of the questionnaire, a pilot study was conducted on 40 participants; the questionnaire was modified according to the feedback of the respondents. Cronbach’s alpha was calculated as above 0.7 which suffices for reliability.

3.4. Program Directors’ Survey A well-designed interview questionnaire for program directors was developed through key informant interviews and extensive focused group discussion of the Quality Indicators Committee (QIC) panels. The questionnaire, consisting of 53 questions, was structured representing four main sections. The first section was demographic details of the program directors; this section had 14 questions exploring the personal information and the base of the participants’ practice. The second section was the supervision and training, which contained 15 questions. The third section was educational content and it had give questions. The fourth section was the support and resources section which had 19 questions. Initially, the SCFHS had a list of 1145 program directors in the directory; after an extensive checking and verification of the list, it was found that there was some duplication and wrong information, so the Sustainability 2020, 12, 8030 8 of 37

final list came out to be 1095 program directors in the country. During the survey period (December 2018 to February 2019) and following the consecutive sampling technique, all the 1097 program directors were invited. An online link was created for the survey and was sent to all program directors through email but only 464 replied to the email. Another 252 program directors were contacted, and an interview was conducted through a computer-assisted telephonic interview (CATI) by SCFHS personnel. The total number of program directors who responded was 716 (65.2%).

3.5. Trainees Survey The survey questionnaires for residents were developed by the PGMT Quality Indicators Committee (QIC) to produce an error-free measure of care quality. Measures should be based on characteristics of best practice such as validity, reliability, and transparency. The questionnaire was created on 25 September 2018, published on 1 October in the same year, and closed on 4 August 2019. The questionnaire consisted of six sections with questions distributed among them. Eight questions were covering the demographic section; in the second section, there were 14 questions asking about the trainees’ educational activities. Three questions in the third section were regarding the program satisfaction/training program. Section four had eight questions regarding perception and personal experience. The fifth and sixth sections had three and five questions for research participation and satisfaction with the SCHFS, respectively. There was a total of 13,688 residents working in different specialties throughout Saudi Arabia; only 3696 (27%) of the residents agreed to participate in the online survey. The trainers were left out of the survey due to time constraints. A total of 41 questions, representing indicators of the training programs’ quality, were validated by expert and QIC panels for clarity and content relevance. The KPI working group is responsible for collecting and analyzing data each year and making sure that the indicators remain precise and appropriate—unlike Toussaint et al.’s suggestion that the report should be on a quarterly basis [12].

3.6. Data Analysis The computer program used for data analysis was the statistical program SPSS version 24. A test will be considered significant if p-value < 0.05. All data were analyzed based on each KPI’s parameters and targets.

3.6.1. Data Management After the data were collected and entered into Microsoft Excel, respondents’ data were rechecked for blank/empty or any typo errors, and then the file was exported to SPSS format for further management and analysis.

3.6.2. Descriptive Statistics For the presentation of descriptive statistics, categorical data (gender, region distribution, qualification, etc.) were calculated as frequency (n) and percentages (%), while numerical data such as age and years of experience were presented by mean + standard deviation.

3.6.3. Inferential Statistics A Chi-square test was used to assess the association between two categorical variables like gender and country distribution.

4. Analysis and Main Findings In this section, we provide the initial analysis of our data. We provide the statistical analysis for the key aspects of the research tools (surveys) discussed in the previous section. In Section5, we will Sustainability 2020, 12, x FOR PEER REVIEW 10 of 45

Sustainability 2020, 12, 8030 9 of 37 4. Analysis and Main Findings In this section, we provide the initial analysis of our data. We provide the statistical analysis for summarizethe keySustainability aspects the key 2020ofinterpretations the, 12, xresearch FOR PEER toolsREVIEW and (surveys) the impact discussed of ourin the research previous towardssection. In personalized Section10 5, of we45 will medical trainingsummarize and4. Analysis the the digital keyand Maininterpretations transformation Findings and of healthcarethe impact of in our the KSA.research towards personalized medical training and the digital transformation of healthcare in the KSA. 4.1. DemographicIn this Features section, ofwe Our provide Respondents the initial analysis of our data. We provide the statistical analysis for the key aspects of the research tools (surveys) discussed in the previous section. In Section 5, we will 4.1. Demographic Features of Our Respondents summarize the key interpretations and the impact of our research towards personalized medical 4.1.1. Sectiontraining One: and Program the digital Directors’transformation Basic of healthcare Characteristics in the KSA. 4.1.1. Section One: Program Directors’ Basic Characteristics There4.1. were Demographic 716 program Features directors of Our Respondents who completed the questionnaire and were included in the final analysis.There The overall were 716 response program rate directors of the who program completed director the questionnaire was 65.3% (716and /were1097). included The respondents in the final analysis. The overall response rate of the program director was 65.3% (716/1097). The were mainly4.1.1. Saudi Section nationals One: Program (647, Directors’ 92%) andBasic predominantlyCharacteristics males 560 (71.7%) (Figure1) who were respondents were mainly Saudi nationals (647, 92%) and predominantly males 560 (71.7%) (Figure 1) morewho prevalent wereThere more in thewere prevalent northern 716 program in the and directorsnorthern southern who and regions completed southern (Figure theregions questionnaire2). (Figure 2).and were included in the final analysis. The overall response rate of the program director was 65.3% (716/1097). The respondents were mainly Saudi nationals (647, 92%) and predominantly males 560 (71.7%) (Figure 1) who were 100.0more prevalent in the northern and southern regions (Figure 2).

80.0100.0

72% 60.080.0 72% 60.0 40.0 PERCENTAGE 40.0 20.0PERCENTAGE 28% 20.0 28% 0.0 0.0 Male Female Male Female Gender Gender

Figure 1. Gender Distribution among Program Directors. FigureFigure 1. Gender 1. Gender Distribution Distribution among among Program Program Directors. Directors.

100 100 100%100% 90 90 95.7%95.7%

80 80 70 70 73.6%73.6% 68.2% 60 68.2% 60 61% 50 61% 50 40 40 30 30 20 39% 31.8% 26.4% 20 10 39% 31.8% 26.4% 4.3% 0 10 0 Eastern Western Central Southern4.3% Northern 0 0 EasternMale WesternFemale CentralMale Trend SouthernFemale Trend Northern

Figure 2.Male Gender DistributionFemale of ProgramMale Directors Trend in Each CountryFemale Region.Trend Figure 2. Gender Distribution of Program Directors in Each Country Region. Almost half (320,Figure 44.7%) 2. Gender of the Distribution program of directors Program Di wererectors in in the Each age Country group Region. between 41 and 50 years (Figures3 and4). Saudi board graduate was the most common highest qualification for our program directors, 58.4%; followed by graduates from different regional programs, 15%, like Arab, Egyptian, or Syrian boards. Figure5 describes the distribution of the primary qualification among program directors. The option “other” included boards in Australia, Pakistan, South Africa, etc. In Figure6, we provide also the overview of the Qualification Distribution among Different Regions. Sustainability 2020, 12, 8030 Sustainability 2020, 12, x FOR PEER REVIEW 11 of 45 10 of 37 Sustainability 2020, 12, x FOR PEER REVIEW 11 of 45 Almost half (320, 44.7%) of the program directors were in the age group between 41 and 50 years (FiguresAlmost 3 and half 4). (320, 44.7%) of the program directors were in the age group between 41 and 50 years (Figures 3 and 4). 50.0

50.0 44.7% 40.0 44.7% 40.0

30.0 32.7%

30.0 32.7%

20.0 PERCENTAGE 19.8% 20.0 PERCENTAGE 19.8% 10.0

10.0 2.8% 0.0 25 - 40 41 - 50 51 - 602.8% > 60 0.0 25 - 40 41 - 50Age Group 51 - 60 > 60

Age Group Figure 3. Age Distribution among Program Directors. Figure 3. AgeFigure Distribution 3. Age Distribution amon amongg Program Program Directors. Directors. 60.0%

60.0%

50.0%

50.0% 47.3% 40.0% 41.2% 47.3% 40.0% 38.2% 41.2% 30.0% 38.2% 29.3% 30.0% 29.3% 20.0% 20.1% 19.1% 20.0% 20.1% 10.0% 19.1%

10.0% 3.3% 1.5% 0.0% 25 - 40 41 - 50 51 - 603.3% > 60 1.5% 0.0% 25 - 40 41 - 50Male Female 51 - 60 > 60

Male Female Figure 4. Gender Distribution in Each Age Group among Program Directors. Figure 4. Gender Distribution in Each Age Group among Program Directors. Saudi board graduate was the most common highest qualification for our program directors, FigureSustainabilitySustainability 4. Gender 2020 2020, 12, 12, x, DistributionFORx FOR PEER PEER REVIEW REVIEW in Each Age Group among Program12 Directors. of12 45 of 45 58.4%;Saudi followed board by graduate graduates was from the differentmost common regional high programs,est qualification 15%, like for Arab, our Egyptian,program directors,or Syrian boards.58.4%; followed Figure 5 describesby graduates the distributionfrom different of theregional primary programs, qualification 15%, amonglike Arab, program Egyptian, directors. or Syrian The 450 450optionboards. “other” Figure 5included describes boards the distribution in Australia, of thePakistan, primary South qualification Africa, etc. among In Figure program 6, we directors. provide alsoThe theoption overview “other” of included the Qualification boards in Di Australia,stribution Pakistan, among Different South Africa, Regions. etc. In Figure 6, we provide also 400400 58.4% the overview58.4% of the Qualification Distribution among Different Regions. 350 350 300 300 250 250 200 200 150 150 100 15% 100 12.5% 50 15% 12.5% 9.5% 50 5.5% 9.5% 0 5.5% 0 Saudi Board Regional Boards North AmericanEuropean Europian Other Saudi Board Regional Boards North AmericanEuropean Europian Other

Figure 5. The Primary Qualification of the Program Directors. Figure 5. TheFigure Primary 5. The Primary Qualification Qualification of the of Program the Program Directors. Directors. 180 Chi Square Test = 3.91, 4df, p = 0.4 180160 Chi Square Test = 3.91, 4df, p = 0.4 53.2% 160140 53.2% 46.8% 56.1% 140120 46.8% 56.1% 120100 43.9%

80 100 43.9% 57.7% 8060 42.3% 6040 57.7% 20 66% 40 42.3% 34%41.7% 57.3% 0 20 66% Central Western Eastern Southern Northern 34%41.7% 57.3% 0 Saudi Board Others Central Western Eastern Southern Northern Figure 6. QualificationSaudi Distribution Board amongOthers Different Regions.

Most of the program directors, 44%, were recently appointed with experience in the job for less Figure 6. Qualification Distribution among Different Regions. thanFigure two years 6. andQualification 15% have been in this Distribution post for more than among six years Di(seeff Figureerent 7). Regions.In Figure 8, we provide also an indication of the Time Spent as Program Director by Gender. Most of the program directors, 44%, were recently appointed with experience in the job for less than two years and 15% have been in this post for more than six years (see Figure 7). In Figure 8, we provide also an indication of the Time Spent as Program Director by Gender. Sustainability 2020, 12, 8030 11 of 37

Most of the program directors, 44%, were recently appointed with experience in the job for less than two years and 15% have been in this post for more than six years (see Figure7). In Figure8, we provide also an indication of the Time Spent as Program Director by Gender.

Sustainability 2020, 12, x FOR PEER REVIEW 13 of 45

Sustainability 2020, 12, x FOR PEER REVIEW 13 of 45

15% (> 6 years) 15% 11% 44% (5 - 6 years) (> 6 years) (0 - 2 years) 11% 44% (5 - 6 years) (0 - 2 years)

30% (3 - 4 years) 30% (3 - 4 years)

0 - 2 years 3 - 4 years 5 - 6 years > 6 years

0 - 2 years 3 - 4 years 5 - 6 years > 6 years Figure 7. Time Spent in Years as Program Director. Figure 7. Time Spent in Years as Program Director. 250 Figure 7. Time Spent in Years as Program Director.

250 200 67.1%

200 67.1% 150 76.4%

150 76.4% 100 32.9%

100 32.9% 71.7% 50 75.6% 23.6% 71.7% 50 75.6% 28.3% 23.6% 24.4% 0 28.3% 0 - 2 years 3 - 4 years 5 - 6 years24.4% > 6 years 0 0 - 2 years 3 - 4 yearsMale Female 5 - 6 years > 6 years

Male Female Figure 8. Time Spent as Program Director by Gender.

Program directors whoFigure responded 8. Time toSpent our as survey Program resi Directorded in overby Gender. 27 cities. However, Riyadh was theFigure most common 8. cityTime of residence Spent for theas program Program directors, 38.4%, Director followed by by Jeddah Gender. city in 17.9%, (see FigureProgram 9). directors who responded to our survey resided in over 27 cities. However, Riyadh was the most common city of residence for the program directors, 38.4%, followed by Jeddah city in 17.9%, Program directors who(see Figure responded 9). to our survey resided in over 27 cities. However, Riyadh was the most common city of residence for the program directors, 38.4%, followed by Jeddah city in 17.9%, (see Figure9). Sustainability 2020, 12, x FOR PEER REVIEW 14 of 45

Riyadh 38.4%

Jeddah 17.9%

Dammam 7.7%

Makkah 5.8%

Madinah 4.9%

Taif 3.7%

Dhahran 3.0%

Abha 2.7%

Khobar 2.5%

Khamis 2.0%

Tabuk 1.8%

Alahsa 1.5%

Buraida 1.5%

Jizan 1.3%

Hofuf 0.7%

Najran 0.7%

Alqatif 0.7%

Jubail 0.7%

Hail 0.6%

Albaha 0.6%

Yanbu 0.4%

Bisha 0.3%

Unaizah 0.1%

Aljouf 0.1%

Arar 0.1%

Hafer Albatin 0.1%

Manamah 0.1%

0.0 5.0 10.0 15.0 20.0 25.0 30.0 35.0 40.0 45.0

Figure 9. Program Directors’ City of Residence. Figure 9. Program Directors’ City of Residence. The respondents covered around 60 different appointed programs (Figure 10). Program directors from the discipline of medicine and its subspecialties, pediatrics, surgical specialties, dentistry, and OBGY were over two-thirds of the respondents (Figure 11). Sustainability 2020, 12, 8030 12 of 37

The respondents covered around 60 different appointed programs (Figure 10). Program directors from the discipline of medicine and its subspecialties, pediatrics, surgical specialties, dentistry, and OBGY were over two-thirds of the respondents (Figure 11). Sustainability 2020, 12, x FOR PEER REVIEW 15 of 45

Pediatrics 50 Internal Medicine 40 Ob/Gyn 35 General Surgery 33 Restorative Dentistry 29 Family Medicine 28 Urology 20 Orthopedic Surgery 20 Anesthesia 20 Radiology Medical Imaging 17 Pediatric Dentistry 17 Periodontics 15 Endodontics 14 Psychiatry 13 Neurology 13 Emergency Medicine 13 Adult Critical Care Medicine 13 Prosthodontics 11 Dermatology 11 Clinical Pharmacology 10 Pediatric Surgery 10 Ophthalmology 10 Orthodontics 9 Pediatric Nephrology 9 Clinical Neurosurgery 8 Community Medicine 8 Diabetes 7 Pathology 7 Perinatal Neonatal Medicine 6 Pediatric Hematology and Oncology 6 Adult Nephrology 6 Adults Gastroenterology 5 Pediatric Endocrinology and Metabolism 5 Pediatric Gastroenterology and Nutrition 5 Plastic Surgery 5 Pediatric Neurology 4 Pediatric Intensive Care 4 Oncology 4 Oral Maxillo facial Surgery 4 Forensic Medicine 4 Adults Endocrinology and Metabolism 4 Pediatric Infectious Diseases 3 Oral Medicine & Pathology 3 Physical Medicine and Rehabilitation 3 Radiology Technology and Medical Imaging 3 Interventional Radiology 3 Pediatric Urology 2 Pediatric Pulmonary Diseases 2 Body Imaging & non-vascular intervention 2 Pediatric Neurosurgery 2 Adults Pulmonary Diseases 2 Cardiac Surgery 2 Colon and rectal Surgery 2 Adult Rheumatology 2 Adults Infectious Diseases 2 Pediatric Cardiology 1 Adult Gastroenterology Therapeutic Endoscopy 1 Palliative Medicine 1 Adult Cardiology 1 Pediatric Anesthesia 1 0 102030405060

Figure 10. Programs of the Appointed Program Directors Surveyed. SustainabilityFigure 2020, 12 10., x FORPrograms PEER REVIEW of the Appointed Program Directors Surveyed. 16 of 45

140

120

73% 91% 100

80

60 71% 57%

40

78% 20 81% 43% 82% 87% 36% 0 Medicine Surgery DentistryPediatrics Pediatric OBGY Family and Radiology Anaesthesia Emergency Pharmacology Community

Male Female

Figure 11. Gender Distribution of Program Directors among Disciplines. Figure 11. Gender Distribution of Program Directors among Disciplines. In Figure 12 to 21 we present some additional characteristics and qualitative features of the participants in our quality initiative including: age of program directors in each discipline (Figure 12), Health Sectors for the Included Program Directors (Figure 13).

Discipline

Figure 12. Age of Program Directors in Each Discipline. Sustainability 2020, 12, x FOR PEER REVIEW 16 of 45

140

120

73% 91% 100

80

60 71% 57%

40

78% 20 81% 43% 82% 87% 36% 0 Medicine Surgery DentistryPediatrics Pediatric OBGY Family and Radiology Anaesthesia Emergency Pharmacology Community Sustainability 2020, 12, 8030 13 of 37 Male Female

In Figures 12–21 we presentFigure 11. Gender some Distribution additional of Program characteristics Directors among Disciplines. and qualitative features of the participants in ourIn quality Figure 12 initiative to 21 we present including: some additional age of characteristics programdirectors and qualitative in eachfeatures discipline of the (Figure 12), Health Sectorsparticipants for the Included in our quality Program initiative including: Directors age (Figure of program 13 directors). in each discipline (Figure 12), Health Sectors for the Included Program Directors (Figure 13).

Sustainability 2020, 12, x FOR PEER REVIEW 17 of 45

Sustainability 2020, 12, x FOR PEER REVIEW 17 of 45 Nearly half of the program directors work in the Ministry of Health (49.4%), followed by the Ministry of Defense (16.2%); and only three directorsDiscipline work in the Royal Commission for Jubail and Yanbu.Nearly half of the program directors work in the Ministry of Health (49.4%), followed by the Ministry of Defense (16.2%); and only three directors work in the Royal Commission for Jubail and Figure 12. Age of Program Directors in Each Discipline. Yanbu.60.0% Figure 12. Age of Program Directors in Each Discipline.

60.0% 50.0% 49.4% 50.0% 40.0% 49.4%

40.0% 30.0%

30.0% 20.0%

20.0% 16.2% 10.0% 11.6% 11.4% 11.4% 16.2% 10.0% 11.6% 11.4% 11.4% 0.0% Ministry of Health Ministry of Defence Ministry of National Guard Others 0.0% Education Ministry of Health Ministry of Defence Ministry of National Guard Others Figure 13. Health Sectors for theEducation Included Program Directors.

Nearly half of theFigure program 13. Health directors Sectors are for holdinthe Includedg other Program administrative Directors. jobs at their centers, includingFigure head of 13. departmentHealth or Sectors section for for 29.9% the (see Included Figure 14 below). Program Directors. Nearly half of the program directors are holding other administrative jobs at their centers, including head60.0 of department or section for 29.9% (see Figure 14 below).

60.0 50.0 51.7%

50.0 51.7% 40.0

40.0 30.0

PERCENTAGE 30.0 20.0 PERCENTAGE 17.6% 20.0 17.0% 10.0 12.2% 17.6% 17.0% 10.0 12.2% 1.5% 0.0 None Head of Head of Dean/Vice1.5% dean Other, please 0.0 section/Unit department specify None Head of Head of Dean/Vice dean Other, please Figure 14. Other Appointedsection/Unit Administrativedepartment Positions for the Program Directors.specify Figure 14. Other Appointed Administrative Positions for the Program Directors. In FiguresFigure 15 and 14. 16 Other some Appointed more interesting Administrative data about Positions our for participants the Program are Directors. presented. Over 27% of the PDs have other special qualifications including medical education, hospital administration, In Figures 15 and 16 some more interesting data about our participants are presented. Over 27% of the PDs have other special qualifications including medical education, hospital administration, Sustainability 2020, 12, x FOR PEER REVIEW 18 of 45 Sustainability 2020, 12, 8030 14 of 37 public health, clinical epidemiology, and others but only 8.9% of the program directors have some degree or certificates in health profession education. Sustainability 2020, 12, x FOR PEER REVIEW 18 of 45

100 public health, clinical epidemiology, and others but only 8.9% of the program directors have some 90 degree or certificates in health profession education.

80 100

70 90

60 80 70 50 60 40 50

30 40 20 26.7% 30 22% 20 26.7% 10 22% 11.1% 10.2% 9.4% 9.3% 7% 7% 10 5.5% 0 11.1% 10.2% 9.4% 9.3% 7% 7% 5.5% Emergency0 Family and Anaesthesia OBGY Radiology Dentistry Medicine Pediatric Surgery EmergencyCommunity Family and Anaesthesia OBGY Radiology Dentistry Medicine Pediatric Surgery Community No Degree Degree in Education No Degree Degree in Education

Figure 15. Percentage of Program Directors with Degree in Health Profession Education among Figure 15. PercentageFigure of Program 15. Percentage Directors of Program withDirectors Degree with Degr inee Healthin Health ProfessionProfession Education Education among among Disciplines. Disciplines.Disciplines.

100% 100% 7.3% 4.3% 11.5% 7.9%7.9% 7.3% 4.3% 90% 11.5% 90% 80% 80% 70% 70% 60%

60% 50% 95.7% 100% 88.5% 92.1% 92.7% 50% 40% 95.7% 100% 88.5% 92.1% 92.7% 40% 30% 20% 30% 10% 20% 0% 10% Central Western Eastern Southern Northern

0% No Degree Degree in Education Central Western Eastern Southern Northern SustainabilityFigure 2020 16. ,Percentage 12, x FOR PEER of Program REVIEW Directors with Degree in Health Profession Education in Different19 of 45 Figure 16. PercentageRegions. of Program DirectorsNo with Degree DegreeDegree in in Health Education Profession Education in Different Regions. Sustainability 2020, 12, x FOR PEER REVIEW 19 of 45

4.1.2. Trainees’ Demographic Data Figure 16. Percentage of Program Directors with Degree in Health Profession Education in Different Regions.There were 3696 trainees who participated in this self-administered, structured, closed, and open-ended online survey. The47.7% response rate of the trainees through52.3% the online link survey was 27%. There were 1932 (52.3%) male respondents and most of the respondents were Saudi nationals, 91.9% 4.1.2. Trainees’ Demographic Data47.7% (see Figure 17). 52.3% There were 3696 trainees who participated in this self-administered, structured, closed, and open-ended online survey. The response rate of the trainees through the online link survey was 27%. There were 1932 (52.3%) male respondents and most of the respondents were Saudi nationals, 91.9% (see Figure 17).

Male Female

Male Female Figure 17. Gender Distribution among Trainees.

The respondents were representingFigure 17. Gender all majorDistribution disciplines, among Trainees.however, most of the trainees were trainees in medicalFigure disciplines, 17. Gender 90.9% (see Figure Distribution 18, below) among Trainees. The respondents were representing all major disciplines, however, most of the trainees were trainees in medical disciplines, 90.9% (see Figure 18, below) 100.0% 90.0% 100.0% 90.9% 80.0% 90.0% 70.0% 90.9% 80.0% 60.0% 70.0% 50.0% 60.0% 40.0% 50.0% 30.0% 40.0% 20.0% 30.0% 10.0% 20.0% 5.9% 1.7% 0.9% 0.6% 0.0% 10.0% Medicine Dentistry5.9% Nursing 1.7% Pharmacy 0.9% Applied 0.6% Health 0.0% MedicineFigure 18. Dentistry Distribution of Trainees Nursing among Disciplines. Pharmacy Applied Health Figure 18. Distribution of Trainees among Disciplines. Residents in trainingFigure represented 18. Distribution the most of Traineescommon among respondents, Disciplines. 94.5%, while fellows were 5.5%. One-third of the residents were working in the R2 level (1270, 34.4%) followed by 23.3% at the R3 levelResidents (860 =), in (See training Figure represented 19, below). the most common respondents, 94.5%, while fellows were 5.5%. One-third of the residents were working in the R2 level (1270, 34.4%) followed by 23.3% at the R3 level (860 =), (See Figure 19, below). Sustainability 2020, 12, x FOR PEER REVIEW 20 of 45 Sustainability 2020, 12, 8030 15 of 37 40.0% Sustainability 2020, 12, x FOR PEER REVIEW 20 of 45 35.0% 34.4% 40.0% 30.0%

35.0% 25.0% 34.4% 30.0% 23.3% 20.0%

25.0% 19.2% 15.0% 23.3% 20.0% 10.0% 12.4% 19.2% 15.0% 5.0% 3.9% 4.9% 10.0% 12.4% 0.3% 0.7% 0.9% 0.0% R1 R2 R3 R4 R5 R6 F13.9% F2 F3 5.0% 4.9% 0.3% 0.7% 0.9% Figure 19. Level of Trainees in Our Study Sample. 0.0% R1 R2 R3 R4 R5 R6 F1 F2 F3 In Figure 20, we communicate that more than one-third of the trainees are in the Central Region (40.4%) of Saudi Arabia and Figureafter that 19. Levelapproximately of Trainees one-thirdin Our Study in theSample. Western Region (34.5%). A very small number are inFigure the Northern 19. Level Region of (0.4%). Trainees in Our Study Sample. In Figure 20, we communicate that more than one-third of the trainees are in the Central Region (40.4%)45.0% of Saudi Arabia and after that approximately one-third in the Western Region (34.5%). A very small number are in the Northern Region (0.4%). 40.0% 40.4% 35.0%45.0% 34.5% 30.0%40.0% 40.4% 25.0%35.0% 34.5% 20.0%30.0%

15.0%25.0% 17.6%

10.0%20.0%

15.0%5.0% 17.6% 7.2% 0.4% 10.0%0.0% Central Region Western Region Eastern Region Southern region Northern Region 5.0% 7.2% 0.4% Figure 20. Distribution of Trainees among Different Regions in Saudi Arabia. 0.0% SustainabilityFigure 20.2020,Distribution 12, x FOR PEER REVIEW of Trainees among Different Regions in Saudi Arabia.21 of 45 Central Region Western Region Eastern Region Southern region Northern Region The highest number of trainees are working in the city of Riyadh (39.6%) and then 20.1% of the 45.0% trainees are workingFigure in 20. Jeddah; Distribution only of0.4% Trainees are working among Di infferent the city Regions of Hail in Saudi(see Figure Arabia. 21, below).

40.0% The highest number of trainees are working in the city of Riyadh (39.6%) and then 20.1% of the 39.6% trainees are working in Jeddah; only 0.4% are working in the city of Hail (see Figure 21, below). 35.0%

30.0%

25.0%

20.0% 20.1% 15.0%

10.0%

8.9% 8.8% 8.4% 5.0% 7.1% 4.7% 0.4% 2.0% 0.0% Riyadh Jeddah Makkah Damman Khobar Abha Madina Hail Gulf cities

Figure 21. The Distribution of Trainees among Different Cities. Figure 21. The Distribution of Trainees among Different Cities. 4.2. Results of Key Performance Indicator (KPI) Domains Nearly half of the program directors work in the Ministry of Health (49.4%), followed by the Reaction domain of Kirkpatrick Ministry of Defense (16.2%); and only three directors work in the Royal Commission for Jubail The initial set of KPIs was based on the reaction domain of the Kirkpatrick training evaluation and Yanbu. model, which measures the value of the training program among stakeholders, i.e., trainees, trainers, Nearly halfand ofprogram the directors. program There directors were four KPIs are developed holding based other on this administrative domain. jobs at their centers, including head4.2.1. of department Percentage of Trainees’ or section Satisfaction for (GMH 29.9% QI (see 1.1) Figure 14 below). In Figures 15 andThis is 16 a score-based some more KPI, with interesting a target set dataat 80% about satisfaction. our This participants is an accumulative are presented.score of Over 27% of the PDs haveeight other different special domains qualifications retrieved from the including trainees’ questionnaire. medical education,The domains are hospital residents’ administration, satisfaction of academic activities in their training centers, residents’ satisfaction with other residents in their programs, residents’ satisfaction with the administrative component of their training, residents’ satisfaction with the program and recommending it to others, residents’ satisfaction with their training center, residents’ satisfaction with their competency, residents’ satisfaction with their specialty, and residents’ satisfaction with SCFHS functions. The calculated score for overall trainees’ satisfaction was 69%; more details are described in Figure 22. So the target is not yet achieved. Sustainability 2020, 12, 8030 16 of 37 public health, clinical epidemiology, and others but only 8.9% of the program directors have some degree or certificates in health profession education.

4.1.2. Trainees’ Demographic Data There were 3696 trainees who participated in this self-administered, structured, closed, and open-ended online survey. The response rate of the trainees through the online link survey was 27%. There were 1932 (52.3%) male respondents and most of the respondents were Saudi nationals, 91.9% (see Figure 17). The respondents were representing all major disciplines, however, most of the trainees were trainees in medical disciplines, 90.9% (see Figure 18, below) Residents in training represented the most common respondents, 94.5%, while fellows were 5.5%. One-third of the residents were working in the R2 level (1270, 34.4%) followed by 23.3% at the R3 level (860 =), (See Figure 19, below). In Figure 20, we communicate that more than one-third of the trainees are in the Central Region (40.4%) of Saudi Arabia and after that approximately one-third in the Western Region (34.5%). A very small number are in the Northern Region (0.4%). The highest number of trainees are working in the city of Riyadh (39.6%) and then 20.1% of the trainees are working in Jeddah; only 0.4% are working in the city of Hail (see Figure 21, below).

4.2. Results of Key Performance Indicator (KPI) Domains

Reaction domain of Kirkpatrick The initial set of KPIs was based on the reaction domain of the Kirkpatrick training evaluation model, which measures the value of the training program among stakeholders, i.e., trainees, trainers, and program directors. There were four KPIs developed based on this domain.

4.2.1. Percentage of Trainees’ Satisfaction (GMH QI 1.1) This is a score-based KPI, with a target set at 80% satisfaction. This is an accumulative score of eight different domains retrieved from the trainees’ questionnaire. The domains are residents’ satisfaction of academic activities in their training centers, residents’ satisfaction with other residents in their programs, residents’ satisfaction with the administrative component of their training, residents’ satisfaction with the program and recommending it to others, residents’ satisfaction with their training center, residents’ satisfaction with their competency, residents’ satisfaction with their specialty, and residents’ satisfaction with SCFHS functions. The calculated score for overall trainees’ satisfaction was 69%; more details are described in Figure 22. So the target is not yet achieved. Sustainability 2020, 12, x FOR PEER REVIEW 22 of 45

100.0%

80.0% 86.0%

73.0% 70.0% 60.0% 69.0% 63.0% 65.0% 65.0% 56.0% 53% 40.0%

20.0%

0.0%

FigureFigure 22. Trainees’22. Trainees’ Satisf Satisfactionaction Scores. Scores. 4.2.2. Percentage of Trainers’ Satisfaction (GMH QI 1.2) This KPI is defined as the total number of satisfied trainers divided by the total number of surveyed trainers in that period of time. The target score was set at an 80% satisfaction rate. Unfortunately, these data were unavailable; trainers were not surveyed last year.

4.2.3. Percentage of Program Directors’ (PDs’) Satisfaction (GMH QI 1.3) This is a score-based KPI, with a target set at 80% satisfaction. This accumulative score contains four main satisfaction domains: satisfaction with training faculty; satisfaction with evaluation process; satisfaction with educational components; and satisfaction with the training center’s governance, education environment, and resources. The overall score for PDs’ satisfaction was calculated at 76%, slightly below our target value of 80%; this is explained in detail in Figure 23.

100.0%

80.0% 80.0% 53% 75.0% 72.0% 76.0% 60.0%

40.0%

20.0%

0.0% Training Faculty Evaluation Process Educational Content Training Center Overall Satisfaction governence, educational environment and Resources

Sustainability 2020, 12, x FOR PEER REVIEW 22 of 45

100.0%

80.0% 86.0%

73.0% 70.0% 60.0% 69.0% 63.0% 65.0% 65.0% 56.0% 53% 40.0%

20.0%

0.0%

Sustainability 2020, 12, 8030 17 of 37

4.2.2. Percentage of Trainers’ Satisfaction (GMH QI 1.2) Figure 22. Trainees’ Satisfaction Scores. This KPI is defined as the total number of satisfied trainers divided by the total number of surveyed trainers in4.2.2. that Percentage period of of Trainers’ time. The Satisfaction target (GMH score QI was 1.2) set at an 80% satisfaction rate. Unfortunately, these data wereThis unavailable; KPI is defined trainers as the total were number not surveyed of satisfied last trainers year. divided by the total number of surveyed trainers in that period of time. The target score was set at an 80% satisfaction rate. 4.2.3. PercentageUnfortunately, of Program these data Directors’ were unavailable; (PDs’) trainers Satisfaction were not (GMH surveyed QI last 1.3) year.

This is4.2.3. a score-based Percentage of KPI,Program with Directors’ a target (PDs’) set Satisfaction at 80% satisfaction. (GMH QI 1.3) This accumulative score contains four main satisfactionThis is a score-based domains: KPI, satisfaction with a target with set at training 80% satisfaction. faculty; This satisfaction accumulative with score evaluation contains process; satisfactionfour with main educational satisfaction domains: components; satisfaction and with satisfaction training faculty; with thesatisfaction training with center’s evaluation governance, educationprocess; environment, satisfaction and with resources. educational The components overall score; and forsatisfaction PDs’ satisfaction with the training was calculated center’s at 76%, slightly belowgovernance, our target education value environment, of 80%; this and is resources. explained The in overall detail score in Figure for PDs’ 23 .satisfaction was calculated at 76%, slightly below our target value of 80%; this is explained in detail in Figure 23.

100.0%

80.0% 80.0% 53% 75.0% 72.0% 76.0% 60.0%

40.0%

20.0%

0.0% Training Faculty Evaluation Process Educational Content Training Center Overall Satisfaction governence, educational environment and Resources

Sustainability 2020, 12, x FOR PEER REVIEW 23 of 45 Figure 23. Program Directors’ Satisfaction Scores. Figure 23. Program Directors’ Satisfaction Scores. 4.2.4. Percentage of Trainees’ Burnout (GMH QI 1.4) 4.2.4. Percentage of Trainees’ Burnout (GMH QI 1.4) This KPI was measured through the trainees’ questionnaire and defined as the number of trainees who feelThis burnout KPI [was38– 49measured] divided through by the the total trainees’ number questionnaire of surveyed and trainees defined inas thatthe periodnumber ofof time. trainees who feel burnout [38–49] divided by the total number of surveyed trainees in that period of The target value was set at a level equal to/below 10%. The calculated score was 66.7% and the time. The target value was set at a level equal to/below 10%. The calculated score was 66.7% and the breakdownbreakdown score score is demonstrated is demonstrated in in Figures Figures 24 24–26.–26.

45.0%

40.0%

35.0% 38.5%

30.0%

25.0% 28.2%

20.0%

15.0% 15.6% 10.0% 13.7%

5.0% 3.9% 0.0% Always Sometimes Rarely It depends Never

FigureFigure 24. 24.The The Prevalence Prevalence of Burnout among among Residents. Residents.

80.00% 78.00% 77.4% 77% 75.8% 75.8% 76.00% 74% 74.00% 72.6% 72.00% 70.8% 70.00% 68% 68.00% 66.00% 64.00% 62.7% 62.00% 60.00% R 1 R 2 R 3 R 4 R 5 R 6 F 1 F 2 F 3

Figure 25. The Mean Burnout Scores According to the Trainees’ Level of Training. Sustainability 2020, 12, x FOR PEER REVIEW 23 of 45

Figure 23. Program Directors’ Satisfaction Scores.

4.2.4. Percentage of Trainees’ Burnout (GMH QI 1.4) This KPI was measured through the trainees’ questionnaire and defined as the number of trainees who feel burnout [38–49] divided by the total number of surveyed trainees in that period of time. The target value was set at a level equal to/below 10%. The calculated score was 66.7% and the breakdown score is demonstrated in Figures 24–26.

45.0%

40.0%

35.0% 38.5%

30.0%

25.0% 28.2%

20.0%

15.0% 15.6% 10.0% 13.7%

5.0% 3.9% 0.0% Always Sometimes Rarely It depends Never Sustainability 2020, 12, 8030 18 of 37 Figure 24. The Prevalence of Burnout among Residents.

80.00% 78.00% 77.4% 77% 75.8% 75.8% 76.00% 74% 74.00% 72.6% 72.00% 70.8% 70.00% 68% 68.00% 66.00% 64.00% 62.7% 62.00% 60.00% R 1 R 2 R 3 R 4 R 5 R 6 F 1 F 2 F 3

Figure 25. The Mean Burnout Scores According to the Trainees’ Level of Training. SustainabilityFigure 2020 25., 12,The x FOR Mean PEER REVIEW Burnout Scores According to the Trainees’ Level of Training.24 of 45

90% 85% 85% Sustainability 2020, 12, x FOR PEER REVIEW 80% 24 of 45 80% 77% 77% 76% 76% 90% 73% 74% 73% 75% 85% 85% 70% 66% 80% 80% 77% 77% 65% 76% 76% 74% 75% 73% 73% 60% 70% Riyadh Jeddah Khober Madina Makkah Dammam Abha Hail Bahrain Dubai 66% 65% FigureFigure 26. The 26. MeanThe Mean Burnout Burnout Scores Scores AccordingAccording to tothe the Trainees’ Trainees’ City of City Residency. of Residency. 60% Learning domainRiyadh of Jeddah Kirkpatrick Khober Madina Makkah Dammam Abha Hail Bahrain Dubai

Learning domainThe second of Kirkpatrick set of KPIs was, it focused on the learning domain, which is the capability in Figure 26. The Mean Burnout Scores According to the Trainees’ City of Residency. The seconddeveloping set new of KPIs knowledge, was, it skills, focused and onattitu thede, learning and it contains domain, 13 different which isKPIs. the capability in developing Learning domain of Kirkpatrick new knowledge,4.2.5. Percentage skills, of and PDs attitude, Who Attended and itPD contains Training 13Course diff erentOffered KPIs. by the SCFHS (GMH QI 2.1) The second set of KPIs was, it focused on the learning domain, which is the capability in This KPI is defined as the number of program directors who attended the course divided by the 4.2.5. Percentagedeveloping of new PDs knowledge, Who Attended skills, and PD attitu Trainingde, and Courseit contains O 13ff ereddifferent by KPIs. the SCFHS (GMH QI 2.1) total number of program directors in the same period. There was no target value set for this KPI. ThisThere KPI4.2.5. were isPercentage defined 38.4% of of asPDs program the Who number Attended directors of PDwho program Traini receivedng Course directors such Offeredcourses, who by which the attended SCFHS is better (GMH the than course QI last 2.1) year, divided by the totalwhich number was ofcalculated program at 15% directors (See Figures in 27 the and same 28, below). period. The details There are was described no target in the valuefollowing set for this figures.This KPI is defined as the number of program directors who attended the course divided by the KPI. Theretotal were number 38.4% of program of program directors directors in the same who period. received There was such nocourses, target value which set for is this better KPI. than last There were 38.4% of program directors who received such courses, which is better than last year, year, which100% was calculated at 15% (See Figures 27 and 28, below). The details are described in the which was calculated at 15% (See Figures 27 and 28, below). The details are described in the following following figures.90% figures.80% 38.8% 37.4% 70% Chi-Square test, p = 0.41 60%100% 50%90% 40%80% 38.8% 37.4% Chi-Square test, p = 0.41 30%70% 20%60% 10%50% 0%40% 30% Male Female 20% 10% No Orientation Orientation Received 0% Figure 27. PercentageMale of Program Directors Who Received Orientation Female by Gender. No Orientation Orientation Received

Figure 27.FigurePercentage 27. Percentage of Program of Program Directors Directors Wh Whoo Received Received Orientation Orientation by Gender. by Gender. Sustainability 2020, 12, 8030 19 of 37 Sustainability 2020, 12, x FOR PEER REVIEW 25 of 45

100% 90% 32.7% 80% 38.5% 39.1% 44.5% 45.5% 70% 60% 50% 40% 30% 20% 10% 0% Central Western Eastern Southern Northern

No Orientation Orientation Received

Figure 28. TheThe Percentage Percentage of Orientation Receiv Receiveded by Program Directors by Region.

4.2.6. PercentagePercentage of of Trainers Trainers in PGMTin PGMT Programs Progra Whoms Who Successfully Successfully Completed Completed SCFHS SCFHS Training Training (GMH QI 2.2) (GMHThis QI KPI2.2) is defined as the total number of certified trainers by SCFHS in 2018. The data source for thisThis KPI KPI was is defined the raining as the department total number database. of certified The target trainers score by wasSCFHS set atin the2018. value The 200.data In source 2018, forthere this were KPI 276 was trainers the raining who completeddepartment the database. SCFHS trainingThe target course. score Thiswas KPIset at achieved the value its 200. target In and2018, it thereis significantly were 276 highertrainers than who last completed year, 2017, the when SCFHS the tr scoreaining was course. 66% of This the KPI Target. achieved its target and it is significantly higher than last year, 2017, when the score was 66% of the Target. 4.2.7. Percentage of Surveyors Who Have Successfully Completed SCFHS’s Accreditation Training (GMH QI 2.3) 4.2.7. Percentage of Surveyors Who Have Successfully Completed SCFHS’s Accreditation Training (GMHThis QI KPI2.3) is defined as the total number of certified surveyors by SCFHS divided by the total number of surveyors. The target was set at a level of 70%. The calculated score for this year was 67%, This KPI is defined as the total number of certified surveyors by SCFHS divided by the total as per the feedback from the Accreditation Department Database. number of surveyors. The target was set at a level of 70%. The calculated score for this year was 67%, as4.2.8. per Percentagethe feedback of from Trainees’ the Accreditation Compliance with Department Minimal Database. Procedure, Case Exposure Policies Required Competency Index (GMH QI 2.4) 4.2.8. Percentage of Trainees’ Compliance with Minimal Procedure, Case Exposure Policies This KPI is target value was set at 90%, however, we did not identify the source of data yet to Required Competency Index (GMH QI 2.4) be used, and it was not yet measured. Our expectation is that this data will be available through the One45This program KPI is target by 2020, value and was we willset at work 90%, on however, achieving we that. did not identify the source of data yet to be used, and it was not yet measured. Our expectation is that this data will be available through the One454.2.9. Percentage program by of 2020, Trainees and Who we will Have work Received on achieving Trainees’ that. Evaluation by Program in a Specific Period (GMH QI 2.5) 4.2.9.This Percentage KPI is defined of Trainees as the Who total Have number Received of trainees Trainees’ who receivedEvaluation their by evaluationProgram in with a Specific two weeks Periodof the end (GMH of every QI 2.5) rotation (see Figure 29, below). The target was set at the level of 100% compliance. Ideally,This this KPI score is defined should as bethe collected total number from of the traine trainees,es who but, received unfortunately, their evaluation it was notwith included two weeks in oftheir the survey, end of therefore,every rotation the only (see availableFigure 29, data below). were The in thetarget program was set directors’ at the level survey. of 100% The compliance. mean score Ideally,for compliance this score was should estimated be collected to be 74.2%. from This the compliancetrainees, but, rate unfortunately, was nearly identical it was not in eachincluded region, in withtheir psurvey,= 0.9 (ANOVA therefore, test). the only available data were in the program directors’ survey. The mean score for compliance was estimated to be 74.2%. This compliance rate was nearly identical in each region, with p = 0.9 (ANOVA test). Sustainability 2020, 12, 8030 20 of 37

Sustainability 2020, 12, x FOR PEER REVIEW 26 of 45

100

90

80

70

60 66.2%

50

40

30

20 24.8%

10 8.9% 0 Unsatisfied Neutral Satisfied

Figure 29. The Satisfaction Rate of the Program Directors with the Compliance of the Training Faculty Figure 29. The Satisfactionin Submitting Rate Trainees of’ theEvaluations Program in Time. Directors with the Compliance of the Training Faculty

in Submitting Trainees’4.2.10. Percentage Evaluations of Research in Included Time. in Curriculum (GMH QI 2.6) This is defined as the number of programs that included research-related issues. The target value 4.2.10. Percentage ofwas Research set at 70%. IncludedThis KPI source in Curriculumof information is the (GMH Training QI Department 2.6) Database but, unfortunately, these data are not yet available. Research participation was documented in 33.1% of This is defined asthe trainees’ the number surveys. of programs that included research-related issues. The target value was set at 70%. This4.2.11. KPI Percentage source of Programs information with Burnout is Policy the (GMH Training QI 2.7) Department Database but, unfortunately, these data are not yet available.This KPI target Research value was set participation at a 100% compliance was rate. documented The source of information in 33.1% was ofthe the trainees’ surveys. program directors’ survey which counted the compliance with such policy as 69%. This KPI in the future should rely on the accreditation data, as well. 4.2.11. Percentage of Programs with Burnout Policy (GMH QI 2.7) 4.2.12. Percentage of Compliance with Implementing Incorporated e-log System in Each Program This KPI target(GMH value QI 2.8) was set at a 100% compliance rate. The source of information was the This KPI is defined as the number of programs that are implementing an incorporated e-log program directors’ surveysystem divided which by the number counted of all programs. the compliance The target value withwas set at such 30%. The policy source of as such 69%. This KPI in the future should rely oninformation the accreditation is the Training Department data,as Database. well. Unfortunately, the score is not available, as the data were not yet received. 4.2.12. Percentage of4.2.13. Compliance Percentage of Trainees with Who Implementing Fulfilled Their Promotion Incorporated Criteria (GMH QI e-log 2.9) System in Each Program (GMH QI 2.8) This KPI target was set at 70%, and the Training Department Database is the main source of this information. Unfortunately, the score is not available, as the data were not yet received.

This KPI is defined4.2.14. Percentage as the ofnumber Trainees Who ofPassed programs the Board Exam that (GMH are QI 2.10) implementing an incorporated e-log system divided by the numberThis KPI target of was all set programs. at a 70% pass rate The in the target board exam, value and the was Training set Department at 30%. The source of such Database (and Assessment Department Database) is the main source of this information. The information is the Trainingcalculated score Department for this KPI in 2018 Database. was 79%. Unfortunately, the score is not available, as the data were not yet received.

4.2.13. Percentage of Trainees Who Fulfilled Their Promotion Criteria (GMH QI 2.9) This KPI target was set at 70%, and the Training Department Database is the main source of this information. Unfortunately, the score is not available, as the data were not yet received.

4.2.14. Percentage of Trainees Who Passed the Board Exam (GMH QI 2.10) This KPI target was set at a 70% pass rate in the board exam, and the Training Department Database (and Assessment Department Database) is the main source of this information. The calculated score for this KPI in 2018 was 79%.

4.2.15. Percentage of Programs That Incorporated Simulation in Their Curricula (GMH QI 2.11) The target value of this KPI was set at 50% or more. At present, the source of this information was the program directors’ survey, Q 3.1, and the trainees’ survey, Q10. The score in the program directors’ survey was 3% only, while the score in the trainees’ survey was 34.2%. The accumulative mean score was only 18.6%.

4.2.16. Percentage of Programs with Trainees Receiving Annual Master Rotation Plan (GMH QI 2.12) This KPI is defined as the number of trainees receiving their annual master training program before 1 October. The target value of this KPI was set at above 90%. The source of this information Sustainability 2020, 12, x FOR PEER REVIEW 27 of 45 Sustainability 2020, 12, x FOR PEER REVIEW 27 of 45 4.2.15. Percentage of Programs That Incorporated Simulation in Their Curricula (GMH QI 2.11) 4.2.15.The Percentage target value of Programs of this KPI That was Incorporated set at 50% or Simulation more. At present,in Their Curriculathe source (GMH of this QI information 2.11) was theThe programtarget value directors’ of this survey, KPI was Q set 3.1, at and 50% the or trainees’more. At survey,present, Q10. the sourceThe score of this in the information program Sustainability 2020, 12,directors’was 8030 the program survey wasdirectors’ 3% only, survey, while Q the 3.1, score and inthe the trainees’ trainees’ survey, survey Q10. was The 34.2%. score The in accumulative the program 21 of 37 meandirectors’ score survey was only was 18.6%. 3% only, while the score in the trainees’ survey was 34.2%. The accumulative mean score was only 18.6%. 4.2.16. Percentage of Programs with Trainees Receiving Annual Master Rotation Plan (GMH QI 2.12) was the program directors’4.2.16.This Percentage KPI is survey. defined of Programs asAround the with number Trainees 73%of trainees Receiving of thereceiving Annual program their Master annual Rotation directors master Plan training (GMH apply QIprogram 2.12) for the annual master beforeThis 1 October. KPI is defined The target as the value number of this of KPI trainees was setreceiving at above their 90%. annual The source master of training this information program rotation plan for residentswasbefore the 1 program October. in theirdirectors’ The target center survey. value earlyAroundof this KPI in73% thewas of the set academic programat above directors90%. year, The apply source as for per of the this program annual information master directors’ feedback (see Figures 30 androtationwas 31 the, program below).plan for directors’residents survey. in their Around center 73%early of in the the program academic directors year, applyas per for program the annual directors’ master feedbackrotation plan (see Figuresfor residents 30 and in 31, their below). center early in the academic year, as per program directors’ feedback (see Figures 30 and 31, below).

27% 27%

73% 73%

Annual Planned Rotation No Annual Rotation Plan Annual Planned Rotation No Annual Rotation Plan Figure 30. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year. Figure 30. The RateFigure of 30. Applying The Rate of Applying for Annual for Annual Rotation Rotation Plan Plan for Residents for Residents Early in the Academic Early in Year. the Academic Year. 90 Chi-Square test, p = 0.67 90 80 Chi-Square test, p = 0.67

80 76.9% 76.1% 70 73.6% 72.7% 70% 76.9% 76.1% 70 73.6% 72.7% 60 70% 60 50 50 40 40 30 30% 30 26.4% 27.3% 20 30% 23.1% 23.9% 26.4% 27.3% 20 23.9% 10 23.1% 10 0 0 Central Western Eastern Southern Norhern Central Western Eastern Southern Norhern Planned Rotation Unplanned Rotation Planned Rotation Unplanned Rotation Figure 31. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year byFigure Region. 31. The Rate of Applying for Annual Rotation Plan for Residents Early in the Academic Year Figure 31. The Rateby ofRegion. Applying for Annual Rotation Plan for Residents Early in the Academic Year by Region.

4.2.17. PercentageSustainability of Programs 2020, 12, x FOR in PEER Compliance REVIEW with the Annual Master Plan (GMH28 of 45 QI 2.13)

This KPI is defined4.2.17. Percentage as the of number Programs in of Compliance trainees with who the Annual completed Master Plan their (GMH block QI 2.13) rotation. The target value of this KPI was set atThis a levelKPI is defined above as the 90%. number The of trainees source who of completed this information their block rotation. was The the target program directors’ value of this KPI was set at a level above 90%. The source of this information was the program survey. The meandirectors’ score survey. of adherences The mean score to of theadherences master to the rotation master rotation planned planned early early in the in academic the academic year was 76.9%; this rate wasyear similar was 76.9%; across this rate was genders, similar across type genders, of training type of training center, center, and and regions regions (see (seeFigure Figure 32, below). 32, below).

100

90

80 76.9%

70

60

50

40

30

20 14.3%

10 4.3% 4.5%

0 < 30 % 30 - 50 % 51 - 80 % > 80 %

Figure 32. The Compliance Rate (Adherence) to the Master Rotation Plan Set Early in the Academic Figure 32. The Compliance Rate (Adherence) to theYear. Master Rotation Plan Set Early in the Academic Year.

Training domain of Kirkpatrick The final set of KPIs is focused on the training governance domain and it contains six different KPIs.

4.2.18. Percentage of Programs with Complete Goals and Objectives for Residency Programs (GMH QI3.1) This is defined as the total number of programs with a clear statement outlining goals and educational objectives for their residency program. The target value for this KPI was set at 80%. Clear goals and objectives of every rotation were provided to 87.7% of residents (see Figure 33, below). . This was similar in both genders and all regions. However, program directors who received orientation about the role and responsibilities of the job were significantly better in providing the residents with those objectives in every rotation. Sustainability 2020, 12, 8030 22 of 37

Training domain of Kirkpatrick The final set of KPIs is focused on the training governance domain and it contains six different KPIs.

4.2.18. Percentage of Programs with Complete Goals and Objectives for Residency Programs (GMH QI3.1) This is defined as the total number of programs with a clear statement outlining goals and educational objectives for their residency program. The target value for this KPI was set at 80%. Clear goals and objectives of every rotation were provided to 87.7% of residents (see Figure 33, below). This was similar in both genders and all regions. However, program directors who received orientation about the role and responsibilities of the job were significantly better in providing the residents with those objectives in every rotation. Sustainability 2020, 12, x FOR PEER REVIEW 29 of 45

100% 92% Chi-Square test, p = 0.006 90% 85% Sustainability 2020, 12, x FOR PEER REVIEW 29 of 45 80%

100%70% 92% Chi-Square test, p = 0.006 60%90% 85%

50%80%

40%70%

30%60%

20%50% 15% 8% 10%40%

30%0% Program Directors Recieved Orientation Program Directors without Orientation 20% 15% Goals and Objectives were provided Goals and Objectives were not provided 8% 10%

0%Figure 33. The Rate of Providing the Residents with Goals and Objectives of Every Rotation among Figure 33. TheProgram Rate of DirectorsProgram Providing DirectorsWho Did/or the Recieved Did Residents Not Orientation Receive with Orientation Goals Program on andTheir Dire ObjectivesRolesctors and without Responsibilities. Orientation of Every Rotation among

Program Directors WhoGoals Did and/or Objectives Did Not were Receive provided OrientationGoals and Objectives on Their were Rolesnot provided and Responsibilities. 4.2.19. Percentage of Completed Trainer Evaluations by Trainee per Program (GMH QI 3.2)

4.2.19. PercentageTheFigure of Completedtarget 33. The score Rate for of thisTrainerProviding KPI was the Evaluations setResidents at 70%. with The Go bytraineesals Traineeand Objectiveshad a chance per of Every Program to evaluate Rotation the (GMHamong training QI 3.2) facultyProgram in a structured Directors Whoprocess Did/or in 48.7%Did Not of Receive cases (see Orientation Figures on 34–36 Their below). Roles and This Responsibilities. was not significantly The targetdifferent score forbetween this genders, KPIwas regions, set orat types 70%. of program—independent The trainees had or ajoint chance program. to However, evaluate the training faculty in a structuredthere4.2.19. is Percentage a significant process of difference Completed in 48.7% found Trainer of when cases Evaluations the (seeprogram by Figures Traineedirector per34spends –Program36 2–3 below). h (GMHmanaging QI This 3.2) the training was not significantly activity in comparison to other time frames. different between genders,The target score regions, for this KPI or was types set at of 70%. program—independent The trainees had a chance to evaluate or joint the training program. However, faculty in a structured process in 48.7% of cases (see Figures 34–36 below). This was not significantly 100% there is a significantdifferent di betweenfference genders, found regions, when or types theChi-Square of program program—independent test, p=0.38 director spends or joint program. 2–3 h managingHowever, the training activity in comparisonthere90% is a significant to other difference time frames.found when the program director spends 2–3 h managing the training activity80% in comparison to other time frames. 48.8% 70% 57.5%

100%60% Chi-Square test, p=0.38 50%90% 40%80% 48.8% 30%70% 57.5% 51.2% 20%60% 42.5% 10%50% 40%0% 30% Male Female 51.2% 20% Faculty Evaluation by Trainees No Evaluation42.5% 10% 0%Figure 34. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According to the Gender of Program MaleDirector. Female

Faculty Evaluation by Trainees No Evaluation

Figure 34. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According Figure 34. The Rateto the Gender of Enabling of Program the Director. Trainees to Evaluate the Faculty in a Structured Process According to the Gender of Program Director. Sustainability 2020, Sustainability12, 8030 2020, 12, x FOR PEER REVIEW 30 of 45 23 of 37

100 Sustainability 2020, 12, x FOR PEER REVIEW 30 of 45

90 100

80 90

70 80 62.5% 70 55.4% 57.7% 62.5% 60 55.3% 53.1% 55.4% 57.7% 60 55.3% 53.1% 50 44.7% 46.9% 44.6% 46.9% 50 44.7% 44.6% 42.3% 42.3% 37.5% 40 37.5% 40

30 30

20 20

10 10 0 0 Central Eastern Western Southern Northern Central Eastern Western Southern Northern Faculty Evaluation by Trainees No Evaluation Faculty Evaluation by Trainees No Evaluation Figure 35. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process by Region. Figure 35. The RateFigure of35. The Enabling Rate of Enabling the Trainees the Trainees to to Evaluate Evaluate the the Faculty Faculty in a Structur in aed Structured Process by Region. Process by Region. 100

100 90 Chi-Square test, p < 0.0001 81.4% 90 80 Chi-Square test, p < 0.0001 70.7% 81.4% 70 80 61.4% 60 70.7% 51.7% 70 48.3% 50 61.4% 60 38.6% 40 51.7% 29.3% 48.3% 50 30 18.6% 38.6% 40 20 29.3% 30 10 0 18.6% 20 < 1 hour 1 - 2 hours 2 - 3 hours > 3 hours

10 Faculty Evaluation by Trainees No Evaluation

0 Figure 36. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According < 1 hour 1 - 2 hours 2 - 3 hours > 3 hours to the Time Spent by Program Directors to Manage the Training Activities. Faculty Evaluation by Trainees No Evaluation 4.2.20. Percentage of Adherence to Accreditation Requirements (GMH QI 3.3)

This is defined as the rate of programs maintaining accreditation standards during the validity Figure 36. The Rate of Enabling the Trainees to Evaluate the Faculty in a Structured Process According period. The target value of this KPI was set at over 90% compliance. The source of information is the Figure 36. The Rateto the Time of Enabling Spent by Program the Trainees Directors to to Manage Evaluate the Training the Faculty Activities. in a Structured Process According Accreditation Department Database. Unfortunately, the data are not available; we did not receive it yet. to the Time Spent by Program Directors to Manage the Training Activities. 4.2.20. Percentage of Adherence to Accreditation Requirements (GMH QI 3.3) 4.2.20. PercentageThis of Adherence is defined as the to rate Accreditation of programs mainta Requirementsining accreditation (GMH standards QI during 3.3) the validity period. The target value of this KPI was set at over 90% compliance. The source of information is the This is definedAccreditation as the Department rate of Database. programs Unfortunately, maintaining the data are accreditation not available; we standardsdid not receive duringit yet. the validity period. The target value of this KPI was set at over 90% compliance. The source of information is the Accreditation Department Database.

4.2.21. Percentage of PD Turnover Rate (GMH QI 3.4) This is the number of program directors who did not complete their full term. This target was set below 10%. Unfortunately, the data of this information are not available. The source of such information should be the Training Department Database.

4.2.22. Percentage of Accreditation Compliance Score (GMH QI 3.5) This is the number of centers that met accreditation standards. The target value of this KPI is to be determined.

4.2.23. Percentage of Violations with the Matching Regulations (GMH QI 3.6) This is the total number of matching violations in a given year. The target value was set below 1%. The source of this information is the Admission and Registration Database. The rate of matching violation for 2018 was 4%. In Table4, we provide a summary of the Key Performance Indicators measurement. Sustainability 2020, 12, 8030 24 of 37

Table 4. Summary of Key Performance Indicator Results.

Domain KPI ID Definition Questions and Calculation Result

GMH QI 1.1 Trainees’ Satisfaction TS Q10, TS Q11, TS Q12, TS Q13, 69% TS Q16, TS Q17, TS Q25 GMH QI 1.2 Trainers’ Satisfaction Unavailable Reaction PDS Q 2.4 PDS Q 2.6 GMH QI 1.3 Program Directors’ Satisfaction 76% PDS Q 3.3 PDS Q 4.2 GMH QI 1.4 Trainees’ Burnout TS Q 14 66.7% Program directors who attended training course GMH QI 2.1 PDS Q 2.1 38.4% offered by SCFHS Trainers in PGMT programs who successfully GMH QI 2.2 Training Department Database 276 completed SCFHS training certification Surveyors who have successfully completed Accreditation Department GMH QI 2.3 67% SCFHS’s accreditation training Database Trainees’ compliance with minimal procedure, GMH QI 2.4 case exposure policies required competency Unavailable index Trainees who have received trainees’ evaluation GMH QI 2.5 PDS Q 2.6 74% by program in a specific period Training Department Database =? GMH QI 2.6 Research included in curriculum Unavailable Learning (TS Q 22 = 33.1%) GMH QI 2.7 Programs with burnout policy PDS Q 4.2 = 69% 69% Compliance with implementing incorporated GMH QI 2.8 Training Department Database Unavailable e-log system in each program Training Department Database GMH QI 2.9 Trainees who fulfilled their promotion criteria Unavailable Assessment Department Database GMH QI 2.10 Trainees who passed the board exam Assessment Department Database 79% Programs that incorporated simulation in their PDS Q 3.1 = 3% GMH QI 2.11 18.6% curricula TS Q 10 = 34.2% Programs with trainees receiving annual master GMH QI 2.12 PDS Q 2.7 73% rotation plan Programs’ compliance with the annual master GMH QI 2.13 PDS Q 2.8 76.9% plan Programs with complete goals and objectives for GMH QI 3.1 PDS Q 3.2 87.7% residency programs Completed trainer evaluation by trainee per GMH QI 3.2 PDS Q 2.9 48.7% program Accreditation Department Training GMH QI 3.3 Adherence to accreditation requirements Not Available Governance Database GMH QI 3.4 PD turnover rate TBD Not Available GMH QI 3.5 Accreditation compliance score TBD Not Available Admission and Registration GMH QI 3.6 Violations with the matching regulations 4% Database

5. Key Findings Related to the Research Objectives Our research work contributes to the intersections of knowledge management, quality assurance, and medical education domains, while it provides direct input to applied data science and sustainability. The overall idea that several well-defined, accurate, measurable, and reliable sets of key performance indicators can govern a holistic quality assurance initiative for the provision of high-quality medical training is a bold knowledge management case. It provides also an operational, functional sustainability framework for medical education, supporting a short- and long-term analysis of performance. The connection of our work is also significant for the computer science and data analytics domains. The technological infrastructure for collecting all required data for the measurement and the standardization of these KPIs serve as a transparent managerial method for measuring efficiency over time. The SCFHS moves forwards and plans a holistic data science strategy for connecting several knowledge-intensive personalized medical services to its users. In this section, we provide the key interpretations of the survey results presented Sustainability 2020, 12, 8030 25 of 37 in previous sections. We will elaborate further with these findings in Section6, where we provide our contribution to the theory of sustainability for the healthcare domain by interconnecting the main findings with our strategic proposition for enhanced decision making. In Table5, below, we present a high-level abstraction of the key findings related to the three significant research objectives of our study.

Table 5. Key Findings Related to the Research Objectives.

Overview of Key Findings No. Research Objective Key Findings 1 KPI definition for A set of 23 KPIs based on work of Kirkpatrick provides trusted, measurable, and efficient • efficient medical training metrics to support decision making Different aspects of medical training can be enhanced and supported by the continuous • monitoring of these KPI sets There are critical requirements for the availability of relevant data related to • medical training 2 KPIs‘ social impact and The continuous measurement of the proposed KPIs is directly linked to various • sustainability in medical components of sustainability in medical education. In the discussion section, we provide education our methodological contribution The synthesis of the key interpretations of KPIs lead to a theoretical contribution, the • SCFHS Framework for Sustainable Medical Education, that is presented in a relevant section in this research study The social impact of the medical education is also directly linked to a continuous quality • assurance and performance process in medical training programs. This effort is progressive and KPIs enhance the strategic alignment of training programs to sustainable goals and digital transformation 3 Sustainable data science The variety, sophistication, and wide range of medical training programs in the SCFHS • for medical education require an integrated big data and analytics ecosystem for the management of data and and digital services for personalized training transformation of The community of the SCFHS as well as its stakeholders’ network can be supported and • healthcare enhanced through digital infrastructures and smart services A social network that integrates the expertise and the human capital of health specialists • can be a significant enabler for the digital transformation of health in Saudi Arabia Future research on the requirements of an integrated medical data ecosystem for training • and education is also promoted

5.1. Recommendations of the Program Directors and Trainees in Improving the Quality of Training Burnout

1. The burnout rate in trainees is multifaceted but most of the residents agree that lack of staff at the training center is the main reason. Training centers need to ensure the proper distribution of workload among residents with appropriate staff coverage. 2. The resident dropout rate should be monitored closely for each program and should be investigated in each training center; the reasons should be clarified and addressed with appropriate actions for each reason. 3. The residents’ annual vacation is their right and should be protected by the training center to choose the time where it suits the residents the best—not the program. 4. Trainees suggest that one day per month should be offered as an admin leave to every resident, mandatorily (note that month and date should be determined by the resident only). 5. Under no circumstances should the residents cover the service without appropriate supervision, including ER rotation, and they should not be prevented from attending the academic activities.

Key Requirements for a Sustainable Medical Education Framework:

A unified ecosystem of smart medical educational content and distance learning facilities can • significantly reduce the burnout rate. More flexible procedures and cloud training services can be a bold action for the improvement of the burnout rate. It was one of the most critical findings and the SCFHS takes this seriously into consideration and various initiatives are planned, closely related also to the digital transformation in the Vision 2030. Sustainability 2020, 12, 8030 26 of 37

The analysis of the dropout rate and critical reasons must be monitored through a transparent • web service capable of monitoring responsive actions. In an integrated service, advanced notification and counseling procedures can support residents’ training and commitment to participate and learn. Time restrictions related to annual vacations or obligations must be supported and training must • always be a priority as a long-term capability that enhances the full competence of the Saudi healthcare system. Towards this direction, a new hybrid training platform and strategy must be implemented. Human factors are a critical aspect of a Sustainable Medical Education Framework. Rewards for training efficiency and performance must be introduced for residents providing a • transparent mechanism for facilitating training without dropouts. This reward system can be part of a holistic data and services ecosystem in the SCFHS.

Working hours

6. Given that residents are struggling in finding time to read during the working hours, centers have to make sure that the educational environment is proper for residents’ educational needs by enforcing academic half-days on a weekly basis and availability of rooms, reading areas, and resources like libraries with updated textbooks. 7. Residents are struggling with long working hours even after their on-call times, therefore, centers have to ensure that post call off is strictly implemented, as per SHF policy. 8. Professional behavior of seniors including consultants is of paramount importance in educational gain. 9. Centers need to conduct continuous medical educational activities on professionalism, especially dealing with junior staff, to enhance proper communication and professional behavior.

Key Requirements for a Sustainable Medical Education Framework:

A blended virtual and physical space for reading, study, and research is also required. Within such • a virtual medical educational space, seniors support juniors and also advance knowledge management capabilities, codify know-how, experiences, lessons learned, and similar activities. Time and space reservation for sophisticated research activities is also a critical pillar in a • sustainable medical education framework. Advanced professional social networking of medical experts is also required, and thus, SCFHS is • planning a sophisticated social human network of medical experts and professionals in the KSA aiming to enhance teamwork and collective intelligence management.

Simulation

10. Many centers lack a simulation and skill lab which is nowadays an imperative tool in training; therefore, centers should invest in preparing such facilities with proper support to enhance the educational opportunities or should collaborate with an accredited center formally. 11. Simulation activity should be incorporated in the curriculum and if the training center does not provide such a facility, collaborative effort with regional accredited simulation centers and a proof of evidence for such an activity should be recorded. 12. Simulation training should be offered to all trainees including all essential procedures which need to be kept in a resident portfolio.

Key Requirements for a Sustainable Medical Education Framework:

The new SCFHS data and services ecosystem requires an integrated open simulation lab, partially • implemented in the cloud as well as an enhanced virtual and augmented reality medical lab. Towards digital transformation, a Saudi Agora of Virtual and Augmented Reality Medical Lab and Content Laboratory can be implemented. This requires, of course, significant resources but will release unexploited capabilities of residents. Sustainability 2020, 12, 8030 27 of 37

In a parallel effort and initiative, the SCFHS should lead the adoption of virtual and augmented • reality content for medical training and labs. This should be a continuous, ongoing process towards excellence.

Training

13. All candidates for a program director post should not be appointed unless they attend a comprehensive orientation of how to perform this job efficiently, before starting their duties. 14. Most trainees are not aware of the availability of courses on training; thus, the SCFHS needs better advertisements. 15. External rotations should be offered to all residents twice a year and it is up to the resident to choose what rotation he/she will select and should not be pressured to give up his right to choose the elective rotation. 16. The training center has to ensure the availability of diverse specialties to improve the clinical encounter and make sure that residents are trained by highly qualified physicians. 17. The institutional leadership should be supportive of the educational process by all means necessary, promptly listen to the trainees’ complaints about their needs, and address their concerns.

Key Requirements for a Sustainable Medical Education Framework:

An integrated TrainingExcellence approach is required with various constitutional parts including • active learning medical practice, mentoring, etc. It is also necessary to support continuous awareness campaigns for the excellent training and research programs of the SCFHS. Institutional leadership is also an integral part of the Sustainable Medical Education Framework, • with dynamic channels for direct communication and accurate information for trainees. It also provides all the means for the realization of a fully efficient educational medical environment.

On-call duties

18. The on-call duty needs modification from the current practice which has been the same for the case; the on-call duty should be in shifts (the 24 h on-call duty should be terminated). 19. On-call shifts should not exceed 8 each month and post call off duty should be implemented immediately right after the handover and it should be monitored by the training center.

Key Requirements for a Sustainable Medical Education Framework:

Shifting on-call duties is a key action for the enhancement of the efficiency in medical education and • research. Within an integrated data and services ecosystem, on-call duties can be also supported by electronic services.

Research

20. Research support should be made available by the training center, the Saudi commission should incorporate a research curriculum into every training program, and a research portal should be made available to all training centers with weak research capability. 21. Research facilities vary from one center to another, so the Saudi commission should make sure that all the trainees have access to minimum research supporting tools to standardize the level of facilities and enhance the quality outcome. An e-portfolio of the resident regarding research should be maintained by the Saudi commission and should be updated periodically by the residents.

Key Requirements for a Sustainable Medical Education Framework:

Research integration into training must be an integral part of any medical training program. • Sustainability 2020, 12, 8030 28 of 37

A unified research skills lab and research cloud service must be implemented for providing ubiquitous • and effective research training to residents. Sophisticated services and access to medical and other scientific libraries, as well as the cultivation of best practices and standardization of the research process, should be promoted. Finally, collective research towards higher accomplishment must promote team research, increasing the visibility and impact of Saudi medical research. An advanced publication program should allow residents to execute their research skills towards • publications with impact. In the long term, we will have high skilled residents with top publication and research records promoting the objectives of Vision 2030.

Counseling

22. Residents should receive regular counseling sessions (minimum quarterly to address any concerns). 23. Centers should put the needs of trainees as a high priority without hindering patient care quality.

Key Requirements for a Sustainable Medical Education Framework:

Counseling should be offered to residents in a constant and systematic way exploiting also the data • and services ecosystem of the SCFHS.

5.2. Recommendation of the Training Quality Department Our research has several implications for the quality assurance of medical training programs and for the relevant training departments. A novel integrated approach is required in order to enhance the capabilities and the efficiency of the training quality department. This will be taken seriously into consideration for future initiatives and strategies in the SCFHS. The main implication is related to the need for the design of a fully functional data and services ecosystem for enhanced efficiency in medical training programs and decision making. In the previous section, we highlighted some key aspects of improvements and efficiency parameters for the residents in medical training programs. In this section, we try to communicate some key requirements for this new ecosystem in the SCFHS for the digital transformation Vision 2030. The following items provide the key findings and the main implications of our research study related to the training quality department of the SCFHS.

Novel Data and Services Ecosystem in SCFHS 1. Data quality was a major challenge of this project: availability, completeness, and accuracy were the most worrisome. In fact, some of the KPI information has yet to be retrieved. The collection of data related to the KPIs and quality assurance indicates the need for an integrated data strategy in the SCFHS. The establishment of big data flows for continuous management monitoring and support of the medical education process is a key requirement. In addition, the suggested services proposed in the previous section for residents require a new, sophisticated analytics upper level. 2. Transparency, collaboration, and effective communication are a cornerstone in achieving the objective of the assigned committee and its charges including conducting this project and preparing the report. The provision of a systematic, manageable workflow within the data and services ecosystem of the SCFHS is also another critical development. 3. Some KPIs need further revisiting to re-define them, or even modify or define certain measures. This research study focused on 23 KPIs based on Kirkpatrick’s model, but further enhancement can be developed. Emphasis can be based on active learning, team development, research skills and capability, collective intelligence, social impact, etc. A key interpretation for this requirement is related to a continuous improvement process on KPIs for teaching, training, and research excellence. This is a core component of any Sustainability Framework for Medical Education. 4. The developed questionnaire was an eye opener toward a better-constructed questionnaire with pre-determined comprehensive domains in order to accurately measure what it was made for, covering all aspects of training quality. In a greater context, the establishment of two-way data Sustainability 2020, 12, 8030 29 of 37

and communication channels in various training and research initiatives in SCFHS must be a key target. The continuous provision of real-time data can serve as a basis for numerous personalized medical training and research services. 5. A quality rating system utilizes a unified scoring system for the satisfaction level across all surveys and performs reliability analysis to check for internal consistency. A transparent, multi-component quality rating system, like the one we tested in this research through 23 KPIs, is a bold initiative towards the objective measurement of quality. It is necessary for the next years to design and implement a cloud-based rating system, capable of delivering on real-time variations and updates on quality and satisfaction perceptions of participants in training and research conducted by the SCFHS. 6. Setting the target value, i.e., benchmark, was somewhat arbitrary. Using an objective measure as much as possible would be ideal, however, any subjective measure selected should be agreed upon by the TQC, with acceptable reasoning for this value. In this direction, relevant research in analytics and KPIs and international benchmarks should be also exploited. For a Sustainability Framework for Medical Education, this quality benchmark can be developed through community collaboration and international associations’ intervention. The SCFHS can be a leader in this area by also exploiting international collaborations. 7. We should utilize a multi-source feedback system to retrieve the required information; this might enhance the validity of the information and reduce bias. Securing accurate, transparent, trusted feedback allows for enhanced decision making and also integrates knowledge creation processes. The integrated data and services ecosystem has to secure this kind of feedback as a long-term commitment and trust agreement between the stakeholders in the SCFHS ecosystem. 8. Data integration between all departments or even within the same department is mandatory. Thus, the new data and services ecosystem of the SCFHS will offer this integration as a value-adding service for enhanced decision making.

6. The SCFHS Sustainable Medical Education Framework Our research study based on the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training provides interesting insights for the integration of knowledge management, medical education, and big data analytics research for sustainability. As an effort to codify the key findings and to generalize the implications, we present, in Figure 37, a proposed Sustainable Medical Education Framework. This framework can be used as a road map as well as a policy making document for the digital transformation of the SCFHS. Sustainability 2020, 12, x FOR PEER REVIEW 39 of 45

Big Data SCFHS Quality and Enhanced Sustainability Ecosystem Performance Decision Making Saudi Social Network of the SCFHS Data quality management Performance Medical Education and Transparency, collaboration, Monitoring and Control KPI management Innovation Startup Ecosystem and effective communication Innovation Research competence Data integration and flow Questionnaire developed was an eye opener Resources utilization Social impact Multi-source feedback Quality rating system: Education impact SCFHS Digital Transformation Big data unified framework Quality benchmarks Vision 2030

FigureFigure 37. The 37. SCFHSThe SCFHS Sustainable Sustainable Medical Medical Education Education Framework. Framework. The implications of our study for training quality department in the context of the previously presented Sustainable Medical Education Framework are significant. Special and general actions towards efficiency improvements were evident in the survey data and are presented in Figure 38, as our proposed Medical Training Quality Framework. The constructive interpretation of the key findings in the previous section can be summarized in Figure 38. Namely, four collaborative, integrative, value-adding processes enhance the medical training impact and also enhance the innovation capability and the social impact: training excellence, research excellence, simulation and virtual reality cloud medical laboratory, and work–life balance summarize socio-technical factors and contribute to greater medical education and training impacts. In future research, we plan to investigate further the determinants of this model.

Simulation & Training Research Work–Life VIrtual Reality Excellence Excellence Balance lab • Ecosystem of smart • medical educational Research integration content and distance into training •Human factors learning facilities •Integrated open •A unified research skills simulation lab •Rewards for training lab •A blended virtual and efficiency and physical space for performance reading, study, and •Research cloud service research •Time and space • •Best practices and reservation for Flexible procedures and •VR/AR medical lab and cloud training services standardization of the sophisticated research research process content laboratory activities •Notification and • counseling procedures • Advanced professional Collective research social networking of •Visibility and impact of medical experts • Awareness campaigns Saudi medical research •VR content for medical training and labs •Advanced publication •Counseling • Institutional leadership program

Figure 38. The SCFHS Training Quality Approach—Medical Training Quality Framework. Sustainability 2020, 12, 8030 30 of 37

The basic assumption for our proposed Sustainability Medical Education Framework is the integration of four integral pillars of efficiency. Sophisticated big data ecosystem: At the first layer, a sophisticated big data ecosystem provides a sustainable infrastructure for the aggregation and analysis of diverse data required for medical training and education as well as for advanced decision making in the context of the SCFHS and its stakeholders. Data quality and governance allow transparency, collaboration, and effective communication for the optimization of medical training. Special services are also facilitating data integration and flow for the provision of value-adding services. Multi-source feedback for all the training programs in all the phases of the relevant workflow also permits flexible communication. A unified big data framework exploits a sophisticated KPI-driven process for measuring and improving performance in parallel with the execution of medical training. A big data ecosystem requires:

Data quality; • Transparency, collaboration, and effective communication; • Data integration and flow; • Multi-source feedback; and • A big data unified framework • Quality and performance management: The second pillar in the Sustainable Medical Education Framework is related to an integral quality and performance management component. The definition, implementation, execution, and monitoring of diverse, multi-facet KPIs provide a sophisticated quality assurance and monitoring component. The availability of a quality rating system together with the exploitation of quality benchmarks provides a transparent methodological way for enhanced quality and performance management. The execution of strategies for quality assurance, as presented in this research, operates as an analytical meta-level for performance management. The set of KPIs that were designed and implemented offers an initial testbed for further execution. In a future development, more additions can be done beyond Kirkpatrick’s framework. Emphasis on Bloom’s taxonomy of educational goals or sophisticated learning analytics for active learning are just some possible directions. The SCFHS quality and performance management should include

KPI management, • A quality rating system, and • quality benchmarks. • Enhanced decision-making capability: The orchestration of the previous two layers, namely the big data ecosystem and quality and performance management, interconnect with the next capability in the Sustainable Medical Education Framework. Performance evaluation, monitoring, and control appear as transparent ubiquitous processes that offer real-time knowledge and critical data for significant decisions in the SCFHS organization and, overall, in relevant organizations worldwide. This pillar also cultivates a continuous improvement culture towards innovation and medical education impact. In the context of innovation, significant effort is invested in the integration of emerging technologies and modern healthcare management to the medical education process. The provision of new services, e.g., a sophisticated Social Network of Medical Experts for personalized medical education must be a priority. Furthermore, decision making must be a continuous driver of enhancements and improvement toward measurable education impact. Enhanced decision making improves Performance, • Monitoring and control, • Sustainability 2020, 12, x 8030 FOR PEER REVIEW 3931 of of 45 37

Innovation, • Resource utilization, and • Education impact. • Sustainability in medical education: The final component in the Sustainable Medical Education Framework is the sustainability layer. All the investments and the initiatives presented in this research paper must be considered as developmental efforts that interconnect various performance capabilities in a long-term, viable, sustainable educational strategy for the healthcare domain. Big Data SCFHS Quality and Enhanced Sustainability Within the sustainability strategy, various actions could be critical milestones. The continuous Ecosystem Performance Decision Making • Saudi Social Network of the sustainable investment in research competence of the members of SCFHS,SCFHS as well as a parallel • • Data quality management Performance • Medical Education and integration• of the healthcare industry with academia and• medical education institutions, can lead to a Transparency, collaboration, • Monitoring and Control Innovation Startup Ecosystem and effective communication KPI management • unique, innovative, sustainable• Medical Education andInnovation Innovation Startup Ecosystem.• Research competence Through this • Data integration and flow Questionnaire developed was • an eye opener Resources utilization • Social impact Saudi• Multi-source Medical feedback Innovation Startup Ecosystem, the vision• of digital transformation of the healthcare • Quality rating system: Education impact • SCFHS Digital Transformation • Big data unified framework sector can be achieved in an• easierQuality benchmarks way. In future research, we plan to operationalize• Vision 2030 the requirements for the Medical Innovation Startup Ecosystem and also specify some key projects in the context of the SCFHS. One of the key priorities within this sustainable vision is community development and enhancement through the so-called Saudi Social Network of Medical Experts in the context of SCFHS. Sustainability includes

Saudi Social Network of the SCFHS, • Figure 37. The SCFHS Sustainable Medical Education Framework. Medical Education and Innovation Startup Ecosystem, • TheResearch implications competence, of our study for training quality department in the context of the previously • presentedSocial Sustainable impact, Medical Education Framework are significant. Special and general actions • towardsSCFHS efficiency Digital improvements Transformation, were and evident in the survey data and are presented in Figure 38, as • our proposedVision 2030 Medical Training Quality Framework. The constructive interpretation of the key findings• in the previous section can be summarized in Figure 38. Namely,The implications four collaborative, of our study integrative, for training value-adding quality department processes in enhance the context the ofmedical the previously training impactpresented and Sustainable also enhance Medical the innovation Education capability Framework and the are social significant. impact: traini Specialng andexcellence, general research actions excellence,towards effi simulationciency improvements and virtual reality were cloud evident medical in the laboratory, survey data and and work–life are presented balance in summarize Figure 38, socio-technicalas our proposed factors Medical and Training contribute Quality to greater Framework. medical Theeducation constructive and training interpretation impacts. ofIn the future key research,findings inwe the plan previous to investigate section further can be summarizedthe determinants in Figure of this 38 .model.

Simulation & Training Research Work–Life VIrtual Reality Excellence Excellence Balance lab Ecosystem of smart medical educational Research integration into content and distance training Human factors learning facilities Integrated open A unified research skills simulation lab Rewards for training lab A blended virtual and efficiency and physical space for performance reading, study, and Research cloud service research Time and space Best practices and reservation for Flexible procedures and VR/AR medical lab and cloud training services standardization of the sophisticated research research process content laboratory activities Notification and counseling procedures Advanced professional Collective research social networking of Visibility and impact of medical experts Awareness campaigns Saudi medical research VR content for medical training and labs Advanced publication Counseling Institutional leadership program

FigureFigure 38. 38. TheThe SCFHS SCFHS Training Training Quality Quality Approach—M Approach—Medicaledical Training Quality Framework. Sustainability 2020, 12, 8030 32 of 37

Namely, four collaborative, integrative, value-adding processes enhance the medical training impact and also enhance the innovation capability and the social impact: training excellence, research excellence, simulation and virtual reality cloud medical laboratory, and work–life balance summarize socio-technical factors and contribute to greater medical education and training impacts. In future research, we plan to investigate further the determinants of this model. The exploitation of our recommended Medical Training Quality Framework can be performed within the context of the previously presented Sustainable Medical Education Framework. The provision of the data and services ecosystem in the SCFHS orchestrates a holistic management of training, research, and technological resources within a fertile life–work balance education context. The first pillar is the training excellence strategy: a transparent, organization-wide strategy for the management of medical training and education as a key enabler of Vision 2030 goals in the KSA. The key dimensions of this approach are summarized as follows:

Integrated training excellence organization-wide; • A unified ecosystem of smart medical educational content and distance learning facilities; • A blended virtual and physical space for reading, study, and research; • Flexible procedures and cloud training services; • Advanced notification and counseling procedures; • Awareness campaigns; and • Institutional leadership. • The overall idea is that training must be designed, executed, and supported with an integrated strategy for effective educational active learning strategies, learning analytics for quality assurance, and sophisticated socio-technical systems. The second pillar is related to research excellence. It is a bold requirement for the evolution of the SCFHS towards the achievement of Vision 2030 goals to promote, support, and implement a total quality management of research excellence. The integration of research into medical training programs will require resources, but the impact will be high. Furthermore, the cultivation of a research culture, the systematic development of research skills, and the implementation of a virtual space for research skills development and research execution are additional aspects of the research excellence approach in the SCFHS. The ultimate objective is the enhancement of the visibility and impact of Saudi medical research together with a well-designed publication and awareness program. The research excellence dimension incorporates among others the following services and key capabilities:

Research integration into training, • A unified research skills lab and research cloud service, • Best practices and standardization of the research process, • Collective research towards higher accomplishment, • Visibility and impact of Saudi medical research, and • An advanced publication program. • The third dimension is related to an open cloud-based simulation lab with advanced Virtual Reality / Augmented Reality (VR/AR) capabilities. The implementation of this open educational and training space will significantly increase the exposure of the members and beneficiaries of the SCFHS to research with a significant impact on research capability. Special effort must be also paid to the development and integration of VR/AR medical content for training and integration in the open cloud-based simulation lab. The key dimensions of the simulation lab are summarized as follows at a general abstract level:

Integrated open simulation lab; • Saudi Agora of Virtual and Augmented Reality Medical Lab and Content Laboratory; and • Virtual and augmented reality content for medical training and labs • Sustainability 2020, 12, 8030 33 of 37

The last dimension of the Medical Training Quality framework is related to the human factor (work–life balance). This is a very sensitive area in our training quality strategy. The design of rewards for training efficiency and performance, together with time and space reserved for sophisticated research activities are necessary for higher satisfaction rates and training impact. Counseling and a professional social network for Saudi medical experts promote the feeling of belonging to a greater community of achievers and contributors to the social good of healthcare in the KSA. The following is a partial list of considerations for work–life balance (human factors):

Rewards for training efficiency and performance, • Time and space reservation for sophisticated research activities, • Advanced professional social networking of medical experts, and • Counseling. • Our research contributes to the theoretical knowledge of the integration of sustainability and medical training and education in multiple ways. The proposed SCFHS Sustainable Medical Education Framework is a novel theoretical abstraction that facilitates management and innovation in medical training and education. From a theoretical point of view, it organizes and integrates multidisciplinary contributions from the epistemology of medical education and information systems research and constructs a managerial framework that focuses on the sustainability aspect of the phenomenon. It is also one of the first studies worldwide that connects medical education and training with sustainability. We propose a concrete sustainability layer in our framework with emphasis on the following pillars:

Integration of social network of medical experts, towards continuous input and feedback from the • community of the SCFHS; Establishment of a Medical Education and Innovation Startup Ecosystem capable of bringing to • life and launching smart services and applications for domains of special interest for the evolution of medical research and education; The development of a continuous research competence developmental process integrated into the • sustainability framework; and The direct integration of medical education and training excellence to the digital transformation • vision. The quality initiative described in this research study is a first proof of concept approach for the capacity of multidisciplinary approaches to promote this vision to functional managerial and training practices.

The next connection of our research study to sustainability is also related to the impact of the key findings. It is our strategic decision to enhance all the medical training programs based on the findings and the related recommendation of this study. This is a bold move towards sustainable medical education. The design of transparent and participatory managerial procedures will enhance the quality of the value proposition of the SCFHS to its members and the Saudi economy and society Our focus on KPIs in this research study is intended. It establishes a managerial framework for sustainable/medical training through a reliable, trusted, and transparent process in which the community of the SCFHS is active. The interpretation of this approach, from sustainability’s point of view, is also significant. We promote sustainability in medical training by connecting the beneficiaries and the stakeholders in the SCFHS though the training and quality assurance process. We emphasize human capital and we set up an integrated approach for measuring diverse aspects of quality and performance. The proposed framework and the quality initiative are bold responses to the need of Saudi healthcare for a continuous improvement approach towards innovation and training excellence. The preparation of highly qualified health specialists is a key aspect of sustainability in healthcare. Sustainability 2020, 12, 8030 34 of 37

7. Conclusions and Future Research This research study promotes inter-disciplinary research with a significant social impact. We discussed the implementation of key performance indicators to benchmark, maintain, and improve the quality of postgraduate medical training. We analyzed the details, the scientific background, and the implementation of the Training Quality Assurance Initiative by the Saudi Commission for Health Specialties as a knowledge management, big data analytics, and sustainability joint endeavor. The first significant contribution of this research is related to the introduction of a full set of measurable, trusted, and reliable KPIs defined for efficient medical training. At a glance, 23 KPIs were used to measure the efficiency of medical training around three dimensions, namely, training governance, learning, and reaction. We do plan in the near future to update this functional set of KPIs with additional ones related to active learning strategies as well as community engagement in the context of a newly developed Social Network of Medical Experts. Another direction for future enhancement of this analytics approach is to emphasize also on research and social impact. Another significant dimension of our research study was the synthesis of the key finding of this Training Quality Assurance Initiative towards the generalization of findings for future exploitation and best practices communication. As a result, we introduced the Sustainable Medical Education Framework, with four integrative components, namely big data and analytics ecosystem, SCFHS quality and performance management, enhanced decision making capability, and sustainability. This framework can serve as a sophisticated decision-making roadmap for the promotion of social impact and sustainability in the context of medical training and education. The entire research paper also initiates a multidisciplinary debate on the role of human factors [50–53], knowledge management, KPI research, and sustainable data science for medical education and digital transformation of healthcare. We also introduced the SCFHS Training Quality Approach and the relevant Medical Training Quality Framework as a key enabler of personalized medical training services for individuals and the community in the KSA. In the next developmental phases, we are looking forward to the integration of social networks for medical expertise sharing and community building [54], as a strategic enabler of leadership and innovation [55]. Our special interest in the resilience and wellbeing of healthcare professions trainees and healthcare professionals during public health crises [56] is challenging the knowledge creation [57] and the design of new timely medical training programs capable of promoting the vision of patient-centric healthcare [58]. Our research contributes to the body of knowledge management, KPI research, sustainable data science, and sustainability, with empirical evidence from a large-scale survey on postgraduate medical training in the KSA. The overall contribution to quality assurance in medical training will be analyzed further in the near future with the integration of more KPIs for active learning, research enhancement, and social impact.

Author Contributions: The authors contributed equally to this research work and they involved in an integrated way in the stages of this research including Conceptualization, methodology, software, validation, formal analysis, investigation, resources, data curation, writing—original draft preparation, writing—review and editing, visualization, supervision, project administration, funding acquisition. All authors have read and agreed to the published version of the manuscript. Funding: This research received no external funding. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Khoja, T.; Rawaf, S.; Qidwai, W.; Rawaf, D.; Nanji, K.; Hamad, A. Health Care in Gulf Cooperation Council Countries: A review of challenges and opportunities. Cureus 2017, 9, e1586. [CrossRef] 2. ten Cate, O.; Scheele, F. Competency-based postgraduate training: Can we bridge the gap between theory and clinical practice? Acad. Med. 2007, 82, 542–547. [CrossRef] Sustainability 2020, 12, 8030 35 of 37

3. Kjaer, N.K.; Kodal, T.; Shaughnessy, A.F.; Qvesel, D. Introducing competency-based postgraduate medical training: Gains and losses. Int. J. Med. Educ. 2011, 2, 110–115. [CrossRef] 4. Scheele, F.; Teunissen, P.; Van Luijk, S.; Heineman, E.; Fluit, L.; Mulder, H.; Meininger, A.; Wijnen-Meijer, M.; Glas, G.; Sluiter, H.; et al. Introducing competency-based postgraduate medical education in the Netherlands. Med. Teach. 2008, 30, 248–253. [CrossRef] 5. 5. Rees, C.J.; Bevan, R.; Zimmermann-Fraedrich, K.; Rutter, M.D.; Rex, D.; Dekker, E.; Ponchon, T.; Bretthauer, M.; Regula, J.; Saunders, B.; et al. Expert opinions and scientific evidence for colonoscopy key performance indicators. Gut 2016, 65, 2045–2060. [CrossRef] 6. Sandhu, S.S.C. Benchmarking, key performance indicators and maintaining professional standards for cataract surgery in Australia. Exp. Ophthalmol. 2015, 43, 505–507. [CrossRef] 7. Raitt, J.; Hudgell, J.; Knott, H.; Masud, S. Key performance indicators for pre hospital emergency Anaesthesia—A suggested approach for implementation. Scand. J. Trauma. Resusc. Emerg. Med. 2019, 27, 42. [CrossRef] 8. Santana, M.J.; Stelfox, H.T. Trauma quality Indicator consensus panel. Development and evaluation of evidence-informed quality indicators for adult injury care. Ann. Surg. 2014, 259, 186–192. [CrossRef] 9. Lytras, M.D.; Aljohani, N.R.; Hussain, A.; Luo, J.; Zhang, X.Z. Cognitive Computing Track Chairs’ Welcome & Organization. In Proceedings of the Companion of the Web Conference, Lyon, France, 23–27 April 2018. 10. Lytras, M.D.; Raghavan, V.; Damiani, E. Big data and data analytics research: From metaphors to value space for collective wisdom in human decision making and smart machines. Int. J. Semant. Web Inf. Syst. 2017, 13, 1–10. [CrossRef] 11. Visvizi, A.; Daniela, L.; Chen, C.W. Beyond the ICT- and sustainability hypes: A case for quality education. Comput. Hum. Behav. 2020.[CrossRef] 12. Alkmanash, E.H.; Jussila, J.J.; Lytras, M.D.; Visvizi, A. Annotation of Smart Cities Twitter Microcontents for Enhanced Citizen’s Engagement. IEEE Access 2019, 7, 116267–116276. [CrossRef] 13. Lytras, M.D.; Mathkour, H.I.; Abdalla, H.; Al-Halabi, W.; Yanez-Marquez, C.; Siqueira, S.W.M. An emerging social- and emerging computing-enabled philosophical paradigm for collaborative learning systems: Toward high effective next generation learning systems for the knowledge society. Comput. Hum. Behav. 2015, 5, 557–561. [CrossRef] 14. Visvizi, A.; Lytras, M.D. Editorial: Policy Making for Smart Cities: Innovation and Social Inclusive Economic Growth for Sustainability. J. Sci. Technol. Policy Mak. 2018, 9, 1–10. 15. Visvizi, A.; Lytras, M.D. Transitioning to Smart Cities: Mapping Political, Economic, and Social Risks and Threats; Elsevier-US: New York, NY, USA, 2019. 16. Arnaboldi, M. The Missing Variable in Big Data for Social Sciences: The Decision-Maker. Sustainability 2018, 10, 3415. [CrossRef] 17. Olszak, C.M.; Mach-Król, M. A Conceptual Framework for Assessing an Organization’s Readiness to Adopt Big Data. Sustainability 2018, 10, 3734. [CrossRef] 18. Kent, P.; Kulkarni, R.; Sglavo, U. Finding Big Value in Big Data: Unlocking the Power of High Performance Analytics. In Big Data and Business Analytics; Liebowitz, J., Ed.; CRC Press Taylor & Francis Group, LLC: Boca Raton, FL, USA, 2013; pp. 87–102. ISBN 9781466565784. 19. Kharrazi, A.; Qin, H.; Zhang, Y. Urban big data and sustainable development goals: Challenges and opportunities. Sustainability 2016, 8, 1293. [CrossRef] 20. Wielki, J. The Opportunities and Challenges Connected with Implementation of the Big Data Concept. In Advances in ICT for Business, Industry and Public Sector; Mach-Król, M., Olszak, C.M., Pełech-Pilichowski, T., Eds.; Springer: Cham, Switzerland, 2015; pp. 171–189. ISBN 978-3-319-11327-2. 21. Lytras, M.D.; Aljohani, N.R.; Visvizi, A.; De Pablos, P.O.; Gasevic, D. Advanced Decision-Making in Higher Education: Learning Analytics Research and Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 937–940. [CrossRef] 22. Zhang, J.; Zhang, X.; Jiang, S.; de Pablos, P.O.; Sun, Y. Mapping the Study of Learning Analytics in Higher Education. Behav. Inf. Technol. 2018, 37, 1142–1155. [CrossRef] 23. Rajkaran, S.; Mammen, K.J. Identifying Key Performance Indicators for Academic Departments in a Comprehensive University through a Consensus-Based Approach: A South African Case Study. J. Sociol. Soc. Anthropol. 2014, 5, 283–294. [CrossRef] Sustainability 2020, 12, 8030 36 of 37

24. Asif, M.; Awan, M.U.; Khan, M.K.; Ahmad, N. A Model for Total Quality Management in Higher Education. Qual. Quant. 2013, 47, 1883–1904. [CrossRef] 25. Varouchas, E.; Sicilia, M.-A.; Sánchez-Alonso, S. Towards an Integrated Learning Analytics Framework for Quality Perceptions in Higher Education: A 3-Tier Content, Process, Engagement Model for Key Performance Indicators. Behav. Inf. Technol. 2018, 37, 1129–1141. [CrossRef] 26. Varouchas, E.; Sicilia, M.-Á.; Sánchez-Alonso, S. Academics’ Perceptions on Quality in Higher Education Shaping Key Performance Indicators. Sustainability 2018, 10, 4752. [CrossRef] 27. Sanyal, B.C.; Martin, M. Quality assurance and the role of accreditation: An overview. In Higher Education in the World 2007: Accreditation for Quality Assurance: What Is at Stake? Palgrave Macmillan: New York, NY, USA, 2007; pp. 3–23. 28. McDonald, R.; Van Der Horst, H. Curriculum alignment, globalization, and quality assurance in South African higher education. J. Curric. Stud. 2007, 39, 6. [CrossRef] 29. Deming, W. Improvement of quality and productivity through action by management. Natl. Product. Rev. 2000, 1, 12–22. [CrossRef] 30. Dlaˇci´c,J.; Arslanagi´c,M.; Kadi´c-Maglajli´c,S.; Markovi´c,S.; Raspor, S. Exploring perceived service quality, perceived value, and repurchase intention in higher education using structural equation modelling. Total Qual. Manag. Bus. Excell. 2014, 25, 141–157. [CrossRef] 31. Suryadi, K. Key Performance Indicators Measurement Model Based on Analytic Hierarchy Process and Trend-Comparative Dimension in Higher Education Institution. In Proceedings of the 9th International Symposium on the Analytic Hierarchy Process for Multi-criteria Decision Making (ISAHP), Viña del Mar, Chile, 2–6 August 2007. 32. Chalmers, D. Teaching and Learning Quality Indicators in Australian Universities. In Proceedings of the Institutional Management in Higher Education (IMHE) Conference, Paris, France, 8–10 September 2008. 33. Varouchas, E.; Lytras, M.; Sicilia, M.A. Understanding Quality Perceptions in Higher Education: A Systematic Review of Quality Variables and Factors for Learner Centric Curricula Design. In EDULEARN16—8th Annual International Conference on Education and New Learning Technologies; IATED: Barcelona, Spain, 2016; pp. 1029–1035. 34. Yarime, M.; Tanaka, Y. The Issues and Methodologies in Sustainability Assessment Tools for Higher Education Institutions: A Review of Recent Trends and Future Challenges. J. Educ. Sustain. Dev. 2012, 6, 63–77. [CrossRef] 35. Sheng, Y.; Yu, Q.; Chen, L. A Study on the Process Oriented Evaluation System of Undergraduate Training Programs for Innovation and Entrepreneurship. Creative Educ. 2016, 07, 2330–2337. [CrossRef] 36. Toussaint, N.D.; McMahon, L.P.; Dowling, G.; Söding, J.; Safe, M.; Knight, R.; Fair, K.; Linehan, L.; Walker, R.G.; A Power, D. Implementation of renal key performance indicators: Promoting improved clinical practice. Nephrology (Carlton) 2015, 20, 184–193. [CrossRef] 37. Pencheon, D. The Good Indicators Guide: Understanding How to Use and Choose Indicators. London: Association of Public Health Observatories & NHS Institute for Innovation and Improvement. 2008. Available online: http://fingertips.phe.org.uk/documents/The%20Good%20Indicators%20Guide.pdf (accessed on 10 September 2020). 38. Lunsford, L.D.; Kassam, A.; Chang, Y.F. Survey of United States neurosurgical residency program directors. Neurosurgery 2004, 54, 239–245, discussion 245–247. [CrossRef] 39. Ahmadi, M.; Khorrami, F.; Dehnad, A.; Golchin, M.H.; Azad, M.; Rahimi, S. A Survey of Managers’ Access to Key Performance Indicators via HIS: The Case of Iranian Teaching Hospitals. Stud. Health. Technol. Inform. 2018, 248, 233–238. 40. Aggarwal, S.; Kusano, A.S.; Carter, J.N.; Gable, L.; Thomas, C.R., Jr.; Chang, D.T. Stress and Burnout among Residency Program Directors in United States Radiation OncologyPrograms. Int. J. Radiat. Oncol. Biol. Phys. 2015, 93, 746–753. [CrossRef] 41. Ishak, W.W.; Lederer, S.; Mandili, C.; Nikravesh, R.; Seligman, L.; Vasa, M.; Ogunyemi, D.; Bernstein, C.A. Burnout during residency training: A literature review. J. Grad. Med. Educ. 2009, 1, 236–242. [CrossRef] 42. Krebs, R.; Ewalds, A.L.; van der Heijden, P.T.; Penterman, E.J.M.; Grootens, K.P. Burn-out, commitment, personality and experiences during work and training; survey among psychiatry residents. Tijdschr. Psychiatr. 2017, 59, 87–93. Sustainability 2020, 12, 8030 37 of 37

43. Gouveia, P.A.D.C.; Ribeiro, M.H.C.; Aschoff, C.A.M.; Gomes, D.P.; Silva, N.A.F.D.; Cavalcanti, H.A.F. Factors associated with burnout syndrome in medical residents of a university hospital. Rev. Assoc. Med. Bras. 2017, 63, 504–511. [CrossRef] 44. Dyrbye, L.N.; West, C.P.; Satele, D.; Boone, S.; Tan, L.J.; Sloan, J.; Shanafelt, T.D. Burnout among U.S. medical students, residents, and earlycareer physicians relative to the general U.S. population. Acad. Med. 2014, 89, 443–451. [CrossRef] 45. Shoimer, I.; Patten, S.; Mydlarski, P.R. Burnout in dermatology residents: A Canadian perspective. Br. J. Dermatol. 2018, 178, 270–271. [CrossRef] 46. Porter, M.; Hagan, H.; Klassen, R.; Yang, Y.; Seehusen, D.A.; Carek, P.J. Burnout and Resiliency Among Family Medicine Program Directors. Fam. Med. 2018, 50, 106–112. [CrossRef] 47. Chaukos, D.; Chad-Friedman, E.; Mehta, D.H.; Byerly, L.; Celik, A.; McCoy, T.H., Jr.; Denninger, J.W. Risk and Resilience Factors Associated with Resident Burnout. Acad. Psychiatry. 2017, 41, 189–194. [CrossRef] 48. Holmes, E.G.; Connolly, A.; Putnam, K.T.; Penaskovic, K.M.; Denniston, C.R.; Clark, L.H.; Rubinow, D.R.; Meltzer-Brody, S. Taking Care of Our Own: Multispecialty Study of Resident and Program Director Perspectives on Contributors to Burnout and Potential Interventions. Acad. Psychiatry. 2017, 41, 159–166. [CrossRef] 49. Dyrbye, L.; Shanafelt, T. A narrative review on burnout experienced by medical students and residents. Med. Educ. 2016, 50, 132–149. [CrossRef] 50. Jagsi, R.; Griffith, K.A.; Jones, R.; Perumalswami, C.R.; Ubel, P.; Stewart, A. Sexual harassment and discrimination experiences of academic medical faculty. JAMA 2016, 315, 2120–2121. [CrossRef] 51. Karim, S.; Duchcherer, M. Intimidation and harassment in residency: A review of the literature and results of the 2012 Canadian Association of Internsand Residents National Survey. Can. Med. Educ. J. 2014, 5, e50–e57. [CrossRef] 52. Fnais, N.; Soobiah, C.; Chen, M.H.; Lillie, E.; Perrier, L.; Tashkhandi, M.; Straus, S.E.; Mamdani, M.; Al-Omran, M.; Tricco, A.C. Harassment and discrimination in medical training: A systematic review and meta-analysis. Acad. Med. 2014, 89, 817–827. [CrossRef] 53. Bates, C.K.; Jagsi, R.; Gordon, L.K.; Travis, E.; Chatterjee, A.; Gillis, M.; Means, O.; Chaudron, L.; Ganetzky, R.; Gulati, M.; et al. It Is Time for Zero Tolerance for Sexual Harassment in Academic Medicine. Acad. Med. 2018, 93, 163–165. [CrossRef] 54. Giroux, C.; Moreau, K. Leveraging social media for medical education: Learning from patients in online spaces. Med. Teach. 2020, 42, 970–972. [CrossRef] 55. McKimm, J.; McLean, M. Rethinking health professions’ education leadership: Developing ‘eco-ethical’ leaders for a more sustainable world and future. Med. Teach. 2020, 42, 855–860. [CrossRef] 56. Wald, H. Optimizing resilience and wellbeing for healthcare professions trainees and healthcare professionals during public health crises—Practical tips for an ‘integrative resilience’ approach. Med. Teach. 2020, 42, 744–755. [CrossRef] 57. Naeve, A.; Yli-Luoma, P.; Kravcik, M.; Lytras, M.D. A modelling approach to study learning processes with a focus on knowledge creation. Int. J. Technol. Enhanc. Learn. 2018, 1, 1–34. [CrossRef] 58. Spruit, M.; Lytras, M. Applied Data Science in Patient-centric Healthcare. Telemat. Inform. 2018, 35, 2018. [CrossRef]

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).