FEATURE ARTICLE

POINT OF VIEW Competency-based assessment for the training of PhD students and early-career

Abstract The training of PhD students and early-career scientists is largely an apprenticeship in which the trainee associates with an expert to become an independent . But when is a PhD student ready to graduate, a postdoctoral scholar ready for an independent position, or an early-career scientist ready for advanced responsibilities? Research training by apprenticeship does not uniformly include a framework to assess if the trainee is equipped with the complex knowledge, skills and attitudes required to be a successful scientist in the 21st century. To address this problem, we propose competency-based assessment throughout the continuum of training to evaluate more objectively the development of PhD students and early-career scientists. DOI: https://doi.org/10.7554/eLife.34801.001

MICHAELFVERDERAME†*,VICTORIAHFREEDMAN†,LISAMKOZLOWSKI† ANDWAYNETMCCORMACK†

The quality of formal training assessment mechanisms. This variation in the quality and received by PhD students and early-career scien- substance of training assessment for PhD stu- tists (a label that covers recent PhD graduates in dents and early-career scientists (Maki and Bor- a variety of positions, including postdoctoral kowski, 2006) underscores the need for an trainees and research scientists in entry-level improved approach to such assessment. positions) is highly variable, and depends on a The value of bringing more definition and number of factors: the trainee’s supervisor or structure to the training environment has been research adviser; the institution and/or graduate recognized by professional organizations such as *For correspondence: mxv8@psu. program; and the organization or agency fund- the National Postdoctoral Association, the edu ing the trainee. The European approach, for American Physiological Society/Association of †These authors contributed example, relies more on one final summative Chairs of Departments of Physiology, and some equally to this work assessment (that is, a high stakes evaluation at educational institutions and individual training Competing interests: The the conclusion of training, e.g. the dissertation programs. In addition, a recent NIH Funding authors declare that no and defense), whereas US doctoral programs Opportunity Announcement places increased competing interests exist. rely more on multiple formative assessments emphasis on the development of both research Funding: See page 5 (regular formal and informal assessments to eval- and career skills, with a specific charge that uate and provide feedback about performance) “Funded programs are expected to provide evi- Reviewing editor: Emma before the final dissertation defense dence of accomplishing the training objectives”. Pewsey, eLife, United Kingdom (Barnett et al., 2017). Funding agencies in the Lists of competencies and skills provide guide- Copyright Verderame et al. US such as the National Foundation lines for training experiences but they are rarely This article is distributed under (NSF) and the National Institutes of Health (NIH) integrated into training assessment plans. the terms of the Creative have recently increased expectations for formal Based on our experience as graduate and Commons Attribution License, which permits unrestricted use training plans for individuals supported by indi- postdoctoral program leaders, we recognized and redistribution provided that vidual or institutional training grants the need both to identify core competencies the original author and source are (NIH, 2012); but these agencies support only a and to develop a process to assess these com- credited. small fraction of PhD trainees via these funding petencies. To minimize potential confirmation

Verderame et al. eLife 2018;7:e34801. DOI: https://doi.org/10.7554/eLife.34801 1 of 5 Feature article Point of View Competency-based assessment for the training of PhD students and early-career scientists

bias we deliberately chose not to begin this proj- and executing protocols, troubleshooting, lab ect with a detailed comparison of previously safety, and data management. described competencies. Each author indepen- Computational Skills encompasses relevant dently developed a list of competencies based statistical analysis methods and informatics on individual experiences. Initial lists were wide- literacy. ranging, and included traditional fundamental Collaboration and Team Science Skills research skills (e.g., critical thinking skills, includes openness to collaboration, self- and dis- computational and quantitative skills), skills ciplinary awareness, and the ability to integrate needed for different career pathways, (e.g., information across disciplines. teaching skills), and business and management Responsible Conduct of Research (RCR) and skills (e.g., entrepreneurial skills such as the abil- Ethics includes knowledge about and adherence ity to develop a business or marketing plan). to RCR principles, ethical decision making, moral Although we recognize that many of the compe- courage, and integrity. tencies we initially defined are important in spe- Communication Skills includes oral and writ- cific careers, from the combined list we defined ten communication skills as well as communica- 10 core competencies essential for every PhD tion with different stakeholders. scientist regardless of discipline or career path- Leadership and Management Skills includes way (Table 1). the ability to formulate a research vision, man- age group dynamics and communication, orga- nize and plan, make decisions, solve problems, Core competencies and and manage conflicts. subcompetencies Survival Skills includes a variety of personal Broad Conceptual Knowledge of a Scientific characteristics that sustain science careers, such Discipline refers to the ability to engage in pro- as motivation, perseverance, and adaptability, as ductive discussion and collaboration with col- well as participating in professional development leagues across a discipline (such as biology, activities and networking skills. chemistry, or physics). Because each core competency is multi-fac- Deep Knowledge of a Specific Field encom- eted, we defined subcompetencies. For exam- passes the historical context, current state of the ple, we identified four subcompetencies of art, and relevant experimental approaches for a Critical Thinking Skills: (A) Recognize important specific field, such as immunology or questions; (B) Design a single experiment nanotechnology. (answer questions, controls, etc.); (C) Interpret Critical Thinking Skills focuses on elements data; and (D) Design a research program. Each of the scientific method, such as designing core competency has between two to seven sub- experiments and interpreting data. competencies, resulting in a total of 44 subcom- Experimental Skills includes identifying petencies (Table 1—source data 1: Core appropriate experimental protocols, designing Competencies Assessment Rubric).

Table 1. Ten Core Competencies for the PhD Scientist. 1. Broad Conceptual Knowledge of a Scientific Discipline 2. Deep Knowledge of a Specific Field 3. Critical Thinking Skills 4. Experimental Skills 5. Computational Skills 6. Collaboration and Team Science Skills 7. Responsible Conduct of Research and Ethics 8. Communication Skills 9. Leadership Skills 10. Survival Skills

DOI: https://doi.org/10.7554/eLife.34801.002 The following source data available for Table 1: Source data 1. Core Competencies Assessment Rubric. DOI: https://doi.org/10.7554/eLife.34801.005

Verderame et al. eLife 2018;7:e34801. DOI: https://doi.org/10.7554/eLife.34801 2 of 5 Feature article Point of View Competency-based assessment for the training of PhD students and early-career scientists Assessment milestones research adviser or job supervisor, to determine Individual competencies could be assessed using if a PhD student or early-career scientist had a Likert-type scale (Likert, 1932), but such rat- reached the milestone for their stage of training ings can be very subjective (e.g., “poor” to (Table 1—source data 1: Core Competencies “excellent”, or “never” to “always”) if they lack Assessment Rubric). A sample for the Critical specific descriptive anchors. To maximize the Thinking Skills core competency is shown in usefulness of a competency-based assessment Table 3. rubric for PhD student and early-career scientist training in any discipline, we instead defined Recommendations for use observable behaviors corresponding to the core We suggest that such a competency-based competencies that reflect the development of assessment be used to guide periodic feedback knowledge, skills and attitudes throughout the between PhD students or early-career scientists timeline of training. and their mentors or supervisors. It is not meant We used the “Milestones” framework to be a checklist. Rather than assessing all 44 described by the Accreditation Council for Grad- subcompetencies at the same time, we recom- uate Medical Education: “Simply defined, a mile- mend that subsets of related competencies (e. stone is a significant point in development. For g., “Broad Conceptual Knowledge of a Scientific accreditation purposes, the Milestones are com- Discipline” and “Deep Knowledge of a Specific petency-based developmental outcomes (e.g., Field”) be considered during any given evalua- knowledge, skills, attitudes, and performance) tion period (e.g., month or quarter). Assessors that can be demonstrated progressively by resi- should read across the observable behaviors for dents and fellows from the beginning of their each subcompetency from left to right, and education through graduation to the unsuper- score the subcompetency based on the last vised practice of their specialties.” observable behavior they believe is consistently Our overall approach to developing mile- demonstrated by the person being assessed. stones was guided by the Dreyfus and Dreyfus Self-assessment and mentor or supervisor rat- model describing five levels of skill acquisition ings may be compared to identify areas of over time: novice, advanced beginner, compe- strength and areas that need improvement. Dis- tent, proficient and expert (Dreyfus and Drey- cordant ratings between self-assessment and fus, 1986). As trainees progress through mentor or supervisor assessment provide oppor- competent to proficient to expert, their perspec- tunities for conversations about areas in which a tive matures, their decision making becomes trainee may be overconfident and need more analytical, and they become fully engaged improvement, and areas of strength which the in the scientific process (Dreyfus, 2004). These trainee may not recognize and may be less than levels are easily mapped to the continuum of confident about. PhD scientist training: beginning PhD student as The competencies and accompanying mile- novice, advanced PhD student as advanced stones can also be used in a number of other beginner, PhD graduate as competent, early- critically important ways. Combined with curricu- career scientist (that includes postdoctoral train- lar mapping and program enhancement plans, ees) as proficient, and science professional as the competencies and milestones provide a expert (see Table 2). framework for developing program learning We therefore defined observable behaviors objectives and outcomes assessments now com- and outcomes for each subcompetency that monly required by educational accrediting agen- would allow a qualified observer, such as a cies. Furthermore, setting explicit expectations

Table 2. PhD scientist training stages mapped to Dreyfus and Dreyfus levels of skill acquisition. Early-career scientists include researchers undertaking postdoctoral training as well as those in science positions in career pathways that involve other kinds of advanced training, e.g., on-the-job training or certification. Dreyfus & Dreyfus Novice Advanced beginner Competent Proficient Expert Rule-based behavior, Incorporates aspects Acts consciously from Sees situation as a whole Has intuitive understanding of limited, inflexible of the situation long-term goals and and acts from personal situations, zooms in on central plans conviction aspects PhD Scientist Training Beginning PhD Advanced PhD PhD Graduate Early-Career Scientist Science Professional Stages Student Student

DOI: https://doi.org/10.7554/eLife.34801.003

Verderame et al. eLife 2018;7:e34801. DOI: https://doi.org/10.7554/eLife.34801 3 of 5 Feature article Point of View Competency-based assessment for the training of PhD students and early-career scientists

Table 3. Sample milestones for one of the subcompetencies within Critical Thinking Skills. Verbs in bold font indicate observable behaviors representing each stage of skill acquisition. MILESTONES CRITICAL THINKING Beginning PhD Science SKILLS Student Advanced PhD Student PhD Graduate Early-Career Scientist Professional B. Design a single Follow experimental Plan experimental Design and execute Consistently design Teach experiment (answer protocols, seek help protocol; hypothesis- and experimental questions, controls, etc.) as include relevant controls; based experiments execute experiments design; needed, describe choose appropriate independently; evaluate with guide others critical methods; protocols of others; appropriate controls; doing role of controls troubleshoot predict range of assess next steps; experiments experimental experimental outcomes critique problems experiments of others

DOI: https://doi.org/10.7554/eLife.34801.004

for research training may enhance the ability of success. A competency-based approach fits well institutions to recruit outstanding PhD students with traditional PhD scientist training, which is or postdoctoral scholars. Finally, funding agen- not bound by a priori finish dates. It provides a cies focused on the individual development of framework to explore systematically and objec- the trainee may use these competencies and tively the development of PhD students and assessments as guidelines for effective training early-career scientists, identifying areas of programs. strength as well as areas that need improve- ment. The assessment rubric can be easily imple- mented for trainee self-assessment as well as Why should PhD training constructive feedback from advisers or supervi- incorporate a competency-based sors by selecting individual competencies for approach? review at regular intervals. Furthermore, it can Some training programs include formal assess- be easily extended to include general and spe- ments utilizing markers and standards defined cific career and professional training as well. by third parties. Medical students, for example, In its recent report “Graduate STEM educa- are expected to meet educational and profes- tion for the 21st Century”, The National Acade- sional objectives defined by national medical mies of , Engineering, and Medicine, associations and societies. 2018 briefly outlined core competencies for By contrast, the requirements for completing STEM PhDs. In its formal recommendations spe- the PhD are much less clear, defined by the cifically for STEM PhD education, the first rec- “mastery of specific knowledge and skills” (Sulli- ommendation is, “Universities should verify that van, 1995) as assessed by research advisers. every graduate program that they offer provides The core of the science PhD remains the com- pletion of an original research project, culminat- for these competencies and that students dem- ing in a dissertation and an oral defense onstrate that they have achieved them before (Barnett et al., 2017). PhD students are also receiving their doctoral degrees.” This assess- generally expected to pass courses and master ment rubric provides one way for universities to research skills that are often discipline-specific verify that students have achieved the core com- and not well delineated. Whereas regional petencies of a science PhD. accrediting bodies in the US require graduate We look forward to implementing and testing institutions to have programmatic learning this new approach for assessing doctoral train- objectives and assessment plans, they do not ing, as it provides an important avenue for effec- specify standards for the PhD. Also, there are tive communication and a supportive mentor– few – if any – formal requirements and no mentee relationship. This assessment approach accrediting bodies for early-career scientist can be used for any science discipline, and it has training. not escaped our notice that it is adaptable to We can and should do better. Our PhD stu- non-science PhD training as well. dents, postdoctoral scholars, early-career scien- tists and their supervisors deserve both a more Acknowledgements clearly defined set of educational objectives and We thank our many colleagues at the Associa- an approach to assess the completion of these tion of American Medical Colleges Graduate objectives to maximize the potential for future Research, Education and Training (GREAT)

Verderame et al. eLife 2018;7:e34801. DOI: https://doi.org/10.7554/eLife.34801 4 of 5 Feature article Point of View Competency-based assessment for the training of PhD students and early-career scientists

Group for helpful discussions, and Drs. Istvan Albert, Joshua Crites, Valerie C Holt, and Decision letter and Author response Rebecca Volpe for their insights about specific Decision letter https://doi.org/10.7554/eLife.34801.007 core competencies. We also thank Drs. Philip S Author response https://doi.org/10.7554/eLife.34801. Clifford, Linda Hyman, Alan Leshner, Ravindra 008 Misra, Erik Snapp, and Margaret R Wallace for critical review of the manuscript. Additional files Michael F Verderame is in The Graduate School, Data availability Pennsylvania State University, University Park, PA, There are no datasets associated with this work United States References [email protected] Barnett JV, Harris RA, Mulvany MJ. 2017. A http://orcid.org/0000-0001-7046-0879 comparison of best practices for doctoral training in Victoria H Freedman is in the Graduate Division of and North America. FEBS Open Bio 7:1444– Biomedical Sciences, College of 1452. DOI: https://doi.org/10.1002/2211-5463.12305, Medicine, Bronx, NY, United States PMID: 28979835 http://orcid.org/0000-0002-7290-0409 Dreyfus H, Dreyfus S. 1986. Mind over machine: The power of human intuition and expertise in the era of Lisa M Kozlowski is at Jefferson College of Biomedical the computer. New York: The Free Press. Sciences, Thomas Jefferson University, Philadelphia, Dreyfus SE. 2004. The Five-Stage model of adult skill PA, United States acquisition. Bulletin of Science, Technology & Society Wayne T McCormack is in the Office of Biomedical 24:177–181. DOI: https://doi.org/10.1177/ Research Career Development, University of Florida 0270467604264992 Health Sciences Center, Gainesville, FL, United States Likert R. 1932. A technique for the measurement of https://orcid.org/0000-0002-2117-8727 attitudes. Archives of Psychology 140:52. Maki P, Borkowski NA. 2006. The Assessment of Doctoral Education: Emerging Criteria and New Author contributions: Michael F Verderame, Victoria Models for Improving Outcomes. Sterling, VA: Stylus H Freedman, Lisa M Kozlowski, Wayne T McCormack, Publishing. Conceptualization, Writing—original draft, Writing— NIH. 2012. Biomedical Research Workforce Working review and editing Group Report. https://acd.od.nih.gov/documents/ reports/Biomedical_research_wgreport.pdf June 1, Competing interests: The authors declare that no 2018. competing interests exist. Sullivan RL. 1995. The Competency-Based Approach Received 02 May 2018 to Training. Baltimore, MD: JHPIEGO Corporation. The National Academies of Sciences, Engineering, Accepted 25 May 2018 and Medicine.Leshner A, Scherer L (Eds). 2018. Published 31 May 2018 Graduate STEM Education for the 21st Century. The National Academy Press. DOI: https://doi.org/10. Funding 17226/25038 The authors declare that there was no funding for this work.

Verderame et al. eLife 2018;7:e34801. DOI: https://doi.org/10.7554/eLife.34801 5 of 5