Assessing Competence in Professional Performance Across Disciplines and Professions with a Foreword by Lee S
Total Page:16
File Type:pdf, Size:1020Kb
Innovation and Change in Professional Education 13 Paul F. Wimmers Marcia Mentkowski Editors Assessing Competence in Professional Performance across Disciplines and Professions With a Foreword by Lee S. Shulman Innovation and Change in Professional Education Volume 13 Series editor W.H. Gijselaers, School of Business and Economics, Maastricht University The Netherlands Associate editors L.A. Wilkerson, Dell Medical School, The University of Texas at Austin, TX, USA H.P.A. Boshuizen, Center for Learning Sciences and Technologies, Open Universiteit Nederland, Heerlen, The Netherlands Editorial Board T. Duffy, School of Education, Indiana University, Bloomington, IN, USA H. Gruber, Institute of Educational Science, University of Regensburg, Regensburg, Germany R. Milter, Carey Business School, Johns Hopkins University, Baltimore, MD, USA EunMi Park, JH Swami Institute for International Medical Education, Johns Hopkins University School of Medicine, Baltimore, MD, USA Eugene L. Anderson, American Dental Education Association, Washington, DC, USA SCOPE OF THE SERIES The primary aim of this book series is to provide a platform for exchanging experiences and knowledge about educational innovation and change in professional education and post-secondary education (engineering, law, medicine, management, health sciences, etc.). The series provides an opportunity to publish reviews, issues of general significance to theory development and research in professional education, and critical analysis of professional practice to the enhancement of educational innovation in the professions. The series promotes publications that deal with pedagogical issues that arise in the context of innovation and change of professional education. It publishes work from leading practitioners in the field, and cutting edge researchers. Each volume is dedicated to a specific theme in professional education, providing a convenient resource of publications dedicated to further development of professional education. More information about this series at http://www.springer.com/series/6087 Paul F. Wimmers • Marcia Mentkowski Editors Assessing Competence in Professional Performance across Disciplines and Professions With a Foreword by Lee S. Shulman 123 Editors Paul F. Wimmers Marcia Mentkowski University of California Alverno College Los Angeles, CA Milwaukee, WI USA USA Innovation and Change in Professional Education ISBN 978-3-319-30062-7 ISBN 978-3-319-30064-1 (eBook) DOI 10.1007/978-3-319-30064-1 Library of Congress Control Number: 2016932870 © Springer International Publishing Switzerland 2016 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, express or implied, with respect to the material contained herein or for any errors or omissions that may have been made. Printed on acid-free paper This Springer imprint is published by Springer Nature The registered company is Springer International Publishing AG Switzerland Foreword Forewords are a delightfully open-ended and forgiving genre of composition. They can be discursive and analytic, offering a conceptual introduction to the volume that follows them. They can also serve as an excuse for personal observations, in which a writer who did not contribute to the volume itself offers observations and anec- dotes that might bring readers of the book to approach its contents at a more personal level. I have chosen to indulge in a bit of both options. My conceit is that in the intersection of personal experiences and conceptual/technical challenges lays a narrative that is useful in understanding where we stand and how we got there. I hope to be forgiven for the effort. For the past 60 years (!), I have nurtured an ongoing and occasionally con- tentious relationship with the varieties of assessment. The relationship began when, at the age of 16, I spent a week responding to a daunting set of placement exam- inations, primarily in multiple choice and essay formats, administered to all undergraduates admitted to the College of the University of Chicago. Those placement exams were samples of the final exams of the yearlong general courses of the College’s curriculum. Several weeks after the tests administration, I learned that I had passed the equivalent of the first two years of the social science cur- riculum and two-thirds of the first year of the humanities course. I needed only to study first-year art appreciation and art history, and I would be eligible to begin the second year of humanities. A few years later, as a doctoral candidate, I was already feeling dissatisfied with the limitations of multiple-choice tests however well designed (and the Chicago tests were outstanding examples of the genre). I spent a year as research assistant to one of the university’s examiners, Christine McGuire, who was moonlighting at the Center for the Study of Liberal Education for Adults. We created assessments of critical thinking and judgment for adult students in Chicago’s Great Books extension program. The assessments asked them to “perform” critical thinking as if they were members of juries hearing cases, or public officials making judgments about competing policies, after having read Plato’s Republic or Mill’s On Liberty. The assessments asked respondents to make sequential judgments; identifying v vi Foreword information they needed, gathering that information by rubbing out locations on data sheets, then making new judgments as they moved to their judgments and decisions. These assessments were forms of performance assessment that would later be elaborated by McGuire in the development of “Patient Management Problems” in the assessment of medical expertise. The very next year, I became research assistant to a professor who had served as one of McGuire’s mentors in the university examiner’soffice, Benjamin Bloom. I became enamored of these ways of creating task environments that permitted us to tap into thought processes (I had, after all, written a bachelor’s thesis on approaches to epistemology.). I designed a doctoral dissertation in which I studied how teachers solved classroom problems by creating a simulation of an entire classroom captured in an “in-basket” assessment. The assessment focused on how the teachers discovered and formulated problems rather than how they solved problems already framed. To study problem finding required the use of more unstructured task situations. Although Bloom was my academic advisor for several years, I was more attracted to cognition than to measurement, and thus never became a psychometrician. Another few years passed into the late 1960s and early 1970s, and as a young professor of educational psychology and medical education at Michigan State University, my colleagues and I were designing and using “high-fidelity” simula- tions to study and assess the ability of medical students to make complex diagnoses. Michigan State’s brand-new medical school had opted to open its doors in 1968 with a “focal problems” curriculum that used clinical problems rather than disci- plines or organ systems as the basis for the curriculum. In our studies we used a variety of assessment instruments and situations—from written patient management problems to simulations engaging actors and actresses to represent clinical cases. Each encounter was videotaped and then used in stimulated recall to probe deeper into the knowledge, problem-solving skills and even the interpersonal empathy and doctor–patient relations of the aspiring physicians. Many of these high-fidelity methods became part of the Emergency Medicine specialty assessments that we designed, and have now found their way into most medical education programs in America as both instruction and assessment methods. In the mid-1980s, now at Stanford, my students and I responded to the challenge of developing much more valid approaches to the assessment of high levels of pedagogical competence among experienced teachers. We first developed multi-day assessment centers that looked quite similar to what would, in medical education, become OSCEs. Although our assessments met all of the psychometric standards we established (with Lee J. Cronbach and Edward Haertel as our hard-nosed measurement collaborators), we moved beyond those methods and developed yearlong situated portfolio assessments so that the effects of real settings and changes over time could become part of the assessment picture. These studies led to the development of the National Board of Professional Teaching Standards cer- tification exams that remain in use today. And in the first decade of the present century, while at the Carnegie Foundation, we studied the processes of professional