Concept Inventories in Engineering Education Teri Reed-Rhoads and P.K
Total Page:16
File Type:pdf, Size:1020Kb
Concept Inventories in Engineering Education Teri Reed-Rhoads and P.K. Imbrie School of Engineering Education Purdue University “What is the reason for the seasons?” or “What causes the phases of the Moon?” One would like to believe that if you asked these two seemingly innocuous questions to a group of “educated” people you would expect most, if not all, of individuals in the group to answer them correctly. However, would you believe that 21 of 23 randomly interviewed graduates, professors, and alumni from a prestigious university incorrectly responded to at least one of the astronomical questions (Harvard, 1977). If our college graduates have difficulty answering fundamental questions like these, what is their conceptual understanding of topics like Chemical Equilibrium, Thermal and Transport Sciences, Electromagnetics, or Nanotechnology? Faculty members would all like to say “my students understand, look at how well they do on my exams.” But for each engineering subject taught, has the question ever really been asked, “What do our students know?” and “What do they think they understand and how do they come to these understandings?” Concept inventories represent a relatively unique form of an assessment instrument with a multitude of possible uses that range from diagnostic and formative purposes to guide instructional planning, to summative purposes for evaluating overall learning and instructional effects at a student, classroom, and/or instructional program level. What makes them relatively unique compared to typical assessments of student academic achievement is that they tend to be highly focused on a small set of key constructs and understandings within a limited domain of academic content. Great care goes into conceptualizing the nature of the situations to be presented and in developing plausible distracters that represent a range of partially correct understandings to completely incorrect understandings and misconceptions. However, it should be stated that a specific definition of a concept inventory is not agreed upon. In fact, there are many published definitions that include the following succinct version “research-based distracter driven multiple-choice instruments” (Lindell, 2006: pg. 14). In addition, the method for developing concept inventories is not consistent or agreed upon. Again Lindell (2006) has a very nice summary of this where comparisons are made across Physics and Astronomy related concept inventories. This was also found to be the case in the creation of the Concept Inventory Central website (www.purdue.edu/SCI/Workshop) where developers were asked to post similar type summaries related to their instruments for others to consider in making a decision to use each instrument. In recent years, the science, technology, engineering, and mathematics (STEM) disciplines have increased their use of Concept Inventories (CI) instruments to measure the value added to student learning by new ways of teaching important material (Evans, 2003). Utilizing a tool such as a CI can provide a learning opportunity for students and professors alike. Students can reveal their misconceptions for the first time, as well as open their minds to accepting scientific points of view. Professors “can form a basis for making instructional decisions, whether to validate students' correct yet unsure ideas, confront student misconceptions, reinforce ideas that are forming, or complement ideas that are accurate but only partial explanations.” (http://www.learner.org, 2007) Numerous studies have shown that many students lack correct conceptual understanding of science and engineering concepts, even after successful completion of courses in which these concepts are taught (Duit and Treagust, 2003, Hake, 1998, and McDermott, 1991). These studies point out the challenge of learning some concepts, as well as call for discussion on what level of conceptual understanding is needed throughout each aspect of a curriculum. 1 CI assessment instruments can provide valid, reliable data on conceptual understanding. Moreover, the creation, development, and use of these inventories foster constructive conversations among educators. The Force Concept Inventory (FCI) (Halloun and Hestenes, 1985; Hestenes, et al., 1992) has proven instrumental in identifying effective pedagogy (Hake, 1998) and thus has served as an agent of change in Physics education (e.g., Crouch and Mazur, 2001). A cited reference search of the FCI’s major publications illustrates the potential influence of a well-developed instrument. Table 1, below, lists the number of citations (Web of Science Cited Reference Search; October 4, 2008). Table 1: Cited references of major FCI publications Article Authors Citations Year 1985 Halloun & Hestenes 145 1992 Hestenes, Wells, & Swackhamer 119 1998 Hake 160 Further analysis of the 1985 article highlights the influence of the FCI in fields 45 outside physics and continuing intra- 40 disciplinary interest (Figure 1 – data from 35 September 6, 2006 search from Web of Science). The most-cited block is the five- 30 year period ten years after initial 25 publication. 20 Looking back at the FCI, it is apparent 15 that significant development time was required to influence pedagogical practices. 10 The 1985 publication was the result of three 5 years’ development, while the compelling 0 results of Hake’s 6000+ students was not 1985 - 19891990 - 19941995 - 19992000 - 20042005 - 2006 presented until 13 years later. After the initial development and publication of the Physics Education Cognitive Other Science Technology FCI, there have been at least 11 Figure 1: Cited references of 1985 FCI subsequent instruments developed in Physics publication, by journal type alone (see Table No. 2). In the areas of science and mathematics, there are 9 general areas covered (see Table No. 3). Table 2: Physics and Astronomy Specific Concept Inventories (Lindell, 2006) 1. Force CI 7. Determining and Interpreting Resistive Electric Circuits Concept Test 2. Mechanics Baseline Test 8. Energy and Motion Conceptual Survey 3. Astronomy Diagnostic Test 9. Force and Motion Conceptual Evaluation 4. Brief Electricity and Magnetism 10. Lunar Phases CI Assessment 5. Conceptual Survey in Electricity and 11. Test of Understanding Graphs in Magnetism Kinematics 6. Diagnostic Exam Electricity and 12. Light and Color Magnetism 2 Unlike the FCI, engineering CI development is occurring in a different environment. First, there is very little known or published on the engineering concepts and subject matter misconceptions. Whereas the FCI generally placed prior educational research in a new format (i.e., a multiple choice test), engineering faces the challenge of using the inventories as a tool to identify misconceptions, being only guided in a general sense of instructors’ perceptions of potential misconceptions (Streveler et al., 2008). Second, the engineering inventories are being developed in an era of evolving accreditation standards that focus on learning outcomes. This coincidence could infer the purpose of the CIs is for accreditation instead of instruments aimed at increasing student learning. The FCI, on the other hand, might be viewed as a more charitable contribution to existing research. This difference could infuse a tincture of skepticism regarding the researchers’ motivation (i.e., have-to vs. want-to). Third, in part due to the FCI, evidence is beyond anecdotal that traditional teaching is ineffective providing a landscape ripe for encouraging faculty to improve their practices with conceptual assessment imperative to demonstrate the effectiveness of innovations. Table 3: Science and Mathematics Concept Inventories 1. Biology 6. Geosciences 2. Calculus– Nelson 7. Natural Selection 3. Calculus– Epstein 8. Organic Chemistry 4. Chemical Equilibrium 9. Physics (See Table No. 2 for full list) 5. Genetics Despite these challenges, engineering experienced a flurry of development, primarily associated with the Foundation Coalition, 1800 Statics beginning from 2001 – 2003 (continues to 2700 (see Evans, et al., 2003 for a 1600 @ year 4) summary). However in more recent years, it appears that s e 1400 e continued development, n i refinement, deployment, and m a application of concept x 1200 E e Statistics inventories has waned. A v i t a 1000 literature review (Allen, 2006) l u Systems&Signals identified 16 engineering m u concept inventories which C 800 have been expanded to 21 since a national workshop on Dynamics 600 concept inventory development and use sponsored by the National 400 Chem.I Science Foundation was held Chem.II 200 in May of 2007 (see Table Materials No. 4). Figure 2 shows the number of people who have 0 taken some of the 1 2 3 4 instruments that are the Years most highly published at Figure 2: Cumulative examinees across years for this time. The first two engineering concept inventories 3 years’ data for all inventories are similar, suggesting strong foundations with the capability to expand. It is important to note that Steif’s Statics CI has achieved the high numbers through participation of upwards of 20 institutions through online access. This same trend is illustrated with the Statistics CI, which also went online early in its development. This highlights the broad applicability of these instruments and the potential broad applicability of other instruments if online access is facilitated. Given the increasing intensity of the call to reform engineering education (NAE 2004, etc.), building a cohesive CI Community will provide a basis for constructive