<<

Paper ID #15915

E-Assessment and Direct Competency Modelling in a Chemistry for Mechan- ical Engineering Course

Dr. Rebecca Jo Pinkelman, Technische Universitat¨ Darmstadt

Rebecca J. Pinkelman graduated from Chadron State College with a B.S. in Chemistry and in 2008. She received her M.S. and Ph.D. in Chemical Engineering from South Dakota School of Mines and Technology in 2010 and 2014, respectively. She is currently a post-doctoral research scientist in the Mechanical and Process Engineering Department at the Technische Universitat¨ Darmstadt. Ing. Frank Guido Kuhl,¨ Technische Universitat¨ Darmstadt

Frank G. Kuhl¨ studied at Technische Universitat¨ Darmstadt, Germany and Lunds Tekniska Hogskolan,¨ Sweden in Mechanical Process Engineering. He received his diploma in Mechanical Process Engineering from TU Darmstadt in 2011 and is currently a Ph.D. student at the institute of Thermal Process Engineer- ing there. Brian Stephenson Prof. Manfred J. Hampe, Technische Universitat¨ Darmstadt

Manfred J. Hampe graduated from Technische Universitat¨ Clausthal in 1976 and received his doctorate in engineering from Technische Universitat¨ Munchen¨ in 1980. He worked as a process engineer in the central research division of Bayer AG in Leverkusen before he became full professor of Thermal Pro- cess Engineering in the Department of Mechanical Engineering at Technische Universitat¨ Darmstadt in 1995. His research interests are in the field of transport phenomena at fluid interfaces. He has been the chairman of the Working Party on Education in Chemical and Process Engineering of the VDI-Society for Chemical and Process Engineering and member of the European Working Party on Education in Chemical Engineering for many years. He is the vice-chairman of the council of the faculties of mechanical and process engineering in Germany and chairman of 4ING, the German Council of University Faculties in Engineering and Informatics. Between 2004 and 2013 he was one of the 19 German Bologna experts. He received the ars legendi award 2013 of the Stifterverband and the German Rectors Conference.

c American Society for Engineering Education, 2016 E-Assessment and Direct Competency Modelling in a Chemistry for Mechanical Engineering Course

In large classes with hundreds of students enrolled, it is difficult to gauge the knowledge level of the students on a regular basis until the exams where it is almost too late to correct and supplement competency deficiencies. Complementing weekly lectures with an online learning platform coupled with a direct competency model for e-learning and e-assessment can provide real-time feedback on student knowledge and deficiencies for the instructors and help the students better prepare for their exams and increase their core knowledge. A case study of a large Chemistry for Mechanical Engineering course (~400-500 students) at a German technical university utilized an online platform to develop a direct competency model to assess students’ knowledge of core chemistry competencies. These competencies were based on the learning outcomes of the course. This model consisted of short review questions (2-3 per topic) that tested students’ understanding of the concept as well as pre and post self-assessments of these core competencies. Students that scored higher on the direct competency questions showed higher mastery of the subject topic through achieving higher scores on correlated exam tasks and subtasks. Also the pre and post self- assessment showed that students assessed themselves to have made significant gains in core knowledge. This validation of the competency model allows it to be used in real-time in an online learning platform to discern how well students understand the material and allows the instructor to correct deficiencies shortly after the material is presented and before the next exam.

Introduction

Formative assessment and feedback on students’ learning process is challenging in large, lecture based courses for instructors. The time required to develop, administer, evaluate and assess, and return feedback concerning student competency is large. Studies have shown that ongoing feedback during a course enhances the teaching-learning process and actively engages students in their learning process1-3.

Assessment can be broadly defined as the process of understanding and improving student learning. It includes setting appropriate criteria and high standards followed by gathering, analyzing, and interpreting evidence to determine if those criteria and standards were met4,5. There are two main forms of assessment, summative and formative. Summative assessment generally refers to student learning in terms of a given grade for a body student work or for accreditation, whereas formative assessment generally refers to the process of ongoing feedback to improve student performance and teaching practice1-4,6,7. In formative assessment not only is the instructor key in creating and fostering an active learning environment but also the learner and peers with all having a shared responsibility in the learning process1.

With the development of online learning platforms and communities, assessment has also moved online. Online assessment (or also called e-assessment) can be defined as a form of assessment (summative or formative) utilizing technology within online and blended learning environments to gather and analyze learning evidence where instructors and learners are separated by time and/or space1,3,4,7. Advantages of online assessment include convenience where time and place are not limiting factors however time and pace can still be controlled, accessibility of online databases, interactive features, inclusion of multimedia, and immediate feedback to both instructors and students1,3,6,7. In large classes, online assessment with immediate feedback allows an instructor to have a better understanding of the overall knowledge gain of the students3. Effective use of online assessment allows instructors and students to collaboratively engage in the learning process and identify areas of learning needs and find ways to address those needs. The student must become an active learner, and the method of online assessment must match the learning objective to achieve this1,3. Barak & Rafaeli (2004) have shown that when students were actively engaged in an online assessment, they achieved higher overall exam scores4. They also noted that students had a positive attitude toward online assessment, most especially in regards to receiving immediate feedback and grades to encourage self-monitoring and self-regulation in the learning process4.

In large courses, it is hard for instructors to assess knowledge gain and the learning process which is critical, especially in larger introductory course where the attrition rate is highest. Improving the teaching-learning process through online assessment during the course could have a significant impact on the retention rate and increase in knowledge gained, because by the time the exam is taken, it is too late to correct any gaps of knowledge2. Online self-assessment quizzes are a good tool for both instructors and students to receive immediate feedback on general knowledge and where each is in the teaching-learning process, and have been shown to positively impact summative assessment later in the course1. Competency integrated standards of achievement provide guidelines for improvement including strengths and weaknesses of the course, content changes, methods of content delivery, and assessment8.

Competency is defined as having the ability to do a set task(s) focusing on the application of knowledge and not only on the acquisition of knowledge or skills8,9. Using competency based standards, the strengths and weakness of a course could be effectively determined8. One method is to define a competency based curriculum, where students have to achieve a minimum level of knowledge in their studies to graduate. These competencies provide a set of guidelines for students to move closer to their educational goals. These competencies are typically embedded and linked within courses across the curriculum9-11. These competency models have been developed because of differences in students’ mathematical versus conceptual reasoning. Students can perform well on questions utilizing an algorithmic solving problem method but not on non-mathematical, conceptual questions10,12-14.

In a specific course, direct competency exams are one method to assess these core competencies. These are typically multiple choice questions that cover the basic understanding of fundamental concepts. The desired performance level is 100% on all questions. They can reveal strengths and weaknesses of the course, student misconceptions, and suggest to students what the fundamental and most important concepts in a course are. When these direct competency exams are offered throughout the semester, they improve student learning through low stakes assessment and give immediate feedback to the instructor10,12-14. Mehta and Schlecht (1998) have shown that daily competency exams (1-3 questions) are useful in learning material and are more beneficial for students with a GPA less than 2.7 (American 4.0 scale), especially in large classes2.

This study is the first attempt to define, measure, and assess students’ core chemistry knowledge through an online platform in a traditional lecture-based course blended with online material. The specific aims were twofold: the development of a direct competency model and an assessment method through an online platform for real-time formative assessment of student competence, for both students and instructors, in large classes.

Methodology and methods

Composition of students and course Five hundred thirty-two students were initially enrolled in a lecture based Chemistry for Mechanical Engineers course at a German Technical University. Of the 532 enrolled students only 379 subsequently enrolled and participated in the exam, which was the only given grade during this course. Only the 379 students that took the exam and chose to answer the developed assessments were included in the data analysis. All collected data was voluntary and anonymized. Chemistry for Mechanical Engineering is a traditional lecture course held once a week for 90 minutes, has 14 lectures, and is supplemented with a script. It is accompanied by weekly 90 minute recitations, each limited to 30 students, for exercises. Lecture and recitation attendance are not mandatory but highly recommended.

Development of the assessments Core competencies were identified from the learning objectives, course script, and prior exam themes. These were identified as follows (in parenthesis following each competency is the shortened version as shown in Table 1 and in all future references in this paper):

─Properties of and compounds based on the and bonds (Properties) ─Stoichiometry (Stoichiometry) ─ (oxidation/reduction equations) (Electrochemistry) ─Acid/base neutralization (Acid/Base) ─Identifying, naming, and properties of the different classes of Organic Chemistry (Organic Chemistry) ─Aromaticity (Aromaticity) ─Reaction equilibria (Reaction Equilibria) ─Reaction kinetics (Kinetics) ─Properties of polymers (Polymers).

From these core competencies, 25 direct competency questions were developed and were grouped into 9 competency quizzes (see Appendix A for example competency questions).

A pre and post self-assessment asked students to evaluate their knowledge level in the above competencies at the beginning and end of the semester using a Likert Scale from 1-4, with 1-have no clue or idea, 2-have seen or heard it, 3-familiar with it, and 4-know it well. Included in the post self-assessment were questions regarding how the students prepared for the exam, including whether they attended lecture and/or recitation and how many times, if they read the course script, worked through the exercises/modules, and found additional exercises and/or books (see Appendix B for the full pre and post self-assessment).

In addition, students had access to 10 multiple choice questions (1 set per lecture) aimed at helping them review their own knowledge and were developed from the script and lectures. These questions are not part of their final grade, but are only for the students’ benefit and to self-evaluate their knowledge level.

Method of assessment Moodle is an online learning platform designed for educators to create a personalized learning environment for their students. It is scalable, flexible, and adaptable for the needs of a particular course and/or learning objectives. It is activity and resource based. Moodle was developed from a social constructivist philosophy, wherein learners develop and construct knowledge for others in a social setting, working collaboratively on building new knowledge16.

All course communication and deliverables were performed with Moodle in addition to the course lectures. In particular, Moodle was used to construct and deliver the direct competency and self- assessments with the quiz function. The competency quizzes were opened a week or two following the corresponding lecture(s). They were then available for the remainder of the semester, but students could only take each quiz once. The lecture multiple choice questions became available the week following the lecture and remained available for the rest of semester and for retaking by students since they were required to achieve a score of 70% to obtain access to exercise solutions. However, the order of the 10 lecture multiple choice questions and their answers were randomized for each attempt.

The exam was offered one week after the 14th lecture and had 7 tasks covering the previously described learning objectives and competencies. The students had 90 minutes to complete the exam. The total number of points achievable was 94 and was scaled to the 1-5 scale used in German universities, with 1.0 being the highest score, 4.0 the lowest passing score, and 5.0 failing. Grades were placed into three groups, 0.7-2.7, 3.0-4.0, and 5.0.

Analysis methods Outliers and normality were determined using box plots, Shapiro’s-Wilks test of normality, and studentized residuals. Homogeneity of variances was determined by Levene’s test of homogeneity. ANOVA was used to test the equality of means, and Tukey post hoc test to determine significant differences between various groups. Paired t-tests were performed to determine differences between the pre and post self-assessments. Spearman correlations were also performed to determine relationships between various factors. SPSS version 23 (SPSS IBM, New York, NY) statistical software was used for all statistical measures. All averages are shown as an average  standard deviation.

Results and discussion

Analysis and comparison of competency quizzes Nine core competencies were assessed throughout the semester by nine competency quizzes as shown in Table 1. Each competency quiz typically covered between one to three core competencies: Quiz 1 – properties and stoichiometry; Quiz 2 – stoichiometry and electrochemistry (oxidation/reduction reactions); Quiz 3 – stoichiometry, electrochemistry, and acid/base reactions; Quiz 4 – general organic chemistry knowledge (including identifying, naming, and properties of different organic chemistry classes); Quiz 5 – aromaticity; Quiz 6 – properties and general organic chemistry; Quiz 7 – reaction equilibria; Quiz 8 – kinetics; Quiz 9 – polymers. Overall, students scored lower on Quiz 4 (45 %  24.56), Quiz 6 (40.5 %  34.46), and Quiz 9 (46 %  36.03) which covered general organic chemistry knowledge and polymers and highest on Quiz 5 (78.2 %  34.02) which covered aromaticity. An overall competency was calculated using the average of all competency quizzes taken by the individual students. The overall competence was 49.78 %  28.27. Comparing the three different grade groups, 07.-2.7, 3.0-4.0, and 5.0, (Figure 1), there were statistically significant differences between groups for Quiz 2 (stoichiometry and electrochemistry, F (2, 65) = 4.185, p = 0.020, partial 2 =0.114), Quiz 5 (aromaticity, F (2, 36) = 3.540, p = 0.039, partial 2 =0.164), and Quiz 8 (kinetics, F (2, 30) = 8.493, p = 0.001, partial 2 =0.362). For Quiz 2, there was a significant difference between 3.0-4.0 group and 5.0 grade group with the former scoring higher (mean difference = 0.28  0.10, p = 0.014). For Quiz 5, the grade group 0.7-2.7 scored a statistically higher level of competency in comparison to the 5.0 group (mean difference = 0.37  0.14, p = 0.031), and in Quiz 8, group 0.7-2.7 scored significantly higher than both the 3.0-4.0 (mean difference = 0.33  0.10, p = 0.010) and 5.0 group (mean difference = 0.49  0.13, p = 0.002).

Analysis and comparison of achieved grades Grades at the German Technical Universities are awarded on the 1.0-5.0 scale with 1.0 the best score (0.7 means higher than expected and is only a mathematical expression used for internal computing and is equivalent to a 1.0), 4.0 the lowest possible passing score, and 5.0 failing. Total exam points (Figure 2) were scaled accordingly: ─ 1.0: maximal points - 67.6; ─ 1.3 – 67.6 - 62.8 ─ 1.7 – 62.8 - 59.2; ─ 2.0 – 59.2 - 55.6; ─ 2.3 – 55.6 - 50.8; ─ 2.7 – 50.8 - 47.2; ─ 3.0 – 47.2 - 43.6; ─ 3.3 – 43.6 - 38.8; ─ 3.7 – 38.8 - 35.2; ─ 4.0 – 35.2 - 34.0; ─ 5.0 – < 34.0. Grades were grouped into three groups: 0.7-2.7, 3.0-4.0, and 5.0. This is approximately equivalent to A-B (0.7-2.7), C-D (3.0-4.0), and F (5.0) in the United States.

The overall average of total exam points was 38.43  11.9 of 94 available with an average grade of 3.7 (Figures 2 and 3). Students received the lowest number of available points on Task 1 (properties and organic chemistry: 2.15  2.23 of 9 available points), Task 5 (stoichiometry, acid/base reactions, and reaction equilibria: 2.17  1.79 of 10 available points), Task 6 (polymers: 3.37  2.38 of 13 available points), and Task 7 (properties: 3.39  2.69 of 9 available points) and the highest number of points on Task 2 (organic chemistry and kinetics: 9.65  3.21 of 15 available points), Task 3 (stoichiometry, reaction equilibria, and kinetics: 11.01  3.92 of 25 available points), and Task 4 (properties, stoichiometry, and electrochemistry: 7.31  3.07 of 13 available points).

Table 1: Core competencies assessed in each competency quiz indicated by shading. Competency Quiz Competency 1 2 3 4 5 6 7 8 9 Properties Stoichiometry Electrochemistry Acid/Base Organic Chemistry Aromaticity Reaction Equilibria Kinetics Polymers

All Students 0.7 - 2.7 3.0 - 4.0 5.0 120.0

100.0

80.0

60.0

40.0 Average Score (%) Score Average

20.0

0.0 Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Overall Competency Quiz

Fig 1: Comparison of the average score of competency quizzes and the overall competence for all students and by grade groups, 0.7-2.7, 3.0-4.0, and 5.0.

Average Point Maximum Possible Points 100 25

80 20

60 15

40 10

20 5

0 0

Average Points Achieved Points Average TEP* Task 1 Task 2 Task 3 Task 4 Task 5 Task 6 Task 7 Exam Tasks

Figure 2: Average total exam points and points for each task. The minimum passing grade was 34 total exam points. *TEP – Total Exam Points.

120 112

100

80 67

60 48 38 40 34 31

25 Frequency EachGrade of Frequency 20 12 2 2 4 4 0 0.7 1.0 1.3 1.7 2.0 2.3 2.7 3.0 3.3 3.7 4.0 5.0 Grade

Figure 3: Grade distribution across the entire course. 1.0 is the highest grade, 4.0 the lowest passing grade, and 5.0 a failing grade. 0.7 means higher than expected and is only a mathematical expression used for internal computing and is equivalent to a 1.0.

Table 2: Core competencies assessed in each exam task and subtask. Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. Exam Task Competency 1 2 3 1a 1b 1c 1d 2a 2b 2c 2d 2e 2f 2g 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j Properties Stoichiometry Electrochemistry Acid/base Organic Chemistry Aromaticity Reaction equilbiria Kinetics Polymers Exam Task Competency 4 5 6 7 4a 4b 4c 4d 4e 4f 4g 4h 5a 5b 5c 5d 5e 5f 5g 6a 6b 6c 6d 6e 6f 6g 6h 7a 7b 7c Properties Stoichiometry Electrochemistry Acid/base Organic Chemistry Aromaticity Reaction equilbiria Kinetics Polymers

The core competencies assessed in each exam task were determined by the authors through separate analyses of the exam, bringing together the results, and discussing any discrepancies until all were in agreement. The competencies assessed in each exam task and subtask is shown in Table 2. Each core competency is explicitly assessed in at least one subtask (dark blue) and in some cases implicitly assessed (light blue).

Analysis and comparison of self-assessment Students self-assessed their knowledge in the core competency areas on a scale of 1-4 (1 – have no clue or idea; 2 – have seen it or heard it; 3 – familiar with it; 4 – know it well) two times, once at the beginning of lectures and the second time after the lectures were over but before the exam was offered (Figure 4). Overall, students rated themselves significantly higher in all core competencies after attending the lecture (properties: t(22) = -4.159, p < 0.0005; stoichiometry: t(22) = -6.146, p < 0.0005; electrochemistry: t(21) = -5.811, p < 0.0005; acid/base reactions: t(21) = -4.170, p < 0.0005; organic chemistry: t(21) = -5.266, p < 0.0005; aromaticity: t(22) = -12.024, p < 0.0005; reaction equilibria: t(22) = -6.554, p < 0.0005; kinetics: t(22) = -6.747, p < 0.0005; polymers: t(22) = -2.510, p = 0.020). In comparison, within the grade groups, students in the 0.7- 2.7 group assessed themselves as significantly increasing in all core competencies except for properties and polymers (properties: t(11) = -1.593; stoichiometry: t(11) = -4.022, p = 0.002; electrochemistry: t(11) = -4.168, p = 0.002; acid/base reactions: t(11) = -2.548, p = 0.027; organic chemistry: t(11) = -3.463, p = 0.005; aromaticity: t(11) = -7.000, p < 0.0005; reaction equilibria: t(11) = -3.957, p = 0.002; p = 0.139; kinetics: t(11) = -3.957, p = 0.002; polymers: t(11) = -1.332, p = 0.210). The 3.0-4.0 group assessed themselves with significant gains except in polymers (properties: t(7) = -4.965, p = 0.002; stoichiometry: t(7) = -5.000, p = 0.002; electrochemistry: t(6) = -3.286, p = 0.017; ; acid/base reactions: t(6) = -2.646, p = 0.038; organic chemistry: t(6) = -2.489, p = 0.047; aromaticity: t(7) = -9.000, p < 0.0005; reaction equilibria: t(7) = -5.000, p = 0.002; kinetics: t(7) = -5.227, p = 0.001; polymers: t(7) = -1.183, p = 0.275), and for the 5.0 group, students self-assessed significant gains in aromaticity and organic chemistry (p < 0.05) and reaction equilibria, kinetics, and polymers (p < 0.10) and insignificant gains in stoichiometry, electrochemistry, and acid/base reactions (stoichiometry: t(2) = -1.890, p = 0.199; electrochemistry: t(2) = -2.000, p = 0.184; acid/base reactions: t(2) = -1.732, p = 0.225; organic chemistry: t(2) = -5.000, p = 0.038; aromaticity: t(2) = -7.000, p = 0.020; reaction equilibria: t(2) = -3.500, p = 0.073; kinetics: t(2) = -3.500, p = 0.073; polymers: t(2) = -4.000, p = 0.057;). The highest exam group, 0.7-2.7, consistently assessed themselves higher in all competencies compared to the other two groups before the lecture (pre self-assessment) but there were no statistical significant differences between groups except for polymers (F (2,110) = 2.794 p = 0.066, partial 2 = 0.048 ). The 0.7-2.7 grade group rated themselves significantly higher in the pre self- assessment than the 5.0 grade group (mean difference = 0.56  0.24, p = 0.052). There were also no statistical significant differences between groups in their post self-assessment, but the higher grade group typically assessed themselves higher than the two lower groups except for in properties, stoichiometry, aromaticity, and kinetics.

0.7 - 2.7 Pre 3.0 - 4.0 Pre 5.0 Pre 0.7 - 2.7 Post 3.0 - 4.0 Post 5.0 Post 5.0 4.5 4.0 3.5 3.0 2.5 2.0

1.5 Average Score Average 1.0 0.5 0.0

Core Competency

Fig 4: Average pre and post self-assessments of the core competencies for all students and by grade groups, 0.7-2.7, 3.0-4.0, and 5.0. Maximum possible score was 4.0.

Analysis and comparison of lecture multiple choice questions Previously developed for this course were 13 sets of 10 multiple choice questions for students to use as a general review. Each was available one week after its corresponding lecture and was open till the end of the lecture period. Since students had to answer these questions and minimally receive a score of 70% to have online access to that week’s exercise solutions, they were allowed unlimited attempts to answer these questions. Moodle logs the date and time the student accesses each question set as well as the score and duration of their attempt. All lecture multiple choice question sets were filtered by attempt and time. Only lecture multiple choice question sets that were the first attempt and took longer than three minutes to complete were included in the analysis. Attempts lower than three minutes were an indication of just clicking through the question set to see the answer and not of an actual attempt to assess knowledge. Overall, students scored lower in the corresponding lecture multiple choice questions in Lecture 2 (spectroscopy/properties, 66.73%  19.57), Lecture 4 (acid/base reactions 64.15%  22.44), Lecture 5 (organic chemistry 62.59  31.74), Lecture 8 (properties 62.90  17.56) and higher in Lecture 1 (properties 80.50  14.14), Lecture 7 (aromaticity 86.08 14.09), and Lecture 12 (kinetics 80.67  13.33). There were no statistically significant differences between the three different grade groups except for Lecture 2 (spectroscopy/properties; F (2, 344) = 5.574, p = 0.004, partial 2 = 0.031), Lecture 6 (organic chemistry; F (2, 262) = 5.352, p = 0.005, partial 2 = 0.039), and Lecture 12 (kinetics; F (2, 269) = 2.475, p = 0.086, partial 2 = 0.018). In Lecture 2 both the 0.7-2.7 group (mean difference = 8.5  2.91, p = 0.010) and 3.0-4.0 (mean difference 7.18  2.45, p = 0.010) group scored statistically significantly higher than the 5.0 group. The results are similar for Lecture 6 with the 0.7-2.7 (mean difference = 9.22  2.88, p = 0.004) and 3.0-4.0 group (mean difference = 5.86  2.49, p = 0.050) having significantly higher scores than 5.0.

Integration and correlations between competency quizzes, exam scores, lecture multiple choice questions, and self-assessment Across the different assessments, students tended to score consistently lower on organic chemistry and polymers and acid/base reactions on the exam and lecture multiple choice questions and consistently higher in stoichiometry, aromaticity, and kinetics. Interestingly, students also self- assessed themselves lower in acid/base reactions and polymers and higher in stoichiometry, aromaticity, and electrochemistry (Figure 4), corresponding to their exam and competency assessments.

Spearman’s correlations were used to test if there were any statistical significance to the above observations and to validate the competence model. Competency quizzes, post self-assessment, and lecture multiple choice question scores were compared to the total exam points achieved and each other.

As shown in Table 3, there were 7 positive correlations between the competency quizzes and exam scores for the highest grade group, 0.7-2.7 (1), 7 for the second, 3.0-4.0 (2), and 5 for the last group, 5.0 (3). The expected correlations are highlighted in dark blue (explicit assessment) and light blue (implicit assessment). These included positive correlations with properties, stoichiometry, electrochemistry, acid/base, organic chemistry, and aromaticity for the first group, properties, stoichiometry, electrochemistry, acid/base, and kinetics for the second group, and stoichiometry, electrochemistry, organic chemistry, reaction equilibria, and kinetics for the third. There was no explicit overlap between correlations and the different grade groups (Table 3), but the first group had a strong correlation between exam scores and stoichiometry and acid/base and the third with kinetics.

Table 3: Correlations between competency quizzes and exam tasks and subtasks with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0). Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. *TEP – Total Exam Points Exam Task Competency Quiz TEP* 1 2 3 1 1a 1b 1c 1d 2 2a 2b 2c 2d 2e 2f 2g 3 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j Q1 (Properties & Stoichiometry) 2 Q2 (Stoichiometry & Electrochemistry) 3 2 Q3 (Stoichiometry, Electrochemistry, & Acid/Base) Q4 (Organic Chemistry) 3 Q5 (Aromaticity) 1 Q6 (Properties & Organic Chemistry) 1 Q7 (Reaction Equilibria) 3 Q8 (Kinetics) 3 2 3 Q9 (Polymers) Overall Competency 2 Exam Task Competency Quiz 4 5 6 7 4 4a 4b 4c 4d 4e 4f 4g 4h 5 5a 5b 5c 5d 5e 5f 5g 6 6a 6b 6c 6d 6e 6f 6g 6h 7 7a 7b 7c Q1 (Properties & Stoichiometry) 2 1 Q2 (Stoichiometry & Electrochemistry) 1 Q3 (Stoichiometry, Electrochemistry, & Acid/Base) 2 1 2 1 1 Q4 (Organic Chemistry) Q5 (Aromaticity) Q6 (Properties & Organic Chemistry) Q7 (Reaction Equilibria) Q8 (Kinetics) Q9 (Polymers) Overall Competency

Table 4 describes the positive correlations found between the competency quizzes and lecture multiple choice questions. For the highest grade group, 0.7-2.7, the calculated overall competency was positively correlated with the lecture multiple choice questions covering properties (2 different lectures), electrochemistry, acid/base, organic chemistry, aromaticity, and reaction equilibria, and for the second group, 3.0-4.0, the overall competency was correlated with properties, and organic chemistry along with Quiz 1 with two different properties lectures and Quiz 6 with properties, and the third group, 5.0, had correlations between overall competency with stoichiometry and Quiz 4 with organic chemistry

Table 4: Correlations between competency quizzes and lecture multiple choice questions with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0).. Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. Lecture Multiple Choice Competency Quiz 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Q1 (Properties & Stoichiometry) 2 2 Q2 (Stoichiometry & Electrochemistry) Q3 (Stoichiometry, Electrochemistry, & Acid/Base) Q4 (Organic Chemistry) 3 Q5 (Aromaticity) Q6 (Properties & Organic Chemistry) 2 Q7 (Reaction Equilibria) Q8 (Kinetics) Q9 (Polymers) Overall Competency 2 3 1 2 1 1 1 1 1

Between the lecture multiple choice questions and the exam scores, stronger correlations were seen in regards to acid/base and reaction equilibria for the first group, 0.7-2.7, along with properties, aromaticity, and polymers (Table 5). Stronger correlations were also seen in the second grade group, 3.0-4.0, in stoichiometry and organic chemistry along with properties, electrochemistry, acid/base, kinetics, and polymers. For the third grade group, 5.0, more correlations were seen with properties and organic chemistry as well as stoichiometry and aromaticity. There was an overlap with both the first and third groups having a positive correlation between total exam points and Lecture 7 covering aromaticity and groups 1 and 2 between Task 7a and Lecture 10 covering properties.

Table 5: Correlations between lecture multiple choice questions and exam tasks and subtasks with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0). Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. *TEP – Total Exam Points Exam Task Lecture Multiple Choice TEP* 1 2 3 1 1a 1b 1c 1d 2 2a 2b 2c 2d 2e 2f 2g 3 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j L1 (Properties) L2 (Properties) 3 L3 (Stoichiometry) 2 3 L4 (Electrochemistry & Acid/Base) L5 (Organic Chemsitry) 2 3 3 2 L6 (Organic Chemistry) 2 2 L7 ( Aromaticity) 1,3 L8 (Properties) L9 (Properties) L10 (Properties) L11 (Reaction Equilibria) 1 1 1 1 L12 (Kinetics) 2 L13 (Polymers) Exam Task Lecture Multiple Choice 4 5 6 7 4 4a 4b 4c 4d 4e 4f 4g 4h 5 5a 5b 5c 5d 5e 5f 5g 6 6a 6b 6c 6d 6e 6f 6g 6h 7 7a 7b 7c L1 (Properties) 3 3 3 L2 (Properties) 3 3 3 3 L3 (Stoichiometry) 2 L4 (Electrochemistry & Acid/Base) 1 2 1 2 L5 (Organic Chemsitry) L6 (Organic Chemistry) L7 ( Aromaticity) L8 (Properties) L9 (Properties) 3 L10 (Properties) 1,2 L11 (Reaction Equilibria) L12 (Kinetics) L13 (Polymers) 2 1

Only three correlations were observed between the competency quizzes and post self-assessment. All three were found in the first group, 0.7-2.7, and include properties, electrochemistry, and organic chemistry (Table 6).

Table 6: Correlations between competency quizzes and post self-assessment with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0). Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. Competency Quiz Competency 1 2 3 4 5 6 7 8 9 Properties 1 Stoichiometry Electrochemistry 1 Acid/base Organic Chemistry 1 Aromaticity Reaction equilbiria Kinetics Polymers Overall Competency

More positive correlations were seen between the exam score and post self-assessment (Table 7). The first grade group, 0.7-2.7, had correlations between the total exam points and stoichiometry, electrochemistry, acid/base, organic chemistry, aromaticity, and kinetics, between exam tasks and acid/base, reaction equilibria, and kinetics. The second group, 3.0-4.0, had correlations between exam tasks and stoichiometry, electrochemistry, acid/base, and organic chemistry. The third group, 5.0, had a correlation between total exam points and reaction equilibria and exam tasks and organic chemistry and kinetics.

Few correlations were seen between the lecture multiple choice questions and self-assessment. One for the first group, 0.7-2.7, in organic chemistry, two for the second group, 3.0-4.0, in properties and acid/base, and none for the third group, 5.0 (Table 8).

Overall the average core competencies, not broken down by grade group, show positive correlations with the exam tasks in stoichiometry, aromaticity, reaction equilibria, kinetics, and polymers. Between competencies and the lecture multiple choice questions, positive correlations were seen for properties, stoichiometry, electrochemistry, acid/base, organic chemistry, and reaction equilibria. For the lecture multiple choice questions and exam tasks, positive correlations were seen for the following competencies: properties, stoichiometry, electrochemistry, acid/base, organic chemistry, aromaticity, reaction equilibria and kinetics. For the self-assessments, correlations were only seen between properties and organic chemistry, between the exam and self- assessments, in properties, stoichiometry, electrochemistry, acid/base, kinetics, and polymers, and for the lecture quizzes, only for acid/base and organic chemistry.

Table 7: Correlations between exam tasks and subtasks and post self-assessment with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0). Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. *TEP – Total Exam Points Exam Task Competency TEP* 1 2 3 1 1a 1b 1c 1d 2 2a 2b 2c 2d 2e 2f 2g 3 3a 3b 3c 3d 3e 3f 3g 3h 3i 3j Properties Stoichiometry 1 Electrochemistry 1 Acid/base 1 Organic Chemistry 1 2 3 Aromaticity 1 Reaction equilbiria 3 1 Kinetics 1 3 1 Polymers Exam Task Competency 4 5 6 7 4 4a 4b 4c 4d 4e 4f 4g 4h 5 5a 5b 5c 5d 5e 5f 5g 6 6a 6b 6c 6d 6e 6f 6g 6h 7 7a 7b 7c Properties Stoichiometry 2 Electrochemistry 2 Acid/base 2 1 2 Organic Chemistry Aromaticity Reaction equilbiria 1 Kinetics Polymers

Table 8: Correlations between lecture multiple choice questions and post self-assessment with the number indicating each grade group’s significant correlations (1: 0.7-2.7; 2: 3.0-4.0; 3: 5.0). Dark blue shading indicates explicit assessment and light blue implicit assessment of the corresponding competency. Lecture Multiple Choice Competency 1 2 3 4 5 6 7 8 9 10 11 12 13 Properties 2 Stoichiometry Electrochemistry Acid/base 2 Organic Chemistry 1 Aromaticity Reaction equilbiria Kinetics Polymers Overall Competency

The full set of statistical data can be found in Appendix C. There were also significant correlations found between non-expected competencies, exam scores, self-assessments, and lecture multiple choice questions, but it is beyond the scope of this paper and further analysis is needed on this data to fully examine and discuss those discrepancies.

Analysis of additional factors At the end of the course, students were asked how often they attended lecture and recitation since neither are obligatory but recommended, when they went through the given, supplementary exercises, either throughout the duration of the course or at the end before the exam, and whether they read and used the given script in preparation for the exam. There were no statistically significant differences between the total exam points of the three grade groups for any of the above variables. The students that regularly attended lecture performed better than those that did not, but attendance at recitation did not seem to have an impact on total exam points achieved (Figure 5). When and how the students decided to prepare for the exam in regards to the exercises completed and using the script did not seem to have a critical impact on their final grade (data not shown).

Lecture Recitation 70.000

60.000

50.000

40.000

30.000

20.000

10.000 Average Total Exam Points Exam Total Average 0.000 None Few (1-4) Several (5-10) Most (11-13) All (14) Attendance

Figure 5: Average total exam points for lecture and recitation attendance.

Discussion Use of an online platform such as Moodle allows for quick assessment and feedback to both the students and instructors on knowledge gained throughout a course. The fundamental issues in regards to any assessment are validity and reliability. Validity includes authenticity of the assessment, effective feedback, and learner support, and reliability is defined as the degree to which what is being assessed is a sufficient indicator of student knowledge1. This direct competency model indicates a link between core competency questions in Chemistry and knowledge assessed through the exam. The core competency quizzes coupled with the previously developed lecture multiple choice questions are indicators of Chemistry knowledge, most especially in the areas of properties, stoichiometry, organic chemistry, reaction equilibria, and kinetics and less so in acid/base, electrochemistry, and polymers. Further validation and reliability are needed due to the sample size of this study.

There were poorer or no links between core competencies and self-assessment. The only correlations seen were in the areas of properties and organic chemistry. More were seen between the points achieved on exam tasks and subtasks and their self-assessments, most especially in the areas of stoichiometry, electrochemistry, acid/base, organic chemistry, reaction equilibria, and kinetics and less so in properties, aromaticity, and polymers. Even though there were not significant differences, students tended to rate themselves higher in the core competencies that they performed better on the exam and competency quizzes and lower in the corresponding areas. Similar studies have confirmed the correlation between self-assessment and objective measures2.

Conclusions

In this study, the development of a direct competency model coupled with an online assessment platform, Moodle, allows real-time feedback for the instructors and students on knowledge level. From this information, instructors can bring more information into the next lecture if there is a gap in knowledge instead of finding out after the exam, and the students can self-regulate their learning process. It is particularly useful in a large class with hundreds of students and no formally graded work. The next step in this research is to carefully look at each question and objectively assess whether it is fully measuring its goal and further develop more competency questions to make a more inclusive competency concept inventory for a short quiz (2-3 questions) every week. With an inventory and online assessment platform, a randomized competency quiz could be given to each student when they access the quiz. For example, for one week the computer program would randomly assign two out of 10 questions from an inventory over a specified competency. This would also lead to a decrease in student dishonesty.

Due to the voluntary requirement of this study and low sample numbers from the total population, this data will be combined with the forthcoming year’s data for a fuller, better analysis, along with breaking down each question in the competency quiz to help tie them to the course outcomes. To increase the number of students participating, completion of the competency quizzes and lecture multiple choice questions have been highly recommended for the feedback and self-regulation of the learning process. Another possibility would be to require taking these quizzes and give students a small participation grade. Increasing the sample will increase the validity and reliability of this model.

References

1. Gikandi JW; Morrow Dl Davis NE (2011). Online Formative Assessment in Higher Education: A Review of the Literature. Computers & Education, 57:2333-2351. 2. Mehta SI; Schlecht NW (1998). Computerized Assessment Technique for Large Classes. Journal of Engineering Education, 87(2):167-172. 3. Robles M; Braathen S. (2002). Online Assessment Techniques. Delta Pi Epsilon Journal, XLIV(1):39-49. 4. Barak M; Rafaeli S (2004). Online Question-Posing and Peer-Assessment as Means for Web-based Knowledge Sharing. International Journal of Human-Computer Studies, 61:84-103. 5. Swan K; Shen J; Hiltz SR (2006). Assessment and Collaboration in Online Learning. Online Learning (formerly Journal of Asynchronous Leaning), 45-62. 6. Conole G; Warburton B (2005). A Review of Computer-Assisted Assessment. Alt-J, Research in Learning Technology, 13(1):17-31. 7. Graff M (2003). Cognitive Style and Attitudes Towards Using Online Learning and Assessment Methods. Electronic Journal of e-Learning, 1(1):21-28. 8. Hager P; Gonczi A (1996). What is Competence? Medical Teacher, 18(1):15-18. 9. Brumm TJ; Mickelson SK; Steward BL; Kaleita AL (2006). Competency-based Outcomes Assessment for Agricultural Engineering Problems. International Journal of Engineering Education, 22(6):1163-1172. 10. Terry RE; Harb JN; Hecker WC; Wilding WV (2002). Definition of Student Competencies and Development of an Educational Plan to Assess Student Mastery Level. International Journal of Engineering Education, 18(2):225-235. 11. Witt HJ; Alabart JR; Giralt F; Herrero J; Vernis L; Medir M (2006). A Competency-Based Educational Model in a Chemical Engineering School. International Journal of Engineering Education, 22(2):218-235. 12. Jacobs TJ; Caton JA (2014). An Inventory to Assess Students’ Knowledge of Second Law Concepts. 121st ASEE Annual Conference & Exposition, Indianapolis, IN. 13. Hochstein JL; Perry EH (1999). Direct Competency Testing – Is It For You? 106th ASEE Annual Conference & Exposition, Charlotte, NC. 14. Karimi A; Manteufel RD (2014). Assessment of Fundamental Concepts in Thermodynamics. 121st ASEE Annual Conference & Exposition, Indianapolis, IN. 15. Dougiamas M (2001). Moodle: Open-Source Software for Producing Internet-based Courses. http://moodle.com/.

Appendix A – Direct Competency Questions- The following questions are examples for each competence that were part of the 25 questions developed for the direct competency quizzes. The correct answer is marked with an asterisk. Q1.3 indicates that this question was the third question in Quiz 1. It is similar for the rest.

─Properties of atoms and compounds based on the periodic table and bonds Q1.3 A material is brittle, can conduct electricity when melted, is a high melting point solid, and is soluble in water. What type of chemical bond does it exhibit? a. Covalent b. Ionic* c. Metallic d. e. Van der Waals f. I don’t know ─Stoichiometry Q1.2 How many moles of aluminum oxide are produced from a reaction of 2 moles of aluminum and 1 of ? a. 1 mol 2 b. ⁄3 mol* c. 2 mol 1 d. ⁄2 mol e. I don’t know ─Oxidation/reduction equations Q2.2 Based on the oxidation reduction reaction of Cu(s) and nitric acid, which of the following statements is correct? a. Cu is oxidized* b. Cu is reduced c. Cu is a reducing agent d. HNO3 is an oxidizing agent e. HNO3 is oxidized f. I don’t know ─Acid base neutralization Q3.1 Identify the correct acid/base and conjugate pair.

− 2− + 퐻2푃푂4 + 퐻2푂 ↔ 퐻푃푂4 + 퐻3푂 − 2− a. 퐻2푃푂4 is the acid; 퐻푃푂4 is the conjugate acid − 2− b. 퐻2푃푂4 is the acid; 퐻푃푂4 is the conjugate base* + c. 퐻2푂 is the acid; 퐻3푂 is the conjugate base − + d. 퐻2푃푂4 is the base; 퐻3푂 is the conjugate acid e. I don’t know ─Identifying, naming, and properties of the different classes of Organic Chemistry Q4.1 What is systematic name for the following compound?

a. 3,ethyl-4-methyl-hexane* b. 2,3-diethylpentane c. 3-methylpropylpentane d. 3,4diethylpentane e. 3-methyl-4-ethylehexane f. I don’t know

─Reaction equilibria Q7.3 In the following system, what occurs when the is increased?

2푁푂2(푔) ↔ 푁2푂4(푔)∆퐻° = −57.20푘퐽 a. Reaction would shift right producing more product b. Reaction would shift left producing more reactant * c. Stay in equilibrium with no shifting of the reaction left or right to produce more product or reactant d. I don’t know ─Kinetics Q8.3 What effect does a catalyst have on the Arrhenius Equation?

−퐸푎 푘 = 퐴푒 푅푇 a. No effect b. Increase rate c. Decrease rate d. Lowers Ea * e. Increases Ea f. Affects T g. Affects A h. I don’t know

─Properties of polymers. Q9.2 Identify the monomer in the given polymer.

a. *

b.

c.

d.

e. I don’t know

Appendix B – Pre and Post Self-Assessment

Pre and Post Self-Assessment Please rate your knowledge of the following core Chemistry concepts. Rank on the scale of 1-4 (4-know it well; 3-familiar with it; 2-have seen or heard it; 1-have no clue or idea) 1. Determining properties of atoms and compounds with the periodic table and/or type of bonds 2. Setting up chemical reactions using stoichiometry to calculate yields 3. Differentiating oxidation and reduction process and calculating redox-reactions 4. Formulating acid-base equilibria and calculating the progress of a neutralization reaction 5. Identifying and naming different classes of Organic Chemistry 6. Describing reaction equilibria mathematically and applying Le Chatelier’s Principle 7. Determining reaction order from experimental data 8. Determining aromaticity 9. Determining monomers and properties of polymers

Post Self-Assessment Only How did you prepare for the lectures and exam? Select all that apply. 1. Read script a. Yes b. No 2. Read other textbooks a. Yes b. No 3. Attended lecture, if yes, how many? a. All (14) b. Most (11-13) c. Several (5-10) d. Few (1-4) 4. Attended seminars recitation regularly, if yes, how many? a. Every week b. Most (11-13) c. Several (5-10) d. Few (1-4) 5. Worked through modules/exercises, if yes, how often? a. Regularly, as the course progressed and the topic/lecture was introduced b. Occasionally c. At the end of the course to prepare for the exam 6. Found additional exercises to study and/or work through a. Yes b. No

Appendix C – Correlation Tables with Spearman’s  and p-value Only significant correlations are shown with a p < 0.100.

Correlations between Competency Quizzes and Exam Tasks and Subtasks Grade Group Factor 1 Factor 2 Associated Core Competency Spearman's ρ Significance (p-value) TEP* Quiz 5 Aromaticity 0.485 0.093 Task 2 Quiz 6 Organic Chemistry 0.606 0.037 Task 4h Quiz 2 Electrochemistry 0.423 0.045 0.7-2.7 Task 5 Quiz 3 Stoichiometry/Acid/Base 0.520 0.032 Task 5e Quiz 3 Stoichiometry/Acid/Base 0.502 0.004 Task 5f Quiz 3 Stoichiometry/Acid/Base 0.647 0.005 Task 7b Quiz 1 Properties 0.079 0.079 TEP Quiz 1 Properties/Stoichiometry 0.237 0.079 TEP Overall All competencies 0.257 0.047 Task 2g Quiz 8 Kinetics 0.467 0.079 3.0-4.0 Task 3b Quiz 2 Stoichiometry 0.293 0.098 Task 4b Quiz 3 Electrochemistry 0.403 0.034 Task 4f Quiz 1 Properties 0.409 0.002 Task 5b Quiz 3 Acid/Base 0.384 0.043 TEP Quiz 2 Stoichiometry/Electrochemistry 0.557 0.060 Task 2 Quiz 4 Organic Chemistry/Kinetics 0.521 0.083 5.0 Task 2d Quiz 8 Kinetics 0.667 0.102 Task 3 Quiz 8 Kinetics 0.668 0.101 Task 3d Quiz 7 Stoichiometry/Reaction Equilibrium 0.690 0.058 TEP Overall All competencies 0.146 0.092 TEP Quiz 5 Aromaticity 0.359 0.025 TEP Quiz 8 Kinetics 0.603 <0.0005 Task 2 Quiz 8 Kinetics 0.393 0.024 Task 2g Quiz 8 Kinetics 0.475 0.005 Task 3 Quiz 3 Stoichiometry 0.216 0.104 Task 3 Quiz 8 Kinetics 0.473 0.005 Task 3b Quiz 2 Stoichiometry 0.214 0.079 Overall Task 3b Quiz 7 Reaction Equilibria 0.306 0.070 Task 3d Quiz 7 Reaction Equilibria 0.365 0.029 Task 3h Quiz 7 Reaction Equilibria 0.294 0.082 Task 4g Quiz 2 Stoichiometry 0.211 0.084 Task 5d Quiz 1 Stoichiometry 0.155 0.081 Task 5e Quiz 3 Stoichiometry 0.359 0.006 Task 5f Quiz 3 Stoichiometry 0.287 0.029 Task 6e Quiz 9 Polymers 0.312 0.020 *Total Exam Points

Correlations between Competency Quizzes and Lecture Multiple Choice Questions Grade Group Factor 1 Factor 2 Associated Core Competency Spearman's ρ Significance (p-value) Overall Competency L*4 Acid/Base/Electrochemistry (overall) 0.290 0.086 Overall Competency L6 Organic Chemistry (overall) 0.690 < 0.0005 Overall Competency L7 Aromaticity (overall) 0.409 0.039 0.7-2.7 Overall Competency L8 Properties (overall) 0.301 0.021 Overall Competency L9 Properties (overall) 0.333 0.104 Overall Competency L11 Reaction Equilibria (overall) 0.367 0.044 Quiz 1 L1 Properties 0.265 0.051 Quiz 1 L9 Properties 0.323 0.101 3.0-4.0 Quiz 6 L8 Properties 0.534 0.074 Overall Competency L1 Properties (overall) 0.244 0.062 Overall Competency L5 Organic Chemistry (overall) 0.293 0.039 Quiz 4 L5 Organic Chemistry 0.572 0.066 5.0 Overall Competency L3 Stoichiometry (overall) 0.372 0.051 Overall Competency L1 Properties 0.206 0.017 Overall Competency L3 Stoichiometry (overall) 0.220 0.019 Overall Competency L4 Acid/Base/Electrochemistry (overall) 0.230 0.015 Overall Competency L5 Organic Chemistry (overall) 0.196 0.036 Overall Overall Competency L6 Organic Chemistry (overall) 0.261 0.009 Overall Competency L11 Reaction Equilibria (overall) 0.228 0.012 Quiz 1 L1 Properties 0.229 0.010 Quiz 4 L5 Organic Chemistry 0.274 0.057 *Lecture

Correlations between Lecture Multiple Choice Questions and Exam Tasks and Subtasks Grade Group Factor 1 Factor 2 Associated Core Competency Spearman's ρ Significance (p-value) TEP* L*7 Aromaticity 0.272 0.047 Task 3 L11 Reaction Equilibria 0.227 0.049 Task 3c L11 Reaction Equilibria 0.229 0.047 Task 3h L11 Reaction Equilibria 0.221 0.055 Task 3i L11 Reaction Equilibria 0.196 0.090 0.7-2.7 Task 4 L4 Acid/Base 0.242 0.041 Task 4d L4 Acid/Base 0.308 0.009 Task 4h L4 Acid/Base 0.193 0.105 Task 6b L13 Polymers 0.222 0.094 Task 7a L10 Properties 0.238 0.068 TEP L3 Stoichiometry 0.194 0.017 TEP L5 Organic Chemistry 0.165 0.044 Task 2 L6 Organic Chemistry 0.165 0.060 Task 2b L6 Organic Chemistry 0.159 0.070 Task 2c L5 Organic Chemistry 0.145 0.077 3.0-4.0 Task 2d L12 Kinetics 0.153 0.075 Task 4e L4 Electrochemistry 0.162 0.056 Task 5 L3 Stoichiometry 0.133 0.104 Task 5f L4 Acid/Base 0.152 0.072 Task 6e L13 Polymers 0.190 0.038 Task 7a L1 Properties 0.150 0.046 TEP L2 Properties 0.176 0.083 TEP L7 Aromaticity 0.289 0.023 Task 2a L5 Organic Chemistry 0.204 0.073 Task 2b L5 Organic Chemistry 0.266 0.018 Task 3a L3 Stoichiometry 0.198 0.069 Task 4 L2 Properties 0.195 0.060 5.0 Task 4 L9 Properties 0.244 0.106 Task 7 L1 Properties 0.221 0.026 Task 7 L2 Properties 0.300 0.003 Task 7a L1 Properties 0.169 0.089 Task 7a L2 Properties 0.230 0.026 Task 7b L1 Properties 0.175 0.079 Task 7b L2 Properties 0.284 0.006 *TEP-Total Exam Points; L-Lecture Correlations between Lecture Multiple Choice Questions and Exam Tasks and Subtasks, cont'd Grade Group Factor 1 Factor 2 Associated Core Competency Spearman's ρ Significance (p-value) TEP* L1 Properties 0.097 0.065 TEP L2 Properties 0.169 0.002 TEP L3 Stoichiometry 0.138 0.016 TEP L5 Organic Chemistry 0.183 0.001 TEP L6 Organic Chemistry 0.218 <0.0005 TEP L7 Aromaticity 0.117 0.073 TEP L11 Reaction Equilibria 0.100 0.071 TEP L12 Kinetics 0.116 0.057 Task 2 L5 Organic Chemistry 0.137 0.017 Task 2 L6 Organic Chemistry 0.191 0.002 Task 2 L12 Kinetics 0.162 0.008 Task 2a L5 Organic Chemistry 0.107 0.069 Task 2a L6 Organic Chemistry 0.148 0.016 Task 2b L6 Organic Chemistry 0.160 0.009 Task 2c L5 Organic Chemistry 0.134 0.019 Task 2c L6 Organic Chemistry 0.102 0.099 Task 2g L12 Kinetics 0.132 0.031 Task 3 L11 Reaction Equilibria 0.137 0.014 Task 3 L12 Kinetics 0.138 0.024 Overall Task 3a L3 Stoichiometry 0.117 0.042 Task 3c L11 Reaction Equilibria 0.101 0.070 Task 3g L11 Reaction Equilibria 0.112 0.045 Task 3i L11 Reaction Equilibria 0.131 0.018 Task 3j L11 Reaction Equilibria 0.092 0.098 Task 4 L2 Properties 0.119 0.028 Task 4 L4 Electrochemistry 0.147 0.013 Task 4c L4 Electrochemistry 0.173 0.003 Task 4d L4 Electrochemistry 0.108 0.068 Task 4h L4 Electrochemistry 0.121 0.090 Task 5 L3 Stoichiometry 0.121 0.035 Task 5d L4 Acid/Base 0.103 0.082 Task 7 L1 Properties 0.092 0.082 Task 7 L2 Properties 0.112 0.039 Task 7 L8 Properties 0.105 0.086 Task 7 L9 Properties 0.159 0.035 Task 7a L1 Properties 0.144 0.006 Task 7a L2 Properties 0.180 0.001 Task 7c L1 Properties 0.109 0.038 Task 7c L9 Properties 0.145 0.055 *TEP-Total Exam Points; L-Lecture

Correlations between Competency Quizzes and Post Self Assessment Grade Group Factor 1 Factor 2 (Associated Core Competency) Spearman's ρ Significance (p-value) Quiz 3 Electrochemistry 0.558 0.094 0.7-2.7 Quiz 4 Organic Chemistry 0.736 0.024 Quiz 6 Properties 0.784 0.021 3.0.-4.0 None 5.0 None Quiz 1 Properties 0.431 0.084 Overall Quiz 4 Organic Chemistry 0.477 0.053 Quiz 6 Properties 0.572 0.026

Correlations between Exam Tasks and Subtasks and Post Self Assessment Grade Group Factor 1 Factor 2 (Associated Core Competency) Spearman's ρ Significance (p-value) TEP* Stoichiometry 0.493 0.053 TEP Aromaticity 0.575 0.020 TEP Kinetics 0.523 0.038 TEP Acid/Base 0.480 0.060 TEP Organic Chemistry 0.687 0.003 0.7-2.7 TEP Electrochemistry 0.718 0.002 Task 2g Kinetics 0.462 0.072 Task 3i Kinetics 0.519 0.039 Task 5 Reaction Equilibria 0.452 0.079 Task 5e Acid/Base 0.640 0.008 Task 1 Organic Chemistry 0.500 0.021 Task 4g Stoichiometry 0.360 0.100 3.0-4.0 Task 4g Electrochemistry 0.365 0.104 Task 5 Acid/Base 0.398 0.074 Task 5g Acid/Base 0.369 0.091 TEP Reaction Equilibria 0.758 0.080 5.0 Task 2 Kinetics 0.778 0.069 Task 2c Organic Chemistry 0.775 0.070 TEP Acid/Base 0.315 0.040 TEP Electrochemistry 0.335 0.028 Task 1 Properties 0.248 0.104 Task 1 Organic Chemistry 0.341 0.025 Task 2c Organic Chemistry 0.283 0.066 Task 3i Organic Chemistry 0.373 0.013 Overall Task 5 Acid/Base 0.306 0.046 Task 5b Acid/Base 0.347 0.022 Task 5c Stoichiometry 0.263 0.084 Task 5c Acid/Base 0.259 0.093 Task 5e Acid/Base 0.505 0.001 Task 6c Polymers 0.256 0.094 Task 6e Polymers 0.268 0.079 *TEP-Total Exam Points

Correlations between Lecture Multiple Choice Questions and Post Self Assessment Grade Group Factor 1 Factor 2 (Associated Core Competency) Spearman's ρ Significance (p-value) 0.7-2.7 L*5 Organic Chemistry 0.426 0.100 L1 Properties 0.454 0.034 3.0-4.0 L4 Acid/Base 0.466 0.080 5.0 None L4 Acid/Base 0.282 0.096 Overall L5 Organic Chemistry 0.306 0.052 *L-Lecture