Predicting and Understanding Success in an Innovation-Based Learning Course
Total Page:16
File Type:pdf, Size:1020Kb
Predicting and Understanding Success in an Innovation-Based Learning Course Lauren Singelmann, Enrique Alvarez, Ellen Swartz, Ryan Striker, Mary Pearson, Dan Ewert North Dakota State University [email protected] ABSTRACT able to identify and solve problems more quickly and effec- In order to keep up with the rising demand for new and inno- tively than ever before. ABET [1], the National Academies vative solutions in an evolving world, an even greater impor- of Engineering [10], and experts in both engineering and ed- tance is being placed on training engineers that can tackle ucation [12] all stress the growing importance for training big problems. However, the process of teaching engineer- engineers that can use their problem-solving skills to create ing students to be innovative is not straightforward. There new and innovative solutions. This work explores how stu- are multiple ways to demonstrate innovation and problem- dents work on these skills and solve real-world problems in solving abilities, meaning traditional educational data min- an Innovation-Based Learning (IBL) course. IBL students ing methods aren't always appropriate. To better under- apply their content knowledge and skills to work on a real- stand the process of problem-solving and innovation, this world project with the goal of creating value external to the work collected data from students working on innovation class. For example, successful students have presented their projects within a course and determined appropriate ways work at conferences, published papers, participated in in- to gain information and insight from the data. Students vited outreach activities, or submitted invention disclosures. wrote and categorized learning objectives in an online por- Students are required to write their own learning objectives tal, which generated log data when they created, updated, and show evidence of work. However, because there are so and completed personal learning objectives and correspond- many possible approaches to the course, predicting and un- ing deliverables. Classification models that were both ro- derstanding student success can be challenging. bust (ROC AUC > .95) and interpretable were applied to both the language used in the objectives and the quantifi- In order to better understand what makes students success- able features such as number of objectives, time of complet- ful in this type of course, data were collected from the online ing certain milestones, and number of deletions and edits. learning portal from the class. Because of the open-ended By extracting the most significant features, we are able to nature of the course, classification and knowledge discovery see which variables are most likely to lead to student success can be challenging. We wanted to build classifier models in innovation-based learning. This would aid instructors in that were robust, but also interpretable in order to better offering impactful support to students or eventually lead to predict and support future students in IBL-style courses. an online tutoring system. The conducted analysis will help Therefore, three main research questions were explored: students develop and grow throughout the innovation pro- cess in this course or in other open-ended problem-solving RQ1: What feature sets and models work best for IBL data? environments. RQ2: How early can student success be predicted? RQ3: What features are most likely to differentiate between Keywords top-performers and lower-performers? Classification, open-ended learning, innovation, problem-solving Finding answers to all of these questions can help guide in- struction in the course and potential development of an on- 1. INTRODUCTION line tutoring system for innovation-based learning courses Thomas Friedman describes the current era as the Age of and other open-ended problem-solving environments. This Accelerations, the time at which technology, the climate, paper will present literature about open-ended learning en- and globalization are all evolving at a rate like we've never vironments, give details about the course and data collected, seen before [5]. As these areas progress, engineers need to be elaborate on how the research questions will be tested, and share results and takeaways. Lauren Singelmann, Enrique Alvarez, Ellen Swartz, Ryan Striker, 2. OPEN-ENDED LEARNING Mary Pearson and Dan Ewert "Predicting and Understanding Open-ended learning is a pedagogical approach that allows Success in an Innovation-Based Learning Course" In: students to use their own motivations and approach to learn Proceedings of The 13th International Conference on about the world [8]. Students are taking part in authen- Educational Data Mining (EDM 2020), Anna N. Rafferty, Jacob tic problem-solving, practicing metacognition, and creat- Whitehill, Violetta Cavalli-Sforza, and Cristobal Romero (eds.) ing unique pathways through their learning. Examples of 2020, pp. 662 - 666 open-ended learning environments include computer pro- Proceedings of The 13th International Conference on Educational Data Mining (EDM 2020) 662 gramming exercises, project-based learning, and inquiry-based learning. Educational research exists about the potential benefits of these experiences [11], but there are still gaps in understanding about how students progress through open- ended learning environments. Educational data mining (EDM) has shown great potential in being able to help shed light on student trajectories and habits within open-ended learning environments [20]. EDM has shown preliminary successes in open-ended learn- ing environments such as learning computer programming [9], project-based learning courses [17], online tutoring plat- forms [3,4], learning-by-teaching platforms [6], and language Figure 1: Example of collected learning objective tutoring systems [13]. Continuing to make strides in un- and corresponding deliverables. derstanding learning in open-ended environments is imper- ative because it will be an important step in implementing evidence-based practices and assessment in these environ- value, 10 students worked on an innovation project but did ments. not achieve high external value, and 1 student made some learning goals but did not complete any. 3. METHODS 3.1 Cardiovascular Engineering Course 3.4 Feature Collection Data were collected from an upper level cardiovascular en- Two main types of features were used and compared: quan- gineering course at a medium-sized research university. Stu- titative data and text data. The quantitative features that dents learned about the main concepts of cardiovascular en- were extracted from the data include countable features (e.g. gineering including functional block diagrams of the heart, number of planned learning objectives, number of logins, arterial systems, and ECG. The students were assessed on etc.), quarter-based progress (e.g. number of deliverables their ability to apply these concepts to a project they worked completed during quarter 2, number of learning objectives on during the semester. After identifying a project and a deleted during quarter 4, etc.), presence of the specific learn- team, they wrote learning objectives and corresponding de- ing objectives as seed in Appendix A (e.g. presence of In- liverables that would share what they needed to learn, when vention Disclosure objective, number of Fundamentals of they would learn it, and how they would demonstrate it. Research objectives, etc.), and the level of learning as de- Students could adjust their learning goals as needed, but fined by Bloom's Revised Taxonomy and the level of exter- they were expected to share their progress on their project nal value. multiple times during the semester [14]. To get an A in the course, students needed to work on an innovation project For the text data, all learning objective and deliverable titles and achieve high external value. High external value is and descriptions were extracted for each student. Using the demonstrated by sharing your work outside the course and scikit-learn library in Python, all the words that students getting review from a subject-matter expert, e.g. publish- wrote in their objectives and deliverables were tokenized, ing a paper, presenting at a conference, or submitting an counted, and scaled. invention disclosure [2, 19]. 4. EXPERIMENTS 3.2 Online Learning Portal 4.1 Models and Feature Sets A custom online learning management system (LMS) was In order to predict which students would achieve high exter- created for students to keep track of their learning objec- nal value during the course of the semester, three classifier tives and deliverables [18]. When students add a learning models were tested: Support Vector Machine (SVM), Lo- objective, they give it a title, description, assign it to a level gistic Regression (LR), and K-Nearest Neighbors (KNN). of Bloom's Revised 3D Taxonomy [7], and categorize it using These three models were chosen because they have some the list of objective categories in Appendix A. When adding level of interpretability, an important feature in EDM [15]. a deliverable, students give it a title, description, level of In order for instructors to use the discovered information, external value, estimated completion time, and status (not they need to be able to understand where it was derived started, in progress, or completed). An example of a learning from. The baseline model was a Majority Class (MC) clas- objective and corresponding deliverables is shown in Figure sifier. 1. Every time a student adds, edits, or deletes an