Quick viewing(Text Mode)

Supporting the Learning of Rapid Application Development in a Database Environment

Supporting the Learning of Rapid Application Development in a Database Environment

Supporting the Learning of Rapid Application Development in a Environment

Silviu Risco BEng(Mech), PGradDipInfTech

Thesis submitted in fulfillment of the requirements for the degree of

Doctor of Philosophy

Faculty of and Queensland University of , Australia. August, 2012 ii

Keywords

A list of keywords for this thesis is as follows: Intelligent Tutoring , Student Modelling, Individualised Feedback, Rapid Application Development, Access iii

Abstract

Intelligent Tutoring Systems (ITSs) are systems designed to provide individualised help to students, learning in a context. The difference between an ITS and a Computer Assisted Instruction (CAI) is that an ITS has a Student Model which allows it to provide a better edu- cational environment. The Student Model contains on what the student knows, and does not know, about the domain being learnt, as well as other personal characteristics such as preferred learning style. This research has resulted in the design and development of a new ITS: Personal Access Tutor (PAT). PAT is an ITS that helps students to learn Rapid Application Development in a database environment. More specifically, PAT focuses on helping students to learn how to create forms and reports in Microsoft Access. To provide an augmented learning environment, PAT’s architecture is dif- ferent to most other ITSs. Instead of having a , PAT uses a widely- used database development environment (Microsoft Access). This enables the students to ask for help, while developing real applications using real database . As part of this research, I designed and created the knowledge base re- quired for PAT. This contains four models: the domain, student, tutoring and exercises models. The Instructional Expert I created for PAT provides individualised help to the students to help them correctly finish each exercise, and also proposes the next exercise that a student should work on. PAT was evaluated by students enrolled in the subject at QUT, and by staff members involved in teaching the subject. The results of the evaluation were positive and are discussed in the thesis.

Table of Contents

1 Introduction 1 1.1 ArtificialIntelligenceinEducation...... 2 1.2 IntelligentTutoringSystems ...... 4 1.2.1 TheArchitectureofanITS ...... 5 1.2.2 ITSBehaviour...... 6 1.2.3 ITSKnowledgeBase ...... 12 1.3 RapidApplicationDevelopment ...... 14 1.4 ResearchGoalandObjectives ...... 17 1.5 ResearchScope ...... 21 1.5.1 WorkdonePriortothisResearch ...... 21 1.5.2 ResearchScope ...... 21 1.6 OutlineofThesis ...... 23

2 Literature Review 27 2.1 Introduction...... 28 2.2 TheComponentsofanITS ...... 30 2.2.1 TheDomainModel...... 30 2.2.2 TheStudentModel ...... 31 2.2.3 TheInstructionalExpert...... 36

v vi TABLE OF CONTENTS

2.2.4 TheSimulationComponent ...... 42 2.3 KnowledgeRepresentationFormalisms ...... 44 2.4 DetailedDescriptionofSpecificSystems ...... 47 2.4.1 ITSsforLearningAboutDatabases ...... 47 2.4.2 ITSsforLearningAboutProgramming ...... 55 2.4.3 ITSswithaDetailedStudentModel ...... 59 2.4.4 ITSsthatDo/Don’tUseaSimulation...... 64 2.5 ImplicationsforResearchinthisThesis ...... 72

3 SystemRequirementsandArchitecture 73 3.1 SystemRequirements...... 74 3.1.1 AnalysingtheStudents’Needs...... 74 3.1.2 FunctionalRequirements ...... 78 3.1.3 Non-functionalRequirements ...... 79 3.2 TeachingMSAccess ...... 81 3.2.1 MicrosoftAccessOverview ...... 81 3.2.2 TeachingAccessatQUT ...... 83 3.3 PAT’sArchitecture ...... 85 3.3.1 ArchitecturalDesign ...... 85 3.3.2 KnowledgeRepresentation ...... 87 3.3.3 PAT’sFunctionality ...... 88 3.4 Summary ...... 89

4 Designing new Models 91 4.1 Introduction...... 92 4.2 TheDomainModel...... 93 4.2.1 DomainModelConceptualSchema ...... 93 4.2.2 ExtendtheUseoftheDomainModel ...... 98 4.3 TheExercisesModel ...... 101 TABLE OF CONTENTS vii

4.3.1 ExerciseDescription ...... 101 4.3.2 ExerciseSolution ...... 105 4.3.3 CreatingaNewExercise ...... 113 4.4 TheStudentModel ...... 115 4.4.1 Student’s Knowledge About the Domain ...... 115 4.4.2 Student’s Personal Characteristics ...... 116 4.4.3 Student’sHistory ...... 117 4.5 UpdatingtheStudentModel...... 120 4.6 TheTutoringModel ...... 123 4.7 Summary ...... 128

5 Giving Feedback to the Student 129 5.1 Introduction...... 130 5.2 AnalysisoftheStudent’sSolution ...... 132 5.3 Synthesis: GeneratingtheFeedback ...... 136 5.3.1 HintLevels ...... 136 5.3.2 HighLevelPseudoCodeforSynthesis ...... 139 5.4 ProposeaNewExercise ...... 152 5.4.1 Considerations when Recommending a New Exercise . . 152 5.4.2 Pseudo for Recommending a New Exercise . . . . . 154 5.5 Summary ...... 157

6 Building a new ITS 159 6.1 TechnologiesUsed...... 160 6.2 Interfaces ...... 162 6.2.1 FirstTimeInterfaces ...... 162 6.2.2 OuterLoopInterface ...... 164 6.2.3 InnerLoopInterfaces...... 167 6.2.4 AdditionalInterfaces ...... 178 viii TABLE OF CONTENTS

6.3 DatabasesandTables...... 179 6.4 ModulesStructureandCode...... 182 6.5 Summary ...... 184

7 System Evaluation 185 7.1 Evaluating an Intelligent Tutoring System ...... 186 7.1.1 AspectsofanITSEvaluation ...... 186 7.1.2 EvaluationMethods ...... 188 7.1.3 Choosing the Correct Evaluation Method ...... 191 7.1.4 HowwasPATEvaluated ...... 194 7.2 PAT’sSummativeEvaluation ...... 195 7.2.1 Evaluations’Objectives...... 195 7.2.2 EvaluationswithStudents ...... 196 7.2.3 EvaluationswithStaff ...... 206 7.2.4 Conclusions for Summative Evaluation ...... 211 7.3 Pre-PostTestEvaluation...... 214 7.3.1 InformationSources ...... 214 7.3.2 TheStudentPopulation ...... 215 7.3.3 TheTopicsConsidered ...... 215 7.3.4 ResultsofthePre-PostEvaluation ...... 216 7.4 Summary ...... 220

8 Summary of Contributions and Future Directions 221 8.1 ResearchContributions...... 222 8.1.1 DesignofPAT’sArchitecture ...... 222 8.1.2 DesignNewModels...... 223 8.1.3 Design PAT’sFeedbackMechanism ...... 224 8.1.4 EvaluationofPAT ...... 225 8.2 ResearchLimitations ...... 226 TABLE OF CONTENTS ix

8.3 FutureDirections ...... 227

AResearchMethodologyandProcess 229

B Initial Focus 233

C How to Install PAT 239

D How to Use PAT 245

List of Figures

1.1 Theoretical foundations for Intelligent Tutoring Systems. .... 4 1.2 ThegeneralarchitectureforanITS...... 5 1.3 ThelifecyclestagesofRAD...... 14

2.1 ThegeneralarchitectureforanITS...... 28 2.2 Typesofstudentmodels...... 32 2.3 Exampleofanontology...... 44 2.4 ThearchitectureofDB-suitetutors...... 48 2.5 SQL-TUTOR’sinterface...... 49 2.6 Kermit’suserinterface...... 50 2.7 Normit’suserinterface...... 52 2.8 Acharya’sarchitecture...... 52 2.9 Acharya’suserinterface...... 54 2.10 ThearchitectureofBITS...... 56 2.11 TheuserinterfaceofBITS...... 56 2.12 AnexampleofanexercisefromELP...... 57 2.13 An example of customised compilation message from ELP. . .. 58 2.14 TheuserinterfaceofJITS...... 59 2.15 Inferring students’ behaviour Wayang Outpost...... 60

xi xii LIST OF FIGURES

2.16 ThelogicalviewofTILE’sstudentmodel...... 61 2.17 Attributes’ specifications of TILE’s student model...... 62 2.18 TAO’suserinterface...... 65 2.19 The simulation interface for Cardiac Tutor...... 66 2.20 TheuseoftheCALsystem...... 67 2.21 TheinterfaceofAlgeBraintutor...... 68 2.22 The interface of the Advanced Tutor...... 69

3.1 The configuration of components in the knowledge base. . . .. 79 3.2 Microsoft Office Access 2007 - graphical interface...... 81 3.3 MicrosoftOfficeAccess2007-exampleofaform...... 82 3.4 Microsoft Office Access 2007 - example of a report...... 83 3.5 PAT’sarchitectureandmaincomponents...... 86 3.6 PartofthesemanticnetworkofPAT...... 88

4.1 Accessobjects...... 93 4.2 Controls that can be created inside of forms or reports...... 94 4.3 Example of properties for a TextBox control instance...... 94 4.4 Domainmodelconceptualschema...... 96 4.5 Semanticnetworkforthedomainmodel...... 97 4.6 AccessvsOpenOfficeBaseControls...... 98 4.7 Access vs OpenOffice Base - properties for objects...... 99 4.8 Access vs OpenOffice Base - properties for a form...... 99 4.9 Exercisemodelconceptualschema...... 104 4.10 Exampleofaform,inanexercise...... 108 4.11 Exampleofasolutiongraph...... 109 4.12 Studentmodelconceptualschema...... 119 4.13 Solution graph and the correctness of a student solution. . . . .121 4.14 An example of a associated to an action...... 125 LIST OF FIGURES xiii

4.15 Tutoringmodelconceptualschema...... 127

5.1 TypesOfErrors...... 134 5.2 Generatingthehinttextforanadvicetype...... 137

6.1 PAT’smenuinAccess...... 161 6.2 FirstTimewindow...... 163 6.3 Questionswindow...... 163 6.4 TheWelcomewindow...... 164 6.5 MyProfilewindow...... 165 6.6 ThefeedbacksiteforPAT...... 166 6.7 Tooltipsforawindow...... 167 6.8 ThisExercisewindow...... 168 6.9 AnexampleoftheTrafficLightswindow...... 171 6.10 Advicewindow...... 172 6.11 Screenshotfortherequiredform...... 174 6.12 Alltherequirementsforanexercise...... 175 6.13 Morereadings...... 175 6.14 Another example of a diagram associated to an action...... 176 6.15 ExampleofusingPAT...... 177 6.16 WelcomeBackwindow...... 178 6.17 Tablesintheknowledgebase...... 180 6.18 Student-systeminteractions...... 183

7.1 ChoosinganevaluationmethodforanITS...... 193 7.2 The questions given to the students (first page)...... 197 7.3 The questions given to the students (second page)...... 198 7.4 Students’background...... 199 7.5 Students’answersforQuestion4...... 200 7.6 Students’answersforQuestions5and8...... 201 xiv LIST OF FIGURES

7.7 Students’answersforQuestion6...... 203 7.8 Students’answersforQuestion11...... 205 7.9 Students’answersforQuestions10and11...... 205 7.10 Thequestionsgiventostaffmembers...... 207 7.11 TeachingstaffanswersforQuestion4...... 209 7.12 Students and teaching staff answers for questions 11 and 7. . . .211 7.13 Topicsknownatpreandposttests...... 217 7.14 Students confident using Access vs students new to Access. . . .218 7.15 Distribution of students by topics learned...... 219

A.1 A multi-methodological approach to IS Research ...... 230 A.2 A for Systems Development Research ...... 231

B.1 The questions given to students - focus group ...... 237

C.1 Firstinstallwindow...... 240 C.2 Secondinstallwindow ...... 241 C.3 Thirdinstallwindow ...... 242 C.4 Lastinstallwindow ...... 242

D.1 PAT’smenuinAccess ...... 246 D.2 FirstTimewindow ...... 246 D.3 Questionswindow...... 247 D.4 Welcomewindow ...... 248 D.5 WelcomeBackwindow ...... 248 D.6 Howtochooseanexercise...... 250 D.7 Howtochooseanexercise...... 251 D.8 Howtochooseanexercise...... 252 D.9 Thisexercisewindow...... 253 D.10Checkyoursolution...... 253 LIST OF FIGURES xv

D.11TrafficLightswindow...... 254 D.12Checkwhatiswrongwithyoursolution...... 254 D.13DisplayAdvicewindow...... 255 D.14Askformorehelp...... 255 D.15Havefun!...... 255 D.16Selecttheassignment...... 256 D.17Checkthesolutionfortheassignment...... 257 D.18Givefeedback...... 257

List of Tables

1.1 ITSfunctionsbasedonthestudentmodel...... 7

3.1 Students’ answers mapped to the KVL tutoring framework. . .. 77 3.2 FunctionalitiesimplementedinPAT...... 88

4.1 The conceptual schema design procedure (CSDP)...... 92 4.2 Example of tasks/subtasks for an exercise...... 111 4.3 Object-properties as steps associated to a subtask within a task. 112 4.4 Examplesofgoalsandtheirdescription...... 123 4.5 Examples of actions and their description...... 124

5.1 Examples of hint texts for an error of commission...... 138 5.2 Example of the CountKnowsForTopics table...... 153

6.1 ServicesthatPATprovides...... 170

7.1 Evaluationmethods...... 193 7.2 Evaluation objectives and related questions for students’ survey. 198 7.3 Students’answersforQuestions1and2...... 199 7.4 Students’answersforQuestion4...... 200 7.5 Students’answersforquestions5and8...... 201

xvii xviii LIST OF TABLES

7.6 Students’answersforquestions6and7...... 202 7.7 Students’answersforQuestions9and10...... 204 7.8 Students’answersforQuestion11...... 204 7.9 Evaluation objectives and related questions for staff survey. . . . 206 7.10 TeachingstaffanswersforQuestion2...... 208 7.11 TeachingstaffanswersforQuestion3...... 208 7.12 Teaching staff answers for questions 4 and 5...... 209 7.13 TeachingStaffanswersforQuestion6...... 210 7.14 TeachingStaffanswersforQuestion7...... 210 7.15 Topicsanalysedinthepre-posttest...... 216

B.1 InterviewTime-lines ...... 236 List of Abbreviations

A list of abbreviations used in this thesis is as follows:

AI Artificial Intelligence

CAA Computer Assisted Assessment

CAI Computer Assisted Instruction

GUI Graphical Interface

ITS Intelligent Tutoring System

MIE Most Important Error

PAT Personal Access Tutor

QBE Query By Example

QUT Queensland University of Technology

RAD Rapid Application Development

RDBMS Relational Database Management System

SDLC Systems Development Life Cycle

SQL Structured

SM Student Model

VBA Visual Basic for Applications xx

Statement of Original Authorship

The work contained in this thesis has not been previously submitted to meet the requirements for an award at this or any other higher institu- tion. To the best of my knowledge and belief, the thesis contains no material previously published or written by another person except where due is made.

Signature:

Date: xxi

Acknowledgments

To everybody I have learned from during my life, starting with my mother Elena.

Thank you to my supervisor, Jim Reye. Words simply cannot say enough how much I appreciate all your help throughout this journey.

Thank you to my wife Geanina and daughter Georgia, for their support, pa- tience, and understanding.

Chapter 1

Introduction

One hallmark of the field of AI in education is using intelligence to about teaching. Representing what, when, and how to teach requires grounding from within several academic disciplines, including , psychology, and education.

- Beverly Park Woolf - 2008

The chapter presents possible approaches of using for teaching. It presents the general architecture for Intelligent Tutoring Systems, their com- ponents and behaviour. It briefly describes Rapid Application Development methodology and some of the available tools. It concludes with an overview of the thesis .

1 2 1. Introduction

1.1 Artificial Intelligence in Education

Since researchers like , John McCarthy and others (early 1950s) thought that computers can “think”, Artificial Intelligence made possible the idea that computers can do human tasks like tutoring. What exactly can be automated and still being beneficial for the learners - are still research questions (Tedre 2006). However, the use of computers can assist in reducing several teaching chal- lenges, in particular:

◦ 24/7 availability of the tutor; ◦ let the student learn at his own pace; and ◦ individual tuition.

Furthermore, researchers in the field showed that learning outcomes can be improved:

◦ in one-on-one tutoring (Bloom 1984, Regian 1997); ◦ in a problem-solving context (Anderson et al. 1984); and ◦ when receiving immediate feedback (Anderson et al. 1984, Brusilovsky 1994b).

Over time, besides the continuing of technology (smaller, more pow- erful and more accessible computers) the way that computers can be used for teaching and learning purposes evolved as well. At the beginning, computer systems were used mainly as a way of presenting exercises and confirming to the student whether their answer was correct. Computer Assisted Instruction (CAI) systems appeared in the 1960’s as programs that could generate sets of problems to enhance students’ perfor- mance in skill-based domains (Hannafin & Peck 1988). A limitation of CAI systems was that they only recorded the student’s actions, instead of trying to model the student’s knowledge. Later on, researchers tried to alter the new 1.1.ArtificialIntelligenceinEducation 3 materials presented to students based on their previous responses. The next step was made in 1982, when Sleeman & Brown (1982) introduced the term “Intelligent Tutoring System” to distinguish these more advanced systems from CAI. Intelligent Tutoring Systems (ITS) are computer systems designed to help students in their learning by providing individualised help in a problem solving context. A key difference between Intelligent Tutoring Systems and Computer Assisted Instruction systems consists in the ability of ITSs to individualise the feedback. This means that for the same error, different students can receive different feedback. To be able to give an individualised feedback, an ITS should record stu- dents’ knowledge about the domain to be learned, students’ preferred learning style and other personal characteristics. 4 1. Introduction

1.2 Intelligent Tutoring Systems

Intelligent Tutoring Systems are educational software systems able to provide individualised help to the student. In order to individualise the help, the sys- tem must first gather information about the student. The type of information gathered is: what the student knows or doesn’t know about the domain to be taught; the student preferred learning style; and other personal characteristics of the student.

The base disciplines for ITS research (Computer Science, Psychology and Education) are presented in Figure 1.1, adapted from Woolf (2008).

Human-Computer Interfaces User Modelling

Computer Psychology Science ITS

Interactive Learning Education Education Educational Psychology Theories of Learning

Figure 1.1: Theoretical foundations for Intelligent Tutoring Systems.

Psychology as a field includes ideas about how people learn and think; educa- tion offers ways of structuring and presenting the material to be taught and the role of the learner. To create an ITS we use Artificial Intelligence (sub-field of Computer Science) techniques with knowledge obtained from psychology and education. 1.2. Intelligent Tutoring Systems 5

1.2.1 The Architecture of an ITS

Figure 1.2 (adapted from (Burns & Parlett 1991)) presents the main compo- nents of an Intelligent Tutoring System.

Student Model

Instructional Expert Simulation UserInterface Domain Model Student

Intelligent Tutoring System

Figure 1.2: The general architecture for an ITS.

The components of an ITS are briefly described next. 1. Domain Model The domain model represents the knowledge about the domain to be taught, knowledge that an expert in the domain should have. It represents the foun- dation for the entire knowledge base of the ITS. 2. Student Model The student model contains representing the student’s knowledge about the domain and other personal characteristics of the student such as: learning styles, motivation, etc. 3. Instructional Expert This component of the ITS is based on knowledge both about the domain and the student. It performs instructionally useful tasks: diagnoses the stu- dent’s attempted solution, provides feedback, changes pedagogical strategies, etc. 6 1. Introduction

4. Simulation This is a component of the ITS responsible for the simulation of a real environment that students are learning. Although generally true, not all ITSs have a simulation module. In some cases, such as training to manipulate devices used for life support or other too expensive devices, simulating the real device is a good approach. In other cases, for example students learning to work with new software, a better option is to use the new software itself. 5. The interface will allow the student to communicate with the system. This is a two way communication: the student asking for new exercises, topics or help; and the feedback that system gives back to the student.

1.2.2 ITS Behaviour

To offer individualised help, the ITS should take advantage of the information acquired from the student’s previous interactions. The information acquired is used to update a model of the student’s current of knowledge (what student knows or doesn’t know about the domain) and other personal char- acteristics (degree of concentration, type of thinking, motivation, preferred learning style). The component responsible for keeping all this information is the Student Model.

Without a student model, an ITS would be doomed to follow a preset of steps regardless of the impact of its actions on a student’s learning (Greer & McCalla 1994, preface, p. V) 1.2. Intelligent Tutoring Systems 7

According to Wenger (1987), who made a comprehensive review of the field of Intelligent Tutoring Systems, an ITS should be able to: ◦ gather data from and about student; ◦ use the data to represent the student’s knowledge and learning processes; and ◦ use the student model to select the best pedagogical strategy for each student. In order to create a student model, we first have to decide on the functionalities that the model will support. This must be done both because we need to know exactly what data should be gathered (to provide information required to support the functionalities) and also what data should not be gathered (to avoid an undesired growth of complexity for the model). Self (1987) listed six types of possible functionalities that an Intelligent Tutoring System can have, supported by a student model (see Table 1.1).

Function Purpose Corrective correct the errors detected in the student’s knowledge Elaborative extend student’s knowledge about the domain Strategic initiate instructional changes in tutorial strategy Diagnostic diagnose student’s knowledge Predictive predict student’s response to tutorial actions Evaluative assess student’s knowledge Table 1.1: ITS functions based on the student model.

Corrective The corrective function is used when the student’s knowledge about a topic is missing or is wrong. This can be addressed by one of the following: present- ing the error (or errors) to the student; reminding the student of successful use of the same knowledge; by using a counter example; asking the student to try again; or proposing an easier exercise. 8 1. Introduction

Elaborative function The elaborative function is used when student’s knowledge is correct (no error detected) but incomplete. Several approaches are available: the system could propose the next topic based on the existing curriculum; the student chooses the next topic; or the system will choose the next topic based on a comparison between the student’s knowledge and what the expert in the field should know. Strategic function This involves changing the plan used by the system to successfully finalise the current exercise, or in some cases, propose to change (abandon for now) the exercise. Diagnostic function When the system is uncertain about the student’s knowledge, the diagnostic function is used to eliminate possible confusions. An example of when the diagnostic function is required is when the student model indicates that the student knows topic1 or topic2 but is not indicating which one. Predictive function The student model can be used not only to analyze the student’s past inter- actions with the system, but also to predict the student’s future performance or learning. In this way, the system can select an action which is predicted to have the most beneficial effect for the student. Evaluative function The student model can be used to assess the student’s knowledge - com- pared to curriculum or preset targets.

More recently, VanLehn (2006) presented a global approach to ITSs behaviour - now called the KVL Tutoring Framework (Lester 2006). Although many ITSs have individual, unique characteristics and functionalities, VanLehn presents 1.2. Intelligent Tutoring Systems 9 this general approach as being a generalization of what “tutoring systems tend to do”. Based on the of Task (a multi-minute activity that can be skipped or interchanged with other tasks) and Step (multiple user interface events that together can complete a task) an ITS is presented as having 2 loops: the outer and the inner loop. The outer loop is responsible for the task selection, similar to the elaborative function identified by Self (1987). The inner loop consists in the steps inside of the task: assessment of knowledge (Self’s diagnostic function), feedback and hints (Self’s corrective function), etc. VanLehn suggests that the outer loop, (task selection) can be achieved by using one of the following methods: ◦ the student selects the task from a menu; ◦ the outer loop assigns tasks from a predefined sequence; ◦ the outer loop implements mastery learning (Bloom 1984); and ◦ the outer loop implements the macroadaptation pedagogy. The inner loop presents the steps within a task and can be grouped in two main categories of services offered to the student: step generation and step analysis. While the step generator is about what the student should do next, the step analyzer is responsible for other actions such as giving feedback and answering the question “is it correct?”. The five services described in VanLehn’s paper are: ◦ Minimal Feedback; ◦ Hints on Next-step; ◦ Error-specific Feedback; ◦ Assessment of Knowledge; and ◦ Review of the Solution. 10 1. Introduction

Minimal Feedback Minimal feedback usually indicates whether a particular step from a task is correct or not. Another option can be “correct, but not optimal”. Minimal feedback can be provided immediately, delayed or on demand. Hints on Next-step This is used for students who get stuck and want to know what to do next. To implement this function in an ITS, three important design stages should be considered: when to give the hint, what step to address and what of hint to give. Ideally, a tutor should only give a hint when is needed - e.g. when the student asks for help. However, care needs to be taken for the extreme cases of help abuse (students asking for help even if they don’t need it) and help refusal (students refusing to ask for help even if they really need it). To be helpful, a hint should be given for only one step and:

◦ The step must be part of a correct solution. ◦ The step should be consistent with the student’s plan for solving the problem (if several plans are possible).

For simple steps, the suggested approach is “Point, Teach and Bottom-out” (Hume et al. 1996): in the first instance, just indicate what the problem is; if that doesn’t work, describe the error and show how to fix; and lastly, tell the student the correct solution for the step. Other approaches are also possible: a sequence of hints (from general to more specific) or contingent tutoring - starting the hints not from the weakest level but from a level of help close to the student’s current level of knowledge. Error-specific Feedback As soon as the system detects an error in the student’s work, error-specific feedback can be provided. The feedback can be about what the error is or what to do next to correct the error. The best way to give hints in such a 1.2.IntelligentTutoringSystems 11 situation is to start by indicating the error. If this is not helping the student to remediate the error, give hints about the next step. Assessment of Knowledge The assessment of student’s knowledge can be used by the ITS, a human tutor or by the student themself. The main issue regarding this service is the granularity used for the assessment. If the assessment is used by the ITS, it will help in making decisions about what to hint, what level of hint to be used, etc. VanLehn recommends that the granularity of the assessment should be correlated with the kind of decisions will use it (e.g. for an assessment covering the entire semester we should use a coarser granularity than an assessment used for generating only one hint for an exercise). Some of the issues related to the assessment are: ◦ The difference between “little evidence and mixed evidence” - only con- sidering the ratio between success and failures is not enough. ◦ Because the state of the student’s current knowledge is changing, the “old” evidence should be ignored at some stage, but when? ◦ Should the help given by the system should be considered evidence of lack of knowledge or newly learnt knowledge? ◦ Because some learning events are more difficult then others, they should weigh more as evidence of knowledge. ◦ The frequency of occurrences of error types should be recorded. As an example of the last issue presented above, Arroyo, Murray, Woolf & Beal (2004) suggest that correlations can be found between learning and different events from the student/ITS interaction such as: hints per problem; time spent per problem; or time spent with the hints. The authors suggest that while asking for too many hints to solve an issue is detrimental for learning, spending more time with the hints is beneficial for learning. 12 1. Introduction

Review of the Solution In the case when feedback is only given after the entire solution is sub- mitted, several issues might occur, mainly because several errors might be detected (not only one). To maximise the learning, the aspects that should be considered are: what errors should be addressed (or skipped); and in what order should the errors be addressed (or even give the student the opportunity to choose). In this section I presented what an ITS can do, based on VanLehn’s (2006) approach to ITSs behaviour. In the next section I will present what kind of information the system should have, to be able to perform these functions.

1.2.3 ITS Knowledge Base

As mentioned in Section 1.2.1, the information that an ITS holds (the knowl- edge base) contains three categories of information: information about the domain to be taught; information about the student; and information about tutoring. Each of these categories represents a model in the ITS: domain model, student model and tutoring model. In addition to these three models in the knowledge base, an ITS must also have a collection of exercises. All the above mentioned models are linked to the exercises model. The foundation of the knowledge base is the domain model. The domain model is a representation of the knowledge that an expert in the field should have. On top of the domain model we have the student model. The student model must be able to represent what the student knows about the domain, so it can be seen as an to the domain model. In addition, the student model should also contain other information useful for the tutoring model (student’s learning style, personal characteristics, etc.). The tutoring model contains information about teaching the domain such as tutoring goals and hints for student. 1.2.IntelligentTutoringSystems 13

Although many domains had been already addressed by ITSs, no previous ITS covers Rapid Application Development (RAD) in a database environment. Next section will briefly describe what RAD is and will introduce some of the RAD development tools for database environments. 14 1. Introduction

1.3 Rapid Application Development

Rapid Application Development (RAD) (Martin 1991) is a software develop- ment methodology focusing on building application in a short period of time by rapidly producing prototypes for the end user, offering the opportunity to improve and refine the user requirements during the development stage. The life cycle stages of RAD are: planning, design, development, and cu- tover. During the process, reiterations of design and development stages help to improve the initial prototype. Figure 1.3 (Hoffer et al. 1999) presents the RAD life cycle - upper part of the figure - mapped to the life cycle of a tradi- tional Systems Development Life Cycle (SDLC).

Planning

Design

Project Identification and Selection

Project Initiation and Planning Development

Analysis

Cutover Logical Design

Physical Design

Implementation

Maintenance

Figure 1.3: The life cycle stages of RAD.

It can be clearly seen from the figure how RAD’s life cycle merges some parts of the SDLC’s to rapidly achieve a prototype and a faster of the system. 1.3.RapidApplicationDevelopment 15

Some of the advantages of using RAD are:

◦ reduced time for development; ◦ user friendly interfaces; ◦ lower costs; and ◦ less testing effort.

Specialised tools using RAD methodology have emerged for database applica- tions, -based applications, cross-platform applications, etc. Some RAD development tools for database environments are:

◦ Microsoft Access (Access) - a member of the Microsoft Office suite. Ac- cess support RAD by providing tools to easily build forms and reports in a database environment. ◦ OpenOffice.org Base - an open source database-driven RAD development environment for building client (desktop) based applications. ◦ Oracle Application Express (Oracle APEX) - a en- vironment based on the Oracle DBMS. It allows a very fast development to be achieved when creating web based applications - as a front end for an Oracle database.

Of the examples mentioned above, Access, as part of the Microsoft Office suite, is arguably(finchannel.com 2010) the most widely used RAD tool for desktop database applications. Access supports rapid application development by providing specialised tools for:

◦ database design - the relationships manager; ◦ query design - the Query By Example (QBE) tool; ◦ creating forms and reports - to easily retrieve or update data from the database; and ◦ Visual Basic for Applications (VBA) - to further extend the functionality of forms and reports. 16 1. Introduction

In the first part of the Databases subject (a first year subject at Queensland University of Technology) the students learn the SQL language. In the sec- ond part, the students learn how to use RAD in a commercial RDBMS, in a business-like manner. For the presented in the previous paragraph, the chosen RAD environment is Access. Over several semesters of teaching Access, the staff from the Databases subject noticed that the most difficulties that students have in learning RAD using Access are related to the creation of forms and reports. Designing forms and reports, in the traditional SDLC, is part of the Logical Design phase. Because in RAD the Analysis and Logical Design life cycles are merged into the Design phase, this adds an extra burden to students who are new to both RAD and Access. For this reason, an ITS with a focus on creating forms and reports in Access would be a beneficial addition to the students learning RAD. Such an ITS does not need to address all the tools and features that Access has, i.e. does not need to provide support tools for database design, writing Queries By Example, or using Visual Basic for Applications. 1.4.ResearchGoalandObjectives 17

1.4 Research Goal and Objectives

The main goal of my research is to increase the effectiveness of students learn- ing Rapid Application Development in a database environment. One way of increasing the effectiveness of students’ learning is using an ITS to provide in- dividual tuition. There are no ITSs for teaching RAD in such an environment. Developing an ITS is not a simple task: “[intelligent] tutors are difficult to build” (Woolf 2008, p. 394). Although ITS research is an established research field, there still are many research problems and difficulties regarding the design and development of an ITS. Two of these difficulties are listed below.

(a) Different ITS developers have adopted different architectures (Nkambou et al. 2010). An important architectural aspect is whether to use a simulation or not. Even though using the real system instead of a simulation was already mentioned as being beneficial for the students (Mitrovic 1998), no ITS with a focus on databases has implemented it.

(b) There is a lack of precise definition of the student models in the existing literature. The student models used in ITSs are usually only broadly described, they are not presented in detail. Examples of such student models will be given in Chapter 2.

In the context of developing an ITS for students learning RAD in a database environment, the research questions which address these difficulties are:

1) Can such an ITS be designed and built to use the real learning environ- ment (embedded within the software development environment) instead of a simulation?

2) When developing software systems, do students prefer to have an ITS embedded within the software development environment, rather than the ITS being a separate piece of software? 18 1. Introduction

3) Can the student model for such an ITS be designed - and documented - in a way that clearly shows its structure, enabling the design’s potential reuse in future ITS research?

The tasks that must be performed to address these questions are briefly de- scribed next.

1) Design an appropriate ITS architecture for students learning to create forms and reports in Microsoft Access. The ITS will provide the students with augmented learning by using a real working environment instead of a simulation module.

2) Design new models for the knowledge base that can be used with the architecture designed as part of the first task. The information from the models will help the ITS to provide feedback so that an explanation or hint will be different even if the student is still stuck after the previous one.

3) Design a feedback mechanism that enables different students to receive different (individualised) hints, even for the same error.

4) Build an actual ITS for use in the classroom that will be based on the first three tasks.

5) Evaluate the new ITS and its acceptance by students and academic staff.

I describe each of these tasks in more detail next. 1.4.ResearchGoalandObjectives 19

1. Design a Suitable Architecture An ITS architecture usually contains a simulation module. This module is required to simulate the real environment that the student is learning about. However, using the real environment (if possible) is much better then using a simulated one (see Section 2.4.4). 2. Design New Models New models (as presented in Section 1.2.1) must be created. They must be specifically created for university students learning about databases. In particular, they should focus on Rapid Application Development by creating forms and reports in Microsoft Access. 3. Design a Feedback Mechanism An ITS must have a mechanism to provide individualised feedback to the students. Such a mechanism must be able to analyse the student’s work and generate the most appropriate feedback for the student. The hints should be available on several levels of specificity: from generic to more specific. 4. Build a new ITS To prove that the theoretical model designed is indeed able to “increase the effectiveness of students learning Rapid Application Development” (the main goal of my research), I have built a new ITS: PAT - Personal Access Tutor. 5. ITS Evaluation Throughout this research, two categories of evaluations - as described by Mark & Greer (1993) - were performed:

◦ Formative Evaluations - to provide guidance for a better design and implementation of PAT. ◦ Summative Evaluations - to assess the educational impact of PAT in students’ learning.

A. Formative Evaluations To determine the students’ needs and opinions about such a system, a 20 1. Introduction focus group was conducted in 2005 (see Appendix B Initial Focus Group). In addition, an initial prototype was presented to a number of students in second semester 2007. The students were asked to comment on what they were doing and what they were expecting to get back while they were using PAT. A video camera recorded the screen of their computer, capturing the students’ activity and comments. Furthermore, a web site containing evaluation forms was created. In PAT’s interfaces, there are links to the web site, making this an easy way for students to provide feedback about PAT. A big advantage was that the students could give feedback exactly when they noticed a problem with PAT - while using it. B. Summative Evaluations The latest version of PAT was used by students for two semesters (semester 1 and 2, 2008). Minor changes were done to PAT after the first semester. PAT was evaluated in the second semester 2008. Students and academics involved in teaching the subject were asked to answer two (different) questionnaires. These resulted in both qualitative and quantitative data. In addition to the questionnaires for students and staff, more evaluation was done in the second semester of 2010. During the student-system inter- action, PAT records all the actions done by the student together with useful information such as: a) the time and date when PAT was used; and b) the most important error from the student’s solution; etc. Based on this data, I have conducted a pre-post evaluation to examine the number of topics that the students learned while using PAT. 1.5. Research 21

1.5 Research Scope

1.5.1 Work done Prior to this Research

PAT, the ITS produced as part of this research, was based on existing (non- ITS) software Reye (2004/2008) which could get the student’s solution for a form, and detect its errors - if any. That original version could display what parts of the exercise are correctly and what parts are incorrectly done in the student’s solution in the form of traffic lights - green for correct and red for incorrect. However, the non-ITS software did not have any of the domain, student or tutoring models (which an ITS must have), and could only contain one exercise at one time. Besides indicating which part of the exercise is wrong, did not provide any more specific help about what the error was or how to fix it. Because no other features existed in the non-ITS software at that time, an extremely simple user interface was required, only a button to display the traffic lights. While the traffic lights window was kept, all the ITS components were created as part of this research.

1.5.2 Research Scope

As stated in Section 1.4, PAT was created to help students learning Rapid Application Development in a database environment using Access. As already discussed in Section 1.3, when the students enrolled in the Databases subject at QUT start learning how to develop Access applications using RAD principles, they have already learned the SQL standard as the first part of the subject. Because in the second part the focus is on RAD using Access, the examples and exercise the students work on do not have a high difficulty on the queries used to extract the required data. The difficult part is on creating complex Access applications. This is why at this stage the students have most problems 22 1. Introduction learning how to create forms and reports. For this reason, and because forms and reports are the main tools that Access has for RAD, PAT’s focus is on creating forms and reports. PAT has a limited capability of analysing queries (SQL) and Visual Basic for Application code (VBA), and no capability of analysing the tables design (database design). However, PAT can still queries with an average difficulty, which supports the learning of forms and reports. Extending PAT’s capability to address more complex SQL , VBA coding, and database design is a very difficult task, outside the scope of this research. ITSs with a focus on SQL and database design (later presented in Section 2.4.1) already exist, although not specifically for Access. 1.6. Outline of Thesis 23

1.6 Outline of Thesis

To achieve the goal of my research (as stated in Section 1.4), I employed a multi-methodological research strategy, the Systems Development Research Methodology Nunamaker et al. (1991). This methodology combines elements of both social (behavioural) and engineering (development) approach to re- search. The stages of this research methodology are:

1) Construct a conceptual framework

2) Develop a system architecture

3) Design the system

4) Build the prototype

5) Experiment, observe and evaluate the system

This thesis structure is organized following these stages. Chapters 1 and 2 (In- troduction and Literature Review) describe the conceptual framework. Chap- ter 3 (System’s Requirements and Architecture) shows how I developed the system’s architecture. Chapters 4 and 5 (Designing new models and Giving Feedback to the Student) represent the “Design the system” stage. Chapter 6 (Building a new ITS) represents the “Build the prototype” stage and Chapter 7 (System Evaluation) is the last stage: Experiment, observe and evaluate the system. More details about the Systems Development Research Methodology can be found on Appendix A - Research Methodology and Process. A short de- scription of each of the chapters follows. Chapter 2 - Literature Review The aim of this literature review is to provide a presentation of Intelligent Tutoring Systems. It describes the components of an ITS through various examples. It also provides some of the approaches for student modelling. 24 1. Introduction

Chapter 3 - System Requirements and Architecture This chapter, based on the research methodology presented in Appendix A, gives an overview of the system requirements and architecture. The chapter presents the results of a focus group aimed to develop the requirements for PAT as part of analysing the student’s needs. It briefly describes MS Access and how MS Access is taught at QUT. It also presents the architectural design and high level functionalities of PAT. Chapter 4 - Designing New Models This chapter presents a detailed description of the knowledge base that PAT uses. The first part presents the domain model - the knowledge about Access as a domain for PAT. The second part presents the exercises model. It contains information about how exercises are defined and used, and it also presents in- formation regarding the solution for exercises. The third part presents the student model with information about what the student knows, student’s per- sonal characteristics, and student’s history. The fourth part presents how PAT’s student model is maintained. The last part presents the tutoring model. Chapter 5 - Giving Feedback to the Student This chapter describes how the Instructional Expert component works. It presents the two stages that PAT goes through to provide the individualised feedback to the student: the analysis and the synthesis. It describes the hint levels used in PAT and the pseudo code for the high level functions for the two stages. It also presents the mechanism to propose a new (the next) exercise to a student. 1.6. Outline of Thesis 25

Chapter 6 - Building a New ITS This chapter presents the used to build PAT. It presents the interfaces and their use. It also presents the modules’ structure and interac- tions, and the implementation of the databases for the student model. Chapter 7 - System Evaluation This chapter presents the summative evaluation of PAT. It starts with an overview of evaluation techniques for ITSs. It presents the results of the sum- mative evaluations of the final version of PAT, by both students and academic staff. The results of a pre-post test evaluation are also discussed. Chapter 8 - Summary of Contributions and Future Directions This chapter describes the contributions of this research and ways of further enhancing PAT’s usefulness to students. Appendix A - Research Methodology and Process The appendix presents the research methodology that I followed for this research. Appendix B - Initial Focus Group The appendix presents the preparation and the results of a focus group conducted as a formative evaluation. Appendix C - How to Install PAT The appendix contains a document explaining how to install PAT on a new computer. Appendix D - How to Use PAT The appendix contains a document explaining how the students should use PAT. The document is delivered during the installation of PAT.

Chapter 2

Literature Review

The question ’What can be automated?’ is one of the most inspiring philosophical and practical questions of contemporary civilization.

George Forsythe - 1969

The aim of this literature review is to describe prior research on Intelligent Tutoring Systems that is relevant to this thesis. In particular, prior work on ITSs for learning about databases and programming is described. The chapter also reviews literature on the use of in ITSs and concludes by summarising the implications for the research in this thesis.

27 28 2. Literature Review

2.1 Introduction

In this chapter I describe previous work in the field that informed my research. This literature review is presented in four parts as described below. The first part gives a high level view of the components of an ITS, describing them from a broad perspective. This part follows the components of the general architecture of an ITS - as previously described in Section 1.2.1. For the reader’s convenience, that figure (describing the ITS architecture) is repeated here.

Student Model

Instructional Expert Simulation UserInterface Domain Model Student

Intelligent Tutoring System

Figure 2.1: The general architecture for an ITS.

The second part describes several knowledge representation (KR) formalisms that can be used to represent the knowledge required for the ITS components. This part will help the reader to better understand the third part, which gives more details about an ITS components. The third part gives a more detailed view of some specific ITSs. Because no previous ITSs help students to learn about Microsoft Access, I focused on other ITSs for learning about databases. These ITSs are concerned with learning database aspects such as Structured Query Language (SQL) and Database Design. 2.1. Introduction 29

Moreover, because no existing ITS covers Rapid Application Development (RAD) in a database environment - further, there are no ITSs specifically addressing RAD in any kind of environment - I have focused on previous ITS research concerned with part of applications development: teaching - ming. The last part of the chapter presents the implications of the literature review’s findings for this research. 30 2. Literature Review

2.2 The Components of an ITS

2.2.1 The Domain Model

Every ITS needs a domain model to represent its knowledge of the subject material. I.e. the knowledge that an expert in the domain has. The domain model is the foundation for the entire knowledge base of the ITS, and must contain all the information to be taught (Burns & Parlett 1991, Schulmeister 1997). Lynch et al. (2006) suggest that the domain models be categorised by com- plexity (simple to complex) and type of structure (well-structured or not). The categories of domains that Lynch et al. (2006) suggested are listed below. ◦ Problem solving domains: simple, well-defined domains; correct answers can be identified. ◦ Analytic and unverifiable domains: no absolute measurement of right or wrong answers. ◦ Design domains: complex, ill-structured domains (e.g. architecture, mu- sic compositions, etc.). Different types of models can be approached in different ways, based on their structure (Kodaganallur et al. 2005, Mitrovic et al. 2003). Two of the well- known approaches are model-tracing (MT) and constraint-based modelling (CBM). MT represents the domain model as declarative knowledge (factual in- formation) and procedural knowledge (the process to achieve a goal). CBM represents the declarative knowledge as a of constraints on correct solu- tions. The MT approach looks at each step the student performs to get to a particular stage. If the step is not correct, corrective actions are taken to help the student do the correct step. In contrast, CBM looks only at current state of the student’s solution, searching for violations of constraints (errors) 2.2. The Components of an ITS 31

- regardless of the process the student used to get there.

Both Kodaganallur et al. (2005) and Mitrovic et al. (2003) agree that for some domains one approach can be more suitable that the other one. However, they have different opinions in regards to the characteristics that influence the decision of choosing one approach in favour of the other one.

Mitrovic & Weerasinghe (2009) suggests that knowing only the type of the domain is not enough, what is also important is the task type. A second dimension (the instructional task) is added, so we can have four categories:

◦ Well-defined domains with well-defined tasks; ◦ Well-defined domains with ill-defined tasks; ◦ Ill-defined domains with well-defined tasks; and ◦ Ill-defined domains with ill-defined tasks.

Mitrovic & Weerasinghe (2009) argues that CBM can successfully be used for well-defined domains, with either well- or ill-defined tasks.

Kodaganallur et al. (2005), however, suggests that the most important fac- tors when choosing the best approach for a particular domain type are the information richness of the solution and the complexity of the goal structure for the solution process. Kodaganallur et al. (2005) argue that while MT can be used for any domain, CBM requires a high degree of information richness of the solution.

2.2.2 The Student Model

In an ITS, the Student Model (SM) represents the system’s beliefs regarding the student’s knowledge of the domain (Holt et al. 1994), and additional infor- mation about the student, such as personal characteristics and learning style (Beck et al. 1996, Parvez & Blank 2008, Reye 2004). 32 2. Literature Review

Student’s Knowledge About the Domain

There are several ways of classifying Student Models (Brusilovsky 1994a, Holt et al. 1994, Self 1987). From the point of view of the nature of knowledge, the student models can be classified as student models or perturbation student models (See Figure 2.2 - adapted from (Holt et al. 1994)).

1. The overlay model is a subset of the expert knowledge which can identify missing part of student knowledge.

2. The perturbation model is an extension to the overlay model that contains information about misconceptions and errors that the student might have.

Student's misconceptions Domain Knowledge about the domain

Student's Knowledge Overlay Student Model Perturbation Student Model

Figure 2.2: Types of student models.

The overlay model (as a subset of an expert knowledge) contains only infor- mation about what the student does or does not know. The ITS will detect differences between the expert’s knowledge and the student’s knowledge by er- rors in the student’s answers. Conati et al. (2002) describe two types of errors that can occur in student’s attempted solution: errors of omission (missing actions) and errors of commission (wrong actions). Holt et al. (1994) propose a variation of the overlay model - the differential model. This model distinguishes between what an expert in the field knows and what the student is expected to learn. In this way, the parts of the student model can be classified by importance: not all the topics are equally important. 2.2. The Components of an ITS 33

The perturbation model can be seen as an extension to the overlay model: it also contains information about misconceptions that the student might have, as deviations from the overlay model. As mentioned in Section 2.4.1, SQL- Tutor (Mitrovic 1998) is an example of an ITS using Constraint-Based Mod- elling (CBM) (Ohllsson 1992). CBM is an approach that represents declarative knowledge in terms of constraints. Violation of a constraint indicates incom- plete or incorrect knowledge. Further research regarding misconceptions was done by Webber (2004). The author presents Conception Model, a framework that represents than interprets the student’s errors in terms of knowledge in the context of a specific learning situation. Besides representing the student’s knowledge about the domain, the stu- dent model should also contain information about the student’s personal char- acteristics. The next section covers this.

Student’s Personal Characteristics

Beck et al. (1996) state that besides the student’s knowledge about the do- main, we should also add “general pedagogical information about the stu- dent”. While Schulmeister (1997) describes learning behavior as a hard thing for an ITS to analyse, Woolf et al. (2001) described it as very beneficial to the instructional process. Furthermore, Bourdeau et al. (2004) suggest that including personal characteristics in the student model can lead to the identi- fication of the best instructional strategy for each student. There are many personal characteristics of the student that can be useful to model. E.g. Beck et al. (1996) talk about acquisition (a measure of how fast students learn new topics) and retention (how well they recall materi- als over time) while Reye (2004) indicates that the student’s overall-aptitude, reliability-of-claims and student-is-bored are measures that should be part of 34 2. Literature Review the student model. Additionally, Woolf (2006) and Arroyo et al. (2009) iden- tified gender and cognitive differences which once addressed, improved the learning results. Most ITSs include some personal characteristics in their student model, to provide improved functionality. In the student model I have designed for PAT, I included information about some personal characteristics, similar to those described above.

The Life Cycle of a Student Model

When creating a student model, literature about the student model life cycle is valuable. By the life cycle of a student model, we refer to the developing, initializing, updating and use of the student model. From this perspective, we can further categorise student models as:

◦ adaptive or adaptable - who updates the model; ◦ static or dynamic - when (how often) is the student model updated; ◦ grouped or individual - the model contains data about a student or a group of students (Rich 1983); ◦ internal or external - where is the model stored; ◦ fine grained or coarse grained - what is the granularity of the information stored; ◦ short term or long term - for how long is the information from the student model is stored (Gluga et al. 2010, Rich 1983); ◦ quantitative or qualitative - simple numerical models or more complex, models also containing qualitative data (Perez-de-la Cruz et al. 2007); and ◦ open or closed (Bull & Kay 2007, Kay 1997).

A short description of each of the above aspects follows. 2.2. The Components of an ITS 35

Adaptive or adaptable: An adaptive model is updated from the student’s inter- actions with the system; an adaptable (configurable) model is updated directly by the student. Static or dynamic: If the student model is only populated with data at first use (initialization) or is updated on a periodic basis, then it is static. If the student model is updated at each student-system interaction, the student model is dynamic. Grouped or individual: If the student model represents characteristics of groups (stereotypes) it is a group student model; if it represents the characteristics of only one person, it is an individual student model. Internal or external: A student model can be stored in the system itself (in- ternal) or outside the system (external). Fine grained or coarse grained: The student model can contain very detailed information or just very important, high level data. I.e. it can have low or high granularity. Short term or long term: The student model can be stored for a session only (short term) or for more than a session, keeping a history of the student activity during many sessions (long term). Quantitative or qualitative: Perez-de-la Cruz et al. (2007) defines quantitative models as simple representations having performance measures as real num- bers, while qualitative models are more complex models which take into ac- count other information such as bug libraries, learning preferences and styles, etc. Open or closed: A student model can be opened or closed to the student: the student can or cannot see or change the information recorded about themselves in the student model. Considering the above classifications, the student model of PAT is an adap- tive, dynamic, individual, fine grained and closed student model. 36 2. Literature Review

2.2.3 The Instructional Expert

The instructional expert is responsible for the pedagogical approach of the ITS, from diagnosing the student’s solution, to deciding the type and level of feedback and individualization of the feedback. In the next two sections, I first describe the teaching theories that I found to be the most relevant ones for my research; I then describe literature specifically related to generating feedback for the student.

Teaching Theories

A number of decisions must be made when designing instructional materials and actions but: “in order to obtain science-based ITS design, these decisions need to be based on theories” (Bourdeau et al. 2004, p. 154). A large theory surrounds both Learning Strategies (Jonnassen & Grabowski 1993, Nisbet & Shucksmith 1996) and Teaching Theories (Kearsley n.d.). For the domain of learning about computers, Kearsley (n.d.) lists: the GOMS model (Card et al. 1983); Andragogy (Knowles 1975); and the Minimalist Framework (Carroll 1990). In this section, I review the teaching theories that influenced me the most with regards to the pedagogical approach I adopted for PAT. The GOMS model In the book, “The psychology of human-computer interaction”, Card et al. (1983) introduced GOMS model (Goals, Operators, Methods, Selection rules), a human model. In the GOMS model, the procedural knowledge is described in terms of student goals, operators ( actions), methods ( of operators that contribute in achieving a goal) and selection rules (deciding which method to be used). Since that time, several variants of the GOMS model have emerged. John & Kieras (1996) describe some of the variants (CMN-GOMS, KLM, NGOMSL 2.2. The Components of an ITS 37 and CPM-GOMS) and provide guidance on how to choose the appropriate one, based on the particular design situation. However, they note several limitations for the GOMS model:

◦ The tasks must be described in terms of procedural knowledge (so they can be analyzed). ◦ The model cannot represent misconceptions (incorrect knowledge ele- ments). ◦ A list of goals, or high level tasks must be provided.

Andragogy While pedagogy, as the art and science of teaching, is a teacher-focused approach to education, andragogy was defined as “the art and science of help- ing adults learn” (Knowles 1980, p. 43) - i.e. a student-centred approach for adults. For this reason, andragogy focuses more on the process of learning and less on the content. Andragogy suggests that when designing new educational material, the following aspects should be considered.

◦ Adults have to know why they are learning something. ◦ Adults learn best while doing things. ◦ Adults learn best in a problem-solving context. ◦ Adults learn best if there is an immediate use of their work.

Knowles (1984) presented an example of applying andragogy principles to learn about using computers:

◦ The tutor must explain to the learner why specific skills are taught (func- tions, commands). ◦ Learners should perform practical tasks instead of memorizing new ma- terials. ◦ Tasks (exercises) for different topics, and on different levels of difficulty, 38 2. Literature Review

should be available. ◦ Guidance (feedback, tips) should be provided to allow students to learn from their mistakes.

All of the above aspects and principles have been implemented in PAT. The Minimalist Framework Carroll (1990, 1998) presents The Minimalist Framework for the design of instructional materials for computer users. Having its roots in constructivism, the Minimalist Framework looks to build on the learner’s past experiences. Its focus is to minimise the way that instructional materials can prevent the learner from focusing on the activities and tasks that have to be learned. The ideas presented by the Minimalist Framework are as follows.

◦ The learning task has to be meaningful for the student and should be independent (self contained). ◦ Students should be able to start realistic activities from the beginning. ◦ A number of activities, on several difficulty levels, should be available for students to choose from. ◦ The activities should be as close as possible to those that occur in real situations, in real systems.

PAT addresses all of these principles. For the last bullet point above, PAT is not just close to real system, it actually uses the real system. The principles that the Minimalist Framework proposes are:

◦ learners should be able to start immediately and on useful tasks; ◦ the amount of readings (and other forms of passive learning) should be minimised; ◦ error recognition mechanisms should be implemented in the instruction; and ◦ learning activities should be independent of sequence and self contained. 2.2. The Components of an ITS 39

The main ideas from the Minimalist Theory (Carroll 1990) that are beneficial for learning about computers are:

◦ training on real tasks; ◦ exploitation of prior knowledge of the learner; ◦ use of errors as learning opportunities; and ◦ reading (learning) via more than one path.

A brief description of each of these ideas follows. Training on Real Tasks Carroll (1990) states that “the unmotivated learner cannot be helped and the motivated learner cannot be stopped” and the learner motivation “is achieved when the learning task is the learner’s task” (Carroll 1990, p. 78). PAT uses Access, the real learning environment, in this way giving the learner both justification and motivation for his or her training. Exploiting Prior Knowledge Understanding and taking into consideration the students’ prior knowledge represents the most important aspect of instructional design. Individualization of feedback, individualization of hint text, proposing the next exercise or topic or even proposing to student to do an easier exercise or to study more a topic not mastered yet, are all based on the student’s prior knowledge. The hint should contain only known (well understood) concepts and the proposed exercise should contain some topics not previously approached or not mastered yet by the student. PAT uses the student’s prior knowledge to propose a new (the next) exercise. Errors as learning opportunities The main reason for using errors as learning opportunities is clearly stated by Caroll: “much of what learners do is ‘error‘” and it is “unrealistic to imagine that the learner error can be eliminated” (Carroll 1990, p. 86). Of course, if we want to use errors as learning opportunities, we must define what an error 40 2. Literature Review is and analyse the possible type of errors for the particular domain we are modelling.

Especially in a problem solving context, the hints that the student receives must address their errors. Many times, the students just don’t know that something is wrong and simply indicating an error exists can be enough to help the student to fix it (Risco 2005).

Based on this idea, PAT allows the students to simply ask whether their solution is correct or not. When more help is needed, PAT will provide feedback about the error that should be addressed first.

Reading Via More Than One Path

As already mentioned, learning is better when the student is interested in the task. As a consequence, the student must have the facility to select the topic he or she is interested in, or an exercise involving the tasks in which the student wants to improve his or her skills or knowledge.

The possibility of reading (or doing exercises) not just in a sequential order, but allowing to follow different paths, addresses several issues, as presented below. This idea was also implemented in PAT.

◦ Users will not get bored if they can easily skip exercises. ◦ They can work on what they are interested from the beginning. ◦ They can review an easier exercise or even do again any exercise they like.

While the GOMS model provides some good ideas, Andragogy’s principles (es- pecially the example of principles applied to learning about using computers) are more useful for my research. However, the tutoring actions that I have designed for PAT were based on the Minimalist Framework. 2.2. The Components of an ITS 41

Feedback to the Student

One of the options when designing an ITS is to provide immediate feedback on student errors - “when to teach” (Anderson et al. 1984). Corbett & Anderson (2001) refine the idea of error feedback and correction, and presents risks and problems associated with immediate feedback, such as possible interference with overall task performance and learning. Four types of feedback are de- scribed:

◦ immediate feedback and correction by the system; ◦ error flagging and student control of correction; ◦ feedback on demand and student control of correction; and ◦ feedback only at the end of student’s work.

Immediate feedback involves automatically correcting the student’s error.

Error flagging is different from the previous type because the student is warned about the error, but he can continue his work and fix the error later. Many times, simply indicating an error can be enough to help the student to fix it (Risco 2005). However, Kumar (2011) shows that although the students can have better results when receiving this type of feedback, this is not necessarily a consequence of better learning. Some students just keep revising their solution and eventually get the correct answer: the students can “reach the correct answer through trial and error even though the problems are not of multiple- choice nature.” (Kumar 2011, p. 147).

Feedback on demand is when a student can ask for help whenever he or she is stuck. While many ITSs use immediate feedback and error flagging, feedback on demand is also a widely used approach.

Feedback only at the end involves not showing errors to the student during the problem solving process, but only at the end. This can be done by comparing the student’s attempted solution with the correct solution(s). 42 2. Literature Review

Corbett & Anderson (2001) show the results obtained from an experiment, in which they measured the amount of time that students (grouped by the type of feedback received) needed to complete five lessons. The order (in increasing order of time spent) was: immediate feedback, error-flagging, feedback on demand, and no tutoring. However, one might argue that just reducing the time spent on a lesson is not enough; the learning outcomes should be taken into consideration as well (e.g. students could take more time to finish the lessons but they will not make the same errors again). More recently, Razzaq & Heffernan (2010) presented an evaluation of a pre- post test where they compared the results of students receiving different types of hints. The experiment shows that students working on problems did significantly better with hints-on-demand compared to receiving a hint when they made an error. In addition, Singh et al. (2011) show the results of an experiment with eighth grade students comparing the results they had from immediate feedback from computer-supported feedback versus next day feedback. The experiment found that the students gained significantly more with the computer-supported immediate feedback. The feedback mechanism that I designed for PAT offers to the student feedback on demand and on several levels of specificity - from general to more specific.

2.2.4 The Simulation Component

Shannon (1998) defines simulation as “the process of designing a model of a real system and conducting experiments with this model for the purpose of understanding the behavior of the system and /or evaluating various strategies for the operation of the system”. A simulation should have the ability to mimic the dynamics of a real system (Ingalls 2008). When used for training, 2.2. The Components of an ITS 43 the purpose of using the model of the real system is to provide a learning environment for the students. Bell et al. (2008) stated that the learning features of simulation-based train- ing are: content, immersion, and communication. Each of these features is briefly described next, showing their instructional benefits. Content consists of one or more of text, images, videos, music, voice, etc. By taking advantage of rich content, simulation based training improves the learner’s ability to understand the material being taught. Immersion focuses on features that can create a sense of realism. High levels of immersion have the potential to enhance the learner’s feeling of pres- ence, given the perception that one is actually using the real system. Bell et al. (2008) argue that this is the greatest benefit of a simulation. Interactivity and communication provide collaboration potential and refers to characteristics that influence the type of interactions between users of the system and trainers and trainees. The benefit of communication is that it occurs in real time between trainees, or between trainees and the system. Kincaid & Westerlund (2009) argue for the importance of modelling and simulation. They argue that simulations: ◦ can be used by students of all ages and levels; ◦ allow for technical skills to be taught in an integrated manner; ◦ provide realistic training and skills; and ◦ reduce risks to humans. While giving credit to the aviation industry for being the first to largely use simulation for training, Kincaid & Westerlund (2009) also describe other fields that nowadays benefit from using simulation for training, such as the medical field and military training. 44 2. Literature Review

2.3 Knowledge Representation Formalisms

From an AI perspective, there are several Knowledge Representation (KR) approaches that can be used to store the knowledge of an ITS. In this section, I describe some of the KR approaches that I investigated as possible ways of holding the knowledge in my ITS.

◦ semantic nets; ◦ symbolic rules; ◦ fuzzy ; ◦ belief networks; ◦ neural networks; and ◦ case-based representations.

Declarative knowledge can be an ontological representation of the domain (Gruber 1995). Having such a representation can be used to structure and support student models (Dufresne et al. 2008) and also helps one to choose the appropriate teaching theory to be used (Bourdeau et al. 2004). Figure 2.3, adapted from Russell & Norvig (2010), shows an example of an ontology.

Anything

AbstractObjects Events

Sets Numbers RepresentationalObjects IntervalsPlaces PhysicalObjects Processes

Categories Sentences Measurements Things Stuff

Times Weights Animals Agents Solid Liquid Gas

Humans

Figure 2.3: Example of an ontology.

A semantic network (Sowa 2006b) is a declarative graphical notation used to represent knowledge via interconnected nodes and arcs. It is used to support 2.3.KnowledgeRepresentationFormalisms 45 reasoning about knowledge. Sowa (1984) presents several types of semantic networks: definitional net- works, assertional networks, implicational networks, networks, learn- ing networks, and hybrid networks. Implicational networks use implication for connecting the nodes. These networks can be used for representing causality, beliefs or to make inferences. Symbolic rules represent knowledge in the form of if-then rules. Symbolic rules are easy to maintain but cannot deal with uncertainty and cannot reach solutions for unknown inputs. This is not suitable for representing structural and relational knowledge (Hatzilygeroudis & Prentzas 2004). Fuzzy logic (Zadeh 1996) is a form of KR formalism which reasons using approximate rather than exact truth values. While traditional logic theory uses 0 and 1 as the two values for true and false, fuzzy logic variables use ranges between 0 and 1. In this way, fuzzy logic can make inferences with degrees of uncertainty. Bayesian networks (Pearl 2009) are directed acyclic graphs where the nodes represent random variables which could be observable quantities, latent vari- ables, unknown or hypotheses. Each node has associated a proba- bility function for the represented by the node. Case-based representations (Bergmann et al. 2006) are collections of cases together with their solutions. Case-based reasoning consists in finding a case similar to the one that must be addressed. In contrast with symbolic rules, even though some inputs are not known, a solution can still be reach if a similar case is found. While maintain the cases collection is easy, computational issues can appear for large number of cases. Neural networks are computational models inspired from the structure and functions of biological neural networks. A neural network consists in a group of interconnected nodes similar to the human ’s neurones. Usually, the 46 2. Literature Review neural networks are adaptive systems that change their parameters during a learning stage. Neural networks are efficient in inferring solutions but are hard to maintain, any change in the structure will affect the whole network. In addition to these knowledge representation formalisms, there are hybrid approaches (Hatzilygeroudis & Prentzas 2004) which are integrations of two or more single knowledge base formalisms, such as integrations of rule-based with case-based reasoning (Branting 1999, Prentzas & Hatzilygeroudis 2002) or neuro-fuzzy representations (Abraham 2005). To represent PAT’s domain model, I chose a semantic network because it is suitable for the domain to be taught (Lynch et al. 2006) and can satisfy both users and system requirements (Hatzilygeroudis & Prentzas 2004). 2.4.DetailedDescriptionofSpecificSystems 47

2.4 Detailed Description of Specific Systems

This part of the literature review contains a more detailed view of the compo- nents of an ITS through relevant examples. Because there are no existing ITSs specifically addressing RAD or Access, in the first two subsection I focus on ITSs that address databases (Section 2.4.1) and programing (Section 2.4.2) as the domains to be taught. In the third subsection (Section 2.4.3), I focus on ITSs that have a detailed description of their student model in the literature. The focus in the last subsection (Section 2.4.4) is on ITS that do - or do not - use a simulation.

2.4.1 ITSs for Learning About Databases

DB-suite

DB-suite (Mitrovic et al. 2004, Mitrovic & team 2008) consist of three web- based ITSs in the area of databases: ◦ SQL-Tutor: teaches the SQL query language; ◦ KERMIT: teaches conceptual database modelling using the ER model; and ◦ NORMIT: is a data normalization tutor. These tutors are constraint-based tutors (Mitrovic & Ohlsson 1999,Mitrovic & Weerasinghe 2009, Ohllsson 1992). In constraint-based tutors, the system analyses the student’s solution by checking if any constraints from the domain model are violated. The constraints are both for correctness and completeness. If a solution does not violate any constraints, then the solution is considered correct and complete. The architecture of the DB-suite tutors is shown in Figure 2.4 (reproduced from (Mitrovic et al. 2004)). 48 2. Literature Review

Figure 2.4: The architecture of DB-suite tutors.

This architecture contains the following models: the student model; the do- main model (as a set of constraints); and a collection of predefined problems. The tutors provide feedback on different levels, from general to more specific.

SQL-Tutor (DB-suite)

SQL-Tutor (Mitrovic 1998, Mitrovic & team 2008) is a constraint-based ITS for students learning SQL. When using the tutor, the students have to complete SQL statements satisfying given requirements.

The system contains definitions of several databases and a set of problems, together with their ideal solutions. The domain model of SQL-Tutor con- tains more than 700 constraints. The knowledge base I designed for PAT also contains a collection of problems with their solutions.

Figure 2.5 shows the user interface of SQL-Tutor. The main part of the interface contains textboxes where the student can fill in the required parts of a sql statement. It also contains a drop down list from which the student can select the feedback level and a submit button. 2.4.DetailedDescriptionofSpecificSystems 49

The top part of the interface offers the student options such as changing the database and selecting another problem. It also contains the “Run Query” button which runs the query in the back end MS Access database. The result of the query is returned the student.

Figure 2.5: SQL-TUTOR’s interface.

To minimise the student’s memory load, the text of the problem and the schema of the database that the exercise uses are shown in the same interface. SQL-Tutor allows the students to select the exercise to work on or it can propose an exercise based on the student model. SQL-Tutor maintains the history of each constraint (Mitrovic & Ohllsson 1999), recording how often the constraint was: ◦ relevant for the ideal solution; ◦ relevant for the student’s solution; and ◦ satisfied or violated. 50 2. Literature Review

SQL-TUTOR’s student model (Mitrovic 2003) contains:

◦ general information about the student such as their , level of ex- pertise, history etc. ◦ a model of the student’s knowledge as an overlay upon the constraint base.

Kermit (DB-suite) Kermit (Suraweera & Mitrovic 2002, 2004) is an ITS for teaching Database Design using the Entity-Relationship (ER) . The students have to create an ER diagram based on the requirements given by the system. Kermit provides feedback to the students by request only. The students can ask for a hint or can ask for the solution to be evaluated.

Figure 2.6: Kermit’s user interface.

Kermit contains six levels of hints, as presented below.

◦ The first level simply indicates whether the solution is correct or not. ◦ The error flag level indicates the types of constructs with errors. 2.4.DetailedDescriptionofSpecificSystems 51

◦ The hint level gives a hint from the first constraint violated. ◦ The detailed hint level gives a more detailed hint. ◦ The all errors level presents hints on all violated constraints. ◦ The complete solution level presents the full solution. The feedback level is automatically increased each time the student asks for help, up to the hint level. Kermit contains over 200 constrains, both syntactic and semantic constraints. The feedback from PAT follows this idea of increasing the feedback level if the student asks again for help for the same error. What PAT does differently is that it might start the feedback level from a higher (not the lowest) level of feedback - if the information from the student model suggests that the student needs more specific feedback. NORMIT (DB-suite) Normit (Mitrovic 2002) is an ITS for students learning Database Normali- sation, a part of Database Design. Database Normalisation is concerned with data optimisation, to minimise redundancy. Because Database Normalisation is a procedural task, the students have to follow a strict sequence of steps to solve the problem and the system does not need to store a correct solution. The domain model of NORMIT contains more than 80 constraints (both syntactic and semantic) to check the student’s solution. Normit provides feed- back on different levels, in the same way as the other tutors from the DB-suite. Normit’s user interface (Mitrovic 2005) is shown in Figure 2.7 (next page). 52 2. Literature Review

Figure 2.7: Normit’s user interface.

Acharya

Acharya (Bhagat et al. 2002) is an web-based ITS for learning SQL. Acharya only analyses SQL for database querying, not updating. This ITS uses Java servlet technology for its web-based front-end and uses PostgreSQL as its back-end. As shown in Figure 2.8, Acharya contains a student module and a pedagogical module.

Figure 2.8: Acharya’s architecture.

This architecture diagram shows that Acharya has two separate databases, one for the student model, and the other one for the rest of the models - including the problems and their solutions. The student model contains general 2.4.DetailedDescriptionofSpecificSystems 53 information about the student, history of concepts learned, with a confidence factor (the system’s belief that the student acquired the ), knowledge level and number of hints received. To analyse the student’s answer, Acharya compares the student’s solution with a pre-set expert solution, which is done individually for each of the clauses of the SQL query. The comparison is done in three stages. The first stage, the pre-processing stage, eliminates complex constructs such as between, exists, etc., and replaces them with equivalent conditions combined using either AND or OR. The next stage, atom processing, each individual condition is compared against the expert solution. At this stage, mistakes such as wrong or negated conditions can be found. The last stage is the truth table processing. At this stage, Acharya checks if the combined expressions are logically equivalent with the expert solution. In PAT, because the hints can be of two types (hints about what is wrong and hints about how to fix) the student’s history also contains the hint type. Acharya, in the same way as SQL Tutor, uses a real RDBMS to run the students’ solutions and the result of the query is returned back to the student - if the student’s solution is correct. As shown in Figure 2.9, the interface of this ITS is divided in three parts:

◦ the upper part displays the schema of the database and the problem requirements; ◦ the middle part contains text boxes for each part of the SQL select statement, text boxes where the student can write his or her solution; and ◦ the lower part displaying the diagnosis result and the results from the query - if the query successfully ran.

If the student’s solution has errors, the most basic ones are addressed by giving the student a list of relevant diagnostics about the clauses with errors, including 54 2. Literature Review

Figure 2.9: Acharya’s user interface.

links to the corresponding course material. If the student’s solution is correct, the result of the SQL is displayed. Showing the results of the query is very beneficial for the students, as they can see exactly the result of their work.

Acharya can propose problems to the student based on pre-requisite re- lations. PAT can also propose a new exercise to the student based on pre- requisite relationships and the current topics known by the student. In addi- tion, PAT lets the student specify additional criteria such as “only exercises I did not start yet”. 2.4.DetailedDescriptionofSpecificSystems 55

SQL-LTM

SQL Lightweight Tutoring Module (SQL-LTM) (Dollinger 2010) is a system that can provide semantic feedback on SQL statements, pointing out their logic flows, even if they are syntactically correct. SQL-LTM can detect most conceptual errors that SQL learners can make (Dollinger 2010). SQL-LTM consists of two modules: a query parser which converts the SQL query into an XML representation, and an analyser which compares the student’s query against the reference query created by the instructor. One of the difficult issues in analysing SQL queries is the possibility that the students can provide solutions that, even though they are different than the reference solution, are still semantically correct. The analyser recognises the equivalence of such queries and provides recommendations on how to get to the expected solution. Similar to Acharya, SQL-LTM uses a real RDBMS to run the students’ queries. However, SQL-LTM does not have a student model nor does it keep track of student’s history; hence the system cannot individualise its feedback. For the same error, different students receive the same advice. In this section I described ITSs that focus on learning about databases. The next section describes ITSs for learning about programming.

2.4.2 ITSs for Learning About Programming

BITS

Bayesian Intelligent Tutoring System () (Butz et al. 2006, 2008) is an Web-based ITS for learning - covering elementary top- ics in C++. BITS can be used as a stand-alone application or as a Web-based application. BITS’s architecture is shown in Figure 2.10. 56 2. Literature Review

Figure 2.10: The architecture of BITS.

BITS uses a Bayesian Network to represent the prerequisite relationships among the concepts. Its knowledge base contains class lecture notes (as web pages), a repository of sample tests and their solutions. The user interface of BITS is shown in Figure 2.11. BITS uses a traffic lights system (Butz et al. 2008) to indicate how well the student knows a concept or the student’s readiness to learn the concept: yellow for known concepts, green for ready to learn concepts, and red for not ready to learn concepts.

Figure 2.11: The user interface of BITS. 2.4.DetailedDescriptionofSpecificSystems 57

While the idea of using a traffic lights system is very useful for the students, the meaning of the colours could be confusing. The of the colours with the traffic lights is generally interpreted as: green is good, red is bad, yellow is somewhere between. In this way, applied to someone’s knowledge, one could see the colours meaning as: green (good) is already done; red (bad) is not yet done; and yellow is ... almost done. This is why in PAT, the traffic lights system describes the tasks in an exercise as: green for correct, red for incorrect, and yellow for partly correct.

ELP

The Environment for Learning to Program (ELP) (Truong 2007, Truong et al. 2003) is a web-based environment for learning programming to first year stu- dents at university level. The system uses Java and C# program templates with “fill in the gap” exercises, presented to the students as web pages. After the student completes an exercise, the code is sent to the server and compiled. An example of such an exercise is shown in Figure 2.12 (Truong et al. 2003).

Figure 2.12: An example of an exercise from ELP. 58 2. Literature Review

If the exercise compiles without errors, the student can download and execute the program on his or her machine. If there are errors during the compilation, the student will receive customised compilation errors messages and comments about the correctness and quality of their program. An example of customised compilation messages is depicted in Figure 2.13 (Truong 2007).

Figure 2.13: An example of customised compilation message from ELP.

JITS

Java Intelligent Tutoring System (JITS) (Sykes 2003) is a programming tutor designed for first year university students learning programming in Java. JITS contains four distinct components: the curriculum design, the AI module, the distributed web-based infrastructure, and the user interface. Part of the AI module is an expert system which can provide general and specific hints based on two decision trees: the syntactic and semantic decision trees. The user interface of JITS is shown in Figure 2.14. JITS’s interface, as well as ELP’s and SQL Tutor’s interfaces, are very simple and easy to use. Creating interfaces that are as simple as possible, yet still providing valuable information, was one of the key ideas I pursued when designing PAT’s interfaces. In this section I described ITSs that focus on learning programming. The next section shows the levels of details used to describe student models. 2.4.DetailedDescriptionofSpecificSystems 59

Figure 2.14: The user interface of JITS.

2.4.3 ITSs with a Detailed Student Model

In Section 2.4.1, I described previous ITSs with a focus on the domain of databases. For those ITSs, even though there are explanations on how the student models are used, not much detail is given on how they are designed and constructed. An example of a very simplistic description of a student model is the one from Acharya, which describes the student model as containing general information about the student, history about concepts learned (with a certainty factor) and knowledge level and number of hints asked the student (Bhagat et al. 2002).

While some ITS literature gives enough detail, so that their designs can be easily re-used, other ITSs describe their student models only at a high level, without too many design and implementation details. Further, some ITSs only describe their student model components, without giving any detail. 60 2. Literature Review

Wayang Outpost

Wayang Outpost (Arroyo, Beal, Murray, Walles & Woolf 2004) is an ITS for the mathematics section of SAT (Scholastic Aptitude Test). Wayang’s student model combines human expertise with a data-centric approach represented as a Bayesian Network. While human knowledge was used to determine the skills required to solve particular problems, the data-centric approach was used to determine the conditional of the Bayesian Network - they were collected from previous students’ interactions.

Figure 2.15: Inferring students’ behaviour Wayang Outpost.

The main purpose of Wayang’s student model is to provide the required infor- mation for the problem and hint selection. The information contained in the student model is composed of three parts: history of actions, spatial/retrieval ability and proficiency level of skills. Another goal of Wayang’s student model is to predict student’s behaviour after seeing a hint: will the student give a correct answer or ask for another hint. To give an example of the detail pro- 2.4.DetailedDescriptionofSpecificSystems 61 vided when describing the student model, Figure 2.15 (reproduced from (Woolf 2008)) shows the and Bayesian Network used to infer the students’ behaviour. The system provides significant information to a level of detail that will help other researchers to easily re-use the design of the student model.

TILE

Another example of a student model documented in detail is a web based learning system (Han 2001) based on the TILE framework. This research produced a student model that can be used in a variety of subject domains. Although simplistic, this student model is well documented. This research describes the student model by providing a logical view - see Figure 2.16.

Figure 2.16: The logical view of TILE’s student model.

Furthermore, the entities and their attributes are also described - see Fig- ure 2.17. It can be clearly seen from the previous two figures that a researcher interested in reusing the design of this student model, can easily implement it. Some other systems, even though they provide detailed descriptions of their student models, either describe models for very simple examples (Bredeweg & Winkels 1994, Mobus et al. 1994), or only describe some aspects of the student model in detail (Zhou & Evens 1999). Bredeweg & Winkels (1994) propose an integrated qualitative reasoning approach that can be used to built both domain and student models. While 62 2. Literature Review

Figure 2.17: Attributes’ specifications of TILE’s student model. the approach is well described, the example given in their paper is a student model for only the simple case of a balance that can have three positions: low, normal or high. Mobus et al. (1994) propose an approach that can model the student’s knowledge growth (from novice to expert) in the domain of functional pro- gramming. Their approach makes the distinction between new knowledge and improved knowledge, represented by sets of rules. The description of the stu- dent model only gives an example of set of rules for six simple programming tasks.

CIRCSIM

CIRCSIM (Evens et al. 1997, Khuwaja et al. 1994) is a cardiovascular phys- iology ITS. The system helps medical students to understand the negative feedback system that controls blood pressure. In one of the improved versions of the system, (Zhou & Evens 1999) de- scribe the student model of CIRCSIM as having several components as shown below. 2.4.DetailedDescriptionofSpecificSystems 63

◦ The performance model contains the assessment of student competence based on performance. ◦ Student reply history stores information of the topics already covered as well as a history of student’s answers. ◦ Student solution record records all the errors that the student made while working on the exercise, together with a description of each error. ◦ The tutoring history model includes both the plan history and the dis- course history. It does not contain information about what the student knows but the system’s history. However, even though Zhou & Evens (1999, p. 15) say that they “provide detailed analyses of some of the information needed by different modules in CIRCSIM-Tutor”, reusing the design would require a more detailed descrip- tion.

ProTutor

ProTutor (Epp 2010, Epp & McCalla 2011) is an ITS for learning Russian pronunciation. The tutor provides feedback by using two visualizations of the learner model: an open learner model and an historic open learner model. While there is a precise description of what the learner can see through the two visualizations, the underlying data from the learner model is only briefly discussed.

Invention Lab

Invention Lab (Roll et al. 2010) is a system using a hybrid approach of model tracing and constraint-based modelling to offer intelligent support in enquiry environments. Roll et al. (2010) describe how the student’s knowledge at the domain level is evaluated and how the system adapts the tasks to student’s demonstrated proficiencies. With regards to the learner model, it is only said 64 2. Literature Review that the model contains 59 production rules. Other ITSs just briefly describe their student model, only listing the im- portant components they contain (Bhagat et al. 2002, Epp & McCalla 2011, Roll et al. 2010).

2.4.4 ITSs that Do/Don’t Use a Simulation

In Section 1.2.1, I described a general architecture for an ITS. However, there is no absolute agreement about the architecture of an ITS - different opin- ions exist (Nkambou et al. 2010). One of the important aspects of an ITS architecture is whether to use a simulation or not (Loftin et al. 2004). In Section 2.2.4, I discussed the simulation component from the point of view of education in general. In this section, I discuss examples of three types of ITSs: a) ITSs using a simulation; b) ITSs using a real system, not a simulation, and c) ITSs for learning abstract domains, such as mathematics, that do not need a simulation.

ITSs Using Simulation

IFT The Intelligent Flight Trainer (IFT) is an ITS for trainee helicopter pilots (Remolina et al. 2004). IFT is a system that combines a helicopter flight simulator with an ITS. The ITS is based on an Adaptive Instructional System (AIS) architecture which provides the infrastructure for modelling not just skill mastery but also personality and affective traits. IFT uses a simulation manager to communicate between the Adaptive Instructional System and the simulator. PORTS TAO Stottler et al. (2007) describe PORTS TAO, an ITS for the instruction of Tactical Action Officers (TAOs) in training at the Surface Warfare Officers 2.4.DetailedDescriptionofSpecificSystems 65

School (SWOS). The ITS uses the PC-based Open-architecture Reconfigurable Training System (PORTS) architecture. PORTS simulates the TAO console - see the image above.

Figure 2.18: TAO’s user interface.

A novel extension of the PORTS TAO ITS is described by Ludwig et al. (2010), where the ITS is integrated with the Second Life 3D virtual world. In this integration, the tactical action officer interacts with teammates (which are computer-controlled) and communicates with them using a chat window. Cardiac Tutor The Cardiac Tutor (Eliot & Woolf 1996) helps students learn cardiac resus- citation procedures through directed practice, within a real-time simulation, using knowledge-based reasoning. Figure 2.19 shows the Cardiac Tutor’s stu- dent interface (Woolf 1996). The patient window contains available interventions with the patient such as: compressions, electric therapy, and intubation. In addition, the simulation, which proceeds in physiologically and clinically plausible , also dis- plays status and ECG windows (with Emergency Room sounds and graphics), showing vital signs and heartbeat information. 66 2. Literature Review

Figure 2.19: The simulation interface for Cardiac Tutor.

At any time, the student can access the hints window, where they can request more information about procedures and the simulated patient’s arrhythmia.

ITSs Using a Real System

Limited ITS research uses a real system as its educational focus, instead of a simulation. The best example (in relation to my research) of intelligent systems that use the real environment are the “Excel Algebra Problem Solving Tutor” (Ritter & Koedinger 1996) and the “Lumiere project” (Horvitz et al. 1998), which served as the basis for the office assistant for Microsoft Office ’97. The “Excel Algebra Problem Solving Tutor” (Ritter & Koedinger 1996) is a plug-in tutor agent with a focus on helping students acquiring software skills. The plug-in tutor agent architecture provides a better learning environment by leveraging the power of existing tools - Excel in the case of this tutor. This tutor was built as a front-end to Excel, called pseudo-Excel. The front- end looked like the usual version of Excel and implemented all of the usual functions of Excel. The only difference was an additional “Tutor” menu from which the student can ask for help or quit the tutor. From this perspective, 2.4.DetailedDescriptionofSpecificSystems 67

PAT is similar to this tutor as it has an additional menu to the interface, but the important difference is that PAT does not provide an additional front-end to the real system, it uses the real user interfaces of the system. The Lumiere project’s focus was to: ◦ construct Bayesian models for reasoning about users goals from their actions; ◦ gain access to events from the real application; ◦ transform the events into observational variables that can be represented in the Bayesian models; ◦ develop persistent profiles of the users’ expertise; and ◦ develop an overall architecture for an intelligent user interface. Another system that uses a real system instead of a simulation, even though not an ITS, but rather a Computerised Assisted Learning (CAL) tool, helps students’ training activities in a laboratory with electrical and electronic hard- ware. The system (Schweinzer 2008) is specifically designed for teaching ele- mentary methods of measurement technology, for the purpose of helping stu- dents to achieve accurate results with their measurements. The system tool is connected to the real devices through a general purpose interface bus (GPIB) - see Figure 2.20. In this way, the system tool has access to the measured values and is able to inspect the quality of the results.

Figure 2.20: The use of the CAL system.

Some of the services that the system provides are: ◦ Automatic repetition of measurement procedures 68 2. Literature Review

◦ Learning support by special ◦ Integrated help-system for actual measuring ◦ Inspection of measuring process - including hints at eventual problems ◦ Examination by multiple choice-tests

ITSs That Do Not Need a Simulation

We have seen in an earlier section examples of ITSs that use a simulation to replace the real environment required for learning. However, some ITSs do not need any particular environment to teach the subject domain. Such examples are ITSs for teaching mathematics, where there is no real world environment that has to be simulated. This kind of tutor only needs useful, user friendly interfaces, in which the lessons or exercises are presented. Examples of ITSs that teach mathematics are tutors with a focus on al- gebra: Ms. Lindquist (Heffernan & Koedinger 2002), AlgeBrain (Alpert et al. 1999), or tutors with a focus on geometry: Geometry Tutor (Koedinger & Anderson 1993), Advanced Geometry Tutor (Matsuda & VanLehn 2005). As illustrations, the next two figures show the user interfaces from Alge- Brain (Alpert et al. 1999) and the Advanced Geometry Tutor (Matsuda & VanLehn 2005).

Figure 2.21: The interface of AlgeBrain tutor. 2.4.DetailedDescriptionofSpecificSystems 69

Figure 2.22: The interface of the Advanced Geometry Tutor. 70 2. Literature Review

Pros and Cons of Using Simulation

Shute & Psotka (1994) describe using a simulation as advantageous: “- tions are useful wherever real objects are involved in learning or training task, and they provide many benefits over real devices.” (Shute & Psotka 1994, p. 23). After conducting several experimental studies using a pre-post knowledge assessment, Washbush & Gosen (2001) concluded that using a simulation im- proved learning. More recently, Schatz et al. (2009) describe the advantages of simulation-based training integrated to intelligent tutoring strategies, and refer to it as “advanced situated tutors”.

Some of the situations when using a simulation with an ITS is beneficial are:

◦ when it provides an environment that is safe even when mistakes are made (Shute & Psotka 1994); ◦ when the real environment is too expensive or too hard to replicate (Fournier-Viger et al. 2008); and ◦ when the students are learning psychomotor skills such as flying heli- copters (Remolina et al. 2004).

However, there are disadvantages to using a simulation. One of these disad- vantages is the development time: the number of hours spend for developing a simulation can be rather large (Chapman 2004). Further, the more realistic the simulation is, the longer the development time required.

From this comparison, it is very easy to see that while in the case of expen- sive and dangerous learning environments such as military combat situations (Ludwig et al. 2010, Remolina et al. 2004, Stottler et al. 2007) or space station training (Belghith et al. 2011, Fournier-Viger et al. 2010) using a simulation is the obvious choice: when errors are made, no helicopter will crash and no pa- tient would die. In the case of students learning new software systems it is not 2.4.DetailedDescriptionofSpecificSystems 71 just safe but also more beneficial for the student to use the real environment (Ritter & Koedinger 1996). Some systems (i.e. ELP (Truong et al. 2003)) follow the idea of simplify- ing the student’s learning process and keep their focus on programming, not issues that are specific to the code editor or the . In PAT’s case that is the point: give the student the opportunity to use the real system, in real conditions, so the student has access to the full set of features - and the cor- responding challenges. While learning to create forms or reports is important, knowing how to use the real system to create them is equally important. However, it is important to mention that while developing a simulation is not an easy task (Chapman 2004), developing an ITS to use the real environ- ment (e.g. a real computer system) is not an easy task either (Horvitz et al. 1998). 72 2. Literature Review

2.5 Implications for Research in this Thesis

In this literature review, I have shown (Section 2.4.1) that even though some ITSs (such as SQL-TUTOR and Acharya) use Access as a back-end database, none of them help students to learn RAD using Access. This justifies creating a new ITS for learning RAD in a database environment, which is my research main goal (stated in Section 1.4): to increase the effectiveness of students learning RAD in a database environment. What I have also identified in this literature review are two of the current difficulties (as stated in Section 1.4) with ITS research. The first of these difficulties is whether to use a simulation or not. In Section 2.4.4, I compared ITSs that use simulation against those ITSs that do. The second difficulty is that there still is a lack of precise definition of the student models. In Section 2.4.3, I discussed examples of ITSs and the ways that the student model is described. In summary, the gaps I identified that provide the motivation for this re- search are listed below. ◦ There is no ITS for RAD in a database environment. ◦ None of the ITSs that help students learn about databases run within a real system instead of using a simulation. ◦ Because of the previous two statements, we do not know if students would prefer to have such an ITS running from within a real system instead of a simulation. ◦ Not many ITSs have their student models explained in great details in ways that could help other interested researchers to easily apply the models for similar domains. These gaps are the basis for the research question I gave in Section 1.4. The following chapters describe how I designed, built and evaluated PAT. Chapter 3

System Requirements and Architecture

Building intelligent tutors requires an appreciation of how people learn and teach.

- Beverly Park Woolf - 2008

This chapter gives an overview of the system requirements and architecture. The chapter includes the results of a focus group that I conducted to help analyse the students’ needs of PAT. This chapter also briefly describes MS Access and how MS Access is taught at QUT. Also in this chapter are the architectural design and high level functionalities of PAT 1.

1Some information about PAT’s architecture has been already published in (Risco & Reye 2009)

73 74 3.SystemRequirementsandArchitecture

3.1 System Requirements

This section first describes the students’ requirements - as gathered from a focus group, then discuses both functional and non-functional requirements.

3.1.1 Analysing the Students’ Needs

The theory underlining the functionalities that an ITS can have is well de- fined by the literature in the field: (Brusilovsky 1994a, Self 1987, VanLehn 2006, Woolf 2008). However, students’ perception of ITSs regarding a specific domain (in our case learn how to use RAD to create forms and reports as in- terfaces to a database) can confirm the usefulness of some of the functionalities or minimise the importance of others. I identified the following issues that must be clarified in order to better understand students’ needs.

1) What are the students’ perceptions and expectations about using an ITS for the Databases subject?

2) What are the students’ opinion regarding several interfaces presented?

From the range of available qualitative research methods, I have chosen the focus group as a suitable method (Edmunds 1999) for the issues presented above. The preparation of this focus group followed the principles described in the book “Focus Group Interviews in Education and Psychology” (Vaughn et al. 1996); more details about the focus group are given in Appendix B. However, for the reader’s ease, I describe next the important aspects from organizing and conducting the focus group. 3.1. System Requirements 75

To narrow down the scope of the focus group, we wanted to know from the students: ◦ What broad functionalities do they expect from an ITS? ◦ What specific functionalities do they expect from an ITS? ◦ What are their opinions about the solutions presented? The focus group had 6 participants, sufficient for the objectives. The partici- pants were selected from first year students at QUT. The only requirement in selecting the participants was to have students who already have com- pleted the Databases subject. In that way we could use their past experience with the non-ITS software Reye (2004/2008) they used in that Database sub- ject. The students from this subject also used another learning environment, Truong et al. (2003), to learn programing in Java and C# in another subject, so they already had some understanding of the types of help such a system can offer. At the beginning of the focus group, the participants saw a small presen- tation (using a projector) about ITSs. The presentation was based on the general architecture of an ITS (see Figure 1.2) and briefly explained the aim of an ITS and its main components. After that presentation, the participants were shown two interfaces: a. the simple interface of the non-ITS software they used before; and b. an improved interface that in addition to the original one, had a button to ask for help and a text box where the help should be displayed. With the two interfaces still displayed, the students were asked to give ideas on what type of help (features) they should get after clicking on the help button. The identified features are discussed in the next section. 76 3.SystemRequirementsandArchitecture

Students’ perception and expectations about ITSs and their use as learning tools

During the focus group, the following features were identified as very useful by the participants:

1) error indication (i.e. to just indicate what is wrong, without detailed feedback);

2) feedback about the error;

3) link to textbook or lecture slides to fix the error;

4) additional resources for the topic not understood;

5) step by step demonstration of how “things” are actually working;

6) check points for each task - i.e. to give feedback on subparts of the exercise (e.g. subparts such as the form appearance or functionality of a subform);

7) possibility to send an email to the “human” tutor, asking for feedback;

8) a list of exercises grouped by topic - “but only those relevant should be displayed” [sic];

9) what to do next;

10) hints on how to start and do the assignment;

11) screen shots at various stages of building the interface.

We can see that many of the features that the students identified are already presented in the KVL tutoring framework (VanLehn 2006). Table 3.1 shows how the features identified by students can be mapped to KVL framework. 3.1. System Requirements 77

KVLframework FocusGroup Thestudentselectsthetask 8 outer loop TheITSassignsthetasks n/a Minimalfeedback 1 Hintsonnextstep 9 inner loop Error-specificfeedback 2 Assessmentofknowledge n/a Reviewofthesolution 6,11 Table 3.1: Students’ answers mapped to the KVL tutoring framework.

As Table 3.1 shows, the students’ requests are matching well the services de- scribed by VanLehn (2006). The only two services not asked for by students are the assessment of knowledge and the ITS to assign the next topic or exer- cise, but this is understandable, as they are important only from the teachers’ point of view. The remaining features indicated by students and not matching any of VanLehn’s services can be grouped as additions to the services described in the KVL framework. Whether they are: links to other resources (error specific or not); or possibility to send an email to a tutor, asking for help; or step by step demonstrations; they can be all categorised as additional resources. The results of the focus group also confirmed that students prefer to have a large variety of media to choose from (Yacci 1994). The only request which should not be seen as a service is feature number 10 - “hints on how to start and do the assignment”. This is an issue related more to the way that the exercises are presented and should not be considered as a service provided by the ITS. The exercise description can include some hints on how to start the exercise, perhaps some generic hints for the difficult issues as well. Since the time when the focus group was held, the assignment requirements for the Access part of the Databases subject were improved, obsoleting this issue. 78 3.SystemRequirementsandArchitecture

Students’ opinion regarding several interfaces presented

Regarding the user interface, the participants were asked to give their com- ments and suggestions about the non-ITS software that they used in the pre- vious semester (see Section 1.5.1). They definitely liked the idea of a “small box” which “doesn’t take much room” and all they really wanted was to “add a textbox with the feedback”. In addition, for question 6 (see Appendix B), they had to compare two possible options for the new user interface, but the talk was still on the func- tionality; user interface, at that time, was not a concern for the participants.

3.1.2 Functional Requirements

No ITS so far has implemented all the functions previously presented in Ta- ble 1.1, and it is not my objective. However, to achieve my research goal (stated in Subsection 1.4 ), I had to develop a system which can:

◦ find the errors (if any) in student’s solution; ◦ determine which is the most important error; ◦ retrieve the appropriate feedback to address that error; ◦ individualise the feedback based on the information existent in the stu- dent model; ◦ propose next exercise or topic to help the student to achieve the knowl- edge and skills required by the curriculum; ◦ give the student the opportunity to choose what exercise to do next; ◦ allow the student to ask for help on a particular task from the exercise; ◦ provide a mechanism to update the student model during the student- system interaction; and ◦ address the students’ needs as identified in the focus group. 3.1. System Requirements 79

3.1.3 Non-functional Requirements

Modularity

Regarding modularity, there are two aspects that I have addressed: decompo- sition of the knowledge base and decomposition of the programming code. For the code decomposition, the code was split into code used for the interface and code used to implement the functionalities of the system. The components of an ITS were already presented in Figure 1.2 from Sec- tion 1.2.1. However, another way of representing the knowledge base is shown in Figure 3.1. This new representation will allow us to easier group the com- ponents of the knowledge base and also identify which tables will require up- dating during the use of the model. In the center of the diagram we have the student model, composed of information about personal characteristics of the student and student’s knowledge about the domain - as already described in Section 2.2.2.

l e d E o x e M r c g i n s i r e s o t u T Personal Characteristics

Student's knowledge about the domain

Domain Model

Figure 3.1: The configuration of components in the knowledge base. 80 3.SystemRequirementsandArchitecture

Privacy and Security

Regarding the privacy and security of data within the Student Model, we took into consideration the Principles of Ethical Conduct as described in the Na- tional Statement of Ethical Conduct in Research Involving Humans (NHMRC 2007): integrity, respect for persons over scientific outcomes, beneficence, jus- tice, consent, and research merit and safety. Some of the specific measures I put in place to address the privacy and security of the data collected are: ◦ the students can only access their own data (as stored on their own computers); and ◦ the aggregated data used for the evaluation of the system does not con- tain any personal information that could identify any individuals. 3.2. Teaching MS Access 81

3.2 Teaching MS Access

This section will give an overview of Microsoft Access - the domain that PAT helps students to learn. It presents particularities of the domain and types of exercises that students enrolled in the INB210 subject at QUT are required to solve during the semester.

3.2.1 Microsoft Access Overview

Microsoft Office Access (previously known as Microsoft Access) is a Relational Database Management System developed by Microsoft Corporation. From this point on, to simplify the text, Microsoft Office Access will be called Access. Be- ing part of the Microsoft Office suite, Access is one of the most used Windows desktop database application. Access’s interface is presented in Figure 3.2.

Figure 3.2: Microsoft Office Access 2007 - graphical interface.

Several versions of Access have been developed by Microsoft Corporation. The latest two versions are Access 2003 and Access 2007. PAT was initially devel- oped for Access 2003 installed on Windows XP and later ported to Access 2007 82 3.SystemRequirementsandArchitecture running on Windows Vista. PAT can be installed on any of the combinations of Access 2003 - Access 2007 and Windows XP or Vista. Access is based on the Microsoft Jet Database Engine and provides a graph- ical user interface which contains RAD tools to easily develop database appli- cations. Through the graphical interface, the users can create several types of objects such as forms and reports to easily interrogate or update the data from the database. Figure 3.3 and Figure 3.4 show examples of forms and reports created in Access.

Figure 3.3: Microsoft Office Access 2007 - example of a form.

Forms are objects used to enter, view or edit records in a database; reports are read only views of the content of one or more tables or queries from a database, formatted for an easy print (Adamski & Finnegan 2008). Access users can use built-in wizards to create simple forms and reports, but these have restricted layouts and functionality. While using the wizards provides an easy start for users who have just started to learn Access, the wizards cannot be used to automatically create more complex, advanced forms 3.2. Teaching MS Access 83

Figure 3.4: Microsoft Office Access 2007 - example of a report. and reports. Such forms and reports must be manually developed, although the wizards can be used to create a basic version, as a starting point. However, users can also design forms and reports entirely by hand. Forms and reports can contain various types of objects such as TextBoxes, Labels, Subforms, etc. Each object has several properties that define the look and behaviour of the object. A Label object for example has Caption (what text the label will display) and FontSize (the size of the text) properties. Dif- ferent types of objects can have the same properties (e.g. both Label and Form objects have a Caption property) but can also have some special properties - LinkChildField and LinkMasterField are properties of Subform object only.

3.2.2 Teaching Access at QUT

INB210 (Databases) is a first year subject at Queensland University of Tech- nology (QUT). The first part of the subject covers SQL. The second part provides students with the opportunity to use MS Access, a widely used com- 84 3.SystemRequirementsandArchitecture mercial RDBMS - to apply their knowledge of SQL and to learn about Rapid Application Development. Students have weekly practicals where they are required to solve exercises related to that week’s topic. With Access, students learn how to create queries, forms and reports. First, they have to create simple forms and reports using the wizards, then they are required to improve the initial forms and reports, adding more functionality and improving their appearance. In terms of the classification of domain models presented in Section 2.2.1, Access (as the domain to be taught) is a domain that is simple and well defined, where the correct solution (or solutions) can be identified. In some situations, the expected behaviour or format for a form or report can be achieved in dif- ferent ways (different combinations of property name - property value). This requires multiple correct solutions to be available, and this issue will be de- scribed in detail in a later chapter. An ITS for Access benefits students’ learning, complementing the lectures, tutorials and workshops. Students use the ITS both during the practicals and at home, in their own time and pace. 3.3. PAT’s Architecture 85

3.3 PAT’s Architecture

After the framework for the system was defined and the requirements gathered, the next step in the process of system development is Develop the System Architecture. In this section I present the architectural design, the method chosen to represent the knowledge and also the functionalities that will be implemented.

3.3.1 Architectural Design

In Section 1.4, the first task I identified was to design an appropriate architec- ture. Mitrovic (1998) recommends using a real work environment (a Relational Database Management System) instead of a simulation as being more beneficial for the students. PAT’s architecture, reflecting this improvement, is presented in Figure 3.5. The figure shows how in PAT, compared to the general ITS architecture (see Figure 1.2) there is no simulation module. Instead, PAT uses a real working environment - Access. As stated in Section 1.2.1, the Domain Model represents the knowledge about the domain to be taught, knowledge that an expert in the field should know. It represents the foundation for the entire knowledge base. Because PAT’s focus is on helping students to learn how to use RAD tools to create forms and reports in Access, all the objects (together with their properties) that can be created in a form or a report are present in the domain model. The Student Model contains the system’s beliefs regarding the student’s knowledge of the domain (Holt et al. 1994) and additional information about the student, such as personal characteristics and learning style (Beck et al. 1996, Reye 2004). In PAT, the student model includes information about student preference for learning from diagrams or text, their interest in the subject and in several topics from the domain. Every time PAT analyses the 86 3.SystemRequirementsandArchitecture

Student Model

Instructional Expert System UserInterface ExercisesModel Intelligent Tutoring Intelligent Domain Model

User Access Access Interface Environment Real Working Real

Figure 3.5: PAT’s architecture and main components. student’s solution, the student model is updated and student’s actions are recorded in the student’s history. Because PAT uses Access, the real working environment instead of a simula- tion, it also has the Exercises Model. The exercises model contains a collection of exercises which the students can test in Access while receiving feedback from PAT. The exercises model must contain all the information required to solve the exercises (purpose, justification, requirements, etc.) and to allow PAT to give advice to the students (correct solution). The Instructional Expert, based on knowledge both about the domain and the student, diagnoses the student’s attempted solution and provides individ- ualised feedback. As part of the Instructional Expert, the tutoring model contains information about teaching the domain such as tutoring goals and hints for students. The tutoring model must be able to take advantage of the information provided from the student model (e.g. student’s learning style and personal characteristics). 3.3. PAT’s Architecture 87

When I designed the Instructional Expert for PAT, I applied principles from The Minimalist Framework for designing of instructional materials for com- puter students (Carroll 1990); the GOMS model (Card et al. 1983); and An- dragogy, “the art and science of helping adults learn” (Knowles 1980, p. 43) - a student centred approach for adults.

Access/ Access Interface: because PAT will be implemented as an add-in for Access, the student can utilise PAT from within Access. After installation, PAT appears as a new group in the Access’s menu bar (Figure 6.1, later). In this way, the student can actually work on each exercise, test their solution and receive feedback from PAT without leaving Access’s window.

The User Interface allows the student to communicate with the system (PAT). This is a two way communication: the student can ask for new exercises or help; and the system gives feedback back to the student. Via commands added to Access’s menu bar, the student can open PAT’s interfaces, presented later, in Chapter 6.

3.3.2 Knowledge Representation

There are many ways of representing the Student Model. In Section 2.3, I stated that semantic networks can be used for knowledge representation and reasoning. An example of such a representation for Access is provided in Figure 3.6. A detailed description of PAT’s entire knowledge base is given in the next chapter.

In the semantic network presented in Figure 3.6, there are two categories of topics recorded: concrete topics from the domain, and abstract topics, which represent teaching concepts. For example, LinkMasterFields, LinkChildFields and SourceObject are properties for a subform object (concrete topics), while LinkingProperties is the abstract parent topic. 88 3.SystemRequirementsandArchitecture

CompleteForm -abstract-

SubForm MainForm -abstract- -abstract-

LinkingProperties -abstract- FormProperties FormObjects -abstract- -abstract-

Subform Subform LinkMasterField Form LinkChildField RecordSource Subform SourceObject

Figure 3.6: Part of the semantic network of PAT.

3.3.3 PAT’s Functionality

Functionalities that PAT implements are presented in Table 3.2. They are presented as described in the KVL tutoring framework services.

Functionalities to be implemented in PAT The student can select the task outer loop The ITS assigns the tasks Minimal feedback Hints on next step inner loop Error-specific feedback Assessment of knowledge Review of the solution Table 3.2: Functionalities implemented in PAT.

In addition to the functionalities presented in Table 3.2, PAT also has addi- tional functionalities as identified by the focus group. An example of this is having a “More Readings” function pointing to the (relevant for the detected error) materials from lecture slides or recommended textbook. 3.4. Summary 89

3.4 Summary

This chapter started by presenting the conceptual framework for this research. Based on information gathered from a focus group that I conducted with stu- dents from QUT, the requirements for PAT were defined, as later detailed in the chapter. A description of PAT’s architecture, meeting these requirements, was then presented. This chapter also included an overview of Access (the domain that PAT teaches) and how Access is taught at QUT. The next chapter describes in detail PAT’s knowledge base, addressing each of the four models: domain model; student model; tutoring model; and the exercises model.

Chapter 4

Designing new Models

The do not try to explain, they hardly even try to interpret, they mainly make models. By a model is meant a mathematical construct which, with the addition of certain verbal interpretations, describes observed phenomena. The justification of such a mathe- matical construct is solely and precisely that it is expected to work.

- Johann Von Neumann (1903 - 1957)

This chapter presents a detailed description of the knowledge base that I de- signed for PAT. The first part describes the domain model - the knowledge about Access as a domain to be taught. The second part describes the exer- cises model. It contains information about how exercises are defined and used, and it also describes information regarding the solution for exercises. The third part describes the student model with information about student’s personal characteristics, what the student knows or doesn’t know, and the student’s history. The fourth part describes how PAT’s student model is maintained and the last part presents the tutoring model.

91 92 4. Designing new Models

4.1 Introduction

Object-Role Modeling (ORM) is “a fact-oriented modeling approach for ex- pressing, transforming and querying information at a conceptual level” (Halpin 2005b, p. 3). The diagrams I present in this chapter use ORM 2 graphical notation. More information about ORM 2 can be found in the Terry Halpin’s book “Information Modeling and Relational Databases” (Halpin 2001) or on the ORM website (Halpin 2005a). ORM is based on a Conceptual Schema Design Procedure (CSDP), which focuses on the analysis and design of data. The resulting conceptual schema specifies: ◦ Types of facts; ◦ Constraints on these facts; and ◦ Derivation rules to derive some facts from the others. The steps of the Conceptual Schema Design Procedure are presented in Ta- ble 4.1.

Step Description 1 Transform familiar information examples into elementary facts, and apply quality checks 2 Draw the fact types, and apply a population check 3 Check for entity types that should be combined, and note any derivations 4 Add uniqueness constraints, and check of fact types 5 Add mandatory role constraints, and check for logical deriva- tions 6 Add value, set comparison and subtyping constraints 7 Add other constraints and perform final checks Table 4.1: The conceptual schema design procedure (CSDP). 4.2. The Domain Model 93

4.2 The Domain Model

As stated in Section 1.2.1, the domain model represents the knowledge about the domain to be taught, knowledge that an expert in the domain should have. It represents the foundation for the entire knowledge base of the ITS. This section presents in detail the new domain model I created and how the model could be used by others in ITSs for similar RAD environments.

4.2.1 Domain Model Conceptual Schema

The objects that an Access database can have are categorised as: tables, forms, reports, and queries (see Figure 4.1). PAT’s focus is on helping students to learn how to create forms and reports.

Figure 4.1: Access objects.

In Section 3.2.1, I described what forms and reports are:

◦ A form is an object used to enter, view, or edit records in a database. Forms can present data in many customised and useful ways. ◦ A report is a read only view of the content of one or more tables or queries from a database, formatted for an easy print

To create the required functionality and look of a form, there are several con- trol types available such as: Label, TextBox, ComboBox, CommandButton, CheckBox, etc. (see Figure 4.2). All the control types, together with forms and reports are referred to as object types, in Access. Because a form or report 94 4. Designing new Models can hold control types, we can classify forms and reports as container types - a subset of object types. All object types have an object type name.

Figure 4.2: Controls that can be created inside of forms or reports.

From the point of view of the domain model, forms and reports are similar. To simplify the text and to make it more easier to read, I will describe the process of creating the domain model for forms, only mentioning reports when there are differences between forms and reports. All the object types have a number of properties (e.g. Caption, Datasource, Visible, etc.) which define the look and functionality of the individual object type. Properties are grouped in four categories: Format, Data, , and Other. Each property has a name and a value (see Figure 4.3).

Figure 4.3: Example of properties for a TextBox control instance. 4.2. The Domain Model 95

Some properties have allowed values e.g. the property with the name ’Enabled’ can only have the values ‘yes’ or ‘no’. Each property has a description. For example, the description for the Control Source property is “specify what data appears in a control”. The description attribute is not part of Microsoft’s supplied software i.e. I have added it to support this ITS. Properties are associated with object types. For example, in Figure 4.3, we can say that the Control Source of the company text box is ‘CompanyName’. In addition, because some properties are more important when associated with a particular object type than when associated with some other object types, an importance attribute is associated with each property. Importance can be ‘high’, ‘low’, or ‘undefined’. Figure 4.4 represents the conceptual schema for the domain model as an ORM2 diagram. 96 4. Designing new Models Category AllowedValue PropertyName PropertyDescription {Format, Data, Event, Other} Event, Data, {Format, HasName IsInCategory HasDescription HasAllowedValue Property IsAbstract

Domain Model Conceptual- Schema HasImportance Importance HasPrereqObjProp "ObjectProperty" HasProperty {Hi, Low, Undefined} Low, {Hi,

ContainerType Report} {Form, HasName ObjectType ObjectTypeName IsAbstract ControlType

Figure 4.4: Domain model conceptual schema. 4.2. The Domain Model 97

In Figure 4.4, I introduce the abstract object and abstract property attributes as an easier, and more meaningful way to organise the prerequisite relationships between ObjectProperty combinations. Abstract objects and properties allow us to create new object types like LinkingProperties (which is a prerequisite for the Subcontainer object type); and the Exist property (indicating whether the object was created or not in the form). To represent the prerequisite relationships for ObjectProperties we have the HasPrereqObjProp role (an acyclic ring fact type). The prerequisite re- lationship instances can be drawn as a semantic network. Figure 4.5 shows only a part of the semantic network representing the prerequisite relationship. The figure only contains some of the important ObjectProperties, for space reasons.

CompleteForm -abstract-

MainForm -abstract- AbstractSubform -abstract-

Form RecordSource FormProperties -abstract- LinkingProperties -abstract-

Form Name

FormObjects -abstract- Subform Subform LinkMasterField LinkChildField

Subform SourceObject FormCharacteristics FormAppearance OtherObjects -abstract- -abstract- -abstract-

ObjectsForFields -abstract- CommandButtons TextBox -abstract- Form Locked Textboxes DefaultView -abstract- Comboboxes -abstract-

Form CommandButton CommandButton ScrollBars OnClick Caption Form Caption TextBox Form ControlSource Combobox Form CloseButton Combobox ColumnWidths MinMaxButtons BoundColumn

Form Form Combobox Combobox Combobox RecordSelectors NavigationButtons ControlSource ColumnCount AfterUpdate

Combobox Combobox RowSourceType RowSource

Figure 4.5: Semantic network for the domain model. 98 4. Designing new Models

4.2.2 Extend the Use of the Domain Model

To illustrate how the domain model I created could be used by others de- veloping ITSs for similar RAD environments, I show the similarities between Microsoft Access and OpenOffice.org Base. As previously mentioned in Sec- tion 1.3, Base is a database management system (part of OpenOffice) which is similar to Access. Figure 4.6 shows the type of controls that can be created in both of the environments (Access and Base). It can be clearly seen from the image how Base uses the same approach as Access by allowing the users to create forms containing several types of controls. It can be seen that many of the controls types that can be created in Base are the same as in Access: TextBox, ListBox, ComboBox, Label, etc.

Figure 4.6: Access vs OpenOffice Base Controls.

Further more, Figure 4.7 shows how Base uses the same approach as Access by having properties associated to objects - even though it only have three 4.2. The Domain Model 99 categories of properties, not four as Access has. The properties have and values.

Figure 4.7: Access vs OpenOffice Base - properties for objects.

In Figure 4.8, we can see how the Form object has its own properties, just like in Access. We can also see that some properties can only have predefined, allowed values e.g. the Content Type property can only have one of the values: Table, Query, or SQL command.

Microsoft Access - Form Properties OpenOffice.org Base - From Properties

Figure 4.8: Access vs OpenOffice Base - properties for a form. 100 4. Designing new Models

Comparing Access and Base, we can say that in both database management systems, the following facts are true. ◦ Forms are a special type of objects that can be used as containers to hold other objects. ◦ Objects (including forms) have properties. ◦ Properties are grouped into categories. ◦ Properties have names and values. ◦ Some properties can only use allowed values. Moreover, the following features from the model I designed to work with Access can also be implemented for Base. ◦ We can attach a description for each property. ◦ We can create abstract objects and properties as a more meaningful way to organise the prerequisite relationships. ◦ Some Object-Property combinations are more important than others. In conclusion, although not actually implemented in this thesis, it can be clearly seen that the conceptual schema for the domain model created for PAT could be re-used for OpenOffice Base, as both of them are relational database management systems with similar characteristics. However, when analysing SQL, some changes should be made in line with the particularities of Base (i.e. the use of “%” for “any characters”, in contrast to Access which uses “*”). 4.3. The Exercises Model 101

4.3 The Exercises Model

Intelligent Tutoring Systems provide help to students learning by solving prob- lems. Therefore, an ITS must have a collection of exercises (or problems) that the students are expected to solve. The exercises are part of the ITS’s knowl- edge base. Each exercise must contain the following components:

◦ exercise description; and ◦ exercise solution 1.

This section describes the new exercises model I created for PAT.

4.3.1 Exercise Description

Because PAT is used from within Access, without any other documentation about the exercise requirements, exercises should have a purpose and descrip- tion in the knowledge base. The exercise purpose presents the reason of having such a form or report e.g. “We want to see all the products from a particular category”. The exercise description presents what has to be done to achieve the purpose e.g. “Create a new form displaying category name and descrip- tion. In the form you should also have a subform displaying product name, quantity per unit, units in stock and unit price”. Each exercise must have a unique exercise name and exercise code. These will uniquely identify an exercise. An exercise can be an assignment or not - i.e. an item assessed as part of their study at QUT. An exercise also has an exercise type: is an exercise for forms or for reports. These are required so that students can browse for exercises by these two criteria: assignments; and exercises for forms or reports.

1Some ITSs do not store solutions for exercise but run a problem solver instead (Conati et al. 2002) 102 4. Designing new Models

In addition to the above classifications, several categories are defined based on the learning purpose of the exercise. Some examples of categories are: simple forms; forms with subforms; adding a combobox in a form; etc. Categories have a category name and a category code. An exercise can belong to more than one category. In addition, each exercise has a difficulty from 1 (easiest) to 5 (hardest).

Some exercises are enhancements to previous exercise e.g. improving the layout of an existing form or adding a combobox to a form for an easier naviga- tion through the records. In such cases, the exercise code for the prerequisite exercise must be specified in the knowledge base.

It is very important that students understand why a particular exercise should be done. For this reason, each exercise should be presented with an exercise justification, regarding the student’s learning objectives, e.g. “You will learn how to create a form with a subform and understand how the form and subform are synchronised”.

Some exercises such as old or future assignments are better not to be pre- sented to the students. However, having them in the database is a benefit from the teaching staff perspective. To address this issue, an additional property is the visibility of an exercise. Only if an exercise is set as visible, will PAT make it available to students.

In Access, forms and reports are stored as objects in the same file as the tables. To create the required form or report, the students must use the corresponding database. Each exercise must be associated with a database described by database ID and database name.

Even though an exercise consist of only one form or report, a form can have one or more subforms and a report can have one or more subreports. For this reason, the form or report containing the other forms or reports is called the main form or main report. It is a main container. The subforms and 4.3. The Exercises Model 103 subreports are called sub containers. Both main containers and sub containers are subtypes of container. Because the exercise’s description only briefly covers the functionality of the form or report, all the specific requirements for the objects from the exercise (i.e. the subform or the combobox) are presented in the knowledge base with the associated object. In this way, any object in a form, from any exercise, must contain the following information: ◦ a unique object name; ◦ the object type; ◦ object purpose and specific object requirements; and ◦ sometimes additional information on how to approach the problem. The object type and the object name are in fact properties of the object type as described in the domain model (Section 4.2). Figure 4.9 (on the next page) presents the conceptual schema for the exer- cises model. 104 4. Designing new Models CategoryCode ExerciseName DatabaseName ExercisePurpose ExerciseDescription ExerciseJustification TaskName HasName SubtaskName SubtaskPoints HasName HasName HasPurpose

HasDescription HasJustification BelongsToCateg InDatabase SubvalueNumber PropertySubvalue HasTaskName CategoryName HasSubtaskName HasSubtaskPoints DatabaseID ExerciseCode HasPrereqExercise TaskNumber SubtaskNumber HasSubvalue HasType HasDificulty HasSubvalueNum IsVisible HasMainContainer IsAssignment HasTask {1...5} HasSubtask Difficulty SimpleObjProp ComplexObjProp ExerciseType {Form,Report}

MainContainer

ContainsObject

"ObjectProperty" HasProperty IsParentOf Object Property Container

SubContainer HasType HasValue HasName HasDificulty HasPurpose HasMIEcode HasOrigValue HasAdditionalInfo HasRequirements Difficulty MIE code MIE ObjectType ObjectName OriginalValue AdditionalInfo PropertyValue ObjectPurpose ObjectRequirements Exercises - Conceptual- Exercises Schema

Figure 4.9: Exercise model conceptual schema. 4.3. The Exercises Model 105

4.3.2 Exercise Solution

An exercise has several attributes associated with an ObjectProperty (as pre- viously defined in Section 4.2.1 as the objectified fact that an object has a property). These attributes are: original value, property value (correct solu- tion), and MIE code.

◦ Original Value - This represents the way that Access internally records all the values of the object-property (e.g. ‘-1’ for ‘yes’ and ‘0’ for ‘no’). ◦ Property Value - This represents the value of the object-property that the correct solution for the exercise should have. ◦ Most Important Error (MIE) Code - This is a measure of the relative importance of each part of the solution. A simple way to explain the MIE code is this: if a student’s solution has more then one error, which error is the “most important one”? Which error should be addressed first?

While the original value and the property value are attributes that come from Access, the MIE code is an attribute that I added to the exercise model. In Access, as in many other development platforms, the final product can be developed step by step using different development paths. This means that we cannot simply compare the student’s path with only one pre-defined (correct) path.

However, using best practices for developing forms and reports in Access, we can differentiate between the steps because some of the steps are more important than others. As an example, one should not try to add objects to a form or report before the data source of the report is correctly defined. Considering this approach, if the student’s solution has more than one error, we can identify the most important error as the one corresponding to the most important step. In this way, we can always determine which step the student 106 4. Designing new Models should work on next - if asking for help. In PAT, the MIE code is manually added by the teaching staff when a new exercise solution is created. The MIE code shows the objects and properties in the order that should be addressed first - if incorrect.

A limitation of this general approach is the case when a student cannot address the most important error in his or her solution, not even with help from PAT. It is possible that in such a situation, the student might try to work on other parts of the form or report and encounter problems in these parts.

To avoid the situation where PAT points out the most important error again and again, even though the student works on another object, I designed a mechanism which allows the student to ask for help for a particular object, rather than the entire solution. A more detailed explanation is given in Sec- tion 5.1, and more information about how the student can ask for help on a particular topic can be found in Section 6.2.3.

For ITSs that have exercises with a small number of topics, a possible approach is to show all the errors. However, the exercises from PAT have many topics. The simplest exercise in PAT (AddShipper) has 10 topics, the majority of helping exercises have around 30 or 40 topics, while the assignments have 50 to 70 topics. Considering the fact that a student might check his or her solution soon after starting to work on the exercise, the list of errors could be too long to be beneficial to display in full.

As a graphical representation of a solution, I created solution graphs for exercises as consisting of the object and associated properties ordered in the same way that an expert in the domain, using best practices, would create the objects and configure the properties.

To illustrate, Figure 4.10 shows an example of a form which represents one of the very simple exercise from PAT. The solution graph for that exercise is then shown in Figure 4.11. In Section 4.3.1, I have already discussed the 4.3. The Exercises Model 107 components of an exercise description. For the example in Figure 4.10, the stu- dents will get textual information about the exercise description and purpose, as well as requirements for all the important objects (such as form, textboxes, etc.). Exercise description: For a selected customer, which movies has that customer rented, how many times have they rented each such movie, and when was the most recent date each such movie was rented? Create a form with the appearance as close to that in the snapshots in this specification. Requirements for the important objects: The main form CustomerRentedMovies shows the family name, given name, and address of the customer selected by the user, displayed in the Detail section of the form. The form is presented in single form view; in addition, the main form holds a combobox with the label “SelectCustomer”. You must name your main form “CustomerRentedMovies”. The subform shows the details of the movies rented by the customer selected. The subform is presented in continuous forms view. You must name your subform “CustomerRentedMoviesSF”. The list is displayed in descending order of number of times rented and then by ascending order of the title. The combobox enables rapid navigation to the customer selected. The drop down list displays the given name and family name separated by a comma and space, telephone number and address of each customer in alphabetical order of family name and then given name. 108 4. Designing new Models

In addition, PAT provides the student with a screenshot of a complete (and correct) form - see Figure 4.10.

Figure 4.10: Example of a form, in an exercise.

Figure 4.11 shows the objects and their associated properties that must be set to particular values in order to achieve the required functionality or look for the form. The ovals with two lines of text represent objects described by their name (top line) and their type (bottom line). The ovals with only one line represent properties. On the left side of the figure I represented the task number. The tasks group together similar object-properties. The column on the right side of the figure represents the order in which an expert in the field would set the values for the properties. Even though it might appear that the students must address the objects and properties in the order of the MIE code, this is not true; the students can address them in any order and, as specified on the previous page, they can ask for help for a particular object. 4.3. The Exercises Model 109

MIE code

Customer_Rented_Movies form 1

RecordSource 2

FamilyName textbox 3

ControlSource 4

GivenName textbox 5

ControlSource 6

Address textbox 7

ControlSource 8

SubformObject subform 9

SourceObject 10

Customer_Rented_Movies_SF form 11

RecordSource 12

LinkMasterFields 13

LinkChildFields 14

Title textbox 15

ControlSource 16

Rentals textbox 17

ControlSource 18

MaxOfDateDue textbox 19

ControlSource 20

SelectCustomer combobox 21

RowSourceType 22

RowSource 23

ControlSource 24

ColumnCount 25

BoundColumn 26

ColumnWidths 27

AfterUpdate 28

Figure 4.11: Example of a solution graph. 110 4. Designing new Models

Similar to VanLehn’s approach of describing an activity using tasks and steps, we can separate the activities that must be done to solve an exercise in tasks composed of subtasks. An example of tasks with subtask for an exercise is presented in Table 4.2. The table shows the four tasks that the exercise consists of. Each task has a task number, a task description and the percentage of the mark for the task. The percentage of all the tasks represents a part of the complete exercise. In this case, the combined percentage of each task shown in the table represents 4% of the total mark for the subject (as 4% is the percentage of this assignment). The table also shows the subtask’s number and description, together with the subtask points. Subtasks points are used to calculate the weight of each subtask for the corresponding task percentage. 4.3. The Exercises Model 111 Points 15 30 30 Subtask Name Subtask only Num nated Num 01201 Nameoftheform2 Sourceofthedata Fieldsinthemainform3 Boundcolumn Afterupdateprocedure4 Nocontrolsource-usedaslook-up5 List display:0 do 10 not 15 see1 25 Customer- List display:2 15 names0 are Listdisplay:namesaresorted concate- 20 1 SourceObject2 LinkChildField LinkMasterField 15 Fieldsinthesubform Thetypeofviewforthesubform Sourceofdata 45 10 20 20 30 100 Table 4.2: Example of tasks/subtasks for an exercise. TaskName Subtask Task Num 0 Main form: CustomerRentedMovies (0.50%) 1 Combobox to select the customer (1.25%) 2 Subform control to3 hold subform (0.50%) Subform CustomerRentedMoviesSF (1.75%) 112 4. Designing new Models

Steps can be seen as setting the correct value for the object-property - as presented in the exercise model. In this way, the object properties are associ- ated with a subtask within a task in an exercise. Table 4.3 presents the steps associated with subtask 0 in task 3, from Table 4.2.

Task Num Subtask Num Object Name Property Name Title ControlSource 3 0 Rentals ControlSource MaxOfDateDue ControlSource Table 4.3: Object-properties as steps associated to a subtask within a task.

In some cases, the value of an object-property from the correct solution can be simply compared with the value of the same object-property from student’s solution (a detailed description of the student solution is in section 4.4); if the object-property has the same value as the student’s solution for that object- property, means the step is correct. An example of such a property is Scroll- Bars. The allowed values for ScrollBars property are “none”, “horizontal”, “vertical”, and “both”, so comparing the correct solution with the student’s solution can be easily done. In those cases, we call the object-property as being simple. A complex object-property is one that cannot be simply compared to the student’s solution. An example of a complex topic is the DataSource property for a form object. If the value of this object-property is a table or a query name, a simple comparison is possible. However, in many cases, the object-property value contains a Select statement. 4.3. The Exercises Model 113

An example of such a statement is:

SELECT CustomerNum, Title, DateDue, Count(*) FROM Movies inner join Copies inner join Rentals GROUP BY CustomerNum, MovieNum ORDER BY Count, Title

In this case, even though we have only one step (one property of an object) we still have to break down the object-property value in several parts, so a comparison can be made. For this case, we split the property value into several property subvalues:

◦ SELECT CustomerNum; SELECT Title; SELECT DateDue; SELECT Count(*). ◦ FROM Movies; FROM Copies; FROM MovieNum. ◦ GROUP BY CustomerNum; GROUP BY MovieNum. ◦ ORDER BY Count; ORDER BY Title.

In this way, we can specify that Movies, Copies, and Rentals are part of the FROM clause; CustomerNum and MovieNum are part of the GROUP BY clause; and so on. Each of these subvalues have an associated subvalue num- ber. To analyse the student’s solution, PAT will check each of these property subvalues, therefore being able to identify the student’s error for any of the clauses in the SQL statement (i.e. SELECT, FROM, GROUP BY, etc.).

4.3.3 Creating a New Exercise

To create a solution for a new exercise, a teaching staff member from the Database subject will first have to create an example of the form or report. Using a tool similar to the one used to capture the student’s solution, the staff member can extract the solution for the new exercise. This solution will then 114 4. Designing new Models be saved as the correct solution. After the correct solution is saved, the staff member has to manually (for now) add the MIE code to indicate the order in which the errors will be addressed and the tasks and subtasks associated to the corresponding Ob- jectProperties. In addition, the tasks and subtasks names, together with the weightings given to each subtask must also be provided. Ideally, this process should be automated and is something I will look for in the future. Currently, PAT contains more than 30 exercises for forms and reports, both helping exercises and assignments. 4.4. The Student Model 115

4.4 The Student Model

The student model contains data representing the student’s knowledge about the domain and other personal characteristics of the student. With regards to the domain, only knowledge of the relevant objects is recorded. These objects are the ones required to provide the functionality or look of the form or report. In addition to these, for more accurate feedback, and because the student’s knowledge about the domain changes over time, I have to represent not only what the user knows at a particular point in time, but also when and how that knowledge changes over time. For this reason, the system must consider the student’s previous interactions with the system - the student’s history. Because of these, I describe the student model in three subsections:

◦ student’s knowledge about the domain; ◦ student’s personal characteristics; and ◦ student’s history.

The conceptual schema for student’s model is presented in Figure 4.12. To uniquely identify a student, PAT uses a student ID. For each student, PAT records his or her first name - student name.

4.4.1 Student’s Knowledge About the Domain

The first thing we want to represent when modelling the student is the student’s knowledge of the domain: what the student knows or does not know about each of the elements presented in the domain model. The student can know about specific topics or the student can know specific facts in a particular point in time. Any of the elements identified in the domain model can be a topic: object types, containers, properties, etc. For example, a student can have knowledge about textboxes, subforms or the bound column property. 116 4. Designing new Models

A fact represents the relationships between topics e.g. a student can know the fact that the bound column property belongs to the combobox object type. In an empty student model (before any interaction between the student and PAT), all the values for the student knows the fact or knows about the topic are set to null. During the interaction between the student and PAT, any time a topic or fact are proven or inferred known or not known, the values are either increased or decreased. An example of updating the student model is given in Section 4.5.

4.4.2 Student’s Personal Characteristics

Personal characteristics can be categorised as global characteristics (not topic related) and local characteristics (only related to a particular topic) (Reye 2004). Each characteristic has a characteristic description presenting its mean- ing. The student model records a value for each of the student’s characteristics as local characteristic values and global characteristic values. A characteristic value must be part of the allowed values for the particular characteristic. The list of personal characteristics I implemented in the student model is presented below, grouped by their type. Global characteristics The student:

◦ has used Access before; ◦ prefers to learn from diagrams or text; and ◦ wants to master the subject.

Local characteristics The student claims to know how to create:

◦ simple forms; ◦ subforms; and 4.4. The Student Model 117

◦ comboboxes. The global and local characteristics are used for adjusting the help level and for recommending a new exercise. An example of using of the global characteristics is decreasing the help for a student who claims to be confident in using Access.

4.4.3 Student’s History

PAT can support many sessions of work for a student by recording a session ID and a session date. A session is regarded as a continuous period of time starting from the moment that the student starts PAT and finishing when the student closes PAT. During a session, a student can work on one or many exercises identified by their exercise code. From a time perspective, the student’s history with PAT can be seen as having two components: a short-term history (activity during one session); and a long-term history (activity during more than one session). The short-term history, even though is saved in the student model database, is overwritten each time the student starts a new session. In contrast, the long-term history is never deleted from the student model database. As part of the short-term history, PAT records the student’s solution. The student’s solution is similar to the exercise solution presented in section 4.3, but in this case, the property value is the actual value from the student’s form. Every time the student asks PAT to check his or her solution, or asks PAT for help, PAT will recapture the student’s current solution and repopulate the values for ObjectProperty and Property Value. For the short-term history we are interested in the errors that the student’s solution has. PAT records in the short-term history the current exercise (exer- cise code) and the current (last) error. An error is measured by the MIE code for the current exercise for the property object which is not consistent with the correct solution. PAT uses the most recent MIE code during the current 118 4. Designing new Models session to detect if the student is progressing or not. For the long-term history, PAT records which exercises the student has worked on and which exercises were successfully completed. PAT also records all its interactions with the student - all the actions the student did using PAT’s interfaces. Every button in any of PAT’s interfaces has an action code. By storing the action code for a session id, exercise code and MIE code, we have a history of all the actions that the student used with PAT and what was the most important error at each time. In addition, PAT records all the advice given to the student. For each object-property there is an advice code identifying an advice. Besides the advice code, PAT records the advice type, which shows how the student asked for help - i.e. using the buttons “what is wrong” or “how to fix”. More information about the advice code and the advice type is presented later in Section 4.6. 4.4. The Student Model 119

Time HasType ActionCode AdviceType AdviceCode Topic Fact AtTime AtTime PropertyValue {What, MoreWhat, How, MoreHow} How, MoreWhat, {What, Actioned ReceivedAdvice KnowsFact

HasSolution KnowsAbout

Time Created

Error HasName ObjectProperty

StudentID

StudentName

In Session...For Error...At Time Error...At Session...For In OnDate Student Model Conceptual- Schema IsWorkingOn

InSession SessionID WorkedOn

SessionDate

ContainsMIEcode

MIE codeMIE "Error" ExerciseCode

IsCompleted

HasValue HasValue

Topic Value Value

HasGlobalCharacteristic LocalCharacteristic GlobalCharacteristic

... for topic ... has characteristic ... for topic ...

AllowedValue AllowedValue

Has Has

AllowedValue AllowedValue LocalCharacteristic LocalCharacteristic GlobalCharacteristic GlobalCharacteristic HasDescription HasDescription CharacteristicDesc

Figure 4.12: Student model conceptual schema. 120 4. Designing new Models

4.5 Updating the Student Model

PAT updates the student model when the user:

◦ starts working on a new exercise; ◦ stops working on an exercise; and ◦ asks PAT to recommend a new exercise.

In addition, PAT updates the student model every time it checks the student’s solution.

If no error is detected, PAT records what the student has shown he or she knows (all the object properties checked in the exercise) and that the exercise was successfully completed. If at least one error is detected, than PAT will record:

◦ what the student has shown he or she knows; ◦ what the most important error was; and ◦ what was the advice given.

To better illustrate how PAT records what the student does, or does not know, an example of solution graph and the results of two consecutive checks of a student’s solution are presented in Figure 4.13.

The left side of Figure 4.13 shows the solution graph for an exercise, includ- ing the MIE code - the most important error. The right side of the diagram shows the results of comparing the solution of a student with the correct so- lution, between two consecutive checks of the solution at times T1 and T2 respectively. 4.5.UpdatingtheStudentModel 121

Student solution

Object properties MIE code T1 T2

Customer_Rented_Movies form 1 Correct Correct

RecordSource 2 Correct Correct GivenName Correct Correct textbox 3

ControlSource 4 Correct Correct

FamilyName textbox 5 Correct Correct at time time atT1 ControlSource 6 Correct Correct Marked as known as Marked Address Correct Correct textbox 7

ControlSource 8 Correct Correct

SubformObject subform 9 Incorrect Correct

SourceObject 10 not relevant Correct

Customer_Rented_Movies_SF form 11 not relevant Correct

RecordSource 12 not relevant Correct T2 time at Marked as known as Marked LinkMasterFields 13 not relevant Correct

LinkChildFields 14 not relevant Incorrect

Figure 4.13: Solution graph and the correctness of a student solution.

At time T1, the first (smallest) MIE code where the student’s solution is incorrect is 9. PAT still checks and records in a temporary table the rest of the object-properties and their values. However, at time T1, the feedback will address only the object-property corresponding to the most important error. After receiving feedback, the student continues to work on the exercise. After some progress, the student is stuck again, and asks once more for help, at time T2. PAT detects that now the first incorrect MIE code is 14. This means that all the object-properties with a MIE code less than 14 are now correct. I present below the way that PAT interprets student’s progress between times T1 and T2. PAT considers that the student does not know about the object-property which in the student’s solution has a value different from the value in the correct solution. However, in the student model, PAT will only record as not known the object-property with the lowest value for the associated MIE code. 122 4. Designing new Models

Note that in the Figure 4.13, some of the ObjectProperties are marked at T1 as “not relevant”. This is because once PAT identifies the most important error, the less important errors are not relevant at that time and will not be recorded. The reason for not recording the other object-properties (those marked as not relevant) with wrong (or missing) values in the student’s solution is the fact that is possible that the student did not start to work on to those objects properties yet. Just to eliminate any possible misunderstanding, the topics marked as not relevant are not relevant only at time T1; they are relevant (and therefore considered) at time T2. For the example in Figure 4.13, at time T1, PAT records as not known the object-property with MIE code 9, while at time T2, PAT records as not known the object-property with the MIE code 14. The object properties considered known are the object properties where the value from the student’s solution is identical to the value from the cor- rect solution. However, PAT will only record those from the previously most important error to the current most important error. In this way, the object properties from MIE code 1 to 8 are only recorded as known once (at time T1) and the object properties from MIE code 9 to 13 only once at time T2. 4.6. The Tutoring Model 123

4.6 The Tutoring Model

The Tutoring Model provides information for generating or selecting the appro- priate instructional action. The tutoring model must be able to take advantage of the information provided from the student model (e.g. student’s learning style and personal characteristics). The tutoring model gives the student feedback while working on exercises involving forms or reports. To create a new form, several goals can be iden- tified, starting with the first goal: “create the main form”. Other goals such as creating the required objects or setting the appropriate characteristics for the forms contribute to the main objective of completing the exercise. Each goal is identified by a goal ID, and every goal has a goal description. Some examples of goals are presented in Table 4.4.

Goal Goal GoalDescription ID 1 createthemainform First step is to create a new form ob- ject. 2 createtheobjectsinthe A form can hold several types of objects main form such as: textbox, combobox, buttons, even oher forms (subforms). 3 create the ?frmName A form can represent a one-to-many re- subform lationship by having another form (a subform) within the main form. 14 have the appropriate To be able to present information from source of data for the the database, a form must have defined ?sfName subform a source of data. This can be a table, a query, or an SQL statement. 21 have the required char- There are some characteristics that can acteristics for the ?frm- be specified for a form (e.g. layout). Name form 115 calculate the correct to- Record-by-record or group-by-group tal for the report totals in a report can be calculated. Table 4.4: Examples of goals and their description.

For some goals, the goal text contains parameters such as ?frmName or ?sf- 124 4. Designing new Models

Name for object names. When the goal text is used to give feedback to the student, the parameters are replaced with the real object name. As an exam- ple, the goal “have the appropriate source of data for the ?sfName subform” when displayed to the student might become “have the appropriate source of data for the CustomerRentedMoviesSF subform”. Moreover, PAT records in its tutoring model the actions that a student must perform in order to achieve a particular goal. As an example, for the goal of having the required main form, two actions must be performed: create a new form and give the correct name to the new form. An action has an action ID and an action description e.g. “define the data source for the form” or “add a subform to a form“. Some of the actions have more readings associated to them which indicate slides from the lectures or pages from the recommended text book. Furthermore, some actions have associated diagrams or images which de- scribe the context of the action as part of the goal. Some examples of actions with their description are presented in Table 4.5, while Figure 4.15 shows the diagram for the action with ActionID 5 - “define the data source for the form”.

ActionID Action description More readings 1 createanewFormobject A&Fbook: AC154,AC269 2 giveanamefortheForm 3 set the default view of a slides 7-8, 63 lecture 8; form 4 giveanamefortheCom- bobox 5 define the data source for slides 9-16 lecture 8; the form 6 addaSubformobjecttothe slides 50-56 lecture 8; A&F form book: AC 172, AC 302 Table 4.5: Examples of actions and their description.

PAT records the links between actions (tutoring model) and object-properties (the domain model), which are also referred to in the exercises model. More 4.6. The Tutoring Model 125 explanation on how PAT links a topic from an exercise to the appropriate goal and action is given in Section 5.3.2.

Add a Subform object to a Form

A form can hold many (different) controls

Form One of the controls a form can hold is the Subform control

Control Control

Control A Suborm control holds ... another form!

Subform control

Form

Control Control

Figure 4.14: An example of a diagram associated to an action.

As presented in Section 4.4.3, there is an advice code associated for each topic. PAT can have several hint texts for each advice code. The advice text depends on the advice type. The advice type refers to the type of feedback the student asks for: “what is wrong with my solution” and “how can I fix this”. Each of these types of feedback are further divided in an initial, general feedback, and a more detailed explanation. For this reason, the allowed values for the advice type are: “what”, “more what”, how”, “more how”. In some situations, based on both student’s short-term history and long- term history, the hint text can have an additional text. For example, if the student received some hints for the same advice in the past, the additional text can be a reminder like “you should remember that” followed by the hint text. On the other side, if the student continues to ask for advice too many times for the same error and in the same session, we should try to avoid the help 126 4. Designing new Models abuse. In such a case, the additional text could be “Try a harder! Ask for help ONLY if you need it”. More specific details on how hints are generated are given in the next chapter. The conceptual schema for the tutoring model is presented in Figure 4.15

4.6. The Tutoring Model 127 HasText...ForAdviceType...ForAdviceCode HintText AdviceCode HasHintAddition AddressingTopic AdviceType {'What','MoreWhat', 'MoreHow'} 'How', Topic AdditionalText AchievedByAction Tutoring Model - Conceptual Schema Tutoring ModelConceptual - ActionID PartOfGoal

HasDiagram HasReadings

HasDescription HasDescription GoalDescription

GoalID

Diagram MoreReadings HasGoal ActionDescription Goal

Figure 4.15: Tutoring model conceptual schema. 128 4. Designing new Models

4.7 Summary

This chapter describes the four models that I created for PAT: the domain, exercises, student and tutoring models. For each of the models, the conceptual schema is presented as an ORM diagram. As an example of how the domain model could be used for other database environments that support RAD tools I described the similarities between Microsoft Access and OpenOffice Base. The next chapter explains the way that PAT analyses the student’s solution and then generates individualised feedback. Chapter 5

Giving Feedback to the Student

An education isn’t how much you have committed to memory, or even how much you know. It’s being able to differentiate between what you do know and what you don’t.

- Anatole France (1844 - 1924)

This chapter describes how the Instructional Expert that I designed for PAT works. It presents the two stages that PAT goes through to provide the indi- vidualised feedback to the student: the analysis and the synthesis. It describes the hint levels used in PAT and presents the pseudo code for the high level functions for the two stages. It also presents the mechanism I designed to propose a new (the next) exercise to a student.

129 130 5.GivingFeedbacktotheStudent

5.1 Introduction

In Section 2.2.3 I presented the types of feedback that other ITSs provide. PAT provides feedback on demand and can either indicate the error (what is wrong) or what to do next (how to fix). In addition, PAT also provides other useful information such as more readings related to the error. To provide feedback to the student, PAT has to go through two important stages: analyse the student’s solution; and then synthesise a response to the student. In the analysis stage, PAT gathers all the information related to the most important error from the student’s solution. With that information, in the synthesis stage, PAT generates the individualised feedback addressing this most important error. In Access, the characteristics and functionality of a form can be achieved by following different development paths. Certain actions must be done in a particular order (e.g. before setting some properties for an object, the object itself must exist) but other actions can be done in any order (e.g. if five texboxes must be created, the order in which they are created does not matter). At the same time, although some actions can be done in any order, there are best practices that should be followed. When a student asks for help, it is because he or she struggles with a particular task in an exercise. If the student hasn’t started yet all the tasks, PAT will detect not just the task that the student struggles with, but the incomplete tasks as well. By following best practices for developing an Access application, PAT could then suggest the student to fix a different error first. An example of this: although a student would be expected to finalise the basic functionality of a form first (e.g. adding a combobox) the student could try to finish the required characteristics of the form first (e.g. set the correct values for scroll bars). Obviously, if PAT would only give advice on the error that should be fixed first, the error that the student tries to deal with would never 5.1. Introduction 131 be addressed. To be flexible enough to handle different approaches, not just the optimal path, PAT can analyse the student’s solution for a particular object only, rather than the entire form. The student can specify what object he or she works on, and ask for help only for that particular object. PAT will then only consider the properties and MIE for that object, still using the same overall analysis and synthesis approach. In the following sections, I describe how I designed the two stages (analysis and synthesis) that PAT goes through to provide hints to a student and also how PAT recommends an exercise. 132 5.GivingFeedbacktotheStudent

5.2 Analysis of the Student’s Solution

The first step of the analysis stage is to capture the student’s solution. One of the problems that PAT has to deal with at that point is to correctly iden- tify the objects from the student’s solution. To encourage the students to use best practices for creating forms and reports in Access (as described in Sec- tion 4.3.2), all the important objects in the form or report have required names which the student must use (part of the exercise requirements). However, not using the correct names can be one or more errors in the student solution, so PAT must map the objects from the student’s solutions to those in the correct solution, without relying on correct names. Mapping the objects from the student’s solution to the correct solution is a very important task because many objects in the form or report can be of the same type, e.g. textboxes. There would be false error reporting if they were incorrectly mapped. One of the difficulties of this process resides in the variety of object types that can be created in forms and reports. Different types need different properties to be used for mapping e.g. using the caption property to map labels, the bound column property for textboxes, etc. After the student’s solution is mapped to the correct solution, a proper analysis can begin. In this analysis stage, PAT check if the student’s solution contains any errors, and if it does, will detect the type of the error (omission or commission). Pseudo Code 1 (on the next page) presents the mechanism that PAT uses to analyse the student’s solution. More explanation about the types of error follows the pseudo code. 5.2.AnalysisoftheStudent’sSolution 133

Pseudo Code 1 Analyze the student’s solution. //======// Procedure Name: AnalyzeStudentSolution // Scope: Capture then analyze the student’s solution //======

// Capture the student’s solution call CaptureStudentSolution

// Map student’s solution to the correct solution call MapStudentSolution

// Get the most important error (if any) and the error type

call GetMIEomission returning MIEomission // MIEomission is the most important error of omission

call GetMIEcommission returning MIEcommission // MIEcommission is the most important error of commission

// If no error is detected if IsNull(MIEomission) and IsNull(MIEcommission) then // record the exercise in exercise history as completed call UpdateExercisesHistory with ExerciseCode // and set the MIEcode = 0 set MIEcode = 0 //MIEcode is the Most Important Error code exit end if

// Check if it is an error of commission or omission

if MIEomission < MIEcommission then set MIEcode = MIEcommission set ErrorType = ‘commission’ else set MIEcode = MIEomission set ErrorType = ‘omission’ end if exit 134 5.GivingFeedbacktotheStudent

The first thing we want to know when analysing the student’s solution is whether there is an error. The student’s solution can have: no errors; only one error; or more than one error. If no errors are detected, that means that the current exercise is correctly completed. If errors are detected, then we want to know the type of the errors (omission or commission) because different tutoring approaches are taken if something is simply missing or is wrongly done. If more than one error is detected, it is also important to know which is the most important error. With Access as the domain to be taught, we can have the following types of errors (also presented in Figure 5.1):

◦ Errors of omission:

− Missing form or report − Missing object − Wrong property value

◦ Errors of commission:

− Wrong object type − Wrong property value

Types of Errors Error

TypeOfError TypeOfError

Error Of Error Of Commission Omission

SubtypeOf SubtypeOf SubtypeOf SubtypeOf SubtypeOf

Wrong Wrong Missing Missing Object Object Type Property Value Form or Report

Figure 5.1: Types Of Errors.

If the student’s solution contains more than one error, then only the most 5.2.AnalysisoftheStudent’sSolution 135 important error - the error with the lowest MIE code - will be considered. Because PAT provides help by request (not only at the end of the exercise), there is no reason to give advice on all the existing errors at the same time, but only one error at a time. The student can ask for help again if he or she needs it. Moreover, the students have available an option to see an overall view of the correctness of their solution - described in the next chapter. 136 5.GivingFeedbacktotheStudent

5.3 Synthesis: Generating the Feedback

The next stage is the synthesis stage: generating the feedback to the student. Synthesis relates to the error that PAT detected: either the only error; or the most important error in the student’s solution. To simplify the text, for the rest of the chapter, I will refer to the error for which PAT provides feedback as the most important error or simply MIE - regardless of whether it is: the most important error; or the only error; or the error for the entire solution or just for a specific object. PAT, as do other ITSs, provides help to the student at several levels of detail - from general to more specific. In addition, PAT has the ability to generate a hint based on the exercise’s goals, the actions that would achieve that goal, and the property descriptions related to the error, all of them existing in PAT’s knowledge base - as explained in detail in Section 5.3.2.

5.3.1 Hint Levels

PAT can provide hints in two categories: hints to indicate what the error is; or hints on how to fix the problem. The hints have two levels of details for each of the categories previously mentioned. Below are the hint and advice types that PAT offers: ◦ what is the error (What - basic level) ◦ more help on what is the error (What - detailed level) ◦ how to fix the error (How - basic level) ◦ more help on how to fix the error (How - detailed level) Figure 5.2 presents the hint levels (from general on the left side, to more specific on the right side) for both the advice types, and how the hint text is generated. 5.3.Synthesis:GeneratingtheFeedback 137

What is wrong How to fix Correct solution

Advice Type What What How How Correct basic level detailed level basic level detailed level Value

Hint Text Goal Description Goal Goal Description Property Description Property Value based on + Action Hint Level (General) 0 1 2 3 4 (Specific)

Figure 5.2: Generating the hint text for an advice type.

The first level of detail only presents to the student the goal that relates to the error. The second level of detail, presenting a more specific hint, contains the description of the goal - see the first two rows in Table 5.1 on the next page. By presenting the goal and goal description, the student only receives information about what the error is. The next two levels of hint also provide information on how to fix the error. To provide a helpful level 2 hint to the student, as well as describing the action that the student should do to fix the error, we should also provide the background of the problem. For this reason, and because in some circum- stances PAT can provide a level 2 hint the first time the student asks for help, the goal description is included in the hint. Level 3 (the most detailed one) provides a hint based on the property de- scription for the most important error. Because PAT can be used with assign- ments, it does not provide the student hints with a greater level of specificity for an assignment. Level 4, which is the correct solution, is only provided for non-assignment exercises. As an example, the hints for an error of commission for the RecordSource property of a form are presented in Table 5.1. 138 5.GivingFeedbacktotheStudent

Hint Level Example What - basic You don’t have the appropriate source of data for the (level 0) CustomerRentedMovies Form. What - detailed To be able to present information from the database, (level 1) a form must have defined a source of data. This can be a table, a query, or an SQL statement. How - basic If you want to have the appropriate source of data for (level 2) the CustomerRentedMovies Form, you should define the data source for the form. How - detailed There is a property used to specify the source of the (level 3) data for a form. Correct value You should set the value for the RecordSource prop- (level 4) erty of the CustomerRentedMovies Form to “Cus- tomers”. Table 5.1: Examples of hint texts for an error of commission.

In the next section, I explain how PAT generates the hints presented in Ta- ble 5.1, using the goal and action corresponding to the error that PAT ad- dresses, and how the hint is further individualised. 5.3.Synthesis:GeneratingtheFeedback 139

5.3.2 High Level Pseudo Code for Synthesis

The process that I designed to generate a hint is presented in the pseudo code below, and described more fully in the text that follows. Pseudo Code 2 presents the high level structure of the code for the synthesis. The entire synthesis stage is executed only when there is an error in the student’s solution. If no errors are detected, the student is congratulated and starting a new exercise is suggested. Because updating the student model occurs in various circumstances, there is not just one procedure to update the student model, but rather several smaller, more specific procedures, called by the high-level procedure shown in Pseudo Code 2. 140 5.GivingFeedbacktotheStudent

Pseudo Code 2 Synthesis: generating the feedback. //======// Procedure Name: SynthesisFeedback // Scope: High level procedures for synthesis //======

// Run the whole process of synthesis only if there is at least // one error in the student’s solution

if MIEcode =0 then // there is no error in the student’s solution set HintText = ‘Congratulations! That’s it with this exercise.’ ‘You can go back to ’My Profile’ window and select another exercise.’ display HintText exit end if

//Call the synthesis high level procedures //======

//1. Adjusting the hint level call AdjustHintLevel

// 2. Get the relevant information about the error call GetInfoAboutMIEcode

// 3. Preparing the text for a hint call PrepareHint

// 4. Improving the text for the hint call ImproveHintText

// 5. Update global variables call UpdateVariables

display HintText

exit

A description of the above procedures follows next. 5.3.Synthesis:GeneratingtheFeedback 141

1. Adjusting the Hint Level

The first synthesis procedure is AdjustHintLevel. This procedure looks at the student’s history, personal characteristics and other factors required to automatically adjusts the hint level. This means that even if the student asks for help for the first time for a particular issue (error), PAT could give a higher level of hint, rather than Level 0 (shown in Figure 5.2). The hint that PAT will prepare can be individualised for a particular stu- dent based on several categories of information from the student model. The factors that can influence PAT’s hints are presented next. ◦ Factors that make the hint more specific:

− Student asks for help again for the same error. − Student is not confident about using Access. − Student is not confident with the topic where the error occurred.

◦ Factors that make the hint more general:

− Student is confident about using Access. − Student is confident with the topic where the error occurred. − Student wants to know everything.

Based on these factors, PAT can decide the following. ◦ Should feedback be only on what is wrong or on how to fix it? ◦ What should be the precise level of specificity? ◦ Should we add any reminders because the student has already had that kind of feedback before? ◦ Should we try to stop help abuse? The pseudo code for AdjustHintLevel (Pseudo Code 3) is presented next, fol- lowed by a more detailed description of the procedure. 142 5.GivingFeedbacktotheStudent

Pseudo Code 3 Adjusting the hint level. //======// Procedure Name: AdjustHintLevel // Scope: Adjusting the hint level //======

// A. Information about the student’s history //======

// Check if the student asked for help for the same error or a new error

if MIEcode = gPreviousMIEcode then // gPreviousMIEcode is a storing the MIEcode // for later checks call UpdateStudentDoesNotKnow increment HintLevel // HintLevel is the level of hint provided exit end if

// Update the student model with what the student // demonstrated to know call UpdateStudentKnows with MIEcode

// Update the student model with what the student was wrong call UpdateStudentDoesNotKnow with MIEcode

// B. Information about the student’s preferences //======

if (StudentNOTconfidentWithAccess)or(StudentNOTinterestedInTopic) then increment HintLevel end if

if (StudentInterestedInAccess)and(HintLevel > 0) then decrease HintLevel end if

exit 5.3.Synthesis:GeneratingtheFeedback 143

A. Information about the student’s history PAT will check in the short term history whether the most recent MIE refers to a new topic or is the same topic as in the previous request for help. If the request is for the same topic, that means the student is still stuck. PAT will check what the hint level in the previous feedback was and will increase it to the next level. If the student asks for help and PAT detects that the MIE error is not the same as the previous one, further investigation is required. From the long term history, PAT checks how many times (if any) the stu- dent: ◦ correctly solved a similar task; ◦ made the same error in previous sessions; and ◦ made the same error in the same (current) session. B. Information about the student’s preferences As PAT records student’s preferences as both global and local characteris- tics, in the analysis stage PAT will retrieve all the global characteristics (not related to a topic) and the local characteristics related to the most important error. As an example, if the most important error is about a subform or a com- bobox, then PAT will check if the student claimed to know how to create subforms or comboboxes. 144 5.GivingFeedbacktotheStudent

In addition to adjusting the hint level, at this stage, PAT will also update the student model. In the case that the student asks for help again for the same error, PAT will record that the student is still stuck after the previous hint. If the student asks for help for a different topic, then PAT will record not just the current error (MIE) but also the topics correctly solved between the previous error and the current one - as already presented in Section 4.5.

2. Get the Relevant Information About the Error (ref: Pseudo Code 2)

Firstly, PAT looks at the type of the object with the MIE code (as part of the analysis stage, PAT has already determined the MIE code). In the correct solution of an exercise, each MIE code is associated with an object property. PAT’s knowledge base also contains the goal associated with each of the object properties in the correct solution. In this way, PAT knows which goal the student is trying to achieve. Secondly, PAT knows the actions that a student should perform to achieve the goal. The actions, however, are also related to object properties. As a result, PAT chooses from the actions associated with the goal that the student wants to achieve, the action that refers to the same object-property, to which the MIE code refers. In addition, PAT also needs to know if the MIE refers to a simple or a complex object-property. If the error refers to a complex object-property, then PAT will also get the type of the problem - described in more detail in the next section. Pseudo Code 4 presents the sub-procedures that are part of the GetInfoAboutMIEcode procedure. 5.3.Synthesis:GeneratingtheFeedback 145

Pseudo Code 4 Get relevant information about the error. //======// Procedure Name: GetInfoAboutMIEcode // Scope: Get relevant information about the error //======

// get the object type for MIE code call GetObjectType with MIEcode returning ObjectType

// detect the goal that the student tries to achieve call GetObjectPropery with MIEcode returning GoalID, ObjectName, PropertyName

// detect the action that can achieve the goal call GetActionID with GoalID, ObjectName, PropertyName returning ActionID

// get the type of object-property (simple or complex) call GetComplexity with MIEcode returning Complexity

// get the problem type - if it is a complex object-property if Complexity = True then call GetProblemType with MIEcode returning ProblemType // ProblemType is the type of the problem for // a complex object-property else set ProblemType = Null end if

exit 146 5.GivingFeedbacktotheStudent

3. Preparing a Hint (ref: Pseudo Code 2)

This procedure prepares the hint text based on the type of object, type of error and complexity for the object-property of the error. The pseudo code is followed by a description of its purpose.

Pseudo Code 5 Prepare the hint text. //======// Procedure Name: PrepareHint // Scope: Prepare the hint text //======

if Complexity = False then // is a simple object-property if ContainerIsMissing then call PrepareHintTextForContainerMissing returning HintText // HintText is the actual text for the hint else if ErrorType = omission then call PrepareHintTextForErrorOfOmission returning HintText else // If MIE is error of commission call PrepareHintTextForErrorOfCommission returning HintText end if end if

else // If it is a complex object-property call PrepareHintTextForComplexObject with ObjectType, Problem- Type returning HintText end if

exit

As an example, consider the case where the MIE code detected in the correct solution is 2, and the object-property is (Form, RecordSource). The goal associated for the MIE code 2 in this exercise is “have the appropriate source of data for the ?frmName Form” (where ?frmName is a ). In addition, 5.3.Synthesis:GeneratingtheFeedback 147 the action associated with this goal and also with the (Form, RecordSource) object property is: “define the data source for the form”. For this example, by adding the linking words to create a proper sentence for each of the hint levels, the hint text will be as described in Table 5.1 from Section 5.3.1. However, Pseudo Code 5 has a few distinct paths that can be followed to prepare a hint, based on the following aspects.

A) Is the container (the form or report) missing?

B) Is the error an error of omission or commission?

C) Is the error part of a complex object property?

A) Hints about containers A special case of dealing with errors about containers is the case when the container is missing. In the case that PAT cannot find a container with the expected name, it will search through the existing containers (forms or reports). PAT searches for a container with a name similar to the required one, in case the student misspelled the name. If PAT finds a similar name, it will let the student know that a container with a different name was found and it will analyse that container. If PAT cannot find a similar name, the hint will state that PAT couldn’t find the container. B) Errors of omission and commission A reminder that errors of omission refer to missing information: missing objects or properties not set. Errors of commission are errors where the infor- mation required is present (not missing) but is incorrect. In the case of Access, we can have properties for objects set to incorrect values. However, Access has default values for some properties e.g. the default value for ScrollBars is set to‘Both’ for a new form. In many cases, having no scroll bars for a form will give the form a better look. For this reason, in some exercises, PAT asks the students to hide the scrollbars. This can be achieved 148 5.GivingFeedbacktotheStudent by setting the ScrollBars property to ‘Neither’. Just because a property is set to a wrong value (‘Both’ in this case) we don’t know for sure if the student set the property with the wrong value (error of commission) or the student did not set the property at all - it is just set to the default value (this kind of problem can only occur with properties, not with objects). For this reason, PAT handles errors of commission and omission in the same way - for wrong or missing properties. Hints about containers (forms and reports themselves) are a special case. As a reminder from the previous chapter, the solution for an exercise contains information in the form of (container name, object name, property name, property value). Also the container has some properties of its own: (container name, property name, property value). PAT will detect whether the error is about a property of the container and will follow an appropriate to deal with that. C) Hints for a complex object property The previous example used the ScrollBars property for a form. The values that Access allows for the ScrollBar property are: ‘Both’; ‘Horizontal Only’; ‘Vertical Only’; and ‘Neither’. Obviously, it is not a too complicated task to check if the student’s solution is identical with the correct solution. In contrast, complex object-properties are object-properties for which the value of the property cannot be simply compared between the correct solution and the student’s solution. An example of such an object property is the RecordSource property for a form. A RecordSource can contain any of: a table name; a query name; or a select statement. 5.3.Synthesis:GeneratingtheFeedback 149

An example of a Select statement from the correct solution, similar to the example from Section 4.3.2, is:

SELECT MovieNum, Title, YearReleased FROM Movies ORDER BY Title

Comparing this select statement with the statement from the student’s solution cannot be done by looking for an exact match of the complete value of the property - the select statement. The solution is to further break down the value of the property in meaningful parts: select, from and order by. These parts are recorded in PAT as types of problems for complex object-properties. Because PAT’s focus is on creating forms and reports, and analysing com- plicated select statements is a very complex task, PAT hints will only indicate the wrong or missing part e.g. “you don’t have the correct tables in the source of data for the CustomerRentedMovies Form”.

4. Improving the Hint Text (ref: Pseudo Code 2)

After the hint text is prepared, PAT checks if any improvements to the text can be made. Pseudo Code 6 presents the ImproveHintText procedure, followed by a description of its purpose. 150 5.GivingFeedbacktotheStudent

Pseudo Code 6 Improving the hint text. //======// Procedure Name: ImproveHintText // Scope: Improving the hint text //======

if MIEcode = gPreviousMIEcode then set AdditionForText = ‘You should remember that: ’ end if

if StudentAlreadyAskedForSameErrorInCurrentSession then set AdditionForText = ‘Try a bit harder!’ ‘Ask for help ONLY if you need it.’ end if

if MaximumHintLevelReached = True then set AdditionForText = ‘That’s all I can tell you about this error: ’ end if

append AdditionForText to HintText

exit

To improve the hint text, two aspects are important: remind the student about previous hints (if any) and warn the student to not overuse the help.

If from the student’s history we detect that the same advice was given to the student on some previous sessions, then a reminder will be added to the hint text e.g. “You should remember that: ” followed by the actual hint text.

If the student asks again for help for the same error in the same session, a warning will be given in addition to the hint e.g. “Try a bit harder! Ask for help ONLY if you need it.”.

Furthermore, if the hint level reaches the maximum value, PAT warns the student: “That’s all I can tell you about this error:” followed by the hint text. 5.3.Synthesis:GeneratingtheFeedback 151

5. Update Global Variables (ref: Pseudo Code 2)

After the hint is prepared and improved, PAT updates the advice history. PAT also records the error that the student had and the hint level for the feedback the student received. In addition, some global variables are updated, as they are required at the beginning of the synthesis stage: ◦ current MIEcode will be stored as gPreviousMIEcode ◦ current hint level will be stored as gPreviousHintLevel

Pseudo Code 7 Update global variables. //======// Procedure Name: UpdateVariables // Scope: Update global variables //======

// Update advice history call UpdateAdviceHistory with MIEcode, HintLevel

// Set variables set gPreviousMIEcode = MIEcode set gPreviousHintLevel = HintLevel

exit 152 5.GivingFeedbacktotheStudent

5.4 Propose a New Exercise

From the student’s history, we can see which topics are recorded as understood or not understood by the student at the current time. Based on this information and applying the teaching principles described below, PAT can propose to a student the exercise which will try to maximise their learning outcome. A student can ask PAT to propose an exercise at any time: when the student starts a new session or when the student gets stuck during an exercise. If the student gets stuck, he or she can ask for a recommendation for a more suitable exercise - according to his or her skills.

5.4.1 Considerations when Recommending a New Ex- ercise

The teaching principles that PAT follows when proposing a new exercise are:

◦ Expand the user’s knowledge: the exercise should contain topics not yet understood; the student should learn/tackle something new. ◦ Master a concept: it is more beneficial for the student to learn all asso- ciated topics for a concept before starting a new topic.

Additionally, and in accordance to the teaching theories discussed in Sec- tion 2.2.3, PAT also checks the following aspects.

◦ The exercise must contain the object-property that PAT finds that should be addressed. ◦ Do not propose an exercise that contains the same topic that forced the student to abandon the previous exercise; it can be quite de-motivational. ◦ Do not propose a recently abandoned exercise (including the current one). ◦ Do not propose an exercise previously completed. 5.4. Propose a New Exercise 153

◦ The student should know the prerequisites for the chosen topic. ◦ If several exercises match the above requirements, propose the easiest one.

To recommend a new exercise, PAT uses a table stored in short term memory (as presented in the Section 4.2.1) containing information about the abstract topics. The table will record information about:

◦ total number of object-properties that belongs to an abstract topic; ◦ the number of object-properties that the student has shown that they know; and ◦ the number of object-properties that the student has shown that they do not know.

Table 5.2 presents an example for this table containing information for an exercise for learning about forms. The table shows the number of object- properties that the student has shown that they know or do not know at a particular point in time - the time when he or she asks PAT to recommend a new exercise.

ObjectName Property CountAll CountYes CountNo AbstractSubform Abstract 4 0 1 Comboboxes Abstract 7 0 0 CommandButtons Abstract 2 0 0 CompleteForm Abstract 2 0 0 FormAppearance Abstract 7 0 2 FormCharacteristics Abstract 1 0 0 FormObjects Abstract 2 0 0 FormProperties Abstract 2 0 0 LinkingProperties Abstract 3 0 0 MainForm Abstract 4 0 1 ObjectsForFields Abstract 1 0 1 OtherObjects Abstract 3 0 0 TextBoxes Abstract 1 0 1 Table 5.2: Example of the CountKnowsForTopics table. 154 5.GivingFeedbacktotheStudent

The CountAll column represents the total number of object-properties that belong to a particular abstract topic. CountYes and CountNo represent the number of object-properties that the student proved to know or to not know respectively.

5.4.2 Pseudo Code for Recommending a New Exercise

This section describes the pseudo code for recommending a new exercise. A description of the code follows.

Update the Temporary Table

When the student asks PAT to recommend an exercise, the CountKnowsFor- Topics table will be updated for the current state of the student for the columns with knows and not knows. The number of object-properties for each abstract topic remains constant for the current version of the knowledge base.

Get the Topic to be Addressed

As soon as the table is updated, PAT searches for the most suitable abstract topic to be addressed. The criteria to choose the most suitable topic are:

◦ maximum number of knows; ◦ minimum number of not knows; and ◦ minimum number of unknowns .

If no data is recorded in the student model (e.g. the first time PAT is used), the procedure will detect that and will propose the exercise recorded in PAT as the exercise to be proposed for the first time. This exercise is an easy exercise, with not many abstract topics involved, suitable as a starting point for a beginner. 5.4. Propose a New Exercise 155

Pseudo Code 8 Recommend a new exercise. //======// Procedure Name: RecommendNewExercise // Scope: Recommend a new exercise //======

// update the temporary table containing the counts for the abstract topics

call UpdateCountKnowsForTopics

call GetNextTopicToRecommend returning AbstractTopic

if IsNull(AbstractTopic) then // if no topic could be found is because the student // did not tried any exercises yet set RecExCode = FirstExerciseCode // PAT has an exercise assigned as the first/easiest exercise that // a student can start with exit end if

call GetAbstractTopicDetails with AbstractTopic returning ObjectType, PropertyName

call GetNextExercise with ObjectType, PropertyName returning RecExCode

call GetExerciseName with RecExCode returning RecExerciseName

display RecExerciseName

exit

Get the Topic’s Details

After PAT chooses an abstract topic, it looks for the object-properties that belong to that abstract topic. PAT chooses the object-properties that are either: 156 5.GivingFeedbacktotheStudent

◦ not known (but not the same object-property as in the last MIE); or ◦ not recorded if the student knows or does not know it (unknown). If many object-properties have the same relevance, then the object-property with the most occurrences in the advice history will be preferred.

Get the Recommended Exercise

Now that PAT has the object-property that should be given to the student, PAT looks for an exercise that: ◦ contains that object-property; ◦ is not recorded as already completed; and ◦ is not the latest exercise the student worked on. If more exercises satisfy the above criteria, the exercise with the lowest diffi- culty will be chosen. 5.5. Summary 157

5.5 Summary

This chapter details the mechanism I created to generate the individualised feedback. It describes the pseudo code for some of the procedures, together with additional explanations. The next chapter describes the technology I used to built PAT and also the interactions between the student and PAT.

Chapter 6

Building a new ITS

The paradigms of are those of conventional engineering modified to take into account the fact that software is a conceptual, rather than a physical product.

-

This chapter starts by describing the technologies used to build PAT. It shows the interfaces I developed for PAT and their use. It also presents the modules’ structure and interactions. Also in this chapter is the implementation of the databases for PAT’s knowledge base.

159 160 6. Building a new ITS

6.1 Technologies Used

Before this research started, PAT existed as an application that could be in- stalled as an addin in Access. It could extract the student’s solution, compare it with a correct solution and display the traffic lights. As part of this research, I created the knowledge base, the interfaces for student-system interaction, and the corresponding code. For completeness, this chapter will present the traffic lights window as well, even though it existed before this research started. Visual C++ PAT’s code is written in Visual C++ using Windows Template (WTL) - an object oriented library for developing applications and user in- terfaces for Windows 32 . WTL is an extension to Active Template Library (ATL) and provides a collection of classes for controls, di- alogs, windows, etc. More information about WTL can be found at WTL documentation (WTLDocumentation 2002). Access DB Because PAT is an addin for Access, and was already capable of interacting with an Access database, PAT’s knowledge base (the student model and the other models) was created using additional Access databases. PAT Installation Being developed in Visual C++, the install kit consists of one executable file: PATaddin.msi. Installing PAT is an extremely easy process. After double- clicking on this msi file, the interaction between the installing process and the student is minimal. With only four clicks, in about 30 seconds, PAT will be installed on the student’s computer. Appendix C contains a document which explains how to install PAT on a computer. The document is given to the students, together with the install file. Appendix D contains the document that explains to the students how to 6.1. Technologies Used 161 use PAT. The document is delivered as part of PAT’s installation process. After the installation, whenever Access is started, a new menu exists in the add-ins group of the ribbon - see Figure 6.1.

Figure 6.1: PAT’s menu in Access.

From the menu, the student can click on the My Profile or This Exercise commands which will bring up the corresponding window. Descriptions of the two interfaces are given below. 162 6. Building a new ITS

6.2 Interfaces

This section presents PAT’s interfaces and the functionality they provide. These interfaces can be categorised (following the KVL framework, previ- ously presented in Section 3.1.1) as: interfaces for the outer loop and interfaces for the inner loop. In addition, we have first time interfaces (used only when the student starts using PAT for the first time) and some additional interfaces. While some interfaces appear as modal windows, the other interfaces do not. A modal window is a child window which requires the user to finish interacting with it before returning to the parent window.

6.2.1 First Time Interfaces

The first time interfaces are the interfaces that the student will usually see only once, after PAT’s installation. Of course, if PAT is re-installed, they will appear again. When started for the first time, PAT will ask for the student’s name. In addition, it will ask the student if he or she is willing to answer a few more questions to initialise some of the data from the student model. This first window is called Welcome to PAT for the first time - see Figure 6.2. 6.2. Interfaces 163

Figure 6.2: First Time window.

If the student chooses to answer to the additional questions, the next window (see Figure 6.3) will appear.

Figure 6.3: Questions window 164 6. Building a new ITS

As it can be seen from the figure, the student still has the choice not to answer. Regardless of which button is pressed, or even if the student initially decided not to answer, the next window (see Figure 6.4) will thank the student (using the name that the student just typed).

Figure 6.4: The Welcome window.

After greeting the student, the My Profile window will appear.

6.2.2 Outer Loop Interface

The My Profile window (see Figure 6.5) contains the services for the outer loop. From this window, the student can select which exercise he or she would like to do next, or can ask PAT to propose the next exercise, or can even revise (update) his or her profile. The Miscellaneous section of the window contains two elements that exist in the majority of the windows that PAT has: send feedback and activate tooltips. Send feedback is a link which point to PAT’s feedback web page: www.pctotal.net/Feedback 6.2. Interfaces 165

Figure 6.5: My Profile window.

When a student clicks on that link, the feedback web page (see Figure 6.6) will open with an image of the originating window. The image is an html image map on which the student can click on any of the elements from that window. 166 6. Building a new ITS

Figure 6.6: The feedback site for PAT.

When the student clicks on any of the parts of the image map, a feedback form (specific for the object that the student clicked on) will appear, allowing the student to send any comments or feedback about that particular object from the window.

The activate tooltips check box enables or disables the appearance of tips about the purpose of the objects in the window. When enabled and the student 6.2. Interfaces 167 hovers the mouse over a particular object from the window, an explanation of what the object does appears on the screen (see Figure 6.7).

Figure 6.7: Tooltips for a window.

The Revise My Profile button will redisplay the window with the questions used for initializing the student model (see Figure 6.3). The window will contain the student’s previous answers (if any). The student can update the information and save the new answers. To manually choose an exercise (as asked by the students from the focus group - see Section 3.1.1), the student has a list of topics to choose from and can also specify two other preferences: exercise difficulty and previous work. When the student makes a selection, the list of exercises is updated accordingly. If the student prefers to let the system to choose an exercise, the list of exercises will contain only one exercise - the recommended one. Regardless the method chosen, when an exercise is selected, the justification for doing that exercise will appear on the right side of the window. The student can browse through the exercises, and when he or she chooses one, they click on the See Detailed Description button. This action closes the My Profile window and opens the This Exercise window, with more information for the chosen exercise.

6.2.3 Inner Loop Interfaces

The This Exercise window (see Figure 6.8) is the interface for the inner loop services. The objects in the window are grouped into six sections: 168 6. Building a new ITS

◦ Exercise; ◦ About this exercise; ◦ Check your solution; ◦ If you are stuck; ◦ Your profile; ◦ Miscellaneous. The Exercise section presents information about the current exercise: its name, purpose, and short description. This section also contains a list of the important objects in the form or report. When the student selects one of these objects, the requirements and hints (if any) for the object are displayed in the box to the right of the list of objects.

Figure 6.8: This Exercise window.

In Section 3.1.1, I presented the features asked for by the students mapped to the KVL framework (see Table 3.1). I indicated there that some of the features the students asked for do not match any of the service from the KVL 6.2. Interfaces 169 framework and I grouped them as additional resources. The table on the next page presents the services that PAT provides, match- ing the KVL framework and the features required by the students from the focus group. 170 6. Building a new ITS ts g with my solution? Is my solution correct? More readings Show me an example Table 6.1: Services that PAT provides. Check-pointsforeachtaskScreen-shotsatvariousstagesHints on how toLinks start to and textbooks do the or assignment lectures slidesAdditional resources Show for all the requiremen topic Showascreenshot Ismysolutioncorrect? More readings KVLFrameworkMinimalfeedbackErrorspecificfeedback FocusGroupHintsonnextstepAssessment Feedbackabouttheerror of knowledge ErrorindicationReview of the solution Whattodonext Whatiswron PATService(button) Ismysolutioncorrect? HowdoIfixthis? 6.2. Interfaces 171

The services listed in Table 6.1 are described next. A. Is my solution correct? At any time a student can check whether his or her solution is correct. After clicking on this button, the traffic lights window (see Figure 6.9) will appear.

Figure 6.9: An example of the Traffic Lights window.

This window shows the tasks and their subtasks (as described in Section 4.3.2), each of them with a particular colour. The first (from the left) column is the red colour, the next three are yellow, and the last column (the column on the right) is green. The meaning of the colours is: ◦ Red: none of the steps in the subtask are correct; ◦ Green: all the steps in the subtask are correct; and ◦ Yellow: only some of the steps in the subtask are correct. If the yellow colour is displayed in the column nearest to the red column, then only a few steps are correct; if the yellow colour is displayed in the column nearest to the green column, more than half of the steps in that subtask are correct. 172 6. Building a new ITS

By presenting this simple but meaningful description of the subtasks in the exercise, the student has: ◦ Basic feedback of what is wrong with his or her solution; and ◦ An assessment of his or her knowledge for the subtasks presented in this window. B. What is wrong with my solution? While Is my solution correct? only provides minimal feedback, What is wrong with my solution? is an error specific feedback. When a student clicks on this button, the analysis, followed by synthesis (as described in the previous chapter) will run. Individualised feedback will be provided to the student in the Advice window (see Figure 6.10).

Figure 6.10: Advice window.

If the student again asks what is wrong with his or her solution, but without fixing the most important error, PAT will give the student the next level of hint for that error. However, based on the information from the student model, the synthesis can adjust the hint and start from a higher level of hint - as described in Section 5.3.1. The More details section of the Advice window will be explained later in the chapter. 6.2. Interfaces 173

C. How do I fix this? When the student opens the This Exercise window, if the most important error appears for the first time in that session, the How do I fix this button is disabled. If a student asked what is wrong with his or her solution and received a Level 1 Hint (as already described in Table 5.1 from Section 5.3), the How do I fix this button will be enabled. By clicking on this button, the student will receive feedback presenting the actions he or she should perform in order to fix the most important error. The feedback will be displayed in the same Advice window, the only difference being the advice text. Both What is wrong with my solution? and How do I fix this services can be used to gain feedback about either the student’s entire solution or for a particular object from the exercise. The student only needs to select an object from the Important objects from the form drop down list and enable the Only for the selected object checkbox (see Figure 6.8) to get feedback for the selected object. I implemented this option in PAT to overcome the problem described in Section 4.3.2, where a student wants to address a different error than the one that PAT regards as the most important one. D. Show a screenshot By clicking on this button, the student will see a snapshot of the final layout of the form or report they are being asked to create in this exercise. An example of such a screenshot is shown in Figure 6.11. 174 6. Building a new ITS

Figure 6.11: Screenshot for the required form.

E. Show all requirements The students participating in the focus group asked for hints on how to start and do the assignment. Since the time of the focus group, the requirements for the assignments were improved so they are easier to read and understand. However, having all the requirements (including some hints) specified on a single page, accessible from PAT, also helps. The image below (Figure 6.12) is an example of such a requirements list. This can be displayed by clicking on the Show all requirements button in the This Exercise window. This window actually presents all the descriptive information that can be found in the This Exercise window, all in the same time. 6.2. Interfaces 175

Figure 6.12: All the requirements for an exercise.

F. More Readings Clicking on the More Readings button from the Advice window gives the student the exact lecture and slides numbers from the PowerPoint presentation for the lecture, and any pages from the textbook used in this subject - if an appropriate example is available. These appear in the More Readings window - Figure 6.13.

Figure 6.13: More readings. 176 6. Building a new ITS

G. Show me an example Clicking on the Show me an example button from the Advice window shows the student a diagram with more information related to the most im- portant error. An example of such a diagram, similar to the one presented in Figure 4.14 in Section 4.6, is shown in Figure 6.14.

Data Source for a Form or Report

A form (report) can have different data sources

Tables

Form / Report Queries Property: RecordSource

SelectStatements the data souce is specified in RecordSource property of the form or report

Figure 6.14: Another example of a diagram associated to an action.

From the This Exercise window, the student can, at any time, go back to My Profile window (by clicking on the Go to My Profile button) and select another exercise. The My Profile and This Exercise windows are non-modal: they can remain on the screen while the student is working on the exercise. The Traffic Lights and Advice windows, however, are modal windows. Because they are only useful (and show the correct information) for a particular point in time, they do not allow the student to make changes to their solution until the student has closed the traffic lights or advice window. An example of how the student can see the traffic lights window while working on an exercise is presented in the Figure 6.15. 6.2. Interfaces 177 Figure 6.15: Example of using PAT. 178 6. Building a new ITS

6.2.4 Additional Interfaces

These are the interfaces where additional communication is required. An ex- ample of such an interface is the Welcome Back window. Whenever a student restarts Access and uses PAT for the first time in that session, the Welcome Back window appears first. An example of this window is presented in the figure below.

Figure 6.16: Welcome Back window.

In this window, PAT reminds the student of the last exercise he or she was working on (in the previous session). It gives the student the opportunity to go to This Exercise window opened for the last exercise or to go to the My Profile window and select another exercise. 6.3. Databases and Tables 179

6.3 Databases and Tables

In Chapter 4 I presented the kind of information that PAT’s knowledge base should contain. In Section 6.1 of this chapter I stated why the knowledge base is an Access database. The data from the knowledge base can be categorised from a time perspec- tive: data that will not change over the time the student uses PAT; and data that will change during (or only appear after) the student uses PAT. From this point of view, we can group the data from the knowledge base into static data and dynamic data. Static data contains information such as: exercises requirements, correct solutions, hints, information about the domain model, etc. Dynamic data con- tains information about the student and his or her history related to: exercises started, errors made, hints received, etc. Figure 6.17 shows the tables from the knowledge base, grouped by the model they belong to - as previously described in Figure 3.1 in Chapter 3. Figure 6.17 also shows which tables are static and which tables are dynamic. Clearly, we can partition the tables into two databases, one being read-only (not allowing any changes) while the other one being writeable. Having a read-only database helps us to have better security for sensitive data such as the tables containing the correct solution. This is achieved by distributing the read-only database as part of the executable file, not as a separate file. However, the writeable database containing the student model is visible (and accessible) to the student. Two of PAT’s “working” tables are stored in the student’s database. (The student’s database is the database in which they create their forms and re- ports.) The tables that are created in the student’s database are StudentSolu- tion and ObjectNameMapping. The StudentSolution table contains the prop- erty value for each of the object-properties in the form or report created by 180 6. Building a new ITS

SessionDetails Tasks

Goals

Subtasks Effects

l e E d x o Actions e M User Databases r c i g s n i e r Users s o t ActionsHistory UserGlobalChar u ExercisesHistory T Exercises GlobalCharacteristics

AdvicesHistory Personal StudentSolution Characteristics UsersPreferences AdvicesForErrors ObjectNameMapping

LocalCharacteristics

ObjPropAction MasterSolution CountKnowsForTopics AdvicesForErrors MasterSolutionSubvalues UserKnowsTemp UserLocalChar ObjPropHistory ExercisesTopics

TopicsForExercises ExercisesObjects

TopicsStructure ObjectsProperties

Properties

Domain Model

UserLocalChar Tables that are dynamic

MasterSolution Tables that are static

StudentSolution Tables located in users's database

Figure 6.17: Tables in the knowledge base.

the student and is created (or re-created) each time PAT analyses the stu- dent’s solution. Because students can use different (internal) names for the objects on their forms and reports, the ObjectNameMapping table maps the object names in the master (correct) solution to the object names in the stu- 6.3. Databases and Tables 181 dent’s solution. This table is also re-created each time the student solution is analysed. As an additional security measure, the fields containing the object name, property name, and property value are encrypted and the tables in the stu- dent’s database are hidden. 182 6. Building a new ITS

6.4 Modules Structure and Code

In Chapter 5, I presented the pseudo code for the high-level procedures for: analysing the student’s solution; synthesising the feedback; and proposing a new exercise to the student. In this section, I give a high level presentation of the communication be- tween the student and PAT, as well as PAT’s modules structure and interac- tions. Figure 6.18 presents PAT’s modules grouped as interfaces, procedures and databases. The communication between the modules is represented as a dotted line, the indicating the data flow. The communication between the student and PAT is represented by lines with text describing the action of the student or the output from PAT. To simplify the diagram, only the two main interface of PAT are presented: My Profile and This Exercise. Similarly, some of the actions that the student can perform (such as updating their profile) are also omitted for simplicity. 6.4.ModulesStructureandCode 183 PAT-SM PAT-data PAT SynthesisFeedback AnalyzeStudentSolution RecommendNewExercise window window My Profile My ThisExercise

Interfaces Procedures Databases

e

s

i

c

r

e n Figure 6.18: Student-system interactions.

e s o s p

i i t l

x t t

c h e

e

r u x l

g h e i e

o l r t w x s

o t e e c

i f

n n k

f i f

c k w r h a

e s

e o r t

h a n

f c

k

s a Student 184 6. Building a new ITS

6.5 Summary

This chapter briefly describes the technologies I used to create PAT. It presents the advantages of using Visual C++ and WTL and how easy is to install PAT on a new computer. The chapter explains the interfaces I created for PAT, following the KVL framework. The chapter concludes with a high level view of PAT’s modules and how they interact with each other. The next chapter discusses the results of the summative evaluation of PAT. Chapter 7

System Evaluation

An evaluation is a process by which relevant data are collected and transformed into information for decision making.

- Cooley and Lohnes (1976, p. 3)

This chapter contains the summative evaluation of PAT. After starting with an overview of evaluation techniques for ITSs, I present the results of the sum- mative evaluations of the final version of PAT, by both students and academic staff. The results of a pre-post test evaluation are also discussed.

185 186 7. System Evaluation

7.1 Evaluating an Intelligent Tutoring System

Although ITSs appeared more than 25 years ago, there is no absolute agree- ment on how an ITS should be evaluated. This is a result of the many disci- plines involved in creating an ITS. An ITS can be seen and therefore evaluated as an or an educational system, each having different eval- uation methodologies. For an information system, we can be interested in evaluating its archi- tecture design or only some components of the system. From an educational perspective, we are more interested in the outcome that the system produces when used by students, i.e. improvement engendered in student learning. However, despite these different perspectives, some consensus has been achieved (Inoue 2001): 1. ITS components should be evaluated throughout the ITS development stage, so improvements can be made to the prototype; and 2. the ITS effectiveness as a system can be measured with traditional educational evaluation methods.

7.1.1 Aspects of an ITS Evaluation

Several researchers have focused on the problem of evaluating an ITS (Alm- strum et al. 1996, Chin 2001, Inoue 2001, Kinskuk et al. 2000, Littman & Soloway 1988, Mark & Greer 1993, Paramythis et al. 2010). Different aspects of such an evaluation have been identified, as briefly described below.

◦ Formative versus summative: formative evaluations are done during the development stage to gather information that will contribute to a better final product; summative evaluations are done when the project devel- opment is complete. ◦ Quantitative versus qualitative: quantitative evaluations are performed with phenomena that can be measured (we can gather numerical data), 7.1.EvaluatinganIntelligentTutoringSystem 187

while qualitative evaluation are performed with phenomena that are dif- ficult to measure but can provide descriptive data instead. ◦ studies versus laboratory studies: field studies observe the students’ behaviour in their normal environment, while the laboratory studies ob- serve the students in a controlled, predefined environment. ◦ Incremental versus longitudinal: incremental studies look at a particular point in time (or a short period of time) only, while longitudinal studies are over a longer period of time. ◦ Theory-driven analysis versus data-driven analysis: theory-driven anal- ysis is testing hypothesis derived from theories (there already exist the- ories about the studied phenomena), while data-driven analysis is not derived from a theory, it is relying only on the data gathered (there are no theories yet about the studied phenomena). ◦ External versus internal: an external evaluation assesses the system as a whole, while an internal evaluation assesses only the components of the system. Internal evaluation - evaluates the ITS architecture:

− the design of the ITS; or − the components of the ITS (the models used, user interface, proce- dures, etc.).

External evaluation - evaluates the educational impact and learning out- come:

and users’ acceptance; or − increased learning.

◦ Informal versus formal: A formal evaluation is conducted using stan- dardised techniques such as questionnaires or surveys, with the students involved in the evaluation being aware of the evaluation objectives. An 188 7. System Evaluation

informal evaluation is done using non standardised techniques. In an informal evaluation, the students are less aware (or unaware) of the eval- uation in progress and the evaluation objectives. Informal evaluations with students can bring more valuable (and unpredictable) data and can be used during formative evaluations; formal evaluations are more suit- able for summative evaluations.

7.1.2 Evaluation Methods

Iqbal et al. (1999, p. 1) present a classification of evaluation methods because of the lack of “clear guidance” in the literature. Furthermore, the paper sug- gests a way of selecting the most suitable evaluation method. This classification is based on two dimensions:

1) The degree of evaluations. This dimension coincides with similar research (Brusilovsky et al. 2004, Littman & Soloway 1988, Mark & Greer 1993) as describing an evaluation as being internal or external - evaluating only a component of an ITS or the system as a whole.

2) The local feasibility of the evaluation method. This is based on the dis- tinction between experimental and exploratory research. The methods are briefly described in the following two sections: Methods for Experi- mental Research and Methods for Exploratory Research. While experi- mental research is more suitable for large group size of users, exploratory research is appropriate for small group size of users.

Methods for Experimental Research

This category contains methods that require varying the initial (independent) variables and measuring the outcome (dependent variable). These methods 7.1.EvaluatinganIntelligentTutoringSystem 189 must be able to achieve the following conditions: 1. statistically significant group of students; and 2. random assignment of participants to conditions.

1. Proof of correctness checks if the system achieved the initial goals and requirements. 2. Additive experimental design evaluates “the impact of large ITS components that can be experimentally modified or withheld” (Iqbal et al. 1999, p. 3). 3. Diagnostic accuracy assess the quality of the micro theories used by the system, because the pedagogical interactions between the system and the user depend on the correct recognition and interpretation of the user’s errors. 4. Feedback/instruction quality. The immediate impact on feedback/instruction quality can be “experimentally estimated through lag sequential analysis pro- cedures” (Iqbal et al. 1999, p. 4). One possible way to calculate this is “the ratio of actions in a specific category to the total number of actions that occur within a specific frame following a target action” (Iqbal et al. 1999, p. 4). 5. Sensitivity analysis examines how changing information in the system’s components (or the entire system itself) influences the teaching response in various circumstances. 6. Experimental research provides a way to obtain relationships between the changes in the system and the outcome produced. Iqbal et al. (1999) present the previous work of Mark & Greer (1993) as describing a variety of experi- mental research design: single group designs, control group designs, and quasi- experimental designs. 7. Product evaluation is a broad-based ITS evaluation. Three guidelines for product evaluation are:

1) The instructional effectiveness of the ITS must be compared (based on performance data) with human tutors and other traditional teaching methods. 190 7. System Evaluation

2) Only extensively used ITS applications should be evaluated.

3) One must use large groups of students for an accurate evaluation.

Methods for Exploratory Research

The methods in this category are suitable where there are multiple sources of data and the evaluations can be made in a natural context (the classroom in PAT’s case). Large numbers of students are not required.

8. Expert knowledge is used to assess whether the system meets the required level of performance. The method is suitable for knowledge-based systems and can address the whole system as well as only components of a system. 9. Level of agreement is used “to estimate consistency of knowledge and be- liefs across experts” (Iqbal et al. 1999, p. 5) about the system or system’s components. 10. Wizard of Oz experiment “use a human to simulate the behaviour of a proposed system, so that one can test aspects of a program design before actually implementing it” (Iqbal et al. 1999, p. 5). 11. Performance metrics are methods that analyse individual factors or fea- tures of an ITS by using quantitative data from results which are too small to allow statistical conclusions. 12. Internal evaluation studies the relationship between the ITS architecture and its behaviour. This group of methods can be further categorised into:

1) knowledge level analysis - to evaluate the knowledge base;

2) process analysis - evaluation of ; and

3) ablation & substitution experiments - observe system performance by turning off components or replacing them with previous versions. 7.1.EvaluatinganIntelligentTutoringSystem 191

13. Criterion-based evaluation analyses the requirements and specifications of the systems in an exploratory way, looking for inadequacies. 14. Pilot testing is used to check the system design while the system is still under development. In accordance with the development stage, there can be: one-to-one tests; small group tests; and close to the end of the development, final tests. 15. Certification is similar to the methods used to identify competent human teachers by receiving an authoritative endorsement. 16. Outside assessment is based on opinions of experts or large numbers of users. It can be on-site expert evaluation or panel of experts evaluation. 17. Existence proofs “method bases its conclusions on the successful imple- mentation of a system or application of a method to propose a new system architecture or design method” (Iqbal et al. 1999, p. 6). 18. Observation & qualitative classification has the intention to “identify classes of phenomena, patterns, and trends in the interaction of people with instructional systems” (Iqbal et al. 1999, p. 6). The observation can relate to novice users (learners) or expert users (teachers). 19. Structured tasks & quantitative classification refers to the collection of data by organising (limiting) the responses of the students. To limit and structure the data gathered, the method uses interviews, questionnaires or surveys. 20. Comparison studies are methods that compare similarities and differences in the behaviour or design of an ITS with an existing standard or other ITSs.

7.1.3 Choosing the Correct Evaluation Method

To decide on the appropriate evaluation method, Iqbal et al. (1999) suggest answering the following two questions.

1) What is being evaluated - the entire system or only a part of the system? 192 7. System Evaluation

2) What is the number of available students and what are the conditions for experimental research?

Figure 7.1 (adapted from (Iqbal et al. 1999)) shows how the evaluation meth- ods can be classified by the two dimensions. Horizontally, methods are pre- sented for small groups of users (exploratory research) and large groups of users (experimental research). Vertically, the methods are presented from internal evaluation (ITS components) to external evaluation (the entire ITS). The num- bers in the diagrams represents the number of the methods as described in the previous section and also presented in Table 7.1. In this way, we can see that the methods are separated into four groups as described below. ◦ Lower left part is suitable for evaluating ITS components which do not require large sample size or rigorous statistical methods. ◦ Lower right part is suitable for evaluating ITS components with large group size of participants which require control group studies. ◦ Upper left part is suitable for evaluating overall system performance and do not require large group size of participants or statistical studies. ◦ Upper right part is suitable for evaluating overall system performance and requires large group size of participants and statistical studies. The process of selecting the most suitable evaluation method for PAT is de- scribed in the next section. 7.1.EvaluatinganIntelligentTutoringSystem 193

External Evaluation 18 19 16 17 7 15 20 6

14 13 5

10 4 11 9 3 2 8 1

12

Internal Evaluation

Experimental Exploratory Research Research

Figure 7.1: Choosing an evaluation method for an ITS.

1 Proof of correctness 2 Additive experimental design 3 Diagnostic accuracy 4 Feedback/instruction quality 5 Sensitivity analysis 6 Experimental research 7 Product evaluation 8 Expert knowledge 9 Level of agreement 10 Wizard of Oz experiment 11 Performance metrics 12 Internal evaluation 13 Criterion-based 14 Pilot testing 15Certification 16Outsideassessment 17 Existence proofs 18 Observation & qualitative classification 20 Comparison studies 19 Structured tasks & quantitative classifica- tion Table 7.1: Evaluation methods. 194 7. System Evaluation

7.1.4 How was PAT Evaluated

To determine the method used to evaluate PAT, we followed the process de- scribed in the preceding section, Section 7.1.3. Method 19 from Table 7.1 (Structured tasks and the quantitative classifica- tion of phenomena) includes: 1) the collection of data as exploratory research; and 2) using external evaluations to gather data using interviews, question- naires or surveys. This method was suitable for evaluating PAT because: 1) we wanted to evaluate PAT as a system (external evaluation); and 2) we had almost 200 students that could potentially use PAT. In terms of the aspects listed in Section 7.1.1 (Aspects of an ITS evalua- tion), PAT was evaluated as below: ◦ Summative evaluation: We evaluated the final (latest at that time) ver- sion of PAT, not the prototype. ◦ Quantitative evaluation: The conditions required for a quantitative eval- uation are met - enough students, measurable objectives. ◦ Laboratory studies: This was a controlled environment with predefined questions. ◦ Incremental evaluation: We only analysed how PAT was used during a single semester; we did not attempt to make any correlation between the data gathered during different semesters. ◦ Data-driven analysis: We wanted to analyse the data obtained from the evaluation to perceive the improvement that PAT has in the students’ learning. ◦ External evaluation: We evaluated PAT as a system, not components of PAT. ◦ Formal evaluation: PAT had a formal evaluation, the students respond- ing to the questions in the classroom, during a tutorial. 7.2.PAT’sSummativeEvaluation 195

7.2 PAT’s Summative Evaluation

Evaluations with the final version of PAT were conducted by both students and academic staff. With students, a questionnaire (see Figures 7.2 and 7.3) was used as a source of both qualitative and quantitative data. Another questionnaire (see Figure 7.10) was given to faculty staff involved in teaching the subject, to gather qualitative data: academics’ opinions about PAT.

7.2.1 Evaluations’ Objectives

As stated in Section 1.4, my research main goal is to:

increase the effectiveness of students learning Rapid Application Development in a database environment.

To measure the success of this research, we need to know: the improvement engendered in students’ learning by using the ITS; and the ITS acceptance by the students and academic staff. In addition, we recognise that the students involved in the evaluation have diverse backgrounds, and their previous experience with Access (if any) varies significantly. For an accurate interpretation of the students’ answers, we must distinguish between students who have used Access before and those who haven’t. Furthermore, another important aspect that has to be considered is how much they used PAT during the semester. If some students only rarely used PAT they will not give an answer as informative as the students who used PAT extensively. 196 7. System Evaluation

As a consequence, the objectives for the summative evaluation are:

1) What is the background of the students - in relation to Access?

2) Does the approach of having PAT embedded in Access make learning easier?

3) Is the feedback provided helpful for students? Does PAT offer enough types of help?

4) Is PAT accepted by students and teaching staff?

7.2.2 Evaluations with Students

Databases is a first year subject at Queensland University of Technology (QUT). PAT is available for download to students enrolled in this subject and its use is optional. PAT is recommended in addition to the traditional methods avail- able such as: participation in lectures, tutorials and practicals; studying lecture notes; and studying the recommended textbook.

Questions for Evaluation with Students

As mentioned above, a questionnaire was used as a source of both qualitative and quantitative data. The questions given to the students are shown in Fig- ure 7.2 and Figure 7.3. The associated objective of each question is presented in Table 7.2. 7.2.PAT’sSummativeEvaluation 197

Figure 7.2: The questions given to the students (first page). 198 7. System Evaluation

Which type of feedback from PAT did you like least ? (Tick only one.)

TrafficLights  What is Wrong  How to Fix  Diagrams  MoreReadings 

Please say why:

7

If you were stuck, and asked PAT to help you, did it provide enough help so that you could fix the problem? 8 Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Do you think PAT is easy to use? 9 Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Would you like to have software similar to PAT to help you in other subjects? 10 Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Overall, did you find PAT helpful? 11 Strongly Dislike  Dislike  Undecided  Like it  Like it Very Much 

Figure 7.3: The questions given to the students (second page).

Objective Question Whatisthebackgroundofthestudents? 1,2,3 Does the approach of having PAT embedded in Access 4 make learning easier? Is the feedback provided helpful for students? Does PAT 5, 6, 7, 8 offer enough types of help? Is PAT accepted by students and teaching staff? 9,10, 11 Table 7.2: Evaluation objectives and related questions for students’ survey. 7.2.PAT’sSummativeEvaluation 199

Results of Evaluation by Students

Objective 1 - Students’ background

Of 185 students enrolled in the subject, 84 responded to the questionnaire. However, only 51 of the students answered all questions. 56 students (67%) answered Question 1 that they have used Access before. The majority of the respondents were Somewhat Confident with using Access, with only 9% being Very Confident (Question 2).

Question Results Yes No Q1 56 28 Not Confident Somewhat Confident Very Confident Q2 Confident 6 30 14 5 Table 7.3: Students’ answers for Questions 1 and 2.

Q1 - Have you used MS Access before? Q2 - How confident were you about

using MS Access?

 

¡  ¡  ¦ ¡

© © ©

§ § §



 ¥  ¥

¦

§

¨ ¨ ¥



¡ ¡ ¢

©

§

  ¥ £ ¤ ¥

 ¡ 

§  §





¡

©

§   ¥

Figure 7.4: Students’ background.

Only 49 students answered Question 3 - How often (how much) did you used PAT - and their answers are as follows:

1) An hour or less - 13 students

2) Between 1 and 3 hours - 20 students

3) More than 3 hours - 16 students 200 7. System Evaluation

Objective 2 - Does the approach of having PAT embedded in Access make learn- ing easier?

For Question 4 (Is it useful to use PAT directly within Access?), Figure 7.5 shows that Strongly Agree is 59%. With 14% for Agree and only 10% for Disagree, we can say that the students considered having PAT integrated in Access is a very useful feature.

Figure 7.5: Students’ answers for Question 4.

The results for Question 4 are presented in Table 7.4.

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q4 1 6 9 8 34 Table 7.4: Students’ answers for Question 4.

Objective 3 - Is the feedback provided helpful for students? Does PAT offer enough types of help?

The results for Questions 5 and 8 are presented in Table 7.5 and the results for questions 6 and 7 are presented in Table 7.6. 7.2.PAT’sSummativeEvaluation 201

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q5 2 8 24 10 10 Q8 3 14 16 13 3 Table 7.5: Students’ answers for questions 5 and 8.

Questions 5 and 8 are similar: they are asking the students if the feedback from PAT was useful. What the answers are in fact showing us is how students overcame easy problems compared to hard problems - in accordance to their knowledge about the topic. Question 5 covers situations when the user asked for help and with minimal help from PAT they were able to move on, while Question 8 is about hard problems where students need more help. As Merrill et al. (1992) suggest, the tutor should provide just enough help to keep students’ frustration and confusion to a minimum. PAT does not give the correct solution (final answer) for assignments, so in some cases, students who needed more than PAT was allowed to say, could became unhappy with PAT’s help.

Figure 7.6: Students’ answers for Questions 5 and 8.

Questions 6 and 7 show students’ preferences for the types of feedback pro- vided. The Traffic Lights (error indication) are by far the most liked type of feedback - see Table 7.6. 202 7. System Evaluation

The students who used Access before were looking more for a simple way of indicating what is wrong:

“Easy to understand and intuitive way of telling me what needs fixing.”

“Simply pointing out where the problem in my solution was is generally enough to help me solve it.”

“Easy to see where the mistakes were made.”

“Simply tells you what needs work on.”

For students who didn’t use Access before, their responses were slightly differ- ent. They were looking not only for an indication of “what is wrong” but also for an indication on the overall performance and progress:

“Good indicator of how well you are doing overall. Good indication of where you should focus on fixing.”

“... the traffic lights were a good yes/no feedback on the current state of my database.”

“Excellent visual feedback on my progress.”

Question Traffic What is How to Diagrams More Lights Wrong Fix Readings Q6 34 12 5 0 0 Q7 0 7 10 14 12 Table 7.6: Students’ answers for questions 6 and 7.

The least preferred types of feedback are More Readings and Diagrams: More Readings for the students who hadn’t used Access before, while Diagrams for the students who had used Access before. However, some of the students explicitly stated that they did not have any dislikes for the types of help, with some of their answers being: “N/A - no particular dislike” or simply “none”. 7.2.PAT’sSummativeEvaluation 203

Figure 7.7: Students’ answers for Question 6.

From the students’ comments, we can group their reasons for liking least some types of the feedback (as shown by Question 7) to the following: ◦ preferred learning styles; ◦ not fast/helpful enough; and ◦ lack of information in the feedback. For some students, the feedback type they like least is dictated by their learning style: “I dislike too much text”. Some other students only like the type of feedback which is a direct hint, something that can be immediately applied and solve the problem. In those cases, they dislike some of the types of feedback because: “did not help directly to fix the problem” or “didn’t answer much” or even “should include solutions”. In other cases, the cause for a dislike was missing information. For some errors for example, there are no diagrams to help the user so a message appears instead: “there are no diagrams” - immediately admonished by students. 204 7. System Evaluation

Objective 4 - Is PAT accepted by students?

The results for questions 9 and 10 are presented in Table 7.7 and the results for Question 11 in Table 7.8.

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q9 6 4 19 13 8 Q10 2 2 14 12 20 Table 7.7: Students’ answers for Questions 9 and 10.

How easy is it to use PAT is reflected in question 9 (Do you think PAT is easy to use?). Strongly Agree and Agree are 42% while Strongly Disagree and Disagree are only 20%. Students who used Access before were more happy with the ease of use: 47% Agree or Strongly Agree compared to only 32% from those who didn’t use Access before.

Question Strongly Dislike Undecided Like it Like it Dislike Very Much Q11 0 3 12 18 17 Table 7.8: Students’ answers for Question 11.

Question 11 (Overall, did you find PAT helpful) summarises students’ expe- rience with PAT. 70% of the students answered Like it or Like it very much and only 6% (3 students) answered that they disliked PAT (see Figure 7.8). In addition, having 64% of the students looking forward for similar software (ITS) in other subjects (Question 10) is just a confirmation of the useful experience they had with PAT. 7.2.PAT’sSummativeEvaluation 205

Figure 7.8: Students’ answers for Question 11.

We also wanted to see how much the students like the idea of using an Intelli- gent Tutoring System (Question 10) and then to compare it to a specific ITS - PAT (Question 11). The results are pretty similar.

Figure 7.9: Students’ answers for Questions 10 and 11. 206 7. System Evaluation

7.2.3 Evaluations with Staff

Questions for Evaluation with Staff

The questions given to the academic staff members involved in teaching the Databases subject are presented in Figure 7.10. The associated objective of each question is presented in Table 7.9.

Objective Question Whatisthebackgroundoftheteachingstaff? 1 Does the approach of having PAT embedded in Access 2 make learning easier? Is the feedback provided helpful for students? Does PAT 3, 4, 5 offer enough types of help? IsPATacceptedbystudentsandteachingstaff? 6,7 Table 7.9: Evaluation objectives and related questions for staff survey. 7.2.PAT’sSummativeEvaluation 207

PAT evaluation – semester 2 2008

No Question

Did you teach in ITB004, before the start of semester 2, this year? If yes, how many semesters? 1 Yes  No  Number of Semesters: _____

Do you think it is useful that students can use PAT directly from within MS Access, rather than using a separate program? 2 Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Do you think PAT is offering enough types of help? If not, what other types of help could be added?

Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Any suggestions?

3

Which type of feedback from PAT did you like most ? (Tick only one.) 4 TrafficLights  What is Wrong  How to Fix  Diagrams  MoreReadings 

Which type of feedback from PAT did you like least ? (Tick only one.) 5 TrafficLights  What is Wrong  How to Fix  Diagrams  MoreReadings 

Do you think PAT is easy to use? 6 Strongly Disagree  Disagree  Undecided  Agree  Strongly Agree 

Overall, did you find PAT helpful?

7 Strongly Dislike  Dislike  Undecided  Like it  Like it Very Much 

Figure 7.10: The questions given to staff members. 208 7. System Evaluation

Results of Evaluation by Staff

Objective 1 - Staff members’ background

Seven staff members responded to the questionnaire. Only one was teaching the subject for the first time.

Objective 2 - Does the approach of having PAT embedded in Access make learn- ing easier?

As the results show, staff members considered that accessing and running PAT directly from within Access is indeed useful - see Table 7.10:

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q2 0 0 2 1 4 Table 7.10: Teaching staff answers for Question 2.

Objective 3 - Is the feedback provided helpful for students? Does PAT offer enough types of help?

The results from the staff members’ questionnaire to questions 3, 4 and 5 are presented in Tables 7.11 and 7.12.

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q3 0 1 2 1 3 Table 7.11: Teaching staff answers for Question 3.

Question 3 asked the teaching staff if they believe that PAT offers enough types of help. This questions is based on one of the ideas emerged from the initial focus group (described earlier in Section 3.1.1). The idea was that students 7.2.PAT’sSummativeEvaluation 209 would like to have many types of help to choose from, not just the Is it correct? button.

Question Traffic What is How to Diagrams More Lights Wrong Fix Readings Q4 1 2 3 0 1 Q5 1 0 0 0 4 Table 7.12: Teaching staff answers for questions 4 and 5.

It should be noted that in comparison to students, who largely preferred Traffic Lights, teaching staff preferred How to Fix. In addition, teaching staff disliked More Readings - this time in the same way as the students, who disliked Diagrams and More Readings. One of the teaching staff mentioned that PAT should use diagrams that are “Access screen-shots”.

Figure 7.11: Teaching staff answers for Question 4.

Objective 4 - Is PAT accepted by teaching staff?

To see if PAT is accepted by teaching staff, two questions were asked: “Do you think PAT is easy to use?” and “Overall, did you find PAT helpful?”. The answers to these two questions are presented in Table 7.13 and Table 7.14. 210 7. System Evaluation

Question 6 (Do you think PAT is easy to use?), where two staff members an- swered Disagree and another two answered that they are Undecided, indicates a place that requires improvements.

Question Strongly Disagree Undecided Agree Strongly Disagree Agree Q6 0 2 2 2 1

Table 7.13: Teaching Staff answers for Question 6.

From the formative evaluations (as described in Section 1.4), when students were asked to try an initial prototype of PAT, I noticed that students can start using PAT in an “uncontrolled” way, by clicking on buttons without knowing what will happen but hoping that they will “get help”. In order to avoid this kind of situations, I created two documents: how to install and how to use PAT. What the questionnaire did not reveal though is whether the staff members read the documents. At the conclusion of the questionnaire for staff, we asked about their overall impression - Question 7. For this question, we had two staff members which answered Undecided but the rest of five answered Like it (one of them) or Like it Very Much (the other four).

Question Strongly Dislike Undecided Like it Like it Very Dislike Much Q7 0 0 2 1 4 Table 7.14: Teaching Staff answers for Question 7. 7.2.PAT’sSummativeEvaluation 211

7.2.4 Conclusions for Summative Evaluation

This section described the summative evaluation of PAT. I presented the re- sults from the evaluation with students, then the results from the evaluation with teaching staff. The results from both students and teaching staff are summarised below, grouped by the evaluation’s objective. Because objective 4 measures the overall rating of PAT, it is presented first.

Objective 4 - Is PAT accepted by students and teaching staff?

As shown in Figure 7.12, we can see that both students and staff found PAT helpful. Furthermore, from questions 9 and 10 for students we saw that: ◦ the majority of students thought that PAT is easy to use; and ◦ students would like to have similar software to PAT in other subjects.

Figure 7.12: Students and teaching staff answers for questions 11 and 7. 212 7. System Evaluation

Objective 1 - Students and staff members’ background

We saw that 2/3 of the students enrolled in the subject have used Access before, with more than 50% of them being Somewhat Confident in using it. Interestingly, the answers to the questions between the two groups (students that used Access before and students that didn’t use Access before) are similar in the majority of the questions. The question where the results are different are: ◦ Q7 - Which type of feedback did you like least? ◦ Q11 - Overall, did you find PAT helpful? Students who used Access before were not really interested in additional ma- terials such as Diagrams, More Readings and not even in How to Fix while students who haven’t use Access before did not like More Readings and What is Wrong but were happy with Diagrams and How to Fix. Although one might expect that PAT would be more useful for beginners (students who had not used Access before), the results show that 56% of them answered that they like or like very much PAT and 38% answered that they are undecided. In contrast, 76% of the students that used Access before answered that they like or like very much PAT with only 18% undecided.

Objective 2 - Does the approach of having PAT embedded in Access make learn- ing easier?

We noticed that both students and teaching staff considered that using PAT directly from Access, while using the real software (no simulation) to work on real problems, is very useful. Students’ answers for question 10 can also be seen as a confirmation - the students would like to have software similar to PAT in other subjects. 7.2.PAT’sSummativeEvaluation 213

Objective 3 - Is the feedback provided helpful for students? Does PAT offer enough types of help?

Some teaching staff and students did not indicate any type of feedback as disliked (Question 7 for students and Question 4 for staff) because they did not dislike any of them: “N/A - no particular dislike” or simply “none”. The only suggestion for improvements from staff members (Question 3) was to not only have general diagrams describing the overall concept but also screen-shots from Access on how to solve some of the possible issues. Some students were unhappy with the content of the feedback received. One possible explanation is the fact that PAT only gives hints, not the solution (correct answer) for assignments. However, future work could look at ways to improve the feedback. 214 7. System Evaluation

7.3 Pre-Post Test Evaluation

In addition to the evaluation described in the previous section, I gathered measurements of the students’ knowledge before and after using PAT. Because PAT helps with the assignments, it would not be ethical to ask any of the students not to use PAT just to have a control group. For this reason, I evaluated PAT using the approach described by Woolf (2008, p. 191) as “C1. Tutor alone”. This evaluation was conducted during the second semester of 2010, with a different set of students.

7.3.1 Information Sources

To analyse the improvement engendered in students’ learning while using PAT we have two available sources of information: data gathered by PAT during student-system interactions and the students’ solutions to the assignments. During each interaction between the student and the system, PAT will record not just the session ID, MIE code, the advice type, advice code, etc. (as part of the student model) but also the name of the dialog box and the button - any time the student clicks on one of those. The data is recorded in the database containing the student model (described in Section 6.3). The information in this database allows us to see which topics were not initially understood (not known) by students while they were using PAT - either practicing on the helping exercises or working on the assignments. If a student asked for help for a particular topic, that topic is considered not known at pre-test. For a topic that the student never asked for help, that topic is assumed as known at pre-test. Because we already had access to this source of information (already in an electronic format) we avoided the extra burden for students of organizing another form of pre-test - such as a paper or online test. 7.3. Pre-Post Test Evaluation 215

Another source of information is the students’ solutions to the assignments. The students have three assignments for the Access part. The results from the assignments show that a student knows a topic if the student’s solution is correct for that topic and the student does not know the topic if the solution is incorrect. This source of information provided the data for the post-test.

7.3.2 The Student Population

Of the 235 students enrolled in the Databases subject in the second semester 2010, only 199 students submitted all the assignments and student models. Because using PAT was optional, the information received from some of these 199 students was insufficient for an accurate evaluation. Possible criteria for selecting the relevant students (students that used PAT enough to provide useful data) are:

◦ number of interactions (clicks on PAT’s interfaces) - a maximum of 1598 and a mean of 182.5; ◦ number of advice messages received - a maximum of 394 and a mean of 30; and ◦ number of sessions started - a maximum 151 and a mean of 11.

The first criterion above is a good measure of how much a student used PAT, in order to distinguish between significant and insignificant data. There were 108 students with more than 200 clicks. However, for ease of use and , we only considered the first 100 students (in descending order of their number of interactions), which means a minimum of 208 clicks.

7.3.3 The Topics Considered

From the 38 topics (object-properties) existing in the exercises used during both the pre and post tests, I selected the 10 most relevant ones based on the 216 7. System Evaluation following criteria: ◦ the topic should require the student to set a correct value (i.e. not topics that can be easily generated by the wizards or using default values); and ◦ the topic should be important from a teaching perspective (i.e. some object-properties are more important than others). These 10 topics are listed in the Table 7.15.

ObjectType PropertyName ComboBox RowSource Form Name Form NavigationButtons Form RecordSelectors Form RecordSource Form ScrollBars Label Caption Report Name Subform SourceObject TextBox ControlSource Table 7.15: Topics analysed in the pre-post test.

7.3.4 Results of the Pre-Post Evaluation

In Section 7.3.1, I explained that the data being used for pre-test purposes provides information about topics not known initially. Where students did not need help, I take it for granted that they already knew the topic. The post-test data was obtained from the students’ solutions to the assignments. This data provides accurate information about topics both known and unknown. From this data, I obtained the average number of topics known on the pre- test versus average number of topics known on the post-test. These values are shown in Figure 7.13. 7.3. Pre-Post Test Evaluation 217

Figure 7.13: Topics known at pre and post tests.

On average, the number of topics known increased from 5.4 to 9.9, after using PAT, i.e. the average number of topics learned is 4.5. The mode is 4 topics learned with a standard deviation of 1.87. Woolf (2008, p. 191) lists for a “tutor alone” evaluation the questions that should be addressed: ◦ Do learners with high or low prior knowledge benefit more? ◦ Do help messages lead to a better performance? To address the first question, I analysed the pre-post results for students with prior knowledge of Access versus students with no prior knowledge of Access. When a student starts using PAT for the first time, it asks the student if they are confident with using Access. The answer is recorded in the student model. Based on this information, 55 students (out of 100 students analysed) had prior knowledge of Access - i.e. they answered “yes” to the question if they are confident with using Access. The results of the pre-post test for the two categories of students are shown in Figure 7.14 218 7. System Evaluation

Pre-Post Test Students with no prior knowledge of Access versus

students with prior knowledge of Access

1

0

5

4

3 2

Number of Topics of Number

1

    

     

# $ $

%

!

&

, - -

"  "  "' ( ) )   * + +

.

1

* +0 0

 / ( ) )  

Figure 7.14: Students confident using Access vs students new to Access.

It can be seen from the graph that the students with no prior knowledge had a slight increase in the number of topics learned: an average of almost 6 topics learned compared with 5 topics learned by students with prior knowledge of Access. Regarding the second question that Woolf (2008) suggests to answer, as I stated at the beginning of this section, I could not ask some of the students not to use PAT or to use a version of PAT without feedback messages or with different feedback messages. However, in Section 7.3.1, I have shown the students’ opinion about messages received. A paired t-test was done for each group to check if the groups improve independently. The improvements for both groups (students with and without previous experience in Access) are statistically significant, with a p-value of 0.0001 (both groups). However, an unpaired t-test on the pre-test and post- test between the two groups indicated that the differences between the two groups are not significant statistically, with a p-value of 0.1035 and 0.2534 respectively. This could be explained by the fact that the group of students that used Access before was determined only by the students’ claim - which 7.3. Pre-Post Test Evaluation 219 could have been over confident. The distribution of number of students by the number of topics learned, is depicted in Figure 7.15.

Figure 7.15: Distribution of students by topics learned.

The graph shows how many students learned 1 topic, 2 topics, and so on up to 9 topics - the maximum number of topics learned out of the 10 topics analysed. It can be clearly seen that the majority of them (about 70%) learned between 3 and 6 topics. 220 7. System Evaluation

7.4 Summary

This chapter gives an overview of evaluation techniques for ITSs and shows how I chose the appropriate technique for PAT. The chapter also describes how I prepared and conducted the summative evaluation of PAT. From the results of the summative evaluations, it can be seen that both students and academic staff appreciate the help that PAT provides. The results of a pre-post test evaluation gives evidence about the improvement that PAT engenders in students learning. In the next chapter, I present a summary of this thesis’s contributions, its limitations, and describe future work that could further enhance PAT’s usefulness. Chapter 8

Summary of Contributions and Future Directions

Everything that can be counted does not necessarily count; every- thing that counts cannot necessarily be counted.

- Albert Einstein (1879 - 1955)

This chapter describes the contributions of this research and ways of further enhancing PAT’s usefulness to students.

221 222 8. Summary of Contributions and Future Directions

8.1 Research Contributions

In Section 1.4, I stated the main goal of my research: to increase the effec- tiveness of students learning Rapid Application Development in a database environment. As also stated in that section, one way of increasing the effec- tiveness of students’ learning is using an ITS to provide individual tuition. For ease of reference, I repeat the research tasks I identified in Section 1.4. 1) Design an appropriate ITS architecture for students learning to create forms and reports in Microsoft Access. The ITS will provide the students with augmented learning by using a real working environment instead of a simulation module.

2) Design new models for the knowledge base that can be used with the architecture designed as part of the first task. The information from the models will help the ITS to provide feedback so that an explanation or hint will be different even if the student is still stuck after the previous one.

3) Design a feedback mechanism that enables different students to receive different (individualised) hints, even for the same error.

4) Build an actual ITS for use in the classroom that will be based on the first three tasks.

5) Evaluate the new ITS and its acceptance by students and academic staff. Next I will describe the contributions of this thesis.

8.1.1 Design of PAT’s Architecture

With regards to tasks 1 and 4 above, I created a new intelligent tutoring system - PAT, which successfully uses the knowledge base and the feedback mechanism created throughout this research. 8.1. Research Contributions 223

Section 3.3.1 describes the system’s architecture, showing how PAT uses a real RDBMS instead of a simulation. By using this architecture, I created for the students an environment that provides: an augmented learning experience by directly using Access; and ease of use as the students can ask for help while in the process of developing forms or reports. Section 6.2.2 shows how PAT can be used from within Access. In this way, the student can ask for help directly from Access, without having to stop his or her work. After receiving a hint, the student simply closes the hint window and continues to work on the exercise. The student can ask for help immediately after any changes he or she makes, with only one click on PAT’s interface (This Exercise window). This is possible because the interface does not have to be closed when the student works on the exercise. With a simple click on their form or report, the student returns to Access and can continue his or her work on the exercise. PAT is not limited to a particular application database structure (such as that for a video store or an airline). PAT can be used with any database structure. Over a number of semesters, PAT has been used successfully in conjunction with a variety of application database , at QUT.

8.1.2 Design New Models

With regards to task 2, I created four new models: the domain model, the exercises model, the student model, and the tutoring model (as described in Chapter 4). The domain model contains information about Access, with a focus on forms and reports. This domain model provides the foundation of PAT’s knowl- edge base. No previous ITS has used Access as its domain for students learning RAD. In Section 4.2.2 I also showed how the domain model I created could be reused for ITSs using other database environments such as OpenOffice Base. 224 8. Summary of Contributions and Future Directions

PAT can be used without the need for any additional documentation, and consequently, all the information that the students need to solve the exercises (description, requirements, purpose, justification, etc.) are included in the exercises model. The exercises model also contains the correct solution for each exercise. Because an exercise can be solved following different development paths, I introduced the Most Important Error (MIE) code as a measure of the relative importance of each part of the solution. Based on the MIE code, PAT knows which error to address - if there are more than one error in the student’s solution. The student model allows PAT to store information about: what the stu- dent knows or does not know about Access forms or reports at any particular point in time; the history of the student using PAT; and other personal char- acteristics of the student. PAT uses this information to individualise the hints for the student. The tutoring model contains tutoring goals and actions that help the stu- dent to correctly solve the exercises they are working on.

8.1.3 Design PAT’s Feedback Mechanism

With regards to task 3, the Instructional Expert I created for PAT provides feedback that consists of:

◦ individualised help given to the student to help them correctly finish each exercise; or ◦ propose the next exercise that the student should work on.

PAT produces individualised hints on the fly, based on the Access objects properties stored in the domain model. When a hint is produced for the student (at an appropriate level of specificity), PAT also individualises the 8.1. Research Contributions 225 hint based on the student’s history - as described in Chapter 5. In addition to providing individualised help to the student, PAT can also propose to the student a more suitable exercise, more appropriate for his or her skills.

8.1.4 Evaluation of PAT

With regards to task 5, I conducted a formal, summative evaluation of PAT with students and teaching staff involved in the Databases subject. The results of the evaluation (see Chapter 7) proved its usefulness in stu- dents’ learning and also its acceptance by both students and staff members. The results of the evaluation also showed the difference between students who had previously used Access before the subject, and students which did not used Access before. The differences were in the way the students used PAT and in the type of feedback they prefer. In addition, from the results of the evaluation it can be seen that the students would like to have ITSs similar to PAT for other subjects. In addition, to show the improvement that PAT engenders in students learning, I used the data that PAT gathers for a pre-post test. The results from the test show that the students which used PAT had an average number of topics learned of 4.5 (out of the 10 most important topics analysed), with a mode of 4 topics learned and a standard deviation of 1.87. The results of the evaluation have been already published (Risco & Reye 2012). 226 8. Summary of Contributions and Future Directions

8.2 Research Limitations

In Section 1.3, I stated that an ITS for learning RAD using Access does not need to address all the tools and features of Access. In this context, the limitations of PAT regarding Access are: ◦ limited functionality for analysing queries; and ◦ limited functionality for analysing VBA code. These topics involve very complex issues and, as I described in Section 2.4.1, there are ITSs built that specifically address only one of the above issues. PAT was designed and developed as research work, not as a commercial product. The main effort was focused on answering the research questions. Consequently, I did not provide any authoring tools to help the teaching staff to create new exercises. While PAT can be used with any Access application (database), the exercise are still manually created and added in any newly compiled version of PAT. In addition, even though PAT records in the student model much useful information, there are no automated ways to report that information. In Section 4.2.2 I described how the domain model I designed could be used for an ITS built for other database environment using as an example Open Office Base. Nonetheless, time constraints prohibited me from actually implementing it. 8.3. Future Directions 227

8.3 Future Directions

Because of PAT’s modular structure, further enhancements can be made. The enhancements to PAT from which the students could benefit are: A) automatically generate the MIE code in the exercise solution;

B) analyze (and provide help) not just the correctness of the solution but also the readability and usability of the form or report;

C) an open student model; and

D) reports and for teaching staff about students’ learning perfor- mance. A) Currently, the most important error code (MIE code) is manually added to the correct solution of an exercise based on best practices when creating forms and reports in Access (as described in Section 4.3.2). However, automatically generating the MIE code for an exercise also considering the student’s needs would be beneficial for the student. As an example, PAT could check which was the last object in the form or report the student worked on, and the MIE code could be adjusted to follow the same path as the student. B) When PAT analyses the student’s form or report, PAT analyses its correctness from a functional point of view, i.e. is the form or report producing the correct data. From a human user perspective though, the readability and usability of the form or report could also be analyzed. To illustrate, from a readability point of view, the objects in the form should have the same size, should be aligned, and should be grouped by their meaning or function. From an usability perspective, the fields should be displayed in the most meaningful order i.e. in the same order in which the data will be entered - first name, last name and address; not last name, address, and only then the first name. 228 8. Summary of Contributions and Future Directions

For PAT, this approach would involve analysing other aspects such as the relative position and size of each of the objects in the form or report. C) An open student model (as described in Section 2.2.2) would allow the students to check their profile. This would help the students’ learning by facil- itating the metacognitive processes and providing them with an opportunity to reflect on their progress. D) A beneficial addition to the teaching staff would be to generate reports and statistics about students’ learning performance. The aggregated data collected from PAT could help the teaching staff to identify the topics that are hard to be learned, suggesting areas for future improvements. Appendix A

Research Methodology and Process

To accomplish the tasks previously defined in Section 1.4, I employed a multi- methodological research strategy, the “Systems Development”, as presented by Nunamaker et al. (1991). Figure A.1 (adapted from Nunamaker et al. (1991)) presents an integrated approach to Information Systems Research. Four research strategies are in- volved in this multi-methodological approach: theory building, experimenta- tion, observation, and systems development. The image from Figure A.2 (adaptation from Nunamaker et al. (1991)) shows the five stages that Systems Development has: concept design; con- structing the architecture of the system; design the system; prototyping and product development; and evaluation. A description of what will be done in each of these stages is presented below.

1) Construct a conceptual framework - In order to start constructing any software, a literature review of related research must be done. Relevant disciplines, related to my research, must be studied and appropriate ideas must be incorporated. I also had to investigate what high level function- alities and requirements the system should have.

229 230 A.ResearchMethodologyandProcess

Theory Building Conceptual frameworks Mathematical models Methods

Systems Developments Prototyping Product development Technology transfer

Observation Experimentation Case studies Computer simulations Surveys Field experiments Field studies Lab. experiments

Figure A.1: A multi-methodological approach to IS Research

2) Develop a system architecture - Develop the architecture design having in modularity (Parnas 2005) and extensibility (future features - like including in the system mini-lessons to be presented to the student, etc.). Define the functionalities of all system components and relations between them. At this stage we clarify all the constraints so they are measurable at the evaluation stage.

3) Design the system - The main objective of this stage is to design the knowledge base and the processes required to carry out the system func- tions. Alternative solutions must be compared, and the best one chosen. 231

Construct a Conceptual Framework

Develop a System Architecture

Design the System

Build the (Prototype) System

Observe and Evaluate the System

Figure A.2: A Process for Systems Development Research

4) Build the system - The program code is written and tested. In order to evaluate the research work, a working product must exist. A prototype can be built to allow preliminary evaluations.

5) Experiment, observe and evaluate the system - Having built the sys- tem, its performance - as stated in the requirements - can be verified. Additionally, testing the functionality and utility of the student model contribution represents a large task that could result in several changes to the system over time. This is the reason for having two periods of observation and evaluation in the Research Plan.

As a refinement of the “Design the System” stage, for an AI-based project, the knowledge engineering process is applicable. Russell & Norvig (2003) describe this process as having the following steps.

1) Identify the task: determine the range of questions that knowledge base will support and the kinds of facts that will be available. 232 A.ResearchMethodologyandProcess

2) Assemble the relevant knowledge: this is the knowledge acquisition part of the process. At this stage, the important issues are to understand the scope of knowledge base and how the domain actually works.

3) Decide on a vocabulary of predicates, functions and constants: create the ontology of the domain by translating the important domain-level concepts into logic-level names.

4) Encode general knowledge about the domain: this can be achieved by establishing the axioms for all the terms from the vocabulary defined previously; returning to correct the ontology if necessary.

5) Encode a description of the specific problem instance: we write atomic sentences about instances of concepts that are part of the ontology al- ready created.

6) Pose queries to the inference procedure and get answers: we let the infer- ence procedures work on the axioms and problem-specific facts to derive the facts we are interested in knowing.

7) Debug the knowledge base: correct any errors detected. Appendix B

Initial Focus Group

Setting the Focus Group

The theory underlining the functionalities that an ITS can have is well defined by the literature in the field Brusilovsky (1994a), Self (1987) and more recent VanLehn (2006). However, students’ perception of ITSs regarding a specific domain (in our case learn how to use RAD to create forms and reports as inter- faces to a database) can confirm the usefulness of some of the functionalities, refute others or yet bring new ideas.

Research Questions for Initial Focus Group

The research questions for this initial evaluation were:

1) What are the students’ perceptions about using Intelligent Tutoring Sys- tems (ITS) in their learning?

2) What are the students’ expectations from an ITS?

From the range of qualitative research methods, I have chosen the focus group research method for two main reasons. The first one is that the settings re- quired for a focus group can be easily met in this particular case and the second

233 234 B. Initial Focus Group one is that the focus group research method is suitable (Edmunds 1999, p. 5,) for the research questions stated above. The preparation of this focus group was based on the book “Focus Group Interviews in Education and Psychology” (Vaughn et al. 1996).

Purpose of the Initial Focus Group

Based on these research questions we can define the purpose of this focus group: to ascertain students’ perception about the use of Intelligent Tutoring Systems as learning tools in university. We are particularly interested in the expectations they have from such a system and their opinions when comparing several solutions offered. The aim of the focus group is to find the answers to the following questions: We do want to know:

◦ What broad functionalities do they expect from an ITS? ◦ What specific functionalities do they expect from an ITS? ◦ What are their opinions about the solutions presented?

We do NOT want to know (as being out of the scope of this research):

◦ Do they want mini lessons or presentations about a specific topic? ◦ Do they trust the system?

Focus Group Goals The goals we want to achieve can be grouped by how the information will be used and outcomes required to consider the focus group successful Vaughn et al. (1996) How the information will be used:

◦ Develop a general understanding of students’ perception about using ITSs as learning tools.

The outcomes required to consider the focus group successful: 235

◦ “Most Wanted” functionalities. ◦ Data gathered from participants should allow us to further develop the questionnaire for later use.

Setting the Initial Focus Group

Number of Participants The initial aim was to have between 8 and 10 participants. However, a number of 6 participants can still achieve the objectives declared. Selection of Participants Participants will be selected from first year students enrolled in ITB117. The only requirement in selecting participants is to have the students who already have completed the Databases subject. In this way we can use their past experience with the non-ITS software Reye (2004/2008) they used before. Time Allocated The focus group will last 1.5 hours. To minimise students’ effort to partici- pate, we scheduled the focus group to take place right after students practical. Recording The preferred method of recording discussion is video recording. The main reason of having the session recorded is to not miss any of participants’ answers or ideas. But more than that, in this way we can review the participants’ answers, observe their attitude regarding others’ answers and even analyse the body language. Presentation about the Topic Participants will see a short presentation describing what Intelligent Tutor- ing Systems are and their main components. There will be nothing presented about the specific functions which these systems can have because they are representing one of the focus groups objective: to understand what students expect from an ITS. 236 B. Initial Focus Group

Clarification of Terms The goal of this part is to clarify all the terms involved during presentation. Participants will be encouraged to ask about any word or concept that is not understood or confusing. A list of definitions for the key terms is available in the Appendix (Defini- tions for Key Terms) to be used by the moderator if required. Questions During this phase, the moderator is required to write on the white board the participants ideas. Having all the ideas on the white board will gave participants the opportunity to refer to them anytime. This first group of questions, besides the introductory purpose, will allow us to ascertain students’ perception about the use of ITSs. Questions from the second part will help to find students’ expectations from an ITS on one hand, and their opinion regarding the options presented on the other hand. In interpreting students’ responses, their expectations will give us in fact the functions the system should have. Opinions about solutions offered will provide feedback to be considered for the interface that my system will use. Interview Time-lines

Stage Phase Time Introduction 5min Starting Presentation 10min ClarificationofTerms 5min Introductoryquestions 15min Questions Specificquestions 35min Wrap-Up 10min Conclusions Record not-completed ideas 5 min Answer any remaining questions 4 min Closing Expressthanks 1min

Table B.1: Interview Time-lines 237

Questions for the Initial Focus Group

PAT initial evaluation – focus group

No Question

1 Using PAT AddIn was useful or not? Why?

2 Compare PAT AddIn with an ITS. What are the differences?

What features would you like from an ITS version of PAT? What do 3 you think an ITS should offer you? How can be the interface enhanced? What ``elements'' should the 4 interface have? 5 What should happen when you click on ``Help'' button? Compare the interfaces. What can be added/removed? Why? Other 6 suggestions? 7 What actions should be available to user to choose from? Have we missed anything? 8

Figure B.1: The questions given to students - focus group

Appendix C

How to Install PAT

This appendix contains a document given to the students enrolled in the Databases subject at QUT. The document describes how to install PAT addin.

Home Installation of PAT

You can install PAT in your or in a lab at QUT. However, if you will install PAT at QUT, the system (PAT) will not be able to keep track of your progress (you will have to reinstall PAT every time you restart the computer).

Before You Start

(a) Supported Operating: Systems PAT may be used with Windows XP and Windows Vista.

(b) Supported versions of MS Office: PAT can be installed for MS Office 2003 and MS Office 2007.

239 240 C. How to Install PAT

Download and Install the Basic Files

1) Download PatAddin.msi from the Blackboard page for Databases subject (Assessment - Microsoft Access Assignment). 2) Double-click on PatAddin.msi. The following dialog box appears.

Figure C.1: First install window 241

3) Click Next. The following dialog box appears.

Figure C.2: Second install window

By default, the addin will be installed on C:\Program files\QUT\QUT PAT Add-in. You can click on Browse if you want to choose another location. Regardless the location you choose for the installation of the addin, another folder (PAT- data) containing the documentation and databases you need to use with PAT will be created in C: drive. You cannot choose (or change) the location for these files. 4) Click Next. The following dialog box appears. Now the installer is ready to install PAT on your computer. 5) Click Next. The following dialog box appears. 242 C. How to Install PAT

Figure C.3: Third install window

Figure C.4: Last install window

Now PAT is installed on your computer. 6) Click Close 7) If you like to keep your folders nice and clean, delete both the zip file and 243 the unzipped files from step 2 above. You can always download them again later, if needed. 8) If you want to uninstall PAT got to: Control Panel - Add/Remove programs then choose QUT PAT - uninstall.

Start Using PAT

To use PAT, all you have to do is to open the NorthwindDemo database which is installed with PAT: Start - All Programs - PATaddin Documentation - NorthwindDemo.mdb

Using PAT Add-in For the First Time

A description of how you should use PAT is presented in the document called “PATaddin-GettingStarted”. It is really important to read this document be- fore you start using PAT. Start - All Programs - PATaddin Documentation - PATaddin-GettingStarted.Doc

Appendix D

How to Use PAT

This appendix contains a document given to the students enrolled in the Databases subject at QUT. The document tells the student how to get started with PAT after installing the addin.

Starting PAT For the First Time

The “Personal Access Tutor (PAT) Add-in” is designed to help you learn how to create forms in MS Access, in your own time, and at your own pace. After installing PAT, you will have a new folder on your computer’s C: drive - PAT- data. In that folder you have this document and a copy of NorthwindDemo database. These two files are also accessible from the links: Start - Program Files - PATaddin Documentation You should now open NorthwindDemo database. When MS Access starts, you will see a new “Add-Ins” tab, in the ribbon. Select this and you will see two new buttons - as in the next image.

245 246 D. How to Use PAT

Figure D.1: PAT’s menu in Access

For the first time only, regardless the button you click on, the Welcome window will appear.

Figure D.2: First Time window

Just type your name (or nick name) and click Submit. Your name/nickname will only be used for a personalised interaction with the system. The system does not collect information that can trace any individual. If you are willing to answer a few more questions (7 in totals), click “Yes”; if not, click “No”. If you clicked on “Yes”, the window from the picture on the next page will appear: 247

Figure D.3: Questions window

PAT is asking these questions to be able to give you the appropriate feedback. For all the questions your answer is “Yes”, check the associated checkbox. When you are done, click “Submit answers”. You can still change your mind and click “Don’t want to answer” and no information will be recorded. Regardless your option of submitting or not the answers, when the “Ques- tions” window is closed, a new window (see below) will appear. 248 D. How to Use PAT

Figure D.4: Welcome window

Now you can start working with PAT - proceed to step 2 (on page 6 of this document).

Start MS Access Again (with PAT already in- stalled)

After the first time you are using PAT, when you restart MS Access and click on one of the buttons (My profile or This Exercise), another window will appear first:

Figure D.5: Welcome Back window 249

You are reminded about the last exercise you were working on; in this way you can click on “Go to ’This Exercise’ window” to continue working on the same exercise or click on “Go to ’My Profile’ window” to select another exercise. If you click on “Go to ’My Profile’ window”, the “My Profile” window will appear and you can select another exercise. If you click on “Go to ’This Exercise’ window”, “This Exercise” window will appear and you can see again all the requirements for the previous exercise. 250 D. How to Use PAT

Best Way to Work with PAT

PAT was designed to help students learning how to create forms in MS Access in their own time, their own peace, as fast and as easy as they could. To make the most of it, you should follow these steps: 1. Open “My Profile” window. 2. Select an exercise from “My Profile” window; first choose the criteria for displaying the exercises:

Figure D.6: How to choose an exercise.

Then click on an exercise in the list: 251

Figure D.7: How to choose an exercise.

The purpose of each exercise appears in the “Why should you do this exercise” section; when you find an exercise you want to work on, click on “See detailed description”. Even if you let PAT to choose an exercise for you, you still have to click on the exercise (to select it) and then click on “See detailed description”. 3. “This Exercise” window will open and you will be able to see the detailed description and requirements for the selected exercise: 4. Start working on the exercise. 5. Just to be safe, from time to time, you can check if your solution is correct: click on “Is my solution correct?” button: 252 D. How to Use PAT

Figure D.8: How to choose an exercise.

6. The “Traffic Lights” window will appear, showing you what is correctly done (green light) and what is not correctly done (red light); an orange light means that your answer is not yet complete. If not everything is correct, try to fix the problem by yourself.

7. ONLY if you are stuck, ask for help to see what is wrong or how to fix the error:

A new window containing an advice will appear:

8. If you still dont know how to get over it, have a look on “More Readings” or see an example:

9. When you finished an exercise, go back to “My Profile” window and ... choose another one! 253

Figure D.9: This exercise window.

Figure D.10: Check your solution.

10. HAVE FUN! 254 D. How to Use PAT

Figure D.11: Traffic Lights window

Figure D.12: Check what is wrong with your solution.

If You Only Want to Check the Assignment

To check the form and/or the report in the assignment, open the “My Profile” window. Then select “Assignments” from the “For the topic:” combo-box. Then select the form or report name from the Exercise list - with the “Show me all exercises” option enabled. 255

Figure D.13: Display Advice window

Figure D.14: Ask for more help.

Figure D.15: Have fun!.

After this, in “This Exercise” window, click on “Is my solution correct?” but- ton. If you need help, you can still check “What is wrong with my solution” (see next page image). 256 D. How to Use PAT

Figure D.16: Select the assignment.

Remember!!! What you have to do for the assignment is specified in the assignment requirements! PAT only checks the contents of the form and the report. It does not check the database relationships or the queries.

Limitations

Remember: the version of PAT that you are using now is just a prototype, still under development. You are free to use, copy and distribute PAT for non-commercial use, pro- 257

Figure D.17: Check the solution for the assignment. vided that no fee is charged for its use, copying or distribution, and it is not modified in any way. QUT disclaims all warranties as to this software, whether express or im- plied, including without limitation any implied warranties of merchantability, fitness for particular purpose and functionality. This prototype is the result of merging two independent applications: the “Traffic Lights” and the “Helper”. These two parts work independently.

Giving Feedback about PAT

It is very important to give us your opinion about PAT. Thanks to last semester students, you have an improved version now. Thanks to you, next semester students can have an even better version. To give feedback, click on “send feedback” link on any of PAT windows.

Figure D.18: Give feedback. 258 D. How to Use PAT

You can also go directly to http://pat.pctotal.net/Feedback. Bibliography

Abraham, A. (2005), Adaptation of fuzzy inference system using neural learn- ing, in N. Nedjah & L. d. Macedo Mourelle, eds, ‘Fuzzy Systems - ing’, Vol. 181 of Studies in Fuzziness and Soft Computing, Springer-Verlag Berlin Heidelberg, pp. 53–83.

Adamski, J. J. & Finnegan, K. T. (2008), New Perspective Microsoft Office Access 2007 - Comprehensive, Thomson Course Technology.

Aleven, V., McLaren, B., Roll, I. & Koedinger, K. (2004), Toward tutoring help seeking; applying cognitive modeling to meta-cognitive skills, in J. C. Lester, R. M. Vicari & F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th International Conference, ITS 2004’, Vol. 3220 of Lecture Notes in Computer Science, Springer, Macei`o, Alagoas, Brasil, pp. 227–239.

Almstrum, V. L., Dale, N., Berglund, A., Granger, M., Little, J. C., Miller, D. M., Petre, M., Schragger, P. & Springsteel, F. (1996), Evaluation: turning technology from toy to tool: report of the working group on evaluation, in ‘ITiCSE ’96: Proceedings of the 1st conference on Integrating technology into computer science education’, ACM, New York, NY, USA, pp. 201–217.

Alpert, S. R., Singley, M. K. & Fairweather, P. G. (1999), ‘Deploying intelligent

259 260 BIBLIOGRAPHY

tutors on the web: An architecture and an example’, International Journal of Artificial Intelligence in Education 10(2), 183–197.

Anderson, J. R. (1983), The Architecture of , , Har- vard University Press.

Anderson, J. R. (1988), The expert module, in M. C. Polson & J. J. Richard- son, eds, ‘Foundations of Intelligent Tutoring Systems’, Lawrence Erlbaum Associates Publishers, pp. 21–54.

Anderson, J. R., Boyle, C. F., Farrel, R. & Reiser, B. (1984), Cognitive prin- ciples in the design of computer tutors, in ‘Sixth Annual Cognitive Science Meetings’, pp. 2–10.

Anderson, J. R., Corbett, A. T., Koedinger, K. & Pelettier, R. (1995), ‘Cog- nitive tutors: Lessons learned’, The journal of Learning Sciences 4(2), 167– 207.

Arroyo, I. (2000), Animalwatch: an arithmetic its for elementary and students, in G. Gauthier, C. Frasson & K. VanLehn, eds, ‘Intelligent Tutoring Systems, 5th International Conference, ITS 2000’, Vol. 1839 of Lecture Notes in Computer Science, Springer-Verlag, Montr´eal, Canada.

Arroyo, I., Beal, C., Murray, T., Walles, R. & Woolf, B. P. (2004), Web-based intelligent multimedia tutoring for high stakes achievement tests, in J. C. Lester, R. M. Vicari & F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th International Conference, ITS 2004’, Vol. 3220 of Lecture Notes in Computer Science, Springer, Macei`o, Alagoas, Brasil, pp. 468–477.

Arroyo, I., Murray, T., Woolf, B. P. & Beal, C. (2004), Inferring unobservable learning variables from students’ help seeking behavior, in J. C. Lester, R. M. Vicari & F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th International BIBLIOGRAPHY 261

Conference, ITS 2004’, Vol. 3220, Springer, Macei`o, Alagoas, Brasil, pp. 782– 784.

Arroyo, I., Woolf, B. P., Royer, J. M. & Tai, M. (2009), Affective gendered learning companions, in V. Dimitrova, R. Mizoguchi, B. du Boulay & A. C. Graesser, eds, ‘Artificial Intelligence in Education 2009’, IOS Press, Ams- terdam, pp. 41–48.

Barrows, H. S. (1988), The tutorial process, rev. ed. edn, Southern Illinois University School of Medicine, Springfield, Ill.

Beal, C. & Cohen, P. (2007), Computational methods for evaluating student and group learning histories in intelligent tutoring systems, in C.-K. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Artificial Intelligence in Educa- tion: Supporting Learning through Intelligent and Socially Informed Tech- nology.’, Vol. 125, IOS Press, Amsterdam, The Netherlands, pp. 80–87.

Beck, J., Stern, M. & Haugsjaa, E. (1996), ‘Applications of ai in education’, Web Page; http://www1.acm.org/crossroads/xrds3-1/aied.html. last visited 24 March 2005.

Belghith, K., Nkambou, R., Kabanza, F. & Hartman, L. (2011), ‘An intelli- gent simulator for tele- training’, IEEE Transactions on Learning Technologies 99(PrePrints).

Bell, B. S., Kanar, A. M. & Kozlowski, S. W. (2008), ‘Current issues and future directions in simulation-based training’, Working Paper Series . URL: http://digitalcommons.ilr.cornell.edu/cgi/viewcontent.cgi?article =1493&context=cahrswp

Bergmann, R., Kolondner, J. & Plaza, E. (2006), ‘Representation in case-based reasoning’, The Knowledge Engineering Review 20:3, 209213. 262 BIBLIOGRAPHY

Bhagat, S., Bhagat, L., Kavalan, J. & Sasikumar, M. (2002), Acharya: An in- telligent tutoring environment for learning sql, in ‘Proceedings of Vidyakash 2002 International Conference on Online learning’.

Bloom, B. (1984), ‘The 2 sigma problem: The search for methods of group instruction as effective as one-to-one tutoring’, Educational Researcher 13(6), 4–16.

Bourdeau, J., Mizogouchi, R., Psyche, V. & Nkambou, R. (2004), Selecting theories in an ontology-based its authoring environment, in J. C. Lester, R. M. Vicari & F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th In- ternational Conference, ITS 2004’, Vol. 3220 of Lecture Notes in Computer Science, Springer, Macei`o, Alagoas, Brasil, pp. 150–161.

Branting, L. K. (1999), Reasoning with Rules and Precedents - A Computa- tional Model of Legal Analysis, Springer.

Bredeweg, B. & Winkels, R. (1994), Student modelling through qualitattive reasoning, in J. E. Greer & G. I. McCalla, eds, ‘Student Modelling : The Key to Individualized Knowledge-Based Instruction’, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer Verlag, pp. 63–98.

Brusilovsky, P. (1994a), ‘The construction and application of student models in intelligent tutoring systems’, Journal of Computer and Systems Sciences International 32(1), 70–89.

Brusilovsky, P. (1994b), Student model centered for architecture for intelligent learning environments, in U. M. Inc., ed., ‘Fourth international conference on User Modeling’, pp. 31–36.

Brusilovsky, P., Karagiannidis, C. & Sampson, D. (2004), ‘Layered evalua- BIBLIOGRAPHY 263

tion of adaptive learning systems’, Int. J. Cont. and Lifelong Learning 14(Nos. 4/5), 402–421.

Bull, S. & Gardner, P. (2009), Highlighting learning across a degree with an in- dependent open learner model, in V. Dimitrova, R. Mizoguchi, B. du Boulay & A. C. Graesser, eds, ‘Artificial Intelligence in Education 2009’, IOS Press, Amsterdam, pp. 275–282.

Bull, S. & Kay, J. (2007), ‘Student models that invite the learner in: The smili:() open learner modelling framework’, Int. J. Artif. Intell. Ed. 17(2), 89–120.

Burns, H. L. & Cpps, C. G. (1988), The expert module, in M. C. Polson & J. J. Richardson, eds, ‘Foundations of Intelligent Tutoring Systems’, Lawrence Erlbaum Associates Publishers, pp. 21–54.

Burns, H. L. & Parlett, J. W. (1991), The evolution of intelligent tutoring sys- tems: Dimensions of design, in H. L. Burns, J. W. Parlett & C. L. Redfield, eds, ‘Intelligent tutoring systems : in design’, Lawrence Erlbaum Associates, Hillsdale, NJ, pp. 1–12.

Burstein, F. & Gregor, S. (1999), The systems development or engineering approach to research in information systems: An action research perspec- tive, in P. Y. B Hope, ed., ‘Proceedings of 10th Australasian Conference on Information Systems’, Vol. 1, Wellington NEW ZEALAND, pp. 122–134.

Butz, C., Hua, S. & Maguire, R. (2008), Web-based bayesian intelligent tu- toring systems, in R. Nayak, N. Ichalkaranje & L. Jain, eds, ‘Evolution of the Web in Artificial Intelligence Environments’, Vol. 130 of Studies in Computational Intelligence, Springer Berlin / Heidelberg, pp. 221–242.

Butz, C. J., Hua, S. & Maguire, R. B. (2006), ‘A web-based bayesian intelligent 264 BIBLIOGRAPHY

tutoring system for computer programming’, Web Intelli. and Agent Sys. 4(1), 77–97.

Card, S. K., Moran, T. P. & Newell, A. (1983), The psychology of human- computer interaction, L. Erlbaum Associates, Hillsdale, N.J.

Carroll, J. M. (1990), The Nurnberg funnel : designing minimalist instruction for practical computer skill, MIT Press,, Cambridge, Mass.

Carroll, J. M. (1998), beyond the Nurnberg funnel, MIT Press, Cambridge, Mass.

Chabay, r. W. & Sherwood, B. A. (1992), A practical guide for the cre- ation of educational software, in J. H. Larkin & R. W. Chabay, eds, ‘Computer-Assited Instruction and Intelligent Tutoring Systems: Shared Goals and Complementary Approaches’, Lawrence Elbaum Associates Pub- lishers, chapter 5, pp. 151–186.

Chapman, B. (2004), ‘E-learning simulation products and services’, Web page; http://www.prisim.com/News/. retrieved june 2011.

Chin, D. N. (2001)), ‘Empirical evaluation of user models and user-adapted systems’, User Modeling and User Adapted Interaction 11(Nos. 1-2), 181– 194.

Clancey, W. J. (1987), Knowledge-based tutoring : the GUIDON program, MIT Press.

Cohen, P. R., Beal, C. R. & Adams, N. (2008), The design, deployment and evaluation of animalwatch intelligent tutoring system, in ‘Prestigious Appli- cations of AI (PAIS), European Conference on AI’, IOS Press, pp. 663–677. BIBLIOGRAPHY 265

Conati, C. (2010), Bayesian student modeling, in ‘Advances in Intelligent Tu- toring Systems’, Studies in Computational Intelligence, Springer, New York, pp. 281–299.

Conati, C., Gertenere, A. & VanLehn, K. (2002), ‘Using bayesian networks to manage uncertainty in student modeling’, International Journal of User Modeling and User-adaptive Interaction 12(4), 371–417.

Corbett, A. T. & Anderson, J. R. (1995), ‘Knowledge tracing: Modeling the acquisition of procedural knowledge’, User Modeling and User-Adapted In- teraction (UMUAI) 4, 253–278.

Corbett, A. T. & Anderson, J. R. (2001), Locus of feedback control in computer-based tutoring: impact on learning rate, achievement and atti- tudes, in ‘Proceedings of the SIGCHI conference on Human factors in com- puting systems’, ACM Press, pp. 245–252.

Costa, E., Gaspar, G. & Coelho, E. (1994), A formal approach to iles, in J. E. Greer & G. I. McCalla, eds, ‘Student Modelling : The Key to Individualized Knowledge-Based Instruction’, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer Verlag, pp. 281–294.

Curzon, P. & Blandford, A. (2002), From a formal user model to design rules, in ‘DSV-IS ’02: Proceedings of the 9th International Workshop on Interactive Systems. Design, Specification, and Verification’, Springer-Verlag, London, UK, pp. 1–15.

Di Eugenio, B., Fossati, D., Haller, S. M., Yu, D. & Glass, M. (2008), ‘Be brief, and they shall learn: Generating concise language feedback for a computer tutor’, International Journal of Artificial Intelligence in Education 18, 317– 345. 266 BIBLIOGRAPHY

Dimitrova, V. G. (2001), Interactive Open Learner Modelling, Phd., The Uni- versity of Leeds, Computer Based Learning Unit.

Dollinger, R. (2010), Sql lightweight tutoring module - semantic analysis of sql queries based on xml representation and linq, in ‘Proceedings of World Con- ference on Educational Multimedia, Hypermedia and 2010’, AACE, Toronto, Canada, pp. 3323–3328.

Dufresne, A., Rouatbi, M. & Guerdelli, F. (2008), The use of ontologies to structure and support interactions in lor, in B. W. et al., ed., ‘Proceedings of the 9th International Conference on Intelligent Tutoring Systems’, Springer- Verlag, Berlin, pp. 551–562.

Edmunds, H. (1999), The focus group research handbook, NTC Business Books.

Eliot, C. R. & Woolf, B. P. (1996), A simulation-based tutor that reasons about multiple agents, in ‘Proceedings of the thirteenth national conference on Artificial intelligence’, Vol. 1 of AAAI’96, AAAI Press, pp. 409–415.

Epp, C. D. (2010), ‘Protutor: a pronunciation tutor that uses historic open learner models’. Department of Computer Science, University of Saskatchewan, Saskatoon, Canada.

Epp, C. D. & McCalla, G. (2011), Protutor: Historic open learner models for pronunciation tutoring, in G. Biswas, S. Bull, J. Kay & A. Mitrovic, eds, ‘15th International Conference, AIED 2011’, LNAI 6738, Springer-Verlag Berlin Heidelberg, Auckland, New Zealand, pp. 441–443.

Evens, M. W., Chang, R.-C., Lee, Y. H., Shim, L. S., Woo, C. W., Zhang, Y., Michael, J. A. & Rovick, A. A. (1997), Circsim-tutor: an intelligent tu- toring system using natural language dialogue, in ‘Proceedings of the fifth conference on Applied natural language processing: Descriptions of system BIBLIOGRAPHY 267

demonstrations and videos’, ANLC ’97, Association for Computational Lin- guistics, Stroudsburg, PA, USA, pp. 13–14.

finchannel.com (2010), ‘Microsoft office 2010 now available for consumers worldwide’, Web Page; http://finchannel.com/Main News/Tech/65178 Mi- crosoft Office 2010 Now Available for Consumers Worldwide. last visited 4 October 2011.

Fischer, G. (2001), ‘User modeling in human-computer interaction’, User Mod- eling and User-Adapted Interaction (UMUAI) 11(Issue 1 - 2), 65–86.

Fournier-Viger, P., Nkambou, R. & Mayers, A. (2008), ‘Evaluating spatial rep- resentations and skills in a simulator-based tutoring system’, IEEE Trans- actions on Learning Technologies 1, 63–74.

Fournier-Viger, P., Nkambou, R. & Nguifo, E. M. (2010), Building intelligent tutoring systems for ill-defined domains, in ‘Advances in Intelligent Tutor- ing Systems’, Studies in Computational Intelligence, Springer, New York, pp. 81–101.

Gamut, L. T. F. (1991), Logic, language, and meaning, Vol. 1, University of Chicago Press, Chicago.

Gertner, A., Conati, C. & VanLehn, K. (1998), Procedural help in andes: Generating hints using a bayesian network student model, in ‘Proceedings of the Fifteenth National Conference on Artificial Intelligence AAAI-98’, The MIT Press, Cambridge, MA, pp. 106–111.

Gluga, R., Kay, J. & Lever, T. (2010), Modeling long term learning of generic skills, in V. Aleven, J. Kay & J. Mostow, eds, ‘Tenth International Con- ference on Intelligent Tutoring Systems: Bridges to Learning, ITS2010 (1)’, 268 BIBLIOGRAPHY

number (LNCS) 6094 in ‘Lecture Notes in Computer Science’, Springer- Verlag Berlin Heidelberg, pp. 85–94.

Goldestein, I. P. (1979), ‘The genetic graph: a representation for the evolution of procedural knowledge’, International Journal of Man-Machine Studies 11(1), 51–77.

Greer, J. E. & McCalla, G. I. (1994), Student modelling : the key to individual- ized knowledge-based instruction, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer-Verlag.

Gruber, T. R. (1995), ‘Toward principles for the design of ontologies used for knowledge ’, International Journal Human-Computer Studies 43(5- 6), 907–928.

Halpin, T. A. (2001), Information modeling and relational databases : from conceptual analysis to logical design, The Morgan Kaufmann series in data management systems, Morgan Kaufman, San Francisco; London.

Halpin, T. A. (2005a), ‘Object role modelling’, Web Page; http://www.orm.net/. last updated: 2005 March 23; last visited 6 May 2005.

Halpin, T. A. (2005b), ‘Orm 2 graphical notation’, Technical Report. Neumont University.

Han, B. (2001), ‘Student modelling and adaptivity in web based learning sys- tems’, Masters Thesis. Massey University, New Zealand.

Hannafin, M. J. & Peck, K. L. (1988), The design, development, and evaluation of instructional software, Collier Macmillan Publishers, New york.

Hatzilygeroudis, I. & Prentzas, J. (2004), Knowledge representation require- ments for intelligent tutoring systems, in J. C. Lester, R. M. Vicari & BIBLIOGRAPHY 269

F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th International Confer- ence, ITS 2004’, Vol. 3220 of Lecture Notes in Computer Science, Springer, Macei`o, Alagoas, Brasil, pp. 87–97.

Heffernan, N. T. & Koedinger, K. R. (2002), An intelligent tutoring sys- tem incorporating a model of an experienced human tutor, in S. Cerri, G. Gouardares & F. Paraguasu, eds, ‘Intelligent Tutoring Systems ITS2002’, Vol. 2363 of Lecture Notes in Computer Science, Springer Berlin / Heidel- berg, pp. 596–608.

Hoffer, J. A., George, J. F. & Valacich, J. S. (1999), Modern and design, Addison-Wesley, Reading, Mass. pid: UQ Library call number: QA76.9.S88 H64 1999.

Holt, P., Dubs, S., Jones, M. & Greer, J. (1994), The state of student mod- eling, in J. E. Greer & G. McCalla, eds, ‘Student modelling : the key to individualized knowledge-based instruction’, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer-Verlag, pp. 3–35.

Horvitz, E., Breese, J. S., Heckerman, D., Hovel, D. & Rommelse, K. (1998), The lumire project: Bayesian user modeling for inferring the goals and needs of software users., in ‘Proceedings of the 14th Conference on Uncertainty in Artificial Intelligence (UAI’98)’, pp. 256–265.

Hume, G., Micheal, J., Rovick, A. & Evens, M. (1996), ‘Hinting as a tactic in one-on-one tutoring’, Journal of the Learning Sciences 5(1), 23–49.

Ingalls, R. G. (2008), Introduction to simulation, in ‘WSC ’08: Proceedings of the 40th Conference on Winter Simulation’, Winter Simulation Conference, Miami, Florida, pp. 17–26. 270 BIBLIOGRAPHY

Inoue, Y. (2001), ‘Methodological issues in the evaluation of intelligent tutoring systems’, systems 29(3), 251–258.

Iqbal, A., Oppermann, R., Patel, A. & Kinshuk (1999), A classification of evaluation methods for intelligent tutoring systems, in ‘Software-Ergonomie 99, Design von Informationswelten, Gemeinsame Fachtagung des German Chapter of the ACM, der Gesellschaft fur Informatik (GI) und der SAP AG’, Teubner, pp. 169–181.

Jackson, G. T. & Graesser, A. C. (2007), Content matters: An investigation of feedback categories within an its, in R. Luckin, K. Koedinger & J. Greer, eds, ‘Artificial Intelligence in Education: Building technology rich learning contexts that work.’, Vol. 126, IOS Press, Amsterdam, The Netherlands, pp. 127–134.

Jameson, A. (2005), User modeling meets usability goals, in L. Ardissono & A. Mitrovic, eds, ‘User Modeling: Proceedings of the Tenth International Conference’, Springer, Berlin.

John, B. E. & Kieras, D. E. (1996), ‘Using goms for user interface design and evaluation: which technique?’, ACM Transactions on Computer-Human Interaction 3(4), 287–319.

Jonnassen, D. & Grabowski, B. (1993), Handbook of individual differences, learning, and instruction, Lawrence Erlbaum Associates, Hillsdale, N.J.

Kassim, A. A., Ahmed, K. S. & Ranganath, S. (2001), A web-based intelligent approach to tutoring, in ‘Proceedings of the International Conference on Engineering Education’.

Kay, J. (1997), Learner know thyself: Student models to give learner control BIBLIOGRAPHY 271

and responsibility, in ‘Control and Responsibility, International Conference on Computers in Education, AACE’, pp. 17–24.

Kearsley, G. (n.d.), ‘Explorations in learning and instruction: The theory into practice database’, Web Page; http://tip.psychology.org/. last visited 24 March 2011.

Khuwaja, R., Evens, M., Rovick, A. & Michael, J. (1994), Architecture of circsim-tutor (v.3): a smart cardiovascular physiology tutor, in ‘Proceed- ings of IEEE Seventh Symposium on Computer-Based Medical Systems’, pp. 158–163.

Kincaid, J. P. & Westerlund, K. K. (2009), Simulation in education and train- ing, in ‘Proceedings of the 2009 Winter Simulation Conference (WSC)’, IEEE, Austin, TX, USA, pp. 273–280.

Kinskuk, Patel, A. & Russell, D. (2000), ‘A multi-institutional evaluation of intelligent tutoring tools in numeric disciplines’, Educational Technology and Society 4(3), 66–74.

Knowles, M. S. (1975), Self-directed learning : a guide for learners and teach- ers, Cambridge Adult Education, New York.

Knowles, M. S. (1980), The modern practice of adult education : from pedagogy to andragogy, Cambridge Adult Education, New York.

Knowles, M. S. (1984), Andragogy in action, Proquest Info & Learning.

Koch, N. P. d. (2000), Software Engineering for Adaptive Hypermedia Sys- tems. Reference Model, Modeling Techniques and Development Process, Phd., Ludwig-Maximilians-Universitat. 272 BIBLIOGRAPHY

Kodaganallur, V., Weitz, R. R. & Rosenthal, D. (2005), ‘A comparison of model-tracing and constraint-based intelligent tutoring paradigms’, Inter- national Journal of Artificial Intelligence in Education 15(2), 117–144.

Koedinger, K. & Anderson, J. R. (1993), Reifying implicit planning in geome- try: Guidelines for model-based intelligent tutoring system design, in S. P. L. . S. J. Derry, ed., ‘Computers as cognitive tools’, Hillsdale, NJ: Erlbaum, pp. 15–46.

Krueger, R. A. & Casey, M. A. (2000), Focus groups : a practical guide for applied research, 3rd ed. edn, Sage Publications, Thousand Oaks, Calif.

Kumar, A. N. (2011), Error-flagging support and higher test scores, in G. Biswas, S. Bull, J. Kay & A. Mitrovic, eds, ‘15th International Confer- ence, AIED 2011’, LNAI 6738, Springer-Verlag Berlin Heidelberg, Auckland, New Zealand, pp. 147–154.

Lesta, L. & Yacef, K. (2002), An intelligent teaching assistant system for logic, in S. Cerri, G. Gouardares & F. Paraguasu, eds, ‘Intelligent Tutoring Systems ITS2002’, Vol. 2363 of Lecture Notes in Computer Science, Springer Berlin / Heidelberg, pp. 421–431.

Lester, J. C. (2006), ‘Reflections on the kvl tutoring framework: Past, present, and future’, Int. J. Artif. Intell. Ed. 16(3), 271–276.

Littman, D. & Soloway, E. (1988), Evaluating itss: The cognitive science per- spective, in M. C. Polson & J. J. Richardson, eds, ‘Foundations of Intelligent Tutoring Systems’, Lawrence Erlbaum Associates Publishers, pp. 191–208.

Loftin, B., Mastaglio, T. W. & Kenney, P. (2004), ‘Outstanding research issues in intelligent tutoring systems.’, Scientific and Technical Report; Submitted BIBLIOGRAPHY 273

to Simulation Technology Center at US Army RDECOM. Contract N61339- 03-C-0156.

Ludwig, J., Ludwig, E., Ludwig, E., Stottler, R. & Nguyen, A. (2010), Integrat- ing an intelligent tutoring system for taos with second life, in ‘Procedings of Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)’, pp. 1–9. Paper No. 10087.

Lynch, C. F., Aleven, V., Ashley, K. D. & Pinkwart, N. (2006), Defining ’ill- defined domains’; a literature survey., in C. F. Lynch, V. Aleven, K. D. Ashley & N. Pinkwart, eds, ‘Eighth International Conference on Intelligent Tutoring Systems, Workshop on Intelligent Tutoring Systems for Ill-Defined Domains’, Jhongli, Taiwan, pp. 1–10.

Mark, M. & Greer, J. (1993), ‘Evaluation methodologies for intelligent tutoring systems’, Journal of Artificial Intelligence and Education 4(2/3), 129–153.

Martin, B., Koedinger, K., Mitrovic, A. & Mathan, S. (2007), On using learning curves to evaluate its, in C.-K. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Artificial Intelligence in Education: Supporting Learn- ing through Intelligent and Socialy Informed Technology.’, Vol. 125, IOS Press, Amsterdam, The Netherlands, pp. 419–426.

Martin, B. & Mitrovic, A. (2005), Using learning curves to mine student mod- els, in L. Ardissono, P. Brna & A. Mitrovic, eds, ‘User Modeling 2005’, Vol. 3538 of Lecture Notes in Computer Science, Springer Berlin / Heidelberg, pp. 151–151.

Martin, J. (1991), Rapid application development, Macmillan Publishing Co., Inc., Indianapolis, IN, USA. 274 BIBLIOGRAPHY

Mathan, S. A. & Koedinger, K. R. (2002), An empirical assessment of com- prehension fostering features in an intelligent tutoring system, in S. Cerri, G. Gouarderes & F. Paraguasu, eds, ‘Intelligent Tutoring Systems’, Vol. 2363 of Lecture Notes in Computer Science, Springer-Verlag Berlin / Heidelberg, p. 330343.

Matsuda, N. & VanLehn, K. (2005), Advanced geometry tutor: An intelligent tutor that teaches proof-writing with construction, in C. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Proceedings of The 12th International Con- ference on Artificial Intelligence in Education’, IOS Press, Amsterdam, The Netherlands, pp. 443–450.

Mayo, M. & Mitrovici, A. (2001), ‘Optimising its behaviour with bayesian networks and decision theory’, International Journal of Artificial Intelligence in Education 12, 124–153.

McConnell, S. (1996), Rapid development: taming wild software , Redmond, Wash. : Microsoft Press, c1996.

Menzel, W. (2006), ‘Constraint-based modeling and ’, International Journal of Artificial Intelligence in Education 16, 29–63.

Merrill, D. C., Reiser, Brian J. an Ranney, M. & Trafton, J. G. (1992), ‘Ef- fective tutoring techniques: A comparison of human tutors and intelligent tutoring systems’, Journal of the Learning Sciences 2(3), 277–305.

Mill´an, E. & P´erez-de-la Cruz, J. L. (2002), ‘A bayesian diagnostic algorithm for student modeling and its evaluation’, User Modeling and User-Adapted Interaction 12(2-3), 281–330.

Miller, J. R. (1988), The role of human-computer interaction in intelligent tutoring systems, in M. C. Polson & J. J. Richardson, eds, ‘Foundations BIBLIOGRAPHY 275

of Intelligent Tutoring Systems’, Lawrence Erlbaum Associates Publishers, pp. 143–190.

Mitrovic, A. (1998), Learning sql with a computerized tutor, in ‘Proceedings of the twenty-ninth SIGCSE technical symposium on Computer Science Ed- ucation’, ACM Press, Atlanta, Georgia, , pp. 307–311.

Mitrovic, A. (2002), Normit, a web-enabled tutor for database normalization, in ‘Proceedings of the International Conference on Computers in Education ICCE’, Vol. 2, Auckland, New Zealand, pp. 1276–1280.

Mitrovic, A. (2003), ‘An intelligent sql tutor on the web’, International Journal of Artificial Intelligence in Education 13, 171–195.

Mitrovic, A. (2005), The effect of explaining on learning: a case study with a data normalization tutor, in C. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Proceedings of The 12th International Conference on Artificial Intelli- gence in Education’, IOS Press, Amsterdam, The Netherlands, pp. 499–506.

Mitrovic, A. & Devedzic, V. (2004), ‘A model of multitutor ontology-based learning environments’, Int. J. Cont. Engineering Education and Lifelong Learning 14(3), 229–245.

Mitrovic, A., Koedinger, K. & Martin, B. (2003), A comparative analysis of cognitive tutoring and constraint-based modeling, in P. Brusilovsky, A. Cor- bett & F. de Rosis, eds, ‘User Modeling 2003’, Vol. 2702 of Lecture Notes in Computer Science, Springer Berlin / Heidelberg, pp. 313–322.

Mitrovic, A. & Ohllsson, S. (1999), ‘Evaluation of a constraint-based tutor for a database language’, International Journal of Artificial Intelligence in Education 10(3), 238–256. 276 BIBLIOGRAPHY

Mitrovic, A., Suraweera, P. & Martin, B. (2004), ‘Db-suite: Experiences with three intelligent, web-based database tutors’, Journal of Interactive Learning Research 15, 409–432.

Mitrovic, A. & team, T. I. (2008), Constraintbased tutors, in B. P. Woolf, E. Aimeur, R. Nkambou & S. Lajoie, eds, ‘9th International Conference on Intelligent Tutoring Systems’, number (LNCS) 5091 in ‘Lecture Notes in Computer Science’, Springer-Verlag, Montreal, Canada, pp. 29–32.

Mitrovic, A. & Weerasinghe, A. (2009), Revisiting ill-definedness and the con- sequences for itss, in V. Dimitrova, R. Mizoguchi, B. du Boulay & A. C. Graesser, eds, ‘Artificial Intelligence in Education 2009’, IOS Press, pp. 375– 382.

Mobus, C., Schroder, O. & Thole, H.-J. (1994), Diagnosing and evaluating the acquisition process of problem solving schemata in the domain of , in J. E. Greer & G. I. McCalla, eds, ‘Student Modelling : The Key to Individualized Knowledge-Based Instruction’, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer Verlag, pp. 211–264.

Moore, J. D. (1995), Participating in explanatory dialogues : interpreting and responding to questions in context, MIT Press,.

Morgan, D. L., Krueger, R. A. & King, J. A. (1998), Focus group kit., Vol. V4. Moderating Focus Groups, SAGE Publications,.

NHMRC (2007), ‘National statement on ethical con- duct in research involving humans’, Web page: http://www.nhmrc.gov.au/publications/synopses/e35syn.htm. last visited: 25 July 2009. BIBLIOGRAPHY 277

Nisbet, J. D. & Shucksmith, J. (1996), Learning strategies, Routledge and Kegan Paul, London.

Nkambou, R., Belghith, K. & Kabanza, F. (2006), An approach to intelligent training on a robotic simulator using an innovative path-planner, in ‘8th In- ternational Conference on Intelligent Tutoring Systems’, Vol. 4053 of Lecture Notes in Computer Science, Springer, pp. 645–654.

Nkambou, R., Bourdeau, J. & Mizoguchi, R. (2010), Introduction: What are intelligent tutoring systems, and why this book?, in R. Nkambou, J. Bour- deau & R. Mizoguchi, eds, ‘Advances in Intelligent Tutoring Systems’, Vol. 308 of Studies in Computational Intelligence, Springer Berlin / Heidelberg, pp. 1–12.

Nunamaker, J. F. J., Chen, M. & Purdin, T. D. M. (1991), ‘Systems develop- ment in information systems research’, Journal of Management Information Systems 7(3), 89–106.

Oguejiofor, E., Kicinger, R., Popovici, E., Arciszewski, T. & De Jong, K. (2004), ‘Intelligent tutoring systems: an ontology-based approach’, In- ternational Journal of IT in Architecture, Engineering and Construction 2(2), 115–128.

Ohllsson, S. (1992), ‘Constraint-based student modeling’, Journal of Artificial Intelligence in Education 3(4), 429–447.

Paramythis, A., Weibelzahl, S. & Masthoff, J. (2010), ‘Layered evaluation of interactive adaptive systems: framework and formative methods’, User Modeling and User-Adapted Interaction 20, 383–453.

Parnas, D. L. (1972), ‘On the criteria to be used in decomposing systems into modules’, Commun. ACM 15(12), 1053–1058. 278 BIBLIOGRAPHY

Parnas, D. L. (2005), Document driven disciplined development of software., in P. Strooper, ed., ‘Australian Software Engineering Conference’, pp. 2–4.

Parvez, S. M. & Blank, G. D. (2008), Individualizing tutoring with learning style based feedback, in B. W. et al., ed., ‘Proceedings of the 9th Interna- tional Conference on Intelligent Tutoring Systems’, Springer-Verlag, Berlin, pp. 291–301.

Pearl, J. (2009), Causality: Models, Reasoning and Inference, 2nd edn, Cam- bridge University Press.

Perez-de-la Cruz, J.-L., Conejo, R. & Guzman, E. (2007), Qualitative and quantitative student models, in C.-K. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Artificial Intelligence in Education: Supporting Learning through Intelligent and Socially Informed Technology.’, Vol. 125, IOS Press, Amsterdam, The Netherlands, pp. 531–538.

Prentzas, J. & Hatzilygeroudis, I. (2002), Integrating hybrid rule-based with case-based reasoning, in S. Craw & A. Preece, eds, ‘Procs 2002 European Conference on Case-Based Reasoning’, Vol. LNAI 2416 of Advances in Case- Based Reasoning, SpringerVerlag, pp. 336–349.

Razzaq, L. M. & Heffernan, N. T. (2010), Hints: Is it better to give or wait to be asked?, in V. Aleven, J. Kay & J. Mostow, eds, ‘Tenth International Conference on Intelligent Tutoring Systems: Bridges to Learning, ITS2010 (1)’, number (LNCS) 6094 in ‘Lecture Notes in Computer Science’, Springer- Verlag Berlin Heidelberg, pp. 349–358.

Reeves, T. (1994), Evaluating what really matters in computer-based educa- tion, in M. Wild & D. Kirkpatrick, eds, ‘Evaluating What Really Matters in Computer-Based Education’, Perth, Australia, pp. 219–246. BIBLIOGRAPHY 279

Regian, J. W. (1997), Functional area analysis of intelligent computer-assisted instruction, Technical report, TAPSTEM ICAI-FAA Committee.

Remolina, E., Ramachandran, S., Fu, D., Stottler, R. & Howse, W. R. (2004), Intelligent simulation-based tutor for flight training, in ‘Interservice / Indus- try Training Simulation and Education Conference (I/ITSEC)’, pp. 1–13. Paper No. 1743.

Reye, J. (2004), ‘Student modeling based on belief networks’, International Journal of Artificial Intelligence in Education 14, 63–96.

Reye, J. (2004/2008), Personal access tutor - pat. unpublished work.

Rich, E. A. (1983), ‘Users are individuals: individualising user models.’, Inter- national Journal of Man-Machine Studies 18, 199–214.

Risco, S. (2005), Intelligent tutoring systems: student’s perception and expec- tations. Focus Group - unpublished work.

Risco, S. & Reye, J. (2009), Personal access tutor - helping students to learn ms access, in V. Dimitrova, R. Mizoguchi, B. du Boulay & A. C. Graesser, eds, ‘Artificial Intelligence in Education 2009’, IOS Press, pp. 541–548.

Risco, S. & Reye, J. (2012), Evaluation of an intelligent tutoring system used for teaching rad in a database environment, in M. de Raadt & A. Carbone, eds, ‘Fourteenth Australasian Computing Education Confer- ence (ACE2012)’, Vol. 123 of CPRIT series, ACS, Melbourne, Australia, pp. 131–140.

Ritter, S. & Koedinger, K. R. (1996), ‘An architecture for plug-in tutor agents.’, Journal of Artificial Intelligence in Education 7(3-4), 315–347.

Roll, I., Aleven, V. & Koedinger, K. R. (2010), The invention lab: Using a hybrid of model tracing and constraint-based modeling to offer intelligent 280 BIBLIOGRAPHY

support in inquiry environments, in V. Aleven, J. Kay & J. Mostow, eds, ‘Tenth International Conference on Intelligent Tutoring Systems: Bridges to Learning, ITS2010 (1)’, number (LNCS) 6094 in ‘Lecture Notes in Computer Science’, Springer-Verlag Berlin Heidelberg, pp. 115–124.

Russell, S. J. & Norvig, P. (2003), Artificial intelligence : a modern approach, 2nd edn, Prentice Hall, Upper Saddle River, N.J.

Russell, S. J. & Norvig, P. (2010), Artificial intelligence : a modern approach, Prentice Hall series in artificial intelligence., 3nd edn, Prentice Hall, Upper Saddle River, N.J.

Salleh, E., Winslett, G., Bruce, C. & Tickle, A. (2005), Diagnostic learning: Using web-based self diagnostic tools for learning abstracts concepts in data network education, in ‘A paper presented to the OLT Conference, QUT, Brisbane’, pp. 241–250.

Schatz, S., Bowers, C. & Nicholson, D. (2009), Advanced situated tutors: Design, philosophy, and a review of existing systems, in ‘Human Factors and Ergonomics Society Annual Meeting Proceedings’, Vol. 53, Human Factors and Ergonomics Society, pp. 1944–1948.

Schulmeister, R. (1997), ‘Hypermedia learning systems’, Web Page; http:// www.izhd.uni-hamburg.de/ paginae/ Book/Frames/ Start-FRAME.html. last visited 30 April 2005.

Schweinzer, H. (2008), Computerized add-on tool for inspection and supervi- sion of hands-on electrical and electronic hardware training, in ‘Proceedings of Conference on Human System Interactions’, pp. 852–857.

Self, J. (1987), Student models: what use are they?, in P. Ercoli & R. Lewis, eds, ‘Artificial Intelligence Tools in Education’, Frasacati, pp. 73–86. BIBLIOGRAPHY 281

Self, J. (1990), ‘Theoretical foundations for intelligent tutoring systems’, Jour- nal of Artificial Intelligence in Education 1(4), 3–14.

Self, J. (1994), Formal approaches to student modelling, in J. E. Greer & G. I. McCalla, eds, ‘Student Modelling : The Key to Individualized Knowledge- Based Instruction’, Vol. 125 of NATO ASI Series F: Computer and Systems Sciences, Springer Verlag, pp. 295–354.

Shannon, R. E. (1998), Introduction to the art and science of simulation, in D. Medeiros, D. E.F. Watson, J. Carson & M. Manivannan, eds, ‘WSC 98: Proceedings of the 1998 Winter Simulation Conference’, Winter Simulation Conference, Miami, Florida, pp. 7–14.

Shute, V. & Psotka, J. (1994), Intelligent tutoring systems: past, present and future, in D. Jonassen, ed., ‘Handbook of Research on Educational Communications and Technology, Scholastic’, pp. 570–600.

Shute, V., Torreano, L. & Willis, R. (1998), Dna - uncorking the bottleneck in knowledge elicitation and organization, in B. Goettl, H. Halff, C. Redfield & V. Shute, eds, ‘Intelligent Tutoring Systems, 4th International Conference, ITS ’98’, Vol. 1452 of Lecture Notes in Computer Science, Springer Berlin / Heidelberg, pp. 146–155.

Siemer, J. & Angelides, M. C. (1994), Embedding an intelligent tutoring sys- tem in a business gaming-simulation environment, in J. T. et al, ed., ‘Pro- ceedings of the 26th conference on Winter simulation’, WSC ’94, Society for Computer Simulation International, San Diego, CA, USA, pp. 1399–1406.

Singh, R., Saleem, M., Pradhan, P., Heffernan, C., Heffernan, N. T., Razzaq, L., Dailey, Matthew D. OConnor, C. & Mulcahy, C. (2011), Feedback during web-based homework: The role of hints, in G. Biswas, S. Bull, J. Kay & 282 BIBLIOGRAPHY

A. Mitrovic, eds, ‘15th International Conference, AIED 2011’, LNAI 6738, Springer-Verlag Berlin Heidelberg, Auckland, New Zealand, p. 328336.

Sleeman, D. & Brown, J. S. (1982), Intelligent Tutoring Systems, Computers and people, Academic Press.

Sowa, J. F. (1984), Conceptual structures : information processing in mind and machine, series, Conceptual structures : information processing in mind and machine.

Sowa, J. F. (2006a), Conceptual graphs, in P. Bernus, K. Mertins & G. Schmidt, eds, ‘Handbook on architectures of information systems’, Springer, pp. 295–320.

Sowa, J. F. (2006b), ‘Semantic networks’, Web Page; http://www.jfsowa.com/pubs/semnet.htm. Last Visited 05 June 2009.

Stottler, R., Davis, A., Panichas, S. & Treadwell, M. (2007), Designing and implementing intelligent tutoring instruction for tactical action officers, in ‘Proceedings of Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)’. Paper No. 7483.

Suraweera, P. & Mitrovic, A. (2002), Kermit: A constraint-based tutor for database modeling, in S. A. Cerri, G. Gouarderes & F. Paraguacu, eds, ‘In- telligent Tutoring Systems’, Vol. 2363 of Lecture Notes in Computer Science, Springer-Verlag, pp. 377–387.

Suraweera, P. & Mitrovic, A. (2004), ‘An intelligent tutoring system for entity relationship modelling’, International Journal of Artificial Intelligence in Education 14, 375–417.

Sykes, E. R. (2003), An intelligent tutoring system prototype for learning BIBLIOGRAPHY 283

to program java, in ‘The 3rd IEEE International Conference on Advanced Learning Technologies (ICALT’03)’, IEEE Computer Society, p. 485.

Tang, T. Y. & McCalla, G. (2007), Paper annotation with learner models, in C.-K. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Artificial Intelli- gence in Education: Supporting Learning through Intelligent and Socially Informed Technology.’, Vol. 125, IOS Press, Amsterdam, The Netherlands, pp. 654–661.

Tedre, M. (2006), What should be automated?: The fundamental question underlying human-centered computing, in ‘HCM ’06: Proceedings of the 1st ACM international workshop on Human-centered multimedia’, ACM Press, New York, NY, USA, pp. 19–24.

Truong, N. (2007), A Web-based Programming Environment for Novice Pro- grammers, PhD thesis, School of Software Engineering and Data Commu- nications, Faculty of IT, Queensland University of Technology, Brisbane Australia.

Truong, N., Bancroft, P. & Roe, P. (2003), A web based environment for learning to program, in M. J. Oudshoorn, ed., ‘Twenty-Sixth Australasian Computer Science Conference (ACSC2003)’, Vol. 16 of CRPIT, ACS, Ade- laide, Australia, pp. 255–264.

VanLehn, K. (1988), Student modelling, in M. C. Polson & J. J. Richard- son, eds, ‘Foundations of Intelligent Tutoring Systems’, Lawrence Erlbaum Associates Publishers, pp. 55–78.

VanLehn, K. (2006), ‘The behavior of tutoring systems’, International Journal of Artificial Intelligence in Education 16, 227–265.

VanLehn, K., Lynch, C., Schulze, K., Shapiro, J. A. & Shelby, R. (2007), 284 BIBLIOGRAPHY

The andes tutoring system: Five years of evaluations, in C.-K. Looi, G. McCalla, B. Bredeweg & J. Breuker, eds, ‘Artificial Intelligence in Educa- tion: Supporting Learning through Intelligent and Socially Informed Tech- nology.’, Vol. 125, IOS Press, Amsterdam, The Netherlands, pp. 678–686.

VanLehn, K., Lynch, C., Schulze, K., Shapiro, J. A., Shelby, R., Taylor, L., Treacy, D., Weinstein, A. & Wintersgill, M. (2005), ‘The andes physics tutor- ing system: Lessons learned’, International Journal of Artificial Intelligence in Education 15(3), 147–204.

VanLehn, K., van de Sande, B., Shelby, R. & Gershman, S. (2010), The andes physics tutoring system: An experiment in freedom, in ‘Advances in Intel- ligent Tutoring Systems’, Studies in Computational Intelligence, Springer, New York, p. 421443.

Vaughn, S., Schumm, J. S. & Sinagub, J. M. (1996), Focus group interviews in education and psychology, Sage PublicationsThousand Oaks.

Washbush, J. & Gosen, J. (2001), ‘An exploration of game-derived learning in total enterprise simulations’, Simulation & gaming 32(3), 281–296.

Webber, C. (2004), From errors to conceptions - an approach to student diag- nosis, in J. C. Lester, R. M. Vicari & F. Paragua¸cu, eds, ‘Intelligent Tutoring Systems, 7th International Conference, ITS 2004’, Vol. 3220 of Lecture Notes in Computer Science, Springer, Macei`o, Alagoas, Brasil, pp. 710–719.

Wenger, E. (1987), Artificial intelligence and tutoring systems : computational and cognitive approaches to the communication of knowledge, Morgan Kauf- mann Publishers, Los Altos, Calif.

Woolf, B. P. (1984), Context dependent planning in a machine tutor, Phd., University of Massachusetts at Amherst. Graduate School. BIBLIOGRAPHY 285

Woolf, B. P. (1996), ‘Intelligent multimedia tutoring systems.’, Communica- tions of the ACM 39(4), 3031.

Woolf, B. P. (2006), Gender and cognitive differences in help effectiveness during problem solving, in ‘Tech., Inst., Cognition and Learning’, Vol. 3, Old City Publishing Inc., pp. 89–95.

Woolf, B. P. (2008), Building Intelligent Interactive Tutors: Student-centered Strategies for Revolutionizing E-learning, Morgan Kaufmann.

Woolf, B. P. (2010), Student modeling, in ‘Advances in Intelligent Tutor- ing Systems’, Studies in Computational Intelligence, Springer, New York, pp. 267–279.

Woolf, B. P., Beck, J., Eliot, C. & Stern, M. (2001), Growth and maturity of in- telligent tutoring systems: a status report, in ‘Smart machines in education: the coming revolution in educational technology’, MIT Press, pp. 99–144.

WTLDocumentation (2002), ‘Wtl documentation’, Web page: www.viksoe.dk/wtldoc/. last visited: 03 March 2010.

Yacci, M. (1994), ‘A grounded theory of student choice in information-rich learning environments’, J. Educ. Multimedia Hypermedia 3(3-4), 327–350.

Zadeh, L. A. (1996), Fuzzy Sets, Fuzzy Logic, And Fuzzy Systems, Vol. 6 of Advances in Fuzzy Systems Applications and Theory, World Scientific Pub- lishing.

Zakharov, K., Mitrovic, A. & Ohllsson, S. (2005), Feedback micro-engineering in eer-tutor, in ‘Proceeding of the 2005 conference on Artificial Intelligence in Education: Supporting Learning through Intelligent and Socially Informed Technology’, IOS Press, Amsterdam, The Netherlands, The Netherlands, pp. 718–725. 286 BIBLIOGRAPHY

Zhou, Y. & Evens, M. (1999), A practical student model in an intelligent tutoring system, in ‘Proceedings of 11th IEEE International Conference on Tools with Artificial Intelligence’, pp. 13–18.