Under the Auspices of the JISC/NSF1 Funded Project, Digital Libraries in Support of Innovative

Total Page:16

File Type:pdf, Size:1020Kb

Under the Auspices of the JISC/NSF1 Funded Project, Digital Libraries in Support of Innovative

Student-focused evaluation of elearning activities

Karen Fill Centre for Learning & Teaching University of Southampton Email: [email protected]

Paper presented at the European Conference on Educational Research, University College Dublin, 7-10 September 2005

Abstract

Under the auspices of a funded digital library project, a number of new computer- based activities have been developed to enhance learning for geography students. In December 2004, student-focused evaluation of some of these innovations was undertaken on three separate courses of study. This involved observation of students as they were introduced to specific online learning activities, analysis of completed questionnaires and focused discussion. The questionnaire design drew on a specific methodology and generic quality criteria, facilitating comparative analysis of results. Teaching staff were invited to add any questions of particular interest and preliminary findings from the analysis were discussed with them. Their reflections informed the final evaluation reports.

Students were particularly positive about descriptions of learning objectives and content, accessibility of linked resources, inclusion of required tools, appropriateness of assessments and improvement of their knowledge and skills. However, one group who were remotely tutored gave negative feedback about the motivational and support aspects of this format. Female students’ responses, on these aspects, were more negative than those of males. A vocal minority of all students on the three courses reported that their tutors significantly underestimated the time required to complete online learning activities.

Introduction

- 1 - The JISC/NSF1 funded project, Digital Libraries in Support of Innovative Approaches to Learning and Teaching in Geography (DialogPlus), has enabled geography teachers in two American and two English universities to develop a number of online activities to enhance student learning. In December 2004, the author of this paper undertook specifically student-focused evaluation of some of these innovations. Three separate courses of study were involved, two at level two (intermediate) and one at level three (final year and postgraduate). Two were in the domain of Physical and Environmental Geography, the other in Earth Observation and Geographical Information Systems. Two adopted a blended approach to learning and teaching, the other was delivered in distance learning mode.

In the blended models, students attended traditional lectures and occasional seminars. The eLearning activities here comprised a number of online activities which introduced the students to a specific topic, offered them a wide range of embedded digital library resources for further study, set them tasks involving the manipulation of real data sets, and assessed their learning both formatively and summatively. In the distance learning model, the students were in fact campus-based but the tutor was in a different country. The lecture material was online, released weekly to the students who then had to undertake practical, lab-based assessments. Tutor support was via email and online discussion board.

The evaluation involved observation of two groups of students as they were introduced to specific online activities, design of questionnaires, analysis of completed forms, and the use of a nominal group discussion technique with the distance learners. The questionnaire design drew on the MECA-ODL methodology and quality criteria (Riddy & Fill, 2004), facilitating comparative analysis of results. Teaching staff were invited to add any questions of particular interest and preliminary findings from the analysis were discussed with them. Their reflections informed the final evaluation reports.

Overall, student responses were particularly positive with respect to descriptions of learning objectives and content, accessibility of linked resources, inclusion of required tools, appropriateness of assessments and improvement of their knowledge and skills. The students on the distance course, all young undergraduates with no prior experience of this delivery mode, responded negatively about their own

1 JISC is the UK’s Joint Information Systems Committee, NSF is the USA’s National Science Foundation.

- 2 - motivation, preferences for learning and support, with respect to the online and remote tutoring components. Female students’ responses, on these aspects, were more negative than those of male students. A vocal minority of all students on the three courses reported that their tutors significantly underestimated the time required to complete eLearning activities.

This paper outlines the background to the DialogPlus project, the specific online learning activities reviewed in December 2004, the evaluation processes and detailed outcomes.

The DialogPlus Project

This five year project started in 2002. Geographers at Pennsylvania State University and the University of California, Santa Barbara (U.S.) and the University of Southampton and the University of Leeds (U.K.) are collaborating in the use of digital library resources and the development of online learning activities. They are supported by Computer Scientists and Educationalists in the partner universities.

Examples of project outputs are digital resources (documents, images, maps, databases, simulations), online activities and complete units of learning that use these, and other web-based, digital resources, a toolkit to support the design of online activities (Conole and Fill, 2005), and evaluation reports on their effectiveness in supporting student learning. The development phase of the project finishes in 2006 and will be followed by a period of consolidation during which it is intended that the resources and, importantly, the innovative academic practice should become embedded in the partner universities.

Evaluated online learning activities

This paper describes and discusses student-focused evaluation of DialogPlus resources used on three courses of study in the first semester of academic year 2004/5. The courses are anonymised here and referred to by number.

Course 1 was taken by 130 campus based undergraduates, mainly in their second year of study, with a few third years and postgraduates. Project members had developed a number of eLearning activities to complement the traditional lectures and encourage active learning. The evaluated activity introduced students to

- 3 - important environmental indicators, offered them links to many online resources, and required them to manipulate real world data, discuss and critique the results obtained. Numeric and textual answers were entered online and assessed by a mixture of computer and human marking. Students were introduced to the activity in a timetabled practical session and completed it in their own time. Face-to-face support was available in ‘clinic’ sessions and there was an online discussion board.

Course 2 was taken by 58 second year undergraduates. The students were campus based but in academic year 2004/5 the tutor for the first semester unit was working in another country. As part of the DialogPlus project the weekly face-to-face lectures for this unit had been developed into eLearning modules, linking to many useful resources and with associated online multiple choice quizzes (MCQs). The students worked through these at their own pace. They could contact the tutor by email or use an online discussion board. They also attended weekly practical sessions, supervised by postgraduate assistants. In the final week of the course the tutor returned for a face-to-face lecture and group tutorials.

Course 3 was taken by 48 campus based students, mainly third year undergraduates with some postgraduate students. Here again, eLearning activities supplemented lectures in a blended delivery mode. The evaluated activity involved calibrating and using a computerised model. Students were introduced to the activity in a timetabled practical session and completed it in their own time. Some group work was involved in calibrating the model.

Evaluation Methods

The internal project evaluation activities are led by Educationalists at the University of Southampton and use a mix of quantitative and qualitative methods. The quantitative approach adopted for student-focused evaluation is based on the MECA-ODL methodology (Riddy and Fill, 2004). Ten of the user quality criteria proposed there have been used generically on all student surveys for DialogPlus evaluation. Questionnaires are supplemented with further questions, usually another ten, agreed with the academic tutors. The generic quality criteria are shown in Figure 1. Students are asked to score their response to each statement as 0 (No), 1 (Somewhat), 2 (Yes) or N/A (not applicable). This range of responses was chosen, following feedback sessions on the MECA-ODL tool, instead of the 5 point Likert scale originally suggested in the methodology. Students’ scored responses are entered to

- 4 - spreadsheets and analysed using standard statistical methods. Students are also asked to make comments if they wish and these are considered with other qualitative input.

There was a full description of each online learning activity, including learning objectives. The interface was easy to use.

Required tools were included (e.g. database, spreadsheet, note making, bulletin board). The content met the needs of my preferred learning style.

The content was relevant, appropriate and clear.

All embedded materials were easily accessible.

Mechanisms were provided for information and support.

Maximum response times to learner queries were defined.

The assessed elements of the activity were appropriate for the learning objectives.

The online learning activities improved my knowledge and skills in the subject of {x}

Figure 1: the generic quality criteria

Qualitative methods used in DialogPlus evaluations include observation of students using the resources, discussions with individual or groups of students, a nominal group technique (Harvey, 1998, pp. 44-45) and analysis of contributions to online discussions.

All findings are discussed in detail with the academic tutors and if necessary, and possible, further data is sort for clarification or illumination. Finally a brief summary of each evaluation is agreed and made available to external stakeholders. All the teaching and learning resources, plus the detailed and summary files are accessible to team members via the project website.

- 5 - Findings

There is a wealth of quantitative and qualitative data produced by these evaluation methods. For the purposes of this paper, the simple summaries plus tabulated responses to the generic quality statements for each evaluated course are given first, followed by a comparative chart based on those responses.

Course 1

Summary The evaluation activities undertaken were observation of students’ introduction to and initial use of a specific online activity, plus analysis of their responses to a questionnaire.

Positive aspects from the observation were that the students seemed engaged with, and interested in, what they were doing; there were no technical problems; and students seemed confident as they left the session that they would be able to complete the activity by the deadline. One negative aspect observed was that the navigation through the activity was unsatisfactory (Next/Back only).

Student responses on the questionnaires were particularly positive with respect to the description of content and learning objectives; inclusion of required tools; mechanisms for information and support; appropriateness of assessments; and the improvement of their knowledge and skills. The most negative response was to the suggestion that they might prefer to be assessed by essays rather than eLearning activities, suggesting general acceptance of the assessment components of the online activities.

There was a spread of views about the other aspects. More students were positive that the specifically evaluated eLearning activity improved their knowledge and skills than were positive that they had learnt a lot from all such activities on the course.

Generic quality scores Fifty-six (43%) of the 130 students taking the unit completed the questionnaire. Their responses to the generic quality statements are shown in the table below.

- 6 -

Number and percentage of the students present Generic giving each score Mean Quality response Criteria 0 – No 1 – 2 – Yes N/A Somewhat

1 There was a full description 1 5 49 1 1.9 of the learning activity, 2% 9% 88% 2% including learning objectives.

2 The interface was easy to 2 17 36 1 1.6 use. 4% 30% 64% 2%

3 Required tools were 1 4 50 1 1.9 included (e.g. database, 2% 7% 89% 2% spreadsheet, note making, bulletin board).

4 The content met the needs 2 25 28 1 1.5 of my preferred learning 4% 45% 50% 2% style.

5 The content was relevant, 0 29 26 1 1.5 appropriate and clear. 52% 46% 2%

6 All embedded materials 2 24 29 1 1.5 were easily accessible. 4% 43% 52% 2%

7 Mechanisms were provided 1 7 47 1 1.8 for information and support. 2% 13% 84% 2%

8 Maximum response times to 4 16 31 5 1.5 learner queries were 7% 29% 55% 9% defined.

9 The assessed elements of 0 14 40 2 1.7 the activity were appropriate 25% 71% 4% for the learning objectives.

10 The activity improved my 0 14 41 1 1.7 {subject} knowledge and 25% 73% 2% skills.

Course 2

Summary The evaluation activities undertaken were a questionnaire and a nominal focus group session for students at the end of the final, face-to-face, lecture.

- 7 - The student response to the ten eLearning modules was particularly positive with respect to description of content and learning objectives; inclusion of required tools; accessibility of linked resources; and appropriateness of the MCQs. The students, all young undergraduates with no prior experience of distance learning, gave particularly negative feedback about their own motivation and preferences for learning with respect to the online and distant tutoring aspects of the unit delivery.

Female students’ responses, on these aspects, were more negative than those of male students. Subsequent analysis of students’ summative results for the course showed no statistically significant differences between male and female students.

Generic quality scores Thirty-five (60%) of the fifty-eight students taking the unit attended the last lecture and completed the questionnaire. None of them had studied by distance learning before.

Number and percentage of the students present Generic giving each score Quality Mean response Criteria 0 – No 1 – 2 – Yes N/A Somewhat

1 There was a full description 0 1 34 0 2.0 of each lecture, including 3% 97% learning objectives.

2 The interface was easy to 0 8 27 0 1.8 use. 23% 77%

3 Required tools were 0 7 28 0 1.8 included (e.g. database, 20% 80% spreadsheet, note making, discussion board).

4 The content met the needs 8 18 8 0 1.0 of my preferred learning 23% 51% 23% style.

5 The content was relevant, 1 17 16 1 1.4 appropriate and clear. 3% 49% 46% 3%

6 All linked resources (e.g. 1 10 24 0 1.7 image databases) were 3% 29% 69% easily accessible.

- 8 - Number and percentage of the students present Generic giving each score Quality Mean response Criteria 0 – No 1 – 2 – Yes N/A Somewhat

7 Mechanisms were provided 4 16 14 1 1.3 for information and support. 11% 46% 40% 3%

8 My queries were answered 6 9 15 5 1.3 in a timely manner. 17% 26% 43% 14%

9 The multiple choice quizzes 3 7 24 1 1.6 were appropriate for the 9% 20% 69% 3% learning objectives.

10 The activity improved my 3 12 18 2 1.5 {subject} knowledge and 9% 34% 51% 6% skills.

Course 3

Summary The evaluation activities undertaken were observation of students’ introduction to and use of the computerised model, plus analysis of student responses to a questionnaire.

Positive aspects from the observation were that students seemed engaged with and, largely, interested in what they were doing; and some students worked very collaboratively together. Negative aspects observed were unclear instructions about the group work aspect; computer crashes; data validity problems; and students having to wait overly long for answers to queries or to submit results.

Student responses on the questionnaires were particularly positive with respect to the description of content and learning objectives; inclusion of required tools; and appropriate assessment.

There was a spread of views about the other aspects. More students were positive that this activity improved their knowledge and skills than were positive that they had learnt a lot from all the online activities in the course. A minority positively enjoyed the online activities.

- 9 - Generic quality scores Thirty-six (75%) of the forty-eight students taking the unit attended the session and completed the questionnaire.

Generic Number and percentage of the students present Quality giving each score Mean Criteria response 0 – No 1 – 2 – Yes N/A Somewhat

1 There is a full description of 0 3 33 0 1.9 the ePractical, including 8% 92% learning objectives.

2 The interface is easy to use. 3 18 15 0 1.3 8% 50% 42%

3 Required tools are included 0 8 28 0 1.8 (e.g. database, spreadsheet, 22% 78% note making, discussion board).

4 The content meets the needs 8 16 12 0 1.1 of my preferred learning style. 22% 44% 33%

5 The content is relevant, 4 16 16 0 1.3 appropriate and clear. 11% 44% 44%

6 All embedded materials are 5 9 22 0 1.5 easily accessible. 14% 25% 61%

7 Mechanisms are provided for 3 15 18 0 1.4 information and support. 8% 42% 50%

8 Maximum response times to 9 14 9 4 1.0 learner queries are defined. 25% 39% 25% 11%

9 The assessed elements of the 5 7 24 0 1.5 epractical seem appropriate for 14% 19% 67% the learning objectives.

10 The epractical is improving my 3 11 22 0 1.5 {subject} knowledge and skills. 8% 31% 61%

- 10 - Comparative quality indicators

One of the virtues of the MECA-ODL approach is that the quality of different eLearning activities can be compared. Figures 2 and 3 below show such comparisons in a graphical format for the three DialogPlus evaluations described above.

It is of the utmost importance to understand the rationale behind any evaluation. On DialogPlus, a utilisation focus has been adopted (Patton, 1997) with the objectives that project team members, and other stakeholders, should be able to know, judge and improve the quality of the learning resources. Comparative analysis is not used here to decide which of several possible resources should be used instead of others, but rather to learn from what students have found to be the best and worst aspects of online learning and to promote good practice.

Generic Quality Criteria 1-5 Course 1 Course 2 Course 3 Yes s e s n o p s e r

n a e M

No Description Interface Tools Learning style Relevance & Clarity

Figure 2: Comparison of quality criteria 1-5

- 11 - Generic Quality Criteria 6-10 Course 1 Course 2 Course 3 Yes s e s n o p s e r

n a e M

No Accessible Support Response times Assessment Skills materials mechanisms

Figure 3: Comparison of quality criteria 6-10

Discussion & Conclusions

Students on all three courses have found the elearning activities to be well described and with the necessary tools included for them to complete the set tasks. Assessments are deemed appropriate and the students acknowledge improvements in their knowledge and skills. Relevance and clarity of content could possibly be improved in all cases after some further investigation with the students.

The interface on course two scored higher for ease of use than the others and it is interesting to note that it was delivered via a different VLE, Bodington Common. The other two were mounted in Blackboard, with some customisation. This could also be investigated further to better understand interface characteristics that students particularly like.

Course two scored lowest with respect to students' preferred learning style. The qualitative evaluation exercises suggest this was more to do with the remote support than with the actual content of the modules. This appears to be confirmed by the relatively high rating for the accessibility of the materials but low rating for support mechanisms. Course one scored highest on both learning styles and support. The former could be taken as an endorsement of the thoughtful mix of text, image and data resources and tasks that tested both literacy and numeracy. The support mechanism most valued by students, as expressed in interviews, was face-to-face

- 12 - clinics. The discussion board was little used because students tended to raise queries in these timetabled sessions.

The evaluated activity in course three required good mathematical skills which may account for its moderate rating on learning style. It also used an adapted simulation model that was error prone and the low rating of response times may be explained by some frustration expressed by some students in qualitative evaluation sessions with delays in resolving these problems. There was less proactive support for these higher level students than on the other two courses but it still scored slightly higher than the remotely tutored version.

Overall, ratings on the MECA-ODL quality criteria were supported by qualitative student feedback. They provide a useful means of comparing instances of elearning and suggesting areas for further investigation and improvement. However, one aspect was raised by a number of students in face-to-face evaluation sessions on all three courses that is not covered by the generic set of criteria. These students found that it took them much longer to complete the online activities than had been estimated by the tutors. There can be several reasons for this, including lower levels of computer skills, difficulty in accessing computers, and lower functional specification of students' computers. Students who raised these time concerns were vehement that they would not want all of their study units to incorporate some elearning. This should be taken into account by developers and teaching staff who are concerned with the quality of the student experience and learning.

Acknowledgements

The author is grateful for the co-operation of the teachers and students who took part in these evaluations.

References

Conole, G. and Fill, K. (2005). A learning design toolkit to create pedagogically effective learning activities. Journal of Interactive Media in Education (Advances in Learning Design. Special Issue, eds. Colin Tattersall, Rob Koper), 2005/08. ISSN:1365-893X [jime.open.ac.uk/2005/08].

DialogPlus website: www.dialogplus.org

- 13 - Harvey, J. (Ed.) (1998) Evaluation Cookbook. Edinburgh: Heriot-Watt University.

MECA-ODL website: www.adeit.uv.es/mecaodl (Note: the methodological guide is available in English, Spanish, German, Italian and Greek from the Materials page)

Patton, M.Q. (1997) 3rd edition. Utilization-focused evaluation, Sage Publications Inc., USA

Priest, S. and Fill, K. (forthcoming) Online learning activities in second year Environmental Geography. In O’Donoghue, J. (ed) Technology Supported Learning and Teaching: A Staff Perspective. Idea Group Inc, USA

Riddy, P. and Fill, K. (2003) Evaluating the quality of elearning resources. Paper presented at British Educational Research Association Annual Conference 2003, Edinburgh, UK, September 2003. Available online: http://www.leeds.ac.uk/educol/documents/00003331.htm

Riddy, P and Fill, K. (2004). Evaluating eLearning Resources. In Networked Learning 2004, (eds.) Banks, Goodyear, Hodgson, Jones, Lally, McConnell & Steeples, Lancaster & Sheffield University, pp. 630-636. ISBN 1-86220-150-1.

- 14 -

Recommended publications