Accessing the Student Voice
Total Page:16
File Type:pdf, Size:1020Kb
ACCESSING THE STUDENT VOICE Using CEQuery to identify what retains students and promotes engagement in productive learning in Australian higher education Geoff Scott University of Western Sydney Final Report December 2005 A PROJECT FUNDED BY THE HIGHER EDUCATION INNOVATION PROGRAM AND THE COLLABORATION AND STRUCTURAL REFORM FUND DEPARTMENT OF EDUCATION, SCIENCE AND TRAINING ii © Commonwealth of Australia 2006 ISBN: 0 642 77603 2 This work is copyright. It may be reproduced in whole or in part for study or training purposes subject to the inclusion of an acknowledgment of the source and no commercial usage or sale. Reproduction for purposes other than those indicated above, requires the prior written permission from the Commonwealth. Requests and inquiries concerning reproduction and rights should be addressed to Commonwealth Copyright Administration, Attorney General’s Department, Robert Garran Offices, National Circuit, Barton ACT 2600 or posted at http://www.ag.gov.au/cca The views expressed in this report do not necessarily reflect the views of the Department of Education, Science and Training. iii ACKNOWLEDGMENTS This project was a truly collaborative effort. It brought together a wide range of expertise from across the higher education sector in Australia. The support and contribution of all of the following is acknowledged: • The project’s National Steering Committee: Dennis Gibson (Chancellor of RMIT University), Richard Johnstone (Executive Director of the Carrick Institute for Learning & Teaching), Cindy Tilbrook (Executive Director, Graduate Careers Australia), Claire Atkinson and Regina Camara (from the Department of Education, Science and Training). • More than 100 senior academic, education development and administrative staff from the 14 universities that participated in the study. • The ATN Planning and Quality Group which organised and hosted workshops across Australia on the project • The Department of Education, Science and Training (DEST) for its Higher Education Innovation Program (HEIP) funding in 2004 and Collaboration and Structural Reform (CASR) funding in 2005. • The following people who worked directly on the project : - from UWS: Steve Butcher, Terry Gozzard, Leo Grebennikov, Carolyn Webb and Kim Johnston - from Macquarie University: Peter Petocz - from QUT: Shane Brown iv EXECUTIVE SUMMARY Although there is a steady interest in the exploitation of textual data in the human and social sciences, the tools and techniques used are still, to a large part, not readily applicable to the domain of large-scale surveys and database research… The inclusion of open-response questions in such studies offers the potential for the identification of responses falling outside the researchers’ preconceived framework and the development of truly constructive proposals. The problems of exploiting data of this type, however, tend to mean that they are poorly utilised, either being totally ignored, analysed non- systematically, or treated as an aside. Bolden and Moscarola, 2000: 450 This report is among the first in the world to explore systematically an enormous database of open-ended comments made by university graduates on their tertiary experience. It uses a unique qualitative data analysis system which is IT-enabled and specifically calibrated for use in post-secondary and higher education. Background There is increasing interest in both proving and improving the quality of Australian higher education. As part of this trend a range of tracking systems has been developed, using data from an array of sources. The Course Experience Questionnaire (CEQ), based on the work of Ramsden and Entwistle (1981 and 1983) and Biggs (1987, 1992), is the only nationally generated source of evaluative data on a common set of tertiary study questions currently available from graduates. Over the past decade it has been distributed to every graduate of an Australian university approximately four months after the successful completion of their course. Respondents not only rate a set of course experience items on the CEQ but are also invited to write open-ended comments on the best aspects (BA) of their university course experience and those most needing improvement (NI). It is estimated that at least 300,000 students have written over half a million ‘best aspects’ and ‘needs improvement’ comments on the CEQ since it was first distributed in the early 1990s, yet no systematic analysis of what graduates have said has been undertaken. This study used a database of the 168,376 comments made by 94,835 graduates from a representative sample of 14 Australian universities between 2001–2004 to address this research gap. As these graduates often refer to more than one aspect of their course experience in their comments, the final scored database produced some 285,906 hits on the many components of university life that students encounter. Opportunity This study has been made possible by the development of a new IT-enabled qualitative analysis tool—CEQuery. The analytical software was developed and tested through a partnership of 10 Australian universities in 2003 and distributed free, with a user manual and training, to all Australian universities in 2004 and 2005. CEQuery automatically classifies comments into 5 main domains (Outcomes, Staff, Course Design, Assessment, and Support) and 26 subdomains using a custom-tailored dictionary which has been further v enhanced during the current project. The domains and subdomains are outlined in Table 1 (below), with further explanatory details of each subdomain provided in Attachment 1. Table 1. CEQuery domains and subdomains Outcomes Staff Course design Assessment Support * Intellectual * Accessibility * Practical-theory * Relevance * Library and respons- links * Work application * Marking * Learning iveness /career * Relevance (to work/ resources * Expectations * Teaching life/discipline) * Further learning * Infrastructure skills * Feedback/return * Flexibility/ /environment * Personal * Practical responsiveness * Standards * Student * Interpersonal experience * Methods of learning administration (current) * Knowledge/skills and teaching * Student services * Quality and * Structure and attitude * Social affinity/ expectations support (See Attachment 1 for full details) The literature review and conceptual framework (Chapters 1 and 2) give the underpinning rationale for the selection of these domains and subdomains and identify the range of issues pursued as the CEQuery results were analysed. CEQuery ‘scores’ comments by looking for key words or combinations of words from its dictionaries for each of the subdomains that are in proximity to each other. When these are found, the relevant section of the comment is placed into the count for that subdomain. This is called a ‘hit’. This means that, when a comment covers more than one subdomain, this overlap is picked up. In order to test the veracity of CEQuery’s scoring, the analyst can click on the CEQuery results for any domain or subdomain and the comments allocated to it are then presented for checking, with the dictionary words used to allocate them to that subdomain highlighted. CEQuery is particularly flexible. Users can undertake a wide range of customised analyses against any of the variables gathered in the CEQ (university, field of education (FOE), award, fees, sex, age, mode of attendance, type of attendance, year in which the CEQ data were gathered, residence and Aboriginal & Torres Strait Islander [ATSI] status) as well as CEQ quantitative results. There is also a custom search facility which was used in the present study when a detailed analysis of the types of methods cited in the ‘Course Design: methods’ subdomain was undertaken. Finally, the dictionary itself can be modified. Need Why bother undertaking a qualitative analysis of this enormous database of best aspects (BA) and needs improvement (NI) comments on the CEQ? The reasons are compelling. The Australian higher education sector is beset by a wide range of local, national and international pressures for change. They include the need to secure new sources of income, vi to manage increasing competition, to deal with a growing student consumer rights movement and associated expectations and demands, to respond to much closer scrutiny, and keep up with rapid, ongoing developments in IT. These pressures are mutually reinforcing and make it important for each university to optimise the quality of every student’s experience in order to remain sustainable. It is particularly important in such a context, therefore, for universities not only to gain but to retain students—morally, in order to develop the total social, intellectual and cultural capital of Australia and to optimise the life chances of those who are first in their family to attend university, but also financially. It is anticipated that what emerges from the analysis of the comments made by the 95,000 graduates involved in the present study will throw some sharper light on what universities might best do to address these compelling learning and teaching quality agenda in our universities, not just in overall terms but in specific fields of education (FOE). The study’s findings have been compared with an extensive review of recent research and writing on learning and teaching in higher education from a wide range of other sources. In doing this it is anticipated that those responsible for assuring the quality of learning and teaching at universities will be helped in directing their scarce resources and development efforts