<<

Engagement for enhancement Institutional case studies from a UK survey pilot

Contents

Introduction 3

1. Canterbury Christ Church University 4

2. Cardiff Metropolitan University 6

3. King’s College London 8

4. 10

5. Manchester Metropolitan University 12

6. Oxford Brookes University 15

7. The 17

8. The 18

9. The 20

10. St John University 22

2 Introduction

The case studies collected here present institutional experiences of developing, administering and using the results from student engagement surveys. They have been contributed by the institutions participating in the 2013 pilot of selected items from the National Survey of Student Engagement (NSSE), and are designed to supplement the technical report of the pilot. As well as exploring the performance of the selected items in the UK, the project was also designed to provide the participating institutions with student engagement data that would be of practical benefit. These case studies address the latter objective.1

In many cases the process is at an early stage; only York St John University and the University of Warwick had used engagement surveys before their involvement in the pilot project. The novel nature of such surveys in the UK means that we have a limited understanding of how they can contribute practically to the enhancement of learning and teaching. There is a wealth of experience from other parts of the world, but this set of case studies is an important step in building up knowledge in the UK context.

Most of the case studies contained here directly address the use of the items and scales used in the pilot project. There are two exceptions. The contribution from Manchester Metropolitan University describes their use of engagement items developed in parallel to the main project, and has been included because of the light it sheds on alternative approaches. Oxford Brookes University will be using the items in 2014; the case study included here is a clear description of their motivations for taking part in the project.

I would like to thank all those people who, in addition to administering the survey and sharing the results with the HEA, agreed to contribute case studies. They are important illustrations of institutional commitment to the improvement of learning and teaching, and the growing interest in student engagement as a key part of that process.

Dr Alex Buckley The Higher Education Academy

1 The technical report, Engagement for enhancement: Report of a UK survey pilot, can be accessed at: http://www.heacademy.ac.uk/resources/detail/nss/engagement_for_enhancement 3 1. Canterbury Christ Church University

By Dr Gill Perkins (University Survey Manager) and Dr John Lea (Assistant Director for Learning and Teaching)

Background

Canterbury Christ Church University’s (CCCU) strategic plan 2011-2015 comprises a number of targets centred around the student experience, some of which are associated with NSS outcomes. A Change Programme was set up to address these targets and other priority themes through the development and implementation of a series of inter- related projects. As part of the Change Programme, the University launched a scheme offering students and staff opportunities to work in partnership on University priority themes.

CCCU embraces and seeks to actively promote the concept of students as active participants in the learning process, in contrast to the idea that students are merely purchasers of educational products, which they evaluate as they would any other product. The literature provides us with a rich context for understanding what is involved in learning partnerships, which goes beyond the student as an informed consumer and into the role of ‘producer’. At CCCU we have been actively exploring how we might negotiate this tricky balance between students as consumers of education, and students as active participants in the learning process.

Partners in Learning at CCCU

In order to enhance these dimensions of student engagement, and as part of the University’s Change Programme, the Learning and Teaching Enhancement Unit has launched a new scheme called Student Ambassadors for Learning and Teaching (SALT). This network of undergraduate and postgraduate students from across all four of the University faculties, have been working with the University to develop projects which address issues relating to their proactive participation and involvement in learning and teaching (including assessment), the student experience and future employability. These initiatives are embedded within their departments, faculties and the institution as a whole.

The SALTs are encouraged to explore themes which interest them, creating their own unique projects or working with staff to develop projects in response to existing agendas. On all projects the SALTs are assigned a sponsor - a member of academic or professional service staff whose role is to advise and facilitate the development of projects. A diverse range of projects emerged in the first year of the SALT scheme, both initiated by SALTs or with SALTs co-opted as student partners.

The University recently provided a new space dedicated to Partners in Learning in the library and student services building, to bring together and facilitate the future development of partnership working and projects. This space provides opportunities to experiment with:

 Students as partners – The Higher Education Academy, along with many universities are now actively re-exploring the purposes of higher education, and in the process experimenting with enhancing student engagement directly at curriculum level, including seeking ways to measure the level of engagement of students in their studies; encouraging students to see themselves as (co) producers of knowledge, not just its recipients; and actively encouraging students to negotiate their own learning projects.  Students as change agents – Students are increasingly becoming actively involved in curriculum development and innovation; becoming involved in professional dialogue with academics concerning aspects of HE pedagogy and

4 practice (including the review of teaching); and getting involved in campaigns that directly relate to learning and teaching.  Open and democratic learning space – to help balance different pedagogic dimensions to higher education, such as andragogic, discursive, and dialogic dimensions versus technology and pedagogic instruction.

Student Engagement Survey Pilot

Assisted by the SALTs, we invited students from a range of types of programme to take part in the pilot survey, representing diverse student characteristics. We advertised the pilot as a national research project in which we were participating, and sought comments about the survey questions and response options from students who took part. It was clear from the comments that the questions encouraged students to focus on themselves as learners as opposed to consumers, which the NSS tends to encourage.

Potential Value of Student Engagement Data

The continuing development of the SALT scheme and the dedicated Partners in Learning space present two key uses for data from a survey of student engagement:

1. The success of the new Partners in Learning space will be measured by how much it is able to contribute to students increasingly seeing themselves as (co) producers of knowledge, not just its consumers. Student engagement data will be integral to that process.

2. The data would help the SALTs and academic and professional service staff identify areas of weakness which will inform future project work in the context of enhancing learning and teaching outcomes through engagement in learning.

Therefore, measuring levels of student engagement is a necessary adjunct for evaluating, reviewing, and delivering the Partners in Learning agenda at CCCU, as well as contributing to achieving the University’s strategic plan targets where student experience is central.

5 2. Cardiff Metropolitan University

By Nicola Poole (Student Retention Officer)

Cardiff Metropolitan University has been involved with the design and execution of survey work for well over a decade. The institution has developed a number of different internal surveys including student withdrawal and first year experience surveys, as well as other less traditional information-gathering methodologies such as the diary room and ‘life through a lens’ photo project. We have also been involved in the National Student Survey since its pilot stage and take part in both the Postgraduate Research and Taught Experience surveys supported by the HEA. It is this increase in the number of surveys now available to students to complete that has led to the worry that students are reaching the point where they are too fatigued to complete surveys in any meaningful way – that is if they complete it at all. This is all the more relevant with response rates being an important factor in the reliability of the data.

So when it was first suggested that we pilot another ‘new’ survey it was met with some trepidation. There was however a number of factors that assisted in the decision to take part in the pilot project with the HEA. There has been a lot of work within the sector with regards to the area of student engagement, and its importance in the creation of a positive and more holistic student experience. This can be seen with the work the Quality Assurance Agency has been carrying out in relation to its quality code Chapter B5 which is all about student engagement. The National Union of Students has also had a number of initiatives including WISE here in , an approach which engages students as active participants in the leadership, management, development and delivery of their own educational experience. It is also echoed in the enhancement strand Students as Partners that has been one of three main pathways for enhancement work driven by the HEA in Wales.

The three sections that were chosen to trial included questions that linked to many of the key drivers and points for action within the institution’s Learning Teaching and Assessment Strategy, including assessment, employability and personal development and academic support. They also included questions which can assess the engagement with those skills that are necessary for success in some of these areas, such as critical analysis.

Decisions had to be made regarding when was going to be the ideal time for the release of the survey to students within the academic year, and which students should receive the survey without affecting performance in other key surveys. It was decided to pilot the survey with first year students towards the end of their academic year, where they would be able to look back and reflect over their first year experience and what they had engaged with over that year, thus taking a different view of their first year experience.

Students were asked to complete a paper version of the survey, with members of the Learning and Teaching Development Unit going into first year taught sessions and discussing the point of the survey before it was handed out. This provided an opportunity for the survey to be discussed informally with both staff and students and also to elicit a higher response rate. It proved useful as some students did have queries regarding the wording of one or two of the questions. There was a positive response from the students who felt that it had made them think about what they had been involved in throughout their first year and in some groups led to discussion amongst students about what they had or hadn’t been involved in and where/when this had occurred.

Staff reaction was positive too and felt that it could be used for enhancement purposes in a number of ways. The institution has a mix of students who travel to the university every day and those that are based in Halls. This was one area that staff suggested would be interesting to investigate, to see if there were any different levels/areas of engagement between the different groups. There was also a suggestion that it could be used in relation to personal

6 development and personal tutorial sessions, as a way of discussing progress, what the students felt they were experiencing, where any holes in their experience may exist and where opportunities may lie to enhance their experience further. One concern from staff was with the size of the NSSE used in America and whether this could create issues with response rates if the whole survey was to be undertaken. There was also the need to look at the wording and direction of some areas of the survey in relation to the British and American systems. However it was felt that this could be achieved with changes to the structure of some sections and a carefully managed plan to deliver the survey to students.

The university will be taking part in phase 2 of the pilot and will this time include a wider range of students. All year one and two students will be surveyed in the spring of 2014. With the comparison of as many year 2 students as possible to the data that was collected this year. The hope is to investigate patterns in engagement between first and second years as there have been suggestions in the more recent retention literature that although the first year experience is important, the design of the second year is just as essential in the retention and attainment levels of students.

7 3. King’s College London

By Dr Camille B. Kandiko (Research Fellow) and Dr Frederico Matos (Postdoctoral Research Associate)

As part of an institution-wide curriculum enhancement initiative, King’s College London recently developed an engagement survey, The King’s Experience Survey. To avoid competing with the NSS, the survey was designed for all non-final year undergraduates. As with any major institutional endeavour, there were multiple drivers, agendas and outcomes. Two competing approaches to the data emerged: enhancement and quality assurance. Here we focus on how student engagement data was used for enhancement of academic development provision for new lecturers, whilst developing a partnership approach to also meet quality assurance needs.

Background

Student engagement data can be used for resource allocation and planning, institutional development and enhancement of teaching and learning. We found bringing together what are often perceived as opposing sides – management, faculty and students – can actually constitute a partnership based on productive and fruitful cooperation. The positioning of student satisfaction data can lead to the development of oppositional relationships between staff, students and senior management. This is characterized in responses such as “You Said; We Did” followed by a list of student demands met by the institution. Staff in low-performing or low response departments are brought to task, even for aspects outside of their responsibility. There is often little information provided for how to enhance the student experience. Whilst a mutually respectful and positive approach between staff, students and management can be developed through creative uses and applications of satisfaction data, an engagement-based survey reinforces a partnership model for enhancement. To this end, student engagement data was organized around a set of key benchmarks:

National comparative benchmarks Engagement indicators

• Critical Thinking • Academic Challenge • Course Challenge • Learning with Peers • Academic Integration • Student-Academic Relationships • Collaborative Learning

King’s curriculum characteristics King’s priority areas

• Research-rich Environment • Feedback • Interdisciplinarity • Assessment • Academic Literacy • Academic Support • Community Engagement • Co-curricular Engagement • Global Connectedness

Partnership approach

To gain support for using engagement data, and to manage results, a partnership notion can be strategically tailored for staff, students and senior management. For students, it can be to get their voice heard, to give feedback that can change their student experience and to learn about how other students spend their time. For staff, such surveys provide information on how students are engaging with their learning, how low-staff-time-intensive activities such as peer-to- peer learning are used, and provides information on how much time students spend on academic work. For management, results can be used for evidence-based planning and resourcing decisions.

Locally-based action groups (at the College, School or Department-level) including students, academic staff, professional services and managers can take the lead on making relevant recommendations and action plans to respond to the data. Ownership of the data and recommendation process can build collegiality and allows for disciplinary difference and local cultures. Linking engagement data from early-years students encourages them to become more active and allows for changes to be made during a student’s experience, which allows an end-of-experience satisfaction survey, such as the NSS, to be more useful barometer of institutional performance.

8 Institutional context

Our student engagement survey was conducted against the backdrop of some challenging NSS results, and in an environment where NSS results drive league tables and can be expected to do so for the next few years at least. In this environment, the default position for many senior managers is to see student engagement activity and the measurement of this as a ‘nice to have’. However, we found that student engagement data was conducive to promoting enhancement in the context of academic development programmes.

King’s Learning Institute runs a variety of accredited and bespoke academic practice courses. A challenge in such programmes is being able to provide disciplinary-specific advice and support in College-wide programmes. In the context of the current higher education environment another challenge is to acknowledge and include the student voice in development initiatives. Data from the King’s Experience Survey has been able to provide insight to help address such concerns.

Contextualising data for academic development

Data from the King’s Experience Survey was reported at the School and Department level, organised around benchmarks and how students spent their time. This data was presented to relevant School Education Committees, KCLSU (the Students’ Union) and to central education committees, providing the opportunity for a collective understanding of the data. However, as part of a partnership approach the data needs to be shared collectively through representation structures and with individuals.

The first stage of sharing data with individual academics has been done through the accredited programmes run by King’s Learning Institute. Data has been presented during sessions on Evaluation and Student Engagement. Since the survey was conducted across the entire institution, data can be shared at the School and Department level on how students spend their time, how often they report active and collaborative learning activities, their relationships with students and staff and other benchmarks. Data that has been of particular interest to new lecturers has been about assessment and feedback to and from students. The raw data, together with that collected from students’ qualitative comments and best practices, has been developed into tip sheets. These are now being created for all of the benchmarks for the survey, with different versions being developed for staff and students. However, for such efforts to get wide institutional support they need to be managed in context with data used for public information and league tables, such as KIS and NSS data.

Summary

Student engagement survey data has presented opportunities for enhancement of teaching and learning, and for the student voice to be shared in the context of academic development. Data collected at the institutional level allowed for comparisons and locally relevant observations. We found dissemination at collective and individual levels provided the best means for enhancement based on quantitative analysis and local qualitative evaluations of what works. A balance needs to be struck between collecting meaningful data that is used for institutional improvement and pedagogical enhancement and for external comparisons and marketing.

9 4. Kingston University

By Steve May (Senior Researcher)

Overview

Kingston University is engaged in a number of interventions designed to improve the experience and success of students by encouraging a sense of belonging through engagement in university life and extra curricular activities. These include:

 The University Compact Scheme to guide students through the application process and provide on-course support  An Academic Mentoring Programme (AMP) intervention, focusing on specific modules to facilitate second year students supporting first year students in their academic progression and thereby the engagement of both in university life.  Academic Skills Centres set up to provide additional flexible student led skills support to all students in need.

This case study focuses on the AMP intervention in which questionnaires were distributed to 390 first year student attending six modules.

Aims and objectives

The AMP is designed to enhance the progression and attainment of first year students (mentees) and provide the opportunity for second and third year students to develop their personal skills through structured interactions with first years. The survey is being used as an instrument to help better understand the needs of mentees and to evaluate the impact of the intervention through measuring and quantifying the engagement of first year participants against non- participants.

Activity

All survey items were converted to both paper based and online questionnaires. The electronic version was created using Survey Monkey and distributed, in consultation with the AMP project leaders, by email to all students from the modules in question (the target population).

Reminders were not sent out because of logistical and resource constraints. The results were downloaded as an Excel spreadsheet in preparation for analysis.

The paper based surveys were distributed by lecturers in the classroom at the end of taught core modules designated as part of the AMP, excluding those who had completed the online version. In this way tutors were able to explain the rationale behind the surveys and answer any queries arising. A spreadsheet equivalent to the output format from the online survey was set up and data from the paper based questionnaires input manually.

For both versions of the survey, respondents were asked to give their university identification number details so that the results could be linked with the record of student AMP activity and with the detailed demographic and progression data held within the student record system.

The results are now being analysed and are expected to give an indication of whether the AMP has led to significantly greater academic engagement by students. They will also enable us to compare the relationships between engagement and student success for a range of demographic groups, with black and minority ethnic (BME) groups being of particular interest.

Outcome

The response rate for the online questionnaires was poor. All students who were present when the questionnaires were distributed in each module completed the paper based version although the proportion of the target populations present varied between modules.

10

Motivation for the survey: The current National Student Survey (NSS) is recognised as an important indicator of student satisfaction and experience; however, it is limited to final year students and gives little information about the reasons behinds the scores given. A recent study by the Higher Education Academy (HEA), the What Works? project, reinforced previous findings that the extent and type of student engagement was key to their experience and closely associated with retention.

The requirement for rigorous and effective evaluation of interventions to support students has come to the fore recently, particularly with the Office for Fair Access (OFFA) and the Higher Education Funding Councils being increasingly concerned to ensure consistency across the sector and good value for the funding given to HEIs to support their students. Kingston University has an institutional “Led by Learning” strategy and a complementary Education Strategy, each of which highlight the importance of student engagement. These provide a strong motivation for the use of engagement surveys, sitting alongside a range of other evaluation instruments, to enable us monitor the impact of our interventions and better understand connections between engagement and experience.

Institutional experience The experience of using the engagement survey has been a positive one inasmuch as students have responded well. It has been at times difficult to fully involve staff in the operational process which has resulted in paper based surveys taking longer than expected to be distributed to students.

The engagement survey provides an important addition to our evaluation instruments and will provide data that complements the qualitative data collected from focus groups and interviews and the retention, progression and attainment data obtained from linking the survey data with student records. In the light of early findings it is planned, where possible, to use the engagement survey in addition to our evaluation tool kit to assess the impact of interventions designed to support students.

Impact

Early evidence suggests that the AMP, ASC and Compact Scheme interventions have improved the experience of participating students over non-participants and it will be instructive to explore the relationship between the engagement survey data and measures of student retention, progression and attainment being analysed.

Lessons learnt The survey is likely to receive a significantly higher response rate if paper based rather than online. Students are expected to respond to many surveys whilst at university and likely to suffer from “survey fatigue”. However, they are more likely to engage positively with the questionnaire when the rationale and potential benefits of the survey are explained to them by their tutors. The key to maximising the benefit of the survey is involvement of “champions” such as lecturing staff and senior students.

Next steps

We intend to utilise the survey in 2013/14 to help determine the effectiveness of a range of interventions. In the short term we will analyse the data from the current survey and report the findings. In the medium to long term we would like to embed the use of the students engagement survey into the evaluation strategy of new interventions designed to raise the satisfaction and experience of students.

11 5. Manchester Metropolitan University

By Dr Mark Langan (Senior Learning and Teaching Fellow) and Professor Mark Stubbs (Head of Learning Research Technologies)

Background

In March 2013, the Faculty of Science and Engineering at Manchester Metropolitan University (MMU) ran a pilot study of five ‘engagement-style’ questionnaire items (‘ISSe’) appended to the existing eleven ‘NSS-style’ questions that comprise the institution’s Internal Student Survey (‘ISS’). The five new ISSe questionnaire items were based on the core themes identified by the HEA Engagement Surveys project and covered aspects of: engagement with classes, reflection on work, interaction with peers, interaction with tutors and feeling inspired to work hard. There is a strong drive in the institution to include the messages of recent reviews of student ‘engagement’ and explore meaningful ways of measuring it (for definitions and background see Trowler & Trowler 2010).

Constraints and bespoke questions

In November 2011, MMU adopted a standard survey instrument across the whole university. Delivered twice a year by the front page of the student portal, the survey asks each student a set of standard NSS-style questions about their programme and modules of study. The standard question set had been refined over several years to optimise response rates and qualitative insights. The decision to pilot ISSe questions by extending this survey instrument imposed constraints on the number and style of questions that could be accommodated. It was not possible to use the same question bank as the other project partners. Key constraints were: (i) questions had to retain the NSS's five point Likert scale (from Strongly Agree to Strongly Disagree) used for other questions in the survey and were thus written as statements (for respondent agreement); (ii) to address concerns about over-surveying, only five additional questions were permitted; (ii) questions were asked at programme level only (and not run separately for each module); (iv) questions had to be run as an addition to the institutional survey in a single Faculty as a pilot study. The bespoke questions were rewritten from basic concepts and were designed iteratively using guidance from discussions of the HEA Engagement Survey Pilot Steering Group (creating the basic themes to be surveyed and ideas for wording), feedback from its membership and also internally from senior academics. The questions used were:

1. ‘I contribute to my taught sessions (e.g. asking/answering questions, taking part in discussions etc.)’ 2. ‘I think through the ideas introduced by my course outside of the taught sessions’ 3. ‘I discuss course content with others (e.g. students, family members etc.), face-to-face, by phone or online’ 4. ‘I discuss my academic progress with my tutors’ 5. ‘Overall my course has inspired me to produce my best work’

Preliminary outcomes

It is acknowledged ‘up front’ that this survey was constrained and appended to an existing NSS-style survey. Overall, the Faculty achieved a response rate of 35% in the survey (n=1377). This survey was carried out to harvest baseline data for first explorations of students’ responses to this style of questionnaire item and also to introduce the concept of engagement-style surveys to students and staff. Preliminary analyses of the responses explored two questions of the survey data: (1) how well do the ISSe items predict the survey’s ‘Overall Satisfaction’ metric? and (2) how do ratings of engagement items change across levels of study (through undergraduate to taught Masters courses)?

To explore the relationship between the current survey’s ISSe questions and the standard ISS global question for ‘Overall Satisfaction’, a technique called Random Forests Analysis (RF; a robust modelling method; see Langan et al. 2013) was used. The RF analysis revealed that the ‘course organisation’ category was still the main predictor of overall satisfaction, a consistent finding of previous ISS and NSS surveys. Other strong predictors of overall satisfaction were the ‘expected’ questionnaire items from the standard ISS (pertaining to ratings of ‘confidence building’ and ‘teaching’. The strongest predictor of ‘Overall Satisfaction’ from the ISSe questions was not as good a predictor as existing ISS questions about course organisation or teaching. The strongest ISSe predictor was the item exploring if students ‘felt inspired to produce their best work’. This question also appeared to be related to the agreement with perceptions of students about having ‘discussions of their academic progress with tutors’. The results were unsurprising as the

12 evidence remains equivocal that engagement questions in general are good predictors of ‘satisfaction-style’ outcomes such as overall satisfaction with courses (Trowler and Trowler 2010, Taylor et al 2011).

It is anticipated that, like the NSS, ISSe question responses will vary between subjects due to the nature of the subjects themselves (see Langan et al 2013). However, some broad trends at Faculty level were explored using all returns/subjects to generate sufficient sample sizes. Exploration of changes by year of study in perceptions of ‘engagement’, based on the ISSe questionnaire items, suggested that to a large extent the metrics dropped throughout the undergraduate lifecycle with some recovery at Master’s level, particularly regarding discussion of work with tutors (see Figure 1). RF analysis was subsequently run separately to rank factors predicting whether students claimed that they had ‘felt inspired to produce their best work’. Overall, 58% of variance could be explained by the model (expressed below as MSE or Mean Squared Error) in which the four most significant factors were students’ responses to:

 I discuss my academic progress with my tutors (15% increase in MSE)  The course has helped me to develop confidence and skills to succeed (14% increase in MSE)  Overall I am satisfied with the quality of my course (12% increase in MSE)  I contribute to my taught sessions… (12% increase in MSE)

Conclusions

Data for 1,131 science and engineering undergraduates has been analysed including, for the first time, students’ reflections on their engagement with their course. In line with previous ISS findings, metrics of overall satisfaction were most related to well-organised, well-explained courses that build students’ confidence and skills. However, the new engagement questions revealed an extra determinant of satisfaction: whether students felt inspired to produce their best work. This metric was linked with confidence building, having the opportunity to contribute and, most importantly, discussing progress regularly with tutors. These findings resonate with local focus group findings that students want well-organised courses that are taught by inspirational tutors who know them as individuals.

At a faculty level, there was a general pattern of decline in the metrics of engagement questionnaire items as students progressed through the undergraduate levels, with recovery at Master’s level. This has raised questions about students’ perceptions of the questionnaire items within their (dynamic) expectations of their educational provision as their course progresses and generated debate about; the nature of the ISSe questions, how this snapshot survey accurately reflects the reality of the students’ experiences; and, how to respond.

Interpretation of this type of survey instrument is always best contextualised with multiple sources of feedback (including the associated respondent textual comments and staff views) about both course design/delivery within subject areas. This is being addressed locally and will lead to further work at the individual level, working with both staff and students, to understand further student expectations of their courses.

This study has initiated many useful questions about the notion of ‘engagement’ that are not discussed here, as well as drawing attention to literature such as Trowler & Trowler (2010). The university now intends to pilot two of the questions across the institution to begin a process of debating whether these two styles of question can be combined within the survey and the usefulness of the outcomes to enhance the quality of our provision.

13

Figure 1. Mean rank metrics of three ISSe questions and one standard ISS question (‘Feedback on my work helped to clarify things I did not understand’) for three undergraduate years (1-3) and taught postgraduate (4) in the Faculty of Science and Engineering (all subjects included; n=1377).

References

Langan, A.M., P.J. Dunleavy and A.F. Fielding (2013). Applying Models to National Surveys of Undergraduate Science Students: What Affects Ratings of Satisfaction? Education Sciences, 3, 193-207; doi:10.3390/educsci3020193.http://www.mdpi.com/journal/education

Trowler, P. and Trowler, V. (2010) Student engagement: evidence summary York: The Higher Education Academy. Available at: http://www.heacademy.ac.uk/assets/documents/studentengagement/StudentEngagementEvidenceSummary.pdf

Taylor, P., Koskela, J. and Lee, G. (2011) Shaping history. York: The Higher Education Academy. Available at: http://www.heacademy.ac.uk/projects/detail/StudentEngagement_ResearchBid2011_Warwick

14 6. Oxford Brookes University

By Berry O’Donovan (Principal Lecturer for Student Experience)

Overview

Oxford Brookes is committed to enhancing its student experience based on evidence, and consequently pays close attention to student responses to satisfaction surveys such as the NSS and PTES. However, satisfaction surveys do not ask students to consider their own behaviours and the role that these play in how they experience their university learning environment. Indeed, such measures are not always good indicators of educational quality (Gibbs, 2010). Accordingly, Brookes was keen to participate in a national group of universities brought together by the Higher Education Academy to devise and pilot core scales for a UK ‘student engagement’ survey. At Brookes the four core scales (course challenge, critical thinking, academic integration, collaborative learning) will be augmented by further items that evaluate engagement opportunities and student behaviours in response to a range of institutional initiatives that come together to form an integrated enhancement strategy towards a valuable and distinctive ‘Brookes’ student experience. The developed Brookes Student Engagement Survey will be piloted in March 2014 with all campus-based, non-final year, undergraduates and evaluated over the subsequent three months.

Aims and objectives

The case for another survey has to be a strong one in an era of survey fatigue and consequent decreasing response rates. Brookes’ motivation for participating in the national pilot is threefold. Firstly, concern over the almost exclusive dominance of student satisfaction ratings as valid measures of the student experience and the quality of their education. Secondly, borne from an understanding that surveys alter behaviours, encapsulated in the old adage ‘what gets counted gets done’, the desire to further consider, measure and thereby encourage the type of educational opportunities, practices and student behaviours from which students draw benefit. Finally, the need to measure the educational impact of a suite of evidence-based, institutional initiatives that together form a coherent programme to enhance the experience of Brookes’ students. These include: a proactive framework for academic advice and support; the introduction of five graduate attributes as core educational outcomes of a Brookes education; an assessment compact detailing good assessment and feedback practice and responsibilities.

Specific objectives include:

1. The design and piloting of a student engagement survey for Oxford Brookes, which can measure students’ perceptions of their engagement in educationally purposeful activities (particularly those associated with the Brookes’ student experience enhancement programme) and their learning gains and outcomes achieved. 2. Collection of data that can be used as a benchmark for assessing changes in the way students engage with educationally purposeful activities in future years. 3. Informing the development of the UK Survey of Student Engagement.

Activity and Timeline

The impetus for participation started in February 2012 when as part of the Brookes’ Programme for the Enhancement of the Student Experience contact was made with the HEA to find out if any UK institutions were using a UK form of the American National Survey of Student Engagement (NSSE) rather than sole reliance on student satisfaction surveys. Clearly, other institutions were thinking along the same lines and later in the year we had joined a national working group of 14 institutions brought together by the Higher Education Academy to develop and trial a UK Student Engagement Survey.

Activities over the next year encompass the initial development, piloting and first large scale administration of the Brookes Student Engagement Survey on the following timeline.

 June – July 2013: Identify stakeholders and establish the Project Advisory Group including internal and external experts. Agree the scope and purposes of the Brookes Student Engagement Survey and the project plan.  Semester 1 2013/4: Item selection by stakeholder group (Sept), pre-testing of draft survey with students using cognitive interviewing (Oct/Nov), review and finalise survey (Dec/Jan).

15  Semester 2 2013/14: Create and pilot online version (Jan/Feb), survey live to students (March/April), initial analysis (May).  The final analysis and reporting will be complete before the end of the academic year (June/July 2014).

With ‘surveyitis’ a very real threat, it is important to us that the survey is of value, and is both valid and reliable and items are interpreted consistently and accurately by students. Consequently the development and piloting of the Brookes Student Engagement survey instrument entails significant testing, including:

1 Pre-testing of the survey using the cognitive interviewing method with think aloud and verbal probing techniques, after Willis (2005). The aim is to understand the cognitive processes that students engage in when answering the questions, to highlight any respondent biases that arise from interpretation of the questions and suggest ways to minimize these errors. 2 Piloting of the online version of the survey using respondent debriefing after completion of the online survey in a focus group setting. The aim is to highlight any issues that arise from the question order, presentation and layout, and to gauge the length of time the survey takes to complete. 3 Distribution of the live survey to all non-final year undergraduates to provide a data set suitable for psychometric analysis of the instrument to ensure it provides data relevant to the underpinning constructs, and assess student responses to the items.

Intended Outcomes and Impact

The results of the survey will be reported to the Brookes Student Enhancement Steering Group in July 2014, and subsequently to Brookes’ departments and, whilst maintaining institutional anonymity, where appropriate to the Higher Education Academy for public reporting.

Internally, learning the lessons from the NSS about how and how not to use survey data, we hope that the results will not be about rankings or publicity, but provide us with additional evidence to gauge the educational quality of our offering and the real impact of our current enhancement initiatives in order to drive future improvements. We hope for example, that questions on the opportunities for, and behaviours engendered by, assessment and feedback practices known to underpin student achievement, and already embedded in Brookes’ policy, will encourage further good practice and guide student behaviours and staff effort.

Externally, we hope that results of the national group will together engender discussion about the nature of the student surveys and weaken the hold of the National Student Survey, arguably now used for enhancement purposes beyond its original intention and for which it is unsuited.

References

Gibbs, G. (2010). Dimensions of Quality. York: Higher Education Academy.

Willis, G. B. (2005) Cognitive Interviewing: A Tool For Improving Questionnaire Design. London: Sage.

16 7. The University of Bath

By Shaun McGall (Learning and Teaching Development Officer) and Gwen van der Velden (Director of Learning and Teaching Enhancement)

The University of Bath, and its Students’ Union (SU) has performed exceptionally well over recent years in a range of ‘satisfaction’ surveys, including the National Student Survey and Student Barometer and other surveys such as the Postgraduate Taught Experience Survey. This summer we collected student responses to our revised internal programme evaluation much as in years before. For the first time we included a number of student engagement related items and scales. We developed and agreed these together with institutional partners and experts in the HEA Student Engagement Survey Working Group. We believe that measuring the actual engagement of students with a range of aspects of their studies, will help us understand their learning better. Or rather, it will help us in understanding their learning behaviours and perhaps their preferences better. In theory at least, that knowledge could influence the design of our educational provision, especially if we manage to follow up on the survey findings with discussions with students on why they might engage or not engage with particular aspects of their studies. There is no doubt our Students’ Union and student Academic Reps will also want to have a close look at the survey findings, as there may be much to learn about where further development and enhancement may be needed.

So our first interest is pedagogical in nature. It helps that we can compare ourselves to other institutions who are also taking part in the project, and this is something we will certainly look closely at. We are very aware that student engagement will differ across different disciplines, and being able to cross reference to disciplines across institutions will also be helpful.

The next layer of interest relates to national policy and educational developments in relation to how students engage with all aspect of their studies. In a climate of awareness about future debt and changing expectations of fee paying students, questions about value of educational provision become ever more important. Rather than assuming that offering ‘more’ provides a better experience, we believe that enabling learning through means that students can engage with effectively are far more important. More provision does not equal better engagement and this is an area we wish to understand better in the years ahead of us. We have made that argument time and again in national as well as local debates, but this study will help us provide rich data to understand this aspect better in detail.

And finally, we are aware of the considerations currently taking place at national level of student engagement type questions for public accountability use, potentially as part of Key Information for Students. Although such decisions have not yet been made, when they will be considered, underpinning these by properly designed trials and prior findings can only be a good thing. Irrespective of what data and information reviews may bring in the future, this project may help inform those evaluating current practice and future developments. This pilot has influenced our joint University / Students’ Union decision to re-focus our internal programme evaluation tool from ‘Satisfaction’ to wider and more meaningful ‘Student Engagement’ items and scales for 2014/15 to support the University’s ‘Excellence in Education’ Strategy. As the University of Bath is an exceptionally well achieving institution, we feel it is important to make our contribution to the development of new experience and knowledge in this field.

17 8. The University of Oxford

By Dr Gosia Turner (Senior Statistical Analyst)

Background

Student satisfaction surveys are the dominant model in the UK and can provide useful information about the student experience. While students’ satisfaction is an important measure that should be continuously evaluated, it does not, however, provide robust evidence to inform policy discussions about quality enhancement.

The University of Oxford currently uses two student satisfaction surveys, the statutory NSS and the Student Barometer. Both surveys record very high satisfaction levels, however the indicators are also very broad in nature; this somewhat limits their use for purposes other than monitoring satisfaction. For example, neither survey provides useful information on how students learn, what they actually do while studying at Oxford, how much they gain educationally from the tutorial system, research environment, their peers and extracurricular activities etc. In addition, no data are produced that can be used to inform teaching practice at the classroom level, nor do they provide a great deal of insight into the defining characteristics of the ‘Oxford experience’.

Aims and objectives

The main objective of the project was to gain insight into what students actually do whilst at Oxford. This project explored such areas as what students do with their time at University, how they study, how they interact with other people and take advantage of the resources available to them and how they learn at Oxford. Linking this information with the student record, made possible to identify (and predict) how student engagement translates into academic performance. The data gathered also has the potential to enable the University to benchmark the results against competitor institutions in the UK and US.

Activity

The National Survey of Student Engagement (NSSE) questionnaire was piloted in spring 2013 at two large Oxford colleges. Over 1,300 undergraduate, postgraduate taught and research students were surveyed and 519 replies were received giving the response rate of 39%. The pilot was administered in collaboration with the Higher Education Academy. There were 14 survey items that were harmonised with other HEIs involved in the project (PGR students were excluded as these questions were less relevant to them). The online survey was then followed by three focus groups with students and two semi-structured interviews with college tutors. Survey results were linked with the student’s complete record, including their first year and final year exam results obtained in summer 2013.

Outcome

The NSSE pilot achieved all the anticipated aims and more. Many popular beliefs and anecdotal evidence about the University have been confirmed empirically. It became clear that students spend more time on self-study than in class, but that the time spent in class is very intensive due to the Oxford tutorial system. The survey results also uncovered interesting patterns in terms of different pedagogies used by different disciplines; for example historians tend to work on their own while physicists choose group work more often. Additionally it was found that students from underperforming schools (at A-Level) tend to engage less during their first year of study but become more engaged than their counterparts in later years. The results also showed an increase in student gains, expressed as their increased academic skills proficiency, as they progressed with their studies. Furthermore, a number of statistically significant positive correlations were found between various academic activities and skills development.

Analyses of the engagement survey results linked with the final exam performance was challenging due to small numbers (only 76 finalists), but has shown a potential for more research in this area should larger numbers be available. The figure below illustrates the proportion of Oxford students obtaining a First by how often they engaged during class. It suggests that students who engaged more were also more likely to obtain a First (student numbers in each group in brackets).

18 During the current academic year, how often have you asked questions or contributed to course discussions in other ways

35.3% 31.8%

21.1%

0.0%

Never (1) Sometimes (19) Often (22) Very often (34)

Impact

Despite the fact that the project was only a pilot study, it has already had a substantial impact on the various policy makers across the University. Interviewed College Tutors, due to their close relationship with students, seemed to be pretty well informed about what students do and how they engage. For that reason they were mostly interested in the correlational analyses that answer questions such as ‘why?’ rather than ‘what?’ All tutors noticed the potential value of internal and external benchmarking allowed by the data. Staff from the Equality and Diversity Unit requested further analyses on how students from less traditional backgrounds engage with studies at Oxford. Researchers at the Oxford Learning Institute found the initial results very interesting and requested further analyses by discipline. These results will directly inform further development of discipline-tailored training courses for new academic staff.

The key to the success of this project was involving certain stakeholders from the very beginning, such as College Senior Tutors and staff from the Oxford Learning Institute. Moreover, the follow up work, such as focus groups and interviews with tutors, made the initiative more visible throughout the university and caught the interest of other Colleges in the potential follow-up.

Next steps

The final report describing the results has been forwarded to the Educational Committee. Most certainly the pilot showed the potential and research value of the student engagement survey. To fully realise this potential, however, a University-wide administration is necessary. Many stakeholders were particularly interested in the impact of the engagement on the final outcomes of the students. Since outcomes are very much dependent on the discipline studied, more finalists need to be surveyed in order to make valid analyses at the subject level, something that so far has not been possible with the pilot data.

19 9. The University of Warwick

By Desislava Ivanova and Zimu Xu (undergraduate students in MORSE - mathematics, operational research, statistics and economics)

Introduction

The benefits of implementing a Student Engagement Survey (SES) in the UK are various. For instance, the data collected can be used to compare with surveys such as the National Survey of Student Engagement (NSSE) or the Australasian Survey of Student Engagement (AUSSE) in order to provide an international benchmark, or with other universities in the UK to provide a national comparison. However, some of the greatest advantages arise if the survey is iterative, allowing a comparison of results in the same institution year on year. Some of these advantages will be explored through this case study, based at the University of Warwick, where a survey of student engagement has been carried out in 2011 and 2013. We have analysed the differences in levels of student engagement in 2011 and 2013 and are in the early stages of being able to indicate the benefits we have seen of the iterative survey.

Overall Comparisons

An analysis of our data from 2011 and 2013 indicates that for 12 out of 17 (71%) questions, there has been a significant increase in the overall scores. The results imply that levels of student engagement have increased considerably over the two year period. In itself this is a useful finding but further analysis has provided insights into the reasons for these increases.

One of the questions with the most noticeable change is 4a (Ability to appreciate your subject(s) of study from a range of social and international perspectives) with the score increasing by 28%. There could be various reasons for this significant change, which will be followed up through focus groups, however one possible explanation is that there was an increase in response rate from overseas students from 6.8% in 2011 to 9.4% in 2013. Another hypothesis is that levels of interaction between students and teachers, as well as between students and other students have increased, leading to a stronger appreciation of the subject of study from different social and international perspectives. This theory is supported by the responses to questions 1b (Discussed ideas from your course with others outside of taught sessions (students, family members, co-workers, etc. including by email/online)) and 1c (Discussed ideas from your course with teaching staff outside taught sessions including e-mail/online) which have also increased. Exploring further, it was identified there is a significant correlation between these questions, with the correlation between 4a and 1b 23% and between 4a and 1c 25%.

The example above illustrates not only how an institution can use iterative surveys to track levels of student engagement but, by identifying questions that correlate to each other, can identify how aspects of engagement indirectly affect each other through altering the correlated questions.

Comparisons by Year

One of the greatest benefits of regularly running the SES is that sequential data can be gathered and analysed by year. For instance, some students answered the survey in both 2011 and 2013, which allows us to calculate well defined correlations across years and scales. Keeping track of this data on an annual basis enables researchers to perform more complex and useful comparisons, identify patterns and trends throughout the year and even make predictions for forthcoming years. The results could even be extended to forecasting and controlling outcomes of future surveys.

Comparisons by Faculty

An interesting way of classifying the faculties at the University of Warwick using data from 2011 and 2013 can be identified using the survey data. The 8 faculties can be divided into 2 broad types. The first type includes faculties that showed a stable increase in responses to all the questions; the second type indicates a stable increase in the majority of the responses but a large improvement in some. What we gain from this division is the ability to approach faculties that exhibit strong improvement in one or more of the points of engagement, and then communicate that good practice across other faculties. Doing this successfully would increase the level of student engagement which might then result in higher NSS scores, which is of crucial importance for the reputation of each University.

20 Limitations and Next Steps

Due to the limited data and the fact that some questions were removed/changed from 2011 to 2013, our statistical comparison was carried out question by question rather than by creating more complex scales and benchmarks. However if the survey is carried out next year, we will be able to create a more complex model that will give us better understanding of the drivers of student engagement and allow us to disseminate good practices across University.

It should be noted that the two surveys were run at a different time in the year, which could affect the results. It is not felt that this would have significant impact on results but future iterations of the survey would be standardized.

The benefits of the SES could be extended if we aim to increase the response rate among students and also include research PhD students.

Conclusion

From comparisons carried out so far, the immediate conclusion is that it is valuable to perform the student engagement survey on annual basis. The fact that the SES was performed twice at the University of Warwick provides a basis for beneficial and interesting statistical analysis. Comparing the results from 2011 with those of 2013, shows various trends and patterns that will be used to improve engagement levels among students throughout the institution. The analysis reveals a satisfactory general trend with all faculties showing an increase in most questions, with some illustrating a considerable improvement. Preliminary theories to explain these results have been raised through the statistical analyses of the two surveys, which will be triangulated by focus groups and one to one interviews, allowing the University to confirm good practice which can be generalised and disseminated across the University.

21 10. York St John University

By Anthony Payne (Head of Student Experience) and Andrew Fern (Strategic Analyst)

Overview

The University has implemented an annual Student Engagement Survey which targets first and second year, undergraduate and postgraduate taught students including those from overseas.

The survey instrument is adapted from the Australasian Survey of Student Engagement (AUSSE) and is administered electronically during April. The survey has received strong support from staff and the Students’ Union. With minimal promotion, response rates of over 20% have been achieved with over 90% of respondents completing all questions. The results have been benchmarked internationally and are now being used by faculties and central services to inform quality enhancement and to further stimulate student engagement.

Aims and objectives

York St John University has a goal of delivering an exceptional student experience and student engagement underpins a number of our York St John University’s strategies. One strategy for achieving this goal is to better understand their student experience and then work closely with the Students’ Union to enhance it.

To improve our evidence base for prioritising enhancement activities the University has implemented an annual Student Engagement Survey. The survey targets undergraduates in the first and second years, any final year students not included in the NSS sample, and postgraduate taught students including both home and international students. Analysis of the rich data produced from such surveys facilitates a much better understanding of the complex nature of student engagement and to enhance the student experience.

Activity

The survey questionnaire is based on one that has been used extensively in the United States for over 20 years (NSSE), and is increasingly being adopted in other countries. It was adapted from the Australasian Survey of Student Engagement (AUSSE) to suit our cultural and institutional context, under licence from Indiana University. Only minor changes have been made in order to ensure comparability with the extensive analyses that have been conducted elsewhere. The instrument and methodology were subject to extensive consultation with the Student Union and staff before adoption and feedback has been provided to students regarding results and outcomes after each survey.

A personalised link to the online survey was sent to all students in the sample. Reminder emails were sent to those students who had not responded. A philanthropic incentive was included whereby £1 would be donated to our student scholarship fund for each response received.

Outcomes

Responses were received from 22% of the sample (n= 950) and over 90% of these completed all aspects of the survey. Responses were data-matched to demographic/course data from the student record system (SITS) and were then anonymised. This minimised unnecessary questions and enabled analysis of results by demographic, equalities and academic progress variables. 22

The respondents were representative of the broader student population except that part-time students were slightly under-represented in the sample. No negative feedback was received on the way that the survey was conducted; conversely, a number of positive messages were received from students who had completed the survey suggesting that they had found it interesting.

In the 2013 survey, an additional item was added to enable students to expand on any issues of concern. The question used - ‘What one thing could have been done to improve your time with us?’ enabled direct comparison with results from a similar question from the NSS.

Key questions on the survey were grouped into 12 ‘engagement scales’ which enabled benchmarking against results internationally:

 Academic Challenge - extent to which the programme challenges students to learn;  Active Learning - students' efforts to actively construct knowledge;  Student and Staff Interactions - the level and nature of interaction with staff;  Enriching Educational Experiences - participation in educational activities;  Supportive Learning Environment - support from the university community; and  Work Integrated Learning - integration of employment-focused work experiences.

Results at this aggregate level for the two years were very similar, which provided a greater degree of confidence that the survey instrument is yielding reliable data. Initial analysis of the survey results yielded a number of interesting findings that have implications for the student experience. These included the following:

 Having good relationships with other students and University staff, especially teaching staff, is closely correlated with higher levels of engagement, continuation, achievement and satisfaction. This aligns with findings from the HEA’s ‘What Works?’ research.  Disabled students receiving DSA support have higher levels of engagement, achievement and satisfaction than non-disabled students whilst disabled students without DSA support have poorer results across all scales.  First year students have the highest non-continuation rates but second and third years are more likely to have considered early departure, however these students typically stay on and complete their studies. BME students are more likely to have seriously considered leaving the University prematurely.  Students who are employed for 10-15 hours per week are more engaged and have better outcomes than those who are employed more or less than that.  On average students spend 8 hours per week on campus, outside of class.  Students over 30 years of age are likely to have higher rates of achievement, continuation and satisfaction are but they are less likely to be active learners.  27% of respondents have caring responsibilities requiring an average 15 hours per week.  78% of students have used university support services and use of such services is positively correlated with engagement and outcomes.  There are interesting variations across subjects but these broadly align with the results of the NSS.

Impact

These survey findings have been triangulated with other data and used by the Enhancement and Student Engagement Committees at the University and Faculty level to inform decision making around priorities and targets for enhancement activities. The results are also used by within faculties in the development of Programme Evaluation Reports and by Deans in writing Annual Evaluation Reports. Overall there has been a significantly increased use of student engagement related evidence across the University to inform planning and enhancement activities.

Next steps

Given the significant benefits accruing from the introduction of the Student Engagement Survey, the University will continue to conduct it on an annual basis for the foreseeable future. A focus in the coming year will be further analysis of qualitative responses and analysis of data by protected characteristics to inform equalities

23

Contact us

The Higher Education Academy Innovation Way York Science Park The HEA has knowledge, experience and expertise in higher Heslington education. Our service and product range is broader York than any other competitor. YO10 5BR www.heacademy.ac.uk | www.twitter.com/heacademy +44 (0)1904 717500 [email protected] The views expressed in this publication are those of the ISBN: 000-0-0000000-00-0 author and not necessarily those of the Higher Education Academy. No part of this publication may be reproduced or © The Higher Education Academy, 2013 transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any storage The Higher Education Academy (HEA) is a national body for and retrieval system without the written permission of the learning and teaching in higher education. We work with Editor. Such permission will normally be granted for universities and other higher education providers to bring educational purposes provided that due acknowledgement is about change in learning and teaching. We do this to improve given. the experience that students have while they are studying, and to support and develop those who teach them. Our To request copies of this report in large print or in a activities focus on rewarding and recognising excellence in different format, please contact the communications office teaching, bringing together people and resources to research at the Higher Education Academy: 01904 717500 or and share best practice, and by helping to influence, shape [email protected] and implement policy - locally, nationally, and internationally. The Higher Education Academy is a company limited by The HEA supports staff in higher education throughout their guarantee registered in and Wales no. 04931031. careers, from those who are new to teaching through to Registered as a charity in England and Wales no. 1101607. senior management. We offer services at a generic learning Registered as a charity in Scotland no. SC043946. and teaching level as well as in 28 different disciplines. Through our partnership managers we work directly with HE The Higher Education Academy and its logo are registered providers to understand individual circumstances and trademarks and should not be used without our permission priorities, and bring together resources to meet them.

24