What Works Two Year Report Two Year Report
Total Page:16
File Type:pdf, Size:1020Kb
WHAT WORKS DEPARTMENT Social Mobility & Student Success What Works at King’s College London What we’ve found in our first two years 2 | A progress report from the King’s What Works team What Works at King’s College London What Works is a research and evaluation team embedded in the King’s College London Widening Participation department with the role of making society fairer in an evidence-based way. Our vision is that all King’s widening participation and student success initiatives are based on evidence and robust evaluation. This report summarises our progress, building our understanding of student behaviour, and embedding robust evaluation practices over the first two years. Authors: Vanessa Todman and Salome Gongadze With contributions from: Michael Sanders, Susannah Hume, Michael Bennett, Eliza Kozman, Miriam Styrnol and Henry Woodward A progress report from the King’s What Works team | 3 Two years of showing the way Michael Sanders I ARRIVED AT KING’S COLLEGE LONDON as a Reader in the summer of 2018. In my interview, when asked what attracted me to King’s, I said, truthfully, that the college’s commitment to ‘service’ as one of its values was a big attraction, and that this was particularly evident in the widening participation department’s establishment of the What Works team. I had already worked with King’s on the team’s spiritual predecessors, KCLxBIT, and What Works was already becoming an exciting feature on the landscape – running randomised controlled trials to try and rigorously test various ways of improving student experience, and making good use of the data that King’s already collected on students. The team’s work on encouraging people to come to welcome fair, in particular, overturned my initial expectations, and forced me to think differently about student experience. Two years into the team’s life, it’s a good time to stop and reflect. The team’s architect, Anne-Marie Canning, has left King’s, and Susannah Hume, the inaugural director, has moved within the institution. But the pioneering spirit of the team continues under new leadership, and it’s clear to see that the enthusiasm for bringing better evidence to widening participation and student success goes undimmed. I’m hopeful that future successes lie in wait for What Works at King’s, but perhaps the biggest legacy will lie beyond the institution – in showing that trials can be conducted in this way, that more can be gotten from data, and laying the groundwork in many ways for the formation of the centre for Transforming Access and Student Outcomes, the new What Works Centre for Higher Education. With this track record, we should all watch what happens next with interest. Michael Sanders is a Reader in Public Policy at King’s College London, where he directs the Evidence Development and Incubation Team, and serves as academic lead for the centre for Transforming Access and Student Outcomes. He is also executive director of What Works for Children’s Social Care. 4 | A progress report from the King’s What Works team What Works at King’s College London With thanks to... Awais Ali Yasemin Genc Chiamaka Nwosu Jawad Anjum Salome Gongadze Pri Perera Eireann Attridge Lewis Hudson Joe Pollard Nasima Bashar Susannah Hume Michael Sanders Michael Bennett Eliza Kozman Miriam Styrnol Haddi Browne Ben Laycock-Boardman Martin Sweeney Eleri Burnhill Oliver Martin Sandra Takei Anne-Marie Canning Pauline Meyer Matt Tijou Nadia Chechlinska Jack Mollart-Solity Vanessa Todman Zoe Claymore Hazel Northcott Henry Woodward A progress report from the King’s What Works team | 5 Contents Executive summary 6 Introduction to What Works 8 The history of the King’s What Works team 8 What Works’ mission and values 8 Applying behavioural insights to higher education 9 Dual-Process Theory and choice architecture 9 The importance of social capital and belongingness 10 Using nudges and psycho-social interventions in higher education 11 The ACES framework 12 Access to higher education 15 Researching behaviours, experiences and perceptions 15 Interventions for student participation 16 Understanding and improving the student experience 21 Data analysis 21 Trials 24 Embedding robust evaluation practices 27 Impact is important, but processes matter too 27 Ask the right questions 27 Use the right data at the right time 27 Choose the right method for your context 29 Causal inference matters 29 Just because something is an RCT, doesn’t mean it’s a universal gold standard 30 Build evaluations, and their findings, into ‘business as usual’ 30 The Centre for Transforming Access and Student Outcomes 31 Our vision for TASO 31 What next for What Works? 33 6 | A progress report from the King’s What Works team Executive summary Students from underrepresented We have been collating quantitative data on students’ experiences at King’s groups, are likely to have a different and have found that there are measurable differences in reported sense of experience of university than other belonging for some groups. Our focus groups with white working-class girls students, and feel that difference give us insight into some of the issues that group can experience, such as low confidence and alienation from more privileged peers, suggesting areas to target future work. Our project to bring together everything the institution knows about the First Year Experience is an example of our work to understand what the differences are so that we can tailor our offer more effectively. Behavioural insights can be used to We are all choice architects, and architects have the potential to achieve ensure that the choices we give to powerful impacts by thinking about the way we communicate choices. To students are framed in such a way increase take up of offers of a place in our halls of residence, we altered the as to encourage positive responses communications that the residence team sent, with positive results: altogether, we saw an increase in take up from 45 per cent to 48 per cent of those who received our treatment email over our control email. Half of each group also received a reminder text message. When we isolated the effect of the text message and the email, we found both together pushed take up to over half in our second wave. The success of this trial demonstrates that we are all choice architects, and even a change to something as simple as an email can have a noticeable impact. Students are powerful actors in their By knowing which behavioural biases are common to students, we can help own education, and need to be included them overcome them. For example, when we think about how we’re doing in interventions which affect them we tend to focus on the present and forget about previous successes. We are currently trialling a self-reflection tool to help first years think holistically about their performance. This was not an incentivised trial and the fact that students have signed up and engaged in the survey waves is a good indicator that students want help self-reflecting and improving their well-being. The reception by students of the videos we co-created with students for our Your King’s Your Success project was highly positive, with on average 83 per cent of students thinking our videos would be helpful for first years. This demonstrates how student led interventions and visible role models can be received by other students. We will know the extent of the impact soon. In order to succeed at university, In our first KCLxBIT report1 we reported that 41 per cent of all students and students need to have a strong 45 per cent of widening participation (WP) students (p<0.05) didn’t have many social network people they could go to if they needed help. Helping students to build their networks has therefore been a focus of the past two years. There’s a lot more to do here – we’re still working to understand how we can help students to make friends (see our Networky project page 18 of this document) and make the most of their networks (See our PACT project page 16). 1 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project Report 2015-2017. Available at: https://www.kcl.ac.uk/study/assets/PDF/widening-participation/What-works-project-report.pdf A progress report from the King’s What Works team | 7 Robust evaluation ensures that A good evaluation is essential for demonstrating the impact of our projects we are targeting limited resources and understanding how they can be improved. It is important to design robust at the right things process and impact evaluations, including designing the correct outcome measures and developing a reasoned and structured theory of change. It is also important to use the right evaluation method for your context. Randomised controlled trials are often the best way of establishing causal inference, but they are not appropriate for every context and are best used alongside qualitative methods to build a good understanding of why and for whom something is working. Building knowledge is an ongoing Robust research can be slow, and even two years in, some of our projects process are still ongoing. We’ve run ten randomised controlled trials so far, involving almost 12,000 students, but still there was no good place for us to stop and say we had the answers. Nonetheless, we hope this report will inspire you, and still provide some useful learning. After two years concentrating on enrolled students, we’re turning our attention to outreach, we’ll let you know how we get on. We’ve helped to set up a body to get You don’t need your own What Works department to get involved in this kind this kind of work embedded in the of work.