<<

WHAT WORKS DEPARTMENT Social Mobility & Student Success

What Works at King’s College London What we’ve found in our first two years 2 | A progress report from the King’s What Works team

What Works at King’s College London

What Works is a research and evaluation team embedded in the King’s College London Widening Participation department with the role of making society fairer in an evidence-based way. Our vision is that all King’s widening participation and student success initiatives are based on evidence and robust evaluation. This report summarises our progress, building our understanding of student behaviour, and embedding robust evaluation practices over the first two years.

Authors: Vanessa Todman and Salome Gongadze With contributions from: Michael Sanders, Susannah Hume, Michael Bennett, Eliza Kozman, Miriam Styrnol and Henry Woodward A progress report from the King’s What Works team | 3

Two years of showing the way Michael Sanders

I ARRIVED AT KING’S COLLEGE LONDON as a Reader in the summer of 2018. In my interview, when asked what attracted me to King’s, I said, truthfully, that the college’s commitment to ‘service’ as one of its values was a big attraction, and that this was particularly evident in the widening participation department’s establishment of the What Works team.

I had already worked with King’s on the team’s spiritual predecessors, KCLxBIT, and What Works was already becoming an exciting feature on the landscape – running randomised controlled trials to try and rigorously test various ways of improving student experience, and making good use of the data that King’s already collected on students. The team’s work on encouraging people to come to welcome fair, in particular, overturned my initial expectations, and forced me to think differently about student experience.

Two years into the team’s life, it’s a good time to stop and reflect. The team’s architect, Anne-Marie Canning, has left King’s, and Susannah Hume, the inaugural director, has moved within the institution. But the pioneering spirit of the team continues under new leadership, and it’s clear to see that the enthusiasm for bringing better evidence to widening participation and student success goes undimmed.

I’m hopeful that future successes lie in wait for What Works at King’s, but perhaps the biggest legacy will lie beyond the institution – in showing that trials can be conducted in this way, that more can be gotten from data, and laying the groundwork in many ways for the formation of the centre for Transforming Access and Student Outcomes, the new What Works Centre for Higher Education. With this track record, we should all watch what happens next with interest.

Michael Sanders is a Reader in Public Policy at King’s College London, where he directs the Evidence Development and Incubation Team, and serves as academic lead for the centre for Transforming Access and Student Outcomes. He is also executive director of What Works for Children’s Social Care. 4 | A progress report from the King’s What Works team

What Works at King’s College London

With thanks to...

Awais Ali Yasemin Genc Chiamaka Nwosu Jawad Anjum Salome Gongadze Pri Perera Eireann Attridge Lewis Hudson Joe Pollard Nasima Bashar Susannah Hume Michael Sanders Michael Bennett Eliza Kozman Miriam Styrnol Haddi Browne Ben Laycock-Boardman Martin Sweeney Eleri Burnhill Oliver Martin Sandra Takei Anne-Marie Canning Pauline Meyer Matt Tijou Nadia Chechlinska Jack Mollart-Solity Vanessa Todman Zoe Claymore Hazel Northcott Henry Woodward A progress report from the King’s What Works team | 5

Contents

Executive summary 6 Introduction to What Works 8 The history of the King’s What Works team 8 What Works’ mission and values 8 Applying behavioural insights to higher education 9 Dual-Process Theory and choice architecture 9 The importance of social capital and belongingness 10 Using nudges and psycho-social interventions in higher education 11 The ACES framework 12 Access to higher education 15 Researching behaviours, experiences and perceptions 15 Interventions for student participation 16 Understanding and improving the student experience 21 Data analysis 21 Trials 24 Embedding robust evaluation practices 27 Impact is important, but processes matter too 27 Ask the right questions 27 Use the right data at the right time 27 Choose the right method for your context 29 Causal inference matters 29 Just because something is an RCT, doesn’t mean it’s a universal gold standard 30 Build evaluations, and their findings, into ‘business as usual’ 30 The Centre for Transforming Access and Student Outcomes 31 Our vision for TASO 31 What next for What Works? 33 6 | A progress report from the King’s What Works team

Executive summary

Students from underrepresented We have been collating quantitative data on students’ experiences at King’s groups, are likely to have a different and have found that there are measurable differences in reported sense of experience of university than other belonging for some groups. Our focus groups with white working-class girls students, and feel that difference give us insight into some of the issues that group can experience, such as low confidence and alienation from more privileged peers, suggesting areas to target future work. Our project to bring together everything the institution knows about the First Year Experience is an example of our work to understand what the differences are so that we can tailor our offer more effectively.

Behavioural insights can be used to We are all choice architects, and architects have the potential to achieve ensure that the choices we give to powerful impacts by thinking about the way we communicate choices. To students are framed in such a way increase take up of offers of a place in our halls of residence, we altered the as to encourage positive responses communications that the residence team sent, with positive results: altogether, we saw an increase in take up from 45 per cent to 48 per cent of those who received our treatment email over our control email. Half of each group also received a reminder text message. When we isolated the effect of the text message and the email, we found both together pushed take up to over half in our second wave. The success of this trial demonstrates that we are all choice architects, and even a change to something as simple as an email can have a noticeable impact.

Students are powerful actors in their By knowing which behavioural biases are common to students, we can help own education, and need to be included them overcome them. For example, when we think about how we’re doing in interventions which affect them we tend to focus on the present and forget about previous successes. We are currently trialling a self-reflection tool to help first years think holistically about their performance. This was not an incentivised trial and the fact that students have signed up and engaged in the survey waves is a good indicator that students want help self-reflecting and improving their well-being. The reception by students of the videos we co-created with students for our Your King’s Your Success project was highly positive, with on average 83 per cent of students thinking our videos would be helpful for first years. This demonstrates how student led interventions and visible role models can be received by other students. We will know the extent of the impact soon.

In order to succeed at university, In our first KCLxBIT report1 we reported that 41 per cent of all students and students need to have a strong 45 per cent of widening participation (WP) students (p<0.05) didn’t have many social network people they could go to if they needed help. Helping students to build their networks has therefore been a focus of the past two years. There’s a lot more to do here – we’re still working to understand how we can help students to make friends (see our Networky project page 18 of this document) and make the most of their networks (See our PACT project page 16).

1 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project Report 2015-2017. Available at: https://www.kcl.ac.uk/study/assets/PDF/widening-participation/What-works-project-report.pdf A progress report from the King’s What Works team | 7

Robust evaluation ensures that A good evaluation is essential for demonstrating the impact of our projects we are targeting limited resources and understanding how they can be improved. It is important to design robust at the right things process and impact evaluations, including designing the correct outcome measures and developing a reasoned and structured theory of change. It is also important to use the right evaluation method for your context. Randomised controlled trials are often the best way of establishing causal inference, but they are not appropriate for every context and are best used alongside qualitative methods to build a good understanding of why and for whom something is working.

Building knowledge is an ongoing Robust research can be slow, and even two years in, some of our projects process are still ongoing. We’ve run ten randomised controlled trials so far, involving almost 12,000 students, but still there was no good place for us to stop and say we had the answers. Nonetheless, we hope this report will inspire you, and still provide some useful learning. After two years concentrating on enrolled students, we’re turning our attention to outreach, we’ll let you know how we get on.

We’ve helped to set up a body to get You don’t need your own What Works department to get involved in this kind this kind of work embedded in the of work. In 2019 we helped to launch the centre for Transforming Access sector (TASO) and Student Outcomes in Higher Education (TASO).2 TASO will exist as an independent hub for higher education professionals to access leading research, toolkits, evaluation techniques and more to help widen participation and improve equality within the sector.

How What Works is delivering our institutional priorities ‘What Works is a major institutional priority for ‘What Works is contributing to the world-class King’s College London, as we work to transform status of King’s by embedding robust research all aspects of Education and the experience of and evaluation methods into student success our students in the context of our Vision 2029. In programmes and outreach activity. It is integral in reaching the highest academic standards in teaching ensuring robust evidence is available to practitioners and learning, and supporting our extraordinarily to support delivery of our Education strategy diverse student body at all stages of their education, commitment to embrace students as co-creators What Works provides us with the insight and of the educational experience and ensure all knowledge we need in designing and improving all King’s students are equipped for success.’ aspects of the student experience, and understanding the impact of our far-reaching work on access Darren Walls, Executive Director, Education and Students, and participation.’ King’s College London

Nicola Phillips, Vice-President and Vice-Principal (Education) and Professor of Political Economy, King’s College London

2 Taso-he.org 8 | A progress report from the King’s What Works team

Introduction to What Works

The history of the King’s What Works team We hope that you find it useful, please get in touch with us at [email protected] if you would like to discuss The What Works team was launched in January 2018, anything in this report further. following a two-year collaborative project between King’s College London and the Behavioural Insights Team (BIT). This project, called ‘KCLxBIT’, was set up What Works’ mission and values to conduct research into the application of behavioural insights approaches to widening participation at King’s Empirical research and robust evidence are at the core over the 2015 –16 and 2016 –17 academic years. of the department’s mission. KCLxBIT ran ten Randomized Controlled Trials (RCTs) and a six-wave longitudinal survey of King’s The What Works team has three main objectives: undergraduates, covering student experience topics like wellbeing, identity, belongingness, faculty interactions, 1 To contribute to understandings of what works in enabling and financial behaviour. people to access and succeed at university.

What Works is now a small dedicated team working 2 To promote empiricism and innovation in widening internally and as part of the broader movement within participation. the sector to build an evidence base for what works in improving the student experience for underrepresented 3 To support the sector to think differently about designing groups. In this two-year period, we have focused on and evaluating their initiatives. running interventions to test the importance of affirming belonging and building social capital for students, In addition to finding out what works and doesn’t work, understand psychological factors which influence the department is also committed to building a better attainment gaps and embedded robust evaluation understanding of why and how interventions work. As practices across King’s. We’ve helped incubate the part of this, we also get feedback from students on their sector’s new What Works centre (TASO, see page 31 for experiences engaging with research such as focus groups. more information), conducted social research with over This feedback helps us understand the ‘how’ of our 100 students, a multi-wave wellbeing survey, and close to interventions and how students experience our work ten RCTs (involving nearly 12,000 pupils and students). more broadly. These have ranged from light-touch text messages to long-term positive psychology interventions. What Works also aims to promote our approach across the wider Higher Education (HE) sector and engage in Within less than two years, King’s has demonstrated the the public dialogue about behavioural insights, evaluation, feasibility of not only running complex research to inform and widening participation in order to bring the sector student support practices, but also using mixed methods along with us in the use of innovative and robust methods. approaches and embedding evaluation practices within Examples of our efforts to engage the sector include a higher education institution. This report sets out our hosting talks and trainings at King’s College London, findings, our progress so far and our aims for the future. eg a talk with behavioural research expert Dr Ben It is intended to help other university teams with similar Castleman and interactive workshops on evaluation aspirations and to ensure that what we have learnt from and RCTs. We also maintain a blog and twitter feed our trials is taken forward into the wider sector. (@KCLWhatWorks) dedicated to promoting our work.

What are behavioural insights? Behavioural Insights is a field that applies insights from the behavioural sciences (particularly behavioural economics, psychology and social psychology) to policy. What Works uses behavioural insights to improve services and deliver positive results for young people by designing services around them. A progress report from the King’s What Works team | 9

Applying behavioural insights to higher education

In line with our objective to contribute to the knowledge impulsive decision-making process. This means that base within our sector, What Works has conducted the context around a decision, both intentional and research on behavioural insights and student experience unintentional, has the potential to profoundly influence topics. This section summarises the highlights of this the outcome of the decision. research and introduces the ACES framework, which has been devised by the What Works department as tool The famous book Nudge by Richard Thaler and Cass for applying behavioural science to higher education Sunstein4 calls this context the ‘choice architecture’. programmes specifically. By considering the choice architecture, we can influence people – or ‘nudge’ them – to make the best decisions by presenting choices in a way that is mindful of the Dual-Process Theory, choice architecture, tendency to rely on System 1 thinking. and nudge Furthermore, research has demonstrated that System 1 We need to understand how students make decisions thinking can actually be somewhat predictable: people in order to encourage students to perform the best have a number of reliable mental habits,5 actions for them and help them overcome the biases meaning that that we are all prone to. we can design ‘nudges’ around these predictable habits. Some examples of the habits include: Much of the work in behavioural economics is rooted in an understanding of human cognition called dual- l Status quo bias.6 People tend to stick with the default process theory, a way of conceptualising how people options of things instead of making an effort to change make decisions under conditions of uncertainty. It them. Practitioners can help influence people to make was developed in the 1970s by psychologists Daniel better choices by changing the default option, or Kahneman and Amos Tversky to challenge the old by reducing the amount of friction (time and effort) assumption that people can generally be relied upon required to change something. For example, one trial to make ‘rational’ choices using cost-benefit analysis.3 showed that there is value in reducing the level of Kahneman argues that humans have two ways of effort required to get benefits – it showed that approaching a decision (see below). providing families in the US simplified information about government funding for university and help Because thinking using System 2 takes more cognitive filling out the paperwork led to an increase in take-up effort, humans tend to prefer System 1, their more for financial aid and HE enrolment.7

System 1 System 2 Fast and intuitive: Slow and reflective: Daily commute Travelling somewhere new 2 + 2 = ? 23 x 47 = ?

The intuitive system, characterised by fast, contextual and The reflective system, characterised by logical, general and effortless processing, uses ‘heuristics’ to simplify complex effortful processing; approaches economic rationality, but decisions; not ‘rational’ in the economic sense. consumes cognitive resources.

Tortoise by HeadsOfBirds and hare by Ludovic Riffault from the Noun Project.

3 Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263-291 4 Thaler, R. Sustein, C. (2009) Nudge: Improving Decisions About Health, Wealth and Happiness. London: Penguin Books. 5 These habits are described in Nudge. The term ‘behavioural insights’ is sometimes used to describe these habits, in addition to being used to describe the larger field studying them. 6 Samuelson, W Zeckhauser, R (1988) Status Quo Bias in Decision Making. Journal of Risk and Uncertainty 1: 7-59 7 Bettinger, E. P., Long, B. T., Oreopolous, P. Sanbonmatsu, L. (2009) The Role of Simplification and Information in College Decisions: Results from the H&R Block FAFSA Experiment. NBER Working Paper No. 15361. Available at: https://www.nber.org/papers/w15361 10 | A progress report from the King’s What Works team

l Message salience. People tend to respond at higher rates to influence, alongside the important work other teams to stimulus that is novel or otherwise eye-catching. do to ensure institutional change to help students from A US-based RCT testing whether personalised text underrepresented groups. messages sent to students to remind them to complete university enrolment tasks found that this intervention Social Capital 8 increased enrolment among some student groups. Students start university with different levels of social capital:10 personal knowledge and experience within their l Present bias. People tend to overvalue present rewards networks about what it’s going to be like and what they and desires against possible future rewards. Research need to do to succeed. Social capital refers to the tangible suggests this is especially pronounced in teenagers and intangible resources an individual can access via their and young adults, who tend to think short term and social networks – friends, family, colleagues and contacts. have a tendency to be influenced by their peer group, It is understood to be an important factor in how students which can create self-reinforcing cycles of present navigate and experience university, including their 9 bias behaviour like procrastination. What Works ran attainment, persistence, and post-university outcomes.11 an RCT (see page 17) focused on reducing present bias to get students to sign up for a place in King’s Widening Participation students – those from Residences – we found that personalised emails backgrounds where university progression is traditionally helped encourage some students to accept offers low – are often found to have less social capital in this of university accommodation when given a short context.12 For example, they are less likely to have timeframe to respond. parents who went to university and who can pass on their experience, help them navigate their new These examples highlight how the biases we all slip in environment, and make the most of their opportunities. to have the potential to lead to sub-optimal decision making, but also that building nudges can help people Practitioners can help students overcome low social overcome these biases. In a university context, decisions capital by providing information, either to empower which may feel low stakes at the time, such as which and enable the network students already have or to modules to enrol on, or whether or not to attend fresher’s supplement gaps that students with lower social capital fair have long repercussions for the future. We encourage might have in their knowledge of how to succeed. practitioners to seek to understand how they are asking They can also positively influence persistence by students to make choices and trial their interventions to helping students increase their social capital through ensure the architecture of those choices subvert these the creation of both a good quality networks and a biases. In parts three and four of this report, we highlight large number of low intensity ‘ties’ with others from several trials we’ve run here at King’s that aim to ‘nudge’ different backgrounds. students to make better decisions. Belongingness Belongingness (or sense of belonging) refers to the extent The importance of social capital and belongingness to which a student feels supported, accepted and wanted In higher education, the evidence suggests that it is by their institution.13 Sense of belonging is psychological important that students feel a part of the university and and subjective. In the context of higher education, it’s that they have a strong social network. about the feeling of academic and social integration:14

l Feeling connected with and relating to your In What Works’ research on the student experience, institution15,16 the department has identified two important aspects of the student experience that impact success at l Identifying with the institution and feeling proud to university that we might be able to use interventions be there17

8 Castleman, B. and Page, L (2015) Summer Nudging: can personalised text messages and peer mentor outreach increase college going among low-income high school graduates. Journal of Economics Behaviour & Organization 115, 144-150. 9 Lavecchia, A. Liu, H. and Oreopolous,P. (2014) Behavioural Economics of Education: Progress and Possibilities, NBER Working Paper No. 20609. 10 Bourdieu, P. and L. P. D. Wacquant. 1992. An Invitation to Reflexive Sociology. Chicago: University of Chicago Press. 11 Ingram,N. Burke, C.Abrahams, J. Thatcher, J. (2018) Bourdieu: The Next Generation, London: Routledge 12 Halpern, D. (2005). Social Capital. Cambridge: Polity Press 13 Hurtado, S., & Carter, D. F. (1997). Effects of college transition and perceptions of the campus racial climate on latino college students’ sense of belonging. Sociology of Education, 70, 324-345 14 Tinto, V. (1997). Colleges as communities: Exploring the educational character of student persistence. Journal of Higher Education, 68(6), 599-623 15 Thomas, J. (2012) Building student engagement and belonging in Higher Education at a time of change: final report from the What Works? Student Retention & Success programme. Available at: https://www.heacademy.ac.uk/system/files/what_works_final_report.pdf.​ 16 Hausmann, L. R. M., Schofield, .J W., & Woods, R. L. (2007). Sense of belonging as a predictor of intentions to persist among African American and white first-year college students. Research in Higher Education, 48(7), 803-839 17 Hurtado, S., & Carter, D. F. (1997). Effects of college transition and perceptions of the campus racial climate on Latino college students’ sense of belonging. Sociology of Education, 70, 324-345 A progress report from the King’s What Works team | 11

Figure 1 Five facets of belonging

Induction is a key chance for the university to signal to students Help students feel welcome Connections that they are welcome. Use induction to facilitate students’ and part of a community social relationships with students and staff members.

Encourage students to develop Communicate a vision of what it means to be a King’s student. Give students opportunities to achieve meaningful shared goals an identity they are proud of, Identification with others at King’s. Encourage them to acquire tokens like associated with King’s university-branded notebooks.

Encourage stability in Create opportunities for students to interact regularly with Stability relationships between staff a small group of staff and students, eg through core subjects and students or the same teachers for multiple modules.

Articulate common worries and issues to students (ie through Perceived Emphasise commonalities, peer discussion) and how they can be overcome, so students including normalising difficulties similarity know that they are not alone in these experiences.

Treat students as respected and valued adults and seek their Seek students’ input in input on matters that affect them. When creating initiatives to Respect matters that affect them help students, avoid the implication that they lack something they need to succeed.

l Feeling that the connections you’ve built are stable sense of belonging, which we are now testing in our and will last18 own research. Through our research, What Works has identified five facets of belonging in higher education l Feeling like you have something in common with and how universities can improve them for students. 19 the people around you They are shown in Figure 1 above. l Feeling respected, that you matter and will be 20 supported and included. Using nudges and psycho-social interventions in higher education Belongingness can be a predictor of whether a student completes their degree.21 Research from What Works We first started testing whether behavioural nudges and elsewhere suggests students from non-traditional could be used at King’s for the KCLxBIT project.24 backgrounds are less likely to feel like they belong. We have continued to test nudges to positively influence Black and minority ethnic students,22 and working- student behaviours (eg encouraging them to seek class students,23 have been found to have a less sense of personal tutor support). belonging than white/Caucasian students in American universities. This has also been mirrored in our own Nudges are just one tool in the behavioural science analysis (see page 21). We aim to build a stronger sector toolbox, and mostly effective on the margins when there understanding of belonging in English universities. is an inclination towards the desired behaviour already. We reviewed the available evidence and identified the As our knowledge base of students’ behaviours and following the factors as being key elements in a student’s perceptions grew though our social research, we also

18 Baumeister, R., & Leary, M. (1995). The need to belong: Desire for interpersonal attachments. Psychological Bulletin, 117 (3), 497 19 Walton, G. M., & Cohen, G. L. (2007). A question of belonging: Race, social fit, and achievement. Journal of Personality and Social Psychology, 92 (1), 82–96.​ 20 Goodenow, C. (1993). Classroom belonging among early adolescent students: relationships to motivation and achievement. Journal of Early Adolescence, 13, 21-43 21 Thomas, L. (2012) Building student engagement and belonging in Higher Education at a time of change: final report from the What Works? Student Retention & Success programme, Paul Hamlyn Foundation 22 Walton, G. M., & Cohen, G. L. (2007). A question of belonging: Race, social fit, and achievement. Journal of Personality and Social Psychology, 92 (1), 82–96. 23 Johnson, D. R, Alvarez, P. Longerbeam, S., Solner, M. Inkelas, K. K. Leonard, J. B. & Rowan-Kenyon, H. (2007) Examining sense of belonging amongst first-year undergraduates from different racial/ethnic groups. Journal of College Student Development, 48(5), 525-542 24 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project Report 2015-2017. Available at: https://www.kcl.ac.uk/study/assets/PDF/widening-participation/What-works-project-report.pdf 12 | A progress report from the King’s What Works team

began to test psycho-social interventions. Psycho-social 3 How can we influence attainment gaps? interventions in education change how students think Several attainment gaps – differences in the average or feel about learning, or about themselves, in a learning performance of different groups – exist between environment. For example, the Behavioural Insights team student groups within most, if not all, higher education have had some success with approaches that strengthen institutions. Currently, the most significant gap school and college pupils’ support networks and help nationally is the gap between that attainment of black them to approach learning more positively.25, 26 and minority ethnic (BAME) and white students.31 Attainment gaps are caused by complex issues that In our first two years we have focused on three questions: will require multiple approaches from universities. Addressing the attainment gap is a core part of King’s 1 How can we encourage university participation amongst Education Strategy, with faculties taking the lead underrepresented groups? supported by our Student Outcomes team. The issues Outreach work is labour intensive and requires a are likely to be too big and too entrenched to be solved university presence in schools, to provide the ‘hot with a nudge, but a trial may help identify psycho- knowledge’27 students prefer (but don’t always have social areas where we could have a positive impact access to) when making decisions.28 We worked with to suggest further study. the Behavioural Insights Team to see if providing more information on report cards and asking parents to sign a pledge to discuss university with their children could The ACES framework increase students’ and parents’ reported positivity about university more cheaply and easily (see PACT, The ACES framework is a tool designed by the What page 16). We have also been conducting research to Works department for applying behavioural science to understand the access experiences of white working- higher education programmes. This framework provides class girls (page 15). a way to design initiatives specifically for a higher education setting. It builds on existing BI frameworks, 2 How can we encourage students to turn up to things? such as MINDSPACE32 and the Behavioural Insights The most common question we are asked by colleagues Team’s EAST33 framework, but also looks more broadly at King’s is, how can we encourage students to turn up than just the context of a specific decision, reflecting the to events and read, and engage with, their university complex factors that feed into student success. Applying communications. Attendance at university events may ACES to thinking about student experience at King’s help students make friends and improve belongingness can help to improve student outcomes at King’s, and to help them to feel connected to their institution. their satisfaction with the institution.

During the KCLxBIT project we focused very much ACES stands for ‘Affirm belonging’, ‘Consider the choice on this question, encouraging attendance to welcome architecture’, ‘Empower and enable’, and ‘Support social events in particular29 and in the first two years of What connections’. Works we have been drawing on those findings, testing out new ways to encourage student attendance and engagement. In our 2018 Residences Trial (see page 17) we tested if we could use behavioural insights to reduce the intention-behaviour gap, the common discrepancy between our intentions and our actions.30

25 Miller, S., Davison, J., Yohanis, J., Sloan, S., Gildea, A., & Thurston, A. (2016) Texting Parents: Evaluation report and executive summary. The Education Endowment Foundation. Available at: https://educationendowmentfoundation.org.uk/public/files/Projects/Evaluation_Reports/Texting_Parents.pdf 26 Hume, S., O’Reilly, F., Groot, B., Kozman, E., Barnes, J., Soon, XZ., Chande, R. & Sanders, M. (2018) Retention and Success in Maths and English Classes: a Practitioner Guide to Applying Behavioural Insights. Available at: http://38r8om2xjhhl25mw24492dir.wpengine.netdna-cdn.com/wp-content/ uploads/2018/03/ASK-guide-27-Feb-2.pdf 27 Ball S. J., and C. Vincent. (1998) ‘I heard it on the grapevine: “Hot” knowledge and school choice.’ British Journal of Sociology of Education 19(3): 377–400. 28 Ball, S. J., J. Davies, M. David, and D. Reay. (2002) ‘Classification and judgement: Social class and the cognitive structures of choice in higher education.’ British Journal of Sociology of Education 23(1): 51–72 29 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project Report 2015-2017. Available at: https://www.kcl. ac.uk/study/assets/PDF/widening-participation/What-works-project-report.pdf 30 Sheeran, P. (2002). Intention-behaviour relations: A conceptual and empirical review. In M. Hewstone and W. Stroebe (Eds.), European Review of Social Psychology. (Vol. 12, pp. 1- 36.). 31 HESA (2018) Students statistical report available at: file://kclad.ds.kcl.ac.uk/anywhere/UserData/PSStore01/k1774052/My%20Documents/ Attainment/2018_HE-stats-report-students%20HESA.pdf (table 3.7) 32 Institute for Government (2010) MINDSPACE. Available at: https://www.instituteforgovernment.org.uk/sites/default/files/publications/MINDSPACE.pdf 33 Behavioural Insights Team. (2015) EAST: Four Simple Ways to Apply Behavioural Insights. Available at: http://www.behaviouralinsights.co.uk/wp- content/uploads/2015/07/BIT-Publication-EAST_FA_WEB.pdf A progress report from the King’s What Works team | 13

l Affirm belonging. Students who feel like they belong l Support social connections. Social connections to other at King’s are more likely to engage academically students, to faculty, and to supporters in their own and socially, achieve better grades, and have higher networks are crucial for students. It enables them wellbeing.34 Belonging can be promoted through a to receive the greatest benefit from their time at relational approach to interacting with students,35 university and realise those benefits in the labour promotion of social networks,36 and by helping market. We should be aiming to help students increase students reframe ambiguous social cues positively.37 their social capital through participation in university and civic life.39 l Consider the choice architecture. The specific decision context (such as the timing, framing, who presents What Works supports staff in applying these four the options, the number of actions required) can have elements whenever designing an intervention for students a surprisingly large impact on whether students – whether large or small, a special project or in the day- complete an action.38 to-day. Every communication between the university and a student is an opportunity to make their experience as l Empower and enable. Students are active partners in frictionless as possible and reinforce that they are a valued their university experience. By giving students the member of the student community. Such communications confidence, information and authority to make their are also a useful way for learning more about students. own decisions about their priorities, we can help By testing which message has the best reaction, we learn them get what they want out of their time at King’s, what students care about. To do this we often use RCTs, not just what we think they should want. which are explained in Figure 2 on page 14.

Consider Empower the choice and enable architecture

Support social Affirm connections belonging

What Works Social mobility and student success

34 Thomas, L. (2012) Building student engagement and belonging in Higher Education at a time of change: final report from the What Works? Student Retention & Success programme, Paul Hamlyn Foundation 35 Hurtado, S., & Carter, D. F. (1997). Effects of college transition and perceptions of the campus racial climate on latino college students’ sense of belonging. Sociology of Education, 70, 324-345 36 Thomas, L. (2012) Building student engagement and belonging in Higher Education at a time of change: final report from the What Works? Student Retention & Success programme, Paul Hamlyn Foundation 37 Walton, G. M., & Cohen, G. L. (2007). A question of belonging: race, social fit, and achievement. Journal of personality and social psychology, 92(1), 82. 38 Behavioural Insights Team. (2015) EAST: Four Simple Ways to Apply Behavioural Insights. Available at: http://www.behaviouralinsights.co.uk/wp- content/uploads/2015/07/BIT-Publication-EAST_FA_WEB.pdf 39 Ingram,N. Burke, C.Abrahams, J. Thatcher, J. (2018) Bourdieu: The Next Generation, London: Routledge 14 | A progress report from the King’s What Works team

Figure 2 Randomised Controlled Trials The Basics

When evaluating whether a programme or an intervention has been effective, you can be most sure if you know the counterfactual, eg what would have happened if the programme or intervention hadn’t been run. The best way to do this is to randomise all the possible participants, give the treatment to half the group and then compare the results, or run a randomised controlled trial (RCT).

The best way to do this is to randomise all the possible participants, give the treatment to half the group and then compare the results. This is called a randomised controlled trial (RCT).

Non-experimental Quasi-experimental Experimental (RCT)

Measure outcomes for each Measure outcomes for participants Randomize participation for participant pre- and post- and non-participants, ‘controlling’ treatment and control group and intervention for bias measure outcomes for both

Example

SHIFT SHIFT is a project to improve teacher motivation. Teachers currently receive very little feedback about the post- school progress of their former students. SHIFT was a collaborative trial between a number of universities to test if positive student feedback could increase teacher motivation – we tested their willingness to visit a website with university application tips for teachers. This trial is ongoing.

Teachers sent students’ postcards with link to website with HE application advice Postcards from Postcards Website visits students to former randomised at compared teachers collected school level

Teachers sent plain postcards with link to website with application advice A progress report from the King’s What Works team | 15

Access to higher education

One major component of What Works’ activities is We set out to understand the white working-class supporting widening participation at King’s to help female journey into King’s using focus groups. The aim students access university and have a good start. This of this work was to help us design interviews with this will become increasingly important to the department’s group within schools and further education colleges. work following our 2020 move into the Widening In spring 2019, we conducted three focus groups and Participation department (KCLWP) at King’s. three interviews with a total of 19 King’s students who identified as white, working class and female. This included three estranged41 students and three mature students. The focus groups yielded insights around What does the Widening Participation three aspects of the university application process. department do? l Belonging and self-efficacy. Even though they’d been The Widening Participation department, within the accepted to King’s, the students we spoke to were aware Social Mobility & Student Success division, aims they weren’t typical university students and so lacked to open up higher education to underrepresented a feeling of fitting in, expressing surprise when they got groups in order to ensure that student bodies at in, rather than validation. They were also often breaking universities reflect the diversity represented in social norms to come to King’s, which provided its own wider society. Such groups include young people pressure. It was also marked how low the confidence from low income backgrounds, or low university- levels of many of them were, and how different they participation neighbourhoods, young people from felt. One student told us she thought King’s let her in by an ethnic minority group, who have left care or who mistake. Many also spoke about ‘almost’ not coming to are estranged from their families, disabled students King’s, either because they’d decided they wouldn’t get and mature students, among others. in before they’d got their offer or because they thought they wouldn’t be good enough.

The What Works department works with the teams in l Networks and support. The students we spoke to told us KCLWP to help them to make sure all their initiatives are that they had limited access to resources and needed evidence-based, informed by cutting edge behavioural help making the best choice for them. Though a key insights and data analytics, and robustly evaluated. We place for students to get information about higher have conducted a number of projects to help build the education is at school, the students we spoke to describe evidence base which supports them with their outreach the support they got from their school as sporadic and activities. focused mainly on Oxford and Cambridge. They also told us they found UCAS confusing and struggled to know how to sell themselves in personal statements. Researching behaviours, experiences and perceptions Due to a reported lack of support from schools, and White working-class girls often limited information available from family, the One major project What Works is conducting around students relied on online information (eg league tables, access is research into the experiences of white working- UCAS), but many told us they lacked time to research class girls. This project was sparked by observing that a high thoroughly. In conversation, league tables came up proportion of students from this group were studying just most commonly as a source of information. one subject – nursing at King’s. We conducted a literature review, which uncovered a shortage of evidence specifically As with our other projects in this area, the networks for this group, especially in comparison to their male peers. that students had were important factors in their This is despite findings that white British success. Families were described as both barriers and pupils of both sexes are among the poorest performing major enablers, motivators and de-motivators. Although ethnic groups at GCSE and have been for some time.40 this group may have struggled with lower HE relevant

40 The Sutton trust (2016) Class differences: Ethnicity and disadvantage. Available at: https://www.suttontrust.com/wp-content/uploads/2016/11/Class- differences-report_References-available-online.pdf 41 ‘Estranged students’ defined as ‘young people studying without the support and approval of a family network’. They are recognised by the as an underrepresented group. 16 | A progress report from the King’s What Works team

social capital, helpers such as school staff, (often This section sets out three projects aiming to positively unlikely staff, such as the librarian), or friends were influence prospective students’ networks. In our Parental found to be a key resource for some. Aspirations for their Children (PACT) project, we attempted to boost students’ social capital by boosting the l Delaying access. Students faced financial, informational, knowledge of the networks students already had in place. and home challenges in thinking about university that This trial focussed on the early conversations during other students probably hadn’t. Consequently, many school years where students are deciding if they want to of the students we spoke to had taken time out to save go, and if so – where and to study what. Our Residence money, retrained through access courses, or sought help Offers trials encouraged students interested in staying in for mental health problems before coming to King’s. halls of residence to accept their offer so that they could Many had also changed degree, one student more than take advantage of the opportunities to develop their social once. This may be a result of not having the correct networks. Lastly, our Networky trial aimed to boost information to make an informed decision at an earlier Welcome Week attendance by introducing students to point when it could have been more useful. one another so that they could attend activities together.

Next steps Parental Aspirations for their Children (PACT) These focus groups provided us with useful insight into The purpose of our work is not to push students who the experiences, barriers and needs of white working-class don’t want to go to university to apply, but rather to help female students at King’s. In particular, they highlighted students from underrepresented backgrounds who do that this group may not be confident of their place in want to go to university to overcome the barriers they university and also may benefit from help finding people might experience in realising their aim. Parents are an like them. This research confirmed to us that we are on essential ally in our work. The KCLWP team has a core the right track in focusing on social capital and feeling objective to help parents be strong allies in getting their of belonging. best for their children and has a three-pronged model to address this. PACT is one of these. The other two However, we need to be mindful of survivorship bias: approaches include empowering and equipping parents the challenges they identify are challenges they have to campaign to make changes in educational inequality been able to overcome. It is important to understand (Parent Power) and to amplify the parent voice on the challenges that prospective students are currently various campaigns (such as child citizenship fees). unable to overcome, or we risk focusing our effort in the wrong place. In the next phase, we will be doing further With PACT, KCLWP wanted to see if a low-cost exploratory research with white working-class female ‘nudge’ type intervention could be used to increase students in schools and further education colleges to parental engagement and drive higher application rates to understand what their views on university are, and what university. During the initial phase a group of behavioural challenges they face. Interviews for this portion science students from Harvard Business and Kennedy of the project will be running in 2020. Schools were asked to design a potential intervention. What Works then commissioned the Behavioural Insights Team to run the intervention they designed, PACT Interventions for student participation (Parental Aspirations for their Children), in 2019. Through investigating what can affect members of underrepresented groups applying to attend university, we In pitching this project to King’s, the Harvard students have established that a number of factors play a prominent argued that, given the powerful effect parents can have role. These include GCSE attainment, geography, cultural on shaping the aspirations and attainment of their values, and influence of family. One prominent factor in children,43 conversations between parents and children both the literature and our discussions with students, for about university was a key area to focus on. The students both their choice to access higher education and their identified a commitment device, in this case a written success once enrolled, was social capital. pledge, as a possible way of encouraging conversations about university between parents and children, after they Some potential university students, especially those from had received a ‘report card’ about the students’ university disadvantaged backgrounds, may not have the required readiness (see below). Commitment devices work by social capital through their networks to access to the types having people commit to taking future actions. What of knowledge that will help them make the right choice Works commissioned the Behavioural Insights Team (BIT) for them when deciding whether to go to university. to test this intervention. The trial focused on parents of (Although it is important to acknowledge that they may students in Years 7–9 with Key Stage 2 UCAS points of not lack social capital in other areas42). 100 or above, in eight schools from around and involved parents of approximately 3.5k students.

42 Todman, V (2018) The Importance of Social Capital Available at: https://blogs.kcl.ac.uk/behaviouralinsights/2018/12/10/the-importance-of-social-capital/ 43 Goodman, A. and Gregg, P. (2010) Poorer Children’s Educational Attainment: how important are attitudes and behaviour? Joseph Rowntree Foundation. Available at: https://www.jrf.org.uk/report/poorer-children%E2%80%99s-educational-attainment-how-important-are-attitudes-and-behaviour A progress report from the King’s What Works team | 17

The trial – which was a two-arm RCT, tested two interventions: Figure 3: Student and parent reported aspirations scores Treated Untreated 1 ‘Report card’ cover letters. Parents were sent letters 5 from their child’s school which confirmed that they were ‘on track’ for university, along with a list of 0 2 28 0 1 subjects they were performing well in and examples 25 of courses the child could pursue at a selection of universities. The report cover letter aimed to further 20 increase parental aspirations for their children. 15 2 A ‘University pledge’. Shortly after report cards had been sent out, schools sent home a pledge which asked 10 parents or guardians to commit to speaking to their child about university in the near future. The pledge 05 was accompanied by a conversation guide to help 00 parents structure the conversation as well as a brief Student reported aspiration Parent reported aspiration list of resources about how to learn more about (4 point Likert scale 1,874) (4 point Likert scale 62) university options.

The results of PACT Figure To evaluate the effect, the Behavioural Insights Team 4 BIT: PACT ‘report card’ created an index of aspiration and goals for both parents and for students; each index is composed of five sub- questions. The questions were modified from a subscale Dear Parent/Guardian of the Student Engagement Instrument (SEI), featured Your child is on track to attend university. on the Eeducation Endowment Foundation Spectrum 44 database. All questions were measured on a 1– 4 scale, She shows high potential in Biology, Geography, and where 1 represented ‘Strongly Disagree’ and 4 represented History. If she continues to do well in GCSE and A-level ‘Strongly Agree’. Outcome scores were generated by Biology, she may have the option of taking the following taking the average from each set of questions. courses at university: We did not find a significant overall effect from these l Biological Sciences, University of communications on either parents’ or students’ reported l Marine Biology with Study Abroad, University aspirations towards university (see Figure 3). Treated of Southampton parents reported having significantly more conversations l and Environmental Biology, Imperial with their children about higher education – however College London the children did not report a significant difference. This may be explained partly by different factors: different This could lead to the following careers: survey timings and sample sizes, as well as children potentially not registering the conversation taking place, l Marine Biologist or parents being aware that this was perceived to be l Nature Conservation Worker important and wanting to be seen to conform to the project. l Vet In order to understand the results of this trial further, l Doctor BIT also ran nine interviews with parents from the l Pharmacologist treatment group. This was not a representative group, To continue the conversation about university, ask and so the findings from these interviews cannot be your child’s teacher. treated as evidence, but they did raise some interesting possible insights and suggestions for how we take these findings forward. They told us that they found the report card to be too computer generated and suggested the intervention should include more explanation. This, if Null results are still useful, these findings suggest tested, could support the theory that students respond further trials should be along these more relational better to ‘hot’ in person information than ‘cold’ factual lines. Through running this research, we also uncovered information provision.45 interesting socio-demographic differences in attitudes

44 Education Endowment Foundation (2020) Student Engagement Instrument. Available at: https://educationendowmentfoundation.org.uk/projects-and- evaluation/evaluating-projects/measuring-essential-skills/spectrum-database/student-engagement-instrument/ 45 Ball, S. and Vincent C. (1998) I Heard It on the Grapevine’: ‘Hot’ Knowledge and School Choice British Journal of Sociology of Education , v19 n3 p377-400 18 | A progress report from the King’s What Works team

Figure 5 BIT: PACT pledge letter

Dear Parent/Guardian

We’re aiming high – sign this pledge to join us.

At XSCHOOL, we want all of our students to thrive in school and beyond. Research shows that having high aspirations for your children makes them more likely to succeed.

In many ways, GCSEs give students a fresh start, as they can choose their courses for the first time and really begin to shape their future.

That’s why, to support your child, I’ve signed the University Pledge. As part of the pledge, I – on behalf of the rest of the teachers at the school – will talk to students about their choices and how they impact not just the next few years but the rest of their lives.

I’m asking you to join me in taking the same pledge in the box below. You don’t need to return it to the school – instead, you can keep it as a reminder of your commitment.

Yours X Headteacher University Pledge (to keep)

School will: And I:

l Do everything we can to help you monitor Will speak with my child about university your child’s progress. on / /2019 and encourage their aspirations. (Suggested topics overleaf. Choose l Help you understand the academic achievements required for your child’s a time that works for you in the next two weeks.) desired path. Will keep in touch with the school and attend parent/teacher events whenever possible. l Keep you connected to your child’s teachers. Will help my child take advantage of the opportunities the school offers. Will continue to have conversations about university with my child in the future.

Signed: Headteacher Signed:

This pledge is yours to keep. If you’d like, you can put it on your fridge as a reminder to speak about university with your child.

towards university. Based on our findings, in an extreme Residence offers trial case, a white student with a Key Stage 2 score of 100 We also want students from underrepresented groups to (exactly average) can be expected to have the same have the best possible start at university. Living at home level of aspiration as a mixed-ethnicity student with while studying is an understandable draw for students a KS2 score of only 84 (considerably below average). from low income households. Living in halls of residence We are taking this forward in our research to understand has been found to have only a small impact on grades, but white working-class female aspirations and expectations a large impact on student engagement.46 It has been found in 2020. to influence feelings of adaption to university life, and

46 Parameswaran, A. and Bowers, J, (2014) Student residences: from housing to education, Journal of Further and Higher Education Volume 38, issue 1 A progress report from the King’s What Works team | 19

persistence in continuing their studies in comparison than the control. However, in Wave 2 the effects of to commuting students.47 all treatments are significant and positive, with the combination of SMS and treatment email showing During Spring 2018 we ran a trial to increase the the strongest effect. We hypothesize from this that proportion of King’s students accepting an offer to live in an intervention to encourage action may be most King’s residences. Prospective students are offered a place effective with those who have already displayed signs in halls via email a few months before they are due to start. of delaying engagement with residence offer emails. The email offers are sent out in three waves. Once received, prospective student have five days to accept the room. The On page 20 (Figure 7) is an example of how we altered intervention consisted of a treatment email, which half of the original residence offer email. This project was a the first two waves received, the other half receiving the first step in a larger ongoing project to test out ways normal email. Half of each group also received a treatment to improve student engagement. We regularly review text message. In this intervention we attempted to close communications for departments and faculties in this an intention-behaviour gap,48 encouraging students who way to help make sure they are getting the most out of wanted to live in halls to perform the action of accepting their message, at the same time building our knowledge within the five-day window. with each communication.

When looked at holistically, those who received the Our project to influence take up of offers of a place in treatment email were significantly more likely to be halls of residence demonstrates the value of looking for living in residence than those in the control. Altogether, opportunities to embed behavioural experimentation into we saw an increase in take up from 45 per cent to 48 more of the work we do. With a little creative thinking, per cent of those who received our treatment email over we turned a business as usual email into experimental our control email. Half of each group also received a research and in doing so, added to our knowledge about reminder text message. When we isolated the effect of what works in student engagement. the text message and the email, we found both together pushed take up to over 50 per cent in our second wave Networky (see Figure 6). Students were contacted in two waves. Following on from our successful text message trials to The second wave were later to apply for their place improve Welcome Week attendance,49 in 2018 What in halls. Not everyone in our population had a mobile Works ran a three-arm randomised trial exploring the number. When those who did not are excluded, there impact of early friendships.50 The aim was to increase were no significant impacts of any of the interventions enrolment into King’s and Welcome Week attendance for Wave 1, although directionally they were all lower by virtue of already knowing someone there.

Figure 6: Comparison of percentage living in residence among students offered a place in Waves 1 or 2

% Control Control SMS Treatment Treatment SMS 60

50 502 472 47 40 445 407 415 422 0 19

20

10

0 Wave 1 Wave 2 Guide to significance levels: significant at p < .05 significant at p < .01 *** significant at p < .001

47 Holdsworth, C. (2006) ‘Don’t you think you’re missing out, living at home?’ Student experiences and residential transitions. The Sociological Review, 54(3), pp. 495-519. 48 Sheeran, P. (2002). Intention-behaviour relations: A conceptual and empirical review. In M. Hewstone and W. Stroebe (Eds.), European Review of Social Psychology. (Vol. 12, pp. 1- 36.). 49 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project Report 2015-2017. Available at: https://www.kcl. ac.uk/study/assets/PDF/widening-participation/What-works-project-report.pdf 50 Sacaradote, B (2006) How do friendships form? The Quarterly Journal of Economics 121, 1, 79-119. 20 | A progress report from the King’s What Works team

Figure 7 How to encourage student engagement

Lessons from our Residence Trial

Dear #Name Personalisation We respond better to things directed to us We have reserved a room for you in King’s Residences and look forward to welcoming you in September. Salience Make why they should care clear Living in halls is a great way to get a head start on settling in and making new friends at King’s. You can view the details of Belonging your offer here. Feeling part of something

This offer expires XXX on XXXX (BST), at which point we Endowment effect will have to offer your place to another student. We are attached to the things we have

Therefore, if you can’t confirm now, set yourself a reminder Planning so you don’t forget. We’re more like to do the things we make an active plan to do You will need to use your Student ID number as your username (found on your academic offer letter), and your date of birth (in the format DDMMYY) as your password. If you have any trouble logging in, please email us on [email protected].

Please note we are unable to offer residence/room swaps at this stage.

Kind regards, Reservations team

In this trial, incoming students who were assigned to How students start at university sets the tone for the rest treatment groups had the opportunity to chat on a of their experience.51 Ideally, if we can help students to Behavioural Insights Team social media site, Networky, feel a connection to the university and prepared to make to three to four other new students studying the same friends, then we can empower them for the challenges subject. They received four texts with conversational ahead. Although using the ideas of belonging and social prompts in the run-up to arrival at King’s. Around 2,000 capital have proved to work in encouraging attendance students were enrolled in the first treatment group, a of underrepresented students to events,52 it may be harder further 405 were also invited to participate in a scavenger to encourage students to actively make friends, as we hunt around the King’s campus. Overall, enrolment found in our Networky project. Encouraging students, was high for both groups with no statistically significant who are unsure about university to explore their options difference and Welcome Week attendance was low is also a harder issue to nudge and we need to do more for both groups, again with no statistical significance. work in this area. Attendance at the scavenger hunt was also low.

51 Fyvie-Gauld, M, Winn, S. & Wilcox, P. (2006) ‘“It was nothing to do with the university, it was just the people”: the role of social support in the first-year experience of higher education’, Studies in Higher Education, 30(6), 707-722. 52 Canning, AM, Hume, S. Makinson, L. Koponen, M, Hall, K. Delargy, C. (2017) KCLxBIT Project. A progress report from the King’s What Works team | 21

Understanding and improving the student experience

The second component of What Works’ two-year The questions are asked of all undergraduate, PGT workplan has been a focus on the experience of students and PGR students. Students are asked to indicate their once they have arrived at university. Our remit is to level of agreement with the statements on a scale from focus on students from disadvantaged backgrounds or Strongly Disagree (1) to Strongly Agree (5). They are with characteristics which are underrepresented in higher also informed that the questions will be used to improve education, though our lessons learned and insights may services offered to them, and that the questions are be applicable to all students. optional. Table 1 shows the questions and the response rates for September 2018.

Data Analysis The questions are drawn from a range of academic 53 Settling into King’s sources, including Bean’s Student Attrition Model and work done at the University of Toronto.54 Our analysis The Settling into King’s questions are a set of wellbeing suggests that responses to these questions are inter- and belonging questions that have been added into the correlated, suggesting that it is sensible to treat these Enrolment Task (an online task all students are required questions as assessing aspects of the same construct. to complete when re-enrolling to university every autumn). This means that students respond to them at Figure 8 (page 22) gives the average agreement out some point during the enrolment window (August to of five with each of the questions, for all students who September), starting from the 2018 –19 Academic Year. responded. Average agreement is relatively high for all These questions were developed to give King’s a picture questions, but as students return their overall sense of of the wellbeing and belongingness of students and how belonging reduces slightly. In returning students, their this varies across different groups. Crucially it provides a responses to questions which measure feeling of fit, feeling tool for the evaluation of initiatives aimed at supporting supported and participation in student life are lower than students settling in to the university. their confidence about their university decision, studies and the year ahead. This suggests these are the areas to All students are asked the first three questions (the ‘Big focus on if designing an intervention to improve reported 3’) while returning students are asked all six (the ‘Big 6’). sense of belonging.

Table 1 Questions and response rates for undergraduate, PGT and PGR students, September 2018 Question Asked Number Declined to Inter-item responding answer correlation (Big 6) I made the right decision in choosing to study at King’s New and returning 16420 2152 0.92 I feel optimistic about the year ahead New and returning 16561 2011 0.93 I feel confident that I can cope with my studies New and returning 16552 2020 0.91 I fit in at King’s Returning only 10788 1702 0.92 I feel supported by King’s Returning only 10791 1699 0.85 I participate in student life beyond my academic Returning only 10636 1839 0.79 commitments

53 Bean, J. P. (1980). Dropouts and turnover: The synthesis and test of a causal model of student attrition. Research in Higher Education, 12, 155–187. 54 Beattie, G.; Laliberté, J.; Oreopoulos, P. (2017). Thrivers and Divers: Using Non-Academic Measures to Predict College Success and Failure, NBER Working Paper No. 22629. 22 | A progress report from the King’s What Works team

Figure 8: Average response to Settling into King’s questions 2018

(1 Strongly Disagree 2 Strongly Agree) ew students Returning students 50 45 40 454 446 428 417 428 5 402 88 6 0 56 25 20 15 10 05 00 I made the right I feel optimistic about I feel confident that I fit in at King’s I feel supported I participate in student decision in choosing the year ahead I can cope with by King’s life beyond my to study at King’s my studies academic commitments

The responses to these questions are available to King’s students who identify with the ethnicities associated staff to understand student experience and evaluate with the lowest levels of reported belonging in 2020. services. With each year the data we receive from these We will construct an intervention based on the findings questions will be stronger. In 2018 we saw significantly of this research and then test the intervention with lower agreement with the SitK questions among female students. The benefit of these questions is that not only students, BAME students, first generation students and will we be able to see if our intervention improves overall ‘WP students’ (here defined using Acorn Quintiles 1&2). belonging, but also if it impacts on the separate factors. When broken down by year of study (Figure 9) we also see sense of belonging is highest for first years and lowest First-year experience project for third years. As our current first years re-enrol into This project originated in a collaboration between second years and onto the third year we will monitor to What Works and the Academic Support departments see if this trend continues for the same cohort. to produce a summary report of all the information we have about the experience of first years. Sources This analysis provides us with rich data in order to of information for this report included admission identify areas to target and to assess the performance of data, academic results, reports on the clearing our projects. As a result of our detailed analysis of these process, Welcome survey data, a report on changes of questions we are able to target our limited resources to circumstance requests, previous What Works research, focus on those most in need of our help. As a consequence and evidence from other universities. Figure 10 shows of this analysis in 2020 we are running focus groups with some of our major findings.

Figure 9: Average big three score for each year cohort by characteristic, 2019 ote this is cross sectional, not time series Average Big score Ethnicity Gender First generation WP status 50

45 428 408 445 429 411 445 425 405 44 421 404 45

40 48 415 97 442 417 98 441 412 96 445 421 98

5

0 1st 2nd rd 1st 2nd rd 1st 2nd rd 1st 2nd rd ear of study White BAME Male Female o es on WP WP A progress report from the King’s What Works team | 23

Figure 10 First year students: what we know

Student behaviours Social Student support and expectations and wellbeing

For all faculties, the majority of Students overwhelmingly identify with For students, word of mouth is the most withdrawals occur in first year their course as opposed to King’s societies, compelling source of information their campus, department or faculty

Less than 50% expect that they The majority of the students Students said they would like will work with their peers on interviewed emphasised the social more information about living projects or assignments. aspect of first year over academic. in London.

Students feel there are too many % Percentage of students who Facebook is the most frequent apps/platforms for information. 23 feel that their peers wouldn’t source of information and events listen or help if they had a personal in and around King’s. % Percentage of students who problem. 23 felt they didn’t have enough % Percentage of students who information needed prior to starting % Felt they had a close friend 38 were worried about managing their course. 31 studying at Kings’s. their money before, and during their first year. % Don’t use reading week % Percentage of students 66 for university work. 20.8 missing friends/family Despite many students struggling increased post-welcome event to 20.8% £ and worrying about finances, only First years underestimate the from 12%. 10% sought advice directly through King’s. amount of time required to prepare for class and participate First year students who live at in extra-curricular activities. home struggle to build connections and lack a sense of belonging.

Through this work we were able to plot institutional data, estranged first year students. The initial findings show such as withdrawal rates and change of circumstance how varied their experiences are, underlining that no requests. We then merged our analysis with reported student cohort is homogenous and great care should information, such as activity during reading week and be taken when generalising students’ behaviours and reported sense of belonging, to get a picture of the experiences. overall student experience for First Years. We provided this information to our faculties’ Student Experience Despite this, certain themes were prevalent across the Managers to help them target their work. Of particular use interviews. Forming relationships with others was an was the Pulse Survey which we produced as part of our important aspect of student life, but many had found it KCLxBIT project.55 It tracked markers such as perceived harder than expected to make friends. Participants took stress over the year, and also asked important questions studying seriously and were positive to be at university. about level of identification with different aspects of For some, the academic adjustment to university had been the university and the extent that students felt they had challenging. STEM-based students were more likely to a support network. In this research, over a quarter of have experienced this. In comparison, other participants students agreed with the statement ‘few of the students I felt their university course was currently easier than know would be willing to listen to me and help if I had a previous studies. Many were relaxed about finances, personal problem’. The KCLxBIT team also reported that even those not confident of having enough money for WP students had a lower overall average rating of peer the full year. Part-time work alongside studies was group interactions in both waves where this was measured common. However, some students that had planned to (p<0.05). This is something we are now exploring further. work were unable to due to the demands of their course.

We are complementing this project with semi-structured The final report will explore these experiences in greater interviews with care-experienced, forced migrant and detail and highlight the different contexts of our students.

55 Canning, A Hume, S Makinson,L. Koponen, M Hall, K Delargy, C (2018) KCLXBIT Project report 2015-2017 24 | A progress report from the King’s What Works team

This will help practitioners consider how to adapt services psychological factors that may be influencing attainment, accordingly. such as low academic self-efficacy. They were sent out in waves, focusing on themes of belonging, personal tutors and study skills. Figure 11 shows the intervention design Trials of the pilot. Your King’s Your Success One group watched videos in which students and staff The Your King’s Your Success (YKYS) trial was talked about their experiences and gave advice. The specifically intended to target attainment gaps at second group was asked to complete written tasks King’s, where there is a clear disparity in the attainment (using techniques such as self-persuasion), the third the rates at which White and BAME (Black, Asian received both. The fourth group was the control group. and Minority Ethnic) students are awarded First and All (treatment and control) were then asked to fill out 2.1 degrees. a final survey at the end. The following table shows the various interventions and the psychological factors they No demographic groups of students are homogenous intended to target. Students received a £10 voucher and therefore reasons for any attainment gap will differ after completing wave three. A summary of the different between individuals. Many complex and interrelated treatments the three groups received is shown below factors can drive attainment gaps, including psychological (with group three seeing both). factors. For example, seeing few students or staff from similar backgrounds could lead to students feeling like What didn’t work they don’t belong,56 like they risk confirming a negative The What Works approach has always been to be stereotype about their group,57 or impact feelings of task- transparent and publish results of interventions that go specific confidence.58 well, and those that do not do so well. Even though it generated encouraging results, the pilot run of YKYS did For our pilot we identified three faculties with high not run exactly as intended. There were delays in getting BAME student numbers where we felt such an ethical approval, leading to a compressed timetable, which intervention could potentially impact attainment. We in turn led to issues with data cleaning to provide quality invited all second-year students from the three faculties assurance. This resulted in us having to remove some to take part. Of a potential 2,500 students, 301 students cases from our final analysis of impact on belonging and took part in the pilot of this project. This intervention was attainment data for the pilot. In the end, our sample was delivered in three waves to four randomised groups. Each too small for us to robustly say whether we had an impact intervention was designed to influence possible on our two outcomes for our pilot.

Figure 11 Your King’s Your Success pilot, January to March 2019

Personal tutor Study skills video video Belonging video

Personal tutor Study skills Belonging task task task 2nd years Baseline Respondents invited to Evaluation Survey Randomised opt in Belonging task Personal tutor Study skills Survey and video task and video task and video

Control

56 Walton, G. M., & Cohen, G. L. (2007). A question of belonging: Race, social fit, and achievement. Journal of Personality and Social Psychology, 92 (1), 82–96 57 Steele, C. M., & Aronson, J. (1995). Stereotype threat and the intellectual test performance of African Americans. Journal of Personality and Social Psychology, 69(5), Pp. 797-811 58 Margolis, P. and McCabe, H., (2006). Improving Self-Efficacy and Motivation: What to Do, What to Say. Intervention in School and Clinic, 41, No. 4, 218-227 A progress report from the King’s What Works team | 25

Figure 12 Your King’s Your Success project design Video Task

Students are asked to reflect on the interactions they have at university each day, to prime them to think of the Students receive a video of students discussing quality conversations they have. They are asked to rate ‘What King’s means to them’. Designed to promote WAVE 1 their feeling of fit and defend that score and provide a feeling of belonging and through identification. advice for first-year students who feel that they are struggling to fit in at King’s.

Students receive a video showcasing advice and testimonials from staff and students on Students are asked to consider commonalities their experience of personal tutoring. Designed between them and their personal tutor. They are then WAVE 2 to provide role models and increase confidence asked to provide advice to first-year students on how in contacting personal tutors and build they should get the most out of their personal tutors. a stable relationship with them.

Students are asked to reflect on why they studied Students receive a video where students their degree and are primed to think of their values. hold up signs displaying, and acting out, study They are then asked to estimate the number of hours WAVE 3 tips. Designed to increase self efficacy by they commit to independent study, after reading communicating info about how to study. an anchor of 16 hours and providing advice on study skills tips to first-year students.

YKYS results and the 2020 responses to the Settling into King’s We learned from our pilot. We ran the project again for re-enrolment questionnaire (see page 26) before we the 2019–20 academic cohort with an additional faculty, can do our impact analysis. longer timescale of interventions, an embedded quality assurance process and incentives for participation (in the However, Figure 14 shows the feedback we received form of King’s café vouchers) staggered across all three on our videos (seen by groups 1 and 3, but not the task waves. We need to wait for weighted attainment data, or control groups) when data from both the 2018–19 pilot and 2019–20 full roll out are merged. We found on average 83 per cent of students thought our videos Figure 13: ‘Do you think this video will be helpful for would be helpful for first years, which we found first year students?’ reassuring. Though admittedly this is not evidence of Merged responses for Pilot (2018–19) and Full (2019–20) surveys any impact, these films were largely created and filmed % Groups 1 and 100 by students and the positive reception demonstrates how 90 0 5 well received student-led interventions can be. This 59 80 54 supports previous research on the impact of role models 60 70 and video interventions. 60 46 47 Staggering incentives helped us mitigate against a drop in 50 response rate for Wave 2. However, we achieved a slightly 40 lower response rate for than our pilot for Wave three (83 0 7 instead of 89), perhaps due to the longer timeframe and 20 17 lack of a big end incentive. 10 14 6 2 7 0 1 Wave 1 Wave 2 Wave Next steps Pilot 109 Pilot 57 Pilot 89 We won’t know if our intervention worked until we Full 152 Full 86 Full 8 receive the results of our sample’s responses to 2020’s Definitely yes Might or might not Definitely not enrolment questions and attainment data. Once that data Probably yes Probably not is collected, we will de-brief the students on the purpose of our trial, providing the same information we provided

59 Silva, A., Sanders, M., & Chonaire, A. N. (2016). Does the Heart Rule the Head? Economic and Emotional Incentives for University Attendance. London: Behavioural Insights Team. 60 Behavioural Insights Team. (2016). The Behavioural Insights Team Update report 2015-16. Retrieved from http://www.behaviouralinsights.co.uk/wp- content/uploads/2015/07/BIT_Update-Report-Final-2013-2015.pdf 26 | A progress report from the King’s What Works team

when they consented, but giving them a final chance first-year highlights, how they compared against their to have their data withdrawn. In our pilot, only one expectations, and other interesting key insights. This report student withdrew from the study at this point, so we serves both as an incentive for students to participate – don’t anticipate a large drop in response rate and hope the popularity of personality quizzes suggests that people to report our findings in full in the autumn. like gaining insight about themselves61 – and to help with belongingness. There is no paid incentive for taking part. AWARE Our research suggests that providing information to We hypothesise that by underlining to students where students which helps them to understand how they are their experiences match social norms we help to normalise 62 progressing may help promote student wellbeing, and even difficulties (a key component of belonging ), and help potentially attainment. We have therefore designed a self- students feel like they are fitting in. In producing this tool reflection tool to help first year students more accurately we aim to signal to first year students that King’s cares assess their first year as it happens. This self-reflection about their wellbeing, wants to help them to understand tool is set up like a survey, with the rationale that most their limitations, motivations and how they learn. surveys students receive are to collect information that the institution needs, rather than information that students So far, over 600 students have signed up to the tool and want to provide, and that a lot of key King’s surveys happen have already submitted some interesting results, for at the end of the student lifecycle. These surveys provide example in Figure 14 below, which indicates that students useful information for institutions, but don’t allow students generally expect tasks to be harder before university to act on their reflections. We wanted to shift that. than how they find them to be once at university, with interactions with faculty being the main exception. Students were asked during Welcome Week to sign up The full results of the trial of this service will be available to this service, and then are asked a series of wellbeing after October 2020. questions at five points in their first year (namely, upon Arrival, Winter, After winter, Reading week and after Exams). Questions include asking students about their Next steps expectations, and then later if those expectations are met, alongside questions taken from validated self-efficacy and By embedding the ‘Settling into King’s’ questions into wellbeing scales. We also include ‘fun’ questions, such as the enrolment and re-enrolment surveys that all students what their favourite things to do in London are, to provide are asked to fill out, not only do we have a picture of how a personal and human element to the survey. feelings of belonging are changing within the university and identifying areas to target interventions, we also have The answers to these surveys will feed into a personalised a robust outcome measure for our interventions. We are report that students will receive after their summer exams. seeing good signs from our Your King’s Your Success pilot This will give those who take part an overview of their and are excited to see the results of our AWARE tool.

Figure 14: Students were asked to rate expected and experienced difficulties of various university-related tasks

How difficult (scale 0–4) Before university (expectation) During first term (reality) 0

28 25 27 25 24 20 22 21 19 19 19 19 18 19 15 17 16

10

05

00 Learning Managing Paying university Getting help with Making new Interacting Passing assessments course material your time fees and expenses academic work friends with faculty such as assignments and exams

61 NBC News (2014) Why do you love personality quizzes? Experts break it down, 6 Jan. Available at: https://www.nbcnews.com/science/weird-science/ why-do-you-love-personality-quizzes-experts-break-it-down-n169896 62 Walton, G. M., & Cohen, G. L. (2007). A question of belonging: Race, social fit, and achievement. Journal of Personality and Social Psychology, 92 (1), 82–96.​ A progress report from the King’s What Works team | 27

Embedding robust evaluation practices

A strong evidence base is essential for making good A Theory of Change tool allows practitioners to lay out progress in widening participation and student success. how every input and aspect of activity can be measured, Our colleagues working across the student lifecycle are what data need to be collected to monitor relevant constantly seeking to improve programmes and initiatives, participants’ outcomes, and which methodology will be and robust evaluation is key to that. It is also key to applied to test the intervention’s specific impact. It forces meeting our commitments to the Office for Students and practitioners to ask three crucial questions: to make sure we are investing smartly. Since 2018, the What Works team has made the evaluation of initiatives 1 Is this the right intervention? as important as the initiatives themselves. Within our Widening Participation (outreach) and Student Success 2 Are your aims realistic? teams, we are still learning about the most effective interventions to support our respective target groups, 3 Is your intervention testable? but our understanding grows with each evaluation. Below are our top tips for a successful evaluation and As a result, our evaluations not only produced more building theories of change and research protocols for robust and comparable findings but also fostered better all key projects. collaboration and knowledge exchanges within the institution, growing the evidence base at King’s. It means we are always thinking about the why behind our work, Impact is important, but processes matter too what it’s for, what we want to achieve, and what we want the learners we engage with to achieve. It also means When evaluating projects, we recommend a unified we can establish if our programmes are doing the things approach, where implementation (process evaluation) we claim. as well as impact are evaluated. Well-integrated Process and Impact Evaluation ensure that both strands support each other and make the overarching evaluative findings Ask the right questions much more robust. With the help of a logic model (in this case a theory of change or ToC) we can integrate All KCLWP and most Student Success initiatives double our intended process and impact evaluation and ensure up as research projects feeding into our understanding they are aligned. Reporting each element also potentially of what works in enabling underrepresented groups to allows others to replicate promising interventions. access and succeed at university. To help simplify the Replicability is an important aspect of contributing to often-messy environment in which these interventions the broader research community and something that work, we advise practitioners to distil the essence of their What Works promotes and provides training on within intervention into a research question – an overarching the sector. question that the evaluation will seek to answer.

The evaluation of a programme’s inputs, activities, The research question helps everyone involved in the and outputs forms the Process and Implementation planning, running, evaluation and review of the initiative Evaluation which then feeds into the outcomes and to maintain focus on the key questions it was set up to impacts of a programme – the Impact Evaluation. By answer in the first place. Devising these research questions providing a map that lays out the intended mechanisms for both the impact and process prior to delivery not only behind an intervention, in the event of a negative or helps cut through the complexity of the programme and null result, a good Theory of Change provides the tools focus on appropriate outcomes but also highlights what to distinguish why this was the case. For instance, this data need to be collected to answer those questions. may be as a result of theory failure (the intervention does not work as theorised to achieve the intended outcomes), implementation failure (the implementation Use the right data at the right time of the intervention was not as intended), or methodology failure (the evaluation methodology was inadequate or Research questions must employ appropriate outcome conducted inadequately). And when there is a positive measures and data collection strategies to demonstrate the result it means we can point to why, and establish key success (or not) of initiatives. In practice, this means we follow up questions. need to find the best data for each question to be able to say: ‘I’ll know [outcome reached] when I see [indicator]’. 28 | A progress report from the King’s What Works team

Figure 15 Theory of change

Start Situation Outcomes First identify what is the problem that you Define the short-term outcomes of the will be trying to resolve. intervention you want to achieve.

Aim Impact What solution do you propose in your What is the long-term goal of your intervention and what is your main goal. intervention? What will change in the long run?

Activities What activities need to be undertaken during the intervention? Rationale and assumptions Inputs Now you should revisit all the Think about what you will need to invest in the assumptions you’ve made and the intervention eg time, people, funds? rationale behind them. Are you sure that A leads to B and are you sure Outputs that it happens in this direction. Try to define the direct outputs of your intervention.

Change

Figure 16 Constructing a research question

These are overarching questions that your evaluation will seek to answer. They will determine the scope and approach of your evaluation

Primary research question Secondary research question Process evaluation question

Causal impact of your evaluation Focus on specific groups or Focus on implementation and intermediate outcomes efficiency of your set-up l Did [scheme] increase [main outcome] among [group]? l Did [scheme] increase [main outcome/ l Was the initiative delivered the way secondary outcome] among [group/ we expected? l Did contextual admissions improve subgroup]? enrolment rates among students from l Are we targeting the right students? vulnerable groups? l Did contextual admissions improve enrolment rates among estranged l What was the cost-effectiveness of students? the initiative? A progress report from the King’s What Works team | 29

Choose the right method for your context Observable indicators Once the data from our indicators is collected, there are many different methods that can be used to try and These are indicators that we can build into the understand whether the initiative is having an impact, and evaluation and control out, eg demography, how it’s operating in practice. Depending on the specific observed behaviour, measured attitudes. programme and its context, some research methods are better suited to this question than others. Following the OfS’ Standards of Evidence, we conceptualise three levels of impact evaluation: Monitoring, Comparing What Works encourages the use of reliable and and Identifying; as shown in the diagram below. validated (tested63) measurements, for example by using academically validated scales to measure socio- Over time, we have seen a lot of our interventions move psychological constructs instead of informal questionnaires. upwards on this pyramid and, where feasible, have started For these observable outcomes, a verified scale ensures evaluation strategies set up on Level 3. The task of we are measuring them in a consistent and reliable way. eliminating equality gaps in access, success and progression Similarly, when surveying participants, we promote the at King’s is large. It is a slow and deliberative process. use of knowledge-based questionnaires to infer whether But we are challenging ourselves and our practitioners to participants have improved knowledge or skills rather than start small and gradually build their programmes towards relying upon whether the participant thinks they have. becoming more effective in isolating the actual effect of If survey measures are not validated, it means that it is their initiative on the desired outcome(s). unclear to what extent they measure what they should and whether changes across time are due to a fault in the measurement or can be attributed to real change. Causal inference matters Previously, it was common practice across our initiatives At What Works, we’ve tried to embed data collection to track outcomes and experiences of participants and in as many ‘business as usual’ processes as possible in claim any effect as a positive impact of the intervention order to minimise friction and participant burden. A great (Level 1). Unfortunately, this doesn’t isolate how the example of this are our Settling into King’s questions (see activity has actually influenced these outcomes and page 26) which are integrated into the enrolment task experiences. We are growing the culture in KCLWP students complete each year. and Student Success to aim for more.

Figure 17 Monitoring, comparing and identifying

The evaluation is designed to provide evidence of a causal LEVEL 3 effect of the intervention, either via the allocation mechanism (ie we’re running a randomised controlled trial) or because RCT/ Identify we are able to run a high-quality quasi-experimental approach quasi-exp. (eg difference-in-differences, discontinuity design method

We are comparing participants with others who have not Comparing participated in the programme (or have not participated with non- participants LEVEL 2 with the same intensity) to establish whether those who Qual Compare participate have better outcomes and experiences Dosage research w/ (although we need to be cautious about making causal response participants inferences as a result). and comp group

We have a coherent strategy and activities are selected to contribute to that strategy. We know Secondary Pre/Post Research LEVEL 1 why we expect particular activities to work Qual (based on a Theory of Change and secondary Rationale/ research w/ Monitor Tracking research) and we are tracking participants’ Theory of participants outcomes and experiences. Change only

63 A validated survey is one that has been administered to a representative sample of the population for which it was created (eg measuring academic self- efficacy). The results should meet certain validity and reliability criteria. These include internal consistency, which means that the responses on different questions of one participant should be correlated to each other. The variance between questions for one person answering it should be smaller than the total variance across survey takers. One measure of reliability is that the same person will score very similarly when they take the survey twice within a short time span (eg with a one week gap). There are good public guides for survey validation available online, such as by Tsang, Royse & Terkawei (2017), available on: https://www.ncbi.nlm.nih.gov/pubmed/28616007 30 | A progress report from the King’s What Works team

Just because something is an RCT, doesn’t mean Unobservable indicators it’s a universal gold standard These are indicators that we can’t observe and Practitioners also need to be pragmatic in deciding therefore cannot build into the evaluative model, their methodology. RCTs are the best way to establish eg motivation, unobserved behaviour, unmeasured causality, in the correct circumstances. But practitioners attitudes; anything that influences the outcome need to be pragmatic around what is logistically possible that we don’t know about or can’t measure. and also what an RCT can actually show. Interviews and focus groups, surveys and data analysis can also help to broaden the understanding of a project’s impact. A well conducted pre/post survey can be just as informative (if not more so) as a large-scale experimental design that is As a result, some of our programmes were encouraged run in the wrong context or with a low sample size. to introduce matched comparator groups, comparing the results with a sample of individuals that were as similar Nonetheless, RCTs have generated useful insight to our participant group as possible (ie we matched them and allowed us to work collaboratively with King’s according to observable factors), but did not take part. practitioners, academics and senior managers to This already elevates the evaluation findings (Level 2) collaborate on issues that matter for young people in by offering a wider perspective on how many of the everyday life. This cooperation – paired with an emphasis observed changes within our participant group are, on transparency of research design – has led to the indeed, specific to our participant group only (or, whether, rapid upscaling and identification of the most effective we are observing effects that we would have seen anyway interventions for us at King’s and we would hope that without the intervention). this report might also inspire some readers to trial similar intervention designs in their organisation. While the use of comparator groups forms a more robust insight into potential impact(s) of an intervention than merely tracking participants, this kind of evaluation Build evaluations, and their findings, into ‘business does not offer causal impact, as we can’t control for as usual’ unobservable variables (see box). The only path for inferring causality is therefore by employing an We encourage staff to integrate evaluation into continuous experimental or quasi-experimental methodology that service improvement. Having collected indicators, analysed balances both observable and unobservable indicators the data, and reported the findings, they need to reflect across groups. The introduction of a randomly assigned on these findings and use them to iterate their initiative control group enables you to compare the effectiveness and revise their Theory of Change. What Works do not of new interventions against what would have happened make recommendations regarding the future of services if you had changed nothing. or schemes under evaluation. If an evaluation results in a neutral or negative result, however, What Works is likely to recommend a more in-depth and rigorous evaluation approach for the next phase of the service or scheme.

Figure 18 Consider the context of an evaluation

Research question For example, an experimental design can only tell you whether something works, not how or why (mixed methods can help here).

Sample size Ideally over 1,000 across treatment and control, 200ish if good baseline data on outcome.

Data Collected consistently and universally (administrative datasets are best).

Fidelity and validity Try to maintain protocol consistency within your intervention; this can sometimes ✔ be difficult – and can sometimes reduce external validity.

Ethics If there is substantial, consistent, high-quality evidence that something is effective, it shouldn’t be withheld from anyone who could benefit. A progress report from the King’s What Works team | 31

The Centre for Transforming Access and Student Outcomes

The final part of this report outlines our part in an initiative to scale-up our evidence-based approach to WP and student success approach for the whole higher education sector. In collaboration with Nottingham Trent University and The Behavioural Insights Team, and with funding from the Office for Students, What Works was commissioned to set up a ‘What Works’ centre for higher education. This new centre is called the Centre for Transforming Access and Student Outcomes (or TASO for short) and will be spun out and work independently from King’s in 2020 after a year-long establishment phase. TASO is an independent organisation which will collate, generate and disseminate evidence on the most effective engaged. First and foremost, TASO will need to create approaches to WP and student success. resources which appeal to a wide range of audiences – practitioners, evaluators, researchers, administrators and senior managers will all need to be able to extract Our vision for TASO what they need easily. TASO will be set-up as an independent organisation The establishment of TASO is an exciting opportunity which will collate, generate and disseminate evidence for all HE providers to benefit from a ‘What Works’ on the most effective approaches to access and student centre devoted to closing the gaps in access and success in order to address inequality of opportunity. participation. We hope that we are a good example of TASO aims to encourage a culture of methodological how this approach can apply within a single institution plurality in the evidence it promotes, building on the and we are looking forward to helping TASO learn from lesson we learned at What Works that good evaluation our experiences as it scales up across the sector. is all about choosing the right ‘tools for the job’. While we advocate and champion all methods of evaluation, we are committed to advancing the evidence base by developing more causal evidence – that is, more evidence which can tell us definitively whether activities are What is the What Works Network? having the desired effect. TASO has been engaging from In 2013, the Cabinet Office founded an initiative the start with HE practitioners and evaluators heavily which ‘aims to improve the way government and by forming dedicated advisory groups, theme working other public sector organisations create, share and groups, and a consultancy network. use (or ‘generate, translate and adopt’) high quality evidence in decision-making.’ The Network Perhaps the main difference between What Works and is made up of nine What Works Centres, three TASO will be that, while we are situated within a single affiliate members, and one associate member. institution (for the time being), TASO will be operating TASO is one affiliate member. across many providers. This means there is a high number of people and breadth of providers which need to be 32 | A progress report from the King’s What Works team

What Works in widening participation

Since our launch in January 2018, we have focused on building our understanding of student behaviour, embedding robust evaluation practices in King’s. Our trials have focused on exploring how increasing sense of belonging and students’ social capital can improve the student experience for underrepresented groups. Our Residence offers trial, and the tentative results from the pilot of Your King’s Your Success, suggest these were valid areas of focus. The high sign up rate for the unincentivised AWARE self-reflection tool also suggests students like these sorts of interventions.

In December 2019, What Works merged with the King’s widening participation team. This move heralds a new and exciting focus on access. What Works is starting to do more work in schools, as well as continuing to run its student success initiatives and ensuring students from underrepresented groups have the best start when they enter King’s. Our PACT study suggests we need to do more work to understand how to make our messages salient for students and parents when using remote methods. We also have more to do to understand the working-class female journey into university. On the analytics side, we are starting a project looking at the progress of students who enter King’s through contextual admissions and continuing research and evaluation into the impact of financial support on decision-making, access and outcomes.

Our focus on understanding the first year of university, using both data analysis and deep dive interviews, has also provided helpful context for practitioners aiming to increase student retention. Our learning will be applicable to work with students in school or college. Insights into today’s first years can also inform activity with future students. We plan to use these findings to run focussed work in the coming years with bursary recipients and students entering King’s via King’s widening participation programmes. A progress report from the King’s What Works team | 33

What next for What Works? Michael Bennett

WHAT WORKS AT KING’S IS ON A SOLID FOOTING, with a mandate to look at Widening Participation activity across the lifecycle, providing evaluation and guidance, testing and trialling new work all while seeking to demonstrate impact. This by definition will take our work in directions that we can’t always anticipate, which for me is very exciting.

You’ve read about the breadth and depth of the work the team has done over the past two years: sometimes successful, sometimes not so much! But always ground-breaking and focussed, and with things to learn as a result. The team has worked hard to get TASO up and running and King’s is now well placed to meet the expectations of the OfS on developing rigorous approaches to evaluation. We have a five-year strategy to guide the way. So we can now consolidate what’s gone before and set direction for what’s to come: making sure that underrepresented students are supported to make smart decisions and have the confidence and knowledge to pursue the degree they are most passionate about, helping the transition into university life and providing sources of support and opportunities to participate that help our students make the most of their studies.

In a sense at this point the What Works team has come full circle, having first emerged out of KCLWP and now brought back to work alongside. To foster a culture of rigorous evidence-based decision making and innovative, creative and unexpected approaches to investigate what works, how and for who. We owe this to our current and future students, the communities we work with (including our professional community), and the ongoing health of the widening participation mission across the sector. All to make the world a little bit fairer.

Thank you for reading this report. I am very excited for what comes next and I invite you to get in touch if you would like to discuss things further with [email protected].

Michael Bennett is Associate Director of Widening Participation at King’s College London and leads the What Works team. @KCLWhatWorks [email protected]

More results and information at: www.blogs.kcl.ac.uk/behaviouralinsights

What Works King’s College London 57 Waterloo Road London SE1 8WA

© King’s College London Approved by Brand May 2020 Designed by Calverts calverts.coop