<<

National Student Survey

Analysis of national results for 2011

Alex Buckley August 2012

Contents Key findings ______3

1. How to use this report ______5

1.1 Statistical significance ______6

2. Overall results ______8

2.1 Overall item scores ______8 2.2 Overall scale scores ______10 2.3 Relationships between aspects of the student experience ______11 2.4 Impact of aspects of the student experience on overall satisfaction ______12

3. Student characteristics ______13

3.1 Differences between age groups ______13 3.2 Differences between genders ______16 3.3 Differences between disability and no known disability ______18 3.4 Differences between subject clusters ______20 3.5 Differences between student domicile ______23 3.6 Differences between part-time and full-time students ______25 3.7 Differences between part-time Open University students and other part-time students 27

4. Institutional characteristics ______29

4.1 Differences between UK nations______29 4.2 Differences between mission groups ______32 4.3 Differences between institution types ______35 4.4 Comparison between HE and FE institutions ______38

5. Comparison between NSS and PTES ______40

6. Additional HEA resources ______42

6.1 Discipline reports ______42 6.2 Research ______42 6.3 Case studies of enhancement activities ______43 6.4 Postgraduate surveys ______43 6.5 Consultancy and change programmes ______43

7. Further reading ______44

Appendix A: Brief description of analyses ______46

Appendix B: Information about the NSS ______47

Appendix C: Core NSS items______48

2

Key findings

Quality of teaching and learning (Q1-4)

This is the most positive area of the survey. It is also the scale that has the strongest impact on students’ overall satisfaction with their courses. While there are few large differences between the different student groups for these items, students at Pre-1992 institutions are more positive than students at other institutions.

Assessment and feedback (Q5-9)

This is the least positive area, and the three items specifically relating to feedback (Q7-9) are the three least positive in the survey. Despite the negativity expressed by a substantial minority of students, and the emotive nature of assessment and feedback, there appears to be a relatively weak connection between students’ experiences of assessment and feedback and their overall satisfaction with their courses.

There are a number of large differences between student groups for these items. Lowest levels of positivity can be found among students from the EU, those studying at Pre-1992 institutions and those studying at institutions in Scotland. Conversely, more students respond positively from older age groups, subjects in Arts and Humanities, and further education colleges (FECs).

Academic support (Q10-12)

Within the three items in this group, there is a much higher level of positivity for Q11, relating to the ability to contact staff, than for Q10 and Q12, that relate to advice and support. There are very few large differences between student groups for this area in the survey. Students at Pre-1992 institutions are more positive than other student groups about their ability to contact staff when they need to, while students at FECs are more positive about the advice and support they have received.

Organisation and management (Q13-15)

There is a strong connection between this scale and students’ overall satisfaction with their courses. Disabled students report less positivity for the items in this scale, while students at Pre-1992 institutions are more positive.

Learning resources (Q16-18)

Not only is the connection between this scale and students’ overall satisfaction with their courses very weak, there are also only weak correlations between this scale and the other scales, rendering the learning resources scale a relatively isolated element of the survey. Students at Russell Group institutions, and Pre- 1992 institutions generally, are more positive about these items. Students in older age groups, and students at the OU, are notably less positive about Q18, which is related to the ability to access specialised equipment and tends to receive far fewer responses than the other items in the survey.

Personal development (Q19-21)

As with the academic support scale, there are few large differences between student groups for the items in this scale. Students in older age groups are less positive, while students studying subjects in Health Sciences are more positive. Results from the Postgraduate Taught Experience Survey (PTES), which uses similar questions but surveys a different group of students, suggest that undergraduate students are more positive about personal development than taught postgraduate students.

3

Overall satisfaction (Q22)

The results in this report suggest that the items in the six scales capture around two thirds of students’ overall experience of their courses. This is reassuring, but it highlights the fact that there are therefore important elements of their experience that the NSS items do not capture.

There are some large differences between the levels of overall satisfaction for different student groups. Students at Russell Group and 1994 Group institutions, and Pre-1992 institutions more generally, are more satisfied. Students at institutions in Scotland and students in older age groups also express greater overall satisfaction.

Age groups

There is a general trend for positivity to increase with age. This trend can be seen for most areas, with the personal development scale being a prominent exception.

Part-time students

Very pronounced differences can be seen between the responses of full-time and part-time students. However, further investigation suggests that many of these differences are attributable to the distinctive responses of students at the OU. In general, full-time students and part-time students studying at institutions other than the OU report broadly similar experiences, whilst OU students report experiences with a very different character.

Further education colleges

There are large differences between the reported experiences of students at FECs and students at higher education institutions (HEIs). However, in many cases these large differences between FECs and HEIs as two aggregated groups are less pronounced than differences between types of HEI. For instance, students at HEIs report greater overall satisfaction than those at FEC institutions. However, closer examination shows that FEC students have very similar levels of overall satisfaction to students at both Post-1992 and Small and Specialist institutions, and it is in fact the Pre-1992 student responses that account for the higher result for HEIs.

Postgraduate Taught Experience Survey (PTES)

The comparison between NSS results and the results for selected items from PTES suggests that undergraduate students are broadly more positive about their experiences than students on taught postgraduate courses. However, caution should be exercised due to the differing nature of the respondents, their courses, and the survey response rates.

4

1. How to use this report

This report presents data from the 2011 administration of the National Student Survey (NSS), broken down by a range of student and institutional characteristics. It is designed to provide high-level information about the perceptions of students in UK higher education, and to assist universities and colleges in targeting, designing and evaluating their efforts to enhance learning and teaching.

When used with an awareness of their limitations, NSS data can play a useful role within institutions in supporting improvements in learning and teaching. By allowing comparisons and benchmarking, the data can highlight issues that would reward further investigation, either as areas of apparent success or challenge. NSS results can be a useful starting point for discussions about learning and teaching, either with colleagues, senior managers, student representatives or students themselves. It is also important to triangulate the data with quantitative and qualitative information from other sources in order to have an accurate sense of students’ experiences.

This report presents a high-level picture of UK undergraduates’ perceptions of their courses, through the lens of the NSS data. It does not provide a detailed picture of students’ learning experiences, nor does it dictate specific areas for intervention. However, it can be used in conjunction with local NSS data to provide a good starting point for further investigation and discussion.

The inherently hierarchical nature of the data (students within departments within institutions, and with all of these levels able to be grouped in a number of ways) means that multi-level modelling is necessary to isolate the specific contribution of any variable to NSS response. This report contains simple comparisons between results for different groups rather than multi-level modelling, and is only designed to provide indicative information about the responses of different student groups. 1

In this report the aggregated experiences of all students responding to the NSS across all disciplines in 2011 are analysed. It can be used in conjunction with the 28 discipline-specific NSS reports produced by the HEA, which can be found at: http://www.heacademy.ac.uk/nss

This report focuses solely on the results for the 2011 administration of the survey. Analyses of NSS results from previous years are available from the HEFCE website: http://www.hefce.ac.uk/whatwedo/lt/publicinfo/nationalstudentsurvey/

Notes on the tables

• The percentage values included in the tables correspond to the percentage agreement, i.e. the proportion of students who agreed with the relevant statement (survey item), selecting either ‘definitely agree’ or ‘mostly agree’. This figure is used as the measure of positivity throughout the report. Accordingly, a lower level of positivity reported here may be due to a larger proportion of respondents selecting ‘neither agree nor disagree’, rather than a larger proportion of respondents disagreeing (selecting ‘definitely disagree’ or ‘mostly agree’).

1 For more information about multilevel modelling of NSS data, see Surridge, P. (2008) Interpreting National Student Survey data using multi-level modelling: A non-technical guide . York: HEA. Available from: http://www.heacademy.ac.uk/assets/documents/nss/NSS_interpreting_data_nontechnical_guide.pdf [accessed 31 August 2012], and Marsh, H. and Cheng, J. (2008) National Student Survey of teaching in UK universities: Dimensionality, multilevel structure and differentiation at the level of university and discipline: Preliminary results . York: HEA. Available from http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_herb_marsh-28.08.08.pdf [accessed 31 August 2012]. 5

• Included in the tables are the number of responses to each item for the relevant groups. This includes all of the responses (including those who disagreed). Because there are different response rates for each item in the NSS, the range between the lowest and the highest number of responses is shown.

• The threshold for statistical significance used in this report is a significance level of 0.05 or less (representing a 95% or greater probability that a difference is reflective of the student population as a whole). Where there are two scores being compared, a statistically significant difference is indicated by the higher result being in bold text. Where there are three or more scores being compared, a statistically significant difference is indicated by the significance level being in bold text; this indicates that there is at least one significant difference between two of the scores. (For more about statistical significance as used in this report, see Section 1.1 below and Appendix A.)

• While the percentage scores in the tables solely relate to the ‘definitely agree’ and ‘mostly agree’ responses, the significance levels relate to the full range of options. There may therefore be cases where differences in responses as a whole are statistically significant, even though the difference between the ‘agree’ responses shown in the tables is very small (see Appendix A for more information).

• To aid visual comparison, the data in the tables are also represented graphically. The full item wording can be found in the Tables and in Appendix C.

1.1 Statistical significance

As with all uses of quantitative data, caution should be exercised when interpreting small differences between respondent groups. Small differences may be due to random variations in response, demographic characteristics of the respondents, method of response and many other factors. A standard method of evaluating whether patterns in the survey sample are likely to reflect patterns in the wider population is to use tests of statistical significance. Significance levels are included in the tables, but for ease of use significance levels of 0.05 or lower have been highlighted in bold – this is the level at which results are standardly taken to be significant, and suggests that there is a 95% or greater probability that the patterns found in the survey sample are reflective of the final-year undergraduate population as a whole. However, even where patterns are shown to be statistically significant it does not necessarily follow that they are of practical importance or are substantively different, particularly where differences are small.

It should also be noted that significance testing assumes that the survey has been conducted using a random sample, or a design that approximates this. In fact, the NSS attempts to survey the whole final-year undergraduate population and, while all surveys may experience non-response bias, it can be more difficult to correct for this in a ‘census’ type survey. A review by Paula Surridge for the HEA described tests for non- response bias that found no significant effect 2 and the overall profile of NSS respondents is broadly representative of the wider student body. However, it is not possible to say whether each subgroup explored in this report (such as part-time students, or the results for HEI mission groups) is similarly representative. For this reason, the significance levels included in this report should only be taken as indications of confidence in the survey results and we recommend that caution be exercised when interpreting, using or relying on small differences.

2 Surridge, P. (2009) The National Student Survey three years on: What have we learned . York: HEA. Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_three_years_on_surridge_02.06.09.pdf [accessed 31 August 2012]

6

More information about the methods of analysis used in this report are included in Appendix A.

The HEA acknowledges the assistance of the Higher Education Funding Council for (HEFCE) in providing the NSS dataset used in this report.

The author also thanks Paul Bennett and Gosia Turner for statistical and editorial input.

7

2. Overall results

2.1 Overall item scores

Table 2.1 shows the percentage of all respondents answering each item who selected ‘definitely agree’ or ‘mostly agree’. The profile of responses can be seen visually in Figure 2.1

Table 2.1: Responses to the NSS 2011 across all disciplines, institutions and student groups

Item % agreement Q1. Staff are good at explaining things 88.0% Q2. Staff have made the subject interesting 80.9% Q3. Staff are enthusiastic about what they are teaching 85.3% Q4. The course is intellectually stimulating 83.7% Q5. The criteria used in marking have been made clear in advance 73.1% Q6. Assessment arrangements and marking have been fair 74.4% Q7. Feedback on my work has been prompt 62.6% Q8. I have received detailed comments on my work 66.9% Q9. Feedback on my work has helped me clarify things I did not understand 61.4% Q10. I have received sufficient advice and support with my studies 74.9% Q11. I have been able to contact staff when I needed to 82.9% Q12. Good advice was available when I needed to make study choices 72.1% Q13. The timetable works effectively as far as my activities are concerned 78.4% Q14. Any changes in the course or teaching have been communicated 73.4% effectively Q15. The course is well organised and is running smoothly 72.4% Q16. The library resources and services are good enough for my needs 81.0% Q17. I have been able to access general IT resources when I needed to 83.4% Q18. I have been able to access specialised equipment, facilities, or rooms when 75.6% I needed to Q19. The course has helped me to present myself with confidence 79.0% Q20. My communication skills have improved 81.9% Q21. As a result of the course, I feel confident in tackling unfamiliar problems 79.2% Q22. Overall, I am satisfied with the quality of the course 83.1% Number of responses to each item (range lowest - highest) 240 ,822 - 264 ,321

8

100%

90%

80%

70%

60%

50%

40% % agreement 30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 2.1: Responses to the NSS 2011 across all disciplines, institutions and student groups

In terms of total NSS responses, the highest level of positivity is for Q1, ‘Staff are good at explaining things’, with 88.0% agreeing with the statement. In fact, the three most positive items (Q1, Q3 and Q4) are all in the first scale, relating to the quality of learning and teaching. The lowest level of positivity is for Q9, ‘Feedback on my work has helped me clarify things I did not understand’, with 61.4% agreement, and the three least positive items (Q7, Q8 and Q9) are all in the assessment and feedback scale, and more specifically, are all related to feedback. The gap between the highest and lowest levels of agreement (Q1 and Q9) is marked, at 26.6%. The level of agreement for Q22, ‘Overall, I am satisfied with the quality of the course’, is 83.1%, and is the fifth most positive item.

9

2.2 Overall scale scores

Twenty-one items in the NSS are grouped into six scales, each measuring a different aspect of the student experience, while Q22 examines overall satisfaction (see Appendix C). These mean results for the six scales and Q22 have been created by calculating the total ‘mostly agree’ and ‘definitely agree’ responses for the items within a scale as a proportion of the total responses for those items.

Table 2.2: Mean percentage agreement for each NSS scale

Items in Scale m ean % Scale scale Total scale responses agreement Quality of learning and teaching 3 1-4 1,054 ,914 84.5 % Assessment and feedback 5-9 1,318 ,729 67.7 % Academic support 10 -12 786 ,394 76.7 % Organisation and management 13 -15 788 ,996 74.8 % Learning resources 16 -18 764 ,227 80.1 % Personal development 19 -21 788 ,909 80.0 % Overall satisfaction 22 264 ,321 83.1 %

100% 90% 80% 70% 60% 50% 40% % agreement 30% 20% 10% 0% Quality of Assessment Academic Organisation Learning Personal Overall learning and and feedback support and resources development satisfaction teaching management

Figure 2.2: Mean percentage agreement for each NSS scale

The difference between the mean percentage of students agreeing with items in the most positive scale (quality of learning and teaching) and the least positive scale (assessment and feedback) is 16.8%. The quality of learning and teaching scale is the only scale that is more positive than Q22, ‘Overall, I am satisfied with the quality of the course’.

3 On the NSS questionnaire itself, this scale is titled “The teaching on my course” http://www.thestudentsurvey.com/content/nss2012_questionnaire_english.pdf [accessed 21 August 2012] 10

2.3 Relationships between aspects of the student experience

Table 2.3 shows the extent to which the six scales and the overall satisfaction item are correlated with one another. In other words, it gives an indication of the strength of the relationship between different aspects of the student experience. Values nearer one indicate a stronger relationship. Due to the fact that this analysis shows correlations rather than causal relationships, it is not possible to conclude that improving one aspect of the student experience will automatically lead to improvements in another aspect, even where the relationship appears strong.

Table 2.3: Correlations between scales and item 22

Q22. Overall, I am satisfied Quality of Assessment Organisation with the learning and and Academic and Learning Personal quality of teaching feedback support management resources development the course scale scale scale scale scale scale Q22. Overall, I 1 0.719 0.585 0.653 0.622 0.372 0.614 am satisfied with the quality of the course

Quality of 1 0.582 0.63 0 0.549 0.331 0.571 learning and teaching scale

Assessment 1 0.608 0.523 0.325 0.457 and feedback scale

Academic 1 0.566 0.381 0.533 support scale

Organisation 1 0.364 0.432 and management scale Learning 1 0.359 resources scale

Personal 1 development scale

All correlations are statistically significant at the 0.000 level.

The strongest relationship is between overall satisfaction and the quality of learning and teaching. Other strong relationships exist between academic support (e.g. receiving advice on studies) and both overall satisfaction and the quality of learning and teaching. The weakest relationship exists between learning resources and the quality of learning and teaching, and in general the learning resources scale appears to be somewhat isolated, bearing relatively weak relationships with all of the other scales.

11

2.4 Impact of aspects of the student experience on overall satisfaction

The different aspects of the student experience, as measured by the six item scales in the NSS, are likely to impact upon students’ overall satisfaction with their course, as measured by Q22. To test this, a multiple regression has been performed, examining the extent to which the results for different item scales explain or predict overall satisfaction. In the table below, the higher the size of the standardised coefficient , the greater the influence of that aspect of the student experience on overall satisfaction.

Table 2.4: Linear regression of scales on question 22

Standardised Unstandardised coefficients coefficients B Std. error Beta t Sig. (Constant) -.796 .008 -98.697 .000

Quality of learning .453 .002 .326 182.333 .000 and teaching scale

Assessment and .091 .002 .082 49.379 .000 feedback scale

Academic support .174 .002 .156 86.244 .000 scale

Organisation and .219 .002 .209 130.644 .000 management scale

Learning resources .030 .002 .027 19.556 .000 scale

Personal .248 .002 .211 133.037 .000 development scale

2 All scales combined explain 66% (Adjusted R = 0.655) of the variability of the overall satisfaction item. This is a strong effect but nevertheless suggests the existence of other factors affecting the overall experience but not measured by items Q1-21.

By some distance, the scale with the strongest impact on overall satisfaction is the quality of learning and teaching, which is also the most positive scale (see Table 2.2). The scale with the weakest impact on overall satisfaction is learning resources. The scale with the second weakest impact on overall satisfaction is assessment and feedback, which is also the least positive scale. The weak impact of assessment and feedback is notable, given both the emphasis it receives from institutions and its emotive nature for students themselves.

12

3. Student characteristics

3.1 Differences between age groups

Table 3.1: Percentage agreement for each NSS item by age group

% agreement Item 0-22 23-24 25-34 35-44 45-54 55-64 65+ Sig. Q1. Staff are good at explaining things 88.5% 87.1% 86.9% 87.6% 87.5% 87.4% 89.2% .000 Q2. Staff have made the subject 80.1% 80.3% 82.3% 84.6% 84.9% 86.5% 89.2% .000 interesting Q3. Staff are enthusiastic about what they 85.2% 83.9% 85.4% 87.2% 88.1% 90.6% 90.7% .000 are teaching Q4. The course is intellectually 82.7% 82.8% 85.3% 88.6% 88.9% 91.9% 93.1% .000 stimulating Q5. The criteria used in marking have 71.2% 71.8% 76.3% 81.8% 82.7% 84.4% 83.7% .000 been made clear in advance Q6. Assessment arrangements and 73.7% 72.5% 74.7% 79.3% 80.5% 85.3% 87.8% .000 marking have been fair Q7. Feedback on my work has been 61.1% 59.7% 64.4% 71.0% 73.2% 79.3% 82.5% .000 prompt Q8. I have received detailed comments 65.0% 62.5% 69.5% 77.7% 81.9% 86.4% 88.1% .000 on my work Q9. Feedback on my work has helped me 59.6% 58.9% 63.4% 71.1% 74.7% 77.4% 79.2% .000 clarify things I did not understand Q10. I have received sufficient advice and 74.7% 73.7% 75.5% 76.8% 76.5% 78.6% 81.4% .000 support with my studies Q11. I have been able to contact staff 83.6% 82.0% 81.0% 81.7% 81.5% 85.0% 86.9% .000 when I needed to Q12. Good advice was available when I 72.2% 71.7% 72.2% 72.7% 71.8% 71.0% 71.0% .000 needed to make study choices Q13. The timetable works effectively as 79.8% 75.9% 73.3% 76.3% 77.8% 83.1% 85.7% .000 far as my activities are concerned Q14. Any changes in the course or 74.4% 70.9% 69.3% 72.8% 75.1% 78.8% 82.2% .000 teaching have been communicated effectively Q15. The course is well organised and is 73.3% 69.3% 67.9% 73.4% 75.6% 81.3% 84.5% .000 running smoothly Q16. The library resources and services 81.7% 81.1% 78.9% 79.0% 79.1% 77.4% 82.3% .000 are good enough for my needs Q17. I have been able to access general IT 83.6% 83.9% 83.2% 82.4% 80.5% 79.4% 80.7% .000 resources when I needed to Q18. I have been able to access 77.0% 76.5% 73.3% 69.7% 65.1% 60.2% 58.8% .000 specialised equipment, facilities, or rooms when I needed to Q19. The course has helped me to 78.7% 79.6% 79.6% 80.9% 79.3% 75.1% 71.7% .000 present myself with confidence Q20. My communication skills have 82.3% 83.0% 80.9% 80.9% 79.1% 72.7% 71.4% .000 improved Q21. As a result of the course, I feel 79.2% 80.0% 79.1% 79.6% 78.4% 73.9% 72.6% .000 confident in tackling unfamiliar problems Q22. Overall, I am satisfied with the 83.1% 81.7% 82.0% 84.5% 85.3% 88.2% 91.0% .000 quality of the course Number of responses to each item (range 165 ,143 - 25 ,670 - 27 ,916 - 13 ,328 - 6,556 - 1,398 - 398 - lowest - highest) 176,541 27,251 31,537 16,723 8,808 2,303 726

13

100%

90%

80%

70% 0-23 60% 23-25 50% 25-35

40% 35-45 % agreement 45-55 30% 55-65 20% 65+ 10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Quality of learning and teaching Assessment and feedback

Figure 3.1: Percentage agreement for each NSS item by age group, Q1-9

100%

90%

80%

70% 0-23 60% 23-25 50% 25-35 35-45 40% % agreement 45-55 30% 55-65 20% 65+ 10%

0% Q10 Q11 Q12 Q13 Q14 Q15 Academic support Organisation and management

Figure 3.2: Percentage agreement for each NSS item by age group, Q10-15

14

100% 90% 80% 70% 0-23 60% 23-25 50% 25-35 40% 35-45 % agreement 30% 45-55 20% 55-65 10% 65+ 0% Q16 Q17 Q18 Q19 Q20 Q21 Q22 Learning resources Personal development Overall satisfaction

Figure 3.3: Percentage agreement for each NSS item by age group, Q16-22

A broad pattern for positivity increasing with age is seen in 13 of the 22 items (Q2-10, Q13-15 and Q22). This is pronounced for the assessment and feedback scale, particularly Q8, ‘I have received detailed comments on my work’. This raises interesting questions about the range of expectations of different age groups regarding the level of feedback received, or the potential impact of institutions with large numbers of older students (such as The Open University). The reverse pattern, with positivity broadly decreasing with age, is seen for the personal development scale (Q19-21) and for Q18, ‘I have been able to access specialised equipment, facilities, or rooms when I needed to’. The lower positivity for the personal development scale among older students is possibly unsurprising, given the items’ focus on development skills that are acquired throughout life. No clear pattern is seen in the remaining four items (Q1, Q11-12 and Q17).

It should be noted that even where positivity generally increases with age, the very youngest age groups (0-23 and 23-25) often do not conform to that pattern (e.g. Q7-9 and Q13-15). This is important to note as the two youngest age groups account for over 75% of the total respondents.

15

3.2 Differences between genders

Table 3.2: Percentage agreement for each NSS item by gender

% agreement Item Female Male Sig. Q1. Staff are good at explaining things 88.1% 87.9% .078 Q2. Staff have made the subject interesting 81.9% 79.3% .000 Q3. Staff are enthusiastic about what they are teaching 86.2% 84.1% .000 Q4. The course is intellectually stimulating 84.6% 82.4% .000 Q5. The criteria used in marking have been made clear in advance 73.7% 72.2% .000 Q6. Assessment arrangements and marking have been fair 74.1% 74.9% .000 Q7. Feedback on my work has been prompt 63.6% 61.1% .000 Q8. I have received detailed comments on my work 67.8% 65.6% .000 Q9. Feedback on my work has helped me clarify things I did not understand 60.9% 62.1% .000 Q10. I have received sufficient advice and support with my studies 74.2% 76.0% .000 Q11. I have been able to contact staff when I needed to 82.5% 83.7% .000 Q12. Good advice was available when I needed to make study choices 71.4% 73.2% .000 Q13. The timetable works effectively as far as my activities are concerned 78.4% 78.4% .588 Q14. Any changes in the course or teaching have been communicated 72.8% 74.4% effectively .000 Q15. The course is well organised and is running smoothly 71.2% 74.3% .000 Q16. The library resources and services are good enough for my needs 79.6% 83.0% .000 Q17. I have been able to access general IT resources when I needed to 82.5% 84.6% .000 Q18. I have been able to access specialised equipment, facilities, or rooms when 74.9% 76.8% I needed to .000 Q19. The course has helped me to present myself with confidence 79.1% 78.8% .000 Q20. My communication skills have improved 82.7% 80.7% .000 Q21. As a result of the course, I feel confident in tackling unfamiliar problems 78.6% 80.0% .000 Q22. Overall, I am satisfied with the quality of the course 83.3% 82.8% .000 Number of responses to each item (range lowest - highest) 142 ,968 - 97 ,854 - 158,147 106,174

16

100%

90%

80%

70%

60%

50% Female Male

% agreement 40%

30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 3.4: Percentage agreement for each NSS item by gender

There is higher positivity among female students for the quality of learning and teaching scale (Q1-4), including a 2.6% difference for Q2, ‘Staff have made the subject interesting’. Male students demonstrate higher positivity for the learning resources scale, including a 3.4% difference for Q16, ‘The library resources and services are good enough for my needs’ (the greatest difference between the genders). Male students are also more positive about the academic support scale, although to a lesser extent.

17

3.3 Differences between disability and no known disability

Table 3.3: Percentage agreement with each NSS item by whether or not respondent has a disability

% agreement No known Item Disabled disability Sig. Q1. Staff are good at explaining things 85.6% 88.3% .000 Q2. Staff have made the subject interesting 81.7% 80.7% .000 Q3. Staff are enthusiastic about what they are teaching 85.0% 85.4% .000 Q4. The course is intellectually stimulating 83.4% 83.5% .000 Q5. The criteria used in marking have been made clear in advance 69.9% 73.1% .000 Q6. Assessment arrangements and marking have been fair 72.7% 74.3% .000 Q7. Feedback on my work has been prompt 61.6% 62.3% .000 Q8. I have received detailed comments on my work 66.5% 66.5% *.000 Q9. Feedback on my work has helped me clarify things I did not understand 60.6% 61.1% .000 Q10. I have received sufficient advice and support with my studies 74.6% 74.9% .000 Q11. I have been able to contact staff when I needed to 80.4% 83.1% .000 Q12. Good advice was available when I needed to make study choices 71.5% 72.1% .000 Q13. The timetable works effectively as far as my activities are concerned 74.3% 78.7% .000 Q14. Any changes in the course or teaching have been communicated 68.1% 73.8% effectively .000 Q15. The course is well organised and is running smoothly 67.7% 72.6% .000 Q16. The library resources and services are good enough for my needs 77.8% 81.3% .000 Q17. I have been able to access general IT resources when I needed to 81.4% 83.6% .000 Q18. I have been able to access specialised equipment, facilities, or rooms when 74.4% 76.0% I needed to .000 Q19. The course has helped me to present myself with confidence 76.8% 79.3% .000 Q20. My communication skills have improved 79.9% 82.3% .000 Q21. As a result of the course, I feel confident in tackling unfamiliar problems 76.8% 79.5% .000 Q22. Overall, I am satisfied with the quality of the course 80.1% 83.2% .000 Number of responses to each item (range lowest - highest) 22 ,129 - 215 ,065 - 23,715 234,570

* As explained in Section 1 and Appendix A, significance levels relate to differences between the full range of results (grouped into ‘agree’, ‘neither agree nor disagree’, and ‘disagree’), while only the percentage agreement is shown in the table. Therefore, as in this case, a small or zero difference can be associated with a low significance level, due to differences in the ‘neither agree nor disagree’ or ‘disagree’ scores.

18

100%

90%

80%

70%

60%

50% Disabled

% agreement 40% No known disability 30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 3.5: Percentage agreement with each NSS item by whether or not respondent has a disability

There are some pronounced differences between these groups. For the items in the organisation and management scale (Q13-15), the level of agreement of students without a disability is an average of 5% higher than that of students with a disability. The items in the scale address issues around timetabling, communication and course management. There are also markedly lower levels of positivity among disabled students for the learning resources and personal development scales (Q16-18 and Q19-21). The level of agreement of disabled students to Q22, ‘Overall, I am satisfied with the quality of the course’, is 3.1% lower than for non-disabled students. The only item for which the positivity of disabled students is higher is Q2, ‘Staff have made the subject interesting’, with a 1% difference in agreement.

19

3.4 Differences between subject clusters

These subject clusters have been created using the HEA’s categorisation. Health Sciences comprises: Health; Medicine and Dentistry; Nursing. STEM comprises: Biological Sciences; Built Environment; Computing; Engineering; Geography, Earth and Environmental Sciences; Mathematics, Statistics and Operational Research; Physical Sciences; Psychology. Social Sciences comprises: Business and Management; Economics; Education; Finance and Accounting; Hospitality, Leisure, Sport and Tourism; Law; Marketing; Politics; Social Work and Social Policy; Sociology. Arts and Humanities comprises: Art and Design; English; History; Languages; Media Communications; Music, Dance and Drama; Philosophical and Religious Studies.

NSS reports for all 28 disciplines are available from the HEA website: http://www.heacademy.ac.uk/nss . Only students studying a subject at 100% FPE have been included in this comparison.

Table 3.4: Percentage agreement with each NSS item by discipline cluster

% agreement Health Social Arts and Item Sciences STEM Sciences Humanities Sig. Q1. Staff are good at explaining things 89.4% 87.5% 88.0% 87.0% .000 Q2. Staff have made the subject interesting 84.6% 78.6% 78.3% 84.3% .000 Q3. Staff are enthusiastic about what they are teaching 87.5% 83.7% 83.8% 87.5% .000 Q4. The course is intellectually stimulating 87.5% 84.1% 81.6% 82.1% .000 Q5. The criteria used in marking have been made clear in 73.1% 72.1% 75.6% 71.4% .000 advance Q6. Assessment arrangements and marking have been fair 71.6% 75.1% 74.9% 74.2% .000 Q7. Feedback on my work has been prompt 62.5% 59.7% 62.6% 66.2% .000 Q8. I have received detailed comments on my work 62.9% 61.5% 67.6% 75.3% .000 Q9. Feedback on my work has helped me clarify things I 59.2% 58.8% 60.6% 67.9% .000 did not understand Q10. I have received sufficient advice and support with 76.3% 75.3% 74.2% 75.6% .000 my studies Q11. I have been able to contact staff when I needed to 81.7% 84.0% 82.1% 82.4% .000 Q12. Good advice was available when I needed to make 74.3% 72.4% 71.1% 73.4% .000 study choices Q13. The timetable works effectively as far as my 70.7% 80.0% 79.7% 78.8% .000 activities are concerned Q14. Any changes in the course or teaching have been 65.0% 76.7% 74.8% 70.8% .000 communicated effectively Q15. The course is well organised and is running 63.9% 75.6% 74.5% 68.3% .000 smoothly Q16. The library resources and services are good enough 84.1% 83.4% 79.1% 78.9% .000 for my needs Q17. I have been able to access general IT resources 87.3% 84.4% 82.0% 81.7% .000 when I needed to Q18. I have been able to access specialised equipment, 79.9% 78.1% 74.0% 71.9% .000 facilities, or rooms when I needed to Q19. The course has helped me to present myself with 85.5% 76.1% 80.8% 77.1% .000 confidence Q20. My communication skills have improved 89.7% 78.3% 83.2% 80.3% .000 Q21. As a result of the course, I feel confident in tackling 85.8% 78.2% 79.8% 76.4% .000 unfamiliar problems Q22. Overall, I am satisfied with the quality of the course 84.4% 83.4% 83.0% 80.7% .000 Number of responses to each item (range lowest - 31 ,969 - 57 ,222 - 62 ,550 - 47 ,238 - highest) 33,857 61,286 70,344 51,594 20

100%

90%

80%

70%

60% Health Sciences 50% STEM 40% % agreement Social Sciences 30% Arts and Humanities

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Quality of learning and teaching Assessment and feedback

Figure 3.6: Percentage agreement with each NSS item by discipline cluster, Q1-9

100%

90%

80%

70%

60% Health Sciences 50% STEM 40% % agreement Social Sciences 30% Arts and Humanities

20%

10%

0% Q10 Q11 Q12 Q13 Q14 Q15 Academic support Organisation and management

Figure 3.7: Percentage agreement with each NSS item by discipline cluster, Q10-15

21

100%

90%

80%

70%

60%

50% Health Sciences STEM 40% % agreement Social Sciences 30% Arts and Humanities 20%

10%

0% Q16 Q17 Q18 Q19 Q20 Q21 Q22 Learning resources Personal development Overall satisfaction

Figure 3.8: Percentage agreement with each NSS item by discipline cluster, Q16-22

Of the four subject clusters, Health Sciences demonstrates the most distinctive levels of agreement. For the items in the organisation and management scale (Q13-15), Health Sciences students express a level of agreement that is on average 6.1% lower than the next lowest cluster (Arts and Humanities for all three items). Conversely, for the items in the personal development scale (Q19-21) students in Health Sciences subjects express a level of agreement that is on average 5.7% higher than the next highest cluster (Social Sciences for all three items). Arts and Humanities students express higher positivity about feedback (Q7-9), with particularly marked positivity for Q8, ‘I have received detailed comments on my work’. For the organisation and management scale (Q13-15), students studying STEM subjects are most positive.

22

3.5 Differences between student domicile

Table 3.5: Percentage agreement with each NSS item by student domicile

% agreement Item UK EU Non-EU Sig. Q1. Staff are good at explaining things 88.0% 88.0% 88.4% .000 Q2. Staff have made the subject interesting 81.3% 78.9% 75.8% .000 Q3. Staff are enthusiastic about what they are teaching 85.7% 82.2% 82.7% .000 Q4. The course is intellectually stimulating 84.0% 79.8% 81.7% .000 Q5. The criteria used in marking have been made clear in 73.0% 72.4% 74.1% .000 advance Q6. Assessment arrangements and marking have been fair 74.6% 71.9% 72.8% .000 Q7. Feedback on my work has been prompt 62.3% 60.8% 69.1% .000 Q8. I have received detailed comments on my work 67.1% 61.2% 67.8% .000 Q9. Feedback on my work has helped me clarify things I 61.3% 58.2% 65.1% .000 did not understand Q10. I have received sufficient advice and support with my 75.0% 74.7% 74.2% .000 studies Q11. I have been able to contact staff when I needed to 82.7% 85.7% 84.6% .000 Q12. Good advice was available when I needed to make 72.0% 71.7% 74.5% .000 study choices Q13. The timetable works effectively as far as my activities 78.3% 78.7% 79.7% .000 are concerned Q14. Any changes in the course or teaching have been 72.8% 77.3% 80.5% .000 communicated effectively Q15. The course is well organised and is running smoothly 71.8% 75.6% 80.2% .000 Q16. The library resources and services are good enough 80.9% 82.3% 81.4% .000 for my needs Q17. I have been able to access general IT resources when 83.1% 85.2% 86.2% .000 I needed to Q18. I have been able to access specialised equipment, 75.2% 79.4% 79.1% .000 facilities, or rooms when I needed to Q19. The course has helped me to present myself with 79.1% 76.8% 78.7% .000 confidence Q20. My communication skills have improved 81.8% 83.9% 81.8% .000 Q21. As a result of the course, I feel confident in tackling 79.3% 78.7% 78.5% .000 unfamiliar problems Q22. Overall, I am satisfied with the quality of the course 82.9% 82.9% 85.4% .000 Number of responses to each item (range lowest - highest) 215 ,936 - 9,656 - 15 ,230 - 238,167 10,268 15,886

23

100%

90%

80%

70%

60% UK 50% EU

% agreement 40% Non-EU

30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 3.9: Percentage agreement with each NSS item by student domicile

Students from outside the EU are the most positive group for 11 of the 22 items. The areas of largest difference are the feedback questions in the assessment and feedback scale (Q7-9), for which all groups report less positivity than for other items. The most pronounced difference is for Q7, ‘Feedback on my work has been prompt’, for which the difference between non-EU students (the most positive) and EU students (the least positive) is 8.3%. Non-EU students are similarly the most positive group for the organisation and management scale (Q13-15). One area where EU students are similarly, or more strongly, positive is the learning resources scale (Q16-18), for which UK students are the least positive. UK students are the most positive for the quality of learning and teaching scale (Q1-4), particularly Q3, ‘Staff are enthusiastic about what they are teaching’. For Q22, ‘Overall, I am satisfied with the quality of the course’, while there is no difference between the positivity of UK and EU students, non-EU students express a level of agreement 2.5% higher.

24

3.6 Differences between part-time and full-time students

Table 3.6: Percentage agreement with each NSS item by mode of study

% agreement Item Full-time Part-time Sig. Q1. Staff are good at explaining things 88.2% 86.7% .000 Q2. Staff have made the subject interesting 80.8% 82.1% .000 Q3. Staff are enthusiastic about what they are teaching 85.2% 86.5% .000 Q4. The course is intellectually stimulating 83.2% 88.1% .000 Q5. The criteria used in marking have been made clear in advance 72.2% 81.6% .000 Q6. Assessment arrangements and marking have been fair 73.5% 83.2% .000 Q7. Feedback on my work has been prompt 61.4% 73.8% .000 Q8. I have received detailed comments on my work 65.4% 80.8% .000 Q9. Feedback on my work has helped me clarify things I did not understand 60.2% 73.0% .000 Q10. I have received sufficient advice and support with my studies 74.7% 77.7% .000 Q11. I have been able to contact staff when I needed to 83.0% 82.9% .163 Q12. Good advice was available when I needed to make study choices 72.0% 73.3% .000 Q13. The timetable works effectively as far as my activities are concerned 78.3% 79.5% .000 Q14. Any changes in the course or teaching have been communicated 73.1% 77.0% .000 effectively Q15. The course is well organised and is running smoothly 71.9% 78.0% .000 Q16. The library resources and services are good enough for my needs 81.1% 79.9% .000 Q17. I have been able to access general IT resources when I needed to 83.4% 82.7% .000 Q18. I have been able to access specialised equipment, facilities, or rooms when 76.4% 65.3% .000 I needed to Q19. The course has helped me to present myself with confidence 79.1% 77.6% .000 Q20. My communication skills have improved 82.6% 75.2% .000 Q21. As a result of the course, I feel confident in tackling unfamiliar problems 79.5% 76.4% .000 Q22. Overall, I am satisfied with the quality of the course 82.7% 86.7% .000 Number of responses to each item (range lowest - highest) 223 ,811 - 17 ,011 - 239,139 25,229

25

100%

90%

80%

70%

60%

50% Full-time 40% % agreement 30% Part-time 20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 3.10: Percentage agreement with each NSS item by mode of study

There are markedly higher levels of positivity among part-time students for the assessment and feedback scale (Q5-9) including a 15.4% difference for Q8, ‘I have received detailed comments on my work’. This pattern of higher positivity for part-time students is also seen for the organisation and management scale (Q13-15). Areas of higher positivity for full-time students include the learning resources and personal development scales (Q16-18 and Q19-21 respectively) with a large difference of 11.1% for Q18, ‘I have been able to access specialised equipment, facilities, or rooms when I needed to’.

26

3.7 Differences between part-time Open University students and other part- time students

A high proportion of part-time students study at The Open University: 10,968, or 43.4% of the total number of part-time students who responded to the NSS. It is therefore important to compare OU part-time students with the rest of the part-time students, in order to ascertain the impact of that individual institution on the overall results for part-time students. This is the only instance of institutional-level results being included in this report.

Table 3.7: Percentage agreement with each NSS item by attendance at the OU

% agreement Part -time Item non-OU Part-time OU Sig. Q1. Staff are good at explaining things 86.4% 87.1% .000 Q2. Staff have made the subject interesting 80.7% 84.1% .000 Q3. Staff are enthusiastic about what they are teaching 85.3% 88.2% .000 Q4. The course is intellectually stimulating 83.5% 94.0% .000 Q5. The criteria used in marking have been made clear in advance 77.8% 86.6% .000 Q6. Assessment arrangements and marking have been fair 79.2% 88.5% .000 Q7. Feedback on my work has been prompt 64.3% 86.2% .000 Q8. I have received detailed comments on my work 72.4% 91.7% .000 Q9. Feedback on my work has helped me clarify things I did not understand 65.7% 82.5% .000 Q10. I have received sufficient advice and support with my studies 75.1% 81.2% .000 Q11. I have been able to contact staff when I needed to 79.2% 87.8% .000 Q12. Good advice was available when I needed to make study choices 70.8% 76.7% .000 Q13. The timetable works effectively as far as my activities are concerned 75.7% 84.5% .000 Q14. Any changes in the course or teaching have been communicated 69.4% 87.8% .000 effectively Q15. The course is well organised and is running smoothly 68.8% 90.1% .000 Q16. The library resources and services are good enough for my needs 79.0% 81.1% .000 Q17. I have been able to access general IT resources when I needed to 80.8% 85.3% .000 Q18. I have been able to access specialised equipment, facilities, or rooms when 67.8% 59.9% .000 I needed to Q19. The course has helped me to present myself with confidence 79.2% 75.3% .000 Q20. My communication skills have improved 76.7% 73.1% .000 Q21. As a result of the course, I feel confident in tackling unfamiliar problems 77.2% 75.3% .000 Q22. Overall, I am satisfied with the quality of the course 82.1% 92.7% .000 Number of responses to each item (range lowest - highest) 11 ,592 - 5,419 - 10 ,952 14,277

27

100%

90%

80%

70%

60%

50% Part-time non-OU 40% % agreement Part-time 30% OU 20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 3.11: Percentage agreement with each NSS item by attendance at the OU

Part-time OU students are more positive than other part-time students for all items except Q18, ‘The course is well organised and is running smoothly’ and the personal development scale (Q19-21). One possible explanation for the differences on the personal development scale may lie in the tendency of OU students to be older, and the tendency of older students to be less positive regarding that scale (see Figure 3.3).

In general, non-OU part-time students tend to express broadly similar levels of positivity to full-time students, whereas the results for part-time OU students suggest an experience of a different character. For instance, whereas both full-time and non-OU part-time students have markedly lower levels of positivity for the assessment and feedback scale (Q5-9) compared to other items, part-time OU students do not report lower positivity for that scale. The difference between OU and non-OU part-time students for Q7, ‘Feedback on my work has been prompt’, is very large, at 21.9%. It is also large for Q15, ‘The course is well organised and is running smoothly’, at 21.3%.

The most pronounced area of lower positivity among part-time OU students is for Q18, ‘I have been able to access specialised equipment, facilities, or rooms when I needed to’, where the level of agreement is 7.9% lower than for part-time non-OU students.

These patterns of higher and lower levels of positivity are pronounced among part-time OU students, but they do have echoes in the results for non-OU part-time students, despite the latter group’s overall similarity to full-time students. For example, non-OU part-time students are more positive than full-time students about the assessment and feedback scale (the difference for Q8, ‘I have received detailed comments on my work’, is 7%), and less positive about Q18, ‘I have been able to access specialised equipment, facilities, or rooms when I needed to’ (the difference being 8.6%).

28

4. Institutional characteristics

4.1 Differences between UK nations

These national groups are formed by the location of the institution, rather than the domicile of the student. Due to the dispersed nature of the institution, students from The Open University have been excluded from these figures.

Table 4.1: Percentage agreement with each NSS item by location of institution

% agreement Northern Item England Scotland* Ireland Sig. Q1. Staff are good at explaining things 87.9% 89.8% 88.6% 87.0% .000 Q2. Staff have made the subject interesting 80.6% 83.5% 81.1% 78.1% .000 Q3. Staff are enthusiastic about what they are teaching 85.2% 86.0% 85.0% 83.3% .000 Q4. The course is intellectually stimulating 83.0% 86.0% 83.1% 83.0% .000 Q5. The criteria used in marking have been made clear in 72.5% 72.5% 72.5% 72.0% .234 advance Q6. Assessment arrangements and marking have been fair 73.7% 75.3% 75.2% 71.7% .000 Q7. Feedback on my work has been prompt 62.4% 54.0% 60.0% 57.1% .000 Q8. I have received detailed comments on my work 66.6% 58.0% 66.3% 57.6% .000 Q9. Feedback on my work has helped me clarify things I 60.9% 56.5% 61.0% 55.8% .000 did not understand Q10. I have received sufficient advice and support with 74.5% 75.9% 76.3% 71.8% .000 my studies Q11. I have been able to contact staff when I needed to 82.4% 85.4% 84.1% 83.8% .000 Q12. Good advice was available when I needed to make 71.9% 71.6% 73.5% 70.5% .000 study choices Q13. The timetable works effectively as far as my 77.9% 79.9% 79.0% 80.1% .000 activities are concerned Q14. Any changes in the course or teaching have been 72.7% 73.6% 72.3% 78.0% .000 communicated effectively Q15. The course is well organised and is running 71.6% 71.5% 72.0% 76.1% .000 smoothly Q16. The library resources and services are good enough 80.8% 83.2% 79.1% 87.0% .000 for my needs Q17. I have been able to access general IT resources 82.9% 87.3% 83.3% 86.9% .000 when I needed to Q18. I have been able to access specialised equipment, 75.7% 79.5% 75.2% 80.4% .000 facilities, or rooms when I needed to Q19. The course has helped me to present myself with 79.0% 80.8% 78.8% 79.3% .000 confidence Q20. My communication skills have improved 82.1% 84.2% 81.7% 84.6% .000 Q21. As a result of the course, I feel confident in tackling 79.2% 81.3% 79.1% 79.9% .000 unfamiliar problems Q22. Overall, I am satisfied with the quality of the course 82.4% 85.7% 83.0% 82.5% .000 Number of responses to each item (range lowest - 210 ,238 - 16 ,588 - 12 ,905 - 4,672 - highest) 216,537 17,939 14,029 4,868 * Participation in the NSS is voluntary for Scottish institutions, and 14 took part in 2011.

29

100%

90%

80%

70%

60% England 50% Scotland 40% Wales %agreement 30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Quality of learning and teaching Assessment and feedback

Figure 4.1: Percentage agreement with each NSS item by location of institution, Q1-9

100%

90%

80%

70%

60% England 50% Scotland 40% Wales %agreement 30% Northern Ireland

20%

10%

0% Q10 Q11 Q12 Q13 Q14 Q15 Academic support Organisation and management

Figure 4.2: Percentage agreement with each NSS item by location of institution, Q10-15

30

100%

90%

80%

70%

60%

50% England Scotland 40% %agreement Wales 30% Northern Ireland

20%

10%

0% Q16 Q17 Q18 Q19 Q20 Q21 Q22 Learning resources Personal development Overall satisfaction

Figure 4.3: Percentage agreement with each NSS item by location of institution, Q16-22

There are marked differences in levels of positivity expressed by students within the four UK nations. Students at Scottish institutions are the most positive group for ten of the 22 items (Q1-4, Q6, Q11, Q17, Q19 and Q21-22). This includes all of the items in the quality of learning and teaching scale. By contrast, students at Scottish institutions are less positive than students at English institutions (by an average of 7.13%) for the feedback items (Q7-Q9). For Q7, ‘Feedback on my work has been prompt’, and Q8, ‘I have received detailed comments on my work’, students at English institutions express the highest level of positivity. For Q8, 66-67% of students at English and Welsh institutions express agreement, compared with 58% of students at Scottish and Northern Irish institutions. Students at Northern Irish institutions express the greatest positivity of the four nations for a number of items, including all three items in the organisation and management scale (Q13-15). For Q22, ‘Overall, I am satisfied with the quality of the course’, students at institutions in England, Wales and Northern Ireland are clustered within a 0.6% range, while students studying at Scottish institutions express a level of agreement 2.7% higher.

31

4.2 Differences between mission groups

Mission group membership is correct for the time the survey took place (Spring 2011).

Table 4.2: Percentage agreement with each NSS item by mission group

% agreement Russell 1994 University Item Group Group GuildHE** Alliance* Million+ Sig. Q1. Staff are good at explaining things 90.3% 91.6% 88.3% 86.4 % 86.1% .000 Q2. Staff have made the subject interesting 82.8% 84.6% 84.2% 78 .0% 79.0% .000 Q3. Staff are enthusiastic about what they are teaching 87.7% 89.2% 87.0% 83.2 % 83.1% .000 Q4. The course is intellectually stimulating 89.5% 89.1% 81.3% 79.5 % 79.6% .000 Q5. The criteria used in marking have been made clear in 66.9% 73.3% 75.4% 73.7 % 74.6% .000 advance Q6. Assessment arrangements and marking have been fair 73.5% 77.4% 73.6% 71.7 % 72.6% .000 Q7. Feedback on my work has been prompt 60.6% 65.1% 64.9% 60.8 % 61.4% .000 Q8. I have received detailed comments on my work 57.9% 66.7% 75.1% 66.2 % 68.5% .000 Q9. Feedback on my work has helped me clarify things I 55.2% 60.6% 65.1% 59 .3% 62.8% .000 did not understand Q10. I have received sufficient advice and support with 73.6% 77.0% 78.6% 73.7 % 73.7% .000 my studies Q11. I have been able to contact staff when I needed to 87.3% 87.9% 81.9% 80.1 % 78.8% .000 Q12. Good advice was available when I needed to make 71.2% 74.4% 75.2% 71 .0% 71.0% .000 study choices Q13. The timetable works effectively as far as my 82.2% 84.1% 75.5% 75.0 % 75.9% .000 activities are concerned Q14. Any changes in the course or teaching have been 79.1% 81.7% 68.9% 70.5 % 68.1% .000 communicated effectively Q15. The course is well organised and is running 78.8% 82.6% 68.1% 68.2 % 66.3% .000 smoothly Q16. The library resources and services are good enough 86.9% 79.4% 77.3% 82.1 % 78.3% .000 for my needs Q17. I have been able to access general IT resources 87.5% 83.8% 80.5% 82.2 % 81.9% .000 when I needed to Q18. I have been able to access specialised equipment, 81.0% 78.2% 72.1% 75.6 % 74.3% .000 facilities, or rooms when I needed to Q19. The course has helped me to present myself with 78.0% 79.9% 81.2% 78.8 % 79.4% .000 confidence Q20. My communication skills have improved 81.8% 82.5% 83.7% 82.6 % 82.8% .000 Q21. As a result of the course, I feel confident in tackling 80.2% 80.5% 80.1% 78.6 % 78.8% .000 unfamiliar problems Q22. Overall, I am satisfied with the quality of the course 86.3% 88.2% 81.9% 80.7 % 79.6% .000 Number of responses to each item (range lowest - 48 ,556 - 26 ,664 - 9,516 - 48,536 - 42 ,209 - highest) 52,821 29,349 10,053 51,590 45,146

*Excluding Bucks New University (included in Million+) and The Open University, which accounts for around 17.5% of the University Alliance respondents and thus has a very large impact on the results. **Excluding (included in Million+).

32

100%

90%

80%

70%

60% Russell Group 50% 1994 Group GuildHE 40%

%agreement University Alliance 30% Million+ 20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Quality of learning and teaching Assessment and feedback

Figure 4.4: Percentage agreement with each NSS item by mission group, Q1-9

100%

90%

80%

70%

60% Russell Group 50% 1994 Group GuildHE 40% %agreement University Alliance 30% Million+ 20%

10%

0% Q10 Q11 Q12 Q13 Q14 Q15 Academic support Organisation and management

Figure 4.5: Percentage agreement with each NSS item by mission group, Q10-15

33

100%

90%

80%

70%

60% Russell Group 50% 1994 Group

40% GuildHE %agreement University Alliance 30% Million+ 20%

10%

0% Q16 Q17 Q18 Q19 Q20 Q21 Q22 Learning resources Personal development Overall satisfaction

Figure 4.6: Percentage agreement with each NSS item by mission group, Q16-22

Students at Russell Group institutions express higher positivity for the learning resources scale (Q16-18), especially Q16, ‘The library resources and services are good enough for my needs’, where the gap between the Russell Group and the next most positive group is 5%. An area of low positivity for the Russell Group is the assessment and feedback scale, particularly Q8, ‘I have received detailed comments on my work’ and Q9, ‘Feedback on my work has helped me clarify things I did not understand’; for Q8, the difference between the Russell Group and the next least positive (University Alliance) is 8.3%, while the difference between the Russell Group and the most positive (GuildHE) is 17.2%. To some extent these patterns of higher and lower positivity for Russell Group institutions mirror those of full-time students (see Figure 3.11), and may reflect the very low proportions of part-time students at Russell Group institutions.

Clustering can be seen for most of the items in the quality of learning and teaching scale (Q1-4), with students at Million+ and University Alliance institutions (predominantly newer institutions) reporting lower levels of positivity than the other groups. Guild HE students report markedly higher levels of positivity for Q8, ‘I have received detailed comments on my work’, while students at University Alliance institutions are relatively positive about Q16, ‘The library resources and services are good enough for my needs’. Million+ students are the second most positive group for Q20, ‘My communication skills have improved’.

34

4.3 Differences between institution types

In this section a broader classification of institution is used, encompassing all HE institutions participating in the NSS. The students included in the ‘FEC’ group are students studying on HE courses at further education colleges. Only students on HE courses are invited to participate in the NSS.

Table 4.3: Percentage agreement with each NSS item by institution type

% agreement Small and Item Pre-1992* Post-1992 specialist FEC Sig. Q1. Staff are good at explaining things 90.4% 86.3% 86.5% 86.1% .000 Q2. Staff have made the subject interesting 82.8% 78.8% 82.3% 80.1% .000 Q3. Staff are enthusiastic about what they are teaching 87.5% 83.3% 86.0% 83.5% .000 Q4. The course is intellectually stimulating 88.1% 79.7% 79.9% 79.2% .000 Q5. The criteria used in marking have been made clear in 70.0% 74.2% 70.5% 77.5% .000 advance Q6. Assessment arrangements and marking have been fair 74.7% 72.3% 70.9% 79.1% .000 Q7. Feedback on my work has been prompt 61.9% 60.9% 62.0% 64.3% .000 Q8. I have received detailed comments on my work 61.6% 67.8% 69.8% 76.0% .000 Q9. Feedback on my work has helped me clarify things I 57.8% 61.2% 62.7% 70.6% .000 did not understand Q10. I have received sufficient advice and support with 75.0% 74.0% 73.3% 77.5% .000 my studies Q11. I have been able to contact staff when I needed to 86.7% 79.5% 80.1% 81.1% .000 Q12. Good advice was available when I needed to make 72.4% 71.1% 71.7% 74.5% .000 study choices Q13. The timetable works effectively as far as my 82.0% 75.4% 71.1% 76.2% .000 activities are concerned Q14. Any changes in the course or teaching have been 78.8% 69.1% 65.2% 66.0% .000 communicated effectively Q15. The course is well organised and is running 78.9% 67.1% 62.4% 63.2% .000 smoothly Q16. The library resources and services are good enough 83.8% 80.1% 79.0% 71.6% .000 for my needs Q17. I have been able to access general IT resources 86.0% 81.8% 81.1% 77.6% .000 when I needed to Q18. I have been able to access specialised equipment, 79.6% 74.6% 70.5% 67.3% .000 facilities, or rooms when I needed to Q19. The course has helped me to present myself with 79.0% 79.2% 77.7% 80.1% .000 confidence Q20. My communication skills have improved 82.3% 82.6% 81.5% 80.1% .000 Q21. As a result of the course, I feel confident in tackling 80.4% 78.6% 77.6% 78.8% .000 unfamiliar problems Q22. Overall, I am satisfied with the quality of the course 86.4% 80.1% 78.9% 78.5% .000 Number of responses to each item (range lowest - 99499 - 108 ,050 - 9,404 - 18 ,194 - highest) 108294 115,349 9,771 19,705

* Excluding The Open University, as it accounts for around 9.2% of the Pre-92 respondents and thus has a very large impact on the results.

35

100%

90%

80%

70%

60% Pre-1992 50% Post-1992 40% Small and specialist %agreement FEC 30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Quality of learning and teaching Assessment and feedback

Figure 4.7: Percentage agreement with each NSS item by institution type, Q1-9

100%

90%

80%

70%

60% Pre-1992 50% Post-1992 40% Small and specialist %agreement FEC 30%

20%

10%

0% Q10 Q11 Q12 Q13 Q14 Q15 Academic support Organisation and management

Figure 4.8: Percentage agreement with each NSS item by institution type, Q10-15

36

100% 90% 80% 70% 60% Pre-1992 50% Post-1992 40%

%agreement Small and specialist 30% FEC 20% 10% 0% Q16 Q17 Q18 Q19 Q20 Q21 Q22 Learning resources Personal development Overall satisfaction

Figure 4.9: Percentage agreement with each NSS item by institution type, Q16-22

As the mission groups are clustered within these broader ‘institution type’ groups, many of these observations are similar to those in Section 4.2.

For 13 of the 22 items, the highest levels of positivity (though often by small margins) are reported by students at pre-1992 institutions, while FEC students are most positive for eight items. Pre-1992 students are most positive for the quality of learning and teaching scale generally (Q1-4), Q11, ‘I have been able to contact staff when I needed to’, and the organisation and management and learning resources scales (Q13-15 and Q16-18 respectively). This higher positivity is particularly notable for Q15, ‘The course is well organised and is running smoothly’, for which the level of agreement expressed by pre-1992 students is 11.8% higher than that of the next most positive (post-1992). Pre-1992 students are least positive of all the groups about Q8, ‘I have received detailed comments on my work’, and Q9, ‘Feedback on my work has helped me clarify things I did not understand’.

There are a number of areas where clustering can be seen of the post-1992, small and specialist, and FEC institutions. For example, for Q22, ‘Overall, I am satisfied with the quality of the course’, those three groups all fall within 1.6% of each other, whereas pre-1992 students express a level of agreement 6.3% higher than the next most positive group (post-1992). Of the eight items where FEC students are the most positive, five of these constitute the assessment and feedback scale (Q5-Q9), and this is most marked for Q8 and Q9.

37

4.4 Comparison between HE and FE institutions

As in Section 4.3, the students included in the ‘FE’ group are students studying on HE courses at further education colleges. Only students on HE courses are invited to participate in the NSS.

Table 4.4: Percentage agreement with each NSS item by HE and FE institutions

% agreement Item HE FE Sig. Q1. Staff are good at explaining things 88.2% 86.1% .000 Q2. Staff have made the subject interesting 81.0 % 80.1% .000 Q3. Staff are enthusiastic about what they are teaching 85.5 % 83.5% .000 Q4. The course is intellectually stimulating 84.0 % 79.2% .000 Q5. The criteria used in marking have been made clear in advance 72.7 % 77.5% .000 Q6. Assessment arrangements and marking have been fair 74.1 % 79.1% .000 Q7. Feedback on my work has been prompt 62.5 % 64.3% .000 Q8. I have received detailed comments on my work 66.2 % 76.0% .000 Q9. Feedback on my work has helped me clarify things I did not understand 60 .6% 70.6% .000 Q10. I have received sufficient advice and support with my studies 74.7 % 77.5% .000 Q11. I have been able to contact staff when I needed to 83.1 % 81.1% .000 Q12. Good advice was available when I needed to make study choices 71.9 % 74.5% .000 Q13. The timetable works effectively as far as my activities are concerned 78.6 % 76.2% .000 Q14. Any changes in the course or teaching have been communicated 74.0 % 66.0% .000 effectively Q15. The course is well organised and is running smoothly 73.2 % 63.2% .000 Q16. The library resources and services are good enough for my needs 81.7 % 71.6% .000 Q17. I have been able to access general IT resources when I needed to 83.8 % 77.6% .000 Q18. I have been able to access specialised equipment, facilities, or rooms when 76.3 % 67.3% .000 I needed to Q19. The course has helped me to present myself with confidence 78.9 % 80.1% .00 0 Q20. My communication skills have improved 82.0 % 80.1% .000 Q21. As a result of the course, I feel confident in tackling unfamiliar problems 79.2 % 78.8% .124 Q22. Overall, I am satisfied with the quality of the course 83.5 % 78.5% .000 Number of responses to each item (range lowest - highest) 222 ,628 - 18 ,194 - 244,616 19,705

38

100%

90%

80%

70%

60%

50% HE FE 40% % agreement

30%

20%

10%

0% Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 Q19 Q20 Q21 Q22

Figure 4.10: Percentage agreement with each NSS item by HE and FE institutions

There are marked differences between students studying higher education courses at HE and FE institutions; however, note should be taken of Section 4.3, which suggested that differences between types of HE institution are often more pronounced than differences between HE and FE institutions.

The differences that exist between HE and FE institutions are focused around the assessment and feedback scale, where FE students are more positive; and the organisation and management and learning resources scales, where HE students are more positive. The largest differences where FE students are more positive are for Q8, ‘I have received detailed comments on my work’, and Q9, ‘Feedback on my work has helped me clarify things I did not understand’; the level of agreement for FE students is 10% higher for both items. The item where the largest difference exists and HE students are more positive is Q16, ‘The library resources and services are good enough for my needs’, where the gap is 10.1%. Students at HE institutions are more positive about their overall experience, with a level of agreement 5.0% higher than students at FE institutions; however, notice should be taken of the observation in Section 4.3 that the only marked difference is between FE institutions and pre-1992 institutions, rather than HE institutions in general.

39

5. Comparison between NSS and PTES

The national Postgraduate Taught Experience Survey (PTES) is run annually by the Higher Education Academy in conjunction with institutions. Table 5.1 shows comparisons between data from NSS items and data from relevant items in PTES. The PTES data are from the 2011 administration of the survey, and the full report can be accessed on the HEA’s website. There are comparable items in PTES for all NSS items except items Q10, Q11, Q12 and Q22. For NSS items Q7 and Q16 there are multiple relevant items in PTES. Unless otherwise stated, the relevant item wording in PTES is either identical to the NSS item, or contains only insignificant differences. The relevant PTES item numbers are in square brackets.

Please note that whereas the NSS is compulsory for HE providers in England, Wales and Northern Ireland, PTES is voluntary. Eighty institutions took part in PTES 2011, as opposed to the 253 institutions that took part in NSS 2011. Differences in results between PTES and the NSS may, therefore, reflect differences between the institutions taking part rather than genuine differences in experience. Nonetheless, PTES includes many of the same questions as found in the NSS as well as some that add further information to the NSS-type questions. Please also note that no tests for significance have been undertaken for this table; the differences between results for NSS and PTES items are provided solely for interest, and should only be taken as indicative.

Table 5.1: Percentage agreement with comparable items in the NSS and PTES in 2011

% agreement NSS item [PTES item] NSS PTES Q1. Staff are good at explaining things [PTES Q4a] 88.0% 81.3 % Q2. Staff have made the subject interesting [PTES Q4b] 80.9% 77.1 % Q3. Staff are enthusiastic about what they are teaching [PTES Q4c] 85.3% 83.8 % Q4. The course is intellectually stimulating [PTES Q3d] 83.7% 82.8 % Q5. The criteria used in marking have been made clear in advance [PTES Q11a] 73.1% 72.8 % Q6. Assessment arrangements and marking have been fair [PTES Q11b] 74.4% 72.8 % Q7. Feedback on my work has been prompt [PTES Q11c] 62.6% 60.2 % I received feedback in time to allow me to improve my next assignment [PTES Q11d – no N/A 59.3 % direct NSS equivalent] Q8. I have received detailed comments on my work [PTES Q11e] 66.9% 68.7 % Q9. Feedback on my work has helped me clarify things I did not understand [PTES Q11f] 61.4% 60.8 % Q13*. The timetable works efficiently as far as my activities are concerned [PTES Q14a] 78.4 % 76.8 % Q14. Any changes in the course of teaching have been communicated effectively [PTES Q14b] 73.4% 73.9 % Q15. The course is well organised and is running smoothly [PTES Q14c] 72.4% 70.8 % Q16. The library resources and services are good enough for my needs [PTES Q16a] 81.0% 74.7 % The library resources and services are easily accessible [PTES Q16b – no NSS equivalent] N/A 79.3 % I am satisfied with the quality of learning materials available to me (Print, online material, N/A 76.2 % DVDs etc.) [PTES Q16f – no NSS equivalent] Q17. I have been able to access general IT resources when I needed to [PTES Q16c] 83.4% 79.3 % Q18. I have been able to access specialised equipment, facilities or rooms when I needed to [PTES 75.6% 67.7 % Q16e] Q19. The course has helped me to present myself with confidence [PTES Q17d] 79.0% 69.6 % Q20. My communication skills have improved [PTES Q17e] 81.9% 67.8 % Q21. As a result of my course, I feel confident in tackling unfamiliar problems [PTES Q17f] 79.2% 70.8 % Number of responses to each item (range lowest - highest) 240 ,822 - 26 ,910 - 264,321 38,327 * PTES Q14a, the equivalent of the NSS item Q13, is slightly differently worded: ‘The timetable fits well with my other commitments’. 40

In general, similar patterns can be seen in responses to the NSS and PTES, suggesting that (final-year) undergraduates and taught postgraduates have broadly similar experiences. The familiar pattern in the NSS of lower positivity regarding feedback (Q7-9) can be seen in the PTES results (Q11c-f), as can the higher levels of positivity regarding the quality of learning and teaching (NSS items Q1-4). Areas of difference, where the results for PTES are less positive, are Q18, ‘I have been able to access specialised equipment, facilities, or rooms when I needed to’, and the personal development scale (NSS items Q19-21). Q20, ‘My communication skills have improved’, demonstrates the most pronounced difference, with PTES respondents having a level of agreement 14.1% lower than that of NSS respondents. One possible explanation of this different pattern for Q18-21 could lie in the tendency of PTES respondents to be older, and the evidence from the NSS that older students are less positive about those issues (see Section 3.1). However, the different nature of the respondent populations and their courses and the lack of tests for statistical significance mean that any interpretation should be approached with caution.

41

6. Additional HEA resources

The Higher Education Academy supports institutions and discipline communities to use student survey data to enhance the student learning experience. For more about our work on the National Student Survey please visit: http://www.heacademy.ac.uk/nss .

6.1 Discipline reports

The NSS has produced 28 reports containing discipline-specific analysis of the 2011 NSS results. Data are provided for 67 subjects, including comparisons between student and institution characteristics, and regression and correlation analyses.

The reports are freely available to download at: http://www.heacademy.ac.uk/nss .

6.2 Research

The HEA has produced a number of key pieces of research relating to the NSS:

Dimensions of Quality (2010)

Produced by Graham Gibbs, this report sets out to identify those factors that give a reliable indication of the quality of student learning. Its focus is broader than just the use of student survey data, but it provides a useful overview of different mechanisms of evaluating educational quality.

Available from: http://www.heacademy.ac.uk/assets/documents/evidence_informed_practice/Dimensions_of_Quality.pdf .

The National Student Survey three years on: What have we learned? (2009)

This report by Paula Surridge summarises some key pieces of research to give an overview of findings relating to the NSS. It also gives recommendations for future work. It is a very useful guide to NSS data, especially regarding the important question of what it can and cannot tell us.

Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_three_years_on_surridge_02.06.09. pdf .

National Student Survey of Teaching in UK Universities: Dimensionality, multilevel structure and differentiation at the level of university and discipline: preliminary results (2008)

This report, by Herb Marsh and Jacqueline Cheng, is a technical investigation of a number of issues, focusing in particular on the relative effects on NSS scores of various factors such as institution and discipline. It is a rich source of information that can help to illuminate raw NSS data.

Available from: http://www.heacademy.ac.uk/assets/documents/research/surveys/nss/NSS_herb_marsh- 28.08.08.pdf .

42

6.3 Case studies of enhancement activities

Through its Institutional Working Group, the HEA has collected case studies describing how NSS data have been used to enhance learning and teaching within institutions:

• 12 case studies from 2007, available from: http://www.heacademy.ac.uk/assets/documents/subjects/bioscience/nss-case-studies.doc ;

• five case studies from 2010, available from: http://www.heacademy.ac.uk/assets/EvidenceNet/Case_studies/NSS_case_studies_Nov_2010.pdf .

6.4 Postgraduate surveys

In addition to supporting the sector to use NSS data for the enhancement of learning and teaching, the HEA has also developed its own national surveys, looking at the postgraduate student experience.

Postgraduate Taught Experience Survey

PTES has been running since 2009, and in 2011 about 39,000 students from 80 institutions completed the survey. The survey asks students about a wide range of elements of their learning experience, including feedback, teaching and skills development. It also asks about the depth and sophistication of the learning they have engaged in. 83 institutions have taken part in the survey in 2012.

For more information visit: http://www.heacademy.ac.uk/ptes .

Postgraduate Research Experience Survey

PRES is the sister survey of PTES and is aimed at postgraduate research students. It runs every two years, and in 2011 over 31,000 students from 102 institutions completed the survey. The survey will next run in 2013.

For more information please visit: http://www.heacademy.ac.uk/pres .

6.5 Consultancy and change programmes

The HEA runs regular change programmes for departments and faculties wishing to explore their NSS results. More information can be found here: http://www.heacademy.ac.uk/change .

The HEA is also currently developing an institutional consultancy service, which will provide senior managers with advice, tailored analysis and support to help them use survey data to strategically address issues in learning and teaching. If you are interested in this service then please email: [email protected] .

43

7. Further reading

In addition to the research produced by the HEA described in the previous section, there are number of other studies and reviews that provide useful information about the strengths and limitations of NSS data.

• Alan Fielding, Peter Dunleavy and Mark Langan (2010) Interpreting context to the UK’s National Student (Satisfaction) Survey for science subjects. Journal of Further and Higher Education. 34 (3), 347- 368.

This is an investigation into the complex issues that can arise when interpreting NSS data. A number of important findings are contained in the article, such as the absence of a strong correlation between the experience of feedback and overall satisfaction, and the important subject differences in students’ responses to the NSS items.

• Abbi Flint, Anne Oxley, Paul Helm and Sally Bradley (2009) Preparing for success: one institution’s aspirational and student focused response to the National Student Survey. Teaching in Higher Education. 14 (6), 608-618.

This article discusses the involvement of students in the process of using NSS data for quality enhancement purposes. Various activities are described, including an event to allow academics to hear student perspectives in detail, and the publication of a ‘You Said, We Did...’ document to inform students of the changes that had resulted from their feedback.

• HEFCE (2011) National Student Survey: Findings and trends 2006 to 2010. Bristol: HEFCE.

This is the latest annual report on the NSS by HEFCE. It provides an overview of the 2010 data, as well as looking at trends in the data from 2006 to 2010 around various demographic characteristics of the student population.

• Paul Ramsden, Denise Batchelor, Alison Peacock, Paul Temple and David Watson (2010) Enhancing and developing the National Student Survey: report to HEFCE. Bristol: HEFCE.

This report, commissioned by HEFCE, provided an interim evaluation of the functions and performance of the NSS, in order to arrive at recommendations about whether the survey should be updated or developed. The study proposed no substantial changes to the survey, but recommended that a full review be undertaken in 2015.

• John Richardson (2005) Instruments for obtaining student feedback: a review of the literature. Assessment and Evaluation in Higher Education. 30 (4), 387-415.

This is a very useful review of the research literature concerning the different kinds of survey tools that can be used to gather information about students’ learning experiences.

• John Richardson, John Slater and Jane Wilson (2007) The National Student Survey: development, findings and implications. Studies in Higher Education. 32 (5), 557-580.

This article describes the history and development of the NSS, focusing on the mechanisms and findings of the two pilot surveys that took place in 2003 and 2004.

44

• Ruth Williams and John Brennan (2003) Collecting and using student feedback on quality and standards of learning and teaching in HE. Bristol: HEFCE.

This is a report commissioned by HEFCE in order to: i) to identify good practice in collecting feedback from students, for quality enhancement; and ii) to make recommendations about the design and implementation of a national survey of students. This report played an important role in the development of the NSS.

• Mantz Yorke (2009) Student experience surveys: some methodological considerations and an empirical investigation. Assessment & Evaluation in Higher Education. 34 (6), 721-739.

This article looks at a number of issues and controversies around the design and administration of sector- wide student surveys, including the NSS.

45

Appendix A: Brief description of analyses

Differences between the various student groups were analysed using the chi-square test. Each NSS item was recoded from the five-point Likert scale into the three-point Likert scale (‘agree’, ‘neither agree nor disagree’, ‘disagree’) and was treated as a discrete variable. The Pearson chi-square statistics were calculated for each NSS item separately and relevant statistical significance values are reported in the tables. If the statistical significance value is equal or less than 0.05 then the difference between groups on the particular NSS item is statistically significant.

For ease of reporting and interpretation, only the percentage agreeing with each item is shown in the tables in this report. However, statistical significance testing makes use of the full range of data (coded into the three- point Likert scale). Occasionally, therefore, a difference may be statistically significant even where the difference shown in the percentage agreeing is very small. In these cases, the source of difference is likely to lie in the profile of ‘disagree’ and ‘neither agree nor disagree’ responses.

The correlation and regression analyses used mean scale scores, and were calculated for each student as the averages taken over all items that belong to the same scale. The averages were taken over the original five- point Likert scale items, hence they take values between one and five. Some missing values were allowed, i.e. scale score was computed even if a student did not provide an answer to all questions in the scale. The minimum number of answers required per scale was three. The correlation coefficients between scales and Q22 (the overall satisfaction item) were calculated using the Bivariate correlation (Two-tailed) 4.

The regression analysis employs simple linear regression. The dependent variable – Q22 on a five-point Likert scale – was treated as a continuous variable, which is fairly common although not the only method used to analyse this type of data. This method was chosen for the simplicity of the output and interpretation of the regression coefficients. The NSS scale scores were used as the explanatory variables in the regression equation. Some of them are highly correlated, hence the regression analysis may be subject to the multicolinearity issue. Multicolinearity usually inflates the standard errors of the coefficients, but it does not affect the main conclusion from the model.

Please see Section 1.1 for further guidance on the presentation of statistical significance in this report.

4 Mean scale scores can oversimplify Likert scale categories, and have only been used in this report where necessary – to undertake correlations and multiple regression analyses. 46

Appendix B: Information about the NSS

The NSS is a survey of final-year students on undergraduate programmes. It is compulsory for publicly funded HE providers in England, Wales and Northern Ireland, and some Scottish institutions take part on a voluntary basis 5. Ipsos MORI administer the survey on behalf of HEFCE, and contact all suitable students using a variety of methods (including email and telephone). The survey was introduced in 2005, and in 2011 154 HEIs and 99 FECs took part, and 265,000 students responded – an overall response rate of 65%.

NSS data are currently available primarily from the Unistats website ( http://unistats.direct.gov.uk ), which allows visitors to compare overall satisfaction results at course and institutional level, as well as download spreadsheets with more comprehensive information. In addition HEFCE releases headline figures, as well as annual reports providing national-level analysis. From September 2012, course-level NSS data will be incorporated into Key Information Sets, which will be available on a new central website and on institutional websites through a ‘widget’.

For reasons of reliability and confidentiality, the threshold for public reportability of the results is a response rate of 23 responses, which must also represent at least 50% of the eligible students. Where there are less than 23 responses, responses from more than one year, or from across different courses, can be aggregated to produce publicly reportable data.

In addition to the public availability of the data, institutions receive their own data at a more detailed subject level. The reportability threshold for the data that institutions receive is ten responses, rather than 23. Data at the individual student level are also available for researchers on application to HEFCE. Data at that level have been used in this report.

The NSS is based to a significant extent on the Course Experience Questionnaire (CEQ), which has been in use in Australia since 1993. There has been a significant amount of research on the CEQ, and a more limited amount on the NSS, and this research indicates that the two surveys are both reliable – they yield consistent and repeatable data – and valid – they measure what they purport to measure.

The NSS asks participants to rate their level of agreement with 22 positive statements, on a five-point scale (in addition to ‘not applicable’): definitely disagree; mostly disagree; neither agree nor disagree; mostly agree; definitely agree. The statements are grouped into six areas, or ‘scales’, plus the overall statement: quality of teaching and learning; assessment and feedback; academic support; organisation and management; learning resources; personal development.

As well as asking participants to rate their agreement with 22 statements, the survey also invites them to add free-text comments about particular positive or negative aspects of their experience. Institutions can choose to utilise a bank of optional statements in addition to the 22 core statements, which are not publicly reported.

5 14 Scottish institutions took part in 2011. 47

Appendix C: Core NSS items

The teaching on my course

1. Staff are good at explaining things 2. Staff have made the subject interesting 3. Staff are enthusiastic about what they are teaching 4. The course is intellectually stimulating

Assessment and feedback

5. The criteria used in marking have been clear in advance 6. Assessment arrangements and marking have been fair 7. Feedback on my work has been prompt 8. I have received detailed comments on my work 9. Feedback on my work has helped me clarify things I did not understand

Academic support

10. I have received sufficient advice and support with my studies 11. I have been able to contact staff when I needed to 12. Good advice was available when I needed to make study choices

Organisation and management

13. The timetable works efficiently as far as my activities are concerned 14. Any changes in the course or teaching have been communicated effectively 15. The course is well organised and is running smoothly

Learning resources

16. The library resources and services are good enough for my needs 17. I have been able to access general IT resources when I needed to 18. I have been able to access specialised equipment, facilities, or rooms when I needed to

Personal development

19. The course has helped me to present myself with confidence 20. My communication skills have improved 21. As a result of the course, I feel confident in tackling unfamiliar problems

Overall satisfaction

22. Overall, I am satisfied with the quality of the course

48