Students' Expectations and Experiences of the Digital Environment
Total Page:16
File Type:pdf, Size:1020Kb
Students' expectations and experiences of the digital environment Institutional interviews
Method An email 'interview' was conducted with respondents identified through email discussion lists and personal contact. Respondents were staff – in a variety of roles and institutional locations – with responsibility for collating information about students' digital experiences and expectations. They were based at 12 institutions, one in Scotland and the rest in England, comprising five pre-92 and six post-92 universities and one specialist postgraduate institution. Because of the methods of contact the sample is almost certainly biased towards institutions that are forward-thinking in their engagement with students around digital issues: responses are indicative of the range of activities currently undertaken at UK HEIs rather than the norm.
Respondents were asked about the situation at their institution as a whole and in some cases this entailed a degree of fact-finding with colleagues in other services. However, the location of respondents may have influenced their perspective. For example, respondents usually reported that their own service (typically e-learning or IT) was responsible for the majority of current activity. This may indicate that our contact methods were successful in reaching the sites where this work is being initiated, or may indicate that activities are distributed within institutions and the results are not being shared effectively across all the relevant services.
Findings are broken down by question.
Qu 1: How, if at all, does your university collect information about students' expectations and experiences of the digital environment?
All respondents reported conducting surveys, mostly online or a mix of online and paper, though one institution gave the survey on tablets to students waiting for registration and achieved better responses with this approach. Arriving students were separately surveyed at four institutions: others surveyed all enrolled students, sometimes in the case of large-scale surveys broken down by year and/or school of study. Annual or biennial surveys were the norm.
Most used at least one other method of collecting information. These other methods were: Analysis of data from more general surveys e.g. NSS/USS/PTES/ISB etc. This was typically by keyword searching of comments but in one case 'we performed a random- forest analysis on the NSS to identify the questions of most significance for predicting overall satisfaction and used a minimum set of these to gather quantitative data but also to act as prompts so that students might provide a free text explanation of the scores they’d awarded'. Module evaluations, sometimes triangulated with NSS scores Focus groups and interviews, often following up issues identified in surveys as above Information-gathering around project work e.g. externally funded by JISC, or internal initiatives including personal initiatives of teaching and professional staff Analysis of VLE stats and other learner data (though NB only one institution was cross- referencing this data with other measures of student experience and satisfaction) Student engagement and representation (course and faculty reps, library and IT user groups). For example: 'The Faculty Forums have stands from all central services... where student reps can raise issues or discuss problems or simply stick a post-it on the wall.' 'The results of [a recent Student Union/Guild] consultation are already feeding into the institutional e-learning strategy and implementation plan, resulting in a drive towards minimum VLE requirements for modules across the university.'
Scoping study: Students' expectations and experiences of the digital environment Funded by Jisc and Co-design partners, with thanks to all who participated [email protected] 25/11/13 Several respondents noted a high level of survey 'fatigue' among students, and although incentives such as printer credits have been used to improve response rates, there are difficulties obtaining thoughtful responses from a heavily-surveyed population through survey methods alone. 'internal survey data is triangulated with demographic, success and engagement data to establish a broader understanding of our students’ experiences and expectations' 'the survey contagion has reached us as well. This isn't necessarily negative, so long as it sits alongside... other approaches to student engagement.'
There is no consensus on whether digital issues should be addressed through specialist surveys or through general data-collection relating to students' experiences and satisfaction. One respondent cited no less than six current surveys or data-collection initiatives into which questions about digital issues had been successfully embedded. Another described how: 'we use the NSS, module evaluations and the student Barometer to collect student expectations and experiences, but since there are no questions explicitly about digital literacy in these, [we have] decided to run a pilot of the NSSE from March 2014 in which will include [specific questions about] digital literacy.'
Two respondents had used or adapted a survey instrument developed by the JISC (originally for the 'Learners' Experiences of e-Learning' programme) and several others commented that it would be useful to have a shared survey instrument – even if adapted per institution – to enable comparison, and to ensure good practice in survey design and testing.
2. What kind of information is collected (e.g. broadly what kind of questions are asked)? There was considerable consensus about the kinds of questions that were asked in surveys, though analysis of the actual survey instruments (where available) shows differences in the phrasing and structuring of question sets which make cross-comparisons problematic. Again this is an area where shared question sets were considered helpful by some.
In descending order, questions focused on: Satisfaction with current provision/expectations of future provision (e.g. 'if they find the VLE useful' 'what they think of IS services') Personal access, ownership and use of digital devices and services (e.g. 'computers, the web, smart phones and tablets') Personal practices/habits with technology (e.g. social media use, use of personal technologies for study) Preferences/perceptions/attitudes to technology in learning (e.g. '[this year we have focused] entirely on their experiences and attitudes towards using their own devices on campus to support their learning and their attitudes towards social media')
Several respondents expressed frustration with methods that asked only about satisfaction with existing services, e.g. with likert-type scoring. In these and other cases there was evidence of more open-ended questions being asked, for example about 'perceptions, and suggestions of what makes for a good e-learning environment', 'what students want from technology to enhance their learning', and 'what kinds of technologies they might use for particular tasks'. Of course, such questions required considerably more time and expertise to design, administer and analyse/interpret effectively.
Emerging areas of interest were the use of mobile devices in learning contexts, and what is broadly termed 'digital literacy': [next year's] pilot will have three specific questions about digital literacy, which can be paraphrased roughly thus: - How often have your assessments asked you to demonstrate how you use technology to support your studies in other ways than those required by the teacher? - How often have you used technology to collaborate with others or engage in online
Scoping study: Students' expectations and experiences of the digital environment Funded by Jisc and Co-design partners, with thanks to all who participated [email protected] 25/11/13 communities as part of your studies? - How often have you used technology to reflect on or record your learning?
Two respondents noted that these activities are not only used to gather information but also to convey information about the services and facilities available to students, and to support student engagement more generally.
3 Who (role/department/service) is responsible for collecting this information? Is it a one-off activity or embedded into ongoing practice?
The activities reported by respondents were carried out by a variety of institutional teams. In some cases several activities were carried out by different teams, and/or responsibility was shared. In descending order of frequency, the teams responsible were:
Learning technology/e-learning (6) IT/IS (4) Learning and teaching, educational development (or similar) (3) Specialist digital literacies project or working group (2) Corporate/academic affairs (2) Research information unit (1) Student guild (1) Quality team (1)
As might be anticipated, different kinds of information were gathered by these different teams: 'if it is about expectations and experiences of digital environment in the context of LEARNING it will be us [educational development]'.
The majority of surveys had become or were in the process of becoming embedded in the core responsibilities of one of these teams (2004-2013 start date). At one institution survey activity had not been sustained: 'I don't think any of the studies are embedded in ongoing practice - although I argued that they should be as part of the commitment to digital literacy.'
While surveys and representative forms of student engagement seem to be relatively well embedded – at these institutions at least – other activities are far more likely to be occasional or one-off. At one institution, a recommendation to carry out qualitative data collection on an annual basis was rejected despite evidence that it provided a rich picture of students' practices with technology that could support curriculum change.The failure to sustain qualitative methods may reflect the greater investment of time and expertise required.
4. What happens with this information (e.g.: does it inform decisions, are issues raised by students responded to)?
Responses to this question were heartening. First, there seems to be a strong connection between information gathering and positive change: at only one institution was it reported that information had not been distributed or leveraged effectively. And second, information is used to influence change in various and sometimes creative ways. Respondents reported (in descending order) that it is:
Fed into high level strategies and policies Used to inform service improvement e.g. IT support, library Provided to programme leaders to support curriculum change (where data is analysed on a
Scoping study: Students' expectations and experiences of the digital environment Funded by Jisc and Co-design partners, with thanks to all who participated [email protected] 25/11/13 course-by-course basis) Published to students, showing how issues have been responded to Used in staff training/development (e.g. to share best practice) Published more widely e.g. academic papers Used to make a case e.g. for particular IT provision or upgrade projects
'The outcomes have been used: To lobby Estates for additional space for open access computers To prioritise major projects (for instance wifi in halls of residence; upgrade the student printing service; introduce laptops for loan)'
We also use the outcomes and comments segmented at faculty level when discussing local e- learning plans with faculty and departmental education leads.
5. How did this process/practice originate (e.g. was it with a special project or initiative)? How is it evolving/changing? Responses to this question did not suggest any clear pattern. However, the most common driver for gathering information on students' digital experiences was a more general student voice/student experience agenda or an initiative of the students' union/student guild (x5). In five cases the practice had definitely been brought into the institution from elsewhere, either another institution (x2) or in response to a JISC programme (x3). Internal initiatives mentioned as factors were academic quality and curriculum enhancement, digital literacy projects, and assessment change. '[projects were] originated through practitioners' curiosity and desire to improve students' experience and learning - but continuation is always difficult, as staff time needs to be paid for out of externally provided funds'
The survey landscape is certainly evolving. The brainchild of someone in IT services, the library or e-learning becomes an annual event involving several services and dedicated team-members' time: 'in early years it was me going round battering on colleagues' doors [but now] the responsibility for making change is shared'. There is, however, a strong sense of 'survey fatigue', expressed through diminishing returns of thoroughness and thoughtfulness from those that do respond, and strict limits on who can survey students during the academic year. Alternatives to surveys are emerging, involving smaller numbers of students thinking more deeply about the issues: email interviews, focus groups, software for crowdsourcing ideas, vox pops and pop-up student engagement events. Even among these relatively forward-thinking participants, however, there were concerns about using these techniques effectively, and about how persuasive the outcomes would be to decision-makers. There was interest from several in the possibility of sharing these experiences more widely: 'we would be very interested in a sector-wide approach along the lines of the ECAR one.'
Comments on student expectations We’re in the middle of Freshers’ week here, and I’ve already had students ask about Facebook groups for their courses... I also notice that they expect a Moodle course to be set up for each subject.
Students have expected ubiquitous computing access and wifi/mobile for at least the last 5 years... The lack of provision is more of a let-down when the expectation becomes an assumption.
We're expecting a lot more BYOD and a lot more demand for lecture capture – but need to educate students about their own obligations in HE. They seem likely to express a demand for digital spoon-feeding which we need to counter.
Scoping study: Students' expectations and experiences of the digital environment Funded by Jisc and Co-design partners, with thanks to all who participated [email protected] 25/11/13 We have a gap between institutional software and student software.
Students remain frustrated by the lack of open access computers and lack of space for them to work using their own computers.
Students are pretty conservative and are largely happy with the use of the VLE as a file store and delighted if lecture recordings are available. Live streaming and recording of classes are the top requests for e-learning development. The usage of tools such as wikis, blogs, e- portfolios etc in teaching is limited.
Comments on good practice in gathering information/engaging students
Engage with schools, and school children - ask them what they expect.
Know your own institution first, then look at others.
Personally, I would suggest taking the time to ask the students how they want to approach a digital task, what experience they have, and then how does this align with the vision of what the academic wishes to achieve?
Do we spend enough time trying to understand what 'digital knowledge/skills/expectations/requirements' our students are entering university with? What is actually driving the shape of a university digital ecosystem?
Information gathering should go beyond access and satisfaction rates; such data collection needs to include a wider snapshot of student experiences – and needs to be an ongoing process, not a one-off.
The university is making increasing use of twitter to be alert to students’ reactions to its initiatives.
I'm of the opinion that survey tools like this will be relatively unhelpful in characterising digital literacies in disciplines. Richer qualitative data will be needed for this. [T]he fact that we have already mapped digital literacies onto programme outcome specifications offers an interesting opportunity to look for correlations between the two sets of data. On the negative side, we are already seeing digital literacies becoming mixed up with skills, the measurement and training of which are comparatively easy; over-emphasis on surveys may hasten this trend.
My position is that we need to increase our efforts to engage students and staff in … partnerships in order for the University collectively to gain greater insight into the nature of digital literacy in practice.
Scoping study: Students' expectations and experiences of the digital environment Funded by Jisc and Co-design partners, with thanks to all who participated [email protected] 25/11/13