Source: Alamy
Higher education experiences fads, some of which wane unlamented. Of late, one of the most ubiquitous buzzwords is “student engagement”. From funding councils, the Quality Assurance Agency and the National Union of Students to institutions and teaching development units, everyone is pressing the “student engagement” button. However, a bit like the term “student-centred learning”, student engagement is now used to refer to so many different things that it is difficult to keep track of what people are actually talking about.
Small campus-based universities with full-time resident students are in a much better position to stop people dropping out than are inner-city universities with large classes, dispersed buildings and a large number of part-time students living at home. Decades ago, Vincent Tinto, the education theorist, described what needed to be done to improve retention as “academic and social integration”. Explanations of this phenomenon have been largely sociological rather than pedagogical, and only recently has the term “engagement” been used to refer to it.
Then came the eminent educationalist Alexander Astin’s studies on what specific aspects of the years spent at college made any difference to what students learned. Repeated in subsequent decades, the studies consistently identified the same teaching, learning and assessment practices that predict learning gains, such as close contact with teachers, prompt feedback, clear and high expectations, collaborative learning and “time on task”.
What these practices have in common is that they engender “engagement”, and it is student engagement that has been found to predict learning gains, not such variables as class contact hours or research prowess. Here, “engagement” is about students’ engagement with their studies, not with their social group or their institution, and the focus is clearly on pedagogic practices.
The key variables that make up engagement have been captured in a questionnaire, the National Survey of Student Engagement (NSSE), that is used throughout the US and in various forms round the world. When lecturers in the US refer to “engagement”, this has come to mean “scores on the NSSE”, just as National Student Survey (NSS) scores have become the currency of quality in the UK. The Higher Education Academy has piloted its own short version of the NSSE for use in the UK. Hopefully, valid questions will eventually find their way into the NSS, because engagement is a better indicator of educational quality than “satisfaction”.
Once you have a measure as widely used as the NSSE, it is possible to spot which methods are being used in situations in which students’ engagement with their studies is found to be especially high (or low).
More engaged all round
George Kuh, professor of higher education at Indiana University Bloomington, has summarised this research for the benefit of administrators, pointing out that students who are more engaged with their studies are also more engaged with their institution’s governance, with volunteering, with student activities, and so on.
The assumption has developed that if you can engage students outside of the curriculum then they will also be more engaged inside the curriculum.
Sometimes this assumption is based on misunderstanding of statistics: while some students are engaged in pretty much everything, others are engaged in nothing, so what is being described may be differences between students rather than between courses or institutions. Nevertheless there is a growing belief underlying some efforts in the UK that engaging students outside of the curriculum will cause engagement with studies, rather than simply compete for students’ time.
At the same time there is a good deal of effort being put into making student engagement in quality assurance work better.
For example, at Coventry University, students administer and interpret student feedback questionnaires themselves. At the University of Winchester, one student per degree programme per year is given a bursary to undertake an educational evaluation that feeds in to the annual course review. Whether this has positive impacts on engagement with studies, or on learning gains, is yet to be demonstrated unambiguously, although it probably develops the employability skills of those few students who are course reps or evaluators.
Teaching the teachers
Some institutions have become much more radical and have involved students not just in spotting quality problems, but in solving them. For example, one student per department at the University of Exeter was hired and trained to teach their lecturers how to use the open-source learning platform Moodle, transforming the use of technology in teaching.
The race is on to find new ways to employ students as change agents, and this too is being described as student engagement.
There is some evidence that universities with more developed student engagement mechanisms, of one kind or another, are improving their NSS scores faster than others. The difficulty in interpreting this evidence is that student engagement means different things at different institutions, and those institutions that are serious about student engagement are probably serious about all kinds of other quality enhancement mechanisms at the same time.
For decades students have also been co-opted into teaching and assessment roles as a part of educational innovations, for example through self and peer assessment, peer tutoring and peer mentoring. Many such practices used to be called student-centred learning. Some of these “engagement with teaching” practices have been reified into formal systems (eg, Supplemental Instruction), marketed all over the world and researched in detail.
In summary you can often produce quite large educational gains when students do for themselves and for each other what teachers previously did for them – and all this is often free. The University of Lincoln even engages students in developing the curriculum.
Finally, some institutions focus primarily on engaging students with research. For example, the Massachusetts Institute of Technology provides internships in real research groups for 80 per cent of its undergraduates, and this has measurable impacts on students’ aspirations to be researchers, among other benefits.
When the Higher Education Funding Council for England, the QAA, the HEA or the NUS sponsor student engagement initiatives, it is to be hoped that first, they are clear what they are referring to and second, they make use of extensive research findings already available in relation to many of the varied forms of this often opaque term.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login