ChatGPT’s potential to enhance student skills must not be dismissed

University educators may not fully understand generative AI or its long-term impact on society, but they must seek to integrate the technology into degree courses, says Stephanie Marshall

September 2, 2023
Source: istock

Those worried about ChatGPT’s potential to undermine university teaching certainly have good cause for their concern. But educators should not close their minds to the vast potential that AI can offer to higher education.

From AI tutors and virtual assistants as teachers to AI-generated insights for professional services and AI-powered marketing content for student recruitment, there are numerous ways that higher education might benefit from technology to improve student engagement and outcomes. Yet the same tools, if used without thinking through, could lead students to become disengaged and disillusioned as technology replaces in-person interactions.

Discussions on how to frame university responses to AI in the UK have tended to revolve around, at one end of the spectrum, wholeheartedly adopting and embracing digital technology, or, at the other end of the spectrum, stubbornly resisting change. In today’s educational discourse, the challenges of incorporating technology within education tend to be oversimplified or reduced to a reductionist list of the beneficial and harmful uses of technology.

The fundamental question in considering appropriate uses of AI is far more complex: how can we embrace digital innovations while taking their psychological, social, cultural, economic and political impacts into account?

ADVERTISEMENT

Some argue it will take higher education institutes at least 10 years to address the challenges posed by AI. This timeline might be an exaggeration, but it is irrefutable that most universities have found themselves caught off guard and unprepared. As the vice-principal responsible for oversight of education at Queen Mary University of London, I believe we must reflect on the potential full impact of embracing AI alongside the ethical issues it presents.

The confusion around AI today, especially within universities, is a result of its inherent complexity and unintelligibility. The more complex a tool is, the harder it is to frame policies to guide its uses and shape it for our purposes. The challenge facing education practitioners is not so much how to use AI, but how to integrate a technology that most of us have not mastered.

ADVERTISEMENT

A second concern is that few know what is gained or lost as we become more dependent on these technologies. However, most appreciate that it will be important for graduating students to have acquired the ability to engage creatively and intelligently with this rapidly evolving new technological environment. For this to happen, students need to acquire a certain degree of mastery over the various technological tools and to understand the broad impact of such developments on individuals and societies. This requires appropriate scaffolding approaches to be deployed in both teaching and research by universities.

There are several ideas universities could consider as viable options. First, students should have the technical literacy required for writing and reading codes. If this happened in optional courses for students, for example, they could learn in a more relaxed environment alongside the core parts of their degrees. While not everyone will pick up these skills easily, it will be crucial to ensure that they are not limited to a few elite sections of society by offering them widely.

Second, it will not be enough to have individuals who can read and write code. We need to adopt an interdisciplinary approach. For example, we need the humanities and social sciences most particularly so that the ethicists, lawyers or policy experts, to name but a few examples, can offer their perspectives. It’s not just a case of designers and users, as intelligent AI deployment must match our vision of a well-informed society.

Third, if we start to use AI to tutor students routinely, we need to consider AI’s potential impact on our analytical, imaginative and interpretive capacities. Our ambition must be for AI to improve these capacities in our students and not render them obsolete. Hence the importance of adopting an interdisciplinary, scaffolded approach to embracing the development of higher-level skills that AI affords.

The above list of ideas undoubtedly can be refined and supplemented, but my hope is that our solutions to problems of education fully address both the positive and negative aspects of AI. AI should help us to clarify our vision of the society we wish to create and support universities as institutions committed to embracing diverse ways of living and thinking, rather than just becoming a shortcut to an easier working life.

Stephanie Marshall is vice-principal (education) at Queen Mary University of London.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Reader's comments (1)

Give a child a toy hammer and they'll go looking for toy nails. Further they will fight to convince their friends that a hammer is the best possible toy because you can whack so many nail. The cultured or the ideal educated person is not necessarily one who is well read or learned but who likes and dislikes the right thing. To know what to walk away from is to have taste in knowledge,

Sponsored

ADVERTISEMENT