Does the UK need medical schools?

Universities’ nominal control of undergraduate medical training belies the fractured reality, says Jonathan Rees

May 20, 2021
Medical students look at a chart
Source: iStock

Until the late 1950s, medical graduates in the UK were supposed to be fit for independent general practice from the day of graduation.

Our task in modern medical schools is in many ways much easier. Now, doctors can only practice independently and unsupervised in any meaningful way once they have postgraduate specialist qualifications, obtained five years or more after their primary medical degree. Medical schools are just the start.

Yet they increasingly look out of time. Their ability to simultaneously serve three different purposes – educating doctors, clinical leadership and medical research – is being undermined by rising tensions between those roles.

Academic clinicians in the UK used to be contractually expected to spend half their time in clinical practice – although some spent more, some less. As with musical skills, medical skills can atrophy as well as grow. A certificate of competence from yesteryear does not obviate the need for continued daily practice and interaction. In medicine and surgery, that means seeing and treating patients.

ADVERTISEMENT

That is especially important in the modern world, where clinical medicine is more complex and demanding – not to mention (for good reasons) accountable. However, clinical academics now make up only a small percentage of doctors. This reflects the value placed by universities on research, and the associated institutional league-table neuroses. It also reflects funding formulas: universities are not funded to deliver patient care or lead a clinical service, so allowing some of their highest paid staff to spend half their time doing so for another organisation makes little corporate sense.

Clinical academics also flee to their labs from the lecture theatres. They know that involvement in undergraduate teaching – and its administration – early in their careers does not pay off. Better to concentrate on research – or else the postgraduate taught courses that bring in additional revenue.

ADVERTISEMENT

Explaining to non-medical colleagues how clinical undergraduate medical education in the UK is organised is usually met with polite scepticism. The medical school chooses who to admit and provides most of the initial teaching. But after the first two years, things change. The bulk – perhaps over 90 per cent – of teaching is then delivered by NHS staff with no contract or even meaningful contact with the university.

At one time, most teaching was located within a largely monogamous partner hospital. But many medical schools now admit 10 times as many students as the archetypal London medical school, on which the model was founded. Students from several medical schools, with different structures, may mingle at far-flung hospitals or clinics, raising the issue of what any single university contributes. Add to that the rise of specialisation and the pressure on doctors’ time and a modern medical student might be “taught” by 500 people during their clinical years, in fleeting, anonymous rotations across scores of sites.

This quasi-outsourcing model of teaching, with the medical school acting as purchaser and the NHS as provider, has allowed many medical schools, particularly the newer ones, to dispense with most of their full-time academic staff. They simply do not possess clinical academic expertise across the breadth of medicine.

NHS staff have noticed these shifts. The challenge to their professional status and autonomy by the growing ethos of medicine as a service industry has sensitised them to the fact that their relationship with their academic colleagues is increasingly instrumental rather than collegial. Besides, everybody – including the students – knows that meeting NHS waiting-list targets and treating the emergency patients waiting on trolleys trump teaching medical students. So the students get a raw deal, too.

In much of Europe, medical schools have never claimed to deliver the immersive apprentice model of UK medical education, and their graduates’ immediate employment roles reflect this. In the US, meanwhile, medicine is a graduate-only course, and medical schools are explicitly in the business of providing clinical care, so its priorities are not so split. However, I doubt that anybody in the UK would seriously contemplate all would-be UK doctors paying for both undergraduate and graduate medical degrees.

Let us reflect on the fact that those wishing to become hospital specialists train on the job at a single department (or between a few geographically linked departments), but with teaching and certification supervised at the UK level by the Royal Colleges. We happily accept that universities could not perform either function.

We should also recall the now largely historical Oxbridge “3+3” model, whereby three years of university education in the basics and fundamentals was followed by three years’ immersion in a hospital, usually in London. The students were not always paid as apprentices, but they were certainly treated as such.

I suggest that we likewise move the second “3” of the primary medical qualification – which is now a faux apprenticeship – to where it belongs: the workplace. Trainees should graduate, then learn and work as paid employees before obtaining professional certification, as lawyers and accountants do. Given that NHS funding already supports clinical placements, this would probably cost less overall than the current system.

ADVERTISEMENT

Meanwhile, medical schools would be liberated from the constraints of professional certification and the associated pretence that they have full control of the clinical experience. Their core function would be the age-old one of providing the foundational knowledge and intellectual skills that equip graduates to make sense of – and grow within – professional practice. But while all students would take some core modules on physiology, cell biology and public health, we could offer options in medical history and innovation, in how law and economics impact on healthcare, and the nature of evidential claims and statistics. This would allow us to produce graduates who bring diverse intellectual qualities to the “trade” aspects of their profession.

Finally, this readjustment would open up medicine to those who don’t want to practise clinically. It would allow medicine as an area of contemporary thought and scholarship to diffuse more widely. After all, medicine is too important to just be left to doctors.

ADVERTISEMENT

Jonathan Rees is an emeritus professor of dermatology at the University of Edinburgh.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored

ADVERTISEMENT