Past mistakes

Whatever the genuine lessons of history, policymakers constantly make opportunistic use of the past to justify their decisions. Matthew Reisz introduces a team of historians who are fighting back against the 'Bad History' all around us

October 15, 2009

Like everybody else, historians disagree violently about "the lessons of history". Some think there aren't any. And even among those who believe that the past is clearly relevant to the present, many are scrupulous about letting other people draw their own moral lessons. Others are happy to state, and underline, what the lessons are.

Take Vic Gatrell's City of Laughter: Sex and Satire in Eighteenth-Century London (2006), which won the prestigious Wolfson History Prize. The book is a study of the satirical prints, many of them gleefully lavatorial or obscene, that poured from the presses in the late 18th century.

They eventually faded away around 1820 as "respectability" set in: in Lord Byron's words, "the age of cant" replaced "the age of cunt". As the author notes, it is an intriguing, perhaps significant if little-remarked fact that "no Victorian produced an image of Queen Victoria farting".

It would be possible to tell this story in fairly neutral terms. We could enjoy the social history, the dirty pictures and Gatrell's expert elucidation of their imagery, while left free to decide for ourselves whether the shift in sensibility he describes was a good thing, a bad thing, a mixed blessing or a matter of complete indifference to us.

ADVERTISEMENT

But Gatrell, professor of history at the University of Essex, doesn't go in for such neutrality. He constantly buttonholes his readers, celebrating the prints' scenes of brawling, drunkenness and low-life pleasure, and launching broadsides against piety, puritanism and political correctness.

He makes it abundantly clear that he believes the attitude of total disrespect towards authority is something we should learn from. His subject may sound fairly obscure, but he is not going to let us forget that it has huge implications for a number of ongoing debates.

ADVERTISEMENT

Many historians, of course, explore topics far more obviously contentious and emotionally charged than late-18th-century satire. So how far do they see themselves as directly useful, offering us insights that can help us face contemporary challenges and lead better lives?

Jonathan Phillips, professor of Crusading history at Royal Holloway, University of London, is wary about drawing facile parallels with - or citing the past's lessons for - today's Middle East. Indeed, he believes that studying the period may help us understand "the sheer complexity of the region, then as now".

"It is far too simplistic to see the Crusades as a battle between Christians or the West and Muslims, since there were Christians fighting Christians, Muslims fighting Muslims, alliances across the religious divide - and the Greek Orthodox Church was always opposed to Crusading," he says.

But although they have to embrace complexity, Phillips adds, historians must also accept and be sensitive to the fact that "for much of the Muslim world, the Crusades have acquired a toxic meaning as part of a historical continuum - of Westerners invading, killing and conquering, as they were to do again in colonial times".

"Policymakers have come to realise that something serious underlay the Islamic response to George W. Bush's unthinking use of the word 'crusade'," he says. "Tony Blair learnt the lesson and made it very clear to a radio interviewer that he didn't want to be described even as 'a crusader for social justice'."

Miri Rubin is professor of medieval and early modern history at Queen Mary, University of London, and recently published Mother of God: A History of the Virgin Mary.

It is quite inevitable and "ought to be acknowledged and addressed", she says, that past and present always interact.

"Studying other times and places is not a search for 'rules' or 'formulae of historical dynamics' - although patterns can be discerned - but equips us with cases of human action that offer alternatives or critiques of the present ways of dealing with fundamentally similar challenges and aspirations: for safety, for support, for friendship, for order, for understanding beginnings and ends."

ADVERTISEMENT

There are also a number of specific areas where medievalists can shed light on current conflicts and dilemmas, she adds.

"I find people have real misconceptions about issues of religious prejudice and the sometimes-related violence," Rubin explains. "People think of pre-modern Europe as a place where crowds - the mob, the unlettered - took to the streets in religious violence, especially against Jews.

"Such behaviour is invoked in our own times as 'medieval' and people who do such things - in the Balkans in the 1990s, for example - are deemed to be 'throwbacks' to another time. In this manner, they are classed as 'aberrant', and so can be bracketed and put aside.

"The truth is that, then as now, violence in the streets is inspired by key actors, who act knowingly, and who are informed and often linked up with privileged access to media. There are agents provocateurs - preachers, journalists, politicians - who endorse behaviour by those who respect their authority. So, rather than the product of 'ignorance' or 'age-old hatred', responsibility for violence ought to be identified along lines of communication and excitation."

Ruth Harris, lecturer in modern history at the University of Oxford, has also dealt with issues of religion and intolerance, both in her book Lourdes: Body and Spirit in the Secular Age (1999), about the famed pilgrimage site, and in a forthcoming study of the Dreyfus affair in late 19th-century France, when a Jewish army officer was convicted of treason on blatantly spurious grounds.

On one level, Harris says, "there is no direct application of historical research to the tasks of policymakers. Occasionally, a history of banking or recent foreign policy might provide easy, transparent lessons. But the 'big' lessons are harder to extract."

Some of the lessons people want to draw from history are so obvious that they hardly require detailed analysis or spelling out at length. "In my own work on the Dreyfus affair," Harris continues, "many people might think that the real lesson is about the iniquity of anti-Semitism and the need to combat it. There is no doubt that this is the case, but this is hardly a difficult lesson in a post-Holocaust world.

"What is more illuminating - and perhaps useful - is to go beneath the legend of the affair and see how campaigners for Dreyfus were ready to battle intolerance with intolerance - mainly by imposing an idea of laicite, or separation of Church and State, which sought to erase all expressions of religious belief and symbolism from the public arena.

"Even today, many French policymakers insist that the doctrine of laicite is unchanging and essential for a tolerant society. They find it difficult to consider its darker side, let alone reshape it for a nation with a large Muslim community. For me, the Dreyfus affair's history provides important insights into how such assumptions developed, but such a line is not an easy one either for a historian to teach or for a policymaker to learn."

Harris argues that "history has a role in suggesting to policymakers that they should continually question their own assumptions". The issues become particularly fraught in relation to national histories, and the place of historians in shoring them up or calling them into question.

Stefan Berger is professor of modern German and comparative European history at the University of Manchester. He followed closely and sceptically the plans to create a new German "master narrative" after reunification, and later became director of a major research project for the European Science Foundation, "Representations of the Past: The Writing of National Histories in Europe", which brought together 150 scholars from 28 countries. He was recently invited to speak at the House of Commons on "how the teaching of history can help overcome mistrust and conflict between nations".

"It is still widespread to anchor national identity in history," Berger says, "so citizenship tests include a large dose of history - although I would tend to avoid this. In even the most inclusive national histories, a 'them' is necessary in order to define an 'us'; an 'other' who can easily become an enemy."

Although Berger admits that "the British Whiggish tradition" stressing tolerance, liberty and common democratic values may sound like a more benign form of national identity than many others, this was only partially true in practice.

"In the early 19th century, it went with a duty to bring those benefits to the Welsh and the Scots - and later to 'civilise' much of the world," he says.

He seems not to be worried about what would happen "if we no longer believe that citizenship has to be based on a common sense of history. Do immigrants really need to know about William the Conqueror in order for the whole community to coexist?

"History at school level doesn't have to be the history of the nation. It can teach lessons about human behaviour, tolerance and the dangers of intolerance - so it might be better to deconstruct notions of national identity. Teaching about Nazism already has this kind of role."

How effective is such teaching? Dan Stone, professor of modern history at Royal Holloway, University of London, has devoted much of his career to studying the Holocaust. While he acknowledges a moral dimension to such work, he is wary of "lessons" or the claim that people who have studied Nazism at school or university automatically become more decent or less prejudiced.

"I teach the Holocaust", he says, "because I think it is reasonable that any thinking person should be astounded that such a thing happened, and I want (and expect my students to want) to know how and why it happened. But I don't believe that teaching it makes my students nicer, more tolerant people.

"The assumption is usually made that by teaching kids between the ages of about ten and 14 about the Holocaust, we will produce nicer, more caring people. But how do we know? There is plenty of evidence that kids of 13 and 14 break out into embarrassed giggles when presented with atrocity photographs. How do we know that it doesn't make them cynical, or lead them to believe that certain groups are innate victims?

"I don't believe there are 'lessons' to be learnt from the Holocaust. If we want to teach our children that racism is bad and that they should be nice people, there are less brutal ways to do it."

So history may (or may not) be useful in all these different and complex ways. But now we turn to an area where historians can certainly make a difference - exposing some of the daft arguments that government ministers and their critics constantly come up with. In the spirit of Ben Goldacre's celebrated "Bad Science" column in The Guardian, we are delighted to launch "Bad History".

The parallel is pretty clear. Bad science can be pernicious, even lethal. If someone believes a disease is caused by gamma rays from the planet Zog, this is not merely silly or inaccurate. Effective medical interventions depend on defining what has gone wrong and why.

Bad diagnoses - attributing symptoms to causes that are vague or non-existent - can easily lead to treatments that are pointless, costly or dangerous. So anyone who exposes scientific illiteracy or mumbo-jumbo is performing an important public service.

Exactly the same thing applies to diseases of the body politic. Pundits and politicians often tell us that something has or has not worked in the past, or that we need to "get back" to something better, be it "traditional family values", independent-minded backbenchers or increased social mobility. "History shows", we are told, that a particular approach will reliably produce the right results - to which someone else will reply, equally baldly, that it is bound to lead to disaster.

Take the fierce disputes about "talking to terrorists". One side thinks it's a terrible, even immoral idea, because it legitimises them, insults their victims and leads only to further demands. Others believe it is essential as a route to pragmatic compromise. All boost their case with historical examples that they claim prove the point.

John Bew, lecturer in modern British history at the University of Cambridge, recently explored the claim, often repeated by the Government, that open negotiations proved crucial "in the search for peace in Northern Ireland" - and are likely to be just as important for resolving conflict in Afghanistan and elsewhere.

Yet such universal recipes for peace are simplistic. "Every conflict is different," Bew writes in a recent paper. "But if there is a lesson from Northern Ireland, it is that there is a great difference in talking to terrorists who are on the crest of a wave and believe they have momentum on their side, and talking to those who have been made to realise - by hard power as well as soft power - that their aims are unattainable through violence."

It is part of the historian's job to point out that things weren't quite as simple as is usually claimed. While this can sometimes feel like tiresome nitpicking, in cases such as this - where issues of life and death are at stake - it is hugely important and salutary.

Bew's paper appeared on the website of the History and Policy group (www.historyandpolicy.org), a partnership between the universities of Cambridge and London that "works for better public policy through an understanding of history". The group is now starting a "Bad History" series of articles to be published on its website. Seven striking initial examples appear in these pages.

The idea is not just to give historians a chance to mock or tick off politicians for the kind of blunders they would mark down in an undergraduate essay. That may be fun, but it isn't really the point. It is hardly surprising that historians know more about history than policymakers, and inevitable that the latter have to make decisions on the basis of inadequate evidence. Nor is it merely a matter of intellectual hygiene - although if prominent people are going to talk about history or science, it's probably better if they get their facts straight.

The real case against "bad history" is much more serious. In the words of Pat Thane, Leverhulme professor of contemporary British history at the Centre for Contemporary British History, University of London, "bad history can lead to bad policy analysis and to bad policy". Far more is at stake than the tendency of some policymakers to spout silly nonsense.

It is not unusual, for example, for politicians to propose policies that have already been tried and failed. They may assume something has changed since some mistily recalled moment in the past, and then claim that this has caused some current problem, real or imagined, as when "family breakdown" is made responsible for "antisocial behaviour". Or they may put forward a figure from the past as a role model - such as Prince Charles promoting the "green" credentials of Henry VIII.

In every case, as the team assembled for this feature demonstrates with wit and panache, it is historians who are best placed to expose such "bad history", not only as factually inaccurate, but as highly likely to lead to bad decisions.

By attacking historical error, they are clearing away the rubbish - and perhaps laying the foundations for better public decision-making.

SAME OLD, SAME OLD

The claim of the decline of the independent-minded MP is a hardy perennial of commentators and self-flagellating parliamentarians.

Roy Hattersley, for example, once described recent Labour backbenchers as "the most supine Members of Parliament in British history" (The Times, 3 November 2005).

Yet anyone who clings to the myth of the independent member of yesteryear needs to remember that in the 1950s there were two sessions - two whole years - during which not one government MP rebelled. Today's whips would sell their souls (those that still have them, anyway) for that level of discipline.

High levels of party cohesion were first identified at the beginning of the 20th century. But since the 1970s, there has been a rise in dissent and a weakening of party ties. The current Parliament, from 2005 onwards, is on course to be the most rebellious of the postwar era.

The truth is that there are always too few quality politicians; they are perceived as never brave enough; they were always better 20 or 30 years ago.

Take, for example, this moan about MPs who "represented not their country but themselves, and always kept together in a close and undivided phalanx, impenetrable either by shame or honour, voting always the same way, and saying always the same things". It was written in 1698.

There isn't much new in today's complaints and we'd be better off recognising that, because otherwise all attempts to reform Parliament and to raise it in the public's esteem are doomed to failure.

Philip Cowley is professor of politics at the University of Nottingham.

AN ILL-STARRED CHAMBER

It was recently reported that George Osborne, the Shadow Chancellor, has plans to "require Cabinet ministers in big-spending departments to attend a 'star chamber' ... to justify their departmental budgets" before a committee of colleagues (The Guardian, 24 June 2009).

Yet this idea has been twice tried - and twice found wanting - over the past 50 years by governments committed to "efficiency gains": under Harold Wilson in 1964-65 and again under Margaret Thatcher in the 1980s.

The Wilson experiment failed because of objective constraints on spending: in the case of Concorde, for example, it was thought too expensive to break longstanding legal agreements with the French. In addition, ministers resented having to seek the approval of their colleagues.

Willie Whitelaw, who chaired Thatcher's "star chamber" in the 1980s, was a well-respected and highly skilled negotiator. But only a few cases came before his committee, and the body came to seem less and less relevant.

Osborne's initiative is another of those novelties, beloved of politicians, that promises the circumvention of hard choices.

History demonstrates how inevitable political rivalries, the lack of any "neutral" ministers and prior spending commitments make such bodies very ineffective tools for cutting public expenditure.

Glen O'Hara is senior lecturer in modern history, Oxford Brookes University.

THE END OF HISTORY? NOT QUITE

In 2000, Condoleezza Rice published an article in the Foreign Affairs journal that spoke of "history march[ing] toward markets and democracy".

Five years later, soon after she was appointed Secretary of State, she contributed to a lengthy US Government report, Supporting Human Rights and Democracy: The US Record 2004-2005, in which she said that American "support of the inalienable rights of freedom-loving people everywhere" was encouraged by the fact that "history shows us that progress toward democracy is inevitable".

Perhaps these claims were just naively optimistic, although they were probably deliberate attempts to build support for her foreign-policy goals, but they miss the extent to which democracy is failing to progress in much of the world. Indeed, it is only by ignoring the experience of the developing world, in particular Africa, since the Second World War that one could argue that democracy is prospering at all.

Consider the case of Nigeria: although it was an independent state ruled by elected politicians in 1960, a military coup in 1966 led to a civil war in 1967-70, followed by 40 years of instability and political violence. Many other cases refute any notion of "inevitable progress".

Determinism such as Rice's, however, is not merely mistaken, but dangerous. It implied that democracy would simply and inexorably triumph in Iraq after the defeat of Saddam Hussein. It distracted attention from historical contexts and stifled the practical planning of exactly how (if at all) a fractured state, freed from tyranny but riven by ethnic and religious conflict, could be transformed into a functioning democracy. The consequences could hardly have been more costly.

Gervase Phillips is principal lecturer in history, Manchester Metropolitan University.

THE ILLUSION OF HAPPY FAMILIES

Every Family Matters, a recent report from the Centre for Social Justice (the think-tank set up by Iain Duncan Smith, the former Conservative leader), accused historians of "painting a picture of past marriage practices that earlier generations would not have recognised ... only since the 1970s has marriage come under threat with the rise of cohabitation".

This argument relies on highly selective evidence, notably formal legal records. By contrast, sources such as diaries, parish and hospital archives, court reports, Foundling Hospital petitions and Royal Commissions on marriage law make clear how extensive cohabitation was in the 19th and early 20th centuries.

We also have to remember how many marriages were broken by relatively early death, especially of the husband, until the 1930s. There were as many impoverished lone mothers in the 1880s as in the 1980s.

Serial relationships, step-parenthood and boys without resident male role models have long been commonplace.

People advancing simplistic and inaccurate claims about "the breakdown of the family" often use it as a scapegoat for many social ills, thereby distracting attention from other causes. They also neglect the plight of families trapped in unhappy marriages before the 1970s. The result is bound to be simplistic policymaking.

Pat Thane is Leverhulme professor of contemporary British history at the Centre for Contemporary British History, University of London; Tanya Evans is research fellow in modern history, Macquarie University, Sydney.

OPPORTUNITY KNOCKS ... BUT NOT FOR EVERYONE

The postwar "meritocracy" provokes nostalgia among many politicians. Alan Milburn, the former Secretary of State for Health, claims he was "part of the most socially mobile generation this country has ever seen" (The Independent, 12 January 2009).

Some commentators attribute this to grammar schools: Stephen Pollard, editor of The Jewish Chronicle, said in The Times that "grammar schools did a fine job of lifting children out of poverty and into opportunity" (24 June 2008).

This is a myth. During the 1950s and 1960s, more than 60 per cent of high-status professionals' children attended grammar schools, but less than 20 per cent of manual workers' children. More than 70 per cent of children were educated in secondary moderns, which prepared them for manual or lower-grade clerical jobs.

Grammar schools did not overcome class inequality. Working-class children were most likely to leave school before sixth form - not because of low parental aspirations, but owing to families' financial needs. Less than 4 per cent of manual workers' children entered university.

Although the proportion of the workforce employed in professional work doubled, this increase was concentrated in school teaching, nursing and technical occupations that did not pay more than skilled manual jobs. The "top" professions - law, politics and the Civil Service - recruited from the ex-public-school Oxbridge graduates they still rely on.

Policymakers' promotion of a "meritocracy" ignores the historical evidence that life chances cannot be divorced from class. Politicians would do better to tackle the underlying causes of economic inequality.

At the very least, they should address the really serious change that has occurred since the 1960s: the loss of economic security for manual and service-sector workers, which makes it hard for people to plan for the future.

Selina Todd is a lecturer in modern history and a fellow of the Centre for Research into Socio-Cultural Change, University of Manchester.

MINDING ITS BUSINESS

It is widely believed that Margaret Thatcher's election ushered in a continuing policy of non-intervention in British industry.

It was recently reported, for example, that EEF, the manufacturers' organisation, had called on the Government "to rethink the laissez faire approach to industry" introduced in 1979 (The Guardian, 1 July 2009).

Yet the assumption is simply untrue. Every government since 1979 has intervened in industry: Thatcher with British Leyland in 1983, John Major with the coal industry in 1992, Tony Blair with Rover in 2005, and Gordon Brown with the banks and the motor industry (again) in 2009.

British governments have never given up on economic interventionism, any more than industrial failure has gone away. As long as the latter continues, politics will dictate when the former will follow. The myth of non-intervention is often linked to the belief that laissez faire policies explain the relative success of the British economy since the 1980s.

Although it is based on false premises, this prevalent view makes it far harder to make the case for intervention. As a result, we see, for example, governments failing to support industries such as renewable energy, which may yet prove crucial to the nation's future.

Niall G. MacKenzie is a research fellow in enterprise and innovation at Judge Business School, University of Cambridge.

AS GREEN AS HE WAS COMPASSIONATE

In a recent lecture, Prince Charles argued that Henry VIII "instigated the very first piece of green legislation in this country", when he "passed laws to protect forests by preventing shipbuilders from felling too many oak trees".

He said: "What was instinctively understood by many in King Henry's time was the importance of working with the grain of nature to maintain balance between keeping the Earth's natural capital intact and sustaining humanity on its renewable income."

This is a very partisan account. The Act of 1543 did order the preservation of large trees for naval timber, but, far from being an early modern ecowarrior, Henry VIII was primarily concerned with hunting.

In 1539, he had created the completely new 10,000-acre forest of Hampton Court Chase, where "forest law was to prevail". This protected deer for him to hunt, and the vegetation where they lived (including trees), on land that largely belonged to other people.

From medieval to early modern times, there were literally hundreds of statutes and royal regulations introduced to protect forests for hunting. None could be described as "green legislation" as we understand it today, and punishments for offenders were harsh. Indeed, many historians consider forests to be among the most potent symbols of arbitrary regal power.

ADVERTISEMENT

John Langton is a fellow in geography, St John's College, Oxford.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT