Data journalism is booming. Every modern newsroom embraces new analytical tools and innovative visual ways of telling stories. At the City journalism department a few years ago, there were barely a dozen students in the data class. This semester, we are holding multiple workshops to accommodate almost 200 MA students eager to learn these skills. Accreditation panels enthuse about data modules, and students know well that this is a key skill for employability.
Chris Anderson’s book provides an interesting account of the historical evolution of data journalism and shows how it incorporates frameworks and analytical tools typically associated with the social sciences. Indeed, another scholar has described data journalism as “social science on a deadline”. However, the “certainty” that this book highlights is evidence of a fundamental division in the way that data journalism is perceived – among both academics and journalists.
In many newsrooms, the increasingly sophisticated analysis of data and its visualisation continues to be something done by a distinct and separate group of (probably nerdy) colleagues seated far away from the main news desk. This feeds into a belief that interrogating data will yield an optimum way of reporting, since it will remove subjectivity and lead to independent and self-evidently accurate news stories. The alternative view is that what we now view as data journalism is simply part of a wider jigsaw in which analysing information, albeit in new and innovative ways, is only the start. The traditional skills of finding interviewees, talking to people and crafting the story remain crucial in the production of the journalism. So the data skills are not a new form of journalism, but one more technique among many, and therefore the notion of achieving a mythical “certainty” is misplaced.
The very idea of separating data journalists within the newsroom is therefore problematic – and analysis of data crucially needs to be taught and understood as part of a wider storytelling framework. Fancy work on Excel or Python is not by itself going to yield satisfactory journalism. Ultimately, data analysis is the start and not the total picture. The problem is that today’s audiences crave certainty. In an era of “post-truth” and declining trust, it is highly desirable that we should be able to find reliable agreed facts and base our judgements upon them. But journalism is never going to be a purely objective pursuit – just as social science and indeed much of physical science is subject to changing interpretations.
Anderson’s analysis demonstrates how data have thereby been weaponised to fight proxy wars in the media. He shows how the gradual attempt to introduce scientific certainty into journalism has not yielded a better politics. And he rightly calls for journalism that will “embrace a more contextualised form of uncertainty”.
The book’s account of the evolution of “precision journalism” is an interesting contribution, although Anderson provides a predominantly US social science context. Yet a wider readership will probably agree that the ethnography of a version of computational journalism whereby “structured stories” are produced with minimal human input is a grim warning of where we may be headed if reason and common sense do not prevail.
Suzanne Franks is professor of journalism – and head of the journalism department – at City, University of London.
Apostles of Certainty: Data Journalism and the Politics of Doubt
By C. W. Anderson
Oxford University Press 240pp, £64.00 and £16.99
ISBN 9780190492335 and 9780190492342
Published 25 October 2018