Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, by Cathy O’Neil

Book of the week: If all-seeing ‘miracle’ tech is making the decisions we must demystify the tricks, says Danny Dorling

September 8, 2016
Still from The Wizard of Oz, 1939
Source: Alamy
All-knowing deities: ‘Facebook is more like the Wizard of Oz: we do not see the human beings involved’, explains Cathy O’Neil of a modern world ordered by algorithms

Many years ago I attended an event at a packed lecture theatre at the annual meeting of the American Association of Geographers. A huge audience had gathered to hear a little-known academic speak on the subject “What your credit card record tells them about you”. But the speaker never appeared. Was it a successful stunt to illustrate just how paranoid we all are about what they know about us? Perhaps the speaker had been trying to show the audience just how little social scientists knew about “big data”, long before the phrase had ever been thought of – or perhaps he had just slept in.

Cathy O’Neil, an academic and former hedge-fund quant, or quantitative analysis expert, has a story to tell, and it is a story about you. She draws from that same deep well of fear that helped to draw crowds at the AAG conference: the suspicion that we are all being observed by hidden forces, algorithms we cannot understand, designed by faceless quants who work to maximise the bottom line for their masters. In the past, there was just one all‑seeing god we had to fear. Now we live in a world with multiple all-knowing deities, each a little different, each oblivious to the fate of most individuals, each unbelievably powerful and each potentially malign.

As O’Neil explains of one of the biggest and most ubiquitous of those deities, “Facebook is more like the Wizard of Oz: we do not see the human beings involved.” We can’t see the quants who decide which of our many friends’ posts we view first, and it turns out that the quants play games with our emotions, testing to see how some groups react to being fed, say, more bad news than good. A majority of users (62 per cent, according to the data O’Neil cites) are completely unaware of this.

These are newly emerging gods, and currently most of them are thought of as benign corporations distributing their software for free, presumably to enhance the common good. A majority (73 per cent) of Americans believe that the search results offered up to them by Google are both accurate and impartial. O’Neil asks how anyone could know if the results we see have been skewed to “favour one political outcome over another”. She reports that Google has prohibited researchers from creating scores of fake profiles in order to map the biases of its search engines. But then again – if they had done so, how would Google know?

ADVERTISEMENT

US voters, O’Neil claims, have been “microtargeted” by political parties and other unknown groups, which for her explains why 43 per cent of Republicans continue to believe that Barack Obama is a Muslim because “microtargeting does its work in the shadows”. Evidence for these and similar claims made in the book is scant; references are generally restricted to the name of a researcher and the university at which they work, and so ironically the reader has to rely on Google to find the source material. Google’s quants could map out who had most likely read this book and found it most interesting by focusing on such searches. But do they really have the time or inclination? Or do they abide by Google’s infamous dictum, “don’t be evil”?

The pressure to be evil comes from that famous root of all kinds of it – money. Given the peculiarly undemocratic nature of the American presidential voting system, only 1 per cent of swing voters living in swing states can be key to the outcome. According to O’Neil, “the money from the financial 1 percent underwrites the microtargeting to secure the votes of the political 1 percent”. But can such voters be targeted that effectively, and where is this book’s reference to the smoking gun – the political quant who came in from the cold and explained how it was all done?

ADVERTISEMENT

I have a great deal of sympathy for O’Neil’s suspicions. In the late 1980s I was given access, as a doctoral student, to magnetic tapes containing the electoral roll of the UK, ordered by the geographical regions the Conservative Party then used. I was never told why the data lab I worked in had been given access to those data. We also later had data from the firm that would become Experian, and the credit card company Capital One, both of which are mentioned in Weapons of Math Destruction. Apparently, Capital One now carries out “rapid-fire calculations as soon as someone shows up on their website. They can often access data on web browsing and purchasing patterns.” But how do they do this?

A quarter of a century ago I concluded that the nascent big data industry was full of big claims and big errors, and I ignored all commercial sources other than mortgage records when writing a PhD thesis on how the social structure of a country could be visualised. Things will, of course, have moved on greatly since then. But when research is done in secret, without peer review or conference presentations, it is easy to make great claims for your rapid-fire calculations, your “powerful algorithms” and your Big Brother-esque surveillance abilities.

The reality is often quite different. It is the research student or young quant trying to keep their bosses happy and promising that they really have devised a clever algorithm that can give the firm the edge. It is the sleek saleswoman in an expensive suit with a flash job title and a fancy set of PowerPoint slides, explaining to the board how they can zoom in, target market, segment and augment profit. And it is the board member nodding sagely and signing the cheque to the company to do the work that neither he, nor she, understands, nor could understand, nor feels the need to understand – just as long as everyone’s getting paid.

Make a profit or win an election every so often, and the target marketers can take the credit. Make a loss, and it is down to “external factors”. No one from outside can scrutinise your work because it is a trade secret. There is nothing very clever or complex about how Facebook or Google works. Almost anyone could have started any of these companies, but they needed to be at the right place at the right time. They are huge now because they were first and are still the most voracious. But just as the computer programming required to land a spaceship on the Moon is not beyond the wit of millions today, neither is the maths of mass marketing too mysterious. Future generations will be taught how it all worked just as I was taught at school about how spinning jennies and steam engines’ centrifugal governors worked in the more distant past. Today’s miracle technology is tomorrow’s textbook example.

Weapons of Math Destruction is a well-written, entertaining and very valuable book. It explains why you are more likely to be put on hold if your credit rating is too low; how “red-lining” (allocating or denying access to finance based on the ethnic make-up of applicants’ neighbourhoods) still operates in the US; the racism that is inherent in most commercial uses of big data; how crime-targeting programmes actually serve to increase crime in poor areas; why ranking universities ultimately reduces the quality of all institutions; and how “profits end up serving as a stand-in, or proxy, for truth”.

As O’Neil says, she just had to type the two words “data scientist” into her CV and she was able to enter this world, and what she reveals is fascinating. But what her book doesn’t do is provide references to information that is not already in the public domain, and neither does it contain a single equation or algorithm. If we are really to demystify these processes, at some point we will have to draw back the curtain to explain how the machines work and don’t work, and how the giant data corporations are not new gods but fallible recent human creations that we have yet to collectively control. Only then will they do less evil.

Danny Dorling is Halford Mackinder professor of geography, University of Oxford, and author of A Better Politics: How Government Can Make Us Happier (2016).


Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy
By Cathy O’Neil
Allen Lane, 272pp, £12.99
ISBN 9780241296813
Published 6 September 2016


The author

Author Cathy O’Neil
Source: 
Adam Morganstern

“My parents were mathematicians, and my mom was and is a computer science professor,” says Boston native Cathy O’Neil. “They raised me to be very logical and careful about my assumptions, but it was an apolitical, pro-science perspective.”

ADVERTISEMENT

“Serious-minded” as a child, she wanted to become a musician. “I was in love with the movie Amadeus, but my parents and teachers thought it would be better for me to be more practical, and they more or less bribed me into going to math camp when I was 14. I loved it and never looked back. I now have a bluegrass band, so I still enjoy music.”

In 2007, she left an academic post at Barnard College for the investment group D. E. Shaw. “I had been lightly headhunted – I’d received emails from them asking if I was interested – but in the end I asked to be interviewed. I wanted to be in an environment where I had the sense that what I figured out had an impact on the real world, which is hard to come by in academia. What I didn’t think enough about was that I’d also like it to be a positive impact. I was extremely naive when I got there. But not for long.”

Best known for her popular blog Mathbabe.org, O’Neil founded Columbia University’s Lede Program in Data Journalism in 2014. Are journalists insufficiently numerate? “More insufficiently sceptical. I don’t blame them: computer science, mathematics and statistics are taught to seem intractable and magical. It’s completely unnecessary and misleading, but it serves a purpose, namely to have algorithms and other mathematical objects seem beyond scrutiny.”

What she wants journalism students to bear in mind, O’Neil says, “is that an algorithm represents a decision-making process, whether it’s deciding who gets hired, who gets fired or who goes to jail. Don’t we, as a society, deserve to understand how that decision is made? I think we do.”

ADVERTISEMENT

Karen Shook

POSTSCRIPT:

Print headline: In American gods we trust

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored

ADVERTISEMENT