Today’s undergraduates: born into a second life

A rift will always separate digital natives from others, Shahidha Bari believes

一月 29, 2015

Benedict Cumberbatch’s latest film, The Imitation Game, a dramatisation of the life of wartime codebreaker and logician Alan Turing, adds some jeopardy and sprinkles some glitter over a life so remarkable in reality that it barely needed Hollywood embellishment. Turing was, perhaps, the greatest alumnus of my hallowed place of learning, King’s College, Cambridge, although as an undergraduate, hurtling from essay to essay, I was only dimly aware of stories of cyanide-injected apples and chemical castration. Mostly, I felt rather sorry for him, nodding apologetically whenever I sped past his badly framed yellowed portrait, a poor-quality, vaguely sepia-tinted A3 photographic reproduction, which hung at a wonky angle at the bottom of a staircase heading into the college computer rooms. Some conscientious fellow had brightly thought to name the rooms after the college’s greatest son, not realising that the tribute being offered was little more than a strip-lit basement with an array of blocky PCs banked on one side and lurid Apple Macs on the other, the air always slightly damp, humming with the sounds of whirring computers and chugging printers that regularly lost the will to live.

 

It is not just that their daily lives are documented by their smartphone cameras, but that this experience is profound and somehow tied to their very sense of being

When my cheery friend Tim, who was enthusiastically studying “computer science”, explained that Turing had invented “computing”, it was a notion that I grasped as vaguely as Tim’s clearly hare-brained plans to invent energy-efficient data storage. (NB: I’ve not seen Tim since he sold his company to Google and moved to San Francisco.) At the time, these were all things I found faintly interesting but couldn’t quite compute (excuse the pun) whenever I poked my head up from a 19th-century novel. I had felt similarly baffled when a boy called Omkar had taken pity on me at sixth-form college and set me up with this thing called an “email address”. Did I want an “underscore” in my name? No, I wanted to read Sylvia Plath, and Omkar could do what he liked. Eighteen years later, I diligently log in to that email account from my laptop, my iPad and my Android phone so many times a day that my loved ones scold me. All that time as a student, while I was busy losing myself in great Shakespearean abstractions, I was surrounded by real-world visionaries who could see an actual brave new (PC) world ahead of them. Luckily, they were determined to drag me along with them.

My technophile sibling, who had once bought and then disastrously dismantled an Amstrad CPC 464 in the late 1980s, insisted on tooling me up with my very own PC for university, charitably helping me to lug it from room to room each year. In 2000, this was a luxury. In 2015, when I look up at the ranks of 0 first-year students waiting expectantly for my lecture, I face a sea of laptops, impossibly small, sleek and silent, swiftly snapped shut and slipped into synthetic sleeves as soon as I step from the lectern. Truthfully, I don’t think of myself as that old, but sometimes I catch them looking at me quizzically when I pull out a paper diary to actually write down appointments. I wonder if their expression of amusement and faint horror resembles that which must have crossed my face when as a graduate student I spent a day flicking through painstakingly typewritten PhDs in the university library and noticed all the Tipp-Ex.

In my field, we are accustomed to rehearsing all the usual anxieties about threats to material book culture: we lament the loss of research skills, we worry about deserted archives and lost arts of palaeography. We champion independent booksellers over Kindles, and heatedly debate the merits of open access publishing. In our wider culture, we are nostalgic for elegant penmanship, we issue apocalyptic cautions about diminished attention spans, physical inactivity and eroded social ties.

For my own part, I don’t fear the tide of technological progress, but I am conscious of how hard it is, even when game, to keep up to speed with the latest advances. I am foxed by online grade books and iTunes alike. Even the language can leave us behind. “Virtual reality” rings with a Nineties naffness, as though its claim to reality no longer needed the qualification. We are asked to trust in innocuous “cloud” technology as though the substance of our thoughts and lives could be dissolved into thin air.

In this brave new world, I worry at how easily we are left, and leave others, behind. I am conscious that even if it is not too late to learn, it is impossible to mend the rift that has opened between a generation who can remember a life before email and those who cannot. It is not just that their daily lives are documented by their smartphone cameras and cloud storage, that they have at their slightest touch an online archive from which memory can be downloaded, but that this experience is profound, that it somehow constitutes and contours their very sense of being that makes it different from mine. In my favourite kind of French philosophy, we talk airily of being “born into language”, but the generations of students I teach now and into the future will also have been born into an online life that is not only parallel to their real lives but profoundly entwined with it. I can keep trying to refresh the browser, but the truth is that they are a different operating system altogether.

Times Higher Education free 30-day trial

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.
ADVERTISEMENT