Tom Chatfield's Blog, page 2
October 31, 2014
The Dark Net
Jamie Bartlett’s new book The Dark Net is the fruit of years’ research into digital crannies dank enough to make concerned parents immolate their child’s iPhone: trolling and cyber-stalking, the politics of hate and terror, the consumption and performance of pornography, illegal drugs and suicide pacts.
It’s a roll call of tabloid bogeymen. But, disappointingly for any journalist in search of straw men to burn, what’s actually on offer is a meticulous, discomforting account of the human stories behind each headline. And perhaps the greatest discomfort on offer is the fact that – no matter how distant the digital underworld may feel from ‘real’ life – the temptation to place it in some safe, separate box proves in every case misguided.
Take the second chapter’s protagonist, Paul. The author first meets Paul in a working men’s club: a young man ‘with a handsome face, short dark hair, and tattoos that climbed up his neck. He was good company… until, that is, talk turned to politics.’ At which point Paul begins to spill out his devotion to a cause: White Pride. ‘What do you think the world will be like under black or Paki or brown rule? Can you imagine it? When we’re down to the last thousand whites, I hope one of them scorches the fucking earth, and everything on it.’
What’s new is not Paul’s fear, hatred or one-dimensional view of the world. It’s the fact that technology has turned him into a ‘one-man political party,’ spreading his message on social media, painstakingly gathering and contributing to meetings of like minds, rising through the ranks of the English Defence League’s Facebook presence. Far from skulking in some outer darkness, his is a strident public voice, mere clicks away from every screen’s daily churn of news, status updates and noise.
For many people who care about politics, Facebook is little better than a punchline: something to do if you don’t actually want to do anything. Stories like Paul’s give the lie to this. For him, Bartlett makes clear, it is online that his actions have reach and resonance. Paul himself is ambivalent about the power he taps into onscreen – ‘I was becoming too hate-filled, too paranoid, it was seeping into my blood’ he tells Bartlett, explaining second thoughts about his digital presence – but at the same time he hungers for community, validation, influence.
Is the internet a breeding ground for terrorism and horror, for abuse and exploitation? Unquestionably – just as it unquestionably fuels many of modernity’s most remarkable stories of collaboration, education, co-operation and investigation. How do the dark and the light balance out? This is a harder question, and one for which Bartlett offers no easy answers, not least because ‘the internet’ itself is something of a fiction: a noun containing multitudes. Instead of weighing its vices and virtues in the scales, he suggests, what we need is to listen to the news it brings about our own condition in the 21st century – and to ask what freedoms we value most.
‘The inflicter of suffering may be fooled, but the sufferer never is.’ This was the poet Philip Larkin’s comment on his great 1950 poem ‘Deceptions’ – but it’s also a line that whispered through the back of my mind while reading The Dark Net.
As soon becomes clear from Bartlett’s time among cyber-libertarians, dissidents and those preaching the gospel of anything goes, political posturing often masks a version of freedom that avoids all questions of consequence: of who is paying what price, and where, for your pleasure or passion. Technology can be an astonishing leveller in giving voice to the marginalised, the timid, the disadvantaged, those far from the centre of things. But it can also be an astonishing excuse for narcissism and collusion: for fooling yourself about others’ suffering precisely as much as you wish to be deceived.
There’s much that’s precious about the open internet, and urgently worth protecting; but there are also profound trade-offs entailed whenever one person’s freedom “to” meets another’s freedom “from”: freedom from surveillance and snooping; from stalking and abuse; from theft, piracy, illicit trade; from invaded privacy, or a door kicked down by those who don’t like what you’re saying; from stolen secrets and confidences.
In his final chapter, Bartlett traces the anything-goes vision of freedom to its logical conclusion in the techno-optimistic cult of transhumanism: a place in which ‘there is no ‘natural’ state of man. Freedom is the ability to do anything, to be anything, to go as far as our imagination can take us. We’re always changing and adapting, and embracing technology is simply the next step.’ Against this are ranged the claims of the ‘anarcho-primitivists’: those claiming that ‘technology tends to distract and detract from our natural state, pushing us ever further away from what it really is to be free humans.’
Both tribes consider our current relationships with technology inadequate. Yet both also offer — it seems to me — willfully inadequate prescriptions for the future, based on a fundamental misreading of what it means to live with technology.
Our responsibility is not to some abstract vision of human potential – whether enhanced by ultimate technology, or denuded of it entirely. It is to each other, as we can best understand our circumstances: compromised, enmeshed in history and contingency, bound by ties we have not chosen.
To become more free, here, is not to pretend that the mirrors our machines hold up to us show anything entirely new. It is to look more carefully at how we live, with them and without them, and hope to become less deceived.
A version of this article first appeared in the Demos Quarterly
September 1, 2014
Digital reflections
I was interviewed by the site Create Hub recently, around the idea of “digital reflections” and our everyday relationships with technology. An excerpt is below; click through for the full discussion.
Q: You recently gave a talk on “Digital Reflections” at Southbank Centre. What was the talk about?
A: I was looking at some of our daily relationships with technology – and how these relationships can shape how we think and feel. Many of us have an incredibly intimate relationship with our phones, for example. They are the first objects we touch when we wake in the morning, the last objects we touch when we go to sleep at night; they are always with us, bringing with them many of the things we care about most. Much of the time, this is great. But I worry that if we have an unexamined relationship with tools like our phones, we risk developing a distorted sense of ourselves; of being excessively influenced by our onscreen reflections and projections.
I struggle with this myself. I get anxious if people don’t reply to my emails or texts fast enough; I feel like I’m missing out, or like my life is inadequate, when I scroll through other people’s timelines; I risk turning every moment of every day into the same kind of time, because I always have the same options available onscreen with me. I risk living in a kind of technological bubble – and of being seduced by how cosy and connected it feels in there. And so I try not to react by violently opposing technology, but instead to put it in perspective; to use and experience it differently; to build different kinds of time and space into my life.
June 27, 2014
What will our descendants deplore?
I have a new essay on the BBC Future website today, exploring a question that I took to a selection of the world’s brightest minds: from James Lovelock to Peter Singer, via Tim Harford and Greg Bear. The opening is below, and you can read the whole thing on the BBC Future website.
Earlier this year, I had a discussion that made me ask a disconcerting question: how will I be viewed after I die? I like to think of myself as someone who is ethical, productive and essentially decent. But perhaps I won’t always be perceived that way. Perhaps none of us will.
No matter how benevolent the intention, what we assume is good, right or acceptable in society may change. From slavery to sexism, there’s plenty we find distasteful about the past. Yet while each generation congratulates itself for moving on from the darker days of its parents and ancestors, that can be a kind of myopia.
I was swapping ideas about this with Tom Standage, author and digital editor of The Economist. Our starting point was those popular television shows from the 1970s that contained views or language so outmoded they probably couldn’t be aired today. But, as he put it to me: “how easy it is to be self-congratulatory about how much less prejudiced we are than previous generations”. This form of hindsight can be dangerously smug. It can become both a way of praising ourselves for progress rather than looking for it to continue, and of distracting ourselves from uncomfortable aspects of the present.
Far more interesting, we felt, is this question: how will our generation be looked back on? What will our own descendants deplore about us that we take for granted?
What will our descendants deplore about us?
I have a new essay on the BBC Future website today, exploring a question that I took to a selection of the world’s brightest minds: from James Lovelock to Peter Singer, via Tim Harford and Greg Bear. The opening is below, and you can read the whole thing on the BBC Future website.
Earlier this year, I had a discussion that made me ask a disconcerting question: how will I be viewed after I die? I like to think of myself as someone who is ethical, productive and essentially decent. But perhaps I won’t always be perceived that way. Perhaps none of us will.
No matter how benevolent the intention, what we assume is good, right or acceptable in society may change. From slavery to sexism, there’s plenty we find distasteful about the past. Yet while each generation congratulates itself for moving on from the darker days of its parents and ancestors, that can be a kind of myopia.
I was swapping ideas about this with Tom Standage, author and digital editor of The Economist. Our starting point was those popular television shows from the 1970s that contained views or language so outmoded they probably couldn’t be aired today. But, as he put it to me: “how easy it is to be self-congratulatory about how much less prejudiced we are than previous generations”. This form of hindsight can be dangerously smug. It can become both a way of praising ourselves for progress rather than looking for it to continue, and of distracting ourselves from uncomfortable aspects of the present.
Far more interesting, we felt, is this question: how will our generation be looked back on? What will our own descendants deplore about us that we take for granted?
April 17, 2014
Origins of Apple’s command
Over at Medium, I’ve just posted my latest piece of techy-etymological exploration, looking this time at the unlikely origins of Apple’s command key – ⌘ – in pre-medieval Scandinavia.
Known sometimes as the St John’s Arms, it’s a knot-like heraldic symbol dating back in Scandinavia at least 1,500 years, where it was used to ward off evil spirits and bad luck. A picture stone discovered in a burial site in Havor, Gotland, prominently features the emblem and dates from 400-600 AD. It has also been found carved on everything from houses and cutlery to a pair of 1,000-year-old Finnish skis, promising protection and safe travel.
It’s still found today on maps and signs in northern and eastern Europe, representing places of historical interest. More famously, though, it lurks on the keyboard of almost every Apple computer ever made—and in Unicode slot 2318 for everyone else, under the designation “place of interest sign.”
What is Apple’s command key all about?
Over at Medium, I’ve just posted my latest piece of techy-etymological exploration, looking this time at the unlikely origins of Apple’s command key – ⌘ – in pre-medieval Scandinavia.
March 31, 2014
Automated ethics
My latest essay for Aeon magazine asks when it’s ethical to hand our decisions over to machines, and when external automation becomes a step too far. The first few paras are below: read the rest on the magazine’s site.
For the French philosopher Paul Virilio, technological development is inextricable from the idea of the accident. As he put it, each accident is ‘an inverted miracle… When you invent the ship, you also invent the shipwreck; when you invent the plane, you also invent the plane crash; and when you invent electricity, you invent electrocution.’ Accidents mark the spots where anticipation met reality and came off worse. Yet each is also a spark of secular revelation: an opportunity to exceed the past, to make tomorrow’s worst better than today’s, and on occasion to promise ‘never again’.
This, at least, is the plan. ‘Never again’ is a tricky promise to keep: in the long term, it’s not a question of if things go wrong, but when. The ethical concerns of innovation thus tend to focus on harm’s minimisation and mitigation, not the absence of harm altogether. A double-hulled steamship poses less risk per passenger mile than a medieval trading vessel; a well-run factory is safer than a sweatshop. Plane crashes might cause many fatalities, but refinements such as a checklist, computer and co-pilot insure against all but the wildest of unforeseen circumstances.
Similar refinements are the subject of one of the liveliest debates in practical ethics today: the case for self-driving cars. Modern motor vehicles are safer and more reliable than they have ever been – yet more than 1 million people are killed in car accidents around the world each year, and more than 50 million are injured. Why? Largely because one perilous element in the mechanics of driving remains unperfected by progress: the human being.
March 17, 2014
Big data and artificial idiocy
I wrote a piece earlier this year for the Guardian about the perils and delights of big data, and the special stupidity it can breed. The first few paras are below: click through below for the whole piece.
Massive, inconceivable numbers are commonplace in conversations about computers. The exabyte, a one followed by 18 zeroes worth of bytes; the petaflop, one quadrillion calculations performed in a single second. Beneath the surface of our lives churns an ocean of information, from whose depths answers and optimisations ascend like munificent kraken.
This is the much-hyped realm of “big data”: unprecedented quantities of information generated at unprecedented speed, in unprecedented variety.
From particle physics to predictive search and aggregated social media sentiments, we reap its benefits across a broadening gamut of fields. We agonise about over-sharing while the numbers themselves tick upwards. Mostly, though, we fail to address a handful of questions more fundamental even than privacy. What are machines good at; what are they less good at; and when are their answers worse than useless?
February 6, 2014
Technology’s greatest myth
I wrote this at the end of last year as my final column for BBC Future, aiming to make 2014 a year for longer essays and projects (and paying attention to my young son). It’s a reflection on a couple of years of fortnightly writing about technology, ideas, and tech’s larger place in our sense of the world.
Lecturing in late 1968, the American sociologist Harvey Sacks addressed one of the central failures of technocratic dreams. We have always hoped, Sacks argued, that “if only we introduced some fantastic new communication machine the world will be transformed.” Instead, though, even our best and brightest devices must be accommodated within existing practices and assumptions in a “world that has whatever organisation it already has.”
As an example, Sacks considered the telephone. Introduced into American homes during the last quarter of the 19th Century, instantaneous conversation across hundreds or even thousands of miles seemed close to a miracle. For Scientific American, editorializing in 1880, this heralded “nothing less than a new organization of society — a state of things in which every individual, however secluded, will have at call every other individual in the community, to the saving of no end of social and business complications…”
Yet the story that unfolded was not so much “a new organization of society” as the pouring of existing human behaviour into fresh moulds: our goodness, hope and charity; our greed, pride and lust. New technology didn’t bring an overnight revolution. Instead, there was strenuous effort to fit novelty into existing norms.
The most ferocious early debates around the telephone, for example, concerned not social revolution, but decency and deception. What did access to unseen interlocutors imply for the sanctity of the home — or for gullible or corruptible members of the household, such as women or servants? Was it disgraceful to chat while improperly dressed? Such were the daily concerns of 19th-century telephonics, matched by phone companies’ attempts to assure subscribers of their propriety.
As Sacks also put it, each new object is above all “the occasion for seeing again what we can see anywhere” — and perhaps the best aim for any writing about technology is to treat novelty as not as an end, but as an opportunity to re-scrutinize ourselves.
I’ve been writing a fortnightly column for the BBC since the start of 2012, and in the last two years have watched new devices and services become part of similar negotiations. By any measure, ours is an age preoccupied with novelty. Too often, though, it offers a road not to insight, but to a startling blindness about our own norms and assumptions.
Take the litany of numbers within which every commentary on modern tech is couched. Come the end of 2014, there will be more mobile phones in the world than people. We have moved from the launch of modern tablet computing in mid-2011 to tablets likely accounting for over half the global market in personal computers in 2014. Ninety per cent of the world’s data was created in the last two years. Today’s phones are more powerful than yesterday’s supercomputers. Today’s software is better than us at everything from chess to quiz shows. And so on.
It’s a story in which both machines and their capabilities increase for ever, dragging us along for the exponential ride. Perhaps the defining geek myth of our age, The Singularity, anticipates a future in which machines cross an event horizon beyond which their intellects exceed our own. And while most people remain untouched by such faith, the apocalyptic eagerness it embodies is all too familiar. Surely it’s only a matter of time — the theory goes — before we finally escape, augment or otherwise overcome our natures and emerge into some new phase of the human story.
Or not. Because — while technological and scientific progress is indeed an astonishing thing — its relationship with human progress is more aspiration than established fact. Whether we like it or not, acceleration cannot continue indefinitely. We may long to escape flesh and history, but the selves we are busy reinventing come equipped with the same old gamut of beauties, perversities and all-too-human failings. In time, our dreams of technology departing mere actuality — and taking us along for the ride — will come to seem as quaint as Victorian gentlemen donning evening dress to make a phonecall.
This is one reason why, over the last two years, I’ve devoted a fair share of columns to the friction between the stories we tell about tech and its actual unfolding in our lives. From the surreptitious erosion of digital history to thedumbness of “smart” tech, via email’s dirty secrets and the importance of forgetfulness, I love exploring the tensions between digital tools and analogue selves — not because technology is to be dismissed or deplored, but because it remains as mired in history, politics and human frailty as everything else we touch.
On which note: what do you think is most ripe for abandonment around technology today? Which habit will come to be seen by future generations as quaint — our equivalent of putting on bow ties for telephones? If you have any thoughts, please Tweet me at @TomChatfield and let me know what you think.
January 23, 2014
On video games: difficulty is the point, not the problem
Here’s a piece exploring the difficulties of discussing games compared to other media. It was written first for the book Early Modernity and Video Games (Cambridge Scholars Publishing, February 2014), then republished with Wired and Ars Technica – and, now, here.
Difficulty is built into video games in a different way than for any other medium.
A movie may be difficult, conceptually or in terms of subject matter; it may be hard to understand or to enjoy. Yet all you have to do to access its entirety is to sit and watch from beginning to end.
Written words can be still more difficult. For these, you need a formidable mastery of language, concepts and context; you must convert text into sense. Still, the raw materials are all there for you to work with. You do not have to pass a tricky test in order to get beyond the first chapter, or find yourself repeatedly sent back to the start of a book if you fail. You do not have to practice turning a page at precise moments in order to progress.
Yet this is what the difficulty of many video games embodies: a journey that the majority of players will not complete, filled with trials, tribulations and inexorable repetitions.
There’s no one agreed-upon definition of a video game, or indeed a game, but Bernard Suits’s phrase “the voluntary attempt to overcome unnecessary obstacles” captures a good deal of what matters. A player contends with obstacles according to a set of limiting rules — and does so, in the case of a video game, by entering a virtual realm that itself embodies those rules.
A good game is one that is rewarding to play; where the journey of discovery and incremental mastery is balanced between excessive frustration and simplicity. There may be many incidental delights, but without some measure of difficulty and repetition there is no heart to the game: no mechanic inviting iterative exploration or breeding the complex satisfactions of play.
Yet video games are not only difficult to play. They are also difficult to write about and to discuss, and for related reasons.
For a start, they embrace aspects of many other media and disciplines: images, sound, music, text and speech, architecture and design, animation and modelling, interface and interaction, social dynamics and artificial intelligence. This brings a bewildering — and rich — load of baggage to a field that has only existed for around half a century. Like players, the would-be investigator of video games is often running in order to stay still.
Time is of the essence when it comes to almost every aspect of the field. Even the most difficult works of literature or philosophy tend to take at most tens of hours to read. Yet far simpler games can demand a hundred hours or more of play if they are to be exhaustively explored, while some online games raise the pitch of this expertise to thousands (hello, EVE).
Then there’s the fact that games themselves don’t stand still. With patches and expansions standard across the industry, and player communities constantly evolving, many titles consist of a steadily updated environment and evolving social context. What does it mean to study the definitive version of a game, or come up with a universal system of reference for research in the field? Are aging games to be understood in emulated form, or on original systems? What does it mean to play a game outside its original context?
These are awkward questions. Yet addressing them doesn’t just demand the exhaustive amassing of software and hardware. It also means paying close attention to the only place in which a game truly exists as itself: the minds of its players.
When writing well about almost any video game, authors are more like anthropologists reporting from the boundaries of a brave new world than critics dissecting a work of fiction. Their data is fieldwork, their analysis mixed with reportage, their most precious skills the arts of looking, listening and recording. Simply Google “vanilla WoW” to find a trove of tales about the first version of World of Warcraft, for example: a vanished world of slow-levelling and epic corpse-runs that now lives only in memory.
Similar reminiscences are everywhere in gaming, their genre somewhere between confession and myth-making. Talk to someone about even something as banal as Candy Crush Saga, and the passions will soon start to flow: what levels they love or hate, where they’re stuck, when and if they weakened and bought their way out of trouble.
Indeed, some games can touch us sufficiently deeply to be labelled an addictive hazard — and to suggest a special species of reverse-engineering, in which systems expressly designed to challenge and enthral us become an extraordinarily concentrated chunk of experience. Set apart from actuality, games are a machine zone that can be refuge, meeting place or unfallen Eden; and within which we at once the heroes and narrators of our own journeys.
You can talk, then, about a game’s art, politics, script, music, sounds, making, impact, legacy, sociological significance, and all the intricacies of design and data that conjure these. But you should never forget the fundamental contract every game seeks to forge with its players: accept this world and these obstacles in the name of experience, and make of them what you will. Difficulty is the point, not the problem. The play’s the thing.


