See a Problem?
We’d love your help. Let us know what’s wrong with this preview of Future Ethics by Cennydd Bowles.
Not the book you’re looking for?
Preview — Future Ethics by Cennydd Bowles
Design is applied ethics. Sometimes this connection is obvious: if you design razor wire, you’re saying that anyone who tries to contravene someone else’s right to private property should be injured. But whatever the medium or material, every act of design is a statement about the future. Design changes how we see the world and how we can act within it; design turns beliefs about how we should live into objects and environments people will use and inhabit.
A market that ethically self-corrects requires perfect information and full agreement on what’s right and wrong.
The general public has no idea what sorts of unwelcome acts are happening inside their gadgets. The idea of market self-correction is a fantasy.
When you invent the ship, you also invent the shipwreck; when you invent the plane you also invent the plane crash; and when you invent electricity, you invent electrocution... Every technology carries its own negativity, which is invented at the same time as technical progress.2
According to the law of unintended consequences, there will always be outcomes we overlook, but unintended does not mean unforeseeable. We can – and must – try to anticipate and mitigate the worst potential consequences.
Unintended consequences affect familiar people in unknown ways, while externalities happen to people we’ve ignored. In other words, we overlook unintended consequences by not looking deeply enough, but we miss externalities because we were looking in the wrong places.
We badly need an expanded concept of justice and fairness that takes mortgaging the future into account.
Data always looks backward, meaning historical prejudice is frozen into a training corpus.
A team is only as trustworthy as its sleaziest partner. Build on a vulnerable platform, lock yourself into a disreputable social network, or share data with an abusive advertiser, and you will rightly bear some blame.
Convinced that technology is neutral and objective, we mistakenly assumed bias was impossible; concerned only with the productivity of the primary user, we overlooked impacts on wider society.
Strong core values and design principles taken to heart are powerful tie-breakers for ethical dilemmas:
‘Make it easy for users’ is a platitude, not a design principle; the opposite would be absurd.
Homogenous teams tend to focus on the potential upsides of their work for people like them, and are blind to the problems they could inflict on a wider audience. The same divisions that pervade today’s world are seen and even amplified in today’s tech industry.
The modern adage ‘If you’re not paying for the product, you are the product being sold’ is facile. It implies that paying with attention is less ethical than paying with cash, a discriminatory position that suggests the poor are somehow less deserving of technology than the rich.
the average American spends 3.1 hours on a mobile device each day, compared to just eighteen minutes in 2008,
With persuasive content tailored to the individual and delivered on a personal device, it will be harder to unify in opposition, and authorities will struggle to take corrective action.
The golden rule’s biggest flaw is its egocentrism. It encourages everyone to see themselves as the ideal ethical arbiter, whether their interests align with others’ or not.
Users don’t care whether we intend harm or not; they care whether we cause harm.
What if everyone did what I’m about to do? This simplified version of Kant’s most important theory26 is an invaluable ethical prompt for technologists.
We use our phones 150 times a day;
seeing ethics as just another equation to be solved is a sad, solutionist point of view. It ignores the most important parts of ethics: dialogue, consensus, resolve.
‘Those who cannot perceive the network cannot act effectively within it, and are powerless.’
The idea of privacy as a luxury good should worry us, since it helps only the already powerful.
major speech recognition platforms have accuracy above 95%.
In 2017, security researchers revealed they’d discovered 234 Android apps ‘constantly listening for ultrasonic beacons in the background without the user’s knowledge’;5 two years prior, Samsung admitted their smart TVs were sharing audio with unnamed third parties.
Technologists have a responsibility to help people form accurate mental models of changing technologies.
even in the face of tax cuts the tech giants hold immense cash reserves offshore, beyond the grasp of authorities. This clear evasion of social contract duties goes sadly unpunished: monopolistic power makes it easier to dodge your end of the deal.
As an entry point to thinking about virtue, we can apply another ethical test: would I be happy for my decision to appear on the front page of tomorrow’s news?
Former NSA and CIA director Michael Hayden is on record: ‘We kill people based on metadata.’
Russia also has autonomous intentions, with Vladimir Putin saying ‘Whoever becomes the leader in [AI] will become the ruler of the world.’
In total, 70% of global internet traffic is routed through Loudoun County, Virginia,
saving to a local hard drive uses around one-millionth of the energy of saving the same file to cloud storage.
‘There is no word for “nature” in my language. “Nature” in English seems to refer to that which is separate from human beings. It is a distinction we don’t recognize.’