More on this book
Community
Kindle Notes & Highlights
Read between
April 20 - April 27, 2020
But by directly communicating select messages to select voters, the microtargeting of the Obama campaign had started a journey toward the privatisation of public discourse in America.
The town square, the very foundation of American democracy, was incrementally being replaced by online ad networks. And without any scrutiny, campaign messages no longer even had to look like campaign messages. Social media created a new environment where campaigns could now appear, as Obama’s campaign piloted, as if your friend was sending you a message, without your realising the source or calculated intent of that contact.
America is now living in the aftermath of the first scaled deployment of a psychological weapon of mass destruction.
It turns out that Republicans can accept a batshit insane candidate, so long as it’s consistent insanity.
‘Think about it,’ I said to Bannon. ‘The message at a Tea Party rally is the same as at a Gay Pride parade: Don’t tread on me! Let me be who I am!’
the site’s algorithm will start to funnel the user similar stories and pages – all to increase engagement. For Facebook, rising engagement is the only metric that matters, as more engagement means more screen time to be exposed to advertisements. This is the darker side of Silicon Valley’s much celebrated metric of ‘user engagement’. By focusing so heavily on greater engagement, social media tends to parasitise our brain’s adaptive mechanisms. As it happens, the most engaging content on social media is often horrible or enraging.
According to evolutionary psychologists, in order to survive in premodern times, humans developed a disproportionate attentiveness toward potential threats.
Social media platforms also use designs that activate ‘ludic loops’ and ‘variable reinforcement schedules’ in our brains. These are patterns of frequent but irregular rewards that create anticipation, but where the end reward is too unpredictable and fleeting to plan around. This establishes a self-reinforcing cycle of uncertainty, anticipation and feedback. The randomness of a slot machine prevents the player from being able to strategise or plan, so the only way to get a reward is to keep playing.
In gambling, a casino makes money from the number of turns a player takes. On social media, a platform makes money from the number of clicks a user performs. This is why there are infinite scrolls on newsfeeds – there is very little difference between a user endlessly swiping for more content and a gambler pulling the slot machine lever over and over.
It created a wicked reinforcement cycle in which the cohort would strengthen their racialised views when they were exposed to criticism. This may be in part because the area of the brain that is most highly activated when we process strongly held beliefs is the same area that is involved when we think about who we are and our identity.
By making people angry in this way, CA was following a fairly wide corpus of research showing that anger interferes with information seeking. This is why people can ‘jump to conclusions’ in a fit of rage, even if later they regret their decisions.
What CA observed was that when respondents were angry, their need for complete and rational explanations was also significantly reduced.
In particular, anger put people in a frame of mind in which they were more indiscriminately punitive, particularly to out-groups. They would also underestimate the risk of negative outcomes. This led CA to discover that even if a hypothetical trade war with China or Mexico meant the loss of American jobs and profits, people primed with anger would tolerate that domestic economic damage if it meant they could use a trade war to punish immigrant groups and urban liberals.
Bannon wanted his targets to ‘discover themselves’ and ‘become who they really were’. But the tools created at Cambridge Analytica in 2014 were not about self-actualisation; they were used to accentuate people’s innermost demons in order to build what Bannon called his ‘movement’. By targeting people with specific psychological vulnerabilities, the firm victimised them into joining what was nothing more than a cult led by false prophets, where reason and facts would have little effect on its new followers, digitally isolated as they now were from inconvenient narratives.
In the last discussion I ever had with Bannon, he told me that to fundamentally change society, ‘you have to break everything’. And that’s what he wanted to do – to fracture ‘the establishment’.
In short, it is far harder to make angry people fearful. The ‘affect bias’ arising out of anger mediates people’s estimation of negative outcomes, which is why angry people are more inclined to engage in risky behaviour – the same is true whether they are voting or starting a bar fight. If you have ever been in a bar fight, you know that literally the worst way imaginable to make your opponent think twice about a rash move is to yell threats at him. It only eggs him on.
Remain’s focus on the economy also neglected to stop and ask people what they thought the economy was in the first place. Cambridge Analytica identified that many people in non-urban regions or in lower socioeconomic strata often externalised the notion of ‘the economy’ to something that only the wealthy and metropolitan participated in. ‘The economy’ was not their job in a local store; it was something that bankers did. This is also what made certain groups comfortable with economic risks and even trade wars, since, in their minds, that chaos would be unleashed upon the people who worked in
...more
In March 2018, the UN concluded that Facebook had played a ‘determining role’ in the ethnic cleansing of the Rohingya people. Violence was enabled by Facebook’s frictionless architecture, propelling hate speech through a population at a velocity previously unimaginable. Facebook’s apathetic response was positively Orwellian.
What was supposed to be so brilliant about the internet was that people would suddenly be able to erode all those barriers and talk to anyone, anywhere. But what actually happened was an amplification of the same trends that took hold of a country’s physical spaces. People spend hours on social media, following people like them, reading news articles ‘curated’ for them by algorithms whose only morality is click-through rates – articles that do nothing but reinforce a unidimensional point of view and take users to extremes to keep them clicking. What we’re seeing is a cognitive segregation,
...more
Shared experience is the fundamental basis for solidarity among citizens in a modern pluralistic democracy, and the story of the civil rights movement is, in part, the story of being able to share space together: being in the same part of the movie theatre or using the same water fountain or bathroom. Segregation in America has always manifested itself in insidiously mundane ways – through separate bus seats, water fountains, schools, theatre tickets and park benches.
From social isolation comes the raw material of both conspiracism and populism: mistrust. Cambridge Analytica was the inevitable product of this balkanised cyberspace. The company was able to get its targets addicted to rage only because there was nothing to prevent it from doing so – and so, unimpeded, the company drowned them in a maelstrom of disinformation, with predictably disastrous results. But simply stopping CA is not enough. Our newfound crisis of perception will only continue to worsen until we address the underlying architectures that got us here.
In February 2013, a Russian military general named Valery Gerasimov wrote an article challenging the prevailing notions of warfare. Gerasimov, who was Russia’s chief of the general staff (roughly equivalent to chairman of the US Joint Chiefs of Staff), penned his thoughts in the Military-Industrial Kurier under the title ‘The Value of Science is in the Foresight’ – a set of ideas that some would later dub the Gerasimov Doctrine. Gerasimov wrote that the ‘“rules of war” have changed’ and that ‘the role of nonmilitary means of achieving political and strategic goals has grown’. He addressed the
...more
It’s difficult for military strategists to envision new forms of battle when they’re focused on those at hand. Before the advent of flight, military commanders cared only about how to wage combat on land or at sea. It wasn’t until 1915, when the French pilot Roland Garros flew a plane jerry-rigged with a machine gun, that military strategists realised that war could actually be waged from the skies. Then, once aircraft began engaging in attacks, army units on the ground pivoted as well, creating compact, rapid-fire antiaircraft guns. And so the evolution of war continued.
You can draw a straight line from the groundwork laid by Gerasimov, Chekinov and Bogdanov, right through the actions of Cambridge Analytica, to the victories of the Brexit and Trump campaigns. In only five or so years, the Russian military and state have managed to develop the first devastatingly effective new weapon of the twenty-first century. They knew it would work, because companies such as Facebook would never take the ‘un-American’ step of reining in their users. So Russia didn’t have to disseminate propaganda. They could just get the Americans to do it themselves, by clicking, liking
...more
In this new economy of surveillance capitalism, we are the raw materials. What this means is that there is a new economic incentive to create substantial informational asymmetries between platforms and users. In order to be able to convert user behaviour into profit, platforms need to know everything about their users’ behaviour, while their users know nothing of the platform’s behaviour.
For the first time in human history, we will immerse ourselves in motivated spaces influenced by these silicon spirits of our making. No longer will our environment be passive or benign; it will have intentions, opinions and agendas.
Over centuries, the law has developed several fundamental presumptions about human nature. The most important of these is the notion of human agency as an irrefutable presumption in the law – that humans have the capacity to make rational and independent choices on their own accord. It follows that the world does not make decisions for humans, but that humans make decisions inside of that world.
This notion of human agency serves as the philosophical basis for criminal culpability, and we punish transgressors of the law on the grounds that they made a condemnable choice. A burning building may indeed harm people, but the law does not punish that building, as it has no agency.
The rights to life, liberty, association, speech, vote and conscience are all underpinned with a presumption of agency, as they are outputs of that agency. But agency itself has not been articulated as a right per se, as it has always been presumed to exist simply by virtue of our personhood. As such, we do not have an express right to agency that is contra mundum – that is, a right to agency that is exercisable against the environment itself. We do not have a right against the heavens or the undue influence of motivated and thinking spaces to mediate the exercise of our agency.
We like to think of ourselves as immune from influence or our cognitive biases, because we want to feel like we are in control, but industries like alcohol, tobacco, fast food and gaming all know we are creatures that are subject to cognitive and emotional vulnerabilities. And tech has caught on to this with its research into ‘user experience’, ‘gamification’, ‘growth hacking’, and ‘engagement’ by activating ludic loops and reinforcement schedules in the same way slot machines do.
We risk creating a society obsessive about remembering, and we may have overlooked the value of forgetting, moving on or being unknown.
The most common job titles in most Silicon Valley companies are engineer and architect, not service manager or client relations. But unlike engineering in other sectors, tech companies do not have to perform safety tests to conform to any building codes before releasing their products. Instead, platforms are allowed to adopt dark pattern designs that deliberately mislead users into continual use and giving up more data. Tech engineers intentionally design confounding mazes on their platforms that keep people moving deeper and deeper into these architectures, without any clear exit.
Social media and internet platforms are not services; they are architectures and infrastructures. By labelling their architectures as ‘services’, they are trying to make responsibility lie with the consumer, through their ‘consent’. But in no other sector do we burden consumers in this way. Airline passengers are not asked to ‘accept’ the engineering of planes, hotel guests are not asked to ‘accept’ the number of exits in the building, and people are not asked to ‘accept’ the purity levels of their drinking water.
For too long the congresses and parliaments of the world have fallen for a mistaken view that somehow ‘the law cannot keep up with technology’. The technology sector loves to parrot this idea, as it tends to make legislators feel too stupid or out of touch to challenge their power. But the law can keep up with technology, just as it has with medicines, civil engineering, food standards, energy and countless other highly technical fields. Legislators do not need to understand the chemistry of molecular isomers inside a new cancer drug to create effective drug review processes, nor do they need
...more
This highlight has been truncated due to consecutive passage length restrictions.
If we as software engineers and data scientists are to call ourselves professionals worthy of the esteem and high salaries we command, there must be a corresponding duty for us to act ethically.
Regulations on tech companies will not be nearly as effective as they could be if we do not start by putting skin in the game for people inside these companies. We need to put the onus on engineers to start giving a damn about what they build.
And if, upon due consideration, an employer’s request to build a feature is deemed to be unethical by the engineer, there should be a duty to refuse and a duty to report, where failure to do so would result in serious professional consequences. Those who refuse and report must also be protected by law from retaliation from their employer.
The regulation of internet utilities should recognise the special place they hold in society and commerce and impose a higher standard of care toward users. These regulations should take the form of statutory duties, with penalties benchmarked to annual profits as a way to stop the current situation, in which regulatory infractions are negotiated and accounted for as a cost of doing business.
In this light, principle-based rather than technology-based regulation should be created so that we are careful not to embed old technologies or outdated business models into regulatory codes.

