Hello World: How to be Human in the Age of the Machine
Rate it:
Read between September 17 - October 2, 2021
2%
Flag icon
No object or algorithm is ever either good or evil in itself. It’s how they’re used that matters. GPS was invented to launch nuclear missiles and now helps deliver pizzas.
2%
Flag icon
Forming an opinion on an algorithm means understanding the relationship between human and machine. Each one is inextricably connected to the people who build and use it.
Megan Hovvels
Giving me a much more objective view of algorithms
3%
Flag icon
algorithm (noun): A step-by-step procedure for solving a problem or accomplishing some end especially by a computer.
Megan Hovvels
Good soup
4%
Flag icon
You give the machine data, a goal and feedback when it’s on the right track – and leave it to work out the best way of achieving the end.
Megan Hovvels
Think chess engines
5%
Flag icon
Although AI has come on in leaps and bounds of late, it is still only ‘intelligent’ in the narrowest sense of the word. It would probably be more useful to think of what we’ve been through as a revolution in computational statistics than a revolution in intelligence.
Megan Hovvels
This was very much the conclusion I came to when studying AI Law and autonomous vehicles. I think the public perception of AI is more optimistic than the reality of its advances. I think in actuality you when break it down, it is more simple (for want of a better word, comprehensible maybe?) than I first thought.
6%
Flag icon
Perhaps more ominous, given how much of our information we now get from algorithms like search engines, is how much agency people believed they had in their own opinions: ‘When people are unaware they are being manipulated, they tend to believe they have adopted their new thinking voluntarily,’
8%
Flag icon
Meehl systematically compared the performance of humans and algorithms on a whole variety of subjects – predicting everything from students’ grades to patients’ mental health outcomes – and concluded that mathematical algorithms, no matter how simple, will almost always make better predictions than people.
Megan Hovvels
Scope of human error
8%
Flag icon
It’s like the saying among airline pilots that the best flying team has three components: a pilot, a computer and a dog. The computer is there to fly the plane, the pilot is there to feed the dog. And the dog is there to bite the human if it tries to touch the computer.
Megan Hovvels
Hehe
8%
Flag icon
It’s known to researchers as algorithm aversion. People are less tolerant of an algorithm’s mistakes than of their own – even if their own mistakes are bigger.
Megan Hovvels
This is entirely true, autonomous vehicles are over 99% less likely to cause an accident than humans, but think about the backlash received when Tesla cars have caused an accident. It is weird to me how ready people are to use this as an argument against delegating functions like driving to AI.
9%
Flag icon
Facebook scandal, these words were repeatedly reprinted by journalists wanting to hint at a Machiavellian attitude to privacy within the company.
Megan Hovvels
See above
9%
Flag icon
we’re not always aware of the longer-term implications of that trade. It’s rarely obvious what our data can do, or, when fed into a clever algorithm, just how valuable it can be. Nor, in turn, how cheaply we were bought.
Megan Hovvels
It is almost entirely unregulated, this is the problem,
10%
Flag icon
Palantir Technologies is one of the most successful Silicon Valley start-ups of all time.
11%
Flag icon
Palantir is just one example of a new breed of companies whose business is our data. And alongside the analysts, there are also the data brokers: companies who buy and collect people’s personal information and then resell it or share it for profit. Acxiom, Corelogic, Datalogix, eBureau – a swathe of huge companies you’ve probably never directly interacted with, that are none the less continually monitoring and analysing your behaviour.
Megan Hovvels
That is creeeeeepy and I don’t like it at all
11%
Flag icon
In the most literal sense, within some of these brokers’ databases, you could open up a digital file with your ID number on it (an ID you’ll never be told) that contains traces of everything you’ve ever done.
Megan Hovvels
TF
12%
Flag icon
In whatever corner of the internet you use, hiding in the background, these algorithms are trading on information you didn’t know they had and never willingly offered. They have made your most personal, private secrets into a commodity.
Megan Hovvels
A compelling but somewhat emotive way to put it.
13%
Flag icon
And that is where we start to stray very far over the creepy line. When private, sensitive information about you, gathered without your knowledge, is then used to manipulate you. Which, of course, is precisely what happened with the British political consulting firm Cambridge Analytica.
15%
Flag icon
But the advertisers aren’t injecting their messages straight into the minds of a passive audience. We’re not sitting ducks. We’re much better at ignoring advertising or putting our own spin on interpreting propaganda than the people sending those messages would like us to be.
Megan Hovvels
A perspective not shared enough
15%
Flag icon
Trump won Pennsylvania by 44,000 votes out of six million cast, Wisconsin by 22,000, and Michigan by 11,000, perhaps margins of less than 1 per cent might be all you need.24
15%
Flag icon
All around the world, people have free and easy access to instant global communication networks, the wealth of human knowledge at their fingertips, up-to-the-minute information from across the earth, and unlimited usage of the most remarkable software and technology, built by private companies, paid for by adverts.
15%
Flag icon
That was the deal that we made. Free technology in return for your data and the ability to use it to influence and profit from you. The best and worst of capitalism in one simple swap.
Megan Hovvels
‘Best and worst of capitalism’, I think that summaries things rather well.
15%
Flag icon
It’s known as Sesame Credit, a citizen scoring system used by the Chinese government.
15%
Flag icon
If you live in the EU, there has recently been a new piece of legislation called GDPR – General Data Protection Regulation – that should make much of what data brokers are doing illegal.
Megan Hovvels
Brexit blues here
16%
Flag icon
Apple has now built ‘intelligent tracking prevention’ into the Safari browser.
16%
Flag icon
Europe might be ahead of the curve, but there is a global trend that is heading in the right direction.
16%
Flag icon
Whenever we use an algorithm – especially a free one – we need to ask ourselves about the hidden incentives. Why is this app giving me all this stuff for free? What is this algorithm really doing? Is this a trade I’m comfortable with? Would I be better off without it?
Megan Hovvels
Definitely something to think about more going forward.
19%
Flag icon
19%
Flag icon
Because the algorithm’s predictions are based on the patterns it learns from the data, a random forest is described as a machine-learning algorithm, which comes under the broader umbrella of artificial intelligence.
20%
Flag icon
The researchers argued that, whichever way you use it, their algorithm vastly outperforms the human judge. And the numbers back them up.
20%
Flag icon
These two kinds of error, false positive and false negative, are not unique to recidivism. They’ll crop up repeatedly throughout this book. Any algorithm that aims to classify can be guilty of these mistakes.
21%
Flag icon
The algorithm’s false positives were disproportionately black.
24%
Flag icon
Weber’s Law states that the smallest change in a stimulus that can be perceived, the so-called ‘Just Noticeable Difference’, is proportional to the initial stimulus.
24%
Flag icon
And yet, instead of adding a few months on, judges will jump to the next noticeably different sentence length, which in this case is 25 years.58
25%
Flag icon
One London-based defence lawyer I spoke to told me that his role in the courtroom was to exploit the uncertainty in the system, something that the algorithm would make more difficult. ‘The more predictable the decisions get, the less room there is for the art of advocacy.’
25%
Flag icon
want someone to use a reasoned strategy. We want to keep judicial discretion, as though it is something so holy.
27%
Flag icon
Andy Beck,8 a Harvard pathologist and founder of PathAI, a company created in 2016 that creates algorithms to classify biopsy slides.
27%
Flag icon
The trick is to shift away from the rule-based paradigm and use something called a ‘neural network’.11 You can imagine a neural network as an enormous mathematical structure that features a great many knobs and dials.
27%
Flag icon
But with every picture you feed into it, you tweak those knobs and dials. Slowly, you train it.
28%
Flag icon
The surprising thing about neural networks is that their operators usually don’t understand how or why the algorithm reaches its conclusions.
Megan Hovvels
!?
28%
Flag icon
The problem is that refining an algorithm often means making a choice between sensitivity and specificity. If you focus on improving one, it often means a loss in the other.
Megan Hovvels
Balancing act needing more critical analysis
29%
Flag icon
Ninety per cent of the nuns who went on to develop Alzheimer’s had ‘low linguistic ability’ as young women, while only 13 per cent of the nuns who maintained cognitive ability into old age got a ‘low idea density’ score in their essays.
33%
Flag icon
lack of a single, connected medical history meant that it was impossible for any individual doctor to fully understand the severity of her condition.
37%
Flag icon
Hammond, the British Chancellor of the Exchequer, announced the government’s intention to have fully driverless cars – without a safety attendant on board – on British roads by 2021.
Megan Hovvels
Lmao didn’t age well
39%
Flag icon
Unlike cameras, lasers can measure distance. Vehicles that use a system called LiDAR (Light Detection and Ranging, first used at the second DARPA Grand Challenge
39%
Flag icon
the camera, the LiDAR, the radar – can do enough to understand what’s going on around a vehicle. The trick to successfully building a driverless car is combining them.
39%
Flag icon
blue circle on Google Maps that surrounds your location – it’s there to indicate the potential error in the GPS reading.
Megan Hovvels
The more you know !
39%
Flag icon
This is all Bayes’ theorem does: offers a systematic way to update your belief in a hypothesis on the basis of the evidence.
41%
Flag icon
Mercedes position. ‘Save the one in the car.’ He went on: ‘If all you know for sure is that one death can be prevented, then that’s your first priority.’
Megan Hovvels
Marketing
41%
Flag icon
challenge how we feel about an algorithm making a value judgement on our own, and others’, lives.
44%
Flag icon
Build a machine to improve human performance, she explained, and it will lead – ironically – to a reduction in human ability.
45%
Flag icon
The good still outweighs the bad. Driving remains one of the biggest causes of avoidable deaths in the world. If the technology is remotely capable of reducing the number of fatalities on the roads overall, you could argue that it would be unethical not to roll it out.
« Prev 1