More on this book
Community
Kindle Notes & Highlights
Read between
January 18 - February 25, 2024
invented the integrated circuit at Fairchild Semiconductor in the late 1950s and the 1960s, imprinting multiple transistors on silicon wafers to produce what came to be called silicon chips.
Gordon...
This highlight has been truncated due to consecutive passage length restrictions.
proposed his eponymous “law”: every twenty-four months, the number of transistors...
This highlight has been truncated due to consecutive passage length restrictions.
Since the early 1970s the number of transistors per chip has increased ten-million-fold. Their power has increased by ten orders of magnitude—a seventeen-billion-fold improvement.
Fairchild Semiconductor sold one hundred transistors for $150 each in 1958. Transistors are now produced in the tens of trillions per second, at billionths of a dollar per transistor: the fastest, most extensive proliferation in history.
Back in 1983, only 562 computers total were connected to the primordial internet.
Someone from the postwar world would be staggered by the scale and reach of what had seemed a niche technology.
Technology’s unavoidable challenge is that its makers quickly lose control over the path their inventions take once introduced to the world.
Thomas Edison invented the phonograph so people could record their thoughts for posterity and to help the blind. He was horrified when most people just wanted to play music.
Gutenberg just wanted to make money printing Bibles. Yet his press catalyzed the Scientific Revolution and the Reformation, and so became the greatest threat to the Catholic Church since its establishment.
Understanding technology is, in part, about trying to understand its unintended consequences, to predict not just positive spillovers but “revenge effects.”
Containment is the overarching ability to control, limit, and, if need be, close down technologies at any stage of their development or deployment.
As the printing press roared across Europe in the fifteenth century, the Ottoman Empire had a rather different response. It tried to ban it.
Fear and suspicion of anything new and different are endemic.
People throughout history have attempted to resist new technologies because they felt threatened and worried their livelihoods and way of life would be destroyed.
Where there is demand, technology always breaks out, finds traction, builds users.
China kept the secret of silk making under wraps for centuries, but it got out in the end thanks to two determined Nestorian monks in 552 CE.
Technologies are ideas, and ideas cannot be eliminated.
Ernest Rutherford
Leo Szilard
The day after Rutherford called it moonshine, Szilard conceptualized a nuclear chain reaction. The first nuclear explosion came just twelve years later. On July 16, 1945, under the auspices of the Manhattan Project, the U.S. Army detonated a device code-named Trinity in the New Mexico desert. Weeks later a Boeing B-29 Superfortress, the Enola Gay, dropped a device code-named Little Boy containing sixty-four kilograms of uranium-235 over the city of Hiroshima, killing 140,000 people.
To date only nine countries have acquired them.
The biggest explosion ever recorded was a test of an H-bomb called the Tsar Bomba.
The blast was ten times more powerful than the combined total of all the conventional explosives deployed in World War II.
hemmed
Lack of widespread demand has meant little pressure to reduce costs and grow access; they are not subject to the classic cost curves of modern consumer technology. These were never going to spread like transistors or flat-screen TVs;
guises.
Vasili Arkhipov,
The engineer A. Q. Khan helped Pakistan develop nuclear weapons by stealing centrifuge blueprints and fleeing the Netherlands.
The worrying truth of this fearsome technology is that humanity has tried to say no and only partially succeeded. Nuclear weapons are among the most contained technologies in history, and yet the containment problem—in its hardest, most literal sense—even here remains acutely unsolved.
While the EU bans GMOs in the food supply, they’re ubiquitous in other parts of the world.
AlphaGo initially learned by watching 150,000 games played by human experts. Once we were satisfied with its initial performance, the key next step was creating lots of copies of AlphaGo and getting it to play against itself over and over.
in March 2016, we organized a tournament in South Korea. AlphaGo was pitted against Lee Sedol, a virtuoso world champion.
with just a day’s training, AlphaZero was capable of learning more about the game than the entirety of human experience could teach it.
Technology is hence like a language or chemistry: not a set of independent entities and practices, but a commingling set of parts to combine and recombine.
buttresses,
backpropagation
when an error is spotted, adjustments propagate back through the network to help correct it in the future. Keep doing this, modifying the weights again and again, and you gradually improve the performance of the neural network so that eventually it’s able to go all the way from taking in single pixels to learning the existence of lines, edges, shapes, and then ultimately entire objects in scenes. This, in a nutshell, is deep learning.
AlexNet was built by the legendary researcher Geoffrey Hinton and two of his students, Alex Krizhevsky and Ilya Sutskever, at the University of Toronto.
In 2012, AlexNet beat the previous winner by 10 percent.
Your smartphone recognizes objects and scenes, while vision systems automatically blur the background
Computer vision is the basis of Amazon’s checkout-less supermarkets
Following the AlexNet breakthrough, AI suddenly became a major priority in academia, government, and corporate life. Geoffrey Hinton and his colleagues were hired by Google.
Shortly after DQN, we sold DeepMind to Google, and the tech giant soon switched to a strategy of “AI first” across all its products.
in November 2022, the AI research company OpenAI released ChatGPT.
large language models
(LLMs)—including
The model reads very large numbers of sentences, learns an abstract representation of the information contained within them, and then, based on this, generates a prediction about what should come next.
When a large language model ingests a sentence, it constructs what can be thought of as an “attention map.” It first organizes commonly occurring groups of letters or punctuation into “tokens,”
humans do this with words of course, but the model doesn’t use our vocabulary.