More on this book
Kindle Notes & Highlights
Mathematics is a form of extended cognition.
‘The ultimate goal of mathematics is to eliminate any need for intelligent thought.’
‘Everything is vague to a degree you do not realise till you have tried to make it precise.’
‘In biology, we use several arguments to convince ourselves that problems that require calculus can be solved with arithmetic if one tries hard enough and does another series of experiments.’
spherical cow in a vacuum …
Lotka–Volterra model and
‘Understanding [a complex] system without formal analytical tools requires geniuses, who are so rare even outside biology.’
Mathematical formulation of a model forces it to be self-consistent and, although self-consistency is not necessarily truth, self-inconsistency is certainly falsehood.
All models are wrong the same way all poems are wrong; they capture an essence, if not a perfect literal truth. ‘All models are wrong but some are useful,’ says Box. If the farmer in the
The art of mathematical modelling is in deciding which details matter and steadfastly ignoring those that do not.
vitalism, the idea that life relied on a Lebenskraft, or vital organising force, that went beyond mere chemical and physical interactions.
The higher the voltage was across the capacitor, the stronger this movement of charge – or current
Unfortunately for Galvani, Volta was a younger man, more willing to engage in public debate and on his way up in the field.
Action potentials are also a way for a cell to say something to other cells.
Edgar Adrian in the 1920s is the ‘all-or-nothing’ principle.
the nervous system cares more about quantity than quality.
physiology as ‘the mechanical engineering of living machines’.
The inside of the cell is briefly as positive as the outside, and then more so – the ‘overshoot’. As this is happening potassium channels are opening, letting positively charged potassium ions fall out of the cell.
this exodus of potassium again makes the inside of the cell more negative, sodium channels close.
whole event takes less than one-half of one-hundredth of a second.
action potential is a delicately controlled explosion occurring a billion times a second in your brain.
Specifically, it imbues neurons with the ability to identify sequences.
neurons in the retina have this kind of ‘direction selectivity’. This lets them signal which way objects in the visual field are moving.
these biological details are congruent with Boolean logic.
‘More is different’ that ‘the behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles’.
engram, was defined as ‘the enduring though primarily latent modification in the irritable substance produced by a stimulus’.
engram and ecphory (or between the processes that lay a memory and those that retrieve it)
activity determines connectivity and connectivity determines activity.
The Hopfield network (see Figure 8) is a mathematical model of neurons that can implement what Hopfield described as ‘content-addressable memory’.
The Hopfield network is recurrent, meaning that each neuron’s activity is determined by that of any of the others in the network.
Therefore, each neuron’s activity serves as both input and output to its neighbours.
each input a neuron receives from another neuron is multiplied by a particular number – a synaptic weight. These weighted inputs are then add...
This highlight has been truncated due to consecutive passage length restrictions.
In the language of physics, a fully retrieved memory is an example of an attractor. An attractor is, in short, a popular pattern of activity.
Always fond of describing things in terms of energy, physicists consider attractors ‘low energy’ states. They’re a comfortable position for a system to be in; that is what makes them attractive and stable.
Hopfield network can sustain multiple attractors.
room or a trip to a beach that ignites the memory of a childhood holiday – are said to be in that memory’s ‘basin of attraction’.
For every experience in which two neurons are either both active or inactive, the connection between them is strengthened.
weights in the Hopfield network are symmetric
If the number of memories remains less than about 14 per cent the number of neurons, each memory will be restored with minimal error.
When pushed past its capacity, the Hopfield network collapses: inputs go towards meaningless attractors and no memories are successfully recovered. It’s a phenomenon given the appropriately dramatic name ‘blackout catastrophe’.
recognition is not recall.
feeling of familiarity when seeing an image can happen without the ability to regenerate that image from scratch.
The Hopfield network is remarkable for being capable of the latter, more difficult task – it fully completes a memory from a partial bit of it....
This highlight has been truncated due to consecutive passage length restrictions.
have a significantly higher capacity: 1,000 neurons can now recognise as many as 23,000 images.
gooey richness of biology.
‘destinesia’ or amnesia about why you’ve gone to where you are.
cells in the prefrontal cortex were different. The neurons there that responded to the visual patterns kept firing even after the patterns disappeared; that is, they maintained their activity during the delay period. A physical signature of working memory at work!
delay activity can only be generated by a network of neurons working together, the connections between them conspiring to keep the activity alive. This is where the idea of attractors comes back into play.
because an attractor stays put. Attractors are defined by derivatives. If we know the inputs a neuron
neuron is part of a recurrent network, it not only gets input but also serves as input to other neurons.