Models of the Mind: How Physics, Engineering and Mathematics Have Shaped Our Understanding of the Brain
Rate it:
Open Preview
1%
Flag icon
Mathematics is a form of extended cognition.
2%
Flag icon
‘The ultimate goal of mathematics is to eliminate any need for intelligent thought.’
2%
Flag icon
‘Everything is vague to a degree you do not realise till you have tried to make it precise.’
2%
Flag icon
‘In biology, we use several arguments to convince ourselves that problems that require calculus can be solved with arithmetic if one tries hard enough and does another series of experiments.’
2%
Flag icon
spherical cow in a vacuum …
2%
Flag icon
Lotka–Volterra model and
2%
Flag icon
‘Understanding [a complex] system without formal analytical tools requires geniuses, who are so rare even outside biology.’
3%
Flag icon
Mathematical formulation of a model forces it to be self-consistent and, although self-consistency is not necessarily truth, self-inconsistency is certainly falsehood.
3%
Flag icon
All models are wrong the same way all poems are wrong; they capture an essence, if not a perfect literal truth. ‘All models are wrong but some are useful,’ says Box. If the farmer in the
3%
Flag icon
The art of mathematical modelling is in deciding which details matter and steadfastly ignoring those that do not.
4%
Flag icon
vitalism, the idea that life relied on a Lebenskraft, or vital organising force, that went beyond mere chemical and physical interactions.
4%
Flag icon
The higher the voltage was across the capacitor, the stronger this movement of charge – or current
5%
Flag icon
Unfortunately for Galvani, Volta was a younger man, more willing to engage in public debate and on his way up in the field.
5%
Flag icon
Action potentials are also a way for a cell to say something to other cells.
7%
Flag icon
Edgar Adrian in the 1920s is the ‘all-or-nothing’ principle.
7%
Flag icon
the nervous system cares more about quantity than quality.
8%
Flag icon
physiology as ‘the mechanical engineering of living machines’.
9%
Flag icon
The inside of the cell is briefly as positive as the outside, and then more so – the ‘overshoot’. As this is happening potassium channels are opening, letting positively charged potassium ions fall out of the cell.
9%
Flag icon
this exodus of potassium again makes the inside of the cell more negative, sodium channels close.
9%
Flag icon
whole event takes less than one-half of one-hundredth of a second.
9%
Flag icon
action potential is a delicately controlled explosion occurring a billion times a second in your brain.
10%
Flag icon
Specifically, it imbues neurons with the ability to identify sequences.
10%
Flag icon
neurons in the retina have this kind of ‘direction selectivity’. This lets them signal which way objects in the visual field are moving.
12%
Flag icon
these biological details are congruent with Boolean logic.
18%
Flag icon
‘More is different’ that ‘the behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles’.
18%
Flag icon
engram, was defined as ‘the enduring though primarily latent modification in the irritable substance produced by a stimulus’.
18%
Flag icon
engram and ecphory (or between the processes that lay a memory and those that retrieve it)
19%
Flag icon
activity determines connectivity and connectivity determines activity.
20%
Flag icon
The Hopfield network (see Figure 8) is a mathematical model of neurons that can implement what Hopfield described as ‘content-addressable memory’.
20%
Flag icon
The Hopfield network is recurrent, meaning that each neuron’s activity is determined by that of any of the others in the network.
20%
Flag icon
Therefore, each neuron’s activity serves as both input and output to its neighbours.
20%
Flag icon
each input a neuron receives from another neuron is multiplied by a particular number – a synaptic weight. These weighted inputs are then add...
This highlight has been truncated due to consecutive passage length restrictions.
20%
Flag icon
In the language of physics, a fully retrieved memory is an example of an attractor. An attractor is, in short, a popular pattern of activity.
20%
Flag icon
Always fond of describing things in terms of energy, physicists consider attractors ‘low energy’ states. They’re a comfortable position for a system to be in; that is what makes them attractive and stable.
20%
Flag icon
Hopfield network can sustain multiple attractors.
20%
Flag icon
room or a trip to a beach that ignites the memory of a childhood holiday – are said to be in that memory’s ‘basin of attraction’.
21%
Flag icon
For every experience in which two neurons are either both active or inactive, the connection between them is strengthened.
21%
Flag icon
weights in the Hopfield network are symmetric
21%
Flag icon
If the number of memories remains less than about 14 per cent the number of neurons, each memory will be restored with minimal error.
21%
Flag icon
When pushed past its capacity, the Hopfield network collapses: inputs go towards meaningless attractors and no memories are successfully recovered. It’s a phenomenon given the appropriately dramatic name ‘blackout catastrophe’.
21%
Flag icon
recognition is not recall.
21%
Flag icon
feeling of familiarity when seeing an image can happen without the ability to regenerate that image from scratch.
21%
Flag icon
The Hopfield network is remarkable for being capable of the latter, more difficult task – it fully completes a memory from a partial bit of it....
This highlight has been truncated due to consecutive passage length restrictions.
21%
Flag icon
have a significantly higher capacity: 1,000 neurons can now recognise as many as 23,000 images.
22%
Flag icon
gooey richness of biology.
22%
Flag icon
‘destinesia’ or amnesia about why you’ve gone to where you are.
23%
Flag icon
cells in the prefrontal cortex were different. The neurons there that responded to the visual patterns kept firing even after the patterns disappeared; that is, they maintained their activity during the delay period. A physical signature of working memory at work!
23%
Flag icon
delay activity can only be generated by a network of neurons working together, the connections between them conspiring to keep the activity alive. This is where the idea of attractors comes back into play.
23%
Flag icon
because an attractor stays put. Attractors are defined by derivatives. If we know the inputs a neuron
23%
Flag icon
neuron is part of a recurrent network, it not only gets input but also serves as input to other neurons.
« Prev 1 3 4