Models of the Mind: How Physics, Engineering and Mathematics Have Shaped Our Understanding of the Brain
Rate it:
Open Preview
23%
Flag icon
When a network is in an attractor state, the derivative of each and every neuron in that network is zero.
23%
Flag icon
if the connections between neurons are just right, memories started at one point in time can last for much longer. All the cells can maintain their firing rate because all the cells
23%
Flag icon
around them are doing the same. Nothing changes if...
This highlight has been truncated due to consecutive passage length restrictions.
23%
Flag icon
For working memory to function, the network needs to be good at resisting the influence of such distractors.
23%
Flag icon
‘ring network’, a hand-designed model of a neural circuit that would be ideal for the robust maintenance of working memories.
23%
Flag icon
Attractors in a ring network, on the other hand, are continuous. With continuous attractors, transitioning between similar memories is easy.
23%
Flag icon
A ring network allows for small, sensible errors.
23%
Flag icon
colours lie on a wheel.
23%
Flag icon
gutter-like nature of a continuous attractor – it has low resistance for moving between nearby states, but high resistance to perturbations otherwise.
24%
Flag icon
Because to the biologist, of course, ‘finely tuned’ are dirty words.
24%
Flag icon
The ellipsoid body is centrally placed in the fly brain and it has a unique shape: it has a hole in the middle with cells arranged all around that hole, forming a doughnut made of neurons
25%
Flag icon
It links ions to experiences.
25%
Flag icon
the perceptron is a feedforward (not recurrent) network. Recurrence means that the connections can form loops: neuron A connects to neuron B, which connects back to neuron A, for example.
25%
Flag icon
Eavesdrop on a neuron that should be doing the same thing over and over – for example, one in the motor system that is producing the same movement repeatedly – and you’ll find its activity surprisingly irregular.
25%
Flag icon
how a neuron responded to any given pulse seemed a matter of pure chance.
25%
Flag icon
‘We were struck by the kaleidoscopic appearance of [responses] obtained from large nerves under absolutely constant conditions.’
26%
Flag icon
‘The variability of cortical neuron response[s] is known to be considerable.’
26%
Flag icon
‘Successive presentations of identical stimuli do not yield identical responses,’
26%
Flag icon
neurons have ‘more in common with the ticking of a Geiger counter than of a clock’.
26%
Flag icon
Petri dish they behave remarkably more reliably: stimulating these neurons the same way twice will actually produce similar results.
26%
Flag icon
the very nature of how neurons work makes them noise reducers
26%
Flag icon
the ‘noisiness’ of your athletic ability gets averaged out over time.
26%
Flag icon
If it only uses a quick snapshot of its input, however, the noise will dominate.
26%
Flag icon
So how much time does a neuron combine its inputs over? About 20 milliseconds.
26%
Flag icon
spike only takes about 1 millisecond and a cell can be receiving many at a time from al...
This highlight has been truncated due to consecutive passage length restrictions.
26%
Flag icon
Neuroscientists William Softky and Christof Koch used a simple mathematical model of a neuron – the ‘leaky integrate-and-fire’ model
26%
Flag icon
Yet the neuron itself – because it integrated these incoming spikes over time – still produced output spikes that were much more regular than the input it received.
27%
Flag icon
When passed through a neuron, noise should get weaker.
27%
Flag icon
Not only is the brain unpredictable, then, but it seems to be encouraging that unpredictability – going against the natural tendency of neurons to squash it.
27%
Flag icon
GABA, as it is more commonly known) was the first identified inhibitory neurotransmitter.
27%
Flag icon
These receptors are like little protein padlocks.
27%
Flag icon
receptor that GABA attaches to, for example, only lets chloride ions into the cell.
27%
Flag icon
Neurons tend to release the same neurotransmitter on to all of their targets, a principle known as Dale’s Law
27%
Flag icon
Neurons that release GABA are called ‘GABAergic’,
27%
Flag icon
‘Whatever the brain does for the mind, we can be sure that GABA plays a major role in it.’
27%
Flag icon
In computers, numbers can only be represented with a certain level of precision.
28%
Flag icon
In general, the stronger that two competing powers are, the bigger the swings in the outcome of their competition.
28%
Flag icon
And the network needs to be self-consistent – that is, each neuron needs to produce the same amount of noise it receives, no more nor less. Could
28%
Flag icon
physics is full of situations where self-consistency is important:
29%
Flag icon
average strength of a connection was roughly equal to one divided by the square root of the number of connections
29%
Flag icon
The auditory cortex, for example, needs to respond to quick changes in
29%
Flag icon
sound frequency to process incoming information. This makes the quick responsiveness of well-balanced neurons a good match.
29%
Flag icon
This counter-intuitive fact that good behaviour can produce bedlam is important.
29%
Flag icon
‘The scientist must always be on the lookout for other explanations than those that have been commonly disseminated.’
30%
Flag icon
Chaotic processes produce outputs that look random but in fact arise from perfect rule-following.
30%
Flag icon
When patients are asleep (particularly in deep dreamless sleep), the EEG makes waves: large movements upwards then downwards extending over a second or more.
30%
Flag icon
When the event of interest – a seizure – occurs, the movements are even starker. The signal traces out big, fast sweeps up and down, three to four times a second, like a kid scribbling frantically with a crayon.
30%
Flag icon
seizure is the opposite of randomness – it is perfect order and predictability.
30%
Flag icon
with the inhibitory connections slightly stronger than excitatory ones – is just off to the right of the middle.
30%
Flag icon
bifurcations.