Mark Gerstein

45%
Flag icon
It turns out that when the weights of the network have been set using the Hebbian learning rule, then the following are true: In the stable state, which represents a stored memory, the network’s energy (as defined by the equation above) is at a local minimum. The network can have multiple local minima (each potentially representing a different stored memory). In a stable state, neurons don’t flip their outputs any further, and the network remains at that energy minimum. However, if you were to perturb the network, say, by making it store a pattern that’s a slightly corrupted
Why Machines Learn: The Elegant Math Behind Modern AI
Rate this book
Clear rating
Open Preview