Vladyslav

38%
Flag icon
Neural networks comprise layers of nodes, each node depending on the previous layer by weights, rather like a series of logistic regressions piled on top of each other. Weights are learned by an optimization procedure, and, rather like random forests, multiple neural networks can be constructed and averaged. Neural networks with many layers have become known as deep-learning models: Google’s Inception image-recognition system is said to have over twenty layers and over 300,000 parameters to estimate.
The Art of Statistics: Learning from Data
Rate this book
Clear rating
Open Preview