Mark Gerstein

17%
Flag icon
Each element of this vector will be an analytic expression that can be calculated using the rules of calculus. Once you have the expressions, you just plug in the current values for the weights, and you get the gradient, which you can then use to calculate the new weights. The problem: You need calculus, and while our gradient has only three elements, in practice, it can have elements that number in the tens, hundreds, thousands, or even more. Widrow and Hoff were after something simpler. This is what they came up with: wnew = wold + μ(-∇est) Instead of calculating the entire gradient, they ...more
Why Machines Learn: The Elegant Math Behind Modern AI
Rate this book
Clear rating
Open Preview