Status Updates From Neural Networks and Deep Le...
Neural Networks and Deep Learning by
Status Updates Showing 1-27 of 27
Khaula Nauman
is on page 50 of 224
"In the early days of AI research people hoped that the effort to build an AI would also help us understand the principles behind intelligence and, maybe, the functioning of the human brain. But perhaps the outcome will be that we end up understanding neither the brain nor how artificial intelligence works!"
this cracked me up
— Sep 19, 2025 06:09AM
Add a comment
this cracked me up
Khaula Nauman
is on page 50 of 224
Usually, when programming we believe that solving a complicated problem like recognizing the MNIST digits requires a sophisticated algorithm. But even the neural networks in the Wan et al paper just mentioned involve quite simple algorithms, variations on the algorithm we've seen in this chapter.
— Sep 19, 2025 06:08AM
Add a comment
Khaula Nauman
is on page 50 of 224
The lesson to take away from this is that debugging a neural network is not trivial, and, just as for ordinary programming, there is an art to it. You need to learn that art of debugging in order to get good results from neural networks. More generally, we need to develop heuristics for choosing good hyper-parameters and a good architecture.
— Sep 19, 2025 06:07AM
Add a comment
Khaula Nauman
is on page 32 of 224
i now know what
- a neural network is
- are preceptor neurons
- are sigmoid neurons
- a sigmoid function is and why sigmoid
- are feedforward neural networks and recurrent ones
- a cost function is (i think)
- how gradient descent helps us minimize our cost function and helps us get closer to finding the correct output
- a stochastic gradient descent is and what is on-line learning
— Sep 06, 2025 12:51PM
Add a comment
- a neural network is
- are preceptor neurons
- are sigmoid neurons
- a sigmoid function is and why sigmoid
- are feedforward neural networks and recurrent ones
- a cost function is (i think)
- how gradient descent helps us minimize our cost function and helps us get closer to finding the correct output
- a stochastic gradient descent is and what is on-line learning
em
is starting
no way this is free. anyway, pov: you are speedrunning learning neural networks
— Jan 02, 2023 05:20PM
Add a comment
Valéria
is starting
I love being a STEM major !!😍 (I’m currently being held at gun point)
— Dec 12, 2022 02:07AM
2 comments
John Zobolas
is 83% done
Finished "Why are deep neural networks hard to train?". Mostly explaining the Vanishing Gradient problem.
— Jul 20, 2022 04:46PM
Add a comment
John Zobolas
is 67% done
Finished "A visual proof that neural nets can compute any function". Amazing chapter, easy to digest with all the visual helpers provided
— Jul 17, 2022 01:25PM
Add a comment
John Zobolas
is 50% done
"Improving the way neural networks learn" chapter finished. Excellent book.
— Jul 17, 2022 11:46AM
Add a comment
John Zobolas
is 33% done
Backpropagation chapter finished, really good!
— Jul 12, 2022 12:34PM
Add a comment
John Zobolas
is 16% done
Finished Chapter 1: Using neural nets to recognize handwritten digits
— Jul 23, 2020 09:45AM
Add a comment
John Zobolas
is 15% done
Picked it up after almost 1 year, yay!
— Jul 15, 2020 05:40PM
Add a comment
Ben Tomlin
is 45% done
This book went on to explain various modifications that can be made to the vanilla neural network model to better fit certain problems. The elegance and flexibility of neural networks and deep learning continues to amaze and inspire me.
— May 27, 2020 10:36AM
Add a comment
Ben Tomlin
is 36% done
I got caught up with other hobbies and schoolwork and unfortunately diverted from this book for some time. But I recently dove back into it and learned more about the mechanism behind machine learning called backpropagation. The author skillfully makes a relatively complex subject seem intuitive and elegant. I look forward to expanding and applying my newfound knowledge.
— May 22, 2020 11:40PM
Add a comment
Ben Tomlin
is 22% done
The topic of neural networks and deep learning has always fascinated me, and I am delighted to have found a book that teaches it so well. So far I have learned the basic structure of a neural network and have been able to draw comparisons between neural networks and the human brain. I look forward to delving further into this topic.
— Apr 13, 2020 01:29PM
Add a comment
Haigeng W
is on page 73 of 224
end of chap 2, how backpropagation algorithm works
— Jun 06, 2018 09:24PM
Add a comment
Haigeng W
is on page 59 of 224
four fundamental equations of backpropagation
— May 20, 2018 08:04AM
Add a comment
Theoderik Trajanson
is starting
2 / 6: The four fundamental equations behind backpropagation - Plan of attack
— Aug 15, 2017 08:35PM
Add a comment
Theoderik Trajanson
is starting
2 / 6: The four fundamental equations behind backpropagation
— Aug 15, 2017 08:21PM
Add a comment
Theoderik Trajanson
is starting
2 / 6: The two assumptions we need about the cost function
— Aug 15, 2017 02:29PM
Add a comment
Theoderik Trajanson
is starting
2 / 6: Warm up: a fast matrix-based approach to computing the output from a neural network
— Aug 15, 2017 02:23PM
Add a comment
Theoderik Trajanson
is starting
1 / 6: Implementing our network to classify digits
— Aug 12, 2017 08:38PM
Add a comment
Theoderik Trajanson
is starting
1 / 6: Learning with gradient descent
— Aug 12, 2017 07:02PM
Add a comment
Theoderik Trajanson
is starting
Chapter 1 / 6 - The architecture of neural networks
— Aug 12, 2017 06:35PM
Add a comment

