Serious science and mathematics readings discussion

Understanding Deep Learning
This topic is about Understanding Deep Learning
11 views
Readings 2025 > February 2025 - Understanding Deep Learning

Comments Showing 1-2 of 2 (2 new)    post a comment »
dateUp arrow    newest »

message 1: by Swapnam (new) - added it

Swapnam | 24 comments Mod
This is the discussion thread for Understanding Deep Learning, the pick for February 2025.
Happy reading and let's hope we can have illuminating discussions on the same!


message 2: by Swapnam (last edited Feb 04, 2025 08:33AM) (new) - added it

Swapnam | 24 comments Mod
Chapter 1 notes

We want to perform an Inference y given Input x using the Model f (i.e. a family of relations), represented as :
y = f(x)
Here x and y are (multi-dimensional) vectors encoding the input/output in a suitable manner and f a function.

A particular relation is determined by the choice of the Parameters p :
y = f(x, p)
Then, the goal of Supervised Learning is to learn the model's parameters p by using a Training Dataset, which is a collection of pairs {x_i, y_i} of input and output vectors.

Call Loss L as the degree of mismatch (needs to be precisely quantified) between the actual relation and the input-output mapping prediction.
Then our objective is to find parameters p_m that minimize the loss function :
p_m = argmin_p (L(p, {x_i, y_i}))
Obviously, the challenge is to find the optimal parameters p_m with the minimum resource consumption possible.

After the training phase is done, we run the model against the Test Data to evaluate its Generalization i.e. performance for instances outside the training distribution.


back to top