We will take a look at the most widely used activation function called ReLU (Rectified Linear Unit) and understand why it is preferred as the default choice for Neural Networks. This article tries to cover most of the important points about this function.
Table Of Contents:Brief Overview of Neural NetworksWhat is an activation function?What is ReLU?Implementing ReLU function in PythonWhy is ReLU non-linear?Derivative Of ReLUAdvantages of ReLUDisadvantages of ReLU
Pre-requisites:
Types...
Published on December 29, 2021 21:39