We will look at the very promising but not very common activation function called SELU (Scaled Exponential Linear Unit) and understand its main advantages over other activation functions like ReLU (Rectified Linear Unit). This article presents lots of essential points about this function.
Table Of Contents:
Brief recap of the pre-requisites
What are Neural Networks?What is an Activation Function?What is ReLU?
What is SELU?
Implementing SELU function in Python
What is normalization?
...
Published on January 14, 2022 01:11