Swish Activation Function

In this article, we have explored Swish Activation Function in depth. This was developed by Researchers at Google as an alternative to Rectified Linear Unit (ReLu).

Table of contents:

Overview of Neural Networks and the use of Activation functionsSwish Activation functionDerivative function of SwishConclusion

Pre-requisite: Types of Activation Functions used in Machine Learning

Overview of Neural Networks and the use of Activation functions

The mechanism of neural networks works similarly t...

 •  0 comments  •  flag
Share on Twitter
Published on December 21, 2021 01:29
No comments have been added yet.