Gaussian Error Linear Unit (GELU)

In this article, we will talk about a relatively new activation function and somewhat better as well. Basically we will be discussing about Gaussian Error Linear Unit or GeLU.

Table of Contents:

Introduction.Gaussian Error Linear Unit.Differences Between major Activation Functions and GelU.Experimenting GeLU on MNIST.Experimenting GeLU on CIFAR-10.Summary.

Pre-requisites:

Types of Activation Functions used in Machine LearningSwish Activation FunctionIntroduction

For neural networks, t...

 •  0 comments  •  flag
Share on Twitter
Published on December 22, 2021 13:46
No comments have been added yet.