Contributor
Mayank Raj

Activation functions and Pooling methods


Mentors
Marcus Edel, Abhinav Anand
Organization
mlpack
Technologies
c++
Topics
Activation functions and Pooling methods
Two new activation functions have been recently proposed FTSwish, and LiSHT and I am proposing to add them to mlpack along with two Pooling methods. These activation functions. Activation functions provide the non-linearity vital to allowing deep learning networks to perform an impressive number of tasks. ReLU (Rectified Linear Unit) has been the default workhorse for some time in deep learning. However, some concerns over ReLU’s removal of all negative values and the associated dying gradient, have prompted new activation functions that handle negative values differently.