Skip to content

Implementing activation functions from scratch in Tensorflow.

License

Notifications You must be signed in to change notification settings

M-68/ActivationFunctions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ActivationFunctions using Custom Layers in Keras

GitHub stars GitHub forks made-with-python GitHub license

Activation functions are an important are of deep learning research .Many new activation functions are being developed ,these include bio-inspired activtions, purely mathematical activation functions including others . Despite, such advancements we usually find ourselves using RELU and LeakyRELU commonly without using/thinking about others. In the following notebooks I showcase how easy/difficult it is to port an activation function using Custom Layers in Keras and Tensorflow!

Link to main notebook --> Activations.ipynb

Implemented activations:

  • LeakyReLu
  • ParametricReLu
  • Elu
  • SElu
  • Swish
  • GELU

Structure

src
|
|-- Activations.ipynb
|-- utils
     |-- Utils.ipynb
     |-- utils.py
     
references
|
|--Ref1
|--Refn

Usage

 git clone  https://github.com/Agrover112/ActivationFunctions.git

References