Subhaditya's KB
Search
Search
Dark mode
Light mode
Home
❯
Knowledge Base
❯
Activation Functions
Activation Functions
Sep 18, 2024
1 min read
architecture
general rule “which is better”
SELU
>
Elu
>
Leaky Relu
>
Relu
>
Tanh
>
Sigmoid
loss
Graph View
Backlinks
Backprop
Composing shallow neural networks to get deep
Index
Non Relational Inductive Bias
Zeiler Fergus