Subhaditya's KB

Home

❯

KB

❯

AI

❯

Machine Learning

❯

Models

❯

Relu

Relu

Sep 18, 20241 min read

  • architecture

Relu

  • ReLU(x)=max(0,x)
  • dx​d​ReLU(X)={01​x≥0otherwise​
  • He init
  • MLP, CNN : Hidden
  • Leaky Relu
  • PRelu
  • Noisy Relu

Graph View

Backlinks

  • Chapter 10 - CNNs
  • Chapter 4 - Deep Neural Networks
  • Chapter 7 - Gradients and Initialization
  • Activation Functions
  • Alex Net
  • Basic Transformer
  • ConvNeXt
  • Dense Skip Connections
  • FTSwish
  • GELU
  • Lisht
  • MobileOne
  • Noisy Relu
  • Parametric Relu
  • RepVGG
  • Swish
  • _Index_of_Models
  • architecture
  • Composing shallow neural networks to get deep
  • He Initialization
  • SELU

Created with Quartz v4.3.1 © 2025

  • GitHub