Subhaditya's KB

Home

❯

KB

❯

AI

❯

Machine Learning

❯

Training

❯

Non Relational Inductive Bias

Non Relational Inductive Bias

Oct 14, 20251 min read

  • temp

Non Relational Inductive Bias

  • Activation Functions
    • allow the model to capture the non-linearity hidden in the data
  • Dropout
    • helps the network avoid memorizing the data by forcing random subsets of the network to each learn the data pattern. As a result, the obtained model, in the end, is able to generalize better
  • Weight Decay
    • puts constraints on the model’s weights
  • Batch Normalization , Layer Normalization , Instance Normalization
    • Reduces Covariate Shift
  • Augmentation
  • Optimizers

Graph View

Backlinks

  • Inductive Bias
  • _Index_of_Training
  • __Index_of__Training

Created with Quartz v4.5.1 © 2025

  • GitHub
  • LinkedIn