KL Divergence

  • Classification
  • Entropy + Cross Entropy
  • Distribution Based metric
  • Measures difference between two PDF
  • We first define xlogx for a weird edge case

Then Entropy

Then cce as defined before

Finally KLD