Information Gain

  • In decision forests, the difference between a node’s Entropy and the weighted (by number of examples) sum of the Entropy of its children nodes. A node’s Entropy is the Entropy of the examples in that node.