Memory-based Learning

  • Lazy learning
  • All encountered examples are stored in memory in a multi-dimensional array, positioned according to relevant Features
  • New items are classified (comprehension) or generated (production) by searching for an example in memory that is closest to the target
  • Because examplars are represented by their Features even novel forms can be classified
  • A generalization of the knn (k-nearest neighbors) algorithm
  • Don’t remove any infrequent or even solo forms. You might need the info
  • Don’t trim down the number of examples of a frequent form you have in the model. This effects it.
  • Learning is storing, classification is analogy
  • multiple long-distance dependencies