Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting

Neural Comput. 2002 Jul;14(7):1755-69. doi: 10.1162/08997660260028700.

Abstract

In error-driven distributed feedforward networks, new information typically interferes, sometimes severely, with previously learned information. We show how noise can be used to approximate the error surface of previously learned information. By combining this approximated error surface with the error surface associated with the new information to be learned, the network's retention of previously learned items can be improved and catastrophic interference significantly reduced. Further, we show that the noise-generated error surface is produced using only first-derivative information and without recourse to any explicit error information.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Artifacts*
  • Computer Simulation
  • Memory Disorders*
  • Memory*
  • Neural Networks, Computer*