Memory maintenance is widely believed to involve long-term retention of the synaptic weights that are set within relevant neural circuits during learning. However, despite recent exciting technical advances, it has not yet proved possible to confirm experimentally this intuitively appealing hypothesis. Artificial neural networks offer an alternative methodology as they permit continuous monitoring of individual connection weights during learning and retention. In such models, ongoing alterations in connection weights are required if a network is to retain previously stored material while learning new information. Thus, the duration of synaptic change does not necessarily define the persistence of a memory; rather, it is likely that a regulated balance of synaptic stability and synaptic plasticity is required for optimal memory retention in real neuronal circuits.