Although spike-frequency adaptation is a commonly observed property of neurons, its functional implications are still poorly understood. In this work, using a leaky integrate-and-fire neural model that includes a Ca2+-activated K+ current (IAHP), we develop a quantitative theory of adaptation temporal dynamics and compare our results with recent in vivo intracellular recordings from pyramidal cells in the cat visual cortex. Experimentally testable relations between the degree and the time constant of spike-frequency adaptation are predicted. We also contrast the IAHP model with an alternative adaptation model based on a dynamical firing threshold. Possible roles of adaptation in temporal computation are explored, as a a time-delayed neuronal self-inhibition mechanism. Our results include the following: (1) given the same firing rate, the variability of interspike intervals (ISIs) is either reduced or enhanced by adaptation, depending on whether the IAHP dynamics is fast or slow compared with the mean ISI in the output spike train; (2) when the inputs are Poisson-distributed (uncorrelated), adaptation generates temporal anticorrelation between ISIs, we suggest that measurement of this negative correlation provides a probe to assess the strength of IAHP in vivo; (3) the forward masking effect produced by the slow dynamics of IAHP is nonlinear and effective at selecting the strongest input among competing sources of input signals.