In time-division-duplex (TDD) mode wireless communications, downlink beamforming performance of a smart antenna system at the base station can be degraded due to variation of spatial signature vectors corresponding to mobile users especially in fast fading scenarios. To mitigate this, downlink beams must be controlled by properly adjusting their weight vectors in response to changing propagation dynamics. This can be achieved by modeling the spatial signature vectors in the uplink period and then predicting them to be used as beamforming weight vectors for the new mobile position in the downlink transmission period. We show that ADAptive LInear NEuron (ADALINE) network modeling based prediction of spatial signatures provides certain level of performance improvement compared to conventional beamforming method that employs spatial signature obtained in previous uplink interval. We compare the performance of ADALINE with autoregressive (AR) modeling based predictions under varying channel propagation (mobile speed, multipath angle spread, and number of multipaths), and filter order/delay conditions. ADALINE modeling outperforms AR modeling in terms of downlink SNR improvement and relative error improvement especially under high mobile speeds, i.e., V = 100 km/h.