Objective: To model the temporal variations in malaria episodes in a hypo-endemic area of Iran and to assess the feasibility of an epidemic early warning system.
Methods and materials: Malaria episode data for Kahnooj District, south-east Iran, were collected from the local health system for the period 1994-2002. Plasmodium species-specific models were generated using Poisson regression. Starting with a simple model which included only temporal effects, we iteratively added more explanatory variables to maximize goodness of fit.
Results: Of 18,268 recorded malaria episodes, more than 67% were due to P. vivax. In addition to seasonality and secular trend, we found that incorporating a 1-month time lag between key meteorological variables and the predicted number of cases maximized goodness of fit. Maximum temperature, mean relative humidity and previous numbers of malaria cases were the most important predictors. These were included in the model with lags of no less than three dekads, i.e. three 10-day periods or effectively 1 month.
Conclusion: Simple models based on climatic factors and information on past case numbers may be useful in improving the quality of the malaria control programme in Iran, particularly in terms of assuring accurate targeting of interventions in time and space. The models developed in this study are based on explanatory data that incorporate a lag of 1 month (i.e. data that were recorded 21-50 days previously). In practice, this translates into an operational 'window' of 1 month. Provided appropriate modes of data exchange exist between key stakeholders and appropriate systems for operational response are in place, this type of early warning information has the potential to lead to significant reductions in malaria morbidity in Iran.