Food fortification is likely to have played an important role in the current nutritional health and well-being of populations in industrialized countries. Starting in the early part of the 20th century, fortification was used to target specific health conditions: goitre with iodized salt; rickets with vitamin D-fortified milk; beriberi, pellagra and anaemia with B-vitamins and Fe-enriched cereals; more recently, in the USA, risk of pregnancy affected by neural-tube defects with folic acid-fortified cereals. A relative lack of appropriate centrally-processed food vehicles, less-developed commercial markets and relatively low consumer awareness and demand, means it has taken about another 50 years for fortification to be seen as a viable option for the less-developed countries. The present paper reviews selected fortification initiatives in developing countries to identify different factors that contributed to their successful implementation, as well as the challenges that continually threaten the future of these programmes. Ultimately, the long-term sustainability of fortification programmes is ensured when consumers are willing and able to bear the additional cost of fortified foods. There has been an enormous increase in fortification programmes over the last couple of decades in developing countries. Considerable progress has been made in reducing vitamin A and I deficiencies, although less so with Fe, even as Zn and folic acid deficiencies are emerging as important public health problems. Food fortification based on sound principles and supported by clear policies and regulations can play an increasingly large role in this progress towards prevention and control of micronutrient malnutrition.