Intravenous (IV) iron is required for optimal management of anemia in the majority of hemodialysis (HD) patients. While IV iron prescription has increased over time, the best dosing strategy is unknown and any effect of IV iron on survival is unclear. Here we used adjusted Cox regression to analyze associations between IV iron dose and clinical outcomes in 32,435 HD patients in 12 countries from 2002 to 2011 in the Dialysis Outcomes and Practice Patterns Study. The primary exposure was total prescribed IV iron dose over the first 4 months in the study, expressed as an average dose/month. Compared with 100-199 mg/month (the most common dose range), case-mix-adjusted mortality was similar for the 0, 1-99, and 200-299 mg/month categories but significantly higher for the 300-399 mg/month (HR of 1.13, 95% CI of 1.00-1.27) and 400 mg/month or more (HR of 1.18, 95% CI of 1.07-1.30) groups. Convergent validity was proved by an instrumental variable analysis, using HD facility as the instrument, and by an analysis expressing IV iron dose/kg body weight. Associations with cause-specific mortality (cardiovascular, infectious, and other) were generally similar to those for all-cause mortality. The hospitalization risk was elevated among patients receiving 300 mg/month or more compared with 100-199 mg/month (HR of 1.12, 95% CI of 1.07-1.18). In light of these associations, a well-powered clinical trial to evaluate the safety of different IV iron-dosing strategies in HD patients is urgently needed.