Purpose: Longitudinal data suggest that time outdoors may be protective against myopia onset. We evaluated the hypothesis that time outdoors might create differences in circulating levels of vitamin D between myopes and non-myopes.
Methods: Subjects provided 200 μl of peripheral blood in addition to survey information about dietary intakes and time spent in indoor or outdoor activity. The 22 subjects ranged in age from 13 to 25 years. Myopes (n = 14) were defined as having at least -0.75 diopter of myopia in each principal meridian and non-myopes (n = 8) had +0.25 diopter or more hyperopia in each principal meridian. Blood level of vitamin D was measured using liquid chromatography/mass spectroscopy.
Results: Unadjusted blood levels of vitamin D were not significantly different between myopes (13.95 ± 3.75 ng/ml) and non-myopes (16.02 ± 5.11 ng/ml, p = 0.29) nor were the hours spent outdoors (myopes = 12.9 ± 7.8 h; non-myopes = 13.6 ± 5.8 h; p = 0.83). In a multiple regression model, total sugar and folate from food were negatively associated with blood vitamin D, whereas theobromine and calcium were positively associated with blood vitamin D. Myopes had lower levels of blood vitamin D by an average of 3.4 ng/ml compared with non-myopes when adjusted for age and dietary intakes (p = 0.005 for refractive error group, model R = 0.76). Gender, time outdoors, and dietary intake of vitamin D were not significant in this model.
Conclusions: The hypothesis that time outdoors might create differences in vitamin D could not be evaluated fully because time outdoors was not significantly related to myopia in this small sample. However, adjusted for differences in the intake of dietary variables, myopes appear to have lower average blood levels of vitamin D than non-myopes. Although consistent with the hypothesis above, replication in a larger sample is needed.