Background: The notion that sun exposure is a risk factor for age-related macular degeneration (AMD) is widespread, but studies have not shown this conclusively.
Methods: To test the hypothesis that AMD cases have greater ocular sun exposure than control subjects, the authors compared 409 cases with 286 control subjects resident in Newcastle, Australia. Sensitivity to sun and glare of the participants was characterized. Sun exposure was estimated from detailed histories and was validated against sun-seeking or avoidance behavior expected, given sun sensitivity and history of treatment for skin neoplasia.
Results: Contrary to the authors' hypothesis, control subjects had greater median annual ocular sun exposure (865 hours) than cases (723 hours), Mann-Whitney U (U) = 45704, z = -4.9, P > 0.0001. Cases had poorer tanning than did control subjects (mean 2 = 18.2, 4 df, P = 0.001) and as young adults were more sensitive to glare, odds ratio (OR), 2.5; 95% confidence intervals (CIs), 1.8 to 3.5. After stratifying by tanning ability, in the poor-tanning group, the median annual sun exposure of control subjects (685 hours) exceeded that of cases (619 hours), U = 6556, z = -1.9, P = 0.06. Among people who tanned well, control subjects also had significantly greater annual sun exposure than did cases (940 vs. 770 hours), U = 16263, z = -3.7, P = 0.0002.
Conclusions: Sensitivity to glare and poor tanning ability are markers of increased AMD risk. Sun sensitivity confounds study of the postulated AMD-sunlight link. Despite analyses stratified by sun sensitivity, sun exposure was greater in control subjects than in cases with AMD.