We rely on color to select objects as the targets of our actions (e.g., the freshest fish, the ripest fruit). To be useful for selection, color must provide accurate guidance about object identity across changes in illumination. Although the visual system partially stabilizes object color appearance across illumination changes, how such color constancy supports object selection is not understood. To study how constancy operates in real-life tasks, we developed a novel paradigm in which subjects selected which of two test objects presented under a test illumination appeared closer in color to a target object presented under a standard illumination. From subjects' choices, we inferred a selection-based match for the target via a variant of maximum likelihood difference scaling, and used it to quantify constancy. Selection-based constancy was good when measured using naturalistic stimuli, but was dramatically reduced when the stimuli were simplified, indicating that a naturalistic stimulus context is critical for good constancy. Overall, our results suggest that color supports accurate object selection across illumination changes when both stimuli and task match how color is used in real life. We compared our selection-based constancy results with data obtained using a classic asymmetric matching task and found that the adjustment-based matches predicted selection well for our stimuli and instructions, indicating that the appearance literature provides useful guidance for the emerging study of constancy in natural tasks.