Electrophysiological and hemodynamic correlates of processing isolated faces have been investigated extensively over the last decade. A question not addressed thus far is whether the visual scene, which normally surrounds a face or a facial expression, has an influence on how the face is processed. Here we investigated this issue by presenting faces in natural contexts and measuring whether the emotional content of the scene influences processing of a facial expression. Event-related potentials were recorded to faces (fearful/neutral) embedded in scene contexts (fearful/neutral) while participants performed an orientation-decision task (face upright or inverted). Two additional experiments were run, one to examine the effects of context that occur without a face and the other to evaluate the effects of faces isolated from contexts. Faces without any context showed the largest N170 amplitudes. The presence of a face in a fearful context enhances the N170 amplitude over a face in neutral contexts, an effect that is strongest for fearful faces on left occipito-temporal sites. This N170 effect, and the corresponding topographic distribution, was not found for contexts-only, indicating that the increased N170 amplitude results from the combination of face and fearful context. These findings suggest that the context in which a face appears may influence how it is encoded.