Paying attention to faces is of special interest for humans as well as for scientific research. The experimental manipulation of facial information offers an ecologically valid approach to investigate emotion, attention and social functioning. Humans are highly specialized in face perception and event-related brain potentials (ERP) provide insights into the temporal dynamics of involved neuronal mechanisms. Here, we summarize ERP research from the last decade, examining the processing of emotional compared to neutral facial expressions along the visual processing stream. A particular focus lies on exploring the impact of attention tasks on early (P1, N170), mid-latency (P2, EPN) and late (P3, LPP) stages of processing. This review systematizes facial emotion effects as a function of different attention tasks: 1) When faces serve as mere distractors, 2) during passive viewing designs, 3) directing attention at faces in general, and 4) paying attention to facial expressions. We find fearful and angry expressions to reliably modulate the N170, EPN, and LPP component, the latter benefiting from attention directed at the emotional facial expression.