1. Determinants of Shared and Idiosyncratic Contributions to Judgments of Faces.
- Author
-
Albohn, Daniel N., Martinez, Joel E., and Todorov, Alexander
- Abstract
Recent work has shown that the idiosyncrasies of the observer can contribute more to the variance of social judgments of faces than the features of the faces. However, it is unclear what conditions determine the relative contributions of shared and idiosyncratic variance. Here, we examine two conditions: type of judgment and diversity of face stimuli. First, we show that for simpler, directly observable judgments that are consistent across observers (e.g., masculinity) shared exceeds idiosyncratic variance, whereas for more complex and less directly observable judgments (e.g., trustworthiness), idiosyncratic exceeds shared variance. Second, we show that judgments of more diverse face images increase the amount of shared variance. Finally, using machine-learning methods, we examine how stimulus (e.g., incidental emotion resemblance, skin luminosity) and observer variables (e.g., race, age) contribute to shared and idiosyncratic variance of judgments. Overall, our results indicate that an observer's age is the most consistent and best predictor of idiosyncratic variance contributions to face judgments measured in the current research. Public Significance Statement: Group-level models of judgment have provided important insights into how individuals view others. However, emerging evidence suggests that group-level averages explain only a portion of the reliable variance, necessitating models that represent the idiosyncrasies of the individual. Despite this evidence, little is known about the relative contributions of the stimulus and perceiver. In the current work, we provide evidence that the variance explained by the stimulus and perceiver is not fixed, but rather depends on theoretically and practically important factors, including the type of judgment being examined and the diversity of the stimulus set being evaluated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF