1. Items Outperform Adjectives in a Computational Model of Binary Semantic Classification.
- Author
-
Diachek, Evgeniia, Brown‐Schmidt, Sarah, and Polyn, Sean M.
- Subjects
- *
SEMANTIC memory , *THEORY of knowledge , *VECTOR spaces , *UTILITY theory , *ADJECTIVES (Grammar) - Abstract
Semantic memory encompasses one's knowledge about the world. Distributional semantic models, which construct vector spaces with embedded words, are a proposed framework for understanding the representational structure of human semantic knowledge. Unlike some classic semantic models, distributional semantic models lack a mechanism for specifying the properties of concepts, which raises questions regarding their utility for a general theory of semantic knowledge. Here, we develop a computational model of a binary semantic classification task, in which participants judged target words for the referent's size or animacy. We created a family of models, evaluating multiple distributional semantic models, and mechanisms for performing the classification. The most successful model constructed two composite representations for each extreme of the decision axis (e.g., one averaging together representations of characteristically big things and another of characteristically small things). Next, the target item was compared to each composite representation, allowing the model to classify more than 1,500 words with human‐range performance and to predict response times. We propose that when making a decision on a binary semantic classification task, humans use task prompts to retrieve instances representative of the extremes on that semantic dimension and compare the probe to those instances. This proposal is consistent with the principles of the instance theory of semantic memory. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF