Back to Search
Start Over
Semantic coordination of speech and gesture in young children
- Source :
- PUB-Publications at Bielefeld University
-
Abstract
- People use speech and gesture together when describing an event or action, where both modalities have different expressive opportunities (Kendon, 2004). One question is how the two modalities are semantically coordinated, i.e. how meaning is distributed across speech and accompanying gestures. While this has been studied only for adult speakers so far, in this paper we present a study on how young children (4 years of age) semantically coordinate speech and gesture, and how this relates to their cognitive and (indirectly) their verbal skills. Results indicate significant positive correlations between cognitive skills of the children and gesture-speech coordination. In addition, high cognitive skills correlate with the number of semantically relevant child descriptions revealing a link between verbal and cognitive skills.
- Subjects :
- InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI)
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- PUB-Publications at Bielefeld University
- Accession number :
- edsair.dedup.wf.001..3e8b24d6839b7fd4642aaeaa89d68f18