Articulatory constraints on spontaneous entrainment between speech and manual gesture

Grégory Zélic, Jeesun Kim, Chris Wayne Davis

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)


The present study examined the extent to which speech and manual gestures spontaneously entrain in a non-communicative task. Participants had to repeatedly utter nonsense /CV/ syllables while continuously moving the right index finger in flexion/extension. No instructions to coordinate were given. We manipulated the type of syllable uttered (/ba/ vs. /sa/), and vocalization (phonated vs. silent speech). Assuming principles of coordination dynamics, a stronger entrainment between the fingers oscillations and the jaw motion was predicted (1) for /ba/, due to expected larger amplitude of jaw motion and (2) in phonated speech, due to the auditory feedback. Fifteen out of twenty participants showed simple ratios of speech to finger cycles (1:1, 1:2 or 2:1). In contrast with our predictions, speech-gesture entrainment was stronger when vocalizing /sa/ than /ba/, also more widely distributed on an in-phase mode. Furthermore, results revealed a spatial anchoring and an increased temporal variability in jaw motion when producing /sa/. We suggested that this indicates a greater control of the speech articulators for /sa/, making the speech performance more receptive to environmental forces, resulting in the greater entrainment observed to gesture oscillations. The speech-gesture coordination was maintained in silent speech, suggesting a somatosensory basis for their endogenous coupling.

Original languageEnglish
Pages (from-to)232-245
Number of pages14
JournalHuman Movement Science
Publication statusPublished - 1 Aug 2015


  • Rhythmic coordination dynamics
  • Speech articulatory constraints
  • Spontaneous entrainment processes


Dive into the research topics of 'Articulatory constraints on spontaneous entrainment between speech and manual gesture'. Together they form a unique fingerprint.

Cite this