How visual timing and form information affect speech and non-speech processing

Jeesun Kim, Chris Wayne Davis

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)

Abstract

Auditory speech processing is facilitated when the talker's face/head movements are seen. This effect is typically explained in terms of visual speech providing form and/or timing information. We determined the effect of both types of information on a speech/non-speech task (non-speech stimuli were spectrally rotated speech). All stimuli were presented paired with the talker's static or moving face. Two types of moving face stimuli were used: full-face versions (both spoken form and timing information available) and modified face versions (only timing information provided by peri-oral motion available). The results showed that the peri-oral timing information facilitated response time for speech and non-speech stimuli compared to a static face. An additional facilitatory effect was found for full-face versions compared to the timing condition; this effect only occurred for speech stimuli. We propose the timing effect was due to cross-modal phase resetting; the form effect to cross-modal priming.

Original languageEnglish
Pages (from-to)86-90
Number of pages5
JournalBrain and Language
Volume137
DOIs
Publication statusPublished - 1 Oct 2014

Keywords

  • Auditory and visual speech processing
  • Visual form and timing information
  • Visual speech

Fingerprint

Dive into the research topics of 'How visual timing and form information affect speech and non-speech processing'. Together they form a unique fingerprint.

Cite this