Hearing a point-light talker: An auditory influence on a visual motion detection task

Jeesun Kim, Christian H. Kroos, Chris Wayne Davis

Research output: Contribution to journalArticlepeer-review

7 Citations (Scopus)

Abstract

Parsing of information from the world into objects and events occurs in both the visual and auditory modalities. It has been suggested that visual and auditory scene perceptions involve similar principles of perceptual organisation. We investigated here cross-modal scene perception by determining whether an auditory stimulus could facilitate visual object segregation. Specifically, we examined whether the presentation of matched auditory speech would facilitate the detection of a point-light talking face amid point-light distractors. An adaptive staircase procedure (3-up-1-down rule) was used to estimate the 79% correct threshold in a two-alternative forced-choice procedure. To determine if different degrees of speech motion would show auditory influence of different sizes, two speech modes were tested (in quiet and Lombard speech). A facilitatory auditory effect on talking-face detection was found; the size of this effect did not differ between the different speech modes.

Original languageEnglish
Pages (from-to)407-416
Number of pages10
JournalPerception
Volume39
Issue number3
DOIs
Publication statusPublished - 2010

Fingerprint Dive into the research topics of 'Hearing a point-light talker: An auditory influence on a visual motion detection task'. Together they form a unique fingerprint.

Cite this