Visual form predictions facilitate auditory processing at the N1

Tim Paris, Jeesun Kim, Chris Wayne Davis

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)


Auditory-visual (AV) events often involve a leading visual cue (e.g. auditory-visual speech) that allows the perceiver to generate predictions about the upcoming auditory event. Electrophysiological evidence suggests that when an auditory event is predicted, processing is sped up, i.e., the N1 component of the ERP occurs earlier (N1 facilitation). However, it is not clear (1) whether N1 facilitation is based specifically on predictive rather than multisensory integration and (2) which particular properties of the visual cue it is based on. The current experiment used artificial AV stimuli in which visual cues predicted but did not co-occur with auditory cues. Visual form cues (high and low salience) and the auditory-visual pairing were manipulated so that auditory predictions could be based on form and timing or on timing only. The results showed that N1 facilitation occurred only for combined form and temporal predictions. These results suggest that faster auditory processing (as indicated by N1 facilitation) is based on predictive processing generated by a visual cue that clearly predicts both what and when the auditory stimulus will occur.

Original languageEnglish
Pages (from-to)157-164
Number of pages8
Publication statusPublished - 20 Feb 2017


  • audiovisual
  • EEG
  • N1 latency
  • prediction


Dive into the research topics of 'Visual form predictions facilitate auditory processing at the N1'. Together they form a unique fingerprint.

Cite this