Cross-situational learning of sign-like gestures in children and adults: a behavioural and event-related potential study

Arianna Colombani, Varghese Peter, Qian Yin Mai, Amanda Saksida, Natalie Boll-Avetisyan, Outi Tuomainen, Mridula Sharma

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Drawing on the innate human ability to detect regularities in the language input (statistical learning), this study applies a cross-situational learning paradigm to test the learning of unfamiliar sign-like gestures (in the form of pseudosigns) for familiar spoken words in children and adults. Twenty-five children (8–11 years) and 19 adults (18–35 years) were familiarised with 8 word-pseudosign pairs and tested on a recognition and a semantic categorisation task, with simultaneous EEG recording. Both groups demonstrated above-chance accuracy, indicating successful learning of word-pseudosign pairs and their meanings, with an advantage of adults over children. Both groups also showed an N400 followed by an LPC response during the recognition task. During categorisation, adults demonstrated an N400 response, whereas, in children, an N400 emerged only when the correctly identified trials were considered. These results suggest that pseudosigns are highly salient linguistic inputs, likely to be learned through statistical computations.

Original languageEnglish
Pages (from-to)1324-1349
Number of pages26
JournalLanguage, Cognition and Neuroscience
Volume40
Issue number10
DOIs
Publication statusPublished - Dec 2025

Keywords

  • associative learning
  • cross-situational learning
  • LPC
  • N400
  • Pseudosigns
  • semantic categorisation
  • statistical learning

Fingerprint

Dive into the research topics of 'Cross-situational learning of sign-like gestures in children and adults: a behavioural and event-related potential study'. Together they form a unique fingerprint.

Cite this