Kinematics in context: Predicting other's action intentions entails the perception of affordances

Ayeh Alhasan, Eyal Karin, Nathan Caruana, Emily Cross, David Kaplan, Michael J. Richardson

Research output: Contribution to journalArticlepeer-review

14 Downloads (Pure)

Abstract

Intention prediction is essential for successful social interaction, but traditional research focusing solely on movement kinematics often overlooks the array of action possibilities in natural settings. This study employs a mixed-methods approach to explore intention prediction, analysing free-text responses from participants who watched videos of an actor reaching for a cup, bottle, or spoon, each with a distinct intention. Each video included varied environmental contexts to suggest specific intentions (e.g., full cups for drinking, empty cups for clearing) or presented ambiguous contexts (e.g., half-full cups). We found that participants’ intention predictions depended on the variety of action possibilities presented by both kinematics and context. Participants tended to identify the primary action possibility of the grasped item as the intended action when both kinematics and context supported its feasibility. Predictions diversified when kinematics or context suggested that the object's primary action was less likely. Our findings suggest that while intention predictions can sometimes be inaccurate, they align with the (most functional) action possibilities (i.e., affordances) indicated by the actor's movements within a given context.

Original languageEnglish
Article number106122
Number of pages13
JournalCognition
Volume260
DOIs
Publication statusPublished - Jul 2025

Keywords

  • Action observation
  • Action prediction
  • Action understanding
  • Affordances
  • Environmental-context
  • Intention
  • Intention prediction
  • Kinematics

Fingerprint

Dive into the research topics of 'Kinematics in context: Predicting other's action intentions entails the perception of affordances'. Together they form a unique fingerprint.

Cite this