Interfaces for Discourse Summarisation: A Human Factors Analysis

Agata McCormac, Kathryn Parsons, Marcus Butavicius, Aaron Ceglar, Derek Weber, Tim Pattison, Richard Leibbrandt, Kenneth Treharne, David Powers

    Research output: Contribution to conferencePaperpeer-review

    1 Citation (Scopus)


    Empirical studies assessing the effectiveness of novel document interfaces are becoming more prevalent, however relatively little attention has been paid to how such tools could work with less structured documents featuring multiple contributors. Participants in this study used different interfaces to answer questions requiring the exploration of collaborative discourse. User performance was influenced by an interaction of interface, transcript, and question type. Individual differences also impacted on performance with higher education levels and higher general knowledge scores being associated with better task performance. The results also revealed that unnecessary interface functionality can hinder performance.

    Original languageEnglish
    Number of pages4
    Publication statusPublished - 1 Jan 2013
    Event25th Australian Computer-Human Interaction Conference -
    Duration: 25 Nov 2013 → …


    Conference25th Australian Computer-Human Interaction Conference
    Period25/11/13 → …


    • Empirical Study
    • Human Computer Interaction
    • Human Factors


    Dive into the research topics of 'Interfaces for Discourse Summarisation: A Human Factors Analysis'. Together they form a unique fingerprint.

    Cite this