Interfaces for Discourse Summarisation: A Human Factors Analysis

Agata McCormac, Kathryn Parsons, Marcus Butavicius, Aaron Ceglar, Derek Weber, Tim Pattison, Richard Leibbrandt, Kenneth Treharne, David Powers

    Research output: Contribution to conferencePaper

    1 Citation (Scopus)

    Abstract

    Empirical studies assessing the effectiveness of novel document interfaces are becoming more prevalent, however relatively little attention has been paid to how such tools could work with less structured documents featuring multiple contributors. Participants in this study used different interfaces to answer questions requiring the exploration of collaborative discourse. User performance was influenced by an interaction of interface, transcript, and question type. Individual differences also impacted on performance with higher education levels and higher general knowledge scores being associated with better task performance. The results also revealed that unnecessary interface functionality can hinder performance.

    Original languageEnglish
    Pages139-142
    Number of pages4
    DOIs
    Publication statusPublished - 1 Jan 2013
    Event25th Australian Computer-Human Interaction Conference -
    Duration: 25 Nov 2013 → …

    Conference

    Conference25th Australian Computer-Human Interaction Conference
    Period25/11/13 → …

    Keywords

    • Empirical Study
    • Human Computer Interaction
    • Human Factors

    Fingerprint Dive into the research topics of 'Interfaces for Discourse Summarisation: A Human Factors Analysis'. Together they form a unique fingerprint.

  • Cite this

    McCormac, A., Parsons, K., Butavicius, M., Ceglar, A., Weber, D., Pattison, T., Leibbrandt, R., Treharne, K., & Powers, D. (2013). Interfaces for Discourse Summarisation: A Human Factors Analysis. 139-142. Paper presented at 25th Australian Computer-Human Interaction Conference, . https://doi.org/10.1145/2541016.2541069