Abstract
Empirical studies assessing the effectiveness of novel document interfaces are becoming more prevalent, however relatively little attention has been paid to how such tools could work with less structured documents featuring multiple contributors. Participants in this study used different interfaces to answer questions requiring the exploration of collaborative discourse. User performance was influenced by an interaction of interface, transcript, and question type. Individual differences also impacted on performance with higher education levels and higher general knowledge scores being associated with better task performance. The results also revealed that unnecessary interface functionality can hinder performance.
Original language | English |
---|---|
Pages | 139-142 |
Number of pages | 4 |
DOIs | |
Publication status | Published - 1 Jan 2013 |
Event | 25th Australian Computer-Human Interaction Conference - Duration: 25 Nov 2013 → … |
Conference
Conference | 25th Australian Computer-Human Interaction Conference |
---|---|
Period | 25/11/13 → … |
Keywords
- Empirical Study
- Human Computer Interaction
- Human Factors