Abstract
Among the different personality traits that guide our behaviour, curiosity is particularly interesting for context-aware assistive systems as it is closely linked to our well-being and the way we learn. This work proposes eye movement analysis for automatic recognition of different levels of curiosity. We present a 26-participant gaze dataset recorded during a real-world shopping task with empirically validated curiosity questionnaires as ground truth. Using a support vector machine classifier and a leave-one-person-out evaluation scheme we can discriminate between two to four classes of standard curiosity scales well above chance. These results are promising and point towards a new class of context-aware systems that take the user's curiosity into account, thereby enabling new types of interaction and user adaptation.
Original language | English |
---|---|
Pages | 185-188 |
Number of pages | 4 |
DOIs | |
Publication status | Published - 7 Sept 2015 |
Event | ACM International Joint Conference on Pervasive and Ubiquitous Computing - Duration: 7 Sept 2015 → … |
Conference
Conference | ACM International Joint Conference on Pervasive and Ubiquitous Computing |
---|---|
Period | 7/09/15 → … |