Abstract
We present FakeET - an eye-tracking database to understand human visual perception of deepfake videos. Given that the principal purpose of deepfakes is to deceive human observers, FakeET is designed to understand and evaluate the ability of viewers to detect synthetic video artifacts. FakeET contains viewing patterns compiled from 40 users via the Tobii desktop eye-tracker for 811 videos from the Google Deepfake dataset, with a minimum of two viewings per video. Additionally, EEG responses acquired via the Emotiv sensor are also available. The compiled data confirms (a) distinct eye movement characteristics for real vs fake videos; (b) utility of the eye-track saliency maps for spatial forgery localization and detection, and (c) Error Related Negativity (ERN) triggers in the EEG responses, and the ability of the raw EEG signal to distinguish between real and fake videos.
Original language | English |
---|---|
Title of host publication | ICMI '20 |
Subtitle of host publication | Proceedings of the 2020 International Conference on Multimodal Interaction |
Place of Publication | New York, NY |
Publisher | Association for Computing Machinery, Inc |
Pages | 519-527 |
Number of pages | 9 |
ISBN (Electronic) | 9781450375818 |
DOIs | |
Publication status | Published - 22 Oct 2020 |
Externally published | Yes |
Event | 22nd ACM International Conference on Multimodal Interaction - Virtual, Online, Netherlands Duration: 25 Oct 2020 → 29 Oct 2020 Conference number: 22nd |
Publication series
Name | Proceedings of the International Conference on Multimodal Interaction |
---|---|
Volume | 2020 |
Conference
Conference | 22nd ACM International Conference on Multimodal Interaction |
---|---|
Abbreviated title | ICMI 2020 |
Country/Territory | Netherlands |
City | Virtual, Online |
Period | 25/10/20 → 29/10/20 |
Keywords
- deepfake
- eeg
- eye-tracking
- visual perception