Abstract
Social distancing and remote work are becoming more prevalent in the post-covid world. At the same time, there is a huge demand for remote healthcare sessions as well. Although a growing number of such sessions are now utilizing online platforms as a medium of communication, other critical parameters such as the affective state and other feedback opportunities are lost during the transmission of this digital information. This paper presents a solution that leverages a brain-computer interface system for this affective feedback and a humanoid robot for teaching effectively during remote sessions. The solution uses Kinect as a sensing mechanism for the trainer. It utilizes state-of-the-art deep learning algorithms at the back-end to understand the emotional state of the trainee. The training poses (from humanoid's camera feed and kinect) are calculated using AlphaPose compared using inverse kinematics. To ascertain the trainees' state (high valence and arousal vs. low valence and arousal), a Capsule Network was used that gives an average accuracy of 90.4% for this classification with a low average inference time of 14.3ms on the publicly available DREAMER and AMIGOS datasets. The system also allows real-time communication through the humanoid, making this experience even more distinct for the trainee.
Original language | English |
---|---|
Pages (from-to) | 17606-17614 |
Number of pages | 9 |
Journal | IEEE Sensors Journal |
Volume | 22 |
Issue number | 18 |
Early online date | 5 Jan 2021 |
DOIs | |
Publication status | Published - 15 Sept 2022 |
Externally published | Yes |
Keywords
- Feature extraction
- Humanoid robots
- Medical treatment
- Real-time systems
- Robots
- Sensors
- human-robot interaction
- statistical learning
- brain-computer interface
- Affective computing
- emotion analysis
- skeletal tracking