Investigating the Use of Eye Fixation Data for Emotion Classification in VR
Main Article Content
Abstract
Eye-tracking technology has become popular recently and widely used in research on emotion recognition since its usability. In this paper, we presented a preliminary investigation on a novelty approach for detecting emotions using eye-tracking data in virtual reality (VR) to classify 4-quadrant of emotions according to russell’scircumplex model of affects. A presentation of 3600 videos is used as the experiment stimuli to evoke the emotions of the user in VR. An add-on eye-tracker within the VR headset is used for the recording and collecting device of eye-tracking data. Fixation data is extracted and chosen as the eye feature used in this investigation. The machine learning classifier is support vector machine (SVM) with radial basis function (RBF) kernel. The best classification accuracy achieved is 69.23%. The findings showed that emotion classification using fixation data has promising results in the prediction accuracy from a four-class random classification.
Downloads
Metrics
Article Details
Licensing
TURCOMAT publishes articles under the Creative Commons Attribution 4.0 International License (CC BY 4.0). This licensing allows for any use of the work, provided the original author(s) and source are credited, thereby facilitating the free exchange and use of research for the advancement of knowledge.
Detailed Licensing Terms
Attribution (BY): Users must give appropriate credit, provide a link to the license, and indicate if changes were made. Users may do so in any reasonable manner, but not in any way that suggests the licensor endorses them or their use.
No Additional Restrictions: Users may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.