Please use this identifier to cite or link to this item:
https://idr.l3.nitk.ac.in/jspui/handle/123456789/16526
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Ashwin T.S. | |
dc.contributor.author | Guddeti R.M.R. | |
dc.date.accessioned | 2021-05-05T10:30:45Z | - |
dc.date.available | 2021-05-05T10:30:45Z | - |
dc.date.issued | 2020 | |
dc.identifier.citation | Future Generation Computer Systems , Vol. 108 , , p. 334 - 348 | en_US |
dc.identifier.uri | https://doi.org/10.1016/j.future.2020.02.075 | |
dc.identifier.uri | http://idr.nitk.ac.in/jspui/handle/123456789/16526 | - |
dc.description.abstract | Automatic recognition of the students’ affective states is a challenging task. These affective states are recognized using their facial expressions, hand gestures, and body postures. An intelligent tutoring system and smart classroom environment can be made more personalized using students’ affective state analysis, and it is performed using machine or deep learning techniques. Effective recognition of affective states is mainly dependent on the quality of the database used. But, there exist very few standard databases for the students’ affective state recognition and its analysis that works for both e-learning and classroom environments. In this paper, we propose a new affective database for both the e-learning and classroom environments using the students’ facial expressions, hand gestures, and body postures. The database consists of both posed (acted) and spontaneous (natural) expressions with single and multi-person in a single image frame with more than 4000 manually annotated image frames with object localization. The classification was done manually using the gold standard study for both Ekman's basic emotions and learning-centered emotions, including neutral. The annotators reliably agree when discriminating against the recognized affective states with Cohen's κ = 0.48. The created database is more robust as it considers various image variants such as occlusion, background clutter, pose, illumination, cultural & regional background, intra-class variations, cropped images, multipoint view, and deformations. Further, we analyzed the classification accuracy of our database using a few state-of-the-art machine and deep learning techniques. Experimental results demonstrate that the convolutional neural network based architecture achieved an accuracy of 83% and 76% for detection and classification, respectively. © 2020 Elsevier B.V. | en_US |
dc.title | Affective database for e-learning and classroom environments using Indian students’ faces, hand gestures and body postures | en_US |
dc.type | Article | en_US |
Appears in Collections: | 1. Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.