Personality Assessment Based on Multimodal Attention Network Learning With Category-Based Mean Square Error
Personality analysis is widely used in occupational aptitude tests and entrance psychological tests. However, answering hundreds of questions at once seems to be a burden. Inspired by personality psychology, we propose a multimodal attention network with Category-based mean square error (CBMSE) for...
Veröffentlicht in: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society. - 1992. - 31(2022) vom: 23., Seite 2162-2174 |
---|---|
1. Verfasser: | |
Weitere Verfasser: | , , , |
Format: | Online-Aufsatz |
Sprache: | English |
Veröffentlicht: |
2022
|
Zugriff auf das übergeordnete Werk: | IEEE transactions on image processing : a publication of the IEEE Signal Processing Society |
Schlagworte: | Journal Article |
Zusammenfassung: | Personality analysis is widely used in occupational aptitude tests and entrance psychological tests. However, answering hundreds of questions at once seems to be a burden. Inspired by personality psychology, we propose a multimodal attention network with Category-based mean square error (CBMSE) for personality assessment. With this method, we can obtain information about one's behaviour from his or her daily videos, including his or her gaze distribution, speech features, and facial expression changes, to accurately determine personality traits. In particular, we propose a new approach to implementing an attention mechanism based on the facial Region of No Interest (RoNI), which can achieve higher accuracy and reduce the number of network parameters. Simultaneously, we use CBMSE, a loss function with a higher penalty for the fuzzy boundary in personality assessment, to help the network distinguish boundary data. After effective data fusion, this method achieves an average prediction accuracy of 92.07%, which is higher than any other state-of-the-art model on the dataset of the ChaLearn Looking at People challenge in association with ECCV 2016 |
---|---|
Beschreibung: | Date Completed 10.03.2022 Date Revised 11.03.2022 published: Print-Electronic Citation Status MEDLINE |
ISSN: | 1941-0042 |
DOI: | 10.1109/TIP.2022.3152049 |