Exploring Automatic Translation between Different Affect Representation Schemes

Affective Image Content Analysis

More Info
expand_more

Abstract

Images possess the ability to convey a wide range of emotions, and extracting affective information from images is crucial for affect prediction systems. This process can be achieved through the application of machine learning algorithms. Categorical Emotion States (CES) and Dimensional Emotion Space (DES) are two typical models used for representing emotions. Moreover, the development of a mapping schema between these representations can contribute to benefit the research in AI and psychology. Consequently, this study focuses on investigating the feasibility of translating emotions from DES to CES. To accomplish this goal, relevant databases are identified and combined as the training data, and machine learning models, namely Naive Bayes, K-nearest Neighbors, and Decision Tree are employed to perform the classification task. The results indicate the superior performance of the K-nearest Neighbors classifier, exhibiting higher mean accuracy (60%) and low standard deviation (0.004) among all implemented classifiers. Overall, the translation from DES to CES offers several advantages, including a simplified and interpretable representation of emotions, as well as the provision of a common language for discussing and expressing emotions.