Beauty Artificial Intelligence and Context-aware Emotion Recognition Technologies, Neuromorphic and Bio-inspired Computing Systems, and Big Data-driven Facial Retouching Tools for Visually Appealing Self-presentations and Negative Emotional States
Lucia Michalkova1, Ana-Maria-Sonia Petreanu2, Camelia Chitcă2, Petris Geambazi2, Kriselda Gura3, and Georgeta-Anca Petre4ABSTRACT. This paper draws on a substantial body of theoretical and empirical research on artificial intelligence-generated hyper-realistic body-inclusive avatars, beauty artificial intelligence and context-aware emotion recognition technologies, and neuromorphic and bio-inspired computing systems. A quantitative literature review of ProQuest, Scopus, and the Web of Science was carried out throughout May 2023, with search terms including “visually appealing self-presentations” and “negative emotional states” + “beauty artificial intelligence and context-aware emotion recognition technologies,” “neuromorphic and bio-inspired computing systems,” and “big data-driven facial retouching tools.” As research published between 2019 and 2022 was inspected, only 174 articles satisfied the eligibility criteria, and 25 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AMSTAR, Distiller SR, ROBIS, and SRDR.
Keywords: beauty; artificial intelligence; emotion recognition; big data; facial retouching; visually appealing self-presentation; negative emotional state
How to cite: Michalkova, L., Petreanu, A.-M.-S., Chitcă, C., Geambazi, P., Gura, K., and Petre, G.-A. (2023). “Beauty Artificial Intelligence and Context-aware Emotion Recognition Technologies, Neuromorphic and Bio-inspired Computing Systems, and Big Data-driven Facial Retouching Tools for Visually Appealing Self-presentations and Negative Emotional States,” Journal of Research in Gender Studies 13(2): 9–23. doi: 10.22381/JRGS13220231.
Received 22 June 2023 • Received in revised form 22 December 2023
Accepted 28 December 2023 • Available online 30 December 2023