3D Augmented Reality Try-on and Context-aware Emotion Recognition Technologies, Big Data-driven Facial Retouching and Artificial Intelligence-based Beauty Tools, and Deep Learning Sensor Fusion and Visual Perception Algorithms for Social Media-related Body Dissatisfaction, Self-Presentation Perceptions and Behaviors, and Unrealistic Physical Appearance
Elvira Nica1, Asser Khamis2, Silviea Crețu3, Nurul Lizzan Kamarudin4, Muzaffer Aydemir5, Emilia Pascu6, Cătălina-Oana Dumitrescu1ABSTRACT. This article advances existing literature concerning 3D augmented reality try-on and context-aware emotion recognition technologies, big data-driven facial retouching and artificial intelligence-based beauty tools, and deep learning sensor fusion and visual perception algorithms for social media-related body dissatisfaction, self-presentation perceptions and behaviors, and unrealistic physical appearance. The screening software tools and evidence synthesis technologies deployed include Abstrackr, CADIMA, DistillerSR, Eppi-Reviewer, JBI SUMARI, Litstream, METAGEAR package for R, Nested Knowledge, PICO Portal, Rayyan, and SRDR+. The case studies cover AirBrush, B612, BeautyPlus, Evoto, Facetune, Filmora, Fotogenic, GlamAR, Meitu, Perfect365, Vivid Glam, and YouCam.
Keywords: 3D augmented reality try-on; big data-driven facial retouching; beauty; visual perception; self-presentation; unrealistic physical appearance
How to cite: Nica, E., Khamis, A., Crețu, S., Kamarudin, N. L., Aydemir, M., Pascu, E., and Dumitrescu, C.-O. (2025). “3D Augmented Reality Try-on and Context-aware Emotion Recognition Technologies, Big Data-driven Facial Retouching and Artificial Intelligence-based Beauty Tools, and Deep Learning Sensor Fusion and Visual Perception Algorithms for Social Media-related Body Dissatisfaction, Self-Presentation Perceptions and Behaviors, and Unrealistic Physical Appearance,” Journal of Research in Gender Studies 15(2): 53–64. doi: 10.22381/JRGS15220254.
Received 12 July 2025 • Received in revised form 21 December 2025
Accepted 26 December 2025 • Available online 30 December 2025
