Multi-sensory Perception and Deep Learning-based Virtual Try-on Technologies for Body Shape and Attractiveness, Unrealistic Societal Beauty Standards, and Thinness Ideal Internalization
Katarina Zvarikova1, Katarina Frajtova Michalikova1, Jacqueline-Nathalie Harba2, Andreea Diana Stupu3, and Ioana Alexantra Pârvu2ABSTRACT. This article reviews and advances existing literature concerning artificial intelligence-generated hyper-realistic body-inclusive avatars, photoshopped and filtered images, and beauty apps and filters. In this research, previous findings were cumulated showing that 3D digital clothing technologies, wearable artificial intelligence devices, and beauty filter algorithms configure body shape and attractiveness, unrealistic societal beauty standards, and thinness ideal internalization. Throughout June 2023, a quantitative literature review of the Web of Science, Scopus, and ProQuest databases was performed, with search terms including “body shape and attractiveness,” “unrealistic societal beauty standards,” and “thinness ideal internalization” + “multi-sensory perception and deep learning-based virtual try-on technologies.” As research published between 2019 and 2022 was inspected, only 169 articles satisfied the eligibility criteria, and 26 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: AXIS, MMAT, ROBIS, and SRDR.
Keywords: multi-sensory perception; body shape and attractiveness; virtual try-on; unrealistic societal beauty standard; thinness ideal internalization
How to cite: Zvarikova, K., Frajtova Michalikova, K., Harba, J.-N., Stupu, A. D., and Pârvu, I. A. (2023). “Multi-sensory Perception and Deep Learning-based Virtual Try-on Technologies for Body Shape and Attractiveness, Unrealistic Societal Beauty Standards, and Thinness Ideal Internalization,” Journal of Research in Gender Studies 13(2): 39–53. doi: 10.22381/JRGS13220233.
Received 21 July 2023 • Received in revised form 25 December 2023
Accepted 28 December 2023 • Available online 30 December 2023