Human-like Tactile Sensing and Perception Technologies, Beauty Filter-based Appealing Self-Presentation and Realistic Makeup Simulation Tools, and Deep Learning-based Facial Expression Recognition and Artificial Intelligence-enabled Emotion Communication Systems for Mediated Virtual Self-Expression and Representation
Ioana Alexandra Pârvu1, Zoran Simonović2, Vloreen Nity Anak Mathew3, Alexandra Teodora Ababii4, Firdaus Abdullah3, Cristian Florin Ciurlău5ABSTRACT. The purpose of the paper is to explore human-like tactile sensing and perception technologies, beauty filter-based appealing self-presentation and realistic makeup simulation tools, and deep learning-based facial expression recognition and artificial intelligence-enabled emotion communication systems for mediated virtual self-expression and representation. Study screening tools, machine learning classifiers, and reference management software employed include AMSTAR, BIBOT, Catchii, DistillerSR, Eppi-Reviewer, JBI SUMARI, Litstream, METAGEAR package for R, Nested Knowledge, PICO Portal, ROBIS, and SluRp. The case studies cover AirBrush, BeautyPlus, Evoto, FaceApp, FaceTune, Fotor, Maybelline, Media.io, Meitu, Picsart, PhotoDirector, PrettyUp, VanceAI, Vivid Glam, Wondershare UniConverter, and YouCam.
Keywords: beauty filter-based appealing self-presentation; realistic makeup simulation; tactile sensing and perception; facial expression recognition; mediated virtual self-expression and representation; artificial intelligence-enabled emotion communication
How to cite: Pârvu, I. A., Simonović, Z., Mathew, V. N. A., Ababii, A. T., Abdullah, F., and Ciurlău, C. F. (2025). “Human-like Tactile Sensing and Perception Technologies, Beauty Filter-based Appealing Self-Presentation and Realistic Makeup Simulation Tools, and Deep Learning-based Facial Expression Recognition and Artificial Intelligence-enabled Emotion Communication Systems for Mediated Virtual Self-Expression and Representation,” Journal of Research in Gender Studies 15(2): 76–87. doi: 10.22381/JRGS15220256.
Received 14 June 2025 • Received in revised form 18 December 2025
Accepted 21 December 2025 • Available online 30 December 2025
