Context Awareness and Multi-Sensory Data Fusion Technologies, 3D Virtual Reality Simulation and Deep Generative Modeling Tools, and Image Processing Computational and Visual Perception Algorithms in Artificial Intelligence-based Digital Twin Industrial Metaverse
Danuta Szpilko1, Firdaus Abdullah2, Ioana Alexandra Pârvu3, and Cristian Florin Ciurlău4ABSTRACT. This paper draws on a substantial body of theoretical and empirical research on cloud computing machines, industrial data semantics, artificial intelligence-driven computer vision and deep reinforcement learning algorithms, and autonomous robotic swarms. A quantitative literature review of ProQuest, Scopus, and the Web of Science was carried out throughout June 2023, with search terms including “artificial intelligence-based digital twin industrial metaverse” + “context awareness and multi-sensory data fusion technologies,” “3D virtual reality simulation and deep generative modeling tools,” and “image processing computational and visual perception algorithms.” As research published in 2022 and 2023 was inspected, only 166 articles satisfied the eligibility criteria, and 19 mainly empirical sources were selected. Data visualization tools: Dimensions (bibliometric mapping) and VOSviewer (layout algorithms). Reporting quality assessment tool: PRISMA. Methodological quality assessment tools include: ASReview Lab, AXIS, DistillerSR, Eppi-Reviewer, Nested Knowledge, and Rayyan.
JEL codes: D53; E22; E32; E44; G01; G41
Keywords: context awareness; multi-sensory data fusion; 3D virtual reality simulation; artificial intelligence; digital twin; industrial metaverse
How to cite: Szpilko, D., Abdullah, F., Pârvu, I. A., and Ciurlău, C. F. (2023). “Context Awareness and Multi-Sensory Data Fusion Technologies, 3D Virtual Reality Simulation and Deep Generative Modeling Tools, and Image Processing Computational and Visual Perception Algorithms in Artificial Intelligence-based Digital Twin Industrial Metaverse,” Economics, Management, and Financial Markets 18(4): 24–38. doi: 10.22381/emfm18420232.
Received 15 July 2023 • Received in revised form 23 December 2023
Accepted 28 December 2023 • Available online 30 December 2023