Abstract
Extended Reality (XR) technologies have continuously evolved, integrating virtual and physical experiences while converging with artificial intelligence (AI) to provide immersive services across various domains. However, despite these technological advancements and theoretical developments, the user experience (UX) of XR remains significantly constrained, failing to fully meet user expectations.
These limitations primarily stem from discrepancies in the relational balance among sensory inputs. In the physical world, sensory modalities operate in a harmonious and balanced manner. However, interactions with virtual objects often disrupt this equilibrium, leading to inconsistencies in sensory input integration.
This study identifies such sensory discrepancies as a critical challenge in human-centered XR technology development and experience design. It investigates the unmet aspects of immersive experiences and emphasizes a strategic approach to interaction design that ensures a balanced multisensory experience.
The methodology was based on a literature review, along with experimental research such as expert interviews. Using Python, the relationships between sensory elements were defined and simulated, leading to the development of a strategic research model for human-centered integrated design, referred to as the “Relational Process Model for Immersive Experience.” The results provide a foundational framework for developing standardized usability metrics and design methodologies. Additionally, it offers insights into overcoming the dual system development process that hinders the advancement of XR systems, contributing to the creation of smoother and more immersive user interactions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only