Optimizing Depth Perception in Virtual and Augmented Reality through Gaze-contingent Stereo Rendering
Event Type
Technical Papers
Technical Papers Q&A
Registration Categories
TimeSunday, 13 December 202010:10 - 10:15 SGT
LocationZoom Room 7
DescriptionVirtual and augmented reality (VR/AR) displays crucially rely on stereoscopic rendering to enable perceptually realistic user experiences. Yet, existing neareye display systems ignore the gaze-dependent shift of the no-parallax point in the human eye. Here, we introduce a gaze-contingent stereo rendering technique that models this effect and conduct several user studies to validate its effectiveness. Our findings include experimental validation of the location of the no-parallax point, which we then use to demonstrate significant improvements of disparity and shape distortion in a VR setting, and consistent alignment of physical and digitally rendered objects across depths in optical see-through AR. Our work shows that gaze-contingent stereo rendering improves perceptual realism and depth perception of emerging wearable computing systems.