Spatialgaze: towards spatial gaze tracking for extended reality

Songzhou Yang,Yuan He, Yulong Chen

CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION(2023)

引用 0|浏览7
暂无评分
摘要
With the rise of Metaverse, Extended Reality (XR) and its enabling techniques have received increasing attention. Spatial gaze tracking is one of these techniques that enables capturing a user’s visual attention, so as to support immersive 3D experience and interaction. Due to the limitations in the employed visual models and algorithms, the existing proposals of gaze tracking can only provide planar gaze tracking or approximate spatial gaze tracking. A critical problem behind is that so far there isn’t an accurate and efficient approach for XR devices to sense the spatial gaze, that are modeled based on the vergence of the binocular visual axes. To address this problem, this paper proposes SpatialGaze, a spatial gaze tracking approach based on the realistic parallax-contingent visual model. SpatialGaze contains a tailored design for XR devices, which is accurate, lightweight, and practical for use. Our implementation and evaluation demonstrate that SpatialGaze achieves an average error of 0.52 ^∘ in direction tracking and an average error of 75.52 cm in depth perception. Compared to the baseline approach, SpatialGaze reduces the direction and depth errors by up to 52.62 and 75.15
更多
查看译文
关键词
Extended reality, Gaze tracking, User attention, Spatial gaze
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要