You are currently viewing CHI ’24 Heads-Up Multitasker: Simulating Attention Switching On Optical Head-Mounted Displays

CHI ’24 Heads-Up Multitasker: Simulating Attention Switching On Optical Head-Mounted Displays

Authors: Yunpeng Bai, Aleksi Ikkala, Antti Oulasvirta, Shengdong Zhao, Lucia J. Wang, Pengzhi Yang, Peisen Xu.

In this paper, we introduce an innovative model grounded in computational rationality to simulate the patterns of user attention shifts during multitasking scenarios, particularly focusing on reading via Optical Head-Mounted Displays (OHMDs) while walking. We employ a hierarchical reinforcement learning framework that encapsulates the essential cognitive functions of users, implemented within the MuJoCo physics engine for high fidelity and flexibility in simulation. Our model not only offers a dynamic environment for simulations but also delivers precise forecasts, effectively mirroring real-world user behaviors such as attention switch, walking speed control, and reading behavior. Please read our paper for more details!

Leave a Reply