Lost in style: Gaze-driven Adaptive Aid for VR Navigation

Rawan Alghofaili, Yasuhito Sawahata, Haikun Huang, Hsueh Cheng Wang, Takaaki Shiratori, Lap Fai Yu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

A key challenge for virtual reality level designers is striking a balance between maintaining the immersiveness of VR and providing users with on-screen aids after designing a virtual experience. These aids are often necessary for wayfnding in virtual environments with complex paths. We introduce a novel adaptive aid that maintains the effectiveness of traditional aids, while equipping designers and users with the controls of how often help is displayed. Our adaptive aid uses gaze patterns in predicting user’s need for navigation aid in VR and displays mini-maps or arrows accordingly. Using a dataset of gaze angle sequences of users navigating a VR environment and markers of when users requested aid, we trained an LSTM to classify user’s gaze sequences as needing navigation help and display an aid. We validated the efcacy of the adaptive aid for wayfnding compared to other commonly-used wayfnding aids.

Original languageEnglish
Title of host publicationCHI 2019 - Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450359702
DOIs
StatePublished - 2 May 2019
Event2019 CHI Conference on Human Factors in Computing Systems, CHI 2019 - Glasgow, United Kingdom
Duration: 4 May 20199 May 2019

Publication series

NameConference on Human Factors in Computing Systems - Proceedings

Conference

Conference2019 CHI Conference on Human Factors in Computing Systems, CHI 2019
CountryUnited Kingdom
CityGlasgow
Period4/05/199/05/19

Keywords

  • Eye tracking
  • Games/Play
  • Virtual/Augmented reality

Fingerprint Dive into the research topics of 'Lost in style: Gaze-driven Adaptive Aid for VR Navigation'. Together they form a unique fingerprint.

Cite this