Visual Appearance Modulates Prediction Error in Virtual Reality

Avinash Kumar Singh, Hsiang Ting Chen*, Yu Feng Cheng, Jung Tai King, Li-Wei Ko, Klaus Gramann, Chin Teng Lin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Scopus citations


Different rendering styles induce different levels of agency and user behaviors in virtual reality environments. We applied an electroencephalogram-based approach to investigate how the rendering style of the users' hands affects behavioral and cognitive responses. To this end, we introduced prediction errors due to cognitive conflicts during a 3-D object selection task by manipulating the selection distance of the target object. The results showed that, for participants with high behavioral inhibition scores, the amplitude of the negative event-related potential at approximately 50-250 ms correlated with the realism of the virtual hands. Concurring with the uncanny valley theory, these findings suggest that the more realistic the representation of the user's hand is, the more sensitive the user becomes toward subtle errors, such as tracking inaccuracies.

Original languageEnglish
Pages (from-to)24617-24624
Number of pages8
JournalIEEE Access
StatePublished - 4 May 2018


  • EEG
  • Virtual reality
  • body ownership
  • cognitive conflict
  • prediction error
  • virtual hand illusion

Fingerprint Dive into the research topics of 'Visual Appearance Modulates Prediction Error in Virtual Reality'. Together they form a unique fingerprint.

Cite this