Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality


Journal article


A. Erickson, D. Reiners, G. Bruder, G. Welch
2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2021

Semantic Scholar DBLP DOI
Cite

Cite

APA   Click to copy
Erickson, A., Reiners, D., Bruder, G., & Welch, G. (2021). Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality. 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct).


Chicago/Turabian   Click to copy
Erickson, A., D. Reiners, G. Bruder, and G. Welch. “Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality.” 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) (2021).


MLA   Click to copy
Erickson, A., et al. “Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality.” 2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), 2021.


BibTeX   Click to copy

@article{a2021a,
  title = {Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality},
  year = {2021},
  journal = {2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
  author = {Erickson, A. and Reiners, D. and Bruder, G. and Welch, G.}
}

Abstract

Mediated perception systems are systems in which sensory signals from the user’s environment are mediated to the user’s sensory channels. This type of system has great potential for enhancing the perception of the user via augmenting and/or diminishing incoming sensory signals according to the user’s context, preferences, and perceptual capability. They also allow for extending the perception of the user to enable them to sense signals typically imperceivable to human senses, such as regions of the electromagnetic spectrum beyond visible light.In this paper, we present a prototype mediated perception system that maps extrasensory spatial data into visible light displayed within an augmented reality (AR) optical see-through head-mounted display (OST-HMD). Although the system is generalized such that it could support any spatial sensor data with minor modification, we chose to test the system using thermal infrared sensors. This system improves upon previous extended perception augmented reality prototypes in that it is capable of projecting registered egocentric sensor data in real time onto a 3D mesh generated by the OST-HMD that is representative of the user’s environment. We present the lessons learned through iterative improvements to the system, as well as a performance analysis of the system and recommendations for future work.