Main publications

Can Retinal Projection Displays Improve Spatial Perception in Augmented Reality?

Peillard, E., Itoh, Y., Normand, J., Sanz, F. A., Moreau, G., & Lécuyer, A. (2020)
2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)

Commonly used Head Mounted Displays (HMDs) in Augmented Reality (AR), namely Optical See-Through (OST) displays, suffer from a main drawback: their focal lenses can only provide a fixed focal distance. Such a limitation is suspected to be one of the main factors for distance misperception in AR. In this paper, we studied the use of an emerging new kind of AR display to tackle such perception issues: Retinal Projection Displays (RPDs). With RPDs, virtual images have no focal distance and the AR content is always in focus. We conducted the first reported experiment evaluating egocentric distance perception of observers using Retinal Projection Displays. We compared the precision and accuracy of the depth estimation between real and virtual targets, displayed by either OST HMDs or RPDs. Interestingly, our results show that RPDs provide depth estimates in AR closer to real ones compared to OST HMDs. Indeed, the use of an OST device was found to lead to an overestimation of the perceived distance by 16%, whereas the distance overestimation bias dropped to 4% with RPDs. Besides, the task was reported with the same level of difficulty and no difference in precision. As such, our results shed the first light on retinal projection displays' benefits in terms of user's perception in Augmented Reality, suggesting that RPD is a promising technology for AR applications in which an accurate distance perception is required.

Studying Exocentric Distance Perception in Optical See-Through Augmented Reality

E. Peillard, F. Argelaguet, J. Normand, A. Lécuyer and G. Moreau
2019 IEEE International Symposium on Mixed and Augmented Reality (ISMAR).

While perceptual biases have been widely investigated in Virtual Reality (VR), very few studies have considered the challenging environment of Optical See-through Augmented Reality (OST-AR). Moreover, regarding distance perception, existing works mainly focus on the assessment of egocentric distance perception, i.e. distance between the observer and a real or a virtual object. In this paper, we study exocentric distance perception in AR, hereby considered as the distance between two objects, none of them being directly linked to the user. We report a user study (n=29) aiming at estimating distances between two objects lying in a frontoparallel plane at 2.1m from the observer (i.e. in the medium-field perceptual space). Four conditions were tested in our study: real objects on the left and on the right of the participant (called real-real), virtual objects on both sides (virtual-virtual), a real object on the left and a virtual one on the right (real-virtual) and finally a virtual object on the left and a real object on the right (virtual-real). Participants had to reproduce the distance between the objects by spreading two real identical objects presented in front of them. The main findings of this study are the overestimation (20%) of exocentric distances for all tested conditions. Surprisingly, the real-real condition was significantly more overestimated (by about 4%, p=.0166) compared to the virtual-virtual condition, i.e. participants obtained better estimates of the exocentric distance for the virtual-virtual condition. Finally, for the virtual-real/real-virtual conditions, the analysis showed a non-symmetrical behavior, which suggests that the relationship between real and virtual objects with respect to the user might be affected by other external factors. Considered together, these unexpected results illustrate the need for additional experiments to better understand the perceptual phenomena involved in exocentric distance perception with real and virtual objects.

Influence of virtual objects' shadows and lighting coherence on distance perception in optical see‐through augmented reality

Gao, Y, Peillard, E, Normand, J‐M, Moreau, G, Liu, Y, Wang, Y.
J Soc Inf Display. 2020

This paper focuses on how virtual objects' shadows as well as differences in alignment between virtual and real lighting influence distance perception in optical see‐through (OST) augmented reality (AR). Four hypotheses are proposed: (H1) Participants underestimate distances in OST AR; (H2) Virtual objects' shadows improve distance judgment accuracy in OST AR; (H3) Shadows with different realism levels have different influence on distance perception in OST AR; (H4) Different levels of lighting misalignment between real and virtual lights have different influence on distance perception in OST AR scenes. Two experiments were designed with an OST head mounted display (HMD), the Microsoft HoloLens. Participants had to match the position of a virtual object displayed in the OST‐HMD with a real target. Distance judgment accuracy was recorded under the different shadows and lighting conditions. The results validate hypotheses H2 and H4 but surprisingly showed no impact of the shape of virtual shadows on distance judgment accuracy thus rejecting hypothesis H3. Regarding hypothesis H1, we detected a trend toward underestimation; given the high variance of the data, more experiments are needed to confirm this result. Moreover, the study also reveals that perceived distance errors and completion time of trials increase along with targets' distance.

Virtual Objects Look Farther on the Sides: The Anisotropy of Distance Perception in Virtual Reality

E. Peillard, T. Thebaud, J. Normand, F. Argelaguet, G. Moreau and A. Lécuyer
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 2019

The topic of distance perception has been widely investigated in Virtual Reality (VR). However, the vast majority of previous work mainly focused on distance perception of objects placed in front of the observer. Then, what happens when the observer looks on the side? In this paper, we study differences in distance estimation when comparing objects placed in front of the observer with objects placed on his side. Through a series of four experiments (n=85), we assessed participants' distance estimation and ruled out potential biases. In particular, we considered the placement of visual stimuli in the field of view, users' exploration behavior as well as the presence of depth cues. For all experiments a two-alternative forced choice (2AFC) standardized psychophysical protocol was employed, in which the main task was to determine the stimuli that seemed to be the farthest one. In summary, our results showed that the orientation of virtual stimuli with respect to the user introduces a distance perception bias: objects placed on the sides are systematically perceived farther away than objects in front. In addition, we could observe that this bias increases along with the angle, and appears to be independent of both the position of the object in the field of view as well as the quality of the virtual scene. This work sheds a new light on one of the specificities of VR environments regarding the wider subject of visual space theory. Our study paves the way for future experiments evaluating the anisotropy of distance perception in real and virtual environments.

Other Publications

See complete list on Google Scholar