24 months postdoctoral position, associated to a project aiming at proposing visualization techniques for augmented reality adapted to spatially complex learning tasks.
11 months post-doctoral position (starting February 2022), related to a project at the intersection of Augmented Reality and Brain-Computer Interfaces
I am Associate Professor at IMT Atlantique. I am a member of the Inuit team from the Lab-Sticc lab.
In nov. 2020, I defended my PhD thesis, entitled “Toward a Characterization of Perceptual Biases in Mixed Reality: A Study of Factors Inducing Distance Misperception”, conducted in cooperation between Centrale Nantes, the Inria Hybrid team as well as the AAU Crenau, and supervised by Guillaume Moreau, Ferran Argelaguet and Jean-Marie Normand.
Before this, I obtained an engineering degree at Centrale Nantes in 2017.
My current research interests include human perception issues in Virtual and Augmented Reality, spatial perception in virtual and augmented environments, and more generally, the effect of perceptual biases in mixed environments.
PhD in Virtual and Augmented Reality, 2020
Inria Rennes - Hybrid / UMR AAU Crenau – Centrale Nantes
'Diplôme d’Ingénieur' (equivalent to MSc), 2017
Centrale Nantes
The main use of Augmented Reality (AR) today for the general public is in applications for smartphones. In particular, social network applications allow the use of many AR filters, modifying users' environments but also their own image. These AR filters are increasingly and frequently being used and can distort in many ways users' facial traits. Yet, we still do not know clearly how users perceive their faces augmented by these filters. In this paper, we present a study that aims to evaluate the impact of different filters, modifying several facial features such as the size or position of the eyes, the shape of the face or the orientation of the eyebrows, or adding virtual content such as virtual glasses. These filters are evaluated via a self-evaluation questionnaire, asking the participants about the personality, emotion, appeal and intelligence traits that their distorted face conveys. Our results show relative effects between the different filters in line with previous results regarding the perception of others. However, they also reveal specific effects on self-perception, showing, inter alia, that facial deformation decreases participants' credence towards their image. The findings of this study covering multiple factors allow us to highlight the impact of face deformation on user perception but also the specificity related to this use in AR, paving the way for new works focusing on the psychological impact of such filters.
Commonly used Head Mounted Displays (HMDs) in Augmented Reality (AR), namely Optical See-Through (OST) displays, suffer from a main drawback: their focal lenses can only provide a fixed focal distance. Such a limitation is suspected to be one of the main factors for distance misperception in AR. In this paper, we studied the use of an emerging new kind of AR display to tackle such perception issues: Retinal Projection Displays (RPDs). With RPDs, virtual images have no focal distance and the AR content is always in focus. We conducted the first reported experiment evaluating egocentric distance perception of observers using Retinal Projection Displays. We compared the precision and accuracy of the depth estimation between real and virtual targets, displayed by either OST HMDs or RPDs. Interestingly, our results show that RPDs provide depth estimates in AR closer to real ones compared to OST HMDs. Indeed, the use of an OST device was found to lead to an overestimation of the perceived distance by 16%, whereas the distance overestimation bias dropped to 4% with RPDs. Besides, the task was reported with the same level of difficulty and no difference in precision. As such, our results shed the first light on retinal projection displays' benefits in terms of user’s perception in Augmented Reality, suggesting that RPD is a promising technology for AR applications in which an accurate distance perception is required.
The topic of distance perception has been widely investigated in Virtual Reality (VR). However, the vast majority of previous workmainly focused on distance perception of objects placed in frontof the observer. Then, what happens when the observer looks onthe side? In this paper, we study differences in distance estimationwhen comparing objects placed in front of the observer with objectsplaced on his side. Through a series of four experiments (n=85),we assessed participants' distance estimation and ruled out poten-tial biases. In particular, we considered the placement of visualstimuli in the field of view, users' exploration behavior as well asthe presence of depth cues. For all experiments a two-alternativeforced choice (2AFC) standardized psychophysical protocol wasemployed, in which the main task was to determine the stimuli thatseemed to be the farthest one. In summary, our results showed thatthe orientation of virtual stimuli with respect to the user introducesa distance perception bias: objects placed on the sides are system-atically perceived farther away than objects in front. In addition,we could observe that this bias increases along with the angle, andappears to be independent of both the position of the object in thefield of view as well as the quality of the virtual scene. This worksheds a new light on one of the specificities of VR environmentsregarding the wider subject of visual space theory. Our study pavesthe way for future experiments evaluating the anisotropy of distanceperception in real and virtual environments.
While perceptual biases have been widely investigated in Virtual Reality (VR), very few studies have considered the challenging environment of Optical See-through Augmented Reality (OST-AR). Moreover, regarding distance perception, existing works mainly focus on the assessment of egocentric distance perception, i.e. distance between the observer and a real or a virtual object. In this paper, we study exocentric distance perception in AR, hereby considered as the distance between two objects, none of them being directly linked to the user. We report a user study (n=29) aiming at estimating distances between two objects lying in a frontoparallel plane at 2.1m from the observer (i.e. in the medium-field perceptual space). Four conditions were tested in our study: real objects on the left and on the right of the participant (called real-real), virtual objects on both sides (virtual-virtual), a real object on the left and a virtual one on the right (real-virtual) and finally a virtual object on the left and a real object on the right (virtual-real). Participants had to reproduce the distance between the objects by spreading two real identical objects presented in front of them. The main findings of this study are the overestimation (20%) of exocentric distances for all tested conditions. Surprisingly, the real-real condition was significantly more overestimated (by about 4%, p=.0166) compared to the virtual-virtual condition, i.e. participants obtained better estimates of the exocentric distance for the virtual-virtual condition. Finally, for the virtual-real/real-virtual conditions, the analysis showed a non-symmetrical behavior, which suggests that the relationship between real and virtual objects with respect to the user might be affected by other external factors. Considered together, these unexpected results illustrate the need for additional experiments to better understand the perceptual phenomena involved in exocentric distance perception with real and virtual objects.