In nov. 2020, I defended my PhD thesis, entitled “Toward a Characterization of Perceptual Biases in Mixed Reality: A Study of Factors Inducing Distance Misperception”, conducted in cooperation between Centrale Nantes, the Inria Hybrid team as well as the AAU Crenau, and supervised by Guillaume Moreau, Ferran Argelaguet and Jean-Marie Normand.
Before this, I obtained an engineering degree at Centrale Nantes in 2017.
My current research interests include human perception issues in Virtual and Augmented Reality, spatial perception in virtual and augmented environments, and more generally, the effect of perceptual biases in mixed environments.
PhD in Virtual and Augmented Reality, 2020
Inria Rennes - Hybrid / UMR AAU Crenau – Centrale Nantes
'Diplôme d’Ingénieur' (equivalent to MSc), 2017
This project aims to explore Augmented Reality (AR) user identification with AR virtual avatars from a human centered perspective. Identification is a psychological phenomenon representing the merger of user and avatar which forms a new identity.
The project aims to develop a theme that is currently underdeveloped in the field of virtual and augmented reality. The aim is to propose visualization techniques for mixed reality adapted to spatially complex learning tasks. This objective is first of all part of a methodological approach aiming at exploring the different types of possible representations, based on perceptual evaluations and metrics. However, this preliminary work also aims at proposing specific evaluations for certain learning tasks, allowing to facilitate the design of mixed reality applications adapted to this use case.
The main use of Augmented Reality (AR) today for the general public is in applications for smartphones. In particular, social network applications allow the use of many AR filters, modifying users’ environments but also their own image. These AR filters are increasingly and frequently being used and can distort in many ways users’ facial traits. Yet, we still do not know clearly how users perceive their faces augmented by these filters. In this paper, we present a study that aims to evaluate the impact of different filters, modifying several facial features such as the size or position of the eyes, the shape of the face or the orientation of the eyebrows, or adding virtual content such as virtual glasses. These filters are evaluated via a self-evaluation questionnaire, asking the participants about the personality, emotion, appeal and intelligence traits that their distorted face conveys. Our results show relative effects between the different filters in line with previous results regarding the perception of others. However, they also reveal specific effects on self-perception, showing, inter alia, that facial deformation decreases participants’ credence towards their image. The findings of this study covering multiple factors allow us to highlight the impact of face deformation on user perception but also the specificity related to this use in AR, paving the way for new works focusing on the psychological impact of such filters.
The topic of distance perception has been widely investigated in Virtual Reality (VR). However, the vast majority of previous workmainly focused on distance perception of objects placed in frontof the observer. Then, what happens when the observer looks onthe side? In this paper, we study differences in distance estimationwhen comparing objects placed in front of the observer with objectsplaced on his side. Through a series of four experiments (n=85),we assessed participants’ distance estimation and ruled out poten-tial biases. In particular, we considered the placement of visualstimuli in the field of view, users’ exploration behavior as well asthe presence of depth cues. For all experiments a two-alternativeforced choice (2AFC) standardized psychophysical protocol wasemployed, in which the main task was to determine the stimuli thatseemed to be the farthest one. In summary, our results showed thatthe orientation of virtual stimuli with respect to the user introducesa distance perception bias: objects placed on the sides are system-atically perceived farther away than objects in front. In addition,we could observe that this bias increases along with the angle, andappears to be independent of both the position of the object in thefield of view as well as the quality of the virtual scene. This worksheds a new light on one of the specificities of VR environmentsregarding the wider subject of visual space theory. Our study pavesthe way for future experiments evaluating the anisotropy of distanceperception in real and virtual environments.