Eating, Smelling, and Seeing : Investigating Multisensory Integration and (In)congruent Stimuli while Eating in VR

Integrating taste in AR/VR applications has various promising use cases - from social eating to the treatment of disorders. Despite many successful AR/VR applications that alter the taste of beverages and food, the relationship between olfaction, gustation, and vision during the process of multisens...

Description complète

Détails bibliographiques
Publié dans:IEEE transactions on visualization and computer graphics. - 1996. - PP(2023) vom: 22. Feb.
Auteur principal: Weidner, Florian (Auteur)
Autres auteurs: Maier, Jana E, Broll, Wolfgang
Format: Article en ligne
Langue:English
Publié: 2023
Accès à la collection:IEEE transactions on visualization and computer graphics
Sujets:Journal Article
LEADER 01000caa a22002652c 4500
001 NLM355323842
003 DE-627
005 20250304152354.0
007 cr uuu---uuuuu
008 231226s2023 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2023.3247099  |2 doi 
028 5 2 |a pubmed25n1184.xml 
035 |a (DE-627)NLM355323842 
035 |a (NLM)37027726 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Weidner, Florian  |e verfasserin  |4 aut 
245 1 0 |a Eating, Smelling, and Seeing  |b Investigating Multisensory Integration and (In)congruent Stimuli while Eating in VR 
264 1 |c 2023 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 07.04.2023 
500 |a published: Print-Electronic 
500 |a Citation Status Publisher 
520 |a Integrating taste in AR/VR applications has various promising use cases - from social eating to the treatment of disorders. Despite many successful AR/VR applications that alter the taste of beverages and food, the relationship between olfaction, gustation, and vision during the process of multisensory integration (MSI) has not been fully explored yet. Thus, we present the results of a study in which participants were confronted with congruent and incongruent visual and olfactory stimuli while eating a tasteless food product in VR. We were interested (1) if participants integrate bi-modal congruent stimuli and (2) if vision guides MSI during congruent/incongruent conditions. Our results contain three main findings: First, and surprisingly, participants were not always able to detect congruent visual-olfactory stimuli when eating a portion of tasteless food. Second, when confronted with tri-modal incongruent cues, a majority of participants did not rely on any of the presented cues when forced to identify what they eat; this includes vision which has previously been shown to dominate MSI. Third, although research has shown that basic taste qualities like sweetness, saltiness, or sourness can be influenced by congruent cues, doing so with more complex flavors (e.g., zucchini or carrot) proved to be harder to achieve. We discuss our results in the context of multimodal integration, and within the domain of multisensory AR/VR. Our results are a necessary building block for future human-food interaction in XR that relies on smell, taste, and vision and are foundational for applied applications such as affective AR/VR 
650 4 |a Journal Article 
700 1 |a Maier, Jana E  |e verfasserin  |4 aut 
700 1 |a Broll, Wolfgang  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g PP(2023) vom: 22. Feb.  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnas 
773 1 8 |g volume:PP  |g year:2023  |g day:22  |g month:02 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2023.3247099  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d PP  |j 2023  |b 22  |c 02