Video2Haptics : Converting Video Motion to Dynamic Haptic Feedback with Bio-Inspired Event Processing

In cinematic VR applications, haptic feedback can significantly enhance the sense of reality and immersion for users. The increasing availability of emerging haptic devices opens up possibilities for future cinematic VR applications that allow users to receive haptic feedback while they are watching...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 30(2024), 12 vom: 06. Okt., Seite 7717-7735
1. Verfasser: Chen, Xiaoming (VerfasserIn)
Weitere Verfasser: Hu, Zeke Zexi, Zhao, Guangxin, Li, Haisheng, Chung, Vera, Quigley, Aaron
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
LEADER 01000caa a22002652 4500
001 NLM367852292
003 DE-627
005 20241029231946.0
007 cr uuu---uuuuu
008 240201s2024 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2024.3360468  |2 doi 
028 5 2 |a pubmed24n1584.xml 
035 |a (DE-627)NLM367852292 
035 |a (NLM)38294913 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Chen, Xiaoming  |e verfasserin  |4 aut 
245 1 0 |a Video2Haptics  |b Converting Video Motion to Dynamic Haptic Feedback with Bio-Inspired Event Processing 
264 1 |c 2024 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 28.10.2024 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a In cinematic VR applications, haptic feedback can significantly enhance the sense of reality and immersion for users. The increasing availability of emerging haptic devices opens up possibilities for future cinematic VR applications that allow users to receive haptic feedback while they are watching videos. However, automatically rendering haptic cues from real-time video content, particularly from video motion, is a technically challenging task. In this article, we propose a novel framework called "Video2Haptics" that leverages the emerging bio-inspired event camera to capture event signals as a lightweight representation of video motion. We then propose efficient event-based visual processing methods to estimate force or intensity from video motion in the event domain, rather than the pixel domain. To demonstrate the application of Video2Haptics, we convert the estimated force or intensity to dynamic vibrotactile feedback on emerging haptic gloves, synchronized with the corresponding video motion. As a result, Video2Haptics allows users not only to view the video but also to perceive the video motion concurrently. Our experimental results show that the proposed event-based processing methods for force and intensity estimation are one to two orders of magnitude faster than conventional methods. Our user study results confirm that the proposed Video2Haptics framework can considerably enhance the users' video experience 
650 4 |a Journal Article 
700 1 |a Hu, Zeke Zexi  |e verfasserin  |4 aut 
700 1 |a Zhao, Guangxin  |e verfasserin  |4 aut 
700 1 |a Li, Haisheng  |e verfasserin  |4 aut 
700 1 |a Chung, Vera  |e verfasserin  |4 aut 
700 1 |a Quigley, Aaron  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g 30(2024), 12 vom: 06. Okt., Seite 7717-7735  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnns 
773 1 8 |g volume:30  |g year:2024  |g number:12  |g day:06  |g month:10  |g pages:7717-7735 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2024.3360468  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 30  |j 2024  |e 12  |b 06  |c 10  |h 7717-7735