ViCom at SemDial 2025

Credit: Andy Lücking

Andy Lücking from the GeMDiS project introduced FraGA, a VR-based corpus of direction-giving dialogues, designed to analyze multimodal communication—specifically, the interplay of speech, head movements, and hand gestures—during turn transitions. The research compares avatar-mediated VR dialogues with real-world (RW) interactions, focusing on gaze patterns, hand movements, and timing of turn transitions.

Leave a Reply