Avatars Grow Legs: Generating Smooth Human Motion from Sparse Tracking Inputs with Diffusion Model
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Conference on Computer-Supported Cooperative Work and Social Computing (CSCW)
Videoconference has become the dominant technology for remote meetings. Embodied Virtual Reality is a potential alternative that employs motion tracking in order to place people in a shared virtual environment as avatars. This paper describes a 210 participant study focused on behavioral measures that compares multiparty interaction in videoconference and embodied VR across a range of task types: a factual intellective task, a subjective judgment task and two negotiation tasks, one with visual grounding. It uses state-of-the-art body, face and finger tracking to drive the avatars in VR and a carefully matched videoconferencing implementation. Significant behavioral differences are observed. These include increased activity in videoconference related to maintaining the social connection: more person directed gaze and increased verbal and nonverbal backchannel behavior. Videoconference also had reduced conversational overlap, increased self-adaptor gestures and reduced deictic gestures as compared with embodied VR. Potential explanations and implications are discussed.
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Lisa Rivalin, Andrew Grier, Tobias Tiecke, Chi Zhou, Doris Gao, Prakriti Choudhury, John Fabian
Shreshth Tuli, Kinga Bojarczuk, Natalija Gucevska, Mark Harman, Xiao-Yu Wang, Graham Wright