Avatars Grow Legs: Generating Smooth Human Motion from Sparse Tracking Inputs with Diffusion Model
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
IEEE World Haptics Conference
Couples often communicate their emotions, e.g., love or sadness, through physical expressions of touch. Prior efforts have used visual observation to distinguish emotional touch communications by certain gestures tied to one’s hand contact, velocity and position. The work herein describes an automated approach to eliciting the essential features of these gestures. First, a tracking system records the timing and location of contact interactions in 3-D between a toucher’s hand and a receiver’s forearm. Second, data post-processing algorithms extract dependent measures, derived from prior visual observation, tied to the intensity and velocity of the toucher’s hand, as well as areas, durations and parts of the hand in contact. Third, behavioral data were obtained from five couples who sought to convey a variety of emotional word cues. We found that certain combinations of six dependent measures will distinguish the touch communications. For example, a typical sadness expression invokes more contact, evolves more slowly, and impresses less deeply into the forearm than a typical attention expression. Furthermore, cluster analysis indicates 2-5 distinct expression strategies are common per word being communicated. Specifying the essential features of touch communications can guide haptic devices in reproducing naturalistic interactions.
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Bilge Acun, Benjamin Lee, Fiodar Kazhamiaka, Kiwan Maeng, Manoj Chakkaravarthy, Udit Gupta, David Brooks, Carole-Jean Wu
Ilkan Esiyok, Pascal Berrang, Katriel Cohn-Gordon, Robert Künnemann