A Method for Animating Children’s Drawings of the Human Figure
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
IEEE World Haptics Conference
Couples often communicate their emotions, e.g., love or sadness, through physical expressions of touch. Prior efforts have used visual observation to distinguish emotional touch communications by certain gestures tied to one’s hand contact, velocity and position. The work herein describes an automated approach to eliciting the essential features of these gestures. First, a tracking system records the timing and location of contact interactions in 3-D between a toucher’s hand and a receiver’s forearm. Second, data post-processing algorithms extract dependent measures, derived from prior visual observation, tied to the intensity and velocity of the toucher’s hand, as well as areas, durations and parts of the hand in contact. Third, behavioral data were obtained from five couples who sought to convey a variety of emotional word cues. We found that certain combinations of six dependent measures will distinguish the touch communications. For example, a typical sadness expression invokes more contact, evolves more slowly, and impresses less deeply into the forearm than a typical attention expression. Furthermore, cluster analysis indicates 2-5 distinct expression strategies are common per word being communicated. Specifying the essential features of touch communications can guide haptic devices in reproducing naturalistic interactions.
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré
Zach Miller, Olusiji Medaiyese, Madhavan Ravi, Alex Beatty, Fred Lin