A Method for Animating Children’s Drawings of the Human Figure
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
IEEE World Haptics Conference (IEEE WHC)
Multi-sensory stimuli can greatly enhance immersion in interactive virtual environments. Advances in graphics algorithms and technologies like VR displays have pushed the appearance of interactive virtual worlds to unprecedented fidelity, but rendering sound and, especially, touch feedback of comparable quality remains a challenge. We describe a method for real-time synthesis of vibrotactile haptic and audio stimuli for interactions with textured surfaces in 3-D virtual environments. Standard descriptions of object geometry and material properties, including displacement and roughness texture maps typically used for physically-based visual rendering, are employed to generate realistic sound and touch feedback consistent with appearance. Our method reconstructs meso-and microscopic surface features on the fly along a contact trajectory, and runs a micro-contact dynamics simulation whose outputs drive vibrotactile haptic actuators and modal sound synthesis. An exploratory, absolute identification user study was conducted as an initial evaluation of our synthesis methods.
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré