Avatars Grow Legs: Generating Smooth Human Motion from Sparse Tracking Inputs with Diffusion Model
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Optimal Transport and Machine Learning (OTML) Workshop at NeurIPS
The gradients of convex functions are expressive models of non-trivial vector fields. For example, the optimal transport map between any two measures on Euclidean spaces under the squared distance is realized as a convex gradients via Brenier’s theorem, which is a key insight used in recent machine learning flow models. In this paper, we study how to model convex gradients by integrating a Jacobian-vector product parameterized by a neural network, which we call the Input Convex Gradient Network (ICGN). We theoretically study ICGNs and compare them to modeling the gradient by taking the derivative of an input-convex neural network, demonstrating that ICGNs can efficiently parameterize convex gradients.
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Bilge Acun, Benjamin Lee, Fiodar Kazhamiaka, Kiwan Maeng, Manoj Chakkaravarthy, Udit Gupta, David Brooks, Carole-Jean Wu
Ilkan Esiyok, Pascal Berrang, Katriel Cohn-Gordon, Robert Künnemann