Avatars Grow Legs: Generating Smooth Human Motion from Sparse Tracking Inputs with Diffusion Model
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
SIAM Journal on Matrix Analysis and Applications
Linear least-squares regression with a “design” matrix A approximates a given matrix B via minimization of the spectral- or Frobenius-norm discrepancy ||AX − B|| over every conformingly sized matrix X. Also popular is low-rank approximation to B through the “interpolative decomposition,” which traditionally has no supervision from any auxiliary matrix A. The traditional interpolative decomposition selects certain columns of B and constructs numerically stable (multi)linear interpolation from those columns to all columns of B, thus approximating all of B via the chosen columns. Accounting for regression with an auxiliary matrix A leads to a “regression-aware interpolative decomposition,” which selects certain columns of B and constructs numerically stable (multi)linear interpolation from the corresponding least-squares solutions to the least-squares solutions X minimizing ||AX − B|| for all columns of B. The regression-aware decompositions reveal the structure inherent in B that is relevant to regression against A; they effectively enable supervision to inform classical dimensionality reduction, which classically has been restricted to strictly unsupervised learning.
Yuming Du, Robin Kips, Albert Pumarola, Sebastian Starke, Ali Thabet, Artsiom Sanakoyeu
Bilge Acun, Benjamin Lee, Fiodar Kazhamiaka, Kiwan Maeng, Manoj Chakkaravarthy, Udit Gupta, David Brooks, Carole-Jean Wu
Harjasleen Malvai, Lefteris Kokoris-Kogias, Alberto Sonnino, Esha Ghosh, Ercan Ozturk, Kevin Lewi, Sean Lawlor