### A Method for Animating Children’s Drawings of the Human Figure

Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins

SIAM Journal on Matrix Analysis and Applications

Linear least-squares regression with a “design” matrix *A* approximates a given matrix *B* via minimization of the spectral- or Frobenius-norm discrepancy ||*AX − B*|| over every conformingly sized matrix *X*. Also popular is low-rank approximation to *B* through the “interpolative decomposition,” which traditionally has no supervision from any auxiliary matrix *A*. The traditional interpolative decomposition selects certain columns of *B* and constructs numerically stable (multi)linear interpolation from those columns to all columns of *B*, thus approximating all of *B* via the chosen columns. Accounting for regression with an auxiliary matrix *A* leads to a “regression-aware interpolative decomposition,” which selects certain columns of *B* and constructs numerically stable (multi)linear interpolation from the corresponding least-squares solutions to the least-squares solutions *X* minimizing ||*AX − B*|| for all columns of *B*. The regression-aware decompositions reveal the structure inherent in *B* that is relevant to regression against *A*; they effectively enable supervision to inform classical dimensionality reduction, which classically has been restricted to strictly unsupervised learning.

Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins

Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won

Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré