A Method for Animating Childrenâs Drawings of the Human Figure
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Conference on Neural Information Processing Systems (NeurIPS)
Matrix square roots and their inverses arise frequently in machine learning, e.g., when sampling from high-dimensional Gaussians đ (Î, K) or âwhiteningâ a vector b against covariance matrix K. While existing methods typically require O(N3) computation, we introduce a highly-efficient quadratic-time algorithm for computing K1/2b, K-1/2b, and their derivatives through matrix-vector multiplication (MVMs). Our method combines Krylov subspace methods with a rational approximation and typically achieves 4 decimal places of accuracy with fewer than 100 MVMs. Moreover, the backward pass requires little additional computation. We demonstrate our methodâs applicability on matrices as large as 50,000 Ă 50,000â well beyond traditional methodsâwith little approximation error. Applying this increased scalability to variational Gaussian processes, Bayesian optimization, and Gibbs sampling results in more powerful models with higher accuracy. In particular, we perform variational GP inference with up to 10,000 inducing points and perform Gibbs sampling on a 25,000-dimensional problem.
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher ReÌ