A Method for Animating Children’s Drawings of the Human Figure
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
International Conference on Learning Representations (ICLR)
Successful recurrent models such as long short-term memories (LSTMs) and gated recurrent units (GRUs) use ad hoc gating mechanisms. Empirically these models have been found to improve the learning of medium to long term temporal dependencies and to help with vanishing gradient issues.
We prove that learnable gates in a recurrent model formally provide quasiinvariance to general time transformations in the input data. We recover part of the LSTM architecture from a simple axiomatic approach.
This result leads to a new way of initializing gate biases in LSTMs and GRUs. Experimentally, this new chrono initialization is shown to greatly improve learning of long term dependencies, with minimal implementation effort.
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré