Simulation and Retargeting of Complex Multi-Character Interactions
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Bayesian Deep Learning Workshop at NeurIPS 2018
At Facebook, we are becoming increasingly interested in incorporating uncertainty into models used for decision making. As a result, we are building an in-house universal probabilistic programming language that aims to make modeling more accessible to developers and to unify the tooling experience for existing users of Bayesian modeling. In addition, we form interfaces for common probabilistic models on top of this language and apply these to datasets within the company.
For the less familiar reader, probabilistic programming languages (PPLs) provide a convenient syntax for allowing users to describe generative processes composed of various sources of uncertainty. The user may then pose probabilistic queries about their world that will be resolved using an inference engine. Some of the more mature languages include domain specific languages such as WinBUGS [1], JAGS [2] and Stan [3], which place some restrictions on the models a user may write in order for inference to run more efficiently. On the other hand, the newer universal PPLs such as Church [4], WebPPL [5] and Anglican [6] extend existing general-purpose languages and resolve queries through a generic inference engine. In doing so, users are constrained only by the limitations of the underlying language, although this may not always result in the most efficient model and this tradeoff between model expressivity and inference efficiency is an ongoing area of research.
Traditionally, Bayesian neural networks (BNNs) are neural networks with priors on their weights and biases [7, 8]. Their main advantages include providing uncertainty of predictions rather than point estimates, built-in regularization through priors, and better performance in problem settings such as the robot arm problem [7, 9], but are generally expensive in terms of compute time. While probabilistic interpretations of neural networks have been studied in the past, BNNs have seen a resurgence in popularity in recent years, particularly with alternative probabilistic approaches to dropout [10] and backpropagation [11], and the renewed investigation of variational approximations [12] which have made computation more tractable. There have also been more recent advances in combining probabilistic programming and deep learning, notably by Edward [13] and Pyro [14]. These languages are built on top of existing tensor libraries and have so far focused on variational approaches for scalable inference.
In this study, we present HackPPL as a probabilistic programming language in Facebook’s server-side language, Hack. One of the aims of our language is to support deep probabilistic modeling by providing a flexible interface for composing deep neural networks with encoded uncertainty and a rich inference engine. We demonstrate the Bayesian neural network interface in HackPPL and present results of a multi-class classification problem to predict user location states using several inference techniques. Through HackPPL we aim to provide tools for interacting and debugging Bayesian models and integrate them into the Facebook ecosystem.
Yunbo Zhang, Deepak Gopinath, Yuting Ye, Jessica Hodgins, Greg Turk, Jungdam Won
Harrison Jesse Smith, Qingyuan Zheng, Yifei Li, Somya Jain, Jessica K. Hodgins
Simran Arora, Patrick Lewis, Angela Fan, Jacob Kahn, Christopher Ré