In the past few years, we have seen an explosion of interest in topics at the intersection of programming languages and machine learning. This is not a coincidence: there has been a growth in real-world applications that need probabilistic thinking. Additionally, the community has realized that probabilistic methods play a genuinely useful role in program analysis – for example, in ranking of deduced facts in static analyses, in type reconstruction, and in general to build explainable generative models. Machine learning techniques such as efficient automatic differentiation are no longer esoteric, and form the basis for popular deep learning frameworks such as Tensorflow and PyTorch and differentiable programming languages like Swift For Tensorflow and others. Deep learning also relies on compiler and code generation techniques to target GPUs and special-purpose accelerator hardware.
At Facebook, we are doing forward-looking research, as well as putting into production concrete results from several of these threads. We introduced HackPPL, which extends our internal PHP dialect into a full-fledged probabilistic programming language, and are creating extensions to Python to eliminate string-based API patterns. We have started various language-centric projects around acceleration and differentiable programming. We also have a portfolio of projects in the “big code” space, exploring several topics such as code search and recommendation, automatic bug fixing, and program synthesis using machine learning. Together, this work hopes to have impact across all of Facebook’s infrastructure.
To foster further innovation in these topics, and to deepen our collaboration with academia, Facebook is pleased to invite faculty and graduate students to respond to this call for research proposals pertaining to the aforementioned topics. We anticipate awarding a total of ten awards, each in the $50,000 range. Payment will be made to the proposer’s host university as an unrestricted gift.