July 29, 2021

Investing in academic research to improve privacy technology: Our approach and recent RFP winners

By: Meta Research

One of our goals over the next decade is to build stronger privacy protections for everyone who uses our apps and services. Our latest research award opportunity in privacy-enhancing technology and the recently launched request for proposals on Building Tools to Enhance Transparency in Fairness and Privacy are the next of many steps toward that goal, and they represent a continuation of several years of investments in the privacy research space.

Our approach to academic research and investments

Through a variety of programs, partnerships, and collaborations, Facebook researchers work with the global academic community on topics that align with our mission to give people the power to build community and bring the world closer together. “We are sponsoring labs and conferences, partnering with academics on short- and long-term projects, and supporting PhD students through our Fellowship program,” says Sharon Ayalde, Research Program Manager, Facebook Academic Engagements. “We also provide research award opportunities through open requests for proposals.”

Requests for proposals (RFPs) in particular help us strengthen our ties to academia and foster community. Through RFPs, we are able to discover activities and key players in academia that are aligned with our research challenges. Research funds are generally awarded as unrestricted gifts to accredited universities to help finance winning proposals. In general, there are 15 to 20 RFP opportunities each year across a variety of research topics, such as privacy, networking, data science, probability, machine learning, and UX.

Investing in these research projects helps accelerate the field for everyone and allows us to apply the most cutting-edge technologies to our apps and services. In the privacy research space, we’ve steadily increased opportunities for academic collaboration, and research project funding continues to be available. Last year, we granted research awards in key topics such as privacy-preserving technologies and cryptography, user experiences in privacy, and privacy in AR/VR and smart device products. These opportunities alone attracted more than 300 applications, with over $2 million in total funding.

The 2020 People’s Expectations and Experiences with Digital Privacy RFP, in particular, received 147 proposals from 34 countries and 120 universities. The five winning proposals represented 14 universities, including Cornell University, Carnegie Mellon University, the Hebrew University of Jerusalem, India Institute of Technology, Brigham Young University, Northwestern University, and Hamad Bin Khalifa University.

What’s next

In 2021 and beyond, we will continue our investment in research and innovation to help us develop new ways to build products and process data with privacy in mind. We’ll also continue to work with policymakers, privacy experts, global organizations and developers on building solutions to ensure that people feel safe and comfortable using our products.

“The role of technology in our lives and society is evolving faster than ever before,” says Scott Renfro, Facebook Software Engineer. “It’s critical that we work hard to put privacy, safety, and security first and work with people at the forefront of emerging technologies and scientific understanding to find better solutions. This is why we want to collaborate with academia and support the important work they do by launching another research award opportunity.”

As part of our continued investment, we are pleased to announce the winners and finalists of the 2021 Privacy-Enhancing Technologies RFP, which sought proposals from academics conducting research in applied cryptography, data policies and compliance, differential privacy, and privacy in AI. The research award opportunity attracted 159 proposals from 102 universities. Thank you to everyone who took the time to submit a proposal, and congratulations to the winners.

Research award recipients

Principal investigators are listed first unless otherwise noted.

Bridging secure computation and differential privacy
Jonathan Katz (University of Maryland College Park)

Cryptographic enforcement of end-to-end data privacy
Anwar Hithnawi (ETH Zurich)

Implementing a flexible framework for privacy accounting
Salil Vadhan (Harvard University)

InferViz: Weighted inference and visualization of insecure code paths
Musard Balliu (KTH Royal Institute of Technology), Marco Guarnieri (IMDEA Software Institute)

Practical differential privacy: Using past and present to inform future
Aleksandra Korolova, Brendan Avent (University of Southern California)

Privacy-preserving machine learning via ADMM
Yupeng Zhang (Texas A&M University)

Private authentication with complex assertions and abuse prevention
Ian Miers (University of Maryland College Park)

Safeguarding user data against cross-library data harvesting
Luyi Xing, Xiaojing Liao (Indiana University Bloomington)

SEBRA: SEcuring BRowser Extensions by Information Flow Analysis
Andrei Sabelfeld (Chalmers University of Technology)

Towards privacy-preserving and fair ad targeting with federated learning
Golnoosh Farnadi (HEC Montreal and MILA), Martine De Cock (University of Washington Tacoma)


A methodological approach to privacy-preserving data analysis pipelines
Patrick Thomas Eugster, Savvas Savvides (Università della Svizzera italiana)

A toolkit for locally private statistical inference
Clement Canonne, Vincent Gramoli (University of Sydney)

Advancing differential privacy accounting
Yu-Xiang Wang (University of California Santa Barbara)

An informed consent management engine to control the privacy of IoT devices
John Grundy, Mohan Chhetri, Zubir Baig, Chehara Pathmabandu (Monash University)

Beyond cookies: Private personalization for the tracker-free web
Henry Corrigan-Gibbs (Massachusetts Institute of Technology)

Challenges in E2E encryption
Yevgeniy Dodis (New York University)

Consent flows tracking for OAuth2.0 standard protocol
Alex Pentland, Thomas Hardjono (Massachusetts Institute of Technology)

Deletion compliance in data systems
Manos Athanassoulis (Boston University)

Differentially private analyses of textual data, such as Facebook posts
Gary King (Harvard University)

Differentially private collection of key-value pairs using multi-party computation
Florian Kerschbaum (University of Waterloo)

Differentially private analysis of streaming and graph data
Jerome Le Ny (Polytechnique Montreal)

Differentially private multi-task learning
Virginia Smith, Steven Wu (Carnegie Mellon University)

DragonFLy: Private, efficient, and accurate federated learning
Adam O’Neill, Amir Houmansadr (University of Massachusetts Amherst)

Efficient sparse vector aggregation for private federated learning
Giulia Fanti, Elaine Shi (Carnegie Mellon University)

End-to-end privacy compliance in distributed web services
Malte Schwarzkopf (Brown University)

Fast identity online with attributes and global revocation (sFIDO)
Lucjan Hanzlik (CISPA Helmholtz Center for Information Security)

InferViz: Weighted inference and visualization of insecure code paths
Musard Balliu (KTH Royal Institute of Technology), Marco Guarnieri (IMDEA Software Institute)

Practical private information retrieval with privacy-enhancing applications
Ling Ren (University of Illinois Urbana-Champaign)

Privacy-preserving machine learning through label differential privacy
Prateek Mittal, Amir Houmansadr (Princeton University)

Privacy in sketches for big data analytics
Pedro Reviriego-Vasallo (University Carlos III de Madrid)

Privacy of data set properties in machine learning
Olga Ohrimenko (University of Melbourne)

Searching for accurate and efficient private models
Reza Shokri (National University of Singapore)

Symmetric homomorphic encryption for fast privacy-preserving data analysis
Patrick Thomas Eugster, Savvas Savvides (Università della Svizzera italiana)

Scalable and secure protocols for data linking and analytics
Xiao Wang (Northwestern University)