I am a PhD Student at MIT. My research page can be found here. My research focuses on distributed computation in statistics and machine learning under constraints of privacy, communication, and efficiency. From a foundational standpoint, this research is inspired by non-asymptotic statistics, randomized algorithms, combinatorics, and at times just via systems design. I was selected as a SERC Scholar (Social and Ethical Responsibilities of Computing Scholar) by MIT’s Schwarzman College of Computing. My work on "FedML: A research library and benchmark for federated machine learning" won a Baidu Best Paper Award at NeurIPS 2020-SpicyFL. My work on "NoPeek-Infer: Preventing face reconstruction attacks in distributed inference after on-premise training" won the Mukh Best Paper Award at IEEE FG-2021. I was Interviewed in the book "Data Scientist: The Definitive Guide to Becoming a Data Scientist" and my work on Split Learning was featured in Technology Review. A small sampling of problems that I work on includes a) private independence testing and private k-sample testing in statistics, b) bridging privacy with social choice theory, c) private mechanisms for training and inference in ML, d) privately estimating non-linear measures of statistical dependence between multiple parties, and e) split learning. I was previously a scientist at Amazon (AWS), Motorola Solutions, and at various startups, all of which were eventually acquired. I have interned at Apple, Corning, and TripleBlind. I hold an MS in Mathematical and Applied Statistics from Rutgers University, New Brunswick.