December 15, 2022

I’m a researcher — here’s why I work at Meta

By: Umer Farooq

As 2022 comes to an end, we wanted to share some more personal, behind-the-scenes stories about research at Meta. These stories remind us of all the hard work that goes into our mission of connecting people.

Illustration of man using a mobile phone to access the internet

My name is Umer Farooq, and I lead Research at Meta for Integrity, including our family of apps (Facebook, Instagram, Messenger, and WhatsApp). After I share my own perspectives on our work and why it matters to me, five researchers from my team will discuss why they chose to pursue a career in integrity research at Meta, and how it’s going for them now.

Every day, I wake up thinking about the important work my team does, enabling my friends and family and small businesses around the world to express themselves and realize economic opportunity on our platform, while working to keep them safe. I joined the Integrity team in 2018 because I believed — as I still do — that being able to express oneself is of paramount importance in a highly connected world. I also believe that to realize this vision, we have to curb abuse that can hinder people’s freedom of expression and economic opportunity.

My team is currently composed of more than 75 researchers based in Menlo Park, Seattle, New York, Washington D.C., and London, as well as remotely across the world. The integrity researchers on my team come from a wide range of professional backgrounds: psychology, sociology, political science, statistics, economics, human-computer interaction, public health, and more. They use qualitative and quantitative methods such as surveys and interviews to understand ambiguous, complex, and emerging topics across integrity, such as hate speech and scams.

I’m often asked about the role of integrity research at a company like Meta, and how research contributes to product development. Simply put, research seeks to better understand the experiences of people who use our products, including their preferences regarding product design and function, how easy or difficult it is to use our products, and why and how people use or don’t use our products. We also conduct research with external stakeholders such as policymakers, academics, and nonprofit organizations. The goal of this research is to enable our teams to make data-informed decisions regarding product, policy, operations, and long-term strategy.

Here are some examples of topics that integrity teams work on:

  • Tools like Facebook Protect, our security program for groups of people who are frequently targeted by malicious hackers, such as human rights defenders, journalists, and government officials.
  • Features across our family of apps, such as Instagram, that help protect people against unwanted interactions or harassment on the platform.
  • Our strategic approach to maintaining a safe online environment in countries at risk.
  • Efforts against the spread of Non-Consensual Intimate Images such as launching to help people who are concerned their intimate images (photos or videos of a person which feature nudity or are sexual in nature) may be shared without their consent.
  • Establishing structure and governance for the Oversight Board, an independent body that people can appeal to if they disagree with decisions we made about content on Facebook or Instagram.

I’m honored to work alongside a team of talented, thoughtful, hardworking researchers who eagerly tackle difficult and highly impactful issues and topics. Below, a few of them share their perspectives on their work.


About a year ago, I was offered what would turn out to be the most interesting, challenging, fulfilling job of my career: leading global integrity research for Instagram. This work covers everything from showing warning screens for graphic violence to strengthening the security of your account.

Every day, I get to work with some of the smartest, most engaged people I’ve ever met, on some of the hardest issues on social media. We tackle questions like: if someone is exploring their sexual identity, how can we support their efforts to find community, while protecting them from potential predators? How can we protect freedom of expression while keeping people safe from unwanted behavior and harassment?

In the last year alone, Instagram has instituted controls that reduce unwanted advances, made reporting bad experiences easier, helped people understand what behaviors might get their accounts disabled, increased access to accurate vaccination information globally, and most recently, protected our Ukrainian users from harassment or physical attacks. There is never a week that goes by that I don’t feel that my work is meaningful or that my input matters.

Illustration of researcher receiving user feedback


My reasons for working at Meta are rooted in my decision to pursue a PhD in economics. I entered college assuming I’d be an engineer. But by my junior year, despite being good at math and programming, I had discovered a new passion: economic growth. It turns out we still don't know why some economies are stuck in the poverty trap, or why others can quickly bounce back from poverty only to find their economic growth coming to a halt.

When you hear the word Facebook, Instagram, Messenger, or WhatsApp, the first thing that comes to mind might be social media, connecting with friends, or entertainment. But Meta platforms also have a substantial impact on economic empowerment, especially for disadvantaged segments of society, smaller businesses, and less-developed nations.

Meta has made marketing accessible to a vast number of small businesses across the globe that could never afford TV ads, billboards, magazine pages, or other traditional marketing. Optimized ad campaigns have not only made reaching customers possible for thousands of small businesses, but also made that outreach more relevant and valuable for the users on the receiving end. Businesses with tight budgets can track the performance of their campaigns in terms of customer acquisition, something they could never do with a billboard. Especially in emerging economies, businesses use WhatsApp to replace phone calls, personalize online shopping consultations, process payments within a messaging thread, communicate sensitive information protected by end-to-end encryption, and more. Meta's free and reliable products have also become an indispensable way for people to do day-to-day tasks and stay connected with family and friends.

All these interactions depend on trust. People won’t engage with ads if the landing pages are unsafe. Customers won’t start a chat thread with a business that doesn’t seem legitimate. Businesses don't want to promote their products alongside low-quality or counterfeit goods. I’m proud to take part in initiatives that address these complex problems, and passionate about making sure we’re building products to earn the trust of people and businesses worldwide every day.

Illustration of researcher receiving user feedback


When I interview job candidates, they often ask me why I work at Meta. It’s an easy question to answer: I value the standards of rigor and quality we hold our research to, I like the collaborative spirit with which we approach new challenges, and I like being able to understand how people use our platforms.

A lot of the job candidates come from the academic world, as I did when I joined Meta over four years ago. At that time, I was worried that the quality of my work might suffer — that the pressures of speed and efficiency might not allow for carefully designing research projects or thoroughly testing survey questionnaires. As I tell all the candidates, I was very wrong. First, I didn’t expect my new coworkers’ excited reaction when I told them my background is in survey methodology. I’d been worried that my lack of experience in user experience would make me less valuable, but my new colleagues actually sought my advice on all their survey problems, and would run to carry out the quality checks I recommended to make surveys more rigorous. High-quality research leads to high-quality data and more accurate representation of what people who use our products think and need.

We have a saying at Meta: nothing is someone else’s problem. How I understand this mantra has evolved over time, from feeling empowered to say and do something if I see something that could be better, to accepting continuous feedback from others who are invested in making me better.


There’s nothing more frustrating — but also motivating — than someone saying “it can’t be done.” Integrity is a space in which many people have said just that, yet we continue to innovate and lead the industry with novel product experiences to support people in building safe online communities. During my time at Meta, I’ve led or managed incredibly fulfilling applied research with demonstrable positive impact on the safety of our global user community.

Since I joined as a researcher in 2018, my career at Meta has expanded my understanding of people, culture, and the value of interconnectedness online. It’s also taught me how to use this knowledge to develop delightful product experiences for our community. As someone with a background in social psychology, I treasure the opportunity to study the perceptions and behaviors of online communities on a global scale. There are few other companies for which a researcher would be tasked with understanding deep human phenomena such as online identity expression — how it differs around the world, how it changes over time, how it’s affected by who is in your network, and how it can be a tool for online protection or abuse.

As an immigrant living far from many of my family and friends, I believe in our company’s mission to give people the power to build community and bring the world closer together. However, a mission even closer to my heart is that of the Integrity team, for which I’m a research manager. Our mission is to protect people and businesses and ensure that when we build products, we have the knowledge and technology to protect voice and economic opportunity. No matter your thoughts about social media, everyone is invested in making it a safe and enjoyable place. That’s why I love what we do and I’m proud to work on the Integrity research team.


I’ve always been passionate about trust and safety work, and spent almost my entire PhD doing academic research about it. While I was planning on looking for an academic job and doing the right things to get there, I always doubted a little bit whether my research would be useful outside the scope of research papers.

Then, the year before I finished my PhD, I had the opportunity to do an internship at Meta (Facebook at the time) on its Integrity team, and the experience blew my mind. It was the first time I got to work on real content moderation problems at scale, and it completely transformed my view about — not only integrity research — but also what makes for good research in general. I knew at the time that this was the career I had always wanted.

Today, I’m a quantitative researcher at Meta, still on the Integrity team, and I can confidently say that I am doing my dream job. I still find the work I’m doing everyday extremely impactful, intellectually challenging, and personally fulfilling. One project that stands out was when I led the research that informed how Meta should prioritize content for review. Our research identified several factors that eventually led to the creation of a dynamic content prioritization system. This system now powers Meta’s human review prioritization, and I could not be more proud of this work and the progress we’ve made in addressing some of the hardest problems impacting social media.