We continually review our Community Standards to ensure they are keeping pace with global dynamics. The Content Policy Research Initiative will help us bring diverse analyses — both qualitative and quantitative — from researchers around the world into this important work.
In order to benefit from the community of experts and researchers who are invested in understanding the relationship between online content and offline harm, Facebook is convening a global series of workshops to share more about research efforts currently underway and identify opportunities for collaboration.
These workshops will generally address Facebook's content policy-making and focus on evidence-based options to keep users and communities safer. We want to kick off an important dialogue that will lead to additional research collaborations.
The Content Policy Research Initiative seeks to engage outside scholars on a number of issues in 2019, specifically:
Read about the first workshops in DC and Paris, Latin America, Sydney and Auckland, and Dar es Salaam and Rome.
As part of the Content Policy Research Initiative, we are supporting research that will inform our efforts on content policy and add to the broader research community conversation on social media content. This research will take place globally and across disciplines and we anticipate that the result will be a robust contribution to scholarship in this space.
In 2019, we held two calls for proposals. The first was focused on hate speech and preventing offline harm and we announced 19 winners in May. The second call was for research on bullying and harassment and fairness in global enforcement. We announced the winners in September.
Recently, we launched a new call for proposals on misinformation and polarization open, which closes April 1, 2020. More information can be found here.
The Data Transparency Advisory Group (DTAG) was established as part of a formal process to solicit feedback and provide a public assessment of our metrics. It is comprised of international experts in measurement, statistics, criminology, and governance who have provided an independent, public assessment of our Community Standards Enforcement Report specifically and our measurement efforts related to content moderation more broadly.
Read their final report here.