18 June 2025
- RSIS
- Publication
- RSIS Publications
- Empowering Local Communities for Online Fact-Checking and Moderation
SYNOPSIS
Meta’s move from independent third-party fact-checkers to community-driven fact-checking in the United States, similar to X’s Community Notes, has ushered in a shift toward participatory moderation, and the need for new ways to navigate the digital space. Local communities in ethnic and religious spaces can collaborate with fact-checking organisations to build capacity amongst members and contribute to the fight against the dissemination of false information.

COMMENTARY
On 7 January 2025, Meta CEO Mark Zuckerberg announced that Facebook and Instagram will shut down their fact-checking programme in the US and opt for the community-based approach of Community Notes. This move marks Meta’s retreat from its responsibility to detect false information with the help of independent, certified third-party fact-checkers.
Consequently, the responsibility of identifying false information will be left to the community, where diverse social media users will have to come to a consensus on information reliability via Community Notes, a feature first introduced by X (formerly Twitter) in 2021. However, with Meta’s extensive reach in Southeast Asia, the reliance on Community Notes has several implications.
First, given that social media is widely used in the region – Facebook and Instagram being prominent platforms in the social media landscape – its extensive use and reach make them significant breeding ground for false information, misinformation, and disinformation.
Second, with Southeast Asia being a religiously and ethnically diverse region, compounded by the relatively lower digital literacy rates and an imbalance of power over internet governance – where the government in some countries hold significantly more power in defining what constitutes fake news – this has raised questions whether community-driven efforts to counter divisive online narratives and behaviours can be effective without undermining the social fabric of societies.
The challenge is further complicated by mixed results on the effectiveness of Community Notes in combating misinformation. For example, this study reported no significant reduction in misleading tweets on X. It also highlighted that the community-based approach may not be sufficiently responsive to curb the rapid spread of tweets when they go viral.
On the other hand, other studies had more positive results. One study found that when Community Notes were publicly displayed, users who had actively engaged in misinformation were more likely to retract their tweets. This suggests that community feedback can encourage content removal among those who publicise misinformation.
Despite the mixed results, the implementation of Community Notes appears to be a likely path forward for social media platforms, and the opportunity to mitigate the challenges ahead remains open across the region. With the United States serving as the experimental ground for Meta’s new intervention, there is a need to build capacity at the community level and to recalibrate online norms in our region to better adapt to community-driven fact-checking and moderation.
Building Capacity for Community-Based Fact-Checking and Moderation
The shutdown of Meta’s fact-checking programme does not mark the end of fact-checking efforts. Instead, it has created new opportunities for organisations to collaborate more closely with local communities, especially ethnic and faith-based groups in the region, to strengthen digital literacy and build leadership in navigating online spaces. By equipping communities with fact-checking knowledge and frameworks, organisations can empower them to actively participate in crowdsourced fact-checking efforts.
To start building capacity, one way is to encourage broad-based participation. Fact-checking efforts and outcomes are dependent on who participates in the process. For instance, studies have shown that other collaborative platforms like Wikipedia have a pronounced gender gap, with women making up only 10 to 15 per cent of the knowledge platform’s editors – a disparity possibly linked to lack of confidence, caregiving or other responsibilities that constrain volunteer activities.
Such underrepresentation needs to be addressed, as misinformation targeting or misrepresenting women may be left unchallenged. Thus, encouraging participation across diverse communities is essential to ensure effective fact-checking.
Similarly, when misinformation involves faith or ethnic content, the participation of minority ethnic and religious communities, with majority groups as allies, would help to ensure that fact-checking and moderation are contextually informed. Interfaith and interethnic knowledge and networks enable prompt detection and response to misinformation before the narratives perpetuate harmful stereotypes or misperceptions.
Beyond participation, all communities need to foster a culture of collective learning to stay ahead of updates in digital platforms. Fact-checking organisations equipped with knowledge and skills can help local communities understand how algorithms amplify content and how certain groups can exploit digital platforms to skew online public opinion.
For starters, community members can be trained to use X’s Community Notes – from creating an account, to writing and upvoting notes, and grasping how bridging-based ranking works in presenting content to users. Beyond learning its functionality, it is equally important to foster critical thinking and guide members on how to raise concerns on misinformation without deepening divisions or eroding trust within or between communities.
As such, education and digital literacy must go beyond individual-level skill-building to include group-based dialogue and learning. This broader, communal approach can build shared awareness against algorithmic biases and help communities recognise coordinated behaviours attempting to distort online discussions.
A Holistic Approach to Supporting Community-driven Fact-checking and Moderation
While Meta’s shift toward community-driven moderation opens new opportunities for local communities to take a more active role, the platform still bears significant responsibility. In Southeast Asia’s diverse religious, ethnic and linguistic contexts, a one-size-fits-all approach may not suffice. Meta must engage meaningfully with local communities to understand their concerns and co-develop features and processes that are culturally relevant and effective. Moreover, when false information still spreads despite these efforts, Meta must take accountability and continue to refine its strategies to address such challenges.
At the same time, broader-based participation requires intentional support. For local communities – particularly faith-based groups – participating effectively in community fact-checking and moderation can be seen as an extension of values and teachings around community responsibility and care. By reframing digital engagement in this way, participation in online spaces is encouraged, helping to reshape digital norms and behaviours within and across communities.
Conclusion
Community-driven fact-checking and moderation have good potential, but they cannot replace the expertise of journalists and professional fact-checkers, particularly when addressing deepfakes or verifying information that requires domain-specific knowledge. A coordinated effort, such as creating a cross-sector network linking ethnic and faith-based organisations, journalists, third-party fact-checkers, and policymakers, is necessary to ensure a responsive and resilient ecosystem.
These partnerships will be very useful in situations where online misinformation can have far-reaching consequences, such as during pandemics and national elections.
About the Author
Lam Teng Si is a Senior Analyst at the Social Cohesion Research Programme at S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. This commentary is part of a series leading up to the International Conference on Cohesive Societies 2025.
SYNOPSIS
Meta’s move from independent third-party fact-checkers to community-driven fact-checking in the United States, similar to X’s Community Notes, has ushered in a shift toward participatory moderation, and the need for new ways to navigate the digital space. Local communities in ethnic and religious spaces can collaborate with fact-checking organisations to build capacity amongst members and contribute to the fight against the dissemination of false information.

COMMENTARY
On 7 January 2025, Meta CEO Mark Zuckerberg announced that Facebook and Instagram will shut down their fact-checking programme in the US and opt for the community-based approach of Community Notes. This move marks Meta’s retreat from its responsibility to detect false information with the help of independent, certified third-party fact-checkers.
Consequently, the responsibility of identifying false information will be left to the community, where diverse social media users will have to come to a consensus on information reliability via Community Notes, a feature first introduced by X (formerly Twitter) in 2021. However, with Meta’s extensive reach in Southeast Asia, the reliance on Community Notes has several implications.
First, given that social media is widely used in the region – Facebook and Instagram being prominent platforms in the social media landscape – its extensive use and reach make them significant breeding ground for false information, misinformation, and disinformation.
Second, with Southeast Asia being a religiously and ethnically diverse region, compounded by the relatively lower digital literacy rates and an imbalance of power over internet governance – where the government in some countries hold significantly more power in defining what constitutes fake news – this has raised questions whether community-driven efforts to counter divisive online narratives and behaviours can be effective without undermining the social fabric of societies.
The challenge is further complicated by mixed results on the effectiveness of Community Notes in combating misinformation. For example, this study reported no significant reduction in misleading tweets on X. It also highlighted that the community-based approach may not be sufficiently responsive to curb the rapid spread of tweets when they go viral.
On the other hand, other studies had more positive results. One study found that when Community Notes were publicly displayed, users who had actively engaged in misinformation were more likely to retract their tweets. This suggests that community feedback can encourage content removal among those who publicise misinformation.
Despite the mixed results, the implementation of Community Notes appears to be a likely path forward for social media platforms, and the opportunity to mitigate the challenges ahead remains open across the region. With the United States serving as the experimental ground for Meta’s new intervention, there is a need to build capacity at the community level and to recalibrate online norms in our region to better adapt to community-driven fact-checking and moderation.
Building Capacity for Community-Based Fact-Checking and Moderation
The shutdown of Meta’s fact-checking programme does not mark the end of fact-checking efforts. Instead, it has created new opportunities for organisations to collaborate more closely with local communities, especially ethnic and faith-based groups in the region, to strengthen digital literacy and build leadership in navigating online spaces. By equipping communities with fact-checking knowledge and frameworks, organisations can empower them to actively participate in crowdsourced fact-checking efforts.
To start building capacity, one way is to encourage broad-based participation. Fact-checking efforts and outcomes are dependent on who participates in the process. For instance, studies have shown that other collaborative platforms like Wikipedia have a pronounced gender gap, with women making up only 10 to 15 per cent of the knowledge platform’s editors – a disparity possibly linked to lack of confidence, caregiving or other responsibilities that constrain volunteer activities.
Such underrepresentation needs to be addressed, as misinformation targeting or misrepresenting women may be left unchallenged. Thus, encouraging participation across diverse communities is essential to ensure effective fact-checking.
Similarly, when misinformation involves faith or ethnic content, the participation of minority ethnic and religious communities, with majority groups as allies, would help to ensure that fact-checking and moderation are contextually informed. Interfaith and interethnic knowledge and networks enable prompt detection and response to misinformation before the narratives perpetuate harmful stereotypes or misperceptions.
Beyond participation, all communities need to foster a culture of collective learning to stay ahead of updates in digital platforms. Fact-checking organisations equipped with knowledge and skills can help local communities understand how algorithms amplify content and how certain groups can exploit digital platforms to skew online public opinion.
For starters, community members can be trained to use X’s Community Notes – from creating an account, to writing and upvoting notes, and grasping how bridging-based ranking works in presenting content to users. Beyond learning its functionality, it is equally important to foster critical thinking and guide members on how to raise concerns on misinformation without deepening divisions or eroding trust within or between communities.
As such, education and digital literacy must go beyond individual-level skill-building to include group-based dialogue and learning. This broader, communal approach can build shared awareness against algorithmic biases and help communities recognise coordinated behaviours attempting to distort online discussions.
A Holistic Approach to Supporting Community-driven Fact-checking and Moderation
While Meta’s shift toward community-driven moderation opens new opportunities for local communities to take a more active role, the platform still bears significant responsibility. In Southeast Asia’s diverse religious, ethnic and linguistic contexts, a one-size-fits-all approach may not suffice. Meta must engage meaningfully with local communities to understand their concerns and co-develop features and processes that are culturally relevant and effective. Moreover, when false information still spreads despite these efforts, Meta must take accountability and continue to refine its strategies to address such challenges.
At the same time, broader-based participation requires intentional support. For local communities – particularly faith-based groups – participating effectively in community fact-checking and moderation can be seen as an extension of values and teachings around community responsibility and care. By reframing digital engagement in this way, participation in online spaces is encouraged, helping to reshape digital norms and behaviours within and across communities.
Conclusion
Community-driven fact-checking and moderation have good potential, but they cannot replace the expertise of journalists and professional fact-checkers, particularly when addressing deepfakes or verifying information that requires domain-specific knowledge. A coordinated effort, such as creating a cross-sector network linking ethnic and faith-based organisations, journalists, third-party fact-checkers, and policymakers, is necessary to ensure a responsive and resilient ecosystem.
These partnerships will be very useful in situations where online misinformation can have far-reaching consequences, such as during pandemics and national elections.
About the Author
Lam Teng Si is a Senior Analyst at the Social Cohesion Research Programme at S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. This commentary is part of a series leading up to the International Conference on Cohesive Societies 2025.