/socialsamosa/media/media_files/2025/11/20/9-8-2025-11-20-11-15-16.jpg)
The Oversight Board has accepted Meta’s request for a policy advisory opinion on how the company should approach expanding its community notes program outside the United States.
The company has asked the Board to advise on the factors it should consider when determining whether any country should be excluded from the rollout, noting that local contextual elements may affect how the system functions. The company also sought guidance on how those factors should be weighed against one another in a framework that can be scaled globally.
In its submission, the company said the community notes program remains in an ‘early stage of product development’ and that it currently has ‘limited data from the US beta rollout.’ It added that its ‘primary interest lies in establishing fundamental guiding principles’ for expanding the program worldwide.
On January 7, 2025, the company announced it was discontinuing its third-party fact-checking program in the US and transitioning to community notes. At the time, it stated that it would refine the feature before introducing it to users outside the US. Community notes allow users to add context to potentially misleading posts, unlike the previous fact-checking model, which relied on partner organizations.
Meta detailed how the program functions. Users can apply to become contributors, and those who meet the eligibility criteria are “gradually and randomly” admitted from a waitlist. Contributors can write and rate notes on public content on Facebook, Instagram and Threads originating in the United States. They also see a dedicated feed of posts flagged by users as needing additional context. Notes must include a supporting link, and contributors can rate others’ notes as ‘helpful’ or ‘not helpful.’
Meta said it built its system using the open-source algorithm from X’s community notes program. It described the tool as a “consensus algorithm that uses separate measures of ‘helpfulness’ and ‘consensus’ to calculate an overall ‘helpful consensus’ score.”
According to the company, the algorithm identifies agreement among a sufficient number of contributors who typically disagree with each other. If a note’s combined score passes a ‘certain threshold’ and does not violate its Community Standards, it is published and appears as a banner beneath the relevant post. The company said this approach “helps ensure that notes reflect a range of perspectives and reduces the risk of bias.”
The request also outlines steps Meta is taking to retain volunteer contributors and prevent coordinated manipulation of submissions and ratings.
Meta noted that its enforcement of the Misinformation and Harm Community Standard remains unchanged. It continues to remove content that could lead to “imminent physical harm” or “interference with the functioning of political processes,” and still relies on trusted partners to help identify violations.
As part of its inquiry to the Board, the company listed several factors it may use to determine whether any country should be excluded from the program. These include low levels of freedom of expression, lack of press freedom, government restrictions on the internet, low digital literacy, and the ability, past and present, to achieve the disagreement required for the consensus-based algorithm to function effectively. It said the list is not exhaustive.
The Board has asked for public comments on issues including the risks and opportunities of community-sourced moderation, the suitability of consensus-driven algorithms in different political environments, its human rights responsibilities, and best practices for global product rollout in contexts with conflict, polarisation, or limited rights protections. It is also seeking research on responses to misleading content beyond removal, and studies on how country-specific factors influence the functioning of social media systems.
The board has previously ruled on cases involving the company’s misinformation policies and labelling practices, including decisions related to elections in Iraqi Kurdistan (2025), UK riots (2025), Australian voting rules (2024), an altered video of U.S. President Joe Biden (2024), COVID-19 misinformation (2023), and lockdown-related posts in Brazil (2021).
/socialsamosa/media/agency_attachments/PrjL49L3c0mVA7YcMDHB.png)
/socialsamosa/media/media_files/2025/11/12/yearbook-2025-11-12-14-50-28.jpg)
Follow Us