4/10/22, 10:31
Oversight Board | Independent Judgment. Transparency. Legitimacy.
Meta initially removed the content for violating its Hate Speech policy but restored it after the user appealed. After being
reported by another user, Meta then removed the content again for violating its Hate Speech policy. According to Meta, before
the Board selected this case, the content was escalated for additional internal review which determined that it did not, in fact,
violate the company’s Hate Speech policy. Meta then restored the content to Instagram. Meta explained that its initial decisions
to remove the content were based on reviews of the pictures containing the terms “z***l” and “t***e/t***a.”
Key findings
The Board finds removing this content to be a clear error which was not in line with Meta’s Hate Speech policy. While the post
does contain slur terms, the content is covered by an exception for speech “used self-referentially or in an empowering way,” as
well as an exception which allows the quoting of hate speech to “condemn it or raise awareness.” The user’s statements that they
did not “condone or encourage the use” of the slur terms in question, and that their aim was “to reclaim [the] power of such
hurtful terms,” should have alerted the moderator to the possibility that an exception may apply.
For LGBTQIA+ people in countries which penalize their expression, social media is often one of the only means to express
themselves freely. The over-moderation of speech by users from persecuted minority groups is a serious threat to their freedom
of expression. As such, the Board is concerned that Meta is not consistently applying exemptions in the Hate Speech policy to
expression from marginalized groups.
The errors in this case, which included three separate moderators determining that the content violated the Hate Speech policy,
indicate that Meta’s guidance to moderators assessing references to derogatory terms may be insufficient. The Board is
concerned that reviewers may not have sufficient resources in terms of capacity or training to prevent the kind of mistake seen in
this case.
Providing guidance to moderators in English on how to review content in non-English languages, as Meta currently does, is
innately challenging. To help moderators better assess when to apply exceptions for content containing slurs, the Board
recommends that Meta translate its internal guidance into dialects of Arabic used by its moderators.
The Board also believes that to formulate nuanced lists of slur terms and give moderators proper guidance on applying
exceptions to its slurs policy, Meta must regularly seek input from minorities targeted with slurs on a country and culture-specific
level. Meta should also be more transparent around how it creates, enforces, and audits its market-specific lists of slur terms.
The Oversight Board’s decision
The Oversight Board overturns Meta’s original decision to remove the content.
As a policy advisory statement, the Board recommends that Meta:
Translate the Internal Implementation Standards and Known Questions into dialects of Arabic used by its
content moderators. Doing so could reduce over-enforcement in Arabic-speaking regions by helping
moderators better assess when exceptions for content containing slurs are warranted.
Publish a clear explanation of how it creates its market-specific slur lists. This explanation should include the
processes and criteria for designating which slurs and countries are assigned to each market-specific list.
Publish a clear explanation of how it enforces its market-specific slur lists. This explanation should include the
processes and criteria for determining precisely when and where the slurs prohibition will be enforced, whether
in respect to posts originating geographically from the region in question, originating outside but relating to
the region in question, and/or in relation to all users in the region in question, regardless of the geographic
origin of the post.
Publish a clear explanation of how it audits its market-specific slur lists. This explanation should include the
processes and criteria for removing slurs from or keeping slurs on Meta’s market-specific lists.
*Case summaries provide an overview of the case and do not have precedential value.
Full case decision
https://www.oversightboard.com/decision/IG-2PJ00L4T
2/11