Facebook’s leaked guidelines offer insight into the codes and rules used by Facebook moderators to determine what content is allowed to be posted and remain on the site. According to The Guardian, the following guidelines (along with many others) are mentioned in documents supplied to Facebook moderators within the last year:
- Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because they are not regarded as credible threats.
- Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.
- Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or “actioned” unless there is a sadistic or celebratory element.
- Photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as “disturbing”.
- All “handmade” art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not.
- Videos of abortions are allowed, as long as there is no nudity.
- Facebook will allow people to livestream attempts to self-harm because it “doesn’t want to censor or punish people in distress”.
- Anyone with more than 100,000 followers on a social media platform is designated as a public figure – which denies them the full protections given to private individuals.
Comments such as “Little girl needs to keep to herself before daddy beaks her face,” and “I hope someone kills you,” among others are allowed to remain on the site as they are “regarded as either generic or not credible.”
“In one of the leaked documents, Facebook acknowledges “people use violent language to express frustration online” and feel “safe to do so” on the site.” Facebook also admits (as many who use the site already know) that “not all disagreeable or disturbing content violates community standards.”
“Many moderators are said to have concerns about the inconsistency and peculiar nature of some of the policies.” Moderators also complained of the volume of work, claiming they often have “just ten seconds” to make decisions.
According to Monika Bickert, Facebook’s head of global policy management, “We feel responsible to our community to keep them safe and we feel very accountable. It’s absolutely our responsibility to keep on top of it. It’s a company commitment. We will continue to invest in proactively keeping the site safe, but we also want to empower people to report to us any content that breaches our standards.”