Facebook declined to confirm the Guardian's reporting, but it didn't dispute it. Perhaps most interesting is Facebook's extremely variable policy on threats of death and violence. Basically these slides and manuals attempt to try and come up with rules and scenarios which content needs to be moderated or if it can be ignored.
Perhaps the most contentious issue is that of violence.
Leaked internal documents detail the complicated and, at times, controversial techniques used to monitor content while preserving free speech, causing some critics to wonder whether Facebook can effectively keep watch over the overwhelming volume of posts on its site. One source says the company simply can't handle all the content and that it has grown "too big, too quickly".
"Not all disagreeable or disturbing content violates our community standards".
The guidelines for removal of content on Facebook have been revealed, and they may be more lenient than you think.
Stretching to thousands of documents, the trove of papers illustrate the complex web of rules and regulations used to govern what is allowed and what is not; including apparent inconsistencies in approach which treat a threat to kill the United States president far more seriously than a similar threat to take the life of an ordinary member of the public. These are not terms or words usually used on Newstalk.com, but we have made a decision to publish them to illustrate how the Facebook policing process works.
Threats against so-called "protected categories" such as President Trump should be deleted according to the publication's files - and so "Someone shoot Trump" is not acceptable to post.
You can't say you want to hurt candidates or heads of state, journalists or activists, witnesses and informants, specific law enforcement officers, foreigners, homeless people.
Videos of deaths don't always have to be deleted because they can raise awareness of issues such as mental illness. The videos remained on Facebook for 24 hours before they were taken down.
Facebook acknowledges that censoring all forms of violent language, especially between friends, would be denying people from airing frustrations. People have different views on what's appropriate to share, Facebook said. While threatening political figures is an obvious red flag, a statement like "I'm going to kill you" is considered too generic.
Another example was "Let's beat up fat kids", which was permissible.
"This requires a lot of thought into detailed and often hard questions, and getting it right is something we take very seriously", Monika Bickert, head of global policy management at Facebook, said in a statement on Monday. "For instance, the line between satire and humour and inappropriate content is sometimes very grey", she said. The broad exception is any imagery of violence that is shared with "sadism and celebration", which should be removed, according to the guidelines. The same goes for non-sexual physical abuse and bullying.