,

Facebook’s secret rulebooks reveals how the site deals with violent content

Come for the puppy pictures, leave because of unfiltered harassment.
Image: sean gallup/Getty Images

Facebook’s closely guarded, secret guidelines for monitoring violence, hate speech, and revenge porn have been revealed for the first time after an investigation by the Guardian. They show the extremely fuzzy and imperfect line between what’s considered dangerous material and acceptable content on the world’s leading social network.

Internal documents obtained by the Guardianexplore the murky waters that moderators and executives must wade through on a daily basis as they judge user generated content. A Facebook user who exclaims “Let’s beat up fat kids” gets a pass, but a commenter who urges “Someone shoot Trump” is taken seriously.

Guardian‘s investigation, which rolled out on Sunday, arrives at a pivotal moment for the social media giant.

Facebook faces mounting public pressure to rein in the ugliest impulses of its 2 billion users, and shield our eyes from disturbing and triggering videos, such as murders and sexual assaults that are broadcast on Facebook Live. At the same time, Facebook says it’s trying to respect its users’ freedom of expression and avoid overt censorship.

Image: carl court/Getty Images

Yet, like many tech companies, Facebook has kept extremely mum on the details of its content moderation strategy, even amid the deafening criticism.

The Guardian talked to overwhelmed moderators and said it saw more than 100 internal training manuals, spreadsheets, and flowcharts that form the blueprint for how Facebook moderates issues such as violence, hate speech, terrorism, racism, revenge porn, and self-harm.

Other stories in the British newspaper’s “Facebook Files” series explores how Facebook lets users livestream self-harm videos, and how the social media site is attempting to address criticism that Facebook is a forum for misogyny and racism.

Mashable contacted Facebook for comment and will update this story with any response.

Facebook’s trove of violence-related documents detail the ever-growing array of terrible situations that moderators must navigate. Many moderators have said they find the policies inconsistent and confusing, like, for example, how not all rape threats are treated equally.

“Facebook cannot keep control of its content,” an unnamed source told the Guardian. “It has grown too big, too quickly.”

Documents supplied to Facebook moderators within the last year included lists of comments that are considered unacceptable, and others that are allowed to remain. They include:

Credible Violence (Calls for Action):

UNACCEPTABLE: “Someone shoot Trump.”

ALLOWED: “Kick a person with red hair.”

ALLOWED: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat.”

ALLOWED: “Let’s beat up fat kids.”

UNACCEPTABLE: #stab and become the fear of the Zionist.

As a head of state, President Donald Trump is in a protected category, so threats against Trump even if they’re hollow aren’t allowed. Yet instructions on how to snap someone’s neck aren’t considered a credible threat, ostensibly because they’re only hypothetical misogyny and abuse.

Credible Violence (Aspirational/Conditional Statements):

ALLOWED: “Little girl needs to keep to herself before daddy breaks her face”

ALLOWED: “You assholes better pray to God that I keep my mind intact because if I lose I will literally kill HUNDREDS of you.”

ALLOWED: “Unless you stop bitching I’ll have to cut your tongue out”

The leaked documents show certain “aspirational” or “conditional” statements that are permitted, because they’re also considered to be generic or not credible, even if they’re alarming.

Revenge Porn, Current Policy:

High-level: Revenge porn is sharing nude/near-nude photos of someone publicly or to people that they didn’t want to see them in order to shame or embarrass them.

Abuse Standards: Sharing imagery as “revenge porn” if it fulfills all three conditions:

– Image produced in a private setting. AND

– Person in image is nude, near nude, or sexually active. AND

– Lack of consent confirmed by: Vengeful context (e.g. caption, comments, or page title); OR independent sources (e.g. media coverage, or LE record)

This partial excerpt comes from a slide on Facebook’s revenge porn policy. Moderators told the Guardian that policies on sexual content are among the most complex and confusing.

The same goes for artwork. According to the documents, all “handmade” art showing nudity and sexual activity is allowed, but digitally made art showing sexual activity is not. A separate document showed that videos of abortion are allowed, so long as there is no nudity.

Graphic Violence (Animal Abuse):

– Generally, imagery of animal abuse can be shared on the site.

– Some extremely disturbing imagery may be “marked as disturbing.”

– Sadism and celebration restrictions apply to all imagery of animal abuse.

According to another slide, Facebook’s policies on animal abuse allow certain photos and videos for “awareness,” although Facebook is allowed to flag content as “extremely disturbing” to warn users before they take a look.

“Generally, imagery of animal abuse can be shared on the site,” one slide says. “Some extremely disturbing imagery may be marked as disturbing.”

In case Facebook forgot to mention within its long list of rules, here’s an easy one: Be nice to people, and stop being horrible.

Read more: http://mashable.com/2017/05/21/facebook-secret-guidelines-violent-content-guardian/

What do you think?

0 points
Upvote Downvote

Total votes: 0

Upvotes: 0

Upvotes percentage: 0.000000%

Downvotes: 0

Downvotes percentage: 0.000000%