Facebook's guidelines on such videos indicate the company tries to get people help, will remove video if unsuccessful

May 22, 2017 21:12 GMT  ·  By

Live streaming of self-harm acts on Facebook is allowed, reveal leaked internal documents, mostly because the social network doesn't want to censor or punish people in distress who are attempting suicide. 

According to The Guardian, which published a series of documents that allegedly belong to Facebook and are meant for employees, the images may be removed from the site only once there is no longer an opportunity to help the person, unless the incident has news value. Facebook's Mark Zuckerberg admitted that they've helped people attempting suicide on live streams by reaching out to friends, family and authorities, but also admitted there were times when they couldn't help.

What matters

This is a policy that was found among about 100 internal documents and manuals The Guardian got its hands on. These feature not only themes such as self-harm but also violence, hate speech, pornography, terrorism, racism and more. More specifically, they detail how Facebook employees are to deal with these topics.

"We're now seeing more video content -- including suicides -- shared on Facebook. We don't want to censor or punish people in distress who are attempting suicide. Experts have told us what's best for these people's safety is to let them livestream as long as they are engaging with viewers," the documents state.

"However, because of the contagion risk, what's best for the safety of people watching these videos is for us to remove them once there's no longer an opportunity to help the person. We also need to consider newsworthiness, and there may be particular moments or public events that are part of a broader public conversation that warrant leaving up."

Facebook live has been around for a little over a year and ever since its launch it has been extremely popular among all types of users, from regular teenagers to journalists, to politicians and Hollywood stars. This popularity has led to plenty of disturbing footage being shared, including murders, suicides, and beatings. This has brought forth the question about when is it ok to censor videos and when it's not. Facebook's response to numerous cases indicates the company isn't all that sure about how to proceed either.

Facebook has vowed to hire 3,000 more people over the next year to help monitor reports about violent videos and other types of material. They will join the other 4,500 people already working this job.