Finally, an MPAA & RIAA idea that's good for something

Jun 27, 2016 21:05 GMT  ·  By

A Reuters report from June 25 revealed that Google and Facebook moved to using an automated system for blocking extremist and violent videos, similar to the one employed to identify copyrighted content.

The report cites two anonymous sources from each company, revealing that Facebook's and YouTube's staff have deployed a system capable of detecting content flagged as extremist or violent and automatically removing it without user supervision.

The system is similar to the one used by the two companies to identify illegal copies of copyrighted content, such as movies, music videos, or the music on the background of some homemade videos, and then take them down. Both companies created these original systems after pressure from copyright protection groups such as RIAA and MPAA.

Some sort of human supervision is still needed

Reuters says both Google and Facebook appear to be using a modified version of this system but to identify Islamic State videos instead.

The only downside is that the system can't detect original videos posted for the first time on the network. Both companies will still rely on users to flag the content, and on one of their employees to manually review it.

Once added to their databases, both networks will be able to recognize it when another user tries to re-upload it or to repost it.

Systems created at the request of US and EU governments

Reuters claims that both services created the systems at the behest of US President Barack Obama and other US and European leaders, worried that terrorists were becoming extremely successful at promoting their extremist content online that helped them recruit new members.

The meeting, named the Counter Extremism Project, took place in late April, and it is said that Twitter and Cloudflare also attended. A Twitter spokesperson said the company was still evaluating its position on this matter.

No company wanted to provide more technical details on how their system works because they're not keen on admitting to censoring content at the demand of US and EU governments.