Facebook: AI removes most content from ISIS and Al Qaeda early from platform

Spread the love

Facebook says 99 percent of content related to terror groups ISIS and Al Qaeda is removed from the platform before anyone has flagged it as terrorist content, or in some cases before it appears on Facebook.

The removal of content attributable to ISIS or Al Qaeda is done by automated systems that compare photos and videos with other material. Machine learning is also used to recognize text. Once it is clear that a specific case is terrorist content, 83 percent of uploaded copies of that content are then deleted within an hour, Facebook said.

Facebook acknowledges that the systems used for this are specifically designed to counter content from ISIS and Al Qaeda, making it less effective to find and block content from other terrorist groups. Facebook is deliberately targeting these two terrorist groups because, according to the company, they pose the greatest threat worldwide. In the long run, Facebook hopes that the use of the automated systems can be expanded so that content from local, smaller terrorist groups can also be detected at an early stage.

In June, Facebook started using artificial intelligence to detect terrorist content. Facebook and Twitter also set up a working group in June to combat the spread of terrorist content. For example, a database is maintained with hashes of removed extremist and terrorist content, so that this content can be quickly recognized by others.

You might also like