Facebook took down more than 12 million pieces of terrorist content on its social network between April in addition to September, the company disclosed on Thursday. Facebook defines terrorist content as posts of which praise, endorse or represent ISIS, al-Qaeda in addition to their affiliate groups.
The removal of the terrorist content is usually part of an on-going effort by Facebook to rid its service of harmful content, which also includes misinformation, propaganda in addition to spam.
Facebook said, “We measure how many pieces of content (such as posts, images, videos or comments) we took action on because they went against our standards for terrorist propaganda, specifically related to ISIS, al-Qaeda in addition to their affiliates.”
The company said the item removed 9.4 million pieces of terrorist content during the second quarter in addition to another 3 million posts during the third quarter. By comparison, the company in May announced of which the item removed 1.9 million posts during the first quarter of 2018.
“Terrorists are always looking to circumvent our detection in addition to we need to counter such attacks with improvements in technology, training, in addition to process,” the company said in a blog post. “These technologies improve in addition to get better over time, however during their initial implementation such improvements may not function as quickly as they will at maturity.”
A lot of the material removed was old. however Facebook said the item removed 2.2 million brand brand new terrorist posts within the second quarter in addition to 2.3 million within the third quarter, up via 1.2 million within the first quarter.
Facebook explained of which the item has focused its efforts on removing terrorist content before the item is usually viewed by a wide audience. With of which focus in mind, Facebook has reduced the median time between when a user first reports a terrorist post to when Facebook takes the item down. of which median time was 43 hours within the first quarter, however fell to 22 hours within the second quarter in addition to 18 hours within the third quarter.
The company said the item has relied on machine learning technology to detect terrorist content. In most cases, of which terrorist content is usually reviewed in addition to removed by trained humans, however the machine learning technology can remove content on its own if its “confidence level is usually high enough of which its ‘decision’ indicates the item will be more accurate than our human reviewers,” the company said.