In wake of Christchurch terrorist attack, Facebook to train AI systems using police video

Facebook will work with law enforcement organisations to train its artificial intelligence systems to recognise videos of violent events as part of a broader effort to crack down on extremism.

Facebook's AI systems were unable to detect live-streamed video of the Christchurch terrorist attack.

The effort will use body-cam footage of firearms training provided by US and UK government and law enforcement agencies. The aim is to develop systems that can automatically detect first-person violent events without also flagging similar footage from movies or video games.

Facebook has been working to crack down on extremist material on its service, so far with mixed success.

A police officer stands guard in front of the Masjid Al Noor mosque in Christchurch. Source: Associated Press

The social media giant also plans to de-radicalise users who search for white supremacy pages by redirecting them away from that page. 

In a statement today, New Zealand's chief censor, David Shanks, said he welcomed "any move by any of the major internet platforms to address the threat posed by online extremism".

Facebook's announcement comes just before a hearing before the US Congress to look at how Facebook, Google and Twitter handle violent content. 

Your playlist will load after this ad

The social media giant intends to de-radicalise users who search for white supremacy pages. Source: 1 NEWS