Instagram apologises after Christchurch attack video 'meme' left online despite multiple complaints

A person holding a phone with Instagram on it.

A "meme" version of the Christchurch terrorist attack video had to be reported multiple times before Instagram took it down, with the social media site repeatedly saying it didn’t breach their guidelines.

It took New Zealand police directly intervening for the video to be removed over the weekend.

Instagram has now apologised and says it's launching an investigation to find out why it wasn't removed earlier.

The video of the March 15 attack has been deemed objectionable by the New Zealand Classification Office, after the terrorist attack on Christchurch mosques was streamed live on Facebook.

As a result, possessing or distributing the video is illegal.

Since then, Facebook says it's taken down more than 4.5 million versions of the footage.

But a video posted on Instagram late last week, described as a "meme" version of the video, remained online for more than 24 hours before being taken down.

In that time, it was reported multiple times by multiple people. Each time, they were advised the video wouldn't be removed as it "likely doesn't go against our Community Guidelines".

Facebook, which owns and operates Instagram, admits that wasn't right.

"We have blocked and disabled the account which shared this post, and apologise this content wasn’t removed sooner," a spokesperson told 1 NEWS.

"Imagery which glorifies violence is not allowed on Instagram, and we remove the vast majority of extreme content before it is reported by our community."

It's understood it wasn't the poster's first time sharing the video, with one commenter asking: "Why the other one get deleted for? It was funny asf lmao".

Kiana, who was one of the people reporting the video, says she's frustrated it took so long for the video and account to be removed.

"The way it was posted as a 'meme' like it was a video game was truly sickening and they had so many comments treating it like it was a joke," she says.

"I was pretty pissed off [with Instagram] if I'm honest, it was like an absolute joke that they didn’t take it seriously."

The video was then reported to the New Zealand police. Within an hour, the entire account was deleted.

Superintendent Mike Johnson, Acting Assistant Commissioner: Investigations, Serious and Organised Crime, says they have "strong relationships" with major platforms like Facebook which help them escalate these kinds of issues.

"In this instance, the content came through internally to our social media team and it was immediately raised with Instagram for review," he says.

"Once we were made aware of the content, which we believed breached the community guidelines, it was removed within an hour."

People who find the video online should first report it to the platform it's published on, but they can also report it to police and Netsafe, Mr Johnson says.

Facebook says it's investigating why the video wasn't immediately removed after being reported by the community. 

As well as community reports, new versions of the video are added to a database so it can be used to match and automatically remove future uploads.

Facebook is one of the signatories to the Christchurch Call, an international commitment to stamp out violent extremist content online.

SHARE ME

More Stories