Netsafe urges Kiwis to report 'Momo Challenge', violent game blamed for child deaths overseas

February 21, 2019
Scary female demon with big eyes and wide smile for halloween holiday. Vector illustration EPS10

Only one complaint has been made in New Zealand regarding harm caused by the horror online character Momo, according to online safety organisation Netsafe.

The “Momo Challenge" is an online game where users are sent violent images via WhatsApp and then given a series of tasks to complete. The tasks progressively become more violent and the final task has been reported as being suicide or extreme violence.

The avatar used is a horror image of a woman with bulging eyes and frightening features.

The game has been linked to several child deaths in Columbia, according to UK magazine The Week.

Netsafe CEO Martin Crocker says although the character Momo has been quite heavily discussed in the media, Netsafe has only received one report regarding harm caused by it.

“That’s the thing, people are worried about it, that’s what Netsafe is here for and that’s what gives us material to go to multinationals with,“ he says.

He says worried parents or teachers should approach Netsafe in the first instance if they are concerned about anything disturbing they see online, including the Momo Challenge.

Mr Crocker said that constructive platforms like Facebook, Twitter, Instagram and YouTube work together with Netsafe on New Zealand’s Harmful Digital Communications Act.

“I think people would be surprised how many hosts around the world try and adhere to New Zealand censorship laws, not because they have to but because they want to do the right thing,” said Mr Crocker.

He said that because of the professional relationship Netsafe has with online hosts, it is able to address concerns more quickly.

“Netsafe is officially recognised as a trusted flagger by Google so we have some ability to cut through the reporting process and it goes further up the queue,” he said.

In the July-September 2018 quarter, YouTube removed almost eight million videos through it’s flagging process, according to it’s transparency report

Earlier this month , Instagram vowed to remove all images of self-harm after a 14-year-old UK girl’s death was linked to viewing images on the social media platform.

SHARE ME

More Stories