Ambulance taking a man from outside a mosque in central Christchurch, New Zealand
Image courtesy: abcnews.go.com

The horrific mass shootings at New Zealand mosques have ignited strong demands for YouTube, Facebook and Twitter to regulate violent hate speech on their platforms.

The shootings have left at least 49 people dead and is the worst mass shooting in New Zealand’s history. The shooter reportedly posted a 74-page anti-Muslim manifesto to Twitter criticizing “white genocide.” He also posted likewise on 8chan (a discussion site where users frequently post hateful content). Then on Friday, someone posted “I will carry out and attack against the invaders, and will even livestream the attack via Facebook,” Reuters reported.

It is clear that online platforms played a huge roll in these shootings. One of the shooters even live streamed the attack on Facebook. He live streamed a 17-minute video of him shooting multiple people in the Al Noor Mosque in Christchurch. The shootings have resulted in many demanding these online platforms to do more to prevent hate speech posted on their sites.

Authorities at Christchurch, New Zealand
Image courtesy: MARK BAKER/AP/REX/SHUTTERSTOCK

There’s strong demand for online platforms to better regulate hate speech

Facebook, Twitter and YouTube have responded to the shootings by deleting any footage of the incident.

The Facebook live stream video has been taken down, though it’s not known how quickly Facebook deleted it. Facebook spokeswoman Mia Garlick said: “Police alerted us to a video on Facebook shortly after the live stream commenced, and we quickly removed both the shooter’s Facebook and Instagram accounts and the video. We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”

As for Twitter, the platform disabled the profile of the alleged attacker. YouTube also reported it’s working vigilantly to remove any footage” that is related to the attacks.

Nevertheless, there’s still content that people can get access to, even a couple of hours after the shooting. For instance, some have dropped the video or posted text of the manifesto. The availability of this content makes law enforcement worried there could be copycat crimes.

Sajid Javid, Britain’s Home Secretary, who controls public safety and security, directly called out the social media platforms. He demanded they do more to stop this kind of violence from being spread online. He tweeted: “You really need to do more @YouTube @Google @facebook @Twitter to stop violent extremism being promoted on your platforms. Take some ownership. Enough is enough.”

Facebook, YouTube and Twitter have all said they’ve worked very hard to prevent violent hate speech content from being posted on their platforms. However, it sadly looks like, even with all their efforts, they weren’t able to prevent this attack from being streamed online, even if it was just for a few hours.

Would stricter regulations have prevented the New Zealand shootings?

This begs the question though: Would the shootings have been prevented if Facebook and other platforms had done more to remove hate speech from their platforms? The answer is unclear, but it would have for sure prevented the hateful propaganda from spreading. The social media platforms may not be the direct cause of the New Zealand Shootings. However, we certainly can say they’re playing a significant role in spreading the attacker’s hateful rhetoric.

Our hearts go out to everyone affected by this horrific event.

Did you find this content helpful?