Shootings in New Zealand: what should social networks do to their far right content?

Shootings in New Zealand: what should social networks do to their far right content?


Experts criticize that social networks are being used to promote “toxic and violent ideologies”, although right-wing groups denounced in the past what they consider “censorship” about their activity on these platforms.

While it opened fire indiscriminately against hundreds of people, it broadcast it live to the whole planet through Facebook.

Branton Tarrant was formally charged with murder on Saturday as the main suspect in the mass shootings at two mosques in Christchurch, New Zealand, which left at least 50 people dead.

The 28-year-old Australian had previously been identified as the person who transmitted on social media 17 minutes of video about the attack on the Al Noor mosque through a camera placed on his head.

What followed was a race against time for the platforms to eliminate these images as soon as possible, since they were widely shared and disseminated after the attacks.

And, through social networks, he came to the covers of some of the most important news websites in the world as images, GIFs and even the full video.

Again, this situation highlighted how sites like Twitter, Facebook, YouTube and Reddit try (unsuccessfully) to manage and content far – right existing on their platforms.

Brenton Tarrant
Copyright of the image FACEBOOK
Image caption Brenton Tarrant broadcast live on Facebook the attack on the Al Noor mosque.

As the video continued to spread, some users posted messages asking to stop sharing.

” That’s what the terrorists wanted” , said some.

What was shared?

The video, which shows a first-person view of the murders, was widely disseminated in networks.

  • About 10-20 minutes before the attack in New Zealand, someone wrote in the / pol / section of 8chan, a discussion forum with images very popular among supporters of the right-wing Alt-Right group. The post included a link to the suspect’s Facebook profile, where he announced he would be broadcasting live and published a document with hate messages.


  • The document, as Bellingcat website analyst Robert Evans points out, is jam-packed with “huge amounts of content, mostly ironic and low-quality trolling” and memes to distract and confuse people.


  • The suspect also referred to a meme during his live broadcast. Before opening fire, he shouted “Subscribe to PewDiePie”, referring to a meme that was created to keep the YouTube star as the platform channel with the highest number of subscribers. PewDiePie was involved in a controversy over racist comments in the past, which is why some believe that the attacker knew that mentioning his name could provoke an online reaction. PewDiePie said later on Twitter that he felt “disgusted because this person has used my name.”

At least 50 people were killed in two gunfights in mosques in New Zealand.
  • The attacks were broadcast live on Facebook and, although the original video was deleted, it was quickly replicated and spread on other platforms such as Twitter and YouTube.


  • Some users continue to report that the video can still be seen on the Internet, although the networks are acting quite quickly to eliminate the original and copies, which continue to be published on YouTube faster than the platform can eliminate them.


  • Several Australian media broadcast some of the video footage of the shooting, as well as some major newspapers around the world.


  • Ryan Mac, a BuzzFeed technology reporter, created a timeline to show where he had been able to watch the video, included in a verified Twitter account with 694,000 followers where, he reported, he was visible for two hours.

What were the reactions?

While many people downloaded and shared the controversial video on the network, others showed their discomfort and asked the Internet not only not to disseminate it, but to not even see it.

“Please do not circulate the video of the terrorist shooting at our brothers and sisters , that’s what he wanted,  wrote Omar Suleiman, president of the Yaqeen Institute for Islamic Research based in Texas, United States.

Many people were particularly angry with the media that published the images.

The British Channel 4 news anchor, Krishnan Guru-Murthy, for example, specifically named two websites of English newspapers and accused them of descending to “a new low point in the clickbait ” (the technique to achieve the greatest number of clicks in the Internet through headlines or sensational or confusing content).

Buzzfeed reporter Mark Di Stefano also claimed that the MailOnline site had allowed readers to download the 74-page “manifesto” from the attacker. Later, the website deleted the document and issued a statement claiming it had been “an error”.

The editor of the British newspaper Daily Mirror, Lloyd Embley, also tweeted that they had removed the images and that their publication was not “in line with our policy on terrorist propaganda videos.”

How did the social media companies respond?

All social media companies sent their sincere condolences to the victims of the shootings, reiterating that they act as quickly as possible to remove inappropriate content from their platforms.

“The police alerted us to a video on Facebook shortly after the live broadcast began and quickly removed the video and the Facebook and Instagram accounts of the attacker,” said Mark Zuckerberg’s company.

“We are also eliminating any praise or support for the crime and the attacker or attackers as soon as we have knowledge,” he added.

“Our hearts are shattered by the terrible tragedy of today in New Zealand, please know that we are working and watching closely to eliminate any violent video,” YouTube posted.

As for what they have done previously to combat the threat of the far right, the focus of social media companies has been complex.

Twitter acted to eliminate Alt-Right accounts in December 2017. It previously removed and then re-established the account of Richard Spencer, a white American nationalist who popularized the term “alternative right”.

Facebook, which suspended Spencer’s account in April 2018, admitted then that it was difficult to distinguish between what was a hate speech and legitimate political discourse.

Richard Spencer
Copyright of the GETTY IMAGES image
Image caption Accounts in social networks of the white supremacist Richard Spencer were suspended in the past, although some were reactivated shortly afterwards.

This month, YouTube was accused of being incompetent and irresponsible for its management of a video that promotes the outlawed neo-Nazi National Action group.

British parliamentarian Yvette Cooper said the video platform had repeatedly promised to block it, but then reappeared online.

What has to happen now?

Ciaran Gillespie, political scientist at the University of Surrey in the United Kingdom, believes that the problem in this controversy goes far beyond a video,however shocking its content may have been.

“It’s not just a matter of broadcasting a live massacre, social media platforms have been quick to eliminate that and there’s not much they can do about who shares it because of the nature of the platform, but more importantly what happens before that, “he said.

FAmiliares of victims and police
Copyright of the GETTY IMAGES image
Image caption At least 49 people were killed in the Christchurch shootings that were partially broadcast via Facebook Live.


As a political researcher, he uses YouTube often and says that he frequently recommends extreme right content.

“There is a lot of this kind of content on YouTube and there is no way of knowing how much YouTube has dealt well with the threat posed by Islamic radicalization because it considers that it is not legitimate, but there is not the same pressure to remove far-right content. , although it poses a similar threat, “he said.

“There will be more calls for YouTube to stop promoting racist and far-right channels and content.”

“Legitimate controversy”

His opinions are shared by Bharath Ganesh, a researcher at the Internet Institute in Oxford, United Kingdom.

“Deleting the video is obviously what needs to be done, but social networking sites have allowed far-right organizations to have a place to debate and there has not been a coherent or integrated approach to managing it,” he said.

“There has been a wrong tendency to consider freedom of expression, even when it is obvious that some people are promoting toxic and violent ideologies.”

Image caption The well-known “youtuber” PewDiePie was involved in numerous polemics for anti-Semitic and Nazi references in some of his videos.

Now social media companies must “take the threat posed by these ideologies much more seriously ,” he added.

“It can mean the creation of a special category for the far right, recognizing that it has a global reach and global networks.”

None underestimates the magnitude of this task, especially because many people with far-right views are experts in what Gillespie calls “legitimate controversy.”

“Some will discuss the threat posed by Islam and will recognize that it is controversial, but they will say that it is legitimate to argue,” he said.

These gray areas are going to be extremely difficult to manage for social media companies, they say, but after the tragedy in New Zealand, many believe that they should try harder to face this reality.

About author

Rava Desk

Rava is an online news portal providing recent news, editorials, opinions and advice on day to day happenings in Pakistan.


Leave a Reply

Your email address will not be published. Required fields are marked *

Your email address will not be published. Required fields are marked *