- Green Growth
- Your Consultant
|Members of the local Muslim community enter the Al Noor mosque after is reopened in Christchurch on March 23, 2019. Muslims returned on March 23 to Christchurch's main mosque for the first time since a white supremacist launched a massacre of 50 worshippers there, as New Zealand sought to return to normal following the searing tragedy. (Photo: AFP/William West)|
The council said it was suing the French branches of the two tech giants for "broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor," according to the complaint, a copy of which was seen by AFP.
In France, such acts can be punished by three years' imprisonment and a 75,000 euro (US$85,000) fine.
Facebook said it "quickly" removed the live video showing the killing of 50 people by a white supremacist in twin mosque attacks in Christchurch on March 15.
But the livestream lasting 17 minutes was shared extensively on YouTube and Twitter, and Internet platforms had to scramble to remove videos being reposted of the gruesome scene.
The CFCM, which represents several million Muslims in France, said it took Facebook 29 minutes after the beginning of the broadcast to take it down.
Major Internet platforms have pledged to crack down on the sharing of violent images and other inappropriate content through automated systems and human monitoring, but critics say this is not working.
Internet platforms have cooperated to develop technology that filters child pornography, but have stopped short of joining forces on violent content.
A US congressional panel last week called on top executives from Facebook and YouTube, as well as Microsoft and Twitter, to explain the online proliferation of the "horrific" New Zealand video.
The panel, the House Committee on Homeland Security, said it was "critically important" to filter the kind of violent images seen in the video.