One year ago, the European Commission and four major social media platforms announced a Code of Conduct on countering illegal online hate speech. It included a series of commitments by Facebook, Twitter, YouTube and Microsoft to combat the spread of such content in Europe.
On 1 June the European Commission released the results of an evaluation of the Code of Conduct. This was carried out by NGOs and public bodies in 24 EU countries.
This is the second time that the Code has been evaluated. The results of the first evaluation were published in December 2016.
The results show that, one year after its adoption, the Code of Conduct has delivered some important progress, while some challenges remain. Some of the key points are:
- On average, in 59% of the cases, the social media platforms responded to notifications concerning illegal hate speech by removing the content. This is more than twice the level of 28% that was recorded six months earlier.
- The amount of notifications reviewed within 24 hours improved from 40% to 51% in the same six months period. Facebook is however the only company that fully achieves the target of reviewing the majority of notifications within the day.
- Compared to the situation six months ago, the social media platforms have become better at treating notifications coming from citizens in the same way as those coming from organisations which use trusted reporters channels. However, there are still some differences in the removal rates. The overall removal rates remain lower when a notification originates from the public.
- Finally, the monitoring showed that while Facebook sends systematic feedback to users on how their notifications have been assessed, practices differed considerably among the social media platforms. Quality of feedback motivating the decision is an area where further progress can be made.
For more information: