Question: How is the EU tackling incitement on the net?

In order to be able to exercise my parliamentary control function as a Member of the European Parliament, I have the opportunity to put questions to the European Commission. The Commission must answer these questions.
Together with other Members, I put the following questions to the Commission:

Question with priority for written answer P-004488/2020 to the Commission

Subject: The Commission's efforts to tackle the spread of racism, hatred and incitement on major technology platforms such as Facebook.

Recent reports indicate that Facebook is not effective enough in combating racism, hate, and hate speech [1]. Facebook does not officially recognise itself as a news media company and is therefore not obliged to comply with journalistic standards. Its commitment to combating hate, agitation and racism is therefore voluntary. In addition, Facebook's unwillingness to take action against hate and incitement has led to an advertising boycott by more than 400 brands [2].

1.    How does the Commission ensure that major technology companies such as Facebook adhere to standards that minimise, prevent and reduce the spread of hate and incitement and racist ideologies on their platforms?

2.    What are the findings of the annual monitoring of Facebook in relation to the EU Code of Conduct on Combating Illegal Hate on the Internet and how would the Commission assess the effectiveness of the Code given Facebook's reluctance to tackle hate, hate speech and racism?

3.    How does the Commission intend to effectively monitor, assess and curb the spread of hate, agitation and racism on online platforms such as Facebook?



Answer given by Commissioner Didier Reynders on behalf of the European Commission on 03.11.2020:

To counter the spread of illegal hate online, the Commission agreed in 2016 with a number of IT companies, including Facebook, on an EU Code of Conduct[1]. The code of conduct includes an obligation to review users' reports within 24 hours and, if necessary, to remove the content concerned. It also encourages cooperation with civil society organisations and national authorities.

The Commission regularly monitors the implementation of the Code of Conduct[2]. As the latest ratings show, on average, IT companies review 90% of the messages within 24 hours and remove 71% of the hate content. While Facebook deleted only 28% of such content in2016 , its removal rate has now improved to over 80%.

However, the Code of Conduct has not only brought about progress in eliminating illegal hate comments, but has also fostered synergies between businesses, civil society and Member State authorities. The results achieved with the Code of Conduct will be taken into account in the ongoing reflection on the Digital Services Bill.[3] include. The proposed Digital Services Act aims to harmonise and clarify the roles and responsibilities of online platforms in combating illegal content transmitted through their services, including illegal internet hate speech. These new rules will also adequately protect fundamental rights enshrined in the Charter of Fundamental Rights of the European Union, including freedom of expression online.


[2] The results of the latest round of assessments were published on 22 June 2020 and are available at

[3] The Digital Services Act was announced in the Commission's Communication on "Shaping Europe's Digital Future",