Door Bente van Dijk
Examining the drafted European Media Free-dom Act: Combating Harmful Content in Today’s Digital Landscape
Media freedom and diversity are under pres-sure. Amid numerous threats to journalists and the spread of disinformation at the time of the corona pandemic and the war in Ukraine, hope seems to take the form of the European Com-missions’ drafted European Media Freedom Act (EMFA). The EMFA is a highly important piece of landmark of Europe’s digital rights. How-ever, the EMFA as currently drafted may not be sufficiently effective in preventing the spread of harmful content via platforms. In this blog post, I will discuss the deficiencies of the drafted EFMA and offer recommendations to address shortcomings in order to improve the protection of media freedom and pluralism in Europe.
Basis and legal framework
The EMFA is a set of rules to protect media plu-ralism and independence in the European Union (EU) as set out in the European Democracy Ac-tion Plan. Until now, media freedom has not been given an effective legal framework of its own at EU level. Pluralism and media freedom are usually areas reserved to the authorities of individual Member States, since they are in a better position to design media regulation ac-cording to their community needs, traditions and specifics of domestic markets. However, several problems have arisen due to divergent approaches in member states regarding media pluralism, causing fragmentation of the internal market. The EMFA now seeks to establish a Eu-ropean framework for the protection of media freedom and pluralism, as well as to combat dis-information and hate speech. The EMFA also introduces provisions that aim to ensure that digital platforms take appropriate measures to address harmful content.
In terms of content, the EMFA should relate to already existing forms of regulation. The EMFA builds on the Audiovisual Media Ser-vices Directive (AVMSD) by introducing a set of measures to protect media freedom and plu-ralism. Additionally, the EMFA amends and updates the AVMSD in order to strengthen the legal framework on disinformation and hate speech. The EFMA seeks to ensure that digital platforms are held accountable for the content they host, a topic on which the EMFA will com-plement the Digital Services Package: the Digi-tal Services Act and the Digital Markets Act. Specifically, the EMFA includes measures to ensure that audiovisual media services are used to disseminate content that incites violence, ha-tred or discrimination, or that incites terrorism or other forms of criminal activity. It also re-quires that audiovisual media services take ap-propriate measures to ensure that content that incites violence, hatred or discrimination is not made available to minors.
Omissions and points of improvement
Regarding content moderation and platform dis-information dissemination, the EMFA includes provisions for increased transparency in online platforms’ algorithms and set up mechanisms for users to report false information and receive corrections. While the EMFA proposes signifi-cant improvement in terms of content modera-tion and platform disinformation dissemination, there are still areas where it could be further im-proved. Firstly, the EMFA should provide clearer definitions of disinformation and hate speech to ensure that platforms are consistent in their content moderation practices. This is espe-cially important given the inconsistent applica-tion of the term hate speech in ECtHR case law since Handyside. Neither term is defined in Ar-ticle 2. This lack of clarity can lead to incon-sistent enforcement of the law, and can create confusion among digital platforms as to what content is considered acceptable. This would also help to prevent any ambiguity of confusion about what constitutes as harmful content.
While it is necessary to combat disinformation and hate speech, the EMFA’s reliance on algo-rithms and automated tools raises concerns. The proposal should require platforms to be more transparent about their content moderation prac-tices and algorithms, and to provide users with clear explanations of why their content was re-moved. This can be achieved by requiring me-dia organizations to disclose their content mod-eration policies and practices, including how decisions are made and how appeals can be made. Next to that, the EMFA should include stronger sanctions for platforms that fail to com-ply with its requirements. This could include fines, suspension or revocation of licenses, or even criminal penalties. Stronger sanctions would help to ensure that platforms take their responsibilities seriously and act in the best in-terests of users. Overall, it is crucial that the pro-posed EMFA strikes a balance between protect-ing media freedom and ensuring responsible content moderation practices. This would help build trust between platforms and users, and en-sure that platforms are held accountable for their actions.
Article 17(1) of the drafted EMFA proposes to identify media service providers on a self-de-claratory basis. The suggested prior notice system for self-declared media outlets creates quick and non-transparent procedures for cer-tain privileged actors that will have a major neg-ative impact on the right to freedom of expres-sion and information, and even open the door to actors who intend to distort the democratic pub-lic debate. This mechanism would allow these self-appointed media, whether or not they actu-ally meet the requirements of Article 17(1), to more easily disseminate disinformation, misin-formation and propaganda with less surveil-lance. Given the public interest in the dissemi-nation of disinformation, this article should be removed. These improvements would help to ensure that the EMFA is effective in combatting disinformation and hate speech, and in promot-ing transparency and accountability in content moderation practices.
In conclusion, the EMFA has the potential to significantly impact media freedom and content moderation practices in the EU. However, there are several areas in which the EMFA can be im-proved to ensure that it is effective in tackling hate speech and disinformation. These areas in-clude providing more clarity on the definition of hate speech and disinformation, and providing clear guidance on how digital platforms should respond to hate speech and disinformation. Therefore, it is crucial that the proposal under-goes careful consideration and revision to en-sure that it strikes a balance between protecting media freedom and ensuring responsible con-tent moderation practices.