GNP Guia Naghi & Partners The Legal 500 – The Clients Guide to Law Firms
post-template-default,single,single-post,postid-23185,single-format-standard,theme-stockholm,qode-social-login-1.1.3,qode-restaurant-1.1.1,stockholm-core-1.1,woocommerce-no-js,select-theme-ver-5.1.8,ajax_fade,page_not_loaded,side_area_over_content,wpb-js-composer js-comp-ver-6.0.5,vc_responsive

Cristina Badea – The Hard-Knock Life of Online Intermediaries in Romania – The Case of Liability for User’s Discriminatory Content

1. Context – Liability exemptions for online intermediaries under the eCommerce Directive

The eCommerce Directive[1], transposed in Romanian law by Law no. 365/2002 on e-commerce, provides for the liability exemption of online intermediaries for illegal content published by their users, under certain conditions. As such, intermediaries that provide hosting services for their users (such as online marketplaces) and have a passive role in relation to the users’ publications are not liable for users’ illegal content. However, if the intermediary becomes aware of illegal user content, usually through a notification received from a third party, it must act as fast as possible to remove or disable access to such content – i.e., the notice and takedown procedures that most platforms have in place. As such, awareness of illegal content must be actual, as the CJEU ruled in the YouTube/Cyando[2] case that “abstract knowledge that protected content is being made available illegally on its platform” is not enough to activate liability. The case pertained to copyright issues, but the interpretation should also be applicable to other users’ illicit actions on online hosting platforms.

Moreover, art. 15(1) of the eCommerce Directive prohibits Member States to impose a general obligation on intermediaries to monitor hosted content or a general obligation to actively seek facts or circumstances indicating illegal activity.

As such, the main rules seem simple: a) the owner of an online platform that hosts third party content does not have an obligation to generally monitor and actively moderate such content and b) to dodge (secondary) liability, it must act quickly to disable illegal content, after it became aware of it.

However, the recent practice of the National Council for Combating Discrimination (“NCCD”), confirmed by the court system, goes in a different direction. In the following, I will analyze two cases handled by the NCCD, one of which upheld by the Supreme Court.

2. Online intermediaries before the NCCD – the practice so far

The NCCD is the public authority in charge with the fight against discrimination in Romania.[3] It engages in a broad range of activities with the purpose of raising awareness regarding discrimination against vulnerable groups, which appear to work – the number of discrimination complaints filed with the authority has generally and steadily risen over the past years.[4] While the NCCD’s goals are honorable, and their attainment is much needed, one cannot help but disagree with its approach in concern to online platform liability for user generated content that contains discriminatory texts.

NCCD analyzed (at least) two cases of online platforms’ liability for user’s racist content.[5] In both cases, the platforms were online marketplaces that allowed users to publish their offers to sell, buy or rent various products. They appear to be passive intermediaries, which under EU law should mean that they should not be liable for users’ racist content if they were not made aware of its existence.

A. The first case was settled by the NCCD in 2016 and by the Supreme Court in December 2020. The object of the complaint before the NCCD was “the discriminatory posts on an auto selling site, by the sellers of vehicles”. The online marketplace was directly accused of the discrimination made through a seller’s offer, that excluded an ethnic minority from its prospective buyers. The online marketplace relied on the liability exemption under the eCommerce Law (and the eCommerce Directive).

The NCCD however considered that the defendant “allows its users to post offers for the sell of cars, that contain discriminatory texts” and that “it can and has the obligation to conduct all the due diligence necessary, by technical and personal support, to eliminate the discriminatory offers”. It therefore imposed a RON 4000 fine on the company running the marketplace. However, this is the extent of the reasoning. The decision mentions nothing regarding the eCommerce Directive or Law, nor of the exemption of liability and provides no legal basis for the obligations that it considered to be incumbent on an online marketplace. There is no mention to case law or general law governing liability.

This decision appears to be thin enough to fall in court, especially given the existent EU case law. However, surprisingly enough, it was upheld by the Supreme Court in its December 2020 ruling[6], which provided an even more questionable reasoning. The Court considered that the company running the marketplace had the “obligation of a priori control, regarding the publishing of offers from users to whom it gave the means for online publication”. As such, the court held that the marketplace has a general obligation to censor, prior to publication, any offer that may contain racist texts.

But what about the eCommerce regime? This time, the defendant received its answer, even though not in the least the one it expected. The court rejected its defence very briefly, holding in a rather spiteful tone that the defendant only brought “mere assertion with no logic or evidence” and that the exemption of liability invoked is “a subjective and personal interpretation” that “extrapolates the meaning of legal norms for a situation in which they are inapplicable”.

B. The second case was settled by the NCCD in March 2022. It pertained again to an offer published by the user of a marketplace to rent a room, with the exclusion of people part of an ethnic minority. This decision echoes the Supreme Court 2020 ruling and holds that the company running the marketplace “had the obligation to censor, verify and limit the offers with such discriminatory content in order to prevent their publication”. The NCCD fined the company with a fine of RON 2000.

3. Why are the decisions wrong?

First, in view of the need to develop case law and legal discourse on aspects regarding the digital environment, both authorities and courts need to provide a more round and profound reasoning for their decisions. Online platforms may differ considerably from one another and there is an avid need of nuanced distinctions between them and the legal obligations they may or may not have. Moreover, parties must know why they are found liable and why their legal arguments are dismissed. The reasoning provided by the NCCD and courts so far is insufficient.

Second, the decisions mentioned above seem to be based on a wrong interpretation of a landmark case before the European Court of Human Rights (“ECHR”) – the case of Delfi v. Estonia, settled on June 16, 2015.[7] The judgment is a long, but interesting read, together with the concurring and dissenting opinions, as it present the ECHR’s view on the liability of an online newspaper for racist and hateful comments published by it user. It looks into EU law and the interplay between freedom of speech and right to dignity, through the lens of intermediary liability.

The ruling basically stated that an online newspaper may be fined for comments posted by users and that contain hate speech. However, the ECHR expressly mentioned that the reasoning only applied to Internet news portals, considered active intermediaries, and did not concern other fora on the Internet. Moreover, it considered that a notice-and-takedown procedure followed by rapid removal of the comments could have sufficed to avoid liability, which was interpreted as excluding of the prior-monitoring of comments obligation.[8] The ECHR also did not analyse how the eCommerce Directive applies, since such competence remains with the national courts.

As such, the Case of Delfi v. Estonia is not (directly) applicable when analysing liability of online marketplaces or other intermediaries. There are more arguments is this sense, which should be considered by the NCCD and courts when analysing discrimination on part of users of online platforms.

4. Conclusion

It is disconcerting that the NCCD approach is that “websites” must prevent and disable, apparently even without any actual knowledge of the platform holder, any content that may be discriminatory. This goes against the prohibition for states to impose a general obligation to monitor and censor allegedly illicit content, and also against the liability exemption under the eCommerce Directive. It goes against the Recommendation issued by the Council of Europe Committee of Ministers[9], and also against CJEU and (the correct reading) of ECHR case law.

To prevent fines from NCCD for user content, online platforms should have in place filter systems and revise how they apply them, together with other moderation actions they manually take. They should also have clear notice-and-takedown procedures in force and available information for users regarding the prohibition of illicit content in their terms and conditions. If these do not repel a NCCD fine, they may at least prove useful in court.

[1] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market

[2] See the CJEU Judgement dated June 22, 2021, in joined Cases C‑682/18 and C‑683/18, para.  111.


[4] Available in Romanian –

[5] NCCD Decision 662/26.10.2016 (available at and  NCCD Decision 188/23.03.2022 (available at

[6] The Supreme Court, Administrative and Fiscal Division, Decision no. 6749/11 December 2020

[7] ECHR Grand chamber, Case of Delfi AS v. Estonia, Application no. 64569/09

[8] As also mentioned here. However, the ECHR mentions in its ruling that Internet news portals may prevent or rapidly remove comments that amount to hate speech (see for example par. 158). Other judges took issue with the idea of preventing comments, which would imply a general obligation to monitor. Others argued that this ruling may lead to collateral censorship (i.e., while not legally enshrining censorship, public authorities sanction private Internet news portals for not censoring their users) – see Joint Dissenting Opinion of Judges Sajo and Tsotsoria.

[9] See Recommendation CM/Rec(2018)2 of the Committee of Ministers to member States on the roles and responsibilities of internet intermediaries, Recommendation CM/Rec(2011)7 on a new notion of media, Declaration on freedom of communication on the Internet.

No Comments

Post a Comment