Twenty years after the e-Commerce directive, the EU has significantly increased the number of regulations impacting online service providers. From the Digital Single Market Strategy in 2015 to the Commission’s vision for a digital transformation by 2030, the EU’s objective has been to create an environment where digital networks and services can prosper, and that citizens, businesses, and the larger civil society can benefit from the same. As part of these various strategies, the EU has passed multiple regulations and directives in the last five years, and has many more pieces of legislation to be implemented in the coming years, that impact content on digital platforms.
If you are an online service provider, the following regulations can directly or indirectly impact you:
Digital Regulations on Online Service Providers
Regulation | Type | Date | Aim | Penalty |
---|---|---|---|---|
E-Commerce Directive | Directive | 2000 |
|
Enforcement, penalties and sanctions vary from one member state to another. |
Code of Conduct on Hate SpeechHate speech is any form of communication, whether written, spoken or otherwise expressed, that attacks or incites violence, discrimination or hostility against a particular individual or group on the basis of their race, ethnicity, nationality, religion, sexual orientation, gender identity, or other characteristics. | Voluntary Code | 2016 | Prevent and counter the spread of illegal hate speech online. | N/A |
Code of practice on disinformation | Voluntary Code | 2018 | Commitments such as transparency in political advertising and closure of fake accounts, to demonetisation of purveyors of disinformationFalse information that is circulated on online platforms to deceive people. It has the potential to cause public harm and is often done for economic or political gain.. |
N/A |
General Data Protection Regulation (GDPR) | Regulation | 2018 | Protect user’s personal dataAny identifiable data is regarded as personal data. This includes names, home addresses, email addresses, ID numbers, an IP address, etc. In the US, various laws define personal data in their own words, such as personal health information defined in HIPAA. In the EU, the GDPR defines it as any information relating to an identifiable person, such as a name,....Restrict non-consensual processing and movement of data.Facilitate business in the digital single market. | 20 000 000 EUR or 4% of the firm’s worldwide annual revenue. |
Directive on copyright and related rights in the Digital Single Market | Directive | 2019 | Ensure fairer renumeration for creators and rightsholders, press publishers and journalists, in particular when their works are used online.Obligations include obtaining authorisation from rightsholders for content uploaded on the platforms of online content sharing providers. | Enforcement, penalties and sanctions vary from one member state to another. |
Audiovisual media services directive (AVMSD) | Directive | 2020 | Provides EU-wide media content standards for all audio-visual media, including video-sharing platforms.Protection of minors against harmful content and reinforced protection against incitement to violence or hatred. | Enforcement, penalties and sanctions vary from one member state to another. |
Promoting fairness and transparency for business users of online intermediation services | Regulation | 2020 | Ensure that users are granted appropriate transparency, fairness, and effective redress possibilities.Applies to online intermediation services and online search engines. | Member states shall ensure that effective, dissuasive, and proportionate penalties are applied to infringements. |
Terrorist Content Online Regulation | Regulation | 2022 | Ensure that hosting serviceA hosting service enables individuals, companies and other service providers to host websites, databases, applications. Within the meaning of the DSA, a hosting service offers the storage of user-generated content. This includes for example filesharing, social media, video-sharing platforms as well as marketplaces. providers take down identified terrorist content within an hour following a removal order from relevant authorities. |
4% of the platform’s annual turnover. |
Digital Services Act | Regulation | 2023** | Establish transparency and accountability frameworks for platforms, and encourage innovation, growth and competition within the European market.Obligations include measures to counter illegal goods, services, or content online, and trace sellers of illegal goods.Audit and transparency measures as well as complaint mechanisms to be implemented. | 6% of platform’s annual global turnover.Platforms that refuse to comply can be taken to court and given a temporary suspension. |
Data Governance Act | Regulation | 2023* | Outlines rules on who can use and access data generated in the EU.Applies to businesses, consumers, and the public sector.Facilitates data sharing across sectors and member states. | Administrative fines: 4% of total worldwide annual turnover or 20 000 000 EUR.Member states can also implement additional penalties. |
Proposal for regulation on General Product Safety | Regulation | 2024* | Address product safety challenges.Enhance market surveillance of dangerous products in the EU.Increase protection of EU consumers. | 4% of the economic operator’s or online marketplace’s annual turnover.Member States may choose to impose periodic penalty payments. |
Proposal for regulation on AI | Regulation | 2024* | Ensure that AI practices and systems in the market are safe and respect existing law, values, and fundamental rights.Establishes a list of prohibited AI systems whose use is considered unacceptable.Applies to providers of AI systems, users, and other participants across the AI value chain. | Non-compliance: 30 000 000 EUR or 6% of total worldwide annual turnover.Supply of incorrect, incomplete, or misleading information: 10 000 000 EUR or 2% of total worldwide annual turnover. |
Proposal for Regulation laying down rules to prevent and combat child sexual abuse | Regulation | 2025* | Combat child sexual abuse material (CSAM)Child Sexual Abuse Material refers to any material that portrays sexual activity involving a person who is below the legal age of consent. To accurately describe the exploitation and abuse of children and protect the dignity of victims, the European Parliament recommends using the term "child sexual abuse material" instead of "child pornography." online with obligations for online service providers to detect, report, and remove CSAM from their services.Obligations include mandatory risk assessmentIt refers to the process of identifying, analyzing, and evaluating the severity and probability of risks and threats associated with business or product development, deployment, and maintenance. In the context of the DSA, very large online platforms are required to annually assess the systemic risks stemming from the design, functioning or use of the platforms, including any actual or foreseeable... and risk mitigation measures, reduction of exposure to groomingA form of child sexual exploitation whereby a person attempts to establish some form of connection (such as building trust in a relationship) with a child, potentially with an aim of sexual abuse or exploitation either online or offline., proactive content detection, effective removal, reporting obligations, data collection and transparency obligations, and single point of contact. | 6% of the annual income or global turnover of the provider. |
Code of Conduct: Voluntary initiatives that establish self-regulatory standards to achieve an objective.
Directive: Requires EU countries to achieve certain objectives but gives the country flexibility in how they choose to do so. Countries must incorporate directives into national law.
Regulation: Legal acts that apply automatically and uniformly to all EU countries as soon as they are in force, without needing to be transposed into national law. They are binding in their entirety on all EU countries.
The acceleration of legislation passed over the last few years, as well as the multiple that are under development, are indicative of the rapidly changing expectations of different stakeholders in the digital environment. The pace of legislation does not seem to slow down and, within a dynamic environment like the internet, issues of trust & safetyThe field and practices that manage challenges related to content- and conduct-related risk, including but not limited to consideration of safety-by-design, product governance, risk assessment, detection, response, quality assurance, and transparency. See also: Safety by design and compliance become even more important and relevant.
More importantly, for businesses, this creates a new and increasingly complex environment to navigate. As new obligations emerge, Tremau is committed to helping online platformsAn online platform refers to a digital service that enables interactions between two or more sets of users who are distinct but interdependent and use the service to communicate via the internet. The phrase "online platform" is a broad term used to refer to various internet services such as marketplaces, search engines, social media, etc. In the DSA, online platforms... steer through the European legal ecosystem by facilitating their understanding of how such laws impact their business and operational models, and providing them with the solutions to ease their compliance efforts.
To find out more about how the TCO will impact you, contact us at info@tremau.com