The ruling follows the EU’s Digital Services Act, which came into operation across all 27 member states yesterday, and requires platforms to report their numbers by 17 February 2023.
European Commissioner and executive vice-president Margrethe Vestager (pictured) said: “Online platforms are at the core of some of the key aspects of our daily lives, democracies, and economies. It’s only logical that we ensure that these platforms live up to their responsibilities in terms of reducing the amount of illegal content online and mitigating other online harms, as well as protecting the fundamental rights and safety of users.”
The new legislation will directly affect all platforms, but there will be tougher measures on those with more than 45 million users in the EU – a group likely to include Amazon, Apple, Google, Meta including Facebook and WhatsApp, Microsoft, Netflix, Spotify, Twitter and others.
Vestager’s fellow European Commissioner Thierry Breton, a former CEO of France Télécom before it was Orange, said on Twitter: “Social media platforms will no longer behave like they are ‘too big to care’. Whether they have feathers or not.”
The DSA follows the EU’s Digital Markets Act (DMA), which came into operation in March 2022. The two have two main goals, said the Commission: “to create a safer digital space in which the fundamental rights of all users of digital services are protected; [and]to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.”
Under the new DSA, the Commission will decide whether a platform should be designated a very large online platform or search engine. If so, there will be a further four months – until June 2023 – “to comply with the obligations under the DSA, including carrying out and providing to the Commission the first annual risk assessment exercise”, said the Commission.
The DSA applies to all digital services that connect consumers to goods, services or content, said the Commission. “It creates comprehensive new obligations for online platforms to reduce harms and counter risks online, introduces strong protections for users’ rights online, and places digital platforms under a unique new transparency and accountability framework.”
The EU described the DSA as “first-of-a-kind regulatory toolbox globally” that “sets an international benchmark for a regulatory approach to online intermediaries”.
It applies to all online services, “but a special regime is introduced for platforms with more than 45 million users”, said the Commission. That number is based on 10% of the population of the EU.
“For such very large online platforms or search engines, further obligations include wide-ranging annual assessments of the risks for online harms on their services – for example with regard to exposure to illegal goods or content or the dissemination of disinformation.”
They will have to put in place “suitable risk mitigation measures”, and they will be “subject to independent auditing of their services and mitigation measures”
The Commission and national regulators will set up a European Board of Digital Services to manage the process.