Tech companies will have to meet new European Union (EU) requirements to curb illegal content and disinformation on their platforms. 

This comes after negotiators reached a landmark deal on how Europe governs the Internet, as the EU lawmakers have agreed on new rules requiring tech giants such as Google, Twitter and Facebook, among others, to do more to moderate illegal content on their platforms.

The wide-ranging Digital Services Act (DSA) can fine a company up to 6% of its global turnover for violating the rules, which would be $7bn (£5.9bn) in the case of Facebook’s owner, while repeated breaches could result in a tech firm being banned from doing business in the EU.

 EU countries have agreed on the broad terms of the Digital Services Act, or DSA, which will force tech companies to take greater responsibility for content that appears on their platforms. New obligations include removing illegal content and goods more quickly, explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation. “Today’s agreement on the Digital Services Act is historic, both in terms of speed and of substance.

The DSA will upgrade the ground-rules for all online services in the EU,” said European Commission President Ursula von der Leyen in a statement. “It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.” 

Executive Vice-President for a Europe Fit for the Digital Age, Margrethe Vestager, added: “With the DSA we help create a safe and accountable online environment…  Platforms should be transparent about their content moderation decisions, prevent dangerous disinformation from going viral and avoid unsafe products being offered on market places. With today’s agreement we ensure that platforms are held accountable for the risks their services can pose to society and citizens.”

The DSA is the second part of the EU’s massive project to regulate tech companies. In a press release, the European Parliament said as part of the act, the European Commission and member states will have access to the algorithms of large online platforms. Illegal content will be removed swiftly, and online marketplaces will be made safer. 

Perhaps the most important difference in the new EU rules to those in the US, where most of the Internet companies are based, relates to liability for material posted by third parties on platforms.

In the US, host companies enjoy near-total immunity from liability for the material posted by third parties. Under the EU’s new rules, these companies could face prosecution if they are notified that third party content on their site contravenes laws in the EU and then fail to take action to remove it. 

Large companies  could face fines of up to 6% of their worldwide turnover for non-compliance and repeated breaches could get them banned from doing business in the EU. 

The companies also face a yearly fee up to 0.05% of worldwide annual revenue to cover the costs of monitoring their compliance, although smaller companies will be exempted. Whether this exemption will provide some larger organisations a loophole to evade the regulatory fees remains to be seen. 

The platforms will be made more transparent, and special care will be taken to protect minors, according to the EU. Dark patterns, which are tactics that mislead people into giving personal data to companies online, will also be prohibited. “As the law is finalised and implemented, the details will matter. We look forward to working with policymakers to get the remaining technical details right to ensure the law works for everyone,” said Google in a statement. 

Although the legislation only applies to EU citizens, the effect of these laws will certainly be felt in other parts of the world and global technology companies may decide it is more cost-effective to implement a single strategy to police content and take the EU’s comparatively stringent regulations as their benchmark. 

The EU have detailed the types of organisation that will be governed by the DSA:  

  • Intermediary services offering network infrastructure: Internet access providers, domain name registrars.
  • Hosting services such as cloud computing and web-hosting services.
  • Very large online search engines with more than 10% of the 450 million consumers in the EU, and therefore, more responsibility in curbing illegal content online.
  • Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms.
  • Very large online platforms, with a reach of more than 10% of the 450 million consumers in the EU, which could pose particular risks in the dissemination of illegal content and societal harms.

This legal framework is a bid to crack down on counterfeit luxury goods, fake medication and illegal rentals, meaning that online marketplaces like Amazon, Airbnb, eBay, AliExpress and Etsy will need to verify that they have some actionable information about the traders using their platforms.

The DSA will contain the following obligations on these organisations:

  • Enhanced supervision and enforcement by the EU Commission when it comes to very large online platforms. The supervisory and enforcement framework also confirms the important role for a newly created independent ‘Digital Services Coordinators and Board for Digital Services’.
  • Measures to assess and mitigate risks, such as obligations for very large platforms and very large online search engines to take risk-based action to prevent the misuse of their systems and undergo independent audits of their risk management systems.
  • New measures to empower users and civil society, including the possibility to challenge platforms’ content moderation decisions and seek redress, either via an out-of-court dispute mechanism or judicial redress.
  • Provision of access to vetted researchers to the key data of the largest platforms and provision of access to NGOs as regards access to public data, to provide more insight into how online risks evolve.
  • Measures to counter illegal goods, services or content online, including a mechanism for users to easily flag such content and for platforms to cooperate with so-called ‘trusted flaggers’.
  • New obligations on traceability of business users in online market places.
  • Transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users.
  • Mechanisms to adapt swiftly and efficiently in reaction to crises affecting public security or public health.
  • New safeguards for the protection of minors and limits on the use of sensitive personal data for targeted advertising.

The DSA will distinguish between tech companies of different sizes, placing greater obligations on the  bigger companies and the largest companies, those with at least 45 million users in the EU, like Meta and Google, will face the most scrutiny.

Source: Cyber Security Intelligence