New Rules for Online Safety and Fairness: Punch and Judy
For those of you who remember children’s puppets shows (Ireland’s ‘Wanderly Wagon’, the UK’s ‘Punch and Judy’, France’s ‘Guignol’ and so on) you may well remember kids in the audience shouting out at puppets on stage ‘he’s behind you’ and the characters shouting back at the kids ‘oh no he isn’t’. Right now I feel like BigTech are on stage ‘Punch and Judy’ style, and the kids are shouting ‘he’s behind you’. Who is behind them? Two pieces of legislation called the Digital Services Act (DSA) and the Digital Markets Act (DMA).
The landscape of digital services is significantly different today from 20 years ago. Online intermediaries have become vital players in digital transformation. Online platforms in particular have 1) created significant benefits for consumers and innovation, 2) facilitated cross-border trading within and outside the Union, and 3) opened up new opportunities to a variety of European businesses and traders. At the same time, online platforms can be used as a vehicle for disseminating illegal content, or selling illegal goods or services online. Some very large players have emerged as quasi-public spaces for information sharing and online trade. They have become systemic in nature and pose particular risks for users’ rights, information flows and public participation. Thus, these regulations are intended to accomplish two goals: first, to create a safer digital space in which the fundamental rights of all users of digital services are protected and then second, to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.
Many of the organisations affected by these acts claim that have been complying with self-regulatory codes for years (such as the EU Code of Conduct for Countering Hate Speech Online) and are now struggling with the overheads of dealing with all these new pieces of legislative oversight and with all the agencies associated with them. The new regulations address illegal and harmful content by getting platforms such as the GAFAM companies (Google, Amazon, Facebook/Meta, Apple, and Microsoft) to more readily ‘take down’ inappropriate content. There are suggestions however that this is the EU interfering with, and taking control of, markets? The truth is the organisations now under scrutiny were not doing all that they could for content regulation and fair play – so the interference seems justified and warranted.
“He’s behind you”.
The DSA regulates everything from “dark patterns” and algorithms to public safety threats and illegal content. The DSA prohibits misleading interfaces that prompt users to make decisions they might not otherwise make, and compels large platforms to comply with stricter obligations on disinformation, political ad transparency and hate speech, among other things. It also provides options for users to opt out of behavioral algorithms, bans micro-targeted advertising, and requires large platforms to undertake annual analysis and reporting with respect to what the EU says are the systemic risks of their services. Its scope includes:
· Intermediary services (e.g., network infrastructure services, internet access providers, domain name registers),
· Hosting services such as cloud and web services,
· Online platforms such as online marketplaces, app stores, collaborative economy platforms and social media platforms,
· Very large online platforms (VLOPs) and very large online search engines (VLOSEs) i.e., platforms reaching more than 10% of 450 million consumers in Europe.
The European Commission summarises the key principles enshrined in these regulations in the table below:
The DMA applies only to the largest digital platforms, known as “gatekeepers” and mandates rules preventing these larger players from imposing unfair conditions on businesses and consumers. For example, a company like Amazon wouldn’t be allowed to rank products on its site in a way that gives Amazon’s own products and services an advantage.
“Oh no he isn’t……Oh yes he is”
Unlike prior discretionary standards, both the DMA and DSA come with teeth. The European Commissioner is given the power to carry out investigations, sanction breaches, and update the law’s obligations as needed. For the DSA, platforms that reach more than 10% of the EU’s population (45 million users) are considered systemic in nature, and are subject not only to specific obligations to control their own risks but also to a new oversight structure. This new accountability framework will be comprised of a board of national Digital Services Coordinators, with special powers for the Commission in supervising very large platforms including the ability to sanction them directly. This is a move under the country-of-origin principle – a similar enforcement regime applied under the GDPR. However, this enforcement regime also recognises a number of layers of accountability and auditability. The first being the power of the commission and the local authorities themselves to directly influence audit (mandated annually), meaningful reporting, and enforcement. The next level recognises the importance of researchers’ output, journalist investigations and civil society investigations e.g., ProPublica) as transparency and accountability levers have been built into both the DSA and DMA. And then finally the users themselves who have also been empowered with rights under the DSA.
Finally: What does this all mean for those working in data protection and privacy?
The IAPP suggests that competition and data protection laws are increasingly intertwined in the world of digital marketplaces. Privacy professionals need to be aware of these developments to ensure that companies comply with these requirements in a manner that satisfies the DMA, the DSA but also the EU GDPR. Some examples presented by IAPP of these crossovers are:
· Growing concerns about targeted advertising based on tracking online behavior have led to an obligation to “refrain from combining personal data sourced from these core platform services with personal data from any other services offered by the gatekeeper or with personal data from third-party services, and from signing in end-users to other services … in order to combine personal data, unless the end-user has been presented with the specific choice and provided consented.
· A prohibition to require “business and end users to subscribe to or register with any other core platform services …” offered by the gatekeeper, thereby obviously limiting the amount of personal data that gatekeepers can accumulate.
· An obligation to “allow end-users to un-install any preinstalled software applications” on the platform.
· An obligation to “provide effective portability of data generated through the activity of a business user or end user … and in particular to provide tools for end-users to facilitate the exercise of data portability”.
· An obligation to provide “real-time access and use to aggregated and non-aggregated data “with a specific reference to the need to comply with the GDPR and its consent requirements for access to personal data.
· An obligation for gatekeepers to provide third-party providers of online search engines with access to “query, click and view data,” subject to anonymization for personal data.
· A requirement for gatekeepers to “take the necessary steps to either enable business users to obtain (any) required consent to their processing (of personal data) where required or to ‘provide duly anonymized data where appropriate”