What Changes Are Coming to the Transatlantic Digital Landscape?
In late March, political agreement was reached on the European Union’s (EU) Digital Markets Act (DMA), closely followed a month later by the Digital Services Act (DSA). The implementation of the two acts over the coming months and years means that the EU is poised to reshape the digital landscape for millions, perhaps even beyond its borders. The final text of the DMA leaked on April 14 and is expected to pass a vote in early May, while the final details of the DSA are still forthcoming. Together, these bills promise greater openness and transparency for users, regulators, and researchers. Policymakers in the United States have identified similar challenges in the digital economy to those framed by the DSA and DMA and have proposed legislation that often parallels these EU initiatives. But US legislative action remains stymied by partisan debates and endless congressional hearings. While the transatlantic policy agenda is not exactly moving in lockstep, a window of opportunity may still be open with congressional focus on antitrust legislation. With US-EU cooperation—including on digital policy—heightened by the war in Ukraine and strengthened through cooperation in the Trade and Tech Council, these major recent developments in platform regulation will be watched closely, including by those outside the tech policy world.
The EU’s Digital Agenda Charges Forward
The DMA is the EU’s attempt to address the use of dominant market power by certain companies to stifle competition. European officials often use the expression “breaking open, not breaking up” to describe the DMA’s approach to these digital behemoths—allowing space for competing products and promoting interoperability to create a more open digital environment and not, for example, dividing Meta (formerly Facebook) into separate companies for Facebook, WhatsApp, and Instagram.
The DMA targets “gatekeepers” of “core platform services”—search engines, app stores, social networks, video-sharing platforms, communications apps, or cloud services. Gatekeepers are identified as companies with an EU market capitalization of at least €75 billion, or with 45 million monthly users (about 10 percent of the EU population). The list of gatekeepers skews heavily—but not uniquely—toward Silicon Valley companies. The DMA sets out rules for these gatekeepers: they will not be able to favor proprietary services over similar third-party ones or use data collected from third-party sellers to offer competing products, and would need to offer users choices for search engines, virtual assistants, or web browsers. Large platforms can no longer combine user data from across their services to target advertising without explicit user consent. New companies will be able to more easily enter a market if, for example, consumers and developers have access to app stores that do not charge a high developer fee. Transparency requirements also mean that companies who advertise on these platforms will have more access to their marketing and performance data.
The case of interoperability—the ability of different platforms and services to communicate with each other—has been among the most debated aspects of the DMA and offers a good case study into ongoing disputes around some of these requirements. The DMA presents interoperability as a way to lessen gatekeepers’ control over communication services and allow consumer choice. Opponents claim that interoperability that is both user-friendly and privacy-protecting is technically impossible. Some argue that interoperability that breaks end-to-end encryption could in fact strengthen the gatekeeper, who could market encryption as an enticing reason to stay inside one system. Another argument claims that if users of a service with less privacy protection are allowed to communicate with users of a paid app offering greater privacy, the paid messaging service would lose its source of revenue if new, outside users essentially gain access for free. Beyond these debates, there are outstanding questions about what interoperability would look like in practice: When sending a message from encrypted Service A to non-encrypted Service B, would a pop-up warn you that your message is leaving an encrypted service? What level of transparency is needed for users to decide whether a potential weakening of privacy is a worthwhile trade-off? And would this be more difficult to scale in a US context, given that Americans have a much higher use of non-encrypted SMS rather than messaging apps like Signal or WhatsApp? Such open-ended questions will have to be worked out in the process of implementing these new rules in practice.
Beyond these debates, there are outstanding questions about what interoperability would look like in practice.
The DSA updates the 2000 e-Commerce Directive governing online platforms and marketplaces to harmonize the laws across EU member states and create one set of rules regarding illegal content, including how users notify platforms of illegal content and the subsequent actions that platforms must take. What is illegal offline is already illegal online—the DSA aims at ensuring this concept exists in practice, too. The e-Commerce Directive laid out the liability regime for platforms, under which they have no general obligation to monitor for illegal content but are governed by a notice-and-action mechanism, meaning that once informed about illegal content, they must act or face liability. This would not change with the DSA. In contrast to the broader protections afforded by the US First Amendment, European countries have stricter speech laws, including hate speech laws or those outlawing, for example, Holocaust denial. Partly due to these differences in speech rules across EU countries, there are some outstanding questions about over-enforcement and preemptive blocking of content resulting in a takedown of otherwise legal speech.
Obligations in the DSA apply across online intermediaries, but special importance and extra requirements are given to very large online platforms (VLOPs) of over 45 million monthly users. These VLOPs have new due diligence obligations under the DSA and must assess different systemic risks that could be caused or exacerbated by their products or design. The DSA applies this notion to three main risks in the context of VLOPs: 1) the spread of illegal content; 2) negative impact on fundamental rights such as free expression and privacy; and 3) deceptive practices. These must be followed by risk-mitigating measures, such as content moderation approaches (like warning labels) or codes of conduct, which can then be assessed by independent auditors. The DSA also sets out parameters and procedures for crisis protocols, including the display of information by member state authorities in the event of a public health emergency or natural disaster.
The DSA also requires different levels of transparency from platforms for the public, regulators, or vetted researchers. These requirements mean platforms would provide data and information about their content moderation decisions, advertising, and the way algorithmic amplification shapes what users see. Redress mechanisms also mean that users will be able to request information about why their content was removed. This more systematic approach seeks to move away from separate debates over individual pieces of content, and transparency around internal algorithmic decision-making will allow more targeted understanding of how these algorithms present and personalize content and the harms that may result. There are outstanding questions about these algorithmic transparency requirements, including ensuring privacy of user data, and US observers have raised questions about whether such transparency requirements around platforms’ algorithms would be constitutional under the US First Amendment (in which mandatory transparency would be seen as a form of compelled speech). Overall, however, the DSA marks a shift from a more self-regulatory approach to one with more binding rules. The self-regulatory Code of Practice on Disinformation, for example, will evolve into a co-regulatory instrument with the DSA.
As the DMA and DSA near the finish line, the question turns to one of enforcement. The DMA will be enforced by the EU Commission, with the potential for fines and penalties of up to 10 percent of global turnover or 20 percent for repeat offenders. DSA enforcement includes fines up to 6 percent of global turnover. VLOPs will be supervised by the EU Commission, with a proportional supervisory fee of up to 0.05 percent of global annual income. National regulators oversee smaller platforms, and new digital services coordinators will have oversight and investigatory powers. Questions remain, however, about the DSA’s proposed European board for digital services and the future relationship between national-level digital services coordinators and other national regulators.
The level of specificity in the final draft and the resources afforded to enforcement, keeping in mind that gatekeeper firms will likely also challenge enforcement in court, should be considered when thinking about the DMA’s future impact.
One issue hanging over centralized enforcement at the EU Commission level for the DMA is personnel: an 80-member enforcement team was proposed in its initial draft, a figure now largely seen as inadequate. The European Council’s chief economist is reported to have said that the EU Commission will be “shorthanded” for the “first few years” with regards to implementing the DMA—“a consequence of budgetary constraints.” Debates around enforcement or under-enforcement of the General Data Protection Regulation, where the data protection authorities in Luxembourg and Ireland lack the resources to pursue investigations, remain front of mind. In the United States, the Federal Trade Commission (FTC) lacks similar enforcement capabilities, as the Build Back Better agenda, which would have allotted significant resources to the FTC, failed to move forward. The level of specificity in the final draft and the resources afforded to enforcement, keeping in mind that gatekeeper firms will likely also challenge enforcement in court, should be considered when thinking about the DMA’s future impact.
The Antitrust Angle
In contrast to the EU, policymaking in the United States may seem like endless cycles of new bills that go nowhere and perpetual congressional hearings, with the November midterm elections dictating and narrowing the chances for passing major legislation. Nevertheless, one area of unexpected cooperation has been antitrust. Senator Amy Klobuchar’s focus on increasing competition has found a rare bedfellow with co-sponsor Senator Chuck Grassley and an apparently accommodating framework for Republican diatribes against so-called anti-conservative content moderation policies. In this context, two recent bills share a better chance than most of becoming law, and it is worth examining what approach they take to platform regulation and the changes they would enact.
The American Innovation and Choice Online Act passed out of committee in the Senate in January 2022, despite some reservations from California Democrats who, while voting for the bill, have shown reticence toward bills targeting home-state companies, a pattern similarly observed in the House. It applies to platforms over a US market capitalization of $550 billion or those with 50 million monthly active US users (or 100,000 monthly active business users) and would address conduct that disadvantages competitors, such as self-preferencing, in which a platform like Amazon or Google favors its own products and services. What would this look in practice? Platforms would be sanctioned if they restrict interoperability or access to data, block users from deinstalling pre-installed apps, boost their own products in search rankings, or condition platform access on the purchasing of other products. The law would be enforced by the FTC, who along with the Department of Justice and state attorneys general would be able to bring lawsuits.
The Open App Markets Act passed out of committee on an even stronger vote in February. It is slightly narrower in scope, targeting large app stores with over 50 million users like Apple and Google. The bills would loosen the restrictions on app developers, allowing the use of alternative payment systems—a significant blow to the commissions of app store owners—and access to the store’s operating system. It would require parity in search results and prohibit the harvesting of data from third-party apps to build competitors. Users would be allowed to sideload apps—that is, download them from non-proprietary and alternative app stores—which is currently unavailable on Apple devices.
As midterm elections approach, time is of the essence for the antitrust bills in Congress.
As in the EU, companies in the United States have predictably protested that these bills would undermine privacy and the integrity of the user experience. Valid critiques about the risks of sideloading—Apple devices have less malware than Android devices, for example, which allow outside app downloads—or those raised by legal experts about potential blind spots in the laws can be hard to disentangle from lobbying talking points.
As midterm elections approach, time is of the essence for the antitrust bills in Congress. Despite the bipartisan support behind these bills, it is an open question to what extent these shared priorities might extend beyond November. There is strong opposition to the antitrust push among some Republicans, and the possibility that the anti-Big Tech agenda would shift from competition to a focus on the “censoring” of conservatives and debates around Section 230, creating lots of noise but leaving the basic business model untouched.
What Is Next for Transatlantic Digital Policy?
With the DMA and DSA—along with the GDPR and forthcoming Data Governance Act, Data Act, and AI Act—the EU is seeking to set the standard for digital legislation and take advantage of the so-called Brussels effect, in which companies often end up adopting EU standards worldwide. It is likely that decisions imposed on platforms in the EU would affect their operations globally, including in the United States. There is hope among US researchers that they will be able to piggyback on some of the transparency disclosures required by the DSA, regardless of whether similar requirements ever became law in the United States.
The coming months will see new debates and challenges arise regarding these pieces of legislation with the potential to change the online experience for hundreds of millions. The answer may not satisfy those eager to reshape digital marketplaces and peer under the hood of these powerful platforms, but only time will tell whether these transatlantic efforts will achieve their desired ends.