Brussels' Tech Crackdown: Inside the EU's Expanding War on Major Platforms

Brussels' Tech Crackdown: Inside the EU's Expanding War on Major Platforms
Photo by Daniels Joffe / Unsplash

From X to TikTok to Chinese e-commerce giants, the Digital Services Act has become Europe's most powerful weapon against Big Tech—with billions in fines hanging in the balance

September 30, 2025

While Meta's impending charges under the European Union's Digital Services Act have captured headlines, the social media giant is just one target in Brussels' unprecedented regulatory assault on the world's largest technology platforms. The European Commission has opened formal investigations into at least 10 major platforms, with preliminary findings already issued against several—each facing potential fines of up to 6% of their global revenue.

The enforcement pattern reveals a coordinated effort to reshape how digital platforms operate in Europe, with American tech companies and Chinese e-commerce giants bearing the brunt of regulatory scrutiny. As investigations multiply and preliminary charges mount, the DSA is emerging as one of the most consequential—and controversial—pieces of technology legislation in history.

Brussels Set to Charge Meta Under Digital Services Act for Content Moderation Failures
European Commission preparing preliminary findings that Facebook and Instagram lack adequate systems for removing “harmful” content—Meta faces potential fines up to 6% of global revenue September 30, 2025 The European Union is preparing to escalate its regulatory confrontation with Meta Platforms, readying formal charges that accuse the tech giant

X: The First to Face Preliminary Findings

Elon Musk's X (formerly Twitter) holds the dubious distinction of being the first platform to receive preliminary findings of DSA violations. In July 2024, the European Commission formally accused X of breaching the regulation in three critical areas, marking a significant escalation in the EU's enforcement efforts.

The Charges Against X

Blue Checkmark Deception: The Commission determined that X's paid verification system constitutes a "dark pattern" that deceives users. Before Musk's 2022 acquisition, blue checkmarks indicated that Twitter had verified an account's authenticity—typically reserved for celebrities, politicians, and journalists. After the takeover, X began selling verification badges for approximately €7 per month to anyone willing to pay.

European regulators argue this fundamentally undermines users' ability to distinguish authentic accounts from imposters. "Back in the day, BlueChecks used to mean trustworthy sources of information," said European Commissioner Thierry Breton. "Now with X, our preliminary view is that they deceive users and infringe the DSA."

The Commission's investigation found that malicious actors have exploited this system to deceive users, particularly during sensitive moments like elections or breaking news events, when determining source credibility is crucial.

Advertising Opacity: X's advertisement repository—a database that platforms must maintain showing who paid for ads and how they targeted users—was deemed non-compliant. The Commission found that X's ad database is neither searchable nor reliable, with design features and access barriers that defeat its transparency purpose.

This lack of transparency prevents researchers from analyzing how advertising campaigns spread, who funds them, and whether they target vulnerable populations—information considered essential for detecting coordinated disinformation campaigns and protecting democratic processes.

Blocking Researchers: Perhaps most damaging, the Commission accused X of deliberately obstructing academic research. The DSA requires platforms to provide eligible researchers with access to public data so they can study how platforms work and how online risks evolve.

However, X prohibits independent data access through scraping, while the process to request access through the company's API is prohibitively expensive and bureaucratically complex. Researchers told investigators they often have no viable option but to pay disproportionately high fees—if they can gain access at all. Many simply abandon their research projects rather than navigate X's obstacles.

Digital Compliance Alert: UK Online Safety Act and EU Digital Services Act Cross-Border Impact Analysis
Executive Summary: Two major digital regulatory frameworks have reached critical implementation phases that demand immediate compliance attention from global platforms. The UK’s Online Safety Act entered its age verification enforcement phase on July 25, 2025, while escalating tensions between US officials and EU regulators over the Digital Services Act highlight

Political Implications

The X investigation carries significant geopolitical weight. Musk has accused the EU of attempting a "secret deal" to suppress content, while the Commission denies any such arrangement. The investigation has become a flashpoint in tensions between the Trump administration—where Musk holds significant influence—and European regulators.

Reports suggest the EU is preparing to issue X with a fine potentially reaching $1 billion, though the exact amount remains undisclosed. President Trump has publicly criticized the DSA as incompatible with America's free speech tradition and has warned of retaliatory measures against European companies.

The investigation continues into other aspects of X's operations, including whether the platform adequately combats illegal content like hate speech and terrorism incitement, and whether its crowd-sourced Community Notes fact-checking feature effectively counters misinformation.

TikTok: Multiple Investigations, Multiple Threats

ByteDance's TikTok faces a more complex web of DSA proceedings than perhaps any other platform, with investigations examining everything from advertising transparency to election interference to the psychological manipulation of children.

The Advertising Repository Failure

In May 2025, the European Commission issued preliminary findings that TikTok violated the DSA's requirement to maintain a searchable advertisement repository. This marked the first formal charge against TikTok under the regulation and could result in fines up to 6% of the company's global revenue.

The Commission found that TikTok fails to provide essential information about advertisements on its platform, including who paid for them, how users were targeted, and what content was displayed. More critically, TikTok's ad repository doesn't allow comprehensive searches based on this information, severely limiting its usefulness for detecting scam advertisements, coordinated disinformation campaigns, and election interference.

"Transparency in online advertising—who pays and how audiences are targeted—is essential to safeguarding the public interest," said Executive Vice-President Henna Virkkunen when announcing the preliminary findings.

The timing is particularly sensitive. The advertising transparency investigation stems from a probe launched in February 2024, but Commission officials acknowledged that TikTok's failure to maintain a functional ad repository made it significantly harder to assess potential misuse during election campaigns—including the contentious Romanian presidential elections.

Election Interference Concerns

In December 2024, the Commission opened a separate formal investigation into TikTok's handling of election-related risks, specifically examining whether the platform failed to properly assess and mitigate threats to electoral integrity during Romania's presidential race.

That election was marred by allegations of Russian disinformation campaigns using TikTok to amplify pro-Kremlin candidate Călin Georgescu, who unexpectedly surged in polls. The investigation examines whether TikTok's risk assessment and mitigation measures were adequate, whether its content moderation policies effectively addressed coordinated inauthentic behavior, and whether the platform provided sufficient transparency about how its algorithms amplified political content.

Protecting Children—Or Addicting Them?

The Commission's February 2024 investigation also examines whether TikTok's design features violate protections for minors. Specific concerns include:

Algorithmic Addiction: Regulators question whether TikTok's recommendation systems create "rabbit-hole effects" that expose children to increasingly extreme content, and whether gamification features stimulate behavioral addictions.

Age Verification Failures: The Commission is scrutinizing whether TikTok's age assurance methods adequately prevent children from accessing age-inappropriate content or from being targeted with manipulative advertising.

Privacy and Data Protection: Investigators are examining whether TikTok ensures adequate privacy and security protections for underage users, particularly given concerns about data sharing with TikTok's Chinese parent company.

The defunct TikTok Lite rewards program—which allowed users to earn points by performing tasks on the platform—became the subject of emergency proceedings in April 2024. The Commission argued the feature was launched without proper risk assessment and could create addictive behaviors. TikTok agreed to withdraw the feature from EU markets in August 2024, effectively settling that particular aspect of the investigation.

The EU’s Digital Services Act: A New Era of Online Regulation
The Digital Services Act (DSA) is a landmark piece of European Union legislation aimed at creating a safer, more transparent, and more accountable online environment. Adopted in 2022 and entering into full force in February 2024, the DSA represents a significant update to the EU’s digital regulatory framework, replacing the

TikTok's Defense

TikTok has maintained it takes DSA obligations seriously and is committed to compliance. However, the company has criticized what it views as unclear standards: "We disagreed with some of the Commission's interpretations and that regulators were relying on preliminary findings rather than clear, public guidelines."

The multiple, overlapping investigations put TikTok in a particularly precarious position. Unlike X or Meta, which face one or two major probes, TikTok must defend against charges spanning advertising transparency, election interference, child protection, addictive design, and researcher access—any one of which could result in massive fines.

Chinese E-Commerce: Temu and AliExpress Under Fire

The EU's regulatory crosshairs have increasingly focused on Chinese e-commerce platforms, reflecting broader European concerns about product safety, consumer protection, and the flood of cheap goods entering the single market.

Temu: Illegal Products and Addictive Shopping

Temu, the ultra-discount shopping platform that exploded in popularity across Europe, received preliminary findings of DSA violations in July 2025. Despite only entering European markets in 2023, Temu now claims over 92 million monthly active users in the EU—making it one of the fastest-growing platforms ever to face DSA scrutiny.

The Commission's investigation produced damning findings. A mystery shopping exercise conducted by EU officials found that consumers shopping on Temu were "very likely to find non-compliant products among the offer, such as baby toys and small electronics." The platform's risk assessment was described as "inaccurate," relying on general industry information rather than specific analysis of Temu's own marketplace.

"We shop online because we trust that products sold in our Single Market are safe and comply with our rules," said Executive Vice-President Virkkunen. "In our preliminary view, Temu is far from assessing risks for its users at the standards required by the Digital Services Act."

The investigation extends beyond product safety to examine Temu's use of gamification features—including wheel-spinning games, countdown timers, and reward systems—that regulators believe create addictive shopping behaviors. These "potentially addictive design" elements are suspected of harming users' "physical and mental well-being" by encouraging impulsive purchases and overconsumption.

Consumer protection groups have been particularly vocal about Temu. The European Consumer Organisation (BEUC) notes that the number of parcels entering the EU through Temu doubles every year. "Just to give you an idea, every year the number of parcels that enter into the EU through TEMU doubles," explained a BEUC director. "So we're talking about a significant number of products that enter the EU which are not compliant. And consumers do not know about that."

The investigation also examines Temu's recommendation systems, transparency around how products are promoted to users, and whether the platform provides adequate data access to researchers studying its impact on consumer behavior.

AliExpress: Partial Settlement, Ongoing Concerns

Alibaba's AliExpress presents a more nuanced case. The platform, which has operated in Europe longer than Temu and has more established compliance systems, has taken a different approach: offering commitments to settle some allegations while continuing to contest others.

In June 2025, the Commission accepted binding commitments from AliExpress to address concerns about advertising and recommender system transparency. The platform agreed to enhance how it discloses targeted advertising and to improve its content flagging mechanism.

However, the Commission simultaneously issued preliminary findings that AliExpress breached its obligation to assess and mitigate risks related to illegal products. Despite the partial settlement, AliExpress remains under investigation for multiple alleged violations, including inadequate risk management systems, deficient content moderation, insufficient trader traceability, and restricted researcher data access.

The parallel track of partial settlement and ongoing enforcement illustrates the EU's flexible approach: platforms can earn credit by demonstrating good faith efforts to comply, but serious violations will still result in formal charges and potential fines.

The Broader Chinese Platform Problem

The aggressive enforcement against Chinese platforms reflects several concerns beyond DSA compliance:

Product Safety Crisis: European officials worry that the ultra-cheap business models of Temu, AliExpress, and similar platforms create incentives to bypass safety regulations. Products that would be rejected at traditional retail checkpoints flood into Europe through millions of individual parcels.

Economic Disruption: Traditional European retailers complain they cannot compete with platforms that undercut prices by exploiting regulatory arbitrage and circumventing product safety requirements.

Environmental Impact: Critics argue that ultra-fast fashion and disposable goods promoted by these platforms fuel overconsumption and environmental harm. France has proposed legislation explicitly targeting Temu and Shein for encouraging "throwaway culture."

Data Security: While not primarily a DSA concern, European intelligence officials worry about Chinese platforms' data collection practices and potential surveillance risks, particularly as user numbers grow exponentially.

The Pornography Platform Dragnet

Perhaps the most surprising expansion of DSA enforcement targets adult content platforms. The Commission has opened investigations into Pornhub, XNXX, XVideos, and Stripchat—all focused on alleged failures to protect minors from accessing explicit content.

These investigations examine age verification systems, content moderation practices, and whether the platforms adequately assess risks that minors might access their services. Some platforms have challenged their designation as "Very Large Online Platforms," arguing the Commission's user count methodology is flawed, but courts have largely upheld the EU's authority to regulate these services.

The investigations reflect the DSA's expansive scope—it's not just about social media and e-commerce, but about any platform with sufficient users to pose "systemic risks," regardless of the content they host.

Shein: The Next Target

While not yet facing formal charges, fast-fashion giant Shein has been designated as a VLOP and faces intensive scrutiny. In February 2025, the Commission sent a request for information demanding details about Shein's systems for preventing illegal product sales and its recommendation algorithms.

Given Shein's business model—ultra-cheap clothing produced rapidly in response to trending designs—many observers expect formal proceedings similar to those facing Temu and AliExpress. European consumer groups have filed complaints arguing Shein uses manipulative design features and inadequate trader transparency.

The Enforcement Pattern: What It Reveals

Examining the full scope of DSA enforcement reveals several consistent patterns:

American Social Media vs. Chinese Commerce

The investigations cluster around two categories: American social media platforms (X, Meta's services, potentially others) and Chinese e-commerce platforms (Temu, AliExpress, Shein). This geographic concentration has fueled accusations that the DSA disproportionately targets non-European companies while giving preferential treatment to EU-based platforms.

European officials reject this characterization, noting they enforce against platforms based on user numbers and compliance failures, not national origin. However, critics point out that nearly all VLOPs facing serious enforcement actions are American or Chinese, while major European platforms have largely avoided similar scrutiny.

Transparency as a Weapon

Nearly every investigation includes allegations that platforms fail to provide adequate transparency—about advertising, content moderation, recommendation algorithms, or research access. This reflects the DSA's core philosophy: if the Commission can't see how platforms operate, it can't assess whether they're managing risks properly.

But critics argue this transforms transparency requirements into a tool for extracting information that regulators then use to build cases against platforms. Companies complain they're trapped in a catch-22: provide data that enables enforcement against you, or face charges for failing to provide transparency.

The "Preliminary Finding" Pressure Campaign

The Commission has been strategic about issuing preliminary findings—formal accusations that precede final decisions. These findings generate negative publicity, create uncertainty for investors, and pressure platforms to settle by offering binding commitments.

Only X has progressed to the preliminary finding stage for multiple violations (dark patterns, ad transparency, researcher access), though Meta, TikTok, Temu, and AliExpress have all received preliminary findings on at least some allegations. The Commission argues this measured approach gives platforms opportunities to remedy problems before facing fines; critics see it as a calculated pressure campaign designed to force compliance without the procedural protections of formal adjudication.

No Fines Yet—But That May Change Soon

Despite dozens of ongoing investigations and multiple preliminary findings, the Commission has not yet issued a single DSA fine. This restraint may be tactical, allowing enforcement to mature before courts scrutinize the first penalties. However, industry observers expect the first fines to arrive in late 2025 or early 2026, potentially reaching billions of dollars.

The delay also reflects legal complexity. Companies have the right to defend themselves, review investigation files, and propose remedies. Each investigation generates thousands of pages of documentation. The Commission must build cases that can withstand inevitable court challenges from companies with unlimited legal resources.

The Compliance Cost Crisis

Beyond potential fines, the DSA imposes massive ongoing compliance burdens. Industry estimates suggest major platforms spend hundreds of millions of dollars annually just meeting DSA requirements:

  • Hiring thousands of content moderators with language expertise across 24 EU languages
  • Developing sophisticated systems to detect and categorize potentially illegal content
  • Conducting biannual risk assessments and independent audits
  • Building advertisement repositories and researcher access portals
  • Creating detailed transparency reports every six months
  • Maintaining appeals processes and out-of-court dispute mechanisms

The Computer & Communications Industry Association estimates EU digital regulations cost U.S. tech firms billions annually in compliance expenses—effectively functioning as a tax on American innovation. Smaller platforms struggle even more, as compliance costs don't scale proportionately with revenue.

The Geopolitical Dimension

DSA enforcement has become inextricably linked to broader U.S.-EU tensions over technology governance, data privacy, and economic competition. The Trump administration views European digital regulations as disguised protectionism designed to handicap American companies while allowing European and Chinese competitors to operate under different standards.

FCC Chairman Brendan Carr has publicly criticized the DSA as "incompatible with America's free speech tradition." Vice President JD Vance has condemned EU enforcement as overreach. President Trump himself has threatened reciprocal measures against European companies and suggested the DSA violates international trade commitments.

The potential $1 billion fine against X—owned by Musk, a major Trump ally—has heightened these tensions. If the EU proceeds with substantial penalties against American platforms while Chinese platforms escape similar punishment, it could trigger a serious transatlantic trade dispute.

What Happens Next

As investigations mature and preliminary findings accumulate, several scenarios could unfold:

Scenario 1: The First Fines Drop
The Commission issues significant penalties against one or more platforms—likely X, given its advanced investigation status. This would test whether the DSA's enforcement mechanisms survive legal challenges and whether fines are large enough to compel behavior changes.

Scenario 2: Negotiated Settlements
Platforms facing preliminary findings offer binding commitments to address EU concerns, similar to AliExpress's partial settlement. This allows the Commission to claim victory while avoiding protracted court battles.

Scenario 3: Escalating Resistance
Major platforms, backed by the U.S. government, mount coordinated legal and political resistance. Court challenges could delay enforcement for years, while diplomatic pressure might force modifications to the DSA's application.

Scenario 4: Fragmentation Accelerates
Unable to meet conflicting requirements across jurisdictions, platforms increasingly operate different versions of their services in different regions. The global internet fractures into incompatible regional systems with divergent content rules.

The Fundamental Question

Beneath all the technical disputes about ad repositories, age verification, and researcher access lies a more fundamental question: Who should decide what content is acceptable online?

The DSA represents Europe's answer: democratically elected governments, through appointed regulators, should set standards and enforce compliance. American critics argue this grants government authorities dangerous control over speech and sets precedents that authoritarian regimes will exploit.

The outcome of ongoing DSA investigations will shape not just how major platforms operate in Europe, but how governments worldwide approach internet governance. If the EU successfully enforces its vision through billion-dollar fines and mandated product changes, other countries will likely adopt similar frameworks.

If, however, the DSA crumbles under legal challenges, political pressure, or practical unworkability, it may represent high-water mark for government control of online platforms—a cautionary tale about regulatory overreach rather than a template for the future.

Conclusion: The Stakes Couldn't Be Higher

The expanding universe of DSA investigations represents far more than technical compliance disputes. It's a battle for the future of the internet itself—how it's governed, who controls it, and what role governments play in determining acceptable online behavior.

For the platforms under investigation, billions of dollars and their European business models hang in the balance. For the European Commission, the DSA's credibility depends on demonstrating it can enforce compliance even against the world's most powerful tech companies. For users, the outcome will determine how much content they can access, what information they can share, and how much surveillance they'll endure in the name of safety.

As Meta joins X, TikTok, Temu, AliExpress, and others in facing DSA enforcement, the pattern is clear: Brussels has declared war on Big Tech, and the casualties are just beginning to mount. Whether this crusade protects Europeans from genuine harms or represents dangerous government overreach into online speech may not be clear for years.

What is certain is that the Digital Services Act has fundamentally altered the relationship between technology platforms and government authority. The investigations proceeding through Brussels today will define digital governance for a generation—for better or worse.


The European Commission continues to expand DSA enforcement, with additional platforms likely to face formal proceedings in the coming months. Companies facing preliminary findings will have opportunities to respond before final decisions are made, though observers expect the first significant fines to arrive before the end of 2025.

Read more