EC Finds Meta and TikTok Breached Transparency Rules Under DSA
An EU investigation has revealed that Meta and TikTok breached transparency rules under DSA, raising fresh concerns about how the two social media giants handle data access and content moderation within the European Union.
Image Credits:Santiago Urquijo / Getty Images
The European Commission (EC) announced that both companies failed to comply with the Digital Services Act (DSA)—the EU’s landmark legislation designed to regulate online content and platform accountability. The findings accuse Meta and TikTok of restricting researcher access to data and making user reporting unnecessarily complex.
EC’s Findings: Meta and TikTok Under Fire
According to the Commission, Meta and TikTok breached transparency rules under DSA by not providing adequate public data access for researchers studying illegal or harmful content. The EC described both companies’ data access tools as “burdensome,” preventing researchers from conducting effective studies on how users—particularly minors—are exposed to risks online.
These limited data-sharing processes, the EC said, undermine the ability to monitor how platforms manage harmful content, misinformation, and algorithmic influence.
Meta’s “Dark Patterns” and Content Reporting Issues
Meta’s social platforms, Facebook and Instagram, were singled out for violating DSA obligations to provide users with straightforward methods for reporting illegal content. Instead, the EC found multiple layers of unnecessary steps that discourage users from completing the reporting process.
The Commission also accused Meta of using “dark patterns”—design tactics that subtly manipulate users into making certain choices or abandoning reports. Such tactics, regulators argue, weaken users’ ability to hold the company accountable.
“Meta’s mechanisms to flag and remove illegal content may therefore be ineffective,” the EC noted, calling the user experience confusing and discouraging.
Flawed Appeals System for Users
Beyond reporting issues, Meta’s appeal mechanisms were criticized for being restrictive. The EC said EU users are not allowed to fully explain their cases or provide evidence during appeals of moderation decisions.
This limitation, the Commission argued, prevents users from defending themselves against unfair removals or restrictions, reducing transparency in Meta’s content governance.
TikTok’s Response: Data Access vs Privacy Law
TikTok responded by emphasizing its commitment to data transparency, stating it has granted data access to nearly 1,000 research teams. However, the company expressed concern that the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR) may have conflicting demands.
“If it is not possible to fully comply with both, we urge regulators to provide clarity on how these obligations should be reconciled,” a TikTok spokesperson said.
TikTok maintains that it has invested heavily in secure data-sharing systems but insists that privacy protections must remain intact.
Meta’s Defense: “We’re Complying with the DSA”
Meta pushed back against the EC’s findings, claiming it has already updated its internal tools and processes to meet DSA compliance standards.
“We disagree with any suggestion that we have breached the DSA,” a Meta spokesperson stated. “We’ve introduced new content reporting options, appeals processes, and data access tools since the DSA came into force.”
Meta insists that its changes align with EU requirements and that ongoing discussions with regulators will further clarify any remaining issues.
Broader DSA Investigations
The findings stem from 2024 investigations targeting both TikTok and Meta. TikTok’s case focused on advertising transparency, data access, content moderation, and protection of minors. Meta’s investigation, meanwhile, explored whether Facebook and Instagram violated rules related to election integrity and algorithmic transparency.
The European Commission emphasized that these are preliminary findings, and further action—including potential fines—could follow depending on each platform’s compliance progress.
What the Digital Services Act Means for Platforms
The Digital Services Act (DSA) represents a major shift in how online platforms must operate in Europe. It requires companies like Meta and TikTok to:
-
Provide researchers and regulators with access to public data.
-
Offer simple, transparent ways for users to report illegal or harmful content.
-
Ensure fair moderation and appeals processes.
-
Avoid manipulative design practices (“dark patterns”).
-
Disclose how algorithms recommend or filter content.
Noncompliance with DSA rules can lead to fines of up to 6% of a company’s global annual revenue, meaning billions of euros for tech giants like Meta and TikTok.
Potential Consequences for Meta and TikTok
If the EC’s final ruling confirms that Meta and TikTok breached transparency rules under DSA, both companies could face severe penalties. These include:
-
Heavy fines for violating transparency and data-sharing obligations.
-
Operational restrictions within the EU market.
-
Increased regulatory scrutiny over advertising, algorithms, and user safety.
The outcome could also set a precedent for other major platforms, signaling how strictly the EU intends to enforce digital accountability.
Why Transparency Matters Under DSA
Transparency is at the heart of the DSA’s mission. By ensuring platforms open up their systems for external scrutiny, regulators aim to:
-
Protect minors and vulnerable users.
-
Promote accountability in algorithmic decision-making.
-
Support independent research on online safety.
-
Curb misinformation and harmful content.
When platforms limit access to essential data, researchers and watchdogs cannot effectively assess risks or verify company claims. This lack of visibility is what led to the EC’s sharp rebuke of Meta and TikTok.
What Happens Next
The EC’s findings mark a critical phase in enforcing the DSA. Both Meta and TikTok now have an opportunity to respond and demonstrate compliance before any formal penalties are imposed.
If they fail to convince regulators, the European Commission could move forward with enforcement actions, including substantial fines or orders to modify platform operations in the EU.
As the investigations progress, the case will likely serve as a benchmark for how strictly EU regulators apply the DSA to other major tech players like X (formerly Twitter), YouTube, and Snapchat.
The conclusion that EC finds Meta and TikTok breached transparency rules under DSA highlights Europe’s growing determination to hold Big Tech accountable. While both companies maintain they’re working to comply, the EC’s findings reveal just how complex the balance between privacy, transparency, and user protection has become in the digital age.
With DSA enforcement ramping up, this may only be the beginning of a new era of stricter oversight—and greater public accountability—for social platforms operating across Europe.
إرسال تعليق