
The European Commission announced preliminary findings against Meta and TikTok under the DSA.
EU Commission finds Meta TikTok DSA violation of transparency obligations
From a perspective, the Digital Services Act (DSA), a European Union law, establishes new rules for large online platforms to ensure transparency, accountability, and access to platform data for users and researchers. On October 24, 2025, the EU Commission released its preliminary findings that Meta Platforms and TikTok may have violated key parts of these rules. This news is significant because it shows that large social media platforms can no longer escape the “rules of the game”—the scope of regulatory action has expanded. In this article, we will delve deeper into the following points What is the DSA and what are its transparency obligations? On what grounds did the EU find Meta and TikTok in violation? How did the platforms violate it—mechanisms, examples, and allegations. Possible consequences: penalties, changes, reforms. Challenges and points of contention. Global implications and India-relevant implications. What could happen next? Conclusion.
DSA and Transparency Obligations: Background
The DSA imposes special obligations on Very Large Online Platforms (VLOPs). Key Transparency Obligations of the DSA The DSA includes several obligations, two of which are particularly important to this case: Data Access for Researchers: The DSA seeks to ensure that platforms provide data that allows independent researchers to analyze the impact of platforms (e.g., on mental health, impact on children, defamation, violence, etc.). The EU Commission has stated: “Allowing researchers access to platforms’ data is an essential transparency obligation under the DSA. …” User Notice-and-Action and Illegal Content Complaint Mechanisms: Platforms must establish simple and accessible mechanisms through which users can report illegal content (e.g., child sexual abuse material, terrorist propaganda, etc.), and platforms must take appropriate action.
What conclusions did the EU reach in Meta and TikTok?
At this stage, it was clarified that a final decision has not yet been made—the companies have the opportunity to respond. The key points are: Both platforms (Meta-owned Facebook/Instagram and TikTok) did not fulfill the DSA obligation to make public data available to researchers—citing “burdensome procedures and tools … to request access to public data.” Meta’s Facebook and Instagram findings suggest that users were not provided with accessible and simple mechanisms to report illegal content or challenge content moderation decisions. In particular, the Commission noted the use of “deceptive interface designs” (dark patterns) that could discourage users from complaining. The Commission warned that if a violation is confirmed,
Analysis of the Violation Mechanism and Allegations
Barriers to Research Data Access: The Commission found that platforms have adopted complex and burdensome mechanisms for granting researchers access to data. This complicates research, undermining open research and public scrutiny. For example, researchers report that TikTok’s Research API lacks or lacks metadata for many videos, compromising the credibility of research. This lack of data access makes it difficult to publicly analyze the impacts of platforms (e.g., children’s mental health, advertising impact). Complaint Mechanism and User Interface Issues: Regarding Meta Platforms like Facebook and Instagram, the Commission found that users have to take several additional steps to report illegal content, such as child sexual abuse material (CSAM) or terrorist propaganda, and that the interface may discourage users from reporting.
There are allegations of “dark patterns”—that is, interface design designed to only show users the options the platform wants, making it difficult to file a complaint. Furthermore, users have been unable to appeal content moderation decisions—meaning information like “Do I know why my content was removed?” is not available. Advertising transparency and other doubts The Commission has not specifically addressed advertising transparency in this case, but platforms already have an obligation under the DSA to disclose information about advertisements, who is paying for them, and which election or social issue they are targeting. Companies’ risk assessments and the operation of algorithms (recommendation systems) are also under the Commission’s scrutiny—for example, Meta is suspected of not adequately addressing political content in its recommendation engine.
Possible Outcomes and Next Steps on Meta TikTok DSA violation
Fines and Regulatory Action If the preliminary findings are confirmed, companies could face fines of up to 6% of their annual global turnover. Additionally, the Commission could issue remedial orders to platforms—such as additional transparency measures for data platforms, user-complaint information, and Improve complaint mechanisms and simplify access to research data. If companies fail to improve, per-day or per-instance penalties (periodic penalty payments) may be imposed.
Guidelines for Improvement: Platforms must develop “simple, straightforward, and user-friendly” complaint mechanisms so that users can easily report illegal content. Researchers should have more intuitive access to data. Data should be made available by the platform, rather than through complex processes. Platforms must provide transparency and user-controlled options in interface design, avoiding “dark patterns.”
Challenges and Points of Contention
How will privacy be balanced versus research transparency? Identifying and Proving “Dark Patterns”
Allegations of “dark patterns” are often interface design-related. Identifying which action led a user to which option is a technical and evidence-gathering challenge. Platforms will often claim to have provided a complaint mechanism; determining whether the mechanism is truly accessible is not easy.
The Difficulty of Measuring Impact While it’s one thing for researchers to obtain platform data, it’s also challenging to understand whether platforms’ algorithms, advertising mechanisms, and recommendation systems are working effectively. A research study found inconsistencies in the data recorded in the “Transparency Database” created under the DSA, indicating that transparency mechanisms are not yet fully developed. Global Competition and Regulatory Enforcement: Platforms are multinational. European regulations can impact other regions as well. However, different countries have different regulations.
Global and India-Relevant Implications of Meta TikTok DSA violation
For example, platforms will have to adapt to local regulations. This also signals a shift in spending and strategy for platforms: opening up data research, improving grievance mechanisms, and increasing international compliance. Special Considerations for India: The use of digital platforms and social media apps in India is very high—users number in the billions. Therefore, if platforms make changes due to European regulations, their global strategy will also be impacted in markets like India. India has recently introduced digital regulations (e.g., the new IT rules) that are geared toward making digital platforms accountable.
The European enforcement model could inspire Indian policymakers. For example, if platforms implement changes in Europe under the pretext of “simplifying the grievance mechanism,” similar expectations could arise in India, including improved user access to content removal, appeals, and complaints. Data research and accountability and accuracy of platform models are also important in India—enabling research into how platforms’ algorithms operate and the impact on children.
What could happen next?
Penal action: If a company fails to improve, fines and regular penalties could be imposed. This would be a major step toward platform enforcement. Strengthening the regulatory enforcement base: The Commission could further investigate other platforms, increasing pressure for compliance with the DSA. Platforms’ behavior could change: user complaint mechanisms could become simpler, research data could become more available, and interface design could become more transparent. New markets and strategy shifts: Platforms may consider expanding their activities in other markets if regulatory pressure increases in Europe, but this will require a global coordinated strategy. Regulations in India and other countries will be a catalyst: This incident could inspire other countries to regulate digital platforms, strengthening the direction of global platform policy change.
Conclusion: Meta TikTok DSA violation
This case is an important signal that the days of “platform independence” in the digital age may be coming to an end—large platforms will now have to demonstrate transparency, accountability, and regulatory compliance. The preliminary findings issued by the European Commission against Meta Platforms and TikTok indicate that simply maintaining growth mode is not enough for social media companies—they must also take their social, ethical, and regulatory obligations seriously. This is an opportunity for users, policymakers, and researchers around the world, including India, to look at platform governance from a new perspective.




