The European Commission has preliminarily found both TikTok and Meta in breach of their obligation to grant researchers adequate access to public data under the EU’s internet rules, the Digital Services Act (DSA).

The Commission has also preliminarily found Meta, for both Instagram and Facebook, in breach of its obligations to provide users simple mechanisms to notify illegal content, as well as to allow them to effectively challenge content moderation decisions.
Facebook, Instagram and TikTok are accused of putting in place burdensome procedures and tools for researchers to request access to public data. This often leaves them with partial or unreliable data, impacting their ability to conduct research, such as whether users, including minors, are exposed to illegal or harmful content.
Allowing researchers access to platforms’ data is seen as an essential transparency obligation under the DSA, as it provides public scrutiny into the potential impact of platforms on our physical and mental health.
As regards Meta, neither Facebook nor Instagram appear to provide a user-friendly and easily accessible ‘Notice and Action’ mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content. The mechanisms that Meta currently applies seems to impose several unnecessary steps and additional demands on users.
In addition, both Facebook and Instagram appear to use so-called ‘dark patterns’, or deceptive interface designs, when it comes to the ‘Notice and Action’ mechanisms.
Such practices can be confusing and dissuading, says the Commission. Meta’s mechanisms to flag and remove illegal content may therefore be ineffective. Under the DSA, ‘Notice and Action’ mechanisms are key to allowing EU users and trusted flaggers to inform online platforms that certain content does not comply with EU or national laws. Online platforms do not benefit from the DSA’s liability exemption in cases where they have not acted expeditiously after being made aware of the presence of illegal content on their services.
The DSA also gives users in the EU the right to challenge content moderation decisions when platforms remove their content or suspend their accounts. At this stage, the decision appeal mechanisms of both Facebook and Instagram does not appear to allow users to provide explanations or supporting evidence to substantiate their appeals. This makes it difficult for users in the EU to further explain why they disagree with Meta’s content decision, limiting the effectiveness of the appeals mechanism.
The Commission says its views related to Meta’s reporting tool, dark patterns and complaint mechanism are based on an in-depth investigation, including co-operation with Coimisiún na Meán, the Irish Digital Services Coordinator.
The EU executive stresses that these are preliminary findings which do not prejudge the outcome of the investigation.
Facebook, Instagram and TikTok can now examine the documents in the Commission’s investigation files and reply in writing to the Commission’s preliminary findings. The platforms can then take measures to remedy the breaches.
The Commission’s preliminary findings form part of the Commission’s formal proceedings launched into Meta, and formal proceedings to investigate TikTok.




