EU Commission Flags Meta for Breaching Digital Services Act: A Look at User Reporting Inefficiencies
      
      The European Commission has made preliminary findings indicating that Instagram and Facebook have violated EU law by failing to offer users straightforward ways to report illegal content, including child sexual abuse material and terrorist content. This revelation sheds light on Meta, the 18 trillion dollar company behind these social media platforms, and raises questions about its compliance with the Digital Services Act (DSA).
On Friday, the Commission stated that Meta appears to have implemented unnecessary steps in the reporting processes, creating a cumbersome experience for users. Such actions may likely discourage individuals from flagging harmful content, a situation that the Commission views as a breach of legal obligations under the DSA. 
Notably, Meta denies any wrongdoing, asserting that both platforms provide sufficient reporting mechanisms. However, the Commission’s assessment indicates that neither Facebook nor Instagram offers a user-friendly interface for reporting illegal content.
Campaigners have consistently expressed concerns about safety shortcomings within Meta’s platforms. Recently, whistleblower Arturo Béjar disclosed research asserting that most new safety measures on Instagram are ineffective, leaving children under 13 particularly vulnerable.
In response to criticism, Meta has introduced mandatory teen accounts on Instagram in September 2024, along with the implementation of a PG-13 cinema rating system intended to enhance parental controls. Nonetheless, the Commission continues to find fault with Meta's mechanisms, especially regarding users whose content has been blocked or accounts suspended. The appeal process appears inadequate, limiting users' ability to present documentation or evidence in their favor.
This investigation is ongoing and has been conducted in cooperation with Coimisiún na Meán, the Irish digital services coordinator. If the findings hold true, Meta could face penalties of up to 6% of its total annual global turnover, as well as additional periodic fines to ensure compliance.
Moreover, the Commission has also highlighted similar issues with TikTok, stating both TikTok and Meta have failed to provide researchers with proper access to public data that would allow them to analyze minors’ exposure to harmful content. Access to this data is deemed essential for transparency and accountability under the DSA as it allows public scrutiny regarding the impact of social media platforms on the physical and mental health of their users.
Henna Virkkunen, the Commission's executive vice president for tech sovereignty, security, and democracy, stated, "Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice."
In light of these developments, Meta has maintained its stance, expressing disagreement with the Commission's assessments. The company notes ongoing negotiations concerning the reporting mechanisms and data access tools it has introduced since the DSA came into effect, asserting that these changes fulfill the legal requirements. TikTok has yet to provide a comment on the situation.
As the investigation progresses, the implications of these findings could have profound effects on Meta and its operational practices in the EU, potentially setting new standards for accountability and user empowerment across digital platforms.
Related Sources:
• Source 1 • Source 2 • Source 3