EU Accuses Meta and TikTok of Violating Digital Content Regulations, Risks Hefty Fines

On October 23, the European Union accused Meta and TikTok of breaching its digital content regulations, raising the prospect of significant fines for both companies. This move highlights the EU's commitment to a robust legal framework aimed at preventing the spread of illegal content and ensuring competitive digital markets. In an announcement made by the European Commission, it was stated that Meta's platforms, Facebook and Instagram, along with TikTok, have violated the Digital Services Act (DSA), which governs content moderation laws within the EU. This marks the first time Meta has been officially accused of violating the DSA, a claim the company strongly refuted. The announcement also included TikTok, which is owned by China's ByteDance, potentially escalating tensions with former U.S. President Donald Trump, who has previously threatened tariffs on countries imposing regulations deemed harmful to American technology. Despite Trump's threats, the EU reiterated its determination to enforce its regulations, emphasizing that both Meta and TikTok were failing to grant researchers sufficient access to public data. EU regulators argue that transparency is not merely a bureaucratic requirement, but essential for researchers to understand the extent of children's exposure to potentially harmful content on these popular platforms. In response to the allegations, TikTok expressed its commitment to transparency but raised concerns about the conflict between the DSA and the General Data Protection Regulation (GDPR). A spokesperson for TikTok stated, "requirements to ease data safeguards place the DSA and GDPR in direct tension," urging regulators to provide clarity on how to balance these obligations. The EU's findings also indicated that Meta's platforms lacked user-friendly mechanisms for flagging illegal content and effective systems for users to contest content moderation decisions. The Commission accused Facebook and Instagram of employing deceptive practices known as dark patterns, which can disorient users and hinder the reporting of illegal content. Under the DSA, platforms are required to clarify their content moderation decisions, a stipulation that the EU claims Meta has failed to meet. Meta, however, disagrees with the EU's accusations, stating that it continues to engage in discussions with the Union. The company noted that since the DSA came into effect, it has made significant changes to its content reporting options, appeals process, and data access tools, asserting confidence that these measures comply with legal expectations. Looking ahead, both Meta and TikTok will be granted access to the EU's investigative files, allowing them to propose remedies to address the regulatory concerns raised by Brussels. If the proposals are deemed unsatisfactory, the EU could impose fines on the companies for each violation per platform. EU digital spokesperson Thomas Regnier defended the DSA against accusations of censorship, particularly from the United States, asserting that the act serves to protect free speech. He stated, "When accused of censorship, we prove that the DSA is doing the opposite. It is protecting free speech, allowing citizens in the EU to fight back against unilateral content moderation decisions taken by Big Tech." Furthermore, both Meta and TikTok are currently under investigation by EU regulators in several probes, including one that examines concerns about the addictive nature of their platforms and their impact on children. The outcome of these investigations and the proposed remedies from Meta and TikTok will be crucial in determining the future landscape of digital content regulation in the EU. Related Sources: • Source 1 • Source 2