EU Accuses Meta and TikTok of Failing to Protect Children Online

The European Commission has preliminarily concluded that Meta Platforms, the parent company of Facebook and Instagram, and TikTok have violated the Digital Services Act (DSA) by failing to adequately protect children on their platforms. This landmark legislation, enacted to enhance online safety, mandates that large online platforms implement robust measures to safeguard users, particularly minors, from harmful and illegal content.

Allegations Against Meta and TikTok

The Commission’s investigation revealed that both companies have imposed excessively burdensome procedures for researchers seeking access to public data. This obstruction hampers the ability to assess whether children are exposed to illegal or harmful content on these platforms. The DSA requires platforms to grant researchers appropriate access to verify user protection measures. By restricting this access, Meta and TikTok are accused of violating transparency and accountability principles.

Specifically, Meta’s Facebook and Instagram platforms have been criticized for not providing user-friendly mechanisms for reporting illegal content, such as child sexual abuse material and terrorist content. The Commission found that the current reporting processes are complicated and confusing, potentially dissuading users from flagging harmful content. Additionally, Meta is accused of employing dark patterns—deceptive interface designs that intentionally mislead users or prevent them from performing certain actions. In this case, these design choices have allegedly made the reporting process for illegal content more difficult, contravening the DSA’s requirements for straightforward Notice and Action processes.

Potential Consequences

If these preliminary findings are confirmed, Meta and TikTok could face fines of up to 6% of their global annual turnover. Given the substantial revenues of these companies, such penalties could amount to billions of dollars. This action underscores the EU’s commitment to holding major tech platforms accountable for their data handling practices and ensuring the privacy and safety of all users, especially children.

Historical Context

This is not the first time these platforms have faced scrutiny over child protection issues. In September 2023, TikTok was fined €345 million by Ireland’s Data Protection Commission for failing to protect children’s privacy. The investigation found that TikTok accounts belonging to teens were public by default during the sign-up process, allowing anyone to view and comment on their videos. Additionally, a family pairing feature designed for parents to manage settings was not strict enough, allowing adults to turn on direct messaging for users aged 16 and 17 without their consent. TikTok stated that it had made changes well before the investigation began, including making all accounts for teens under 16 private by default and disabling direct messaging for 13 to 15-year-olds.

Similarly, Meta has faced significant fines for data protection violations concerning children. In 2022, Instagram was fined €405 million for setting business accounts created by minors to public by default, exposing sensitive information without consent. In late 2024, Facebook was fined €251 million following a data breach that compromised the personal data of minors. These incidents make Meta the most penalized company under the GDPR framework, with fines totaling €2.7 billion for violating data protection laws, particularly those concerning children.

Responses from Meta and TikTok

Both companies have responded to the Commission’s findings. A Meta spokesperson stated, We disagree with any suggestion that we have breached the DSA, and we continue to negotiate with the European Commission on this. The spokesperson added that Meta has introduced changes to its content reporting options, appeals process, and data access tools since the DSA came into force and is confident that these solutions match what is required under the law in the EU.

TikTok has not yet publicly responded to the latest allegations. However, in response to previous fines, the company has stated that it had made changes well before the investigations began, including making all accounts for teens under 16 private by default and disabling direct messaging for 13 to 15-year-olds.

Broader Implications

The EU’s actions against Meta and TikTok highlight the ongoing challenges in regulating large tech platforms and ensuring they adhere to stringent data protection and child safety standards. The Digital Services Act represents a significant step in holding these companies accountable and protecting vulnerable users. As the investigations continue, the outcomes will likely have far-reaching implications for how social media platforms operate within the EU and potentially influence global standards for online safety and data protection.