Search

Meta is under investigation for child safety violations as a result of the EU’s comprehensive tech regulation.

Share it

Facebook parent company Meta is facing a significant investigation by the European Union regarding potential breaches of the EU’s strict online content laws, specifically concerning child safety risks associated with its platforms, Facebook and Instagram.

The European Commission announced that it is looking into whether these social media platforms could be contributing to behavioral addictions in children and creating what is known as ‘rabbit-hole effects’. Privacy risks related to Meta’s recommendation algorithms and issues with age verification processes are also under scrutiny.

A Meta spokesperson responded by emphasizing the company’s efforts over the past decade, developing numerous tools and policies aimed at protecting young users online. Meta looks forward to sharing details of its work with the European Commission in addressing these challenges faced by the industry.

The EU’s Commissioner for internal market, Thierry Breton, expressed skepticism about Meta’s compliance with the obligations under the Digital Services Act (DSA) to mitigate negative effects on the physical and mental health of young Europeans using its platforms. This has led to the initiation of an in-depth investigation into Meta’s child protection measures as a priority for the European Union.

The EU’s investigative approach includes gathering evidence through various means such as requests for information, interviews, and inspections. The initiation of this probe enables the EU to take further enforcement actions, including interim measures, non-compliance decisions, and consideration of commitments made by Meta to address the concerns raised.

Under the DSA, companies like Meta can face fines of up to 6% of their global annual revenues for violations. Despite the law being in effect, the EU is yet to impose fines on any tech giants. Meta is not alone in facing EU scrutiny as other U.S. tech giants have also come under the spotlight for compliance with the Digital Services Act.

Additionally, Meta is being investigated by the EU for alleged infringements related to the handling of election disinformation. The company has been a subject of scrutiny over a range of issues including harmful content and its impact on society.

Mark Zuckerberg, CEO of Meta, testified before the Senate Judiciary Committee in Washington, DC, shedding light on the company’s practices.

Alex Wong | Getty Images

Concerns over child safety and related violations are not limited to the EU. In the United States, Meta is facing legal action from the attorney general of New Mexico for allegations of enabling child sexual abuse, solicitation, and trafficking through its platforms.

In response to these allegations, Meta highlighted its use of advanced technology and other preventive measures in combatting predators and ensuring a safer online environment for users.

The investigations and actions taken by regulatory bodies like the EU and authorities in the U.S. demonstrate the growing focus on ensuring online platforms prioritize user safety, especially for vulnerable groups such as children.

As Meta navigates through these investigations and regulatory challenges, the tech industry as a whole is under pressure to enhance transparency, accountability, and mechanisms for protecting users, particularly minors, from potential harms on digital platforms.

Time will tell how Meta and other big tech companies respond to these investigations and whether regulatory measures lead to substantial changes in how online platforms operate in the interest of user safety and well-being.

🤞 Don’t miss these tips!

🤞 Don’t miss these tips!

Solverwp- WordPress Theme and Plugin