Meta is back in hot water for its methods (or lack thereof) for protecting children. The European Commission has launched formal proceedings to determine whether the owner of Facebook and Instagram has violated the Digital Services Act (DSA) by contributing to children’s social media addiction and not ensuring they have high levels of safety and privacy.
The Commission’s investigation will specifically examine whether Meta is properly assessing and acting against risks brought on by its platforms’ interfaces. It’s concerned about how their designs could “exploit the weaknesses and inexperience of minors and cause addictive behavior, and/or reinforce so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.”
The proceedings will also explore whether Meta takes necessary steps to prevent minors from accessing inappropriate content, has effective age-verification tools and minors have straightforward, strong privacy tools, such as default settings.
The DSA sets standards for very large online platforms and search engines (those with 45 million or more monthly users in the EU) like Meta. Obligations for designated companies include transparency about advertising and content moderation decisions, sharing their data with the Commission and looking into risks their systems pose related to areas such as gender-based violence, mental health and protection of minors.
Meta responded to the formal proceedings by pointing to features such as parental supervision settings, quiet mode and it automatically restricting content for teens. “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission,” a Meta spokesperson told Engadget.
However, Meta has continuously failed to prioritize the safety of young people. Previous alarming incidents include Instagram’s algorithm suggesting content that features child sexual exploitation and claims that it designs its platforms to be addictive to young people while suggesting psychologically harmful content, such as the promotion of eating disorders and body dysmorphia.
Meta has also famously served as a hub of misinformation for people of all ages. The Commission already launched formal proceedings against the company on April 30 due to concerns around deceptive advertising, data access for researchers and the lack of an “effective third-party real-time civic discourse and election-monitoring tool” before June’s European Parliament elections, among other concerns. Earlier this year, Meta announced that CrowdTangle, which has publicly shown how fake news and conspiracy theories move around Facebook and Instagram, would be completely shut down in August.
Trending Products