European Commission Investigates Facebook And Instagram “Rabbit-Hole”
Www.oeisdigitalinvestigator.com:
The European Commission has launched a formal investigation into Meta, saying it may have breached the Digital Services Act over the protection of minors on Facebook and Instagram.
It’s concerned, it says, that the company’s algorithms may lead to addiction in children, and create so-called “rabbit-hole effects.” It also plans to investigate the sites’ age-assurance and verification methods.
“With the Digital Services Act we established rules that can protect minors when they interact online,” said Margrethe Vestager, executive vice president for a Europe fit for the digital age.
“We have concerns that Facebook and Instagram may stimulate behavioral addiction, and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation,” Vestager added. “We want to protect young people’s mental and physical health.”
The opening of proceedings is based on a preliminary analysis of the risk assessment report sent by Meta in September last year, along with Meta’s replies to the commission’s formal requests for information and publicly available reports, as well as the commission’s own analysis.
Specifically, the commission believes that the sites’ interfaces may exploit the weaknesses and inexperience of minors to cause addictive behavior, and through its suggested content lead children down a rabbit hole of ever-more harmful material. It also says that the company’s age-verification measures to prevent access by minors to inappropriate content may not be reasonable, proportionate and effective.
And, it suggests, Meta may be failing to comply with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors—particularly in terms of the default privacy settings for minors used in its recommender systems.
If the allegations are proven, they would constitute infringements of Articles 28, 34, and 35 of the DSA.
“We are not convinced that it has done enough to comply with the DSA obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram,” said commissioner for internal market Thierry Breton.
“We will now investigate in-depth the potential addictive and ‘rabbit-hole’ effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems,” Breton added. “We are sparing no effort to protect our children.”
Meta says it has numerous features designed to protect children, including parental supervision tools, “take a break” notifications and a Quiet Mode. It hides potentially harmful content from teens, it says, and now allows young users to turn off their ability to receive DMs from anyone they don’t follow or aren’t connected to on Instagram—including other teens—by default.
“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools, features and resources designed to protect them,” says a spokesperson.
“This is a challenge the whole industry is facing, which is why we’re continuing to advance industry-wide solutions to age-assurance that are applied to all apps teens access. We look forward to sharing details of our work with the European Commission.”
The announcement is just the latest in a flurry of investigations based on the DSA. Last month, for example, the commission launched an investigation into Meta over its policies and practices around deceptive advertising and political content. It’s also investigating TikTok, X and AliExpress.