Meta was today hit with a new European Union investigation over alleged failure to protect children in part because of Facebook and Instagram’s “addictive” algorithms.
The European Commission said it suspected the platform was violating the bloc’s online content-moderation rulebook, the Digital Services Act (DSA), in its second probe into the company.
Meta’s popular platforms are accused of potentially “stimulating behavioral addiction in children” through so-called rabbit holes. Instagram and Facebook may also gather too much data from underage users to push personalized content with its algorithms.
Meta may not have sufficiently assessed the risks that underage users can access Facebook and Instagram and see inappropriate content, said the Commission. The company may also not have installed sufficiently efficient strong tools to verify the age of users and block children from accessing its platforms.
The probe into Meta’s potentially inadequate protection of minors comes on top of another investigation started in April into whether the company also insufficiently limited the spread of disinformation. The European Commission opened a similar investigation in February into TikTok’s addictive design.
Very large online platforms like Instagram and Facebook have to respect stringent content-moderation rules and limit a number of major societal risks, including negative effects for mental health and for minors.
Probes can lead to swift penalties of up to 6 percent of a company’s annual global revenue if the Commission concludes that it infringed the DSA.
Meta didn’t immediately reply to a request for comment.