26 November 2023
Unsealed US court documents allege Meta “routinely” collected personal data on children under 13 via Instagram, according to CNN and The New York Times.
The court filings from 33 State Attorneys General claim Meta received 1.1 million reports of users under 13 since 2019, but “disabled only a fraction” of the accounts and collected children’s personal data without parental consent, in violation of several state consumer protection statutes and the Children’s Online Privacy Protection Act (COPPA).
Internal Meta information, including employee emails, chats and presentations are cited in the complaint as evidence that Meta “coveted and pursued” under-13 users and “continually failed” to implement effective age verification systems.
According to CNN: “The unsealed complaint also alleges that Meta knew that its algorithm could steer children toward harmful content, thereby harming their well-being.”
Meta released a statement on Saturday stating it has worked for a decade to make experiences safe and age-appropriate, and the court filing “mischaracterizes our work using selective quotes and cherry-picked documents,” according to The New York Times.
Meta argues that age verification is complex for online services, calling for federal legislation to require app stores to obtain parental approval when apps are downloaded, rather than requiring age-verification by each app.
With the Online Safety Act in the UK and Digital Services Act in the EU, the US lawsuits are part of international efforts to protect children’s online safety in the age of social media.
But is Meta right? Should app stores should be responsible for age verification? There may be debate on this topic, but most privacy professionals take a very different view to the position adopted by the tech giant.