icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
19 Jan, 2024 16:13

Meta admits 100k underage users are sexted daily – media

The disturbing statistic is among other revelations contained in a newly-unsealed court filing
Meta admits 100k underage users are sexted daily – media

Facebook and Instagram users sexually harass about 100,000 children on both platforms every day, the social media brands' parent company, Meta, has reportedly admitted.

The revelation was contained in a legal filing from a lawsuit taken by the state of New Mexico against Meta last month, which was unsealed on Wednesday.

Internal company documents detail unsolicited messages received by underage users including “pictures of adult genitalia,” The Guardian, TechCrunch and other outlets have reported. The documents also show that multiple Meta employees have raised concerns about child exploitation on their respective platforms. 

The social media giant has long been aware of sexual predators targeting its youngest users. One senior executive even testified before Congress last year about how his daughter had been sexually solicited on Instagram – and about his own ultimately futile efforts to end the phenomenon.

Some employees worried the predator problem could have serious consequences for the company too, according to the filing, which describes a 2020 incident in which an Apple executive’s 12-year-old daughter received lewd messages on IG Direct – Instagram’s private-messaging platform analogous to Facebook Messenger.

This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App Store,” a Meta employee wrote in one document.

While Meta acknowledged the risks posed by allowing children to receive unsolicited direct messages on Facebook and Instagram, the company declined to implement proposed safeguards and even stood in the way of adopting child-safety fixes because there was no money in it, according to New Mexico Attorney General Raul Torrez.

For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and child exploitation,” Torrez told TechCrunch on Wednesday, accusing management of habitually making “decisions that put growth ahead of children’s safety.” 

While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta’s internal data and presentations show the problem is severe and pervasive,” he continued, referring to the newly-unsealed filing.

The state’s lawsuit accused Meta of allowing Facebook and Instagram to devolve into “a marketplace for predators in search of children upon whom to prey,” such that it failed to remove even reported child pornography and other abusive content. Some child exploitation material is “over ten times more prevalent on Facebook and Instagram than it is on Pornhub and OnlyFans,” the suit claimed. 

While Meta responded to the suit by claiming it had spent “over a decade working on these issues,” the newly-public documents show the company deliberately limiting child-safety features even while trying to lure more children and teens to use its direct-messaging function. Executives internally admitted they were losing youth market share to platforms like SnapChat, arguing against scanning messages for “harmful content” lest kids view it as invading their privacy.