Featured Image

Tell Congress to shut down Pornhub Send a message today

(LifeSiteNews) — Social-media giant Meta was aware that its subscription tools were being used to facilitate child sexual exploitation but neglected to solve the issue, according to two recent explosive reports in the Wall Street Journal and the New York Times.

The reports concern warnings the company’s safety staff issued to their superiors in 2023 concerning Facebook and Instagram’s paid subscription tools, which were being used by hundreds of “parent-managed minor accounts” to sell adult male users’ images of their own young daughters wearing swimsuits and leotards. 

The photos themselves were not sexual, nude, or otherwise illegal, and many customers made perfectly clear to the mothers running the accounts that they were deriving sexual enjoyment from them. “Sometimes parents engaged in sexual banter about their own children or had their daughters interact with subscribers’ sexual messages,” the Journal reported. “Meta’s recommendation systems were actively promoting such underage modeling accounts to users suspected of behaving inappropriately online toward children.”

“One calculation performed by an audience demographics firm found 32 million connections to male followers among the 5,000 accounts examined,” the Times report says. “Interacting with the men opens the door to abuse. Some flatter, bully and blackmail girls and their parents to get racier and racier images. The Times monitored separate exchanges on Telegram, the messaging app, where men openly fantasize about sexually abusing the children they follow on Instagram and extol the platform for making the images so readily available.”

Yet many parents went along with it thanks to the substantial financial incentives at play: “The bigger followings look impressive to brands and bolster chances of getting discounts, products and other financial incentives, and the accounts themselves are rewarded by Instagram’s algorithm with greater visibility on the platform, which in turn attracts more followers.”

The Times added that Instagram users who report sexually explicit images and suspected predators “are typically met with silence or indifference,” and even if they use the block function on “many” of them they have actually been penalized with limits on their own access to certain features.

“Anyone on Instagram can control who is able to tag, mention or message them, as well as who can comment on their account,” Meta spokesperson Andy Stone told the Times. “On top of that, we prevent accounts exhibiting potentially suspicious behavior from using our monetization tools, and we plan to limit such accounts from accessing subscription content.”

But former employees who worked on the company’s safety division say Meta has recognized the general situation for years yet lacks the manpower and resources to keep from being “overwhelmed” by it.

The Journal added that the teams who alerted Meta to the latest issue last year recommended that the company prohibit accounts featuring child models from offering subscriptions or require accounts selling child-related content to register for more stringent monitoring. But the company rejected those proposals, and instead settled for an automated system that would theoretically ban suspected pedophiles from subscribing – but “the technology didn’t always work, and the subscription ban could be evaded by setting up a new account.”

Meta is currently the subject of a lawsuit by New Mexico Attorney General Raúl Torrez, who after investigating the company’s platforms for several months accused it and CEO Mark Zuckerberg of facilitating child sex trafficking as well as the distribution of child sex abuse material.

Tell Congress to shut down Pornhub Send a message today