(LifeSiteNews) — According to recent testimony during a U.S. Senate committee hearing, while social media giant Meta was focused on cracking down on conservative speech in coordination with the Biden administration, pedophiles were not only operating “vast” networks on the company’s platforms, but were often connected with each other via the company’s own algorithms.
During a November 7 Senate Judiciary Committee hearing on “social media and the teen mental health crisis,” Senator Josh Hawley (R-MO) heard testimony from Facebook’s former Director of Engineering for Protect and Care, Arturo Bejar, about the apparent lack of effort Meta puts in to cracking down on the “vast pedophile network” that operates on its popular Facebook and Instagram platforms, while at the same time going to great lengths to censor conservative speech under direction from U.S. President Joe Biden’s administration, as evidenced by the Murthy v. Missouri (previously known as Missouri v. Biden) case.
Answering questions from Hawley, Bejar, who had previously stated that his own daughter had been a target of online sexual predators, confirmed to the committee that on October 5, 2021, he had sent an email to Meta CEO Mark Zuckerberg and Chief Operating Officer (COO) Sheryl Sandberg that “one in eight children” on Facebook had received sexually inappropriate messages on the platform within the “last seven days,” and nearly one in three children had experienced similar “sexual advances” in general.
Despite the shocking nature of his findings, Bejar confirmed to Hawley that neither Zuckerberg nor Sandberg met with him to discuss his email, leading Hawley to characterize the Big Tech executives as “turning a blind eye” to information they did not find favorable to the company.
Referencing a June 7 investigative article by the Wall Street Journal (WSJ) in which in-house researchers teamed up with researchers from Stanford University and the University of Massachusetts Amherst and found that Instagram’s algorithm “helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content,” Hawley asked Bejar why he thinks this is happening.
Bejar told Hawley that because Meta’s algorithms funnel nearly all of its resources into combatting a “very narrow definition of harm,” even in the event that users report accounts to the company for dealing in child exploitation material or behavior, Meta only acts on a “fraction of a percent” of the complaints. Bejar also explained that because the definition of harm is so narrowly applied, in the event that users “like” or “follow” pedophilic content, the algorithm, because it does not detect an issue with the content, will actually “promote” similar content to those users, thereby creating an algorithm-driven pedophile network.
Hawley asked Bejar if much of this problem has been caused by Meta’s shift in the mid-to-late 2010s to an automated, AI-driven process of monitoring content, as opposed to having actual human employees sift through reports and online posts.
Bejar said that while he wasn’t employed by Facebook during the time of the transition to AI, he knows that AI-driven systems are “only as good as their inputs,” which he said are lacking on Meta’s platforms.
The example Bejar raised was that users on Instagram and Facebook can report advertisements for being “sexually inappropriate,” which tells the system to no longer show that ad or similar ads to the user. But when it comes to sexually explicit messages sent to children, such a reporting system does not exist, which means children have little to no recourse in the event that they are contacted by a user looking to exploit them.
Concluding the back-and-forth, Hawley pointed to the fact that despite Bejar raising the issue of child exploitation to Facebook, the company failed to address the issue or allocate any additional resources to combatting the problem. Hawley told the committee that he found this especially egregious considering that around the same time, according to evidence from the Murthy v. Missouri case, Meta was actively pouring resources into censoring political speech when told to do so by the Biden administration.
“[The courts that have heard the Murthy v. Missouri case] have found that Facebook, among others, actively coordinated with the present administration to censor First Amendment protected speech, not this garbage that is not protected by anything in our Constitution, but First Amendment protected speech,” Hawley charged.
“Here’s what gets me,” he continued. “The courts found… that Facebook devoted all kinds of resources and people, actual human people, to doing things like monitoring posts on COVID-19 vaccine efficacy… but the things that your daughter experienced, this ring of pedophiles… that, Facebook just can’t find the time for…”