News
Featured Image

WASHINGTON, D.C., April 11, 2018 (LifeSiteNews) – Several lawmakers continued to raise specific examples of conservative and religious content suppressed by Facebook during founder Mark Zuckerberg’s second day of congressional testimony.

Rep. Fred Upton, R-MI, raised the case of Aric Nesbitt, a Republican running for the Michigan state senate who attempted to pay to boost a Facebook ad vowing to be a “strengthen our economy, limit government, lower our auto insurance rates, balance the budget, stop sanctuary cities, pay down government debt and be a Pro-Life, Pro-Second Amendment” lawmaker.

Facebook rejected the ad, claiming it violated its policies against ads that “contain shocking, disrespectful or sensational content, including ads that depict violence or threats of violence.”

“I’m not sure where the threat was based on what he tried to post,” Upton said.

Zuckerberg answered that he was not familiar with the case. “It’s quite possible that we made a mistake, and we’ll follow up afterwards on that.” But he went on to suggest that such mistakes are to be occasionally expected, given the sheer volume of content Facebook was responsible for reviewing.

“By the end of this year, we’ll have about 20,000 people at the company who work on security and content review-related issues,” he said. “But there’s a lot of content flowing through the systems, a lot of reports, and unfortunately we don’t always get these things right when people report it to us.”

Rep. Cathy McMorris Rodgers, R-WA, asked if Facebook “adequately and clearly defines” its content standards. Zuckerberg conceded he was “worried we’re not doing a good enough” of explaining its criteria to the public.

The congresswoman went on to note that there were several well-known examples of providers “blocking and censoring religious and conservative political content,” quoting Federal Communications Commission chair Ajit Pai’s assessment that such discrimination occurs “routinely.” She asked what Facebook was doing to ensure user content was being treated fairly and objectively.

“The principle that we’re a platform for all ideas is something that I care very deeply about,” Zuckerberg claimed. “I am worried about bias, and we take a number of steps to make sure that none of the changes that we make are targeted in any kind of biased way.” He didn’t elaborate on those steps, but offered to follow up on them later.

McMorris Rodgers responded by citing the recent case of Facebook rejecting the Franciscan University of Steubenville’s Holy Week ad featuring the San Damiano Cross. The Catholic university received a notification that images could not contain “shocking, sensational, or excessively violent content.” Facebook reversed the decision following a public outcry, and “apologize[d] for the error.”

Regardless of Facebook’s reversal, she said, “that it happened at all is deeply disturbing.”

“How can users know that their content is being viewed or judged accordingly to objective standards?” she asked.

“It sounds like we made a mistake there, and I apologize for that,” Zuckerberg answered before reiterating his answer that mistakes are bound to happen given the amount of content reviewers process. But he insisted that Facebook’s overall track record was solid. “We make a relatively small percent of mistakes in content review,” he claimed, “but that’s too many, and this is an area where we need to improve.” 

Zuckerberg also denied that Facebook was intentionally discriminating against anyone. “I wouldn’t extrapolate from a few examples to assuming that the overall system is biased,” he said. “I get how people can look at that and draw that conclusion, but I don’t think that that reflects the way that we’re trying to build the system or what we have seen.”

As her time expired, McMorris Rodgers ended by reiterating that “this is an important issue in building trust…as we move forward.”