Facebook Ignored Racial Bias Research, Employees Say

An anonymous reader quotes a report from NBC News: In mid-2019, researchers at Facebook began studying a new set of rules proposed for the automated system that Instagram uses to remove accounts for bullying and other infractions. What they found was alarming. Users on the Facebook-owned Instagram in the United States whose activity on the app suggested they were Black were about 50 percent more likely under the new rules to have their accounts automatically disabled by the moderation system than those whose activity indicated they were white, according to two current employees and one former employee, who all spoke on the condition of anonymity because they weren’t authorized to talk to the media. The findings were echoed by interviews with Facebook and Instagram users who said they felt that the platforms’ moderation practices were discriminatory, the employees said. The researchers took their findings to their superiors, expecting that it would prompt managers to quash the changes. Instead, they were told not share their findings with co-workers or conduct any further research into racial bias in Instagram’s automated account removal system. Instagram ended up implementing a slightly different version of the new rules but declined to let the researchers test the new version. It was an episode that frustrated employees who wanted to reduce racial bias on the platform but one that they said did not surprise them. Facebook management has repeatedly ignored and suppressed internal research showing racial bias in the way that the platform removes content, according to eight current and former employees, all of whom requested anonymity to discuss internal Facebook business. The lack of action on this issue from the management has contributed to a growing sense among some Facebook employees that a small inner circle of senior executives — including Chief Executive Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, Nick Clegg, vice president of global affairs and communications, and Joel Kaplan, vice president of global public policy — are making decisions that run counter to the recommendations of subject matter experts and researchers below them, particularly around hate speech, violence and racial bias, the employees said. Facebook did not deny that some researchers were told to stop exploring racial bias but said that it was because the methodology used was flawed. “We are actively investigating how to measure and analyze internet products along race and ethnic lines responsibly and in partnership with other companies,” Facebook spokeswoman Carolyn Glanville added, noting that the company established a team of experts last year, called Responsible AI, focused on “understanding fairness and inclusion concerns” related to the deployment of artificial intelligence in Facebook products.

Read more of this story at Slashdot.

Source:
https://tech.slashdot.org/story/20/07/23/2017243/facebook-ignored-racial-bias-research-employees-say?utm_source=rss1.0mainlinkanon&utm_medium=feed