Facebook said Tuesday that the main social network and Instagram, its photo service, will be stepping up efforts to address challenges minorities face on its platforms and examine potential racial bias in its algorithm and products.
The move comes as Facebook is under more pressure to combat hateful content on its site in the wake of the police killing of George Floyd, a 46-year-old Black man in Minneapolis. Civil rights groups have criticized the company for not doing enough to address hate speech, prompting major advertisers to halt spending on Facebook this month as part of a campaign.
Black users and other minorities have also complained that their posts about racism are mistakenly flagged by the company for hate speech. Mark Luckie, a former manager at Facebook who is Black, has also accused the social network of “failing” its Black users, noting that there’s a theory that their content is pulled down more than other groups.
“The racial justice movement is a moment of real significance for our company,” said Vishal Shah, Instagram’s vice president of product in a statement. “Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves.”
Instagram is creating a new team focused on “ensuring fairness and equitable product development.” Called the Equity team, the Instagram employees will also support features that promote equality such as tools that help diverse businesses. The team will work with Facebook’s Responsible AI team. Facebook is also creating an Inclusivity product team and it recently launched a council that will help incorporate the views of Black users and other minorities as it develops new products.
The Wall Street Journal, which reported earlier about the creation of the new teams, said that studying racial bias has been a “contentious” issue within Facebook and Instagram in the past. An internal study showed that users whose activity suggested they were Black were 50% more likely to have their accounts disabled than other users if Instagram made changes to what accounts get deleted or suspended. Instagram addressed those concerns but barred more research into the issue.
Facebook told the Journal that it was concerned that the metric used to determine a user’s race wasn’t entirely reliable. As part of the study, workers looked at users’ “multicultural affinity,” which have suggested to advertisers in the past whether a user is interested in ads related to African American, Hispanic American or Asian American communities. The social network has assigned users a “multicultural affinity” based on their activity on the platform. Facebook doesn’t ask users to provide their race so using this metric could suggest to the social network who is a minority.
The company disabled the ability for advertisers to target users by “multicultural affinity” in 2017 because of concerns that advertisers could use the tool to exclude people of certain races.
Facebook said that its work to study and address potential racial bias is still in the early stages but it plans to share more in the coming months.