Facebook users who recently watched a video from a British tabloid featuring Black men saw an automated prompt from the social network that asked if they would like to â€œkeep seeing videos about Primates,â€ causing the company to investigate and disable the artificial intelligence-powered feature that pushed the message.
Facebook on Friday apologized for what it called â€œan unacceptable errorâ€ and said it was looking into the recommendation feature to â€œprevent this from happening again.â€
The video, dated June 27, 2020, was by The Daily Mail and featured clips of Black men in altercations with white civilians and police officers. It had no connection to monkeys or primates.
Darci Groves, a former content design manager at the social network, said a friend had recently sent her a screenshot of the prompt. She then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager for Facebook Watch, the companyâ€™s video service, called it â€œunacceptableâ€ and said the company was â€œlooking into the root cause.â€
Ms. Groves said the prompt was â€œhorrifying and egregious.â€
Dani Lever, a Facebook spokeswoman, said in a statement: â€œAs we have said, while we have made improvements to our A.I., we know itâ€™s not perfect, and we have more progress to make. We apologize to anyone who may have seen these offensive recommendations.â€
Google, Amazon and other technology companies have been under scrutiny for years for biases within their artificial intelligence systems, particularly around issues of race. Studies have shown that facial recognition technology is biased against people of color and has more trouble identifying them, leading to incidents where Black people have been discriminated against or arrested because of computer error.
In one example in 2015, Google Photos mistakenly labeled pictures of Black people as â€œgorillas,â€ for which the search giant said it was â€œgenuinely sorryâ€ and would work to fix the issue immediately. More than two years later, Wired found that Googleâ€™s solution was to censor the word â€œgorillaâ€ from searches, while also blocking â€œchimp,â€ â€œchimpanzee,â€ and â€œmonkey.â€
Facebook has one of the worldâ€™s largest repositories of user-uploaded images on which to train its facial- and object-recognition algorithms. The company, which tailors content to users based on their past browsing and viewing habits, sometimes asks people if they would like to continue seeing posts under related categories. It was unclear whether messages like the â€œprimatesâ€ one were widespread.
Facebook and Instagram, its photo-sharing app, have struggled with other issues related to race. After Julyâ€™s European Championship in soccer, for instance, three Black members of Englandâ€™s national soccer team were racially abused on the social network for missing penalty kicks in the championship game.
Racial issues have also caused internal strife at Facebook. In 2016, Mark Zuckerberg, the chief executive, asked employees to stop crossing out the phrase â€œBlack Lives Matterâ€ and replacing it with â€œAll Lives Matterâ€ in a communal space in the companyâ€™s Menlo Park, Calif., headquarters. Hundreds of employees also staged a virtual walkout last year to protest the companyâ€™s handling of a post from President Donald J. Trump about the killing of George Floyd in Minneapolis.
The company later hired a vice president of civil rights and released a civil rights audit. In an annual diversity report in July, Facebook said 4.4 percent of its U.S.-based employees were Black, up from 3.9 percent the year before.
Ms. Groves, who left Facebook over the summer after four years, said in an interview that there has been a series of missteps at the company that suggests its leaders arenâ€™t prioritizing ways to deal with racial problems.
â€œFacebook canâ€™t keep making these mistakes and then saying, â€˜Iâ€™m sorry,â€™â€ she said.