Google apologizes after photo software tags black people as 'gorillas'
The cringeworthy mishap began when New Yorker Jacky Alcine was going through his pictures. Noticing an album titled 'Gorillas,' he clicked on it. But when he opened it, he didn't find any primates. Instead, he saw images of himself and a friend.
He immediately took to Twitter, telling the tech giant: “Google Photos, y'all f****d up. My friend's not a gorilla.”He added that it was only photos with that particular friend that were ending up in the album.Alcine also asked the tech company:"What kind of sample image data you collected that would result in this son?"
And it's only photos I have with her it's doing this with (results truncated b/c personal): pic.twitter.com/h7MTXd3wgo
— diri noir avec banan (@jackyalcine) June 29, 2015
However, Google insists it tests its image recognition system on“people of all races and colors.”
Google engineer Yonatan Zunger responded to Alcine's tweets, admitting the“shudder”factor of the mistake and saying“this is 100% Not OK.”
@jackyalcine Thank you for telling us so quickly! Sheesh. High on my list of bugs you *never* want to see happen. ::shudder::
— Yonatan Zunger (@yonatanzunger) June 29, 2015
Zunger asked if Google could access Alcine's account to see why the problem occurred, and later said the bug had been fixed.
But Alcine reported that two photos were still showing up under the terms 'gorilla' and 'gorillas' after the alleged algorithm tweak. In response, Zunger said the company had turned off the ability for photographs to be labeled 'gorillas.'
“Lots of work being done, and lots still to be done. But we're very much on it,” he said.
The feature is designed to spot characteristics in a photo and sort them together, so that certain categories of images can be easily found in one place. For instance, airplane photos would be grouped together in an album titled 'Airplanes.' The program was launched at Google's I/O developer conference in May.
But for now, Google admits the software is far from perfect.
'We're appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future,” a Google spokesperson said.
Other mistakes have also been reported, though none have been as offensive. For example, dogs were mislabeled as horses and landscape photos were misinterpreted as collages of animal faces.
It comes just weeks after Flickr's auto-tagging feature encountered a similar problem, tagging black people as “apes” and the gates of Dachau concentration camp as a “jungle gym.”