Google’s Photo Recognition Software Thinks Two Black People are ‘Gorillas’

gorillafeat1

Google Photo’s facial-recognition software can classify pictures — whether they are of people, pets or objects, among other things — according to their content using automatic image labeling.

However, image recognition algorithms can also get a picture’s subjects wrong, like when a photo of a cat is misidentified as a shark.

While the errors are often laughable, Google Photo’s misidentification of Brooklyn computer programmer Jacky Alciné and his friend, found by Slate, wasn’t quite so. After looking through his Google Photos app, Alciné found a photo of him and his friend, both of whom are black, under the label “Gorillas.”

Alciné took to Twitter and said:

“Google Photos, y’all fucked up. My friend’s not a gorilla.”

gorilla 1

Google’s chief social architect, Yonatan Zunger, tweeted back:

@jackyalcine Holy fuck. G+ CA here. No, this is not how you determine someone’s target market. This is 100% Not OK.

gorilla3

Zunger’s response was swift, and he said a team was being formed to address the racist label. It took 15 hours for it to be deleted from Alciné’s Google Photo account. Zunger also said Google was trying to fix “longer-term fixes” with Google Photo. That included identifying “words to be careful about in photos of people” and “better recognition of dark skinned faces.”

In an official statement, a Google spokesperson said:

“We’re appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

Unfortunately, photo classifications that are unintentionally offensive have happened in the past, with dark-skinned individuals being affected the most. The Independent reported that auto-tagging for sorting photos on Flickr labeled a black person as both “ape” and “animal” this past May.

While Google did their best to remove the offensive labels as quickly as they could, there’s still a concern that this mistake happened at all. As Alciné tweeted:

“Like I understand HOW this happens; the problem is moreso on the WHY. This is how you determine someone’s target market.”

Support our Journalism with a Contribution

Many people might not know this, but despite our large and loyal following which we are immensely grateful for, NextShark is still a small bootstrapped startup that runs on no outside funding or loans.

Everything you see today is built on the backs of warriors who have sacrificed opportunities to help give Asians all over the world a bigger voice.

However, we still face many trials and tribulations in our industry, from figuring out the most sustainable business model for independent media companies to facing the current COVID-19 pandemic decimating advertising revenues across the board.

We hope you consider making a contribution so we can continue to provide you with quality content that informs, educates and inspires the Asian community. Even a $1 contribution goes a long way. Thank you for everyone’s support. We love you all and can’t appreciate you guys enough.

NextShark is a leading source covering Asian American News and Asian News including business, culture, entertainment, politics, tech and lifestyle.

For advertising and inquiries: info@nextshark.com