Yelp has been getting flak for showing Chinese and Korean restaurants in search results related to “dog meat” and “cat meat,” perpetuating a racist stereotype attached to Asians.
The information first came from the Tampa Bay Times which performed searches in a dozen U.S. locations, including Tampa Bay, Chicagoland, the San Francisco Bay area, New York, Philadelphia, Denver, Boston, Austin, Seattle, Atlanta, and Jacksonville.
According to the outlet, typing “dog menu” in Yelp’s search engine yielded items on dog massage, hot dogs, and pet groomers, but also, controversially, dog meat.
As it turned out, searching for dog meat “almost always suggested a Korean restaurant,” while looking for cat meat “pointed toward Chinese restaurants,” the outlet noted.
In response to the observations, a Yelp spokesperson denied deliberate programming and explained that searches rely on “real-world consumer user data and human behavior patterns.” These include keywords in reviews, previous searches and “behaviors on the app.”
“Included in the huge volume of search queries Yelp receives are some very rare, atypical ones that computer-generated models still try to match … which is made more difficult by the rarity of this search occurrence. To be clear, no human programmed these results or matches and we are taking prompt action to remove them from autocomplete and our other systems,” the spokesperson told the Tampa Bay Times.
“Thank you for bringing this to our attention and allowing us to correct it.”
Gizmodo, which carried out their own tests on Wednesday, reported that the search terms in question “brought up no results,” apparently after Yelp took action.
Ken Lee, CEO of Asian American advocacy group OCA, said that the observation shows “algorithmic bias.”
“Our small businesses must already combat racist, inflammatory reviews from users on Yelp. These biased search results for dog and cat meat vendors should not be an additional concern.
“The company has a social and corporate responsibility to their consumers and local businesses to keep their app and website clear of prejudice and misinformation. Algorithmic bias against communities of color continues to plague new technology and online platforms — this incident is another in a long line of incidents.”
This, of course, is not the first time an American company faced accusations of algorithmic bias. In 2015, for instance, Google Photos automatically classified black people as “gorillas.” And just last year, “jew-hunters” showed up in Facebook’s internal ad platform as a valid entry for “occupation.”
Many people might not know this, but NextShark is a small media startup that runs on no outside funding or loans, and with no paywalls or subscription fees, we rely on help from our community and readers like you.
Everything you see today is built by Asians, for Asians to help amplify our voices globally and support each other. However, we still face many difficulties in our industry because of our commitment to accessible and informational Asian news coverage.
We hope you consider making a contribution to NextShark so we can continue to provide you quality journalism that informs, educates, and inspires the Asian community. Even a $1 contribution goes a long way. Thank you for supporting NextShark and our community.