Sony researchers uncover bias against skin with yellow hues in AI algorithms

Sony researchers uncover bias against skin with yellow hues in AI algorithmsSony researchers uncover bias against skin with yellow hues in AI algorithms
via geralt
Ryan General
October 4, 2023
Sony AI researchers have uncovered hidden layers of skin tone bias in AI algorithms, challenging existing skin tone scales used by tech giants like Google and Meta.
About the study: For their study published on Sept. 10, Sony AI’s research team employed colorimetry to derive quantitative metrics and assess standardized skin tone scales that AI tech companies have adopted in their tools, such as the Monk Skin Tone Scale and the Fitzpatrick scale. 
The researchers quantified skin color bias in face datasets and generative models, breaking down results by the skin color of saliency-based image cropping and face verification algorithms. Sony employed multidimensional skin color scores instead of uni-dimensional ones for fairness benchmarking. To create a robust dataset, they generated approximately 10,000 images using generative adversarial networks and diffusion models.
Unmasking hidden bias: According to the paper, the standardized skin tone scales fail to capture the full spectrum of human skin diversity as they ignore the nuanced contribution of yellow and red hues to human skin color. The paper further highlights that AI datasheets and model cards still leave ample room for discrimination, particularly against under-represented groups. 
Hue nuances: Underscoring that the problem extends beyond mere skin tone, the study noted how existing scales also fail to account for the nuances of skin hue, which significantly impacts how AI classifies people and emotions.
For example, editing skin color to achieve a lighter or redder hue increases the chances of AI misclassifying non-smiling individuals as smiling and vice versa. Additionally, AI classifiers tend to inaccurately predict gender, labeling people with lighter skin tones as more feminine and those with redder skin hues as happier. These findings suggest that bias in AI models is not limited to skin tone but extends to skin hue as well.
Tipping the scales: To rectify this oversight, Sony proposes the introduction of the “Hue Angle,” a multidimensional measurement of skin color that considers not only lightness and darkness but also the gradation from red to yellow. By implementing a more comprehensive approach, they hope to provide AI developers with a more accurate and inclusive tool for assessing skin color diversity.
Response from the tech community: In response to Sony’s research, Ellis Monk, co-creator of the Monk Skin Tone Scale, defended his system in a statement with Wired. He went on to emphasize the thought that went into addressing undertones and hue variations.
Meanwhile, X. Eyeé, CEO of AI ethics consultancy Malo Santo, acknowledged to Wired the importance of Sony’s work but cautions there are potential challenges in implementing a new measurement system.
Share this Article
NextShark.com
© 2024 NextShark, Inc. All rights reserved.