Unsurprisingly, computers are smarter than humans when it comes to differentiating Chinese, Korean and Japanese ethnicities based on their faces.
Academics at the University of Rochester collected thousands of pictures of East Asian faces and fed them through an algorithm to determine what made Chinese, Korean and Japanese people look different.
Surprisingly, they found that the computer achieved accuracy rates of more than 75%, Stuff.co.nz said.
This figure outperformed humans who scored an average accuracy rate of about 39% in an earlier test created by Japanese-American web designer Dyske Suematsu. Pure guessing was worse at 33%.
Jiebo Luo, a computer science professor who co-created the new test, said (via Washington Post):
“This is a challenging task even for humans. I asked some of my students to take the test and they all failed horribly — even though all of them were Asian.”
What made computers better, then? Luo said it’s about the computer’s ability to harness a vast library of faces:
“Our machine has seen far more examples than any living person.”
Interestingly, Luo and his team figured that the computer did not just assess through physical proportions. Cultural features, such as accessories, hairstyles and facial expressions, were used by the algorithm to achieve amazing accuracy.
Can you differentiate the three without difficulty?