This kind of discrimination started long before the machine learning or AI era... The development of film, imaging sensors, even the color science of the camera is biased. In the lab, scientists use an untypical Asian girl's picture, even nobody knows who she is, for testing all the color science of film.


As a result, the skin-tone of Asian are not accurately represented. However, the skin tone for white is accurate. Anyway, the good news is that, nowadays, because of the raised camera market in Asia and digital cameras, the bias is reduced...

· · Web · 0 · 0 · 0
Sign in to participate in the conversation
Mastodon @ SDF

"I appreciate SDF but it's a general-purpose server and the name doesn't make it obvious that it's about art." - Eugen Rochko