为了保护个人数据安全,GitHub 用户 ffffffff0x 整理了一套关于数字隐私搜集、保护、清理集一体的方案,外加开源信息收集 (OSINT) 对抗。
其中包含查询密码泄露、浏览器指纹、网络数据清除、ID 生成、图像假数据制作、照片 EXIF 信息提取等各类工具。
GitHub:github.com/ffffffff0x/Digital-Privacy
As a result, the skin-tone of Asian are not accurately represented. However, the skin tone for white is accurate. Anyway, the good news is that, nowadays, because of the raised camera market in Asia and digital cameras, the bias is reduced...
This kind of discrimination started long before the machine learning or AI era... The development of film, imaging sensors, even the color science of the camera is biased. In the lab, scientists use an untypical Asian girl's picture, even nobody knows who she is, for testing all the color science of film.
24 years old, student, ham