Site icon Secplicity – Security Simplified

China’s Explicitly Biased Face Recognition Model

Dystopia

 

According to an article by Techdirt, the Chinese government has created “Uyghur alarms” by an explicitly biased face recognition service which they are using. China uses face recognition to identify and target Uyghur people. Under the guise of identifying the different races in China, the model used appears to specifically identify Uyghur and Tibetan face features and not any of the other 54 minorities in China. Based on our research, face recognition models do tend to identify some groups of people better than others, though China doesn’t care about this issue.  

The model used in China targets groups of people based on how they look. Huawei will build this system using the models provided, likely integrating it into the surveillance system they already have. Using skin color and measuring distances between features on the face, the face recognition programs compare these numbers to a set of numbers which identify Uyghur 

Breaking it down further, biases in face recognition come from inaccuracy in matching these numbers. A trained model may not account for some variations in the faces of a group of people. If more training is done for one group, then the model can account for more of these variations. A properly configured model will report how confident it is and variations that it hasn’t seen will lower this confidence score. See our previous research on biases in training 

China continues to persecute Uyghurs in northwest region of China. The addition of the Uyghur alarm will help China’s government identify any Uyghur who dares travel outside the region. With Chinas vast surveillance system, it would make any Uyghur travel not approved by the government almost impossible 

 

 

Exit mobile version