Speaking to BBC’s Panorama, the software engineer — who preferred to remain anonymous — said he installed these systems in police stations in the Xinjiang province.
China has always maintained that surveillance of the region is important given that separatists, who want their own state, have killed hundreds of people in attacks. Xinjiang, home to at least 12 million ethnic minority Uighurs, most of whom are Muslim, has seen massive human rights violations and poor treatment of Uighurs in the region. China has also set up “re-education centres” for them in the area that have been criticised for human rights abuses, mistreatment, rape and torture.
The revelation has shocked many. The Chinese embassy in London maintained that “political and social rights of all ethnic groups are guaranteed” and that “People live in harmony regardless of their ethnic backgrounds and enjoy a stable and peaceful life with no restriction to personal freedom.”
The software engineer, fearing for his safety, has also not revealed the name of the company he worked for. He, however, showed photographs of five Uighurs on whom he claimed the government tested the facial recognition system.
He told BBC’s Panorama: “The Chinese government use Uighurs as test subjects for various experiments just like rats are used in laboratories.”
“We placed the emotion detection camera 3m from the subject. It is similar to a lie detector but far more advanced technology,” he said.
He also explained his role in installing cameras in police stations in the province.
In Xinjiang, police officers use restraint chairs in which one’s wrists and ankles are locked by metal restraints, the engineer said. He spoke about how the AI is trained to recognise and analyse “even minute changes in facial expressions and skin pores.”
According to his claims, the software then makes a pie chart with the red segment indicating a negative or anxious state of mind.
Sophie Richardson, China director of Human Rights Watch, who saw the evidence provided by the software engineer, said: “It is shocking material. It’s not just that people are being reduced to a pie chart, it’s people who are in highly coercive circumstances, under enormous pressure, being understandably nervous and that’s taken as an indication of guilt, and I think, that’s deeply problematic.”
In July 2018, a patent was filed by Huawei and the China Academy of Sciences to describe a face recognition product that is capable of identifying people on the basis of their ethnicity. Huawei had said that it did “not condone the use of technology to discriminate or oppress members of any community” and that it was “independent of government” wherever it operated.