Table of Contents
The program aggregates data about people – often without their knowledge – and flags those it deems potentially threatening to officials.
According to interviewees, some of those targeted are detained and sent to extralegal “political education centers” where they are held indefinitely without charge or trial, and can be subject to abuse.
“For the first time, we are able to demonstrate that the Chinese government’s use of big data and predictive policing not only blatantly violates privacy rights, but also enables officials to arbitrarily detain people,” said Maya Wang, senior China researcher at Human Rights Watch. “People in Xinjiang can’t resist or challenge the increasingly intrusive scrutiny of their daily lives because most don’t even know about this ‘black box’ program or how it works.”
Human Rights Watch said Xinjiang authorities in recent years have increased mass surveillance measures across the region, augmenting existing tactics with the latest technologies. Since around April 2016, Human Rights Watch estimates, Xinjiang authorities have sent tens of thousands of Uyghurs and other ethnic minorities to “political education centers.”
These actions are part of the regional authorities’ ongoing “Strike-Hard” campaign, and of President Xi’s “stability maintenance” and “enduring peace” drive in the region. Authorities say the campaign targets “terrorist elements,” but it is in practice far broader, and encompasses anyone suspected of political disloyalty, which in Xinjiang could mean any Uyghur, particularly those who express, even peacefully, their religious or cultural identity.
Since August 2016, the Xinjiang Bureau of Public Security has posted procurement notices confirming the establishment of the “Integrated Joint Operations Platform” (IJOP, 一体化联合作战平台), a system that receives data on individuals from many different sources. Kashgar Prefecture appears to be one of the first areas where the system is complete and in regular use.
These notices reveal that the IJOP gathers information from multiple sources or “sensors.” One source is CCTV cameras, some of which have facial recognition or infrared capabilities (giving them “night vision”). Some cameras are positioned in locations police consider sensitive: entertainment venues, supermarkets, schools, and homes of religious figures. Another source is “wifi sniffers,” which collect the unique identifying addresses of computers, smartphones, and other networked devices. The IJOP also receives information such as license plate numbers and citizen ID card numbers from some of the region’s countless security checkpoints and from “visitors’ management systems” in access-controlled communities. The vehicle checkpoints transmit information to IJOP, and “receive, in real time, predictive warnings pushed by the IJOP” so they can “identify targets… for checks and control.”
The IJOP also draws on existing information, such as one’s vehicle ownership, health, family planning, banking, and legal records, according to official reports. Police and local officials are also required to submit to IJOP information on any activity they deem “unusual” and anything “related to stability” they have spotted during home visits and policing. One interviewee said that possession of many books, for example, would be reported to IJOP, if there is no ready explanation, such as having teaching as one’s profession.
Police officers, local Party and government cadres, and fanghuiju (访惠聚, an acronym which stands for Visit the People, Benefit the People, and Get Together the Hearts of the People [访民情、惠民生、聚民心]) teams are also deployed to visit people at home to gather data. Fanghuiju teams consist of officials from different agencies who have since 2013 been sent out to villages and local communities for the overarching purpose of “safeguarding social stability.” According to official reports, the frequency of fanghuiju visits to a given family – as often as every day to once every two months – depends on whether the family is considered politically “untrustworthy.” During the visits, people are required to provide a range of data about their family, their “ideological situation,” and relationships with neighbors. Official reports say these teams use mobile apps to ensure that “the information for every household” is “completely filled in” and submitted to IJOP.
Police officers and local officials tasked with data collection do not appear to explain the reasons for such data collection, nor give residents a choice to decline to provide the data, according to interviewees.
An Urumqi-based businessman shared with Human Rights Watch a form he was made to fill out for submission to the IJOP program in 2017. That form asked questions on religious practices, such as how many times the person prays every day and name of the person’s regular mosque; whether and where the person had traveled abroad, including to any of “26 [sensitive] countries”; and their “involvement with [political] instability,” including via relatives. The form also asks whether the person is a Uyghur, has been flagged by the IJOP, and is “trustworthy” to the authorities.
Another interviewee told Human Rights Watch he had observed the IJOP computer interface in the neighborhood committee office on multiple occasions in the past year:
I saw with my own eyes, on designated computers…the names, gender, ID numbers, occupation, familial relations, whether that person is trusted, not trusted, detained, subjected to political education (and year, month, date) for every Uyghur in that district. Those detained or not trusted, their color [coding] is different. Also, the content of the form is different depending on what has [already] been filled in. For example, for Uyghurs who have passports: when they got it, where did they go, how long did they stay, when did they come back, did they give their passports [to the police], did they come back from abroad, the reasons for travelling abroad such as family visits, tourism, pursuing studies, business, or others.
According to official and state media reports, the IJOP regularly “pushes” information of interest and lists of names of people of interest to police, Chinese Communist Party, and government officials for further investigation. Officials then are supposed to act on these clues that same day (不过夜), including through face-to-face visits. The IJOP data is evaluated together with other sources of information, such as the person’s “general performance” during “study meetings.”
Upon “inspection,” individuals “who ought to be taken, should be taken” (应收尽收) into custody, two work reports by local fanghuiju teams say. Two people told Human Rights Watch that they had observed the IJOP computer interface generate lists of individuals to be rounded-up by the police. One heard police saying that some of those on the list would be detained and/or sent to political education centers. The other said:
Those pushed by IJOP are detained and investigated. As to how long that investigation takes place, nobody knows. During investigation, the person maybe held in the detention center or in the “political education” center. [Afterwards] that person can be sentenced to prison or subjected to [further] “political education.”
Most reports provide little detail about precisely how the IJOP conducts its analysis. An August 2017 post by a fanghuiju team noted that IJOP flagged those “villagers who, without reason, failed to pay for their mobile phone bills and got disconnected,” as well as those “whose phone and video calls involve terrorism and violence.” An earlier press article dated October 2016 about an unnamed “big data platform” in Jiashi County (or Peyziwat County), Kashgar Prefecture, says it analyzes geographic, migrant population, fertilizer, gas, vehicle, and other data about people’s daily lives and alerts the police if it discovers any “unusual activity.” A police researcher involved with the project explained:
For example, if a person usually only buys 5 kilos of chemical fertilizers, but suddenly [the amount] increases to 15 kilos, then we would send the frontline officers to visit [the person] and check its use. If there is no problem, [they would] input into the system the situation, and lower the alert level.
While official references to IJOP are rare, one official WeChat report acknowledged that the IJOP is contributing analytics that land people in political education centers in the campaign against “Two-Faced” Uyghur officials thought to be disloyal to the Party:
Finally, after the political legal [authorities] and public security used the IJOP to…again analyze and study [the cadres], they are sent to the county’s Occupational Skills and Education Training Center to be [politically] educated.
“If the Chinese government’s goal is to prevent bona fide crimes, it could train police and procurators in professional, rights-respecting methods, and empower defense lawyers,” Wang said. “Arbitrary mass surveillance and detention are Orwellian political tools; China should abandon use of them and release all those held in political education centers immediately.”
For more information about the use of IJOP in Xinjiang, please see the information below.
Procurement notices for IJOP show that it is supplied by the Xinjiang Lianhai Cangzhi Company (新疆联海创智公司). That firm is a wholly owned subsidiary of China Electronics Technology Group Corporation (CETC 中国电子科技集团公司), a major state-owned military contractor in China, which had announced in a March 2016 press conference that the company had been awarded a government contract to build a big data program that would collate citizens’ everyday behavior and flag unusual activities to predict terrorism.
Integrated joint operations are a new People’s Liberation Army doctrine that depend on a hi-tech C4ISR(command, control, communications, computers, intelligence, surveillance, and reconnaissance) “system of systems,” according to an expert who has studied it. The application of this military doctrine, and the supporting technology, to civilian policing is a worrying development that indicates the extent to which policing in Xinjiang is being based on a military model.
A number of academic articles by researchers affiliated with the People’s Public Security University of China, the CETC, and the Xinjiang Public Security Bureau Special Investigation Unit discuss predictive policing algorithms. One addresses whether individuals’ patterns of electricity use are unusual, and describes an official police list that outline 75 behavioral indicators of “religious extremism,” including, for example, whether someone “store large amounts of food in their homes.” In July 2017, these three institutions jointly built a national research institute in Urumqi, with the aim to better equip regional authorities with big data powers to discover “hidden social security incidents.”
IJOP and the Strike Hard Campaign in Xinjiang
Xinjiang, in northwestern China, is home to 11 million Uyghurs and other predominantly Muslim ethnic minorities. The Chinese government has imposed pervasive restrictions on fundamental human rights, including freedom of religion, on these minorities, primarily Uyghurs. These controls are intrusive and personal, including, for example, restrictions on what kind of dress or beard Uyghurs may wear, or what name they may give their children.
Authorities treat expressions of Uyghur identity, including language, culture, and religion, as well as aspirations for independence, as one of the “three [evil] forces” (三股势力), that is, “separatism, terrorism, and extremism.” The Chinese government has a long tradition of conflating violent and nonviolent forms of political advocacy in Xinjiang, and authorities justify many repressive measures and the heavy security presence in the region as necessary in their fight against terrorism.
Since May 2014, the Chinese government has waged a “Strike Hard Campaign against violent activities and terrorism” (严厉打击暴力恐怖活动专项行动), a campaign that seems to have been brought to new repressive heights by Party Secretary Chen Quanguo, appointed in August 2016.
Official reports suggest that IJOP supports several objectives of the Strike-Hard Campaign. One is to uncover the hidden “violent terrorists” and “criminal groups” as well as those who “challenge…state security, ethnic unity and social stability,” all labels that can include Uyghurs who disagree with the state, including on trivial matters or issues plainly protected by fundamental human rights. Another is to strengthen monitoring and control of people who “float” – meaning, anyone who is not living in a location other than that of their official household registration (hukou), including migrant workers as well as anyone who has travelled abroad.
In practice, what the campaign means for Xinjiang residents who are not ethnic Han (China’s predominant ethnic group) is that authorities in the past year are redoubling efforts at forced assimilation and at severing any foreign ties such residents may have. These efforts include: restricting foreign travel by recalling passports, forcing those living abroad to return, imprisoning those with foreign connections, strengthening the use of Mandarin language in education while deprioritizing minority languages, targeting “Two-faced” minority officials, and detaining people in “political education” centers. Xinjiang authorities have also heightened surveillance efforts, including instituting mass collection of DNA and voice biometrics from individuals between ages 12 and 65, routinely inspecting smartphones for “subversive” content, creating numerous checkpoints on roads and train stations, hiring thousands of new security personnel, and building “convenient” police stations.
IJOP and the Lack of Privacy Protections
There are few checks on police surveillance powers, or effective privacy protections against government intrusions in China. The police do not have to obtain any sort of court order to conduct surveillance, or provide any evidence that the people whose data they are collecting are associated with or involved in criminal activity. Police bureaus are not required to report surveillance activities to any other government agency, or to publicly disclose this information. It is very difficult for people to know what personal information the government collects, and how the government uses, shares, or stores their data.
China does not have a unified privacy or data protection law to protect personally identifying information from misuse, especially by the government. There is very little information available about how, and how securely, the data collected by IJOP is stored, who can receive or share the data, and under what circumstances, when, if ever, is the data deleted. There is no formal system for people to find out what information is held about them in the IJOP, and no way to obtain redress for abuses associated with it.
Across China, Human Rights Watch has also documented the authorities’ efforts in implementing new technological systems for mass surveillance, including the use of big data in the “Police Cloud” program. It is unclear how the IJOP and Police Cloud are related, but they share similar objectives: integrating massive data collections on citizens, sharing it across multiple agencies, and explicitly prioritizing “focus personnel” – a term authorities use to describe people they find problematic, including Uyghurs, drug-users, and those with mental health problems.
It is also unclear if, and how, IJOP connects to other databases on people the police manage or have access to, including biometrics (DNA, voice samples, fingerprints), hukou and residency information (which includes information such as religious and political affiliations), as well as registration information at internet cafés, hotels, flights and trains.
The foundation of these systems is the digital national identification card system, which makes a citizen’s card number the key to accessing many public and private services, as well as the identifier for vast databases of personal information the government accesses, collects, and collates on each individual. In Xinjiang, residents are required to present their IDs in an even wider array of situations than elsewhere in China, including when going through the region’s countless security checkpoints, buying knives, and filling the tank at the gas station. Although some of the data fed into the IJOP may not be secret or undisclosed – such as the location of a car – when various points or types of data are aggregated, it can be highly revealing of private life.
The government’s use of big data and predictive policing exacerbates already widespread violations of the right to privacy in China. Predictive algorithms require large datasets to train on for accuracy. As more police departments build cloud-based policing systems, they collect more and more personal data, including through their own increased surveillance activities and through cooperation with the private sector. As conceived, these systems will lead to enormous national and regional databases containing sensitive information on broad swaths of the population, which can be kept indefinitely and used for unforeseen future purposes. Such practices will intrude on the privacy of hundreds of millions of people – the vast majority of whom will not be suspected of crime. And of those who are suspected of “unlawful” behavior, many will be targeted for acts, including dissent or religious expression, that are protected under international human rights law but are crimes in China.
Directly at risk are the rights to be presumed innocent until proven guilty, and the freedom of association. The IJOP flags people who may have acted in a manner authorities deem unusual but that in no way constitutes a crime. These people are then at the mercy of a judicial system rife with abuse, including torture, that presents defendants only limited scope to contest the state’s accusations even for ordinary, non-political, crimes. A predictive policing system such as IJOP that focus on individual’s relationship networks could also place them under suspicion and surveillance, merely because they have associated with individuals whom authorities deem politically threatening.