The scale and reach of Chinese surveillance of its own citizens is well documented, but a new piece shows that the country’s government is now trying to use this vast trove of data to predict crimes and protests before they happen.
The Supreme Court ruling on abortion is also raising fresh concerns about the way that personal data may be used to prosecute women. We’re increasingly living in a world where Apple’s decision to have privacy be a major focus is looking increasingly prescient – but even the Cupertino company may now need to do more …
Chinese surveillance: The story so far
The sheer scale of Chinese surveillance of its citizens is mind-boggling. It’s estimated that there are more than a billion surveillance cameras in use around the world – and around half of them are in China.
In 2016, a new cybersecurity law required cloud companies to store the personal data of Chinese citizens within the country, on servers run by state-owned companies which are widely believed to be fully-accessible by the government. Apple was forced to comply.
In 2020, Chinese police officers were given ‘smart helmets’ which can check forehead temperatures, touted as a means of spotting COVID infections. But it was later discovered that these helmets had far more extensive capabilities.
The country uses both offline and online surveillance to assign ‘social credit’ scores. Negative scores can be accumulated for everything from jaywalking to failing to visit parents regularly. Those with poor scores can be prevented from travelling, attending college, obtaining better jobs – and subjected to public shaming.
‘How China is policing the future’
A New York Times piece reveals how China is now attempting to use this data to predict the future – aiming to detect everything from someone leaving home to attend a protest to criminals getting together to plan or carry out a robbery.
Data being ‘weaponized’ in the US after Roe overturned
When the Supreme Court ruling was leaked back in May, pro-choice and privacy campaigners warned that app data could be used to prosecute women who have had abortions.
Now that the ruling is official, there are calls for tech companies to respond by protecting user data – but suggestions that they won’t, because the use and sale of personal data is big business.
Apple looks increasingly prescient on privacy
Apple made an early decision to make customer privacy a key priority, going to some extreme lengths to do so.
Apple engineers say the rules have resulted in fewer app features and slower development.
Even paying that price hasn’t always worked out for the company. Ironically, it was Apple’s attempt to do CSAM scanning in a more privacy-focused way than other companies which mired it in so much controversy. Siri has also been lambasted for being dumber than other intelligent assistants, in part because Apple’s privacy rules give the service much less access to personal data. And, of course, there was the furore when Apple refused to create a backdoor into iPhones at the request of the FBI.
Apple doesn’t have a perfect record on privacy. When Europe’s tough GDPR privacy law came into effect in 2018, the Cupertino company was forced to introduce new protections in order to comply. The company ran into trouble on ‘Siri grading’ back in 2019. There are other examples, but there’s no question that the company has gone further than any other tech giant to protect user privacy.
China surveillance techniques being used in the US
There was a time when the average non-tech Apple customer might have seen privacy as a somewhat academic issue. Nice to have, but not something that ordinary people needed to worry too much about. That attitude is now rapidly changing.
Machine-learning in particular is enabling unprecedented new surveillance capabilities. Where once the sheer volume of data would be a limitation as well as an enabler, giving governments more information than they could possibly analyze, AI systems are now enabling the sifting of vast amounts of data in ways that have never before been practical.
China may represent an extreme, but other countries – including the US – are taking steps in this direction. The Brookings Institute last year noted that China is not alone in trying to predict crimes before they happen.
Apple may need to go even further on privacy
The US Supreme Court seems set to potentially overturn other landmark rulings which may fundamentally change the rights of citizens. Personal data which was once innocuous may become incriminating, as in the case of period-tracking apps.
Apple was at the forefront of protecting personal data, and may need to go even further in the face of increasing threats to civil liberties.
Apple already requires apps to have a privacy labels, revealing what categories of data are collected by the app. But as the threats grow, the company may need to respond with tougher protections.
Apple-certified safe data storage is one possibility that occurs to me. Apps which collect sensitive data can apply for Apple validation of their personal data storage. This might require all data to use end-to-end encryption, for example, so that not even developers have access to it.
Any law enforcement agency – or even private citizen – serving demands for access to personal data could then simply be told by developers and Apple alike that they do not have any way to obtain it.
What are your thoughts? Please take our poll, and share your views in the comments. Given the sensitive nature of the topic, please ensure that all comments are respectful of opposing views, arguing your own case rather than insulting those who disagree with you.
Image: Mohamed Hassan/PxHere