Inside the massive (and unregulated) world of surveillance tech
A few years ago, an American defense consultant I know told me about a trip he took to Uzbekistan. His role there was to help sell technology that the Uzbek government could use to spy on its own citizens. He eventually shared with me the marketing material he’d presented to the Uzbek government. One glossy brochure featured technology that could not just intercept phone calls, but identify the caller, regardless of what phone number they were using, based on their unique voiceprint, and then identify their exact geographic location.
And yet, tools of surveillance that an authoritarian regime could use to spy on its own citizens, on dissidents, on journalists, that, according to the US government today, is not a weapon. And yet, these tools of surveillance are part of a growing secretive multi-billion-dollar industry.
The Israeli-based company NSO Group has reportedly sold its technology to the regime in Saudi Arabia, which has been accused of harassing, and even, in one case, killing one of its political opponents. And we do think of weapons as things that kill people. But in the information age, some of the most powerful weapons are things that can track and identify us.
The future of digital communication and privacy
People send 100 billion WhatsApp messages every day – and they’re all encrypted to protect them from potentially curious entities like companies, governments and even WhatsApp itself. With our increased reliance on digital communication tools during the COVID-19 pandemic, our fundamental right to privacy is more important than ever, says Will Cathcart, head of WhatsApp. He describes the tech and protocols the company built to prevent encryption services from being misused to spread disinformation or commit crimes – while still safeguarding privacy.
What tech companies know about your kids
I’m an anthropologist, and I’m also the mother of two little girls. And I started to become interested in this question in 2015 when I suddenly realized that there were vast – almost unimaginable amounts of data traces that are being produced and collected about children. So I launched a research project, which is called Child Data Citizen, and I aimed at filling in the blank.
We have no knowledge or control over the ways in which those who buy, sell and process our data are profiling us and our children. But these profiles can come to impact our rights in significant ways.
But we need to abandon the belief that these technologies can objectively profile humans and that we can rely on them to make data-driven decisions about individual lives. Because they can’t profile humans. Data traces are not the mirror of who we are. Humans think one thing and say the opposite, feel one way and act differently. Algorithmic predictions or our digital practices cannot account for the unpredictability and complexity of human experience.
What you need to know about face surveillance
So why do you do these things? My guess is, it’s because you care about your privacy. The idea that privacy is dead is a myth. The idea that people don’t care about their privacy because “they have nothing to hide” or they’ve done nothing wrong is also a myth. I’m guessing that you would not want to publicly share on the internet, for the world to see, all of your medical records. Or your search histories from your phone or your computer. And I bet that if the government wanted to put a chip in your brain to transmit every one of your thoughts to a centralized government computer, you would balk at that.
Nonetheless, we daily face a propaganda onslaught telling us that we have to give up some privacy in exchange for safety through surveillance programs. Face surveillance is the most dangerous of these technologies. There are two primary ways today governments use technologies like this. One is face recognition. That’s to identify someone in an image. The second is face surveillance, which can be used in concert with surveillance-camera networks and databases to create records of all people’s public movements, habits and associations, effectively creating a digital panopticon.
Just consider how trivial it would be for a government agency to put a surveillance camera outside a building where people meet for Alcoholics Anonymous meetings. They could connect that camera to a face-surveillance algorithm and a database, press a button and sit back and collect a record of every person receiving treatment for alcoholism. It would be just as easy for a government agency to use this technology to automatically identify every person who attended the Women’s March or a Black Lives Matter protest. Even the technology industry is aware of the gravity of this problem.