The surveillance state has arrived
I have been interested in computers and technology since junior high school, when my school acquired its first computer – a lonely RadioShack TRS-80, housed up in the library. Because I was a strong math student, I was selected as one of two kids from each class to visit with the computer a couple of times a week to learn how to program in BASIC. From that point forward, I was enchanted.
In those days (we are talking about the early 1980s now) and for the two decades that followed, the power and sophistication of technology grew exponentially, accompanied by optimism about the promise of offering amazing services and solving big problems. Sure, there were people of great foresight, who saw the darker implications just over the horizon of this rise in processing power and the increasing ubiquity of computer hardware. But these were lone voices in the wilderness, for the most part; I consider myself a critical person yet I certainly did not pay a whole lot of attention to these concerns.
As I’ve said in a previous post, I miss those days of technological idealism and optimism. Now, it seems like we’ve entered a tech dystopia, in which our tech titans still speak the idealistic language of “bringing the world together,” but where the platforms do otherwise: monopolize competition out of existence, amplify people’s fears and hatreds, undermine truth in ways that Orwell warned about, and foster in users a sense of general anxiety and distrust. Now, instead of getting excited about how tech tools can help me, I eye each new one warily and with apprehension, wondering what hidden perils lurk beneath the shiny surface of the app, waiting to steal my personal information or manipulate my behavior. As tech investor and critic Roger McNamee has observed, this erosion of trust is terrible for society and ultimately, may undermine the business of the tech companies themselves.
One area that I believe is a cause for serious concern, and has justifiably received a great deal of attention of late, is the dangers to privacy – and I would say, without meaning to be at all dramatic, liberty – posed by the rise in the power and accuracy of surveillance technology. This is something that I think every thoughtful person needs to be informed and concerned about. (Start with the New York Times’s remarkable “Privacy Project” series – essential reading on these issues.)
For those who believe that the surveillance state that China is building based on its prowess in AI is something that is “over there,” take a listen to this Atlantic magazine podcast about how surveillance and facial recognition was deployed by the manager of a New York City subsidized housing development. Then listen to the New York Times podcast (and read the article) about Clearview AI, which amassed an enormous collection of images available on the internet and social media platforms and used it to create a facial recognition engine the likes of which have not previously been seen. (I have to wonder whether they violated, for example, the Illinois Biometric Information Privacy Act, about which goodcounsel has previously posted; I have to think that they’ll be hearing from enterprising plaintiffs’ lawyers soon. [Update: yup.])
It was concerning enough that law enforcement agencies signed on to be Clearview customers, without any public disclosure or debate. (Hey, why not? They even offered a 30-day free trial!) What was incredible was how Clearview itself – a privately owned company – was monitoring what law enforcement was doing. Surveilling the surveillers!
I’m not saying that everyone who creates or uses these products is evil (though some of them probably are). Clearview’s law enforcement customers certainly have good motives. The Times story describes how Clearview helped law enforcement quickly solve some cold cases. This usefulness is, of course, part of the seduction. Who wouldn’t appreciate the convenience of an ATM machine recognizing us so we don’t have to mess with ATM cards? Who doesn’t want to some heinous crimes to be solved? But how far do we take that logic? If citizens were implanted with geolocation chips at birth, that would help law enforcement a lot, too. It would also serve as the infrastructure of totalitarianism. In fact, to an extent that might shock you, by carrying our mobile phones, we already are trackable, and not by the government but simply by private companies who trade in this information. (Great New York Times pieces here and here explain how this tracking is happening, today, with little regulation; also, this radio story and this article report how the Trump campaign is targeting potential Catholic voters by using commercial marketing databases to geo-locate them as they leave church on Sundays.). In China, the government is perfecting ubiquitous surveillance. These issues are right upon us, right now. We need to decide how much we want to give up privacy and risk liberty in return for more convenient shopping or more effective law enforcement. We need to consider whether the use of technologies like facial recognition should be banned entirely.
Too often, we talk about the development and deployment of new technologies as inevitable, rather than the product of choices, including the choice not to regulate. Of course, an attitude of resignation is helpful to the tech giants that would like to use and profit from these technologies in an unregulated manner. Our society’s failure to establish clear lines in the past has led to a pernicious “Surveillance Capitalism,” in which private companies surveil us without our knowledge or even by actively deceiving us about what they do, and thus manipulate and shape our choices rather than catering to them. I believe that it is incumbent upon everyone – we as citizens, our elected officials, and those in tech who are trying to make technology more humane and less destructive – to learn and talk about these issues and to pass appropriate legislation. Many of our leading politicians are of an age that they struggle to understand what his happening. Right now, anything goes, and that is dangerous. This is a genie that has to be put back into the bottle.
Categorised as: Surveillance and privacy