Digital Project

Too Smart for Their Own Good

Writer and urbanist Adam Greenfield speaks with Triple Canopy associate editor Matthew Shen Goodman about the promises and perils of intelligent objects, the futility of air-quality sensors, and the prospect of data streams being weaponized by “men’s rights” advocates whose power to wreak havoc may supersede that of the state or Google (or of the state conspiring with Google). How might the incipient class of sensor-equipped objects and data-hungry infrastructure, meant to empower us through the provision of massive amounts of information about the world and our behaviors, in fact stymie or delude us—or even empower the most noxious elements of society?

This conversation is part of the Trusting Objects, a series of interviews that examines our interactions with the many instances and modes of the contemporary object, from the driverless, networked smart car to the visual marks that constitute the form of a poem.

Matthew Shen GoodmanIn Everyware: The Dawning Age of Ubiquitous Computing, published in 2006, you describe how smart technologies, from radio-frequency identification of pets to networked street signs, are becoming increasingly pervasive in our lives. You suggest that “if we make wise choices about the terms on which we accept it, we can extend the utility and convenience of ubiquitous computing to billions of lives, addressing dissatisfactions as old as human history.” Do you remain hopeful about ubiquitous computing and smart technology?

Adam GreenfieldI think that then, like many of the people around me, I was enthusiastic about technology qua technology. I thought there would turn out to be a more or less linear progression between the development of networked information technologies and their emancipatory potential. Particularly, democratization of access to these tools—I thought that any advance in that domain was inherently interesting and good. But in time I couldn’t help but notice that each new information technology seemed to be coopted really quickly, whether through commercial exploitation or neoliberal valorization.

The gap between my hopes for some of these technologies and what they’ve actually become can be seen in Google Glass, for example. Who uses Google Glass, and what sorts of social practices are reflected in that use, or extended by it? You realize very quickly that Glass tends to reinscribe certain kinds of value in the world; my hopes for its liberating potential were naive.

Shen GoodmanWhat were your hopes for Google Glass?

GreenfieldWell, for example, I’m face-blind, and I find it socially mortifying. For me the prospect of Glass was really simple and enticing: I’d be able to drop a pair of augmented-reality glasses on my face that would recognize people, tell me who they are and where I last saw them. In principle, this would happen quickly and, if you'll excuse the expression, seamlessly, to such a degree that there wouldn’t be a noticeable lacuna in our conversation.

We could probably do something like that now with publicly available databases. But there are technical constraints: Unless I want each and every face in my field of vision to be tagged, I’d probably have to invoke a command as I get within a few dozen meters of you; I’d have to stare straight at you for Glass to acquire your face. Even in the best-case technical scenario, in which I have a fast network connection and get a quick and accurate facial reading, there’d still be this really weird pause in the social interaction. I’m not sure that’s any better than the awkwardness, discomfort, and occasional offense that I deal with now.

To optimize this feature of Glass would also entail a fairly significant social breach. Where does the data come from? How would the databases work? I imagine eighty-one separate indices of your facial bone structure that exist in an open database and can be correlated with your identity in the amount of time it takes to blink. I’m not sure that we as a society have consciously and in an informed way decided that we want unique biometric identifiers floating around on open databases.

Shen GoodmanConsider the glut of smart objects now available to consumers, as well as the increasing concern over privacy and data precipitated by revelations concerning NSA surveillance. Do you feel that the public is finally scrutinizing not only objects like Google Glass but also the systems and networks that activate or are activated by those objects?

GreenfieldNow that people are dealing with these issues through the products and services in their daily lives, sure, there’s a national conversation. Typically when something like Nest—the home-automation company known for its smart thermostats and recently acquired by Google—comes into being people pose provocative questions about it. But even before Nest’s products shipped we could say that gathering information about the interiors of our private spaces could serve the purpose of characterizing the activities of the people in those spaces.

One point I make in Everyware, which I stand by, is that we don’t want to live in a culture in which we are perforce compelled to tell the truth to one another about everything at all times. I couldn’t possibly have told all of the people I interacted with today everything I was thinking about them. Sometimes you need to do business with a jackass, and you can’t be upfront about your feelings. The same goes for our locations, our patterns of activity, even the music that we listen to. I surely don’t need to tell everyone in the world that I’ve been listening to the Carpenters all day. We all need some deniability.

Shen GoodmanAlong the same lines, do we want to be constantly apprised of the information being gathered and the functions being performed by the objects that we interact with on a daily basis, so that we might constantly modify our behavior? We want a certain seamlessness when we swipe a MetroCard. Being totally aware of every system we’re engaging, every input and output, at every moment—that sounds like hell.

GreenfieldThis is a great question: What are the areas of life that we should sand down until they are essentially frictionless, and what are the areas in which friction might be useful? If you were constantly exposed to all of the information produced by these objects and to records of every interaction with the systems in which they’re embedded, you’d be overwhelmed. Ideally, we can create interfaces that allow you to negotiate the complexity of the system that you’re operating, to progressively reveal or steadily conceal information, depending on the level of engagement you want.

In this case, friction could be an expressive instrument. You could calibrate the amount of friction you experience so as to attain a plateau or peak of what we might call productivity—but not in the economic sense of the term. A certain amount of friction within the physical infrastructure of urban life is necessary for accidents to occur, for us to have those conversations and encounters that give life value and meaning. If I magically take that away, I diminish the experience of existing in four-dimensional space-time, which involves feeling some need and walking down to the corner to satisfy it.

Shen GoodmanIn terms of civic applications, the ability of smart objects to measure activities and produce data often seems to be confused with the capacity to effect meaningful change.

GreenfieldOne of the arguments most frequently made in support of networked smart systems and sensors is that they can be used for citizen science, or scientific research undertaken by nonprofessional scientists. Take air quality as an example: We’re going to give people everywhere their own air-quality readers, which they’ll use to gather and correlate and aggregate information. They’ll learn about how science is done and about their community’s quality of air. That may be true, but the ultimate implication here is that we’re going to collectively measure the air quality and therefore collectively change it, which I don’t buy at all.

We already know who the worst polluters are. To measure this pollution and not take concrete action to change it is to set up people for failure on a fairly large scale. I could construct a grid of air-quality sensors placed three meters apart, turning the city into a pincushion of data-collection sources. Would it tell me anything that I don’t already know about who the bad actors are? And would we get any closer to the day when we can effectively bring power to bear and change the situation?

Shen GoodmanThe cynical response would be that it’s an initiative pushed by people who sell air-quality sensors.

GreenfieldI think this kind of enterprise further erodes any faith that people might have in their ability to change things. I believe the advocacy for citizen science and measurement is sincere, but I don’t see any evidence that data does actually drive policy.

Three or four years ago I thought that opening up access to the information produced by public objects gathering data was the best way to avoid concentrations and asymmetries of power, and that the two most potent threats to individuals were the private sector and the state, which could bring power to bear on us in ways that we’d ultimately find to not be very congenial. What Gamergate, of all things, has shown me is that directed swarms of sociopathic individuals might present much more of a threat than the state, commercialism, or any confluence of the two. (The fact that I hadn’t recognized this earlier is certainly a failure of imagination on my part and a marker of my comfort and privilege.) The people who could really fucking ruin your day turn out to be “men’s rights” advocates, one of whom may have a law degree and be able to actually hijack the processes of the state and file a ton of bullshit claims against you. Someone like that is able to exploit the latent vulnerabilities of our system and cause material as well as psychic damage in a way that the state and the private sector generally cannot. Unlimited open access to the data gathered by public objects would give people like this a very effective harassment tool, which to me is sufficient cause to rethink the whole notion from the ground up. I'm only sorry it took me a concrete, real-world example of networked data being used to hound someone out of public life before I understood the risks.

In network security there’s a notion called the Advanced Persistent Threat, or APT. Generally this is an organization or quasi-organization that has the capability, intention, and longevity to exploit the weaknesses that you present to the world. The point isn’t that an APT is going to attack you today; an APT will gather information slowly and patiently for a long time, build a total picture of you, and maybe place a piece of code on a server somewhere that will be exploited by some other piece of malware five years down the line. The APT is playing the long game, and that’s what makes it a persistent threat.

What we’ve learned is that you don’t need to be Visa or FedEx or Amazon to have APTs interested in you; all you need is to be someone sharing her or his opinion online. In this context it’s all too easy to see how the data streams produced by public objects could become weaponized. God forbid you should piss off somebody who’s able to figure out where you live and when you’re at home, and to use those facts to do something really upsetting or disturbing to you or your family.

Shen GoodmanThese sorts of attacks by, or enabled by, smart objects could scale up or down, from the domestic to the civic and back again.

GreenfieldYou know, someone once told me that Iranian state Internet censorship is apparently much more sophisticated than Chinese censorship. The Great Firewall is very binary and clumsy. You know when you’re blocked and why the Internet’s not working, and you can take measures to get around the firewall. Apparently the Iranian strategy has been to institute slowdowns, to throttle Internet connections and make it incredibly slow for a page to load. People give up; they assume the Internet connection is bad, they don’t think the state is actually censoring them.

You see how that could be another really dangerous strategy for harassing people: Just keep adding friction. Keep someone from being able to buy one thing in particular. If somebody really wants to fuck with your life, he’ll make it so that you absolutely cannot ring up toilet paper anywhere you go.

“Too Smart for Their Own Good” was published as part of Triple Canopy’s Research Work project area, which receives support from the Andy Warhol Foundation for the Visual Arts, the Brown Foundation, Inc., of Houston, the Lambent Foundation Fund of Tides Foundation, and the New York City Department of Cultural Affairs in partnership with the City Council.