home
call for papers
conference directors
sponsors
abstracts and papers
speakers
agenda
hotel and travel
media in transition
contact

MIT

Forum 2:

SURVEILLANCE

Saturday, May 4, 2013
11-12:30
Wong Auditorium, MIT

It is a truth universally acknowledged that digital technologies have immensely enhanced existing means of surveillance by government and corporations and have created powerful new instruments to monitor individual behavior. Do the ramifying systems for observing and recording our routine activities fundamentally threaten our privacy and freedom, as many have argued?  In an era of dating mining and smart algorithms, is our awareness that we are being monitored, converted to bits and distributed among databases, changing the way we behave as citizens and individuals?  Should it do so?  Or is this framing of the question too pessimistic, ignoring the fact that many of the world's data collectors are or claim to be improving our lives by expanded productivity, services tailored to individual users, advances not merely in shopping but in health, education and public safety.

Speakers

Goran Bolin is professor in media and communication studies at Sodertorn University, Stockholm. He is the author of Value and the Media: Cultural Production and Consumption in Digital Markets (2011), and the editor of Cultural Technologies (2012).

Kelly Gates is associate professor of communication, science studies, and critical gender studies at University of California, San Diego. She is the author of Our Biometric Future (2011).

Jose van Dijck is a professor of comparative media studies at the University of Amsterdam where she served as the Dean of Humanities. The author of six books, her most recent is The Culture of Connectivity: A Critical History of Social Media (2013).

Moderator: Ethan Zuckerman is director of the Center for Civic Media and a principal research scientist in the Media Lab at MIT. He is co-founder of international citizen media community Global Voices, and his book Rewire: Digital Cosmopolitans in the Age of Connection will be published in June.

Video

Video of Surveillance is available.

Podcast

A downloadable podcast of Surveillance is available.

Audiocast

Streaming audio of Surveillance is available.

Summary

By Elyse Graham

Photos by Greg Peverill-Conti

[this is an edited summary, not a verbatim transcript]

Ethan Zuckerman

Ethan Zuckerman opened the discussion by pointing out that the arrival of “a more digitized and digitally mediated age” initially brought a strain of hope that new technologies would enable citizens to have greater power as observers from below. The Canadian tech pioneer Steve Mann coined the term “sousveillance” to suggest the idea, as Zuckerman put it, that “once we all have cameras, we can watch the folks in power.”

But reality has not lived up to these dreams, Zuckerman said. Instead, we have entered a culture in which large institutions, as a routine matter, exercise powers of surveillance that far exceed those of ordinary individuals. Companies follow our movements and interests online; grocery stores keep track of our purchases. This situation looks to intensify with the recent appearance of “big data,” in which technology delivers information in massive volumes that may require special tools for access and understanding.

Not everyone sees this situation as a cause for concern, Zuckerman said. Consumers, in particular, seem content to share information if it means they can save money. But the world of big data has begun to attract critical attention that takes up issues of privacy in new ways:  “ideas that are thoughtful and poetic, like a ‘right to be forgotten,' that basically says, ‘Yes, I realize that all of my movements are going to be watched, yes, I know that my entire social life online is going to be tracked, but could you please, after analyzing me, building me into your model, at some point eventually allow me to be forgotten?’”  

 “Big data is promising big innovations in health, big innovations in governance,” he added. The time may have arrived when we have to decide whether the trade-offs these promises entail are “ones that we actually want to make or want to question.”

bolin

Goran Bolin argued that the rise of new methods of surveillance based on algorithmic processing has the potential to change our sense of self.

Bolin began by making a distinction between audiences and media users. Audiences are a commodity that media companies sell to advertisers; they are statistical representations crafted by marketing departments and statisticians, and thus have no agency or self-consciousness. By contrast, media users are the individual “social subjects” who actually watch television and read newspapers. They have agency and self-consciousness, and are able to compare their own media use with statistical reports about audiences.

The digital world relies on yet another model of audience construction, he said, namely that of mass personalization. In this model, media producers collect and sell information about specific users using hidden tracers and algorithms. “They suck the DNA out of our behavior and construct and package this behavior for advertisers,” Bolin said.

Most of us know advertisers are tracking our behavior online, he said, but we don’t know much about what this information says about us as users. “We know when we go to Amazon that we are targeted and we get suggestions for books,” he said, “but the exact measure of how we are constructed, how our behavior is turned into this commodity, is much more complex.”

Kelly Gates questioned “the calls for increased surveillance in the wake of the Boston Marathon bombings.”

“As soon as the FBI released photos of the bombers,” she said, “we began to hear public expressions of appreciation for all the surveillance at the disposal of investigators.”

But these celebrations are premature, she argued. Firstly, the actual value of camera surveillance for crime prevention is not clear. Adding CCTV cameras to the streets of DC has not decreased street crime; it isn’t even clear that the images of the bombers that the FBI released to the public were of any real use in the capture of the suspects. Secondly, the columnists and commentators who are leading the conversation about surveillance see only small parts of the issue. Many people seem “to equate more surveillance with more cameras,” she said, but more surveillance could also mean better cameras, or new forms of body analysis, or softer wiretap laws.

In short, when we talk about surveillance, we need to take care to consider all of the moving parts involved, Gates said. She offered the example of a suggestion made at Friday’s panel that we design a system that warns people when someone nearby is making a video recording: “Who gets pinged and who doesn’t? Would we get pinged when we come into the purview of the FBI?”

Jose van Djick

Jose van Dijck said we are entering a new technological order in which the question of whether to have more surveillance or less has changed meaning.

Citing the authors Viktor Schonberger and Kenneth Cukier, Van Dijck argued that we are living in a new era of “datafication,” in which technology codes and quantifies our lives in minute detail, from social connections to conversations. As a result, law enforcement agencies are struggling to make sense of ever larger and messier quantities of data. But more data does not always mean more usable information. We rely on platforms to generate data, which means it is never unmediated. The answers that a data search produces always depend, on some level, on the questions we ask; for instance, how we define a terrorist. And people themselves are messy and hard to pin down; they play around, they engage in irony, they make mistakes and mess around. For all of these reasons, Van Dijck said, we should be skeptical of the ability of big data to create a more legible, more governable world.

“We should not embrace the acceptance of our datafication as a belief,” she said. “Do we need more data about ourselves and what we do? Will that improve our surveillance?”

Zuckerman asked Van Dijck how unpredictability fits into the use of big data to sort out the world statistically. “We can predict most people’s movements eighty percent of the time, but nothing is preventing me from walking away from the stage right now,” he said. “How does unpredictability creep into the digital media space?”

Van Dijck replied that this issue exemplifies the fact that the answers we receive from any query rely on what questions we ask. This makes it difficult to prepare for truly unexpected events, she said: “Things have to be designed in. Questions have to be designed in. There’s a misplaced faith in the possibility of being able to predict everything in advance and do it in advance and prevent it.”

Zuckerman noted a project imagined by one of his colleagues that would install sound-activated cameras in conflict-ridden zones. In theory, loud noises, like gunfire, would trigger the cameras, creating a record of anomalous events. But -- as the thought experiment is intended to demonstrate -- loud noises are more common than we think, and the cameras would likely end up recording a lot of business as usual.
“Looking for the exception is harder than it seems,” he said. “Cars backfire even in war zones.”

Audience Question and Audience

One audience member asked on what grounds we can take confidence in interpretations of large data sets. Is it even possible to develop a hermeneutics of big data, or is this a fantasy?

Van Dijck described an experience in which an information scientist showed her a set of data that tracked people using certain search queries online. The information scientist noted a pattern in which people who searched for “mortgage” later searched for “interior design,” then “garden.” But when she looked at the same data, Van Dijck said, she saw a different pattern: “mortgage,” then “bankruptcy,” then “YMCA.”

“It was the same set of data,” she said, “but with totally different interpretations of what they meant.”

MiT8 audience

The same audience member also asked how “sousveillance,” or observing from below, relates in practice to surveillance by the government and other large institutions.

Gates said that the aftermath of the Boston Marathon bombing demonstrated the limitations of crowdsourcing for criminal investigation. After the marathon, the FBI asked citizens to send in photographs and videos taken that day; at the same time, crowds of amateur sleuths on the internet pored over the same footage online. As it turned out, the internet was able to provide very little help in solving the crime. (The popular social networking website Reddit initially identified the wrong man as a suspect.) “Crowdsourcing didn’t have the same capacities to sift through the data as the FBI,” Gates said. “So crowdsourcing and ‘sous-veillance’ don’t create a level playing field.”

Another audience member asked the panelists to discuss the potential downsides of the “collection of big data by government agencies.”

Gates and Zuckerman noted that big data often produces inaccuracies, and that it may paint a picture of us that, as Gates put it, “doesn’t quite match how we view ourselves.” Zuckerman said a data service has “determined” that he is “unmarried, make about one fifth of my income, am a gun owner, and am a Republican.” (These details present a very poor picture of the real Ethan Zuckerman.)

Van Dijck asked the panel whether it would help society for citizens to make a point of learning more about statistics. “Should we train people in data literacy?” she asked.

“With questions of terror and threats, our innumeracy about risk is an enormous piece of the picture,” Zuckerman said. Would the columnists who write in favor of more surveillance “be making the same point if we had, as a public, a better sense of how exceedingly rare public acts of violence are compared with handgun shootings?”

Gates argued that statistical knowledge will never be sufficient to counter the visual power of acts of terrorism: “The power of images can always counter our rational knowledge of how rare these events are.”

 

 


top