By Gary T. Marx, Professor Emeritus, MIT | Bio | Back to Main Page
References | Notes
A Slovak-language version of this article, translated by Malina Olszewska.
This article considers privacy and surveillance filtered through specification of some individual and social dimensions of information control. The two can be related in a variety of empirical and ethical configurations. In both academic and popular discussion privacy is too often justified as a value because of what it is presumed to do for individuals. But as this volume shows, it can also be a positive social value because of what it does for the group. An additional point (neglected by some of privacy’s more strident supporters) is that it can also be an anti-social value tied to private property and modernization. 1.
In contrast, surveillance is too often criticized for what it is presumed to do for more powerful groups (whether government or corporations) relative to the individual. But it can also be a pro-social value. Just as privacy can support the dignity and freedom to act of the person, surveillance can protect the integrity and independence of groups vital to a pluralistic democratic society and it can offer protection to individuals, whether for the dependent such as children and the sick, or to those who like clean water and industrial safety and do not want their precious liberties destroyed by enemies. Surveillance, like privacy, can be good for the individual and for the society, but like privacy it can also have negative consequences for both.
As with most interesting questions, "it all depends." But what does it depend on? To begin with we must come to terms with the meaning of some basic terms for information control. A map and a common language are required to explain and evaluate fundamental properties, contexts and behaviors involving personal information. The empirical richness of information protection and revelation needs to be disentangled and parsed into basic categories and dimensions. Some of the confusion and debate about privacy and surveillance is caused by the failure to consider different types of these and the dimensions which may cross cut and divide them. Varying time periods, the particular groups and individuals in question and the variety of positive and negative consequences (and ways of measuring and weighing these) must be noted.
Little can (or better should) be universally said about the topic apart from such specification. Terms must be defined and connections noted. What we see and conclude depends on how we turn the conceptual kaleidoscope. In broadening and turning the kaleidoscope this article considers some connections between surveillance, privacy and publicity; elaborates on some meanings of surveillance and privacy and then considers contexts and goals (and conflicts between goals) that inform the normative questions and make them so challenging.
Related but Distinct: Surveillance and Privacy, Privacy and Publicity
How do surveillance and privacy relate? Before considering their logical, empirical and ethical connections they need to be seen as elements within a broader sociology of information control framework. They both are about the control of information—in one case as discovery, in the other as protection. At the most basic level, surveillance is a way of accessing data. Surveillance, implies an agent who accesses (whether through discovery tools, rules or physical/logistical settings) personal data. Privacy, in contrast, involves a subject who restricts access to personal data through the same means. 2.
In popular and academic dialogue surveillance is often wrongly seen to be the opposite of privacy and, in simplistic dramaturgy, the former is seen as bad and the latter good. For example, social psychologist Kelvin (Kelvin 1973) emphasized privacy as a nullification mechanism for surveillance. But Kelvin’s assertion needs to be seen as only one of four basic empirical connections between privacy and surveillance. Surveillance is not necessarily the dark side of the social dimension of privacy.
Privacy-publicity (nouns)
The noun surveillance and the verb to surveil are the same figures of speech as privacy and privatization. The latter however have their opposites in publicity and publicization. But where are the equivalent opposites for surveillance as a noun and a verb? If privacy and publicity are opposites can we say that privacy and surveillance are also opposites? They can certainly be in opposition. But it does not then follow that to surveil and to publicize are automatically joined, although, as the next paragraph suggests, they can be linked.
In English there is no easy term for the action which is the opposite of surveillance. The verb form to surveil suggests actively surveying by an agent just as the verb form to privatize suggests actively protecting (although the more common usage involves property rights as with privatization.) While publicize is the opposite of privatize, the best worst term we have for a potential surveillance agent who doesn’t act is that he or she demonstrates anti- or non-surveillance. 3. The agent chooses not to act. He or she doesn’t want to know (as with the proverbial three monkeys who were capable of surveilling but did not.) One form here is minimization in which a surveillance agent engages in self-restraint given laws or policies (e.g., wiretap restrictions in a search warrant.) In contrast, with privatization the subject who has the data chooses to act to protect-restrict its being known by unspecified others. But the subject could also take actions of publicization in efforts to broadcast it. 4.
The distinct activities covered by the umbrella term surveillance do not have equivalent implications for privacy. The most common meaning refers to an act of data discovery-collection, but these occur within a broader system of connected activities. Seven kinds of activity conceived as surveillance strips can be noted: tool selection; subject selection; collection; processing/analysis; interpretation; uses/action; and data fate. Considered together these strips constitute the surveillance occasion and offer a way to bind a given application. Privacy is most likely to be at issue with respect to data collection and uses that involve the communication of results.
The discovery of information and its communication are sequentially linked. Surveillance involves the ferretting out (or at least reception) of data. It thus has an element of publicization—at least to the surveillant who finds it. In this sense, to make public is to make available to at least some persons beyond those whose initially have the data. The public (or better audience) for results may be minimal as with surveillance data classified as top secret, proprietary or confidential. Here the results of surveillance are "private" (even in the act of their becoming "public" to some, although the agent may not share it.) Yet this involves an act of restricting information rather than an offering or broadcasting to "the public" as the term is usually understood (e.g., in the form of a newspaper story, a posting on a webpage, or FOIA results.)
Privacy and Publicity
Privacy, like surveillance, is a multi-dimensional concept whose contours are often ill-defined, contested, negotiated and fluid, dependent on the context and culture. Consideration needs to be given to how the different forms of privacy and surveillance relate beyond considering these as generic forms. Among some major forms are informational (Westin 1967), aesthetic (Rule et al 1983, decisional (DeCew 1997) and proprietary (Allen 2007) privacy.
Informational privacy is the most significant and contested contemporary form and involves the rules and conditions around personal information. Violations of aesthetic privacy, while usually carrying minimal implications for life chances, are often the most shocking (as with a hidden video camera in a girl’s locker room.) Breaches of decisional or proprietary privacy involve application or use of private information, rather than information discovery which is the core of surveillance. However if individuals can nullify surveillance (e.g., hiding their use of contraceptives when that was illegal, blocking paparazzi from taking pictures, encryption) then they need not worry about that information being used.
Brief mention can be made of the term "public" in relation to the term "private" and they can be linked within the same framework. Both involve rules about the protection and revelation of information. Privacy norms are discretionary in generally giving individuals the right to control their personal information and restrict surveillance agents. Publicity norms require that information not be restricted—that is that it be made public, in effect legitimating the surveillant’s discovery actions. 5.
When the rules specify that a surveillance agent is not to ask certain questions of (or about) a person and the subject has discretion about what to reveal, we can speak of privacy norms. When the rules specify that the subject must reveal the information or the agent must seek it, we can speak of publicity norms (or, better perhaps, disclosure norms.) With publicity norms the subject has an obligation to reveal and/or the agent to discover (Marx 2011.) 6.
As with surveillance there are a multiplicity of legitimate goals for privacy and publicity and the social consequences depend on the context, the time period and the particular interests involved. In and of themselves, and viewed abstractly, as will be argued with surveillance, they are neither good nor bad. There is often an optimal point and a going too far in either direction may have negative consequences. No one wants to live in a fish bowl or spot light all the time and privacy can protect the ability to act strategically and can protect the individual against discrimination and other forms of unfair treatment. Privacy can aide in presenting a positive sense of self and give a feeling of being in control. Selectively sharing information can be a resource for intimacy and trust. The contribution of information control to autonomous group action is central for a democracy and a free market. Yet privacy can also protect illegality and hiding information is a central feature of deception and can be destructive of community. A rich mixture of consequences can also be seen for publicity—as visibility it can bring accountability and fairness but in violating legitimate privacy it can be invasive and can make it difficult for groups to pursue their goals. The last section considers some goal and value conflicts. Let us next turn to some further elements of surveillance.
What is Surveillance?
The English noun surveillance comes from the French verb surveiller. It is related to the Latin term vigilare with its hint that something vaguely sinister or threatening lurks beyond the watchtower and town walls. Still, the threat might be successfully warded off by the vigilant. This ancient meaning is reflected in the association many persons still make of surveillance with the activities of police and national security agencies. Yet in contemporary society the term has a far wider meaning.
The dictionary, thesaurus, and popular usage suggest a set of related activities: look, observe, watch, supervise, control, gaze, stare, view, scrutinize, examine, check-out, scan, screen, inspect, survey, glean, scope, monitor, track, follow, spy, eavesdrop, test, guard. While some of these are more inclusive than others and can be logically linked (e.g., moving from look to monitor), and while we might tease out subtle and distinctive meanings for each involving a particular sense, activity, or function, they all reflect what the philosopher Ludwig Wittgenstein calls a family of meanings within the broader concept (Wittgenstein 1958.)
At the most general level surveillance of humans (which is often, but need not be synonymous with human surveillance) can be defined as regard or attendance to others (whether a person, a group or an aggregate as with a national census) or to factors presumed to be associated with these. A central feature is gathering some form of data connectable to individuals (whether as uniquely identified or as a member of a category.) This may or may not involve revealing what was "private" as in not knowing and/or supporting or violating a norm about how a subject’s information is to be responded to.
A verb such as "observe" is not included in the definition because the nature of the means (or the senses involved) suggests subtypes and issues for analysis and ought not to be foreclosed by a definition (e.g., how do visual, auditory, text and other forms of surveillance compare with respect to factors such as intrusiveness, validity and the perception of a privacy invasion?.) If such a verb is needed, to "scrutinize," "regard" or "attend to" is preferable to observe, with its tilt toward the visual.
The Multiplicity of Surveillance Goals
Many contemporary theorists offer a narrower definition tied to the goal of control (e.g., Rule et al. 1983, Dandecker 1990, Lyon 2001, Manning 2008, Monahan 2010)—a factor that contributes to surveillance being viewed as on the dark side. Taking a cue from Foucault’s earlier writings, control as domination is emphasized (whether explicitly or implicitly) rather than as a more positive direction or neutral discipline. Yet as Lianos (Lianos 2003) observes, the modern role of surveillance as control must be placed in perspective alongside its fundamental importance in enhancing institutional efficiency and services and also in offering protection to individuals and societies.
Surveillance, particularly as it involves the state and organizations, but also in role relationships as in the family, commonly involves power differences and on balance favors the more powerful. Understanding this is furthered with comparisons to settings where control and domination are not central as with other goals such as surveillance for protection, entertainment or contractual relations; where surveillance is reciprocal; and where it does not only, or necessarily, flow downward or serves to disadvantage the subject.
Authority and power relations are closely related to the ability to collect and use data. The conditions for accessing and using information are elements of a democratic society. (Haggerty and Samatas 2010) The greater the equality in subject-agent settings, the more likely it is that surveillance will be bi-lateral. Given the nature of social interaction and a resource-rich society with civil liberties, there is appreciable data collection from below as well as from above and also across settings. Reciprocal surveillance can also be seen in many hierarchal settings. Mann (Mann 2003) refers to watchful vigilance from below as sousveillance.
The historical changes Foucault observed in Discipline and Punish (Foucault 1975) are central for the analysis of contemporary events, even if in that book he does not go beyond 1836 (no examples of computer dossiers or biometric analysis bolster his case.) Yet one unfortunate legacy of his work is to call attention away from the pro-social aspects of surveillance and technology more broadly. Foucault’s empirical documentation is illustrative rather than systematic and tends to exclude important surveillance topics beyond the control of superordinates in hierarchical organizations. His tone and examples give a subversive, even conspiratorial, twist to the hallowed ideals of the Renaissance and the Enlightenment regarding the consequences of seeking truth and social betterment. Rather than ensuring freedom and universal benefits, knowledge serves the more powerful. However, he does not offer an adequate theory of why hierarchy is necessarily undesirable.
With respect to categories I suggest elsewhere (Marx, forthcoming) Foucault focuses on the watchers who are directly carrying out internal-constituency, non-reciprocated, rule-based, organizational surveillance of individuals on behalf of the organization’s goals. The hope behind such watching is that subjects’ fear of possible discovery will lead to self-surveillance and that rational analysis will improve outcomes desired by agents. The social significance of these forms is clear.
Yet other forms neglected by Foucault—for example, organizational surveillance for more benign ends, inter-organizational surveillance, and the non-organizational surveillance of individuals of each other—also need consideration. His analysis, as with that of many contemporary observers, does not give sufficient attention to the multiplicity and fluidity of surveillance goals and the conflicts between them. Surveillance may serve parallel or shared goals of the individual as well as the organization. It may be initiated by the individual and used against an organization. It may focus on rule-based standards involving kinds of behavior or it may involve social, psychological and physiological characteristics used to classify persons—whether to favor or disfavor them. Again we see there is no simple relationship nor evaluation possible when considering surveillance-privacy connections.
Foucault, and many of those in the surveillance essay and dystopian novelist tradition, collapse or reduce the more general process or activity of surveillance to just one context—the organizational— to one goal, which is control, a term often used interchangeably with domination and repression. This needs to be seen along with other goals. It serves to illustrate the potentially pro-social aspects of surveillance. Table 1 identifies 12 major goals for the collection of information on persons. This list is hardly exhaustive. Additional goals that may cut across these or fit within them include kinds of control (whether involving coercion or care), categorization, determination of accountability, and inclusion or exclusion involving access to the person and the person’s access (whether to resources, identities or physical and social egress and exit.)
Table: 1 Surveillance Goals for Collecting Personal Information
Some natural clusters of goals temporal patterns tend to be associated with specific contexts. Yet there may also be tension between goals (e.g., prevent vs. document.) Goals A-J are disproportionately associated with organizations, while K and L are more likely to involve individuals acting in a personal rather than an organizational capacity (although individuals may also seek many of the other goals such as documentation, influence and prevention.)
The definition of surveillance as hierarchical watching over or social control is inadequate. The broader definition offered here is based on the generic activity of surveilling (the taking in of data.) It does not build in the goal of control, nor specify directionality. In considering current forms we need to appreciate bi-directionality and horizontal as well as vertical directions. Control needs to be viewed as only one of many possible goals and/or outcomes of surveillance.
Contexts, Goals and Conflicts
Goals along with rules are a central factor for contextual analysis. Attention to the appropriateness of goals and of means for a given setting illustrate a central argument of this paper—that surveillance and privacy must in general be judged according to the legitimate expectations of the institution or organization in question (Marx 1988, Nissenbaum 2010.) To articulate a goal and apply a surveillance or privacy protection-revelation tool brings questions of empirical and ethical judgment. That is, how well does the tactic achieve both immediate and broader goals appropriate for the context? How does it compare to other means for achieving the goal or doing nothing? Is the goal desirable and if so, by what standard? The clarity and consequences of publicly stated goals along with the appropriateness of means are central to understanding surveillance.
The surveillance-privacy relationship will vary depending on the kind of surveillance with respect to contexts and goals, as well as the kind of privacy. One element here is the distinction between non-strategic and strategic surveillance. In his analysis of "The Look" Sartre (Sartre 1993) describes a situation in which an observer is listening from behind a closed door while peeking through a keyhole when "all of a sudden I hear footsteps in the hall." He becomes aware that he himself will now be observed. In both cases he is involved in acts of surveillance, but these are very different forms. In the latter case he simply responds and draws a conclusion from a state of awareness. Only the former where he has taken the initiative, actively and purposively using his senses to collect data on others raises important privacy issues.
Non-strategic surveillance refers to the routine, auto-pilot, semi-conscious, often even instinctual awareness in which our sense receptors are at the ready, constantly receiving inputs from whatever is in perceptual range. Smelling smoke or hearing a noise that might or might not be a car's backfire are examples. In contrast, strategic surveillance involves a conscious strategy to gather information. This may be in a cooperative or adversarial setting—contrast parents watching a toddler with corporations intercepting each other’s telecommunications. Much non-strategic surveillance involves data that is publicly available to an observer (in not being protected) and as such is freely (if not necessarily voluntarily in a purposive sense) offered.
Within the strategic form—which to varying degrees ferrets out what is not freely offered—we can identify two mechanisms intended to create (or prohibit) conditions of visibility and legibility—material tools that enhance (or block) the senses and rules about the surveillance itself. While these are independent of each other, they show common linkages, as with rules requiring reporting when there are no available tools for discovery or rules about the conditions of use for tools that are available. A stellar example are the "Lantern Laws" which prohibited slaves from being out at night unless they carried a lantern (Browne 2012.) Here the emphasis is on requiring the subject to make him or herself visible given the limitations brought by darkness. But note also efforts to alter environments to make them more visible as with the creation of "defensible space:" via taking down shrubs or using glass walls (Newman 1972) or less visible á la the architecture of bathrooms.
Within the strategic form we can distinguish traditional from the new surveillance. Examples of the new surveillance include computer matching and profiling, big data sets, video cameras, DNA analysis, GPS, electronic work monitoring, drug testing and the monitoring made possible by social media and cell phones. The new surveillance tends to be more intensive, extensive, extends the senses, is based on aggregates and big data, has lower visibility, and involves involuntary (often categorical) compliance of which the subject may be unware, decreased cost, and remote locations. While the historical trend here is clear, it is more difficult to generalize about other characteristics such as whether or not surveillance has become more deceptive or more difficult to defeat than previously. Many forms are more omnipresent and often presumed to be omnipotent.
The new surveillance may be defined as scrutiny of individuals, groups and contexts through the use of technical means to extract or create information. In this definition the use of "technical means" to extract and create the information implies the ability to go beyond what is naturally offered to the senses and minds unsupported by technology, or what is voluntarily reported. Many of the examples extend the senses and cognitive abilities by using material artifacts, software and automated processes, but the technical means for rooting out can also involve sophisticated forms of manipulation, seduction, coercion, deception, infiltrators, informers and special observational skills. The new surveillance is at the core of contemporary privacy concerns.
Including "extract and create" in the definition calls attention to the new surveillance’s interest in overcoming the strategic or logistical borders that inhibit access to personal information. These inhibitors may involve willful hiding and deception on the part of subjects or limits of the natural world, senses and cognitive powers. "Create" also suggests that data reflect the output of a measurement tool. The tool itself reflects a decision about what to focus on and the results are an artifact of the way they were constructed. Of course constructions vary in their usefulness, validity and reliability. Our perceptions of the empirical world are conditioned by where and how we look and these may vary in their fidelity to that world. It is this powerful, border busting quality that raises profound implications for privacy.
The use of "contexts" along with "individuals" recognizes that much modern surveillance attends to settings, or patterns of relationships and groups, beyond focusing on a given, previously identified individual. Meaning may reside in cross-classifying discrete sources of data (as with computer matching and profiling) that, when considered separately, are not revealing. Systems as well as persons are of interest. The collection of group data or the aggregation of individual into group data offers parameters against which inferences about individuals are drawn for purposes of classification, prediction and response. Depending on the parameters, this may bring rationality and efficiency, but there is always an inferential leap in going from group characteristics based on past events to future predictions about a given individual. Here is another factor that can confuse the issue—while there may seem to be no privacy concern in collecting group rather than individually identified datadata, we need to think about whether identity groups should have rules protecting the privacy or at least image of the group (Alpert 2003.) Group surveillance raises a new privacy question and one that may become more important in an age of big data collections and mergings. Should some groups have a right to privacy just as individuals do? With respect to that consider the possible stigmatizing effect of public information on disease or crime data by ethnic or religious groups.
Neither Dark nor Light
Context directly involves the normative questions. The discussion above suggested some of the ways that privacy and surveillance can be approached and illustrates the variety of empirical connections that can be seen depending on the types and dimensions of interest. Let us now turn more directly to normative issues and a central argument: surveillance and privacy as such are "neither good nor bad, but context and comportment make it so" (Marx, forthcoming.) Context refers to the type of institution and organization in question and to the goals, rules and expectations they are associated with (Marx 1988, forthcoming and Nissenbaum 2010.) Comportment refers to the kind of behavior actually shown by those in various surveillance roles relative to what is expected.
Snaking through these are expectations about the means and ends of information collection, communication, use and protection. Apart from the specifics of empirical settings, we can say little about our topics. Privacy for whom, of what, why and under what conditions and surveillance of whom, of what by whom, why and under what conditions need to be specified. This yields a rich array of information control games and calls attention to the myriad desirable and undesirable empirical consequences for the individual and society.
While sharing some elements, differences in four basic surveillance and privacy contexts involving coercion (government), care (parents and children), contracts (work and consumption) and free floating accessible personal data (the personal and private within the public) will lead to different normative conclusions for the same privacy invading or protecting behavior on the part of agents and subjects. Informational privacy and its surveillance must be judged in light of the institutional contexts and physical-logistical settings (e.g., financial, educational, health, welfare, employment, criminal justice, national security, voting, census); places (a street, a bathroom) and times (dinner time, day or night); the kind of data involved, such as about religion or health; participant roles; and aspects of technology and media, such as audio or visual, wire or wireless, print, phone, computer, radio, or TV. Considerations of setting, location, time, data type and means offer the contingencies for righteousness or righteous indignation and are central to legislation and regulation. They are however rich in anomalies and cross cultural differences.
It is one thing to defer to context and the rules and goals and broader values underlying them as the Rosetta stone for deciding about the ethics of the social control of information. Applying them to reach judgments is another matter. Even if we can agree on what a goal and related legitimating values associated with it mean, the further nettlesome issue of prioritizing values and resolving conflicts between them remains. For many purposes (at least for persons and organizations of good will) the struggle is often between the good and the good.
Goal and Value Conflicts
Value conflicts are everywhere. Thus we seek privacy and often in the form of anonymity, but we also know that secrecy can hide dastardly deeds and that visibility can bring accountability. On the other hand, too much visibility may inhibit experimentation, creativity and risk taking. And while we value disclosure and "permanent records" in the name of fairness and just deserts, we also believe in redemption. New beginnings after individuals have been sanctioned, or after they have otherwise overcome limitations or disadvantages, are fundamental to the American reverie.
In our democratic, media-saturated, impression-management societies, many of us want to both see and be seen (e.g., social media) even as we also want to look the other way and be left alone. We may want to know but also be shielded from knowing. We value freedom of expression and a free press but do not wish to see individuals defamed, harassed or unduly humiliated (whether by the actions of others or their own.) Also as ideals, we desire honesty in communication and also civility and diplomacy. In interpersonal relations (in contrast to the abrasive talk shows) we may work hard to avoid embarrassing others by not seeking certain information or by holding back on what we know. We value the right to know, but also the right to control personal information. The absence of surveillance may bring freedom from censorship, but also open the door to the worst demagogues, liars, and self-deluded snoops. Yet undue surveillance chills non-conforming communication and is the companion of repression.
Individuals expect organizations to treat them in a fair, accurate and efficient manner, and to judge them as unique, not as undifferentiated members of a general category, while at the same time, they hesitate to reveal personal information and desire to have their privacy and confidentiality protected. Meeting the first set of goals necessarily requires giving up personal data, and up to some point, the more one gives up, the more accurate and distinctly reflective it will be of the unique person. Yet the more data one reveals, the greater the risk of manipulation, misuse and privacy violation. At the same time, knowing more can bring new questions and less certainty to surveillance agents. Depending on their role and social location, individuals and groups differ in the relative importance they give to privacy as compared to accuracy.
The individual’s expectation to be assayed in his or her full uniqueness may conflict with an organization’s preference for responding to persons as part of broad common aggregates—something seen as more rational, effective and even efficient. The idea of due process and fairness to be determined in each individual case can radically conflict with an organization’s utilitarian goals and bottom line. In the criminal justice context, for example, civil liberties sometimes conflict with the goal of effective enforcement. The case for categorical surveillance (without cause) versus particularized surveillance (only with cause) and for prevention versus after-the-violation responses can be well argued either way.
Culture sends contradictory messages. On the one hand, individuals are expected to submit to surveillance as members of a community that supports the common good and fairness (e.g., the required census or social security number that apply to all) or that allows one to participate in certain behaviors such as traveling, buying on credit, or obtaining an entitlement. Yet fairness apart, when such surveillance goes beyond minimal verification and is done in a coercive manner, it may conflict with the expectation that before personal information borders are crossed, there needs to be some grounds for suspicion. If agents have to wait to do surveillance until they have cause in situations where there is evidence of preparatory actions or where violations are of low visibility or hidden outright, many violators get a free ride. This limitation protects the innocent against unnecessary searches. Yet it can also mean failing to prevent terrible events—for example, in the case of 9/11, where well-intentioned policies from another era as well as many informal factors blocked the FBI and CIA from exchanging information about the perpetrators.
If your tools work and if you search them all, you will likely get the guilty, not to mention the innocent. Profiling as a surveillance tool permeates society far beyond ethnicity, religion or national origin. In contemporary society, with its emphasis on prevention, the push is toward broader and deeper searching absent cause. The dilemma can be identified but not solved because observers differ in judging the tradeoffs between equality, fairness, grounds for suspicion, invasiveness, prevention and effectiveness and the likelihood and seriousness of risks.
The above discussion involves conflicts between abstract values. But more concrete conflicts may also appear in applying the tools. The intrinsic properties of a device may work against the agent’s desire for secrecy. While much contemporary surveillance is defined by its ability to root out the unseen and unknown, it also paradoxically may reveal itself through electrical and chemical and other forms of data. That which silently gathers the emanations of others, if not exactly a mirror image, nonetheless emanates itself, offering discovery possibilities and means of neutralization to technically competent adversaries. The watchers may also be watched by the means they apply to others. Also, if an agency publicizes a surveillance system that has as one goal making citizens feel more secure, it may in addition have the opposite effect because it sends the message that dangers are so great as to warrant the need for such a system. Or this same publicity may alert committed malefactors to the presence of surveillance, triggering evasive, blocking or displacement means—a kind of unfair (at least to the law-abiding public) warning. Thus, advertising the means versus keeping them secret highlights the potential conflict of goals between deterrence and apprehension so apparent with undercover work. The existence of practices with a good potential for abuse traditionally leads to demands for regulation. A bureaucratic and legalistic response may lessen problems, but ironically, it can also lead to expanded use of potentially troubling means. In contrast, without a formal mandate legitimating and acknowledging the tactic, agents may hesitate to use it because of uncertainty about where to draw the lines.
In sum this paper has sought to illustrate the complexity of the information control relationships between privacy and surveillance. It argues that little useful can be said in general terms apart from specifying types of privacy and surveillance and the dimensions that may cut across or unite them. Judgments must define the context and comportment of concern. Surveillance, like privacy, can be good for the individual and for the society, but like privacy it can also have negative consequences for both. Appreciation of this complexity hardly solves any problem, but it might bring a little light and less heat to issues of great importance.
Notes
References
Allen, A. 2007. Privacy, Law and Society. St. Paul, Minn.: Thomson/West.
Alpert, S. 2003. "Protecting Medical Privacy Challenges in the Age of Genetic Information". Journal of Social Issues 59: 301-311.
Browne, S. 2012. "'Everybody’s Got a Little Light Under the Sun': Black Luminosity and the Visual Culture of Surveillance." Cultural Studies 26: 542-564.
Dandeker, C. 1990. Surveillance, Power, and Modernity: Bureaucracy and Discipline from 1700 to the Present Day. Cambridge: Polity Press.
Foucault, M. 1977. Discipline and Punish: The Birth of the Prison. New York: Vintage.
Goffman, E. 1971. Relations in Public: Micro Studies of the Public Order. New York: Basic Books.
Haggerty, K. and Samatas, M. 2010. Surveillance and Democracy. London: Routledge.
Kelvin, P. 1973. "A social-psychological examination of privacy." British Journal of Clinical Psychology 12: 248-261.
Lianos, M. 2003. "Social Control After Foucault." Surveillance & Society 1: 412-430.
Lyon, D. 2001. Surveillance and Society: Monitoring Everyday Life. Buckingham, Phil.: Open University Press
Mann, S., Nolan, J., and Barry Wellman, B. 2003. "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments." Surveillance & Society 1: 331-355.
Manning, P. K. 2008. "A view of surveillance." In S. Leman-Langlois (ed.) Technocrime: Technology, Crime, and Social Control. Devon, UK: Willan Publishing, p. 209-242.
Monahan, T. 2010. Surveillance in the time of insecurity. New Brunswick, NJ: Rutgers University Press.
Marx, G.T. 2011. "Turtles, Firewalls, Scarlet Letters, and Vacuum Cleaners: Rules about Personal Information." In W. Aspray and P. Doty (eds.) Privacy in America: Interdisciplinary Perspectives. Lanham, MD: Scarecrow Press, Inc., p. 271-294.
Nissenbaum, H. 1998. "Protecting Privacy in an Information Age: The Problem of Privacy in Public." Law and Philosophy 17: 559-596.
Sartre, Jean-Paul. Being and Nothingness. New York: Washington Square Press, 1993.
Wagner DeCew, J. 1997. In Pursuit of Privacy: Law, Ethics, and the Rise Of Technology. Ithaca, NY: Cornell University Press.
Westin A. 1967. Privacy and Freedom. New York: Columbia University Press.
Wittgenstein, L. 1958. Philosophical Investigations (trans. G.E.M. Anscombe), Oxford: Basil Blackwell.
References | Notes | Top
Privacy may be easier to think about than surveillance because it has opposites. Consider:
Privatization-publicization (action nouns)
Privatize-publicize (verbs)
The concepts in Table 1 were developed using an inductive method, sifting hundreds of examples to answer the question, "what is use of the tool intended to accomplish?" I added categories until any new example fit within the existing categories. I do not argue that any given application will necessarily fit into only one category (although one may be dominant), that goals should only be studied statically, nor that observers would all necessarily agree on how to categorize what agents say about goals. For example, the point of view of the respondent may differ from that of the analyst; that is, a respondent might categorize surveillance of children as being for protection, while an analyst might code it as a form of control. The private and the public may be present as expectations stemming from formal rules or from manners and may exist as conditions regarding the visibility of the information independent of the rules.
Rather than being seen as bi-modal dimensions with values at the ends of a continuum (such as whether a technology requires a power source or can be invisible), here each goal is an end point (although it could be treated as bi-modal if scored as present or absent.) The goals may occur together (e.g., compliance and documentation), and some are broader than others. For example, the goals of organizational functioning and documentation are perhaps the broadest and most content-neutral of those in the list, potentially touching most of the others.
behavioral rules
certification standards
subjective rules (correct inner attitudes and feelings)
Notes | Top
Newman, O. 1972. Defensible Space. New York: MacMillan.
Rule, James B., Douglas McAdam, Linda Stearns, and David Uglow, 1983.
"Documentary Identification and Mass Surveillance in the United States." Social Problems 31, no. 2: 222-234.