Journal of Social Issues, forthcoming May 2003,
vol. 59 (2)
I
am grateful to David Johnson, Richard Leo and David Shulman for critical
comments and to the supportive environment of the Jurisprudence and Social
Policy Program of the University of California, Berkeley Law School where I was
a Visiting Professor when this was written.
References | Back
to Main Page
By Gary T. Marx
Professor Emeritus, MIT
In
light of contemporary efforts to intensify the collection of personal
information, this article, as well as articles elsewhere on this web site dealing
with the engineering of social control and computer matching and profiling, may
be of more than academic interest.
Abstract: Eleven behavioral
techniques of neutralization intended to subvert the collection of personal
information are discussed: discovery
moves, avoidance moves, piggy backing moves, switching moves, distorting moves,
blocking moves, masking moves, breaking moves, refusal moves, cooperative moves
and counter-surveillance moves. In Western liberal democracies the advantages
of technological and other strategic surveillance developments are often
short-lived and contain ironic vulnerabilities. The logistical and economic
limits on total monitoring, the interpretive and contextual nature of many
human situations, system complexity and interconnectedness, and the
vulnerability of those engaged in surveillance to be compromised, provide ample
room for resistance. Neutralization is a dynamic adversarial social dance
involving strategic moves and counter-moves and should be studied as a conflict
interaction process.
It may well be doubted whether human ingenuity can
construct an enigma of the kind which human ingenuity may not, by proper
application resolve.
--Edgar Allen Poe, The Gold Bug
Efforts
to protect information on the part of the actor have their logical
counterpart in the discovery efforts of those engaged in surveillance.
That is, in resisting surveillance individuals are protecting their privacy,
while those involved in surveillance seek to break through the personal
borders which protect privacy. We can view contexts of personal information
discovery and protection behaviorally and make inferences about what the
individuals are attempting to do. We can also view these concepts in terms of
cultural standards that judge whether behavior is appropriate, ethical and
legal.
The
study of privacy and secrecy overlaps the study deviance and social control. In
many settings privacy and surveillance are different sides of the same nickel.
Privacy can serve as a nullification mechanism for the power offered by
surveillance (Kelvin, 1973). Surveillance seeks to eliminate privacy in order
to determine normative compliance or to influence the individual or for its own
ends as with voyeurism (Marx, 2002b).
There
are increasingly sophisticated technologies for collecting personal
information, such as surveillance cameras and sensors, and for
predicting behavior and assessing the truth, such as expert systems (Brin,
1999; Froomkin, 2000; Garfinkle, 2000; Gutwirth, 2002; Staples, 2000). The increased
prominence of these surveillance technologies is "pulled" by concerns
over issues such as crime, terrorism, and economic competitiveness, and
is pushed by newly perceived opportunities offered by technical
developments in electronics, computerization, and artificial intelligence. The
scale, mobility and anonymity of mass society and, ironically, increased
expectations of, and protections for, privacy have furthered reliance on both
surveillance technologies and on data-base memories that locate, identify,
record, register, classify, and validate or generate grounds for suspicion
(Agre & Rotenberg, 1997; Bennett & Grant, 1999; Gandy, 1993; Lyon,
2001; Regan,1995). The fear of catastrophic risks in an interdependent world
relying on complex technologies and the entrepreneurial efforts of the security
industry and governments, such as the United States with its’ war on drugs,
have helped spread the technologies internationally (Andreas, 2000; Ericson
& Haggerty, 1997; Nadelmann, 1993).
One
noteworthy aspect is the extent to which individuals go along with requests for
personal information. This is likely related to beliefs about the advantages
of, and need for, providing such information, and trust in authority --factors
which often override the ambivalence resulting from traditional privacy and
autonomy concerns. Moreover, a lack of
resistance to intrusive surveillance may mask as acceptance because of a fear
of being sanctioned or losing one's job, position or privilege, or as a
necessary condition for something desired such as employment, credit, apartment
or car rental, air travel or government benefits. There may also be fatalism
and resignation, believing it is impossible to resist.
Many
cultural beliefs support the legitimacy of surveillance. Consider statements I
heard such as, “I have nothing to hide”, "it’s for my own good",
"I support the goals", “I’m getting paid”, “it’s just the way they do
things here”, “they have to do it to …[stay competitive, obtain insurance, stop
crime, avoid risks]”, "the measure is valid", and "they promise
to protect confidentiality." Lack of awareness of the extent and nature of
surveillance, or of the potential for abuse and misuse of personal information,
may also support acquiescence.
Since
completing a study of humans as covert information discoverers (Marx, 1988), I
have been studying the social, cultural, ethical and policy implications of new
technologies for the collection of personal information (e.g., Corbett &
Marx, 1991; Marx 1995, 1998, 1999, 2001, 2002 a & b, forthcoming). One
aspect involves individual resistance to surveillance, the topic considered
here.
My
method involves observation, interviews, document collection and mining the
literature in order to offer a conceptual framework, and eventually testable
hypotheses, about personal information collection and protection. Among those
interviewed are subjects (e.g., employees subjected to work monitoring, drug
tested athletes, political activists) and practitioners of surveillance (e.g.,
police, private detectives, managers, technology providers).
The
identification of factors encouraging the spread of surveillance and the
ready availability of newsworthy horror stories of privacy invasions such as
the selling of information from AIDS tests or cameras hidden in dressing rooms
(e.g., Smith, 1990), too often lead to an unreflective "the sky is
falling" view of contemporary surveillance, whether explicit or implicit.
The potential of a technology for harm needs to be kept distinct from its’
realization. Just because something negative could happen, does not mean
that it must happen. In short,
little consideration is given to how the dystopia will actually be produced or
to factors working against it, such as practicality, cost, laws and fear of
lawsuits, organizational policies, morality, doubts about effectiveness or the
ability to control the technology, and concern with public opinion. Control
systems are not usually as effective and efficient as their advocates claim and
they often have a variety of unintended consequences (Sieber,1981; Marx, 1995;
Tenner, 1996).
There
is frequently a gap between visible conforming behavior and less visible
attitudes, emotions and fantasies. Moreover new technologies rarely enter
passive environments of total inequality.
Instead they become enmeshed in complex pre-existing systems. They are
as likely to be altered as to alter. Professional associations, oversight organizations, and political
and social movements are also factors.
In contrast to these collective responses, the focus in this
article is on individual responses.
Individual
and collective responses are often linked as when protest movements grow out of
or encourage individual resistance and provide models, resources and
legitimation (McAdam & Snow, 1997). However, more spontaneous individual
responses can be contrasted with those growing out of explicit organizational
efforts. The former may be collective in the sense that many persons respond
the same way to the same stimulus, but they are not necessarily organizational.
The social and political implications of such individual forms are relatively
unstudied.
Whether
at work (Gabriel, 1999), in prison (Sykes,1971), in the family, or in efforts
to create a carceral society as with the former East Germany (Pfaff, 2000),
surveillance targets often have space to maneuver and can use
counter-technologies. The individual is often something more than a passive and
compliant reed buffeted about by the imposing winds of the more powerful, or
dependent only on protest organizations for ideas about resistance. Humans are
wonderfully inventive at finding ways to beat control systems and to avoid
observation. Most surveillance systems have inherent contradictions,
ambiguities, gaps, blind spots and limitations, whether structural or cultural,
and, if they do not, they are likely to be connected to systems that do. As
Goffman (1961) notes in his study of the underlife of organizations, when
individuals feel that surveillance is wrong, or that they are unfairly
disadvantaged by it, it will often be challenged. Systems also may be challenged for reasons of
self-interest. The scale, complexity and limitations of omnipresent and
omnipotent surveillance offer room for this subversion. Boyne (2002) discusses
limits of panoptical control as applied to contemporary society. Resistance is a central theme in much
contemporary science fiction. Wood (2002)
Behavioral
techniques of neutralization are a major form of such challenges. These are one
strand of what Scott (1985) calls everyday forms of resistance: “…the ordinary
weapons of relatively powerless groups: foot dragging, dissimulation, false
compliance, pilfering, feigned ignorance, slander, arson, sabotage and so
forth” (p. 29). However, powerless takes on a new meaning (beyond its usual
association with lower social class or minority status) when we consider the
demands of the modern organization (whether private or governmental) for
personal information.
This
article considers 11 generic techniques of neutralization that
observation suggests can be weapons of the strong, as well as the weak. They
involve a strategic focus on directly resisting a particular privacy-invading
information technology. This strategic focus is in contrast to the sheer
contrariness to authority that Foucault (1977) discusses and the
non-instrumental forms noted by Scott (1985). The techniques may be accompanied
by, or eventually lead to, other individual and collective responses expressing
indignation, rejection and rebellion (often with a symbolic element), apart
from direct efforts in the immediate context of surveillance. Our emphasis here,
however, is on the former.
The
behavioral techniques to be discussed have cultural support. There are parallels to more general cultural
beliefs supportive of rule breaking identified by Sykes and Matza (1957)
as “techniques of neutralization” and
Bandura (1999) as “moral deniability.” People will break rules if they regard
an organization or its surveillance procedures as unacceptable or illegitimate,
untrustworthy or invalid, irrelevant, demeaning, unnecessary or irrelevant.
These approaches help answer the question: "How can the rules be broken
when the culture is clear in defining this behavior as wrong?” However, in this case, just who is breaking
the rules may be disputed (e.g., is it the authorities seeking information to
which they are not entitled, or is it those subject to legitimate surveillance
seeking to avoid it?).
General
cultural counter-beliefs that neutralize the conventional beliefs include:
"My rule breaking behavior doesn’t hurt anyone"; "it’s not my
fault"; "they had it coming
to them"; and “they’ll never know”.
More specific cultural beliefs that may accompany efforts to thwart
surveillance encountered in my research include: “They have no right to that
information”; "it’s none of their business"; "the measure is not
accurate"; "it’s unfair"; "it’s discriminatory --they don't
monitor the communications of the managers and executives"; "I don't
trust them to keep it confidential"; "what if they use it for some
other purpose?", "it's irrelevant to how I do my job"; "I
did not consent to provide the information"; "it’s sneaky";
"it means they don't trust me"; "it makes me feel like a child"; "my personal
information is my property and I am not being paid for its’ use";
"providing that information puts me at a strategic disadvantage."
Such
justifications serve to soften a culturally induced tendency toward deference
to authority and are counters to the cultural beliefs that legitimate
surveillance. Consequently, they may free individuals to resist in the
ways discussed below and, after the fact, alleviate guilt.
As
with any normative system, there is a moral economy within which individuals
may weigh the costs and benefits of compliance and violation and draw personal
lines. For example, Gilliom (2001)
studied welfare recipients who justified evading a sophisticated surveillance
system by the cost to their families of not seeking ways around
it, given very stringent limits on the income they were permitted to legally
have. However, the inherent
value conflicts involving surveillance and the self-interested reasons
for evasion hardly require elaborate ideologies of resistance. Moreover, because efforts to counter many of
the kinds of surveillance considered here often occur on morally contested
ground and are defensive, the need for such justifications may not be as strong
as in more conventional crime and deviance settings.
The
11 forms of surveillance neutralization, in the next section, are
inductively developed concepts that are the result of several decades of
interviews and observations regarding social control. As responses to social
control they parallel prior work on strategies for the engineering of social
control (Marx, 1995, 2002a). These
concepts suggest that human creativity seeking to thwart systems of
surveillance is aided by logistical and economic limits on total monitoring,
the vulnerability of those engaged in surveillance to be compromised, and by
the interpretive and contextual nature of many human situations.
The
11 prominent types of response to privacy-invading surveillance
are: 1) discovery moves, 2) avoidance moves, 3) piggy-backing moves, 4)
switching moves, 5) distorting moves, 6) blocking moves, 7) masking
(identification) moves, 8) breaking
moves, 9) refusal moves, 10)
cooperative moves, and 11) counter-surveillance moves.
At
a general level these are forms of resistance or non-compliance. The criteria
reflect the point of view of the observer and emphasize visible behavior,
although some inferences are made about goals. As with most social science
categorizations of complex and fluid behavior, this conceptualization is not
exhaustive. The categories are not
necessarily mutually exclusive and many can be systematically related.
They
may be temporally linked, as when discovery of surveillance leads to an effort
to avoid it. They may be simultaneously present as when a person wearing gloves
to block his fingerprints also masks his true prints by leaving
items containing another’s fingerprints. They may be logically linked. Some
broad dimensions can be seen to form an umbrella with others serving as ribs or
nestled subtypes (e.g., refusal may involve literally saying “no” as when the
surveillance is just ignored, or it may involve the refusal to fully cooperate
on the grounds desired by surveillors).
In
spite of some conceptual and operational haze, these responses are distinctive
enough to warrant separate treatment and I have found them useful in capturing
commonly occuring and analytically interesting forms. As with any beginning
effort, greater specification of the defining criteria and perhaps the addition
of other forms are welcomed.
The
strategic actions of both watchers and the watched can be thought of as moves
in a game, although unlike traditional games, the rules may not be equally
binding on all players. There are
likely common resistance moves shared by a citizen concerned with protecting
personal privacy and a criminal seeking to avoid detection. In spite of the obvious moral difference, I
treat these two generic types as behaviorally equivalent in efforts to protect
information and to neutralize other’s discovery moves.
Unless
otherwise noted, all examples are drawn from my observations and interviews.
1. Discovery Moves
Known
as surveillance detection in the intelligence trade, the goal is to find
out if surveillance is in operation and where it is. One form involves self-regulation. The subject’s behavior varies
depending on whether or not surveillance has been found to be in operation.
Drivers slow down when their anti-radar "fuzz buster" warns them that
police radar is in use. Here the
surveillance "works," at least as long as it is believed to be
present.
What
Goffman (1974, pp. 97-99) calls “vital tests” may be applied. For example, a criminal may
test a would-be partner by requiring
that an act of violence, theft, or drug use occur before a drug deal is
completed. The film Battle of Algiers
offers a riveting example when a potential recruit to the Algerian independence
movement is suddenly handed a gun on the street and told to kill a nearby
policeman (the recruit himself is a police infiltrator). A similar incident occurs in Clint
Eastwood’s In the Line of Fire.
Establishing
credibility by having a trusted person vouch for the individual or by using back
channels into official records or publicly available data are further examples.
A major user of freedom of information acts are criminals seeking to determine
the identity of informers. In Britain,
in the interest of defending their clients and perhaps more, criminal defense
attorneys have even created a data base with the names of those known to be
government informers.
Discovery
is aided by a thriving private security industry that routinely sweeps offices,
homes and vehicles for listening devices and that sells do-it-yourself
anti-bugging and other devices --e.g.,
pocket bug and tape recorder detectors for (as put by a character in the film The
Conversation) “your catalogue suckers”. Consider, for example, a small
flashlight like device that sells for several hundred dollars which permits
finding hidden video cameras.
Every
day objects may be examined to see if they are other than what they appear to
be --does a towel dispenser, book, or teddy bear hide a video lens? The
appearance of a space between one’s fingers and the glass on a mirror may
suggest a two-way mirror. Door handles, documents and drawers may be examined
under ultra-violet light to see if they have been coated with fluorescent dust.
Access keypads to a safe or to a telecommunications device may be inspected to
see if they have been coated with a waxy film that will reveal what keys were
touched to gain access.
2. Avoidance Moves
These
moves may follow the discovery that surveillance is present or it may be
assumed that, because surveillance might be present, avoidance is a prudent
response. Avoidance moves are passive
rather than active and involve withdrawal. There is no effort to directly
engage, or tamper with, the surveillance. Rather, there is a temporal,
geographical or methodological displacement to times, places and means in which
the identified surveillance is presumed to be absent or irrelevant.
Displacement
can be across settings (e.g., avoiding supermarkets with frequent shopper cards
or making calls from a pay phone which cannot be traced to the location of a
telephone subscriber). Displacement may
also occur within a given setting (e.g., shoplifters who operate within the
interstitial area of surveillance camera blind spots or thieves who know that
not all goods or library books are electronically tagged and apply the
"five-finger discount" only to untagged items).
Because
of concerns over leakage, security consultants advise clients with sensitive
information to only use the unsecured telephone, fax or Internet for
communications they would not mind seeing in the newspaper the next day and to
never use cordless microphones for presentations in non-public meetings.
Caution is advised even in face-to-face conversations, unless the room has been
recently swept for bugs and the person they are talking to can be checked for
electronic signals suggesting transmission or recording.
Beyond
the presumed security of face-to-face meetings and dealing only with those who
are known or vouched for, physical or social locations presumed to be safe may
be favored for secret conversations such as in an open field or boat. Consider,
for example, the film The Conversation, in which criminals mistakenly
felt safe talking in a rowboat in the middle of a lake, or the Philadelphia
organized crime figures who were arrested as a result of electronic
surveillance of meetings they held in the offices of their doctor and
lawyer. They wrongly assumed that the
doctor-patient and lawyer-client privilege precluded such places from police surveillance (New York Times, March 17,
1994).
The
introduction of Federally mandated computer record checks offers a nice example
of displacement to means less available for surveillance. Computer record
matching has made it much more difficult for those on welfare to obtain extra
income from a job, workmen’s compensation, retirement or assistance in another
state without being discovered. This
tightening has likely led to some increase in the use of fraudulent identities
and under-the-table forms of payment for work (cash, exchanges) that are beyond
the reach of the organizational dossier screen (Gilliom, 2001).
Another
form of avoidance is not raising the red flag. Thus, knowing that
certain profiles or crossing certain thresholds will trigger surveillance or at
least suspicion, individuals stop short of this or avoid triggering
characteristics. For example, bank
deposits under $10,000 do not require a report to the federal government.
However there can be ironic vulnerabilities here as well. It looks suspicious if 50 deposits are made
on a given day that total $9,999.
Drug smugglers, believing that a profile is in effect that targets
younger men in late model cars driving on US 90, may use as couriers older
women in older cars taking back roads.
Another
threshold involves repetition and familiarity.
Rotation is a device favored by some violators. For example, prostitutes
may move frequently to new cities making it more difficult for police to
identify them. Those interested in
thwarting or avoiding surveillance share information such as where the
hidden cameras and police radar traps are and how to identify unmarked police
vehicles. Cell phones and websites have made this easier. Here discovery and avoidance are linked. Being able to identify a police car doesn’t
necessarily prevent visual surveillance, but it does permit adjusting behavior
so as not to call additional attention to oneself.
3. Piggy Back Moves
Here
surveillance is directly faced rather than avoided. A control is evaded or
information protected by accompanying or being attached to a legitimate subject
or object. An important context for this form is entrance and exit controls.
For example, systems requiring an access card can sometimes be thwarted by
walking or, in the case of a parking structure, driving quickly behind a person
with legitimate access. The fact that
the door or gate must remain open long enough to permit the legitimate entry
may offer a window for others. This nicely illustrates the ironic vulnerability of control systems in
which there is exchange across borders. The need to open creates the
possibility for illegitimate as well as legitimate entrance and egress.
Steganography,
which permits hiding information in art, documents and other items in related
form, is freely available and easy to use on the Internet. It draws on
conventionally communicated computer information to bootleg other information
only available to someone with the appropriate software (e.g., hiding messages in digital
pictures). The viewer without the software sees only what is immediately
visible and will have not reason to be suspicious.
4. Switching Moves.
In
settings involving testing, certification and validation, an authentic result
is transferred to someone or some thing to which it does not apply. While accurate in what it purports to show,
the accuracy is misplaced. Consider test substitution, whether in the “soft”
college environment of the large class room or the more controlled testing
centers. The Educational Testing Service is on guard for substitute test takers
(Newsweek Nov. 11, 1996). Assuming the test was taken without cheating, the
results are valid in assessing the score of the actual test taker, but not the
person they purport to represent. Insurance companies also seek to identify
healthy substitutes who take medical exams for persons whose
pre-existing conditions would exclude them from coverage.
A
common form of switching involves certification transference: A ticket, entry
card, license, entitlement or identity marker belonging to someone else is
used. South Africa provides an unusual example. There, welfare payments can be obtained from ATM machines. The recipient enters the correct information
into the computer and offers a thumb print for verification. A colleague reported one enterprising family
that collected welfare payments long after an elderly relative had died. They
cut off her thumb and continued to use it.
In
a urine drug test, "clean" urine, from an acquaintance or
commercially purchased, may be substituted for one's own. In an extreme
example, an Olympic competitor passed his drug test twice, but the tests also
revealed that he was pregnant! Using a catheter, the athlete had inserted urine
from his pregnant, drug-free girl friend and then passed it back as the
temporarily duped observers watched.
One
response to late night roadblocks in search of drunk drivers is to have the driver
who has been drinking switch places with someone who hasn’t been drinking. As a
condition of sentencing, those arrested for drunk driving may agree to have a
breathalyzer attached to their car's ignition. The alcohol ignition interlock
prevents a car from starting until the driver blows into a funnel-like device
that analyzes the alcohol content of the driver's breath. For the car to start
the alcohol level must be below 0.05% weight by volume. Beyond avoidance (e.g.,
using a different car), the interlock initially could be tricked by having a
friend breathe into the device, or through a temporal switch by using air from
a balloon filled with air before drinking.
5.
Distorting Moves
Distorting
moves manipulate the surveillance collection process such that, while offering
technically valid results, the inferences drawn from a test or inspection about
performance, behavior or attribute are invalid. The technical data do
not mean what they appear to say. This contrasts with switching in which the
socially misleading inference involves identity. Consider a tactic used by some
data entry clerks who are judged by the number of keystrokes they enter. At the end of the day some workers simply
held down one key for a few minutes. This generated the impression of greater
productivity than was actually the case. The actual number of keystrokes was
accurately assessed but, given how they were produced, the result
is not socially meaningful.
A
way of beating the polygraph involves stepping on a tack hidden in one's shoe
in response to initial factual questions (e.g., regarding name and age). These are used to create a presumed truthful
baseline for comparison to answers to later incriminating questions. Meditation and pills are other means
intended to distort results.
Eating
peanuts or sucking on a penny before taking a breath analyzer test are said to
distort results. Drug tests results may
be distorted by having bleach on one’s hand when generating a urine sample, by
taking medication or by eating certain confounding foods.
A
more subtle form involves conversational ploys in which a surveillance agent is
duped into believing that a machine is invalid. Consider the story told me by a
Russian. A family coming back from a picnic is stopped by police and the driver
fails a breathalyzer test. He protests,
"That's impossible, I haven't been drinking, your machine must be broken.
Please try it on my wife." She
also fails the test. The man gets even more insistent that the machine is
broken and says, "Please try it on my young daughter." She is tested and also fails. At which point the police officer, doubting
his own machine, lets the man go. The
man later remarks to his wife, "That was really a good idea to let her
drink with us.”
6. Blocking Moves
Blocking
and masking, the next two forms, call explicit attention to the communicative
aspects of surveillance. Surveillors desire to read the signals given off by
their subjects. With blocking, subjects
seek to physically block access to the communication or, if unable or
unwilling to do that, to render it (or aspects of it such as the identity,
appearance or location of the communicator) unusable.
The
Faraday cage, which encapsulates space in metal or a metallic net,
blocks electronic transmissions. (There is even a portable version available
for travelers.) Shoplifters may seek to block
the sensors for electronically tagged consumer goods by using a metallic shield
which prevents signal transmission (e.g., a large shopping bag with a familiar
logo conceals within it a second bag lined with aluminum or duct tape). A move
reflecting the ironic vulnerability of locks requiring keys involves using a
portable device to desensitize the sensor in the same fashion as the checkout
stand does.
A
means of stopping eavesdropping via the traditional telephone was to turn the
dial and hold it in place by inserting a pencil, effectively blocking the
microphone within the phone. Playing loud music and whispering accomplishes the
same goal. Caller-ID in most states can now be stopped by line or per-call
blocking. Those making credit card phone calls in public places, such as
airports, are advised to shield the numbers when they are entered to
prevent theft of access codes by lurkers, some with binoculars who may not even
be seen by the caller.
Commercially
available technical means such as 900 numbers that forward calls, and then
immediately erase the source of the call, are a way of avoiding telephone
number identification systems. Anonymous re-mailers who forward computer
messages stripped of their sender's address also stop identification.
Another
form of blocking is wearing a veil or loose fitting, buttoned up clothes
that reveal little about the physical appearance of the person within. In
response to the phenomena of covert “upskirt and down blouse” videotaping, one
policewomen reports always wearing pants to the mall. A woman who was secretly
videotaped in her home reports, “I sometimes take a shower with my clothes on.”
Generic
ski or Halloween masks worn by bank robbers and the helmets with visors or
masks worn in some protest demonstrations, whether by police or demonstrators,
are other examples. Writing in invisible ink is a familiar children's game and
it has its' adult counterparts, although these may rely on bad science. Thus, a
bank robber was identified and arrested in spite of rubbing lemon juice on his
face because he had been told that it would prevent the surveillance camera
from creating a clear picture.
Situations
in which those watching are unaware that specific blocking has occurred can be
contrasted with those where the fact of blocking is obvious. In the latter situations, information is
communicated but it is useless. The encryption of communications is an example.
Encrypted communications are easy to intercept but meaningless absent
decryption. A “photo flash deflector” fluid which blocks the photographing of a
license plate became available soon after systems for monitoring red light
runners appeared. Some “fuzz busters” sends white noise back to a police radar
gun producing a blank reading.
In
contrast to the real time protection of information is destruction of it after
the fact. An advertisement for “Spy paper” reports that it, “looks and feels
like high quality notepad paper yet dissolves completely in a matter of
seconds when it comes into contact with water.” The delete command along with a
“wiper program” are available for computer entries to be sure they can be
eliminated, although as Oliver North discovered this does nothing for backup copies
elsewhere in the system (Tower, 1987). Those disposing of hard drives are
advised to first purge them of their data. “Piano roll” faxes have to be
carefully destroyed as these contain records of what has been received.
7. Masking Moves
Masking
involves blocking in that the original information is shielded, but it goes
beyond it to involve deception with respect to the identity, status and/or
location/locatability of the person or material of surveillance interest. Specifically, masking shares with one form
of blocking the goal of eliminating genuine identifying marks (e.g., by
removing serial numbers or wearing gloves or a generic mask) but it differs
from them by replacing what is blocked with information intended to mislead,
such as using a disguise or fake serial numbers. It also differs because such
blocking without masking may call attention to itself (e.g., a car with no
license plate number, a weapon with the serial number removed, an anonymous
email) while masking does not (e.g., altered numbers, a pseudonymous email
address). As a result of masking, the surveillance mechanism operates as
intended but the information collected is misleading and useless. Surveillors may not even realize that this
has happened.
Efforts
to disguise identity, whether involving wigs, died hair, elevator shoes, padded
clothing, plastic surgery or fake documents, fit here and again contrast with
generic give-away forms such as a mask. The film Personal Record, in
which a woman wears a wig and uses fake documents to create a seemingly
untraceable paper trail in committing a crime of revenge, is a nice example.
Masking
can be seen in giving a false social security number or name. There is a rock
group called “Gary Marx and the Sisters of Mercy”. The lead singer Marc Pairman
uses the name Gary Marx. Not wanting to be outdone, I sometimes use the name
Marc Pairman when a website requests my name. I also have supermarket cards
with the names Karl Marx, Groucho Marx and Georg Simmel.
Remote
computer entries, whether taking or sending information, by using another's
identification and password, are a nice example of masking. Surveillance
agents may not know there has been deception. The computer security system
accurately records transactions and the use of a particular entry code from a
given machine, but it cannot determine whether the entrant is the person or
organization it technically appears to be, absent additional means such as
biometric identification and video recording.
Controllers concerned with identity verification/authentication must
determine 1) is this a valid identity, authentication or access code and, if
so, 2) is this the authorized user?
8. Breaking Moves
The
goal of breaking moves is to render the surveillance device inoperable. However,
as with blocking moves, surveillors are likely to discover this. Breaking moves are the crudest
form of neutralization. Examples include disabling electrical and phone lines,
spray painting a video monitor, and immobilizing a video monitor by aiming a
laser pointer at it. When radar
location detection devices were first attached to police cars, so that
supervisors would be able to know where officers were, some officers in Boston
responded by simply smashing the device with their clubs. More subtly, the system also was
defeated by driving beyond city limits and entering at a different point. Leaving the system in this way caused the
monitor to lose track of the police car.
A male guard dog may be neutralized with a tranquilizer dart, mace,
poisoned food or a female dog in heat.
Those
under judicial supervision may remove electronic location monitors. Ironically, the effort to interrupt a signal
itself becomes a signal. However, this must be noted and acted upon. In a
tragic case, an abusive spouse sentenced to home confinement simply took off
the device, went to his former wife's home, and killed her. This was on a weekend. Although a message about his removal of the
device was sent, there was no one on duty until Monday to learn of his action.
9. Refusal Moves
The
above strategies suggest that any move away from participation on the terms
desired by those watching can be seen as a kind of refusal. A more extreme form
of refusal is to “just say no” and ignore the surveillance. This surprisingly
simple response is not as common as one might expect. Partly this reflects
deference to authority and fear of sanctioning or denial of service. Politeness and the desire to avoid conflict,
or be labeled a troublemaker, also may be factors.
It
is now routine in some large retail chains to ask all customers for their phone
numbers, whether they pay with cash or with a credit card. In response, the
individual can refuse to give any number or say, “I don’t have a
phone." Beyond increasing consumer
autonomy, this prevents surveillor access to the rich array of personal and
census tract data that can be linked to phone numbers and then, using powerful
"data dredging techniques," analyzed in relation to the record of
bar-coded purchases.
Refusal
can be specific to certain kinds of information. For example, while many
persons do not realize this, the Privacy Act of 1974 restricts the collection
of social security numbers to a limited number of governmental purposes and
offers no mandate at all for unrelated private sector uses. Yet that
restriction hardly stops private organizations from requesting the number.
As
part of the employment process at a large state university, I was asked to sign
a form swearing that I was a loyal citizen and supported the laws of the state
and country. I also was asked if
I had ever belonged to various political groups. The latter question seemed inconsistent with the First Amendment
protection of the former that I had just indicated I supported. With more curiosity than trepidation, I ignored
the second question and waited. But as with so much on bureaucratic forms, this
was overlooked and nothing happened.
The
surveillance also might be ignored not on principle, but on the assumption that
the risks of being identified are small and the risk worth the gain. Drug
smugglers flood borders with many couriers carrying small amounts knowing that
most will get through.
Requests
to participate in public opinion surveys, whether over the phone, in person, or
by computer or mail, also may be refused.
Indeed, there is evidence that refusal rates have been increasing. With
respect to unwanted phone requests, whether for interviews or sales pitches,
one response is to say, "This isn't a good time for me to talk, but if you
give me your home phone number I will call you back late tonight".
Refusal
is a broad term which may mean literally ignoring a request for information.
But it may also involve feigned participation in which expectations about how
one is to participate are violated. Here masking can be seen as a subtype of
refusal. Consider, for example, offering false information. This can involve checking the wrong column
for income or indicating a preference for golf when one's preference is really
for cooking. According to some research, 30% or more of website visitors admit
providing incorrect information. Changing the spelling of one's name or middle
initial permits tracking the solicitation networks through which names are sold
and also might impede database matching.
The
above cases are ones in which providing the information is, in principle,
optional. There are other cases where
it is required, but is refused as an act civil disobedience. For example, two marines refused to give DNA
samples because they did not trust the government to only use it for
identification purposes in case of their death (Los Angeles Times, Dec. 27,
1995). Apart from organizational demands, individuals also may avoid certain
medical tests or DNA analysis because they do not want this on their insurance
and other records (see Alpert, this issue).
10. Cooperative Moves
One
of the findings from research on white collar crime is the frequency of insider
perpetration and cooperation with violators beyond the organization (Rosoff,
Pontell, & Tillman, 2002). Insiders often have access to, and knowledge of,
control systems and this knowledge can be tempting. Thus, it is not surprising
that efforts to resist surveillance sometimes involve collusion with
surveillors. A market for information and its’ distortion is created and
controllers possessing the information have an illicit product to sell. Or this
may be done for personal reasons as in the early days of computerized records
when a police chief simply erased the record of his son’s drunk driving arrest.
Given
the complexity of the social controller's job (e.g., the often vast and
dispersed activities subject to control) and various restrictions and
constraints on them (e.g., applicable laws and organizational policies), social
controllers also might need the cooperation of those they are charged with
controlling (e.g., Sykes, 1971). This
can result in what Marx (1981) terms non-enforcement, that is, controllers
exchange a resource they have for some form of cooperation from the controlled.
Cooperative
moves seem particularly characteristic of control systems where agents are
poorly motivated or indifferent, feel fatigued, and are under-rewarded. They also may sympathize with those they are
to surveil. This can lead to the
routinization of surveillance, lessened attention and taking action only when
it cannot be avoided. Moreover, the
agents may cooperate with those watched not to get involved. Consider, for example, (1996) finding that
more than one million persons managed to live illegally in Moscow in the mid-1980s
because illegal residents of apartment complexes would routinely bribe
superintendents or guards who might otherwise inform on them. In short, trying to motivate and control
first-line surveillance agents in low visibility field settings is a major issue
for organizations.
Cooperative
moves also may be ideologically motivated.
For example, given ideological disagreements with authority and sympathy
for those controlled, some welfare workers bend rules and look the other way in
the face of system rules and procedures seen to be unreasonable. Gilliom (2001)
finds a pattern of widespread, if unorganized, resistance on the part of
welfare caseworkers in response to a new and very comprehensive computer
monitoring program of those on assistance.
11. Counter-surveillance Moves
These
moves are of a different order than the moves discussed above. They involve turning the tables and
surveilling those who are doing the surveillance. Knowing that targets of
surveillance may respond in kind also can be a factor limiting or inhibiting
the initial use of surveillance. The
extent to which there has been a "democratization of surveillance” is an
important topic. Certainly there is
greater equality in access to and use of surveillance technologies today than in
much of recorded history. However, we
are certainly far from equivalence here. The kind of technologies that are
developed, apart from who has them, is also much affected by inequality in
resources. Yet many techniques are inexpensive, easy to use, and widely
available.
If
counter-measures uncover questionable practices, which are then publicized, it
also may lead to their moderation or cessation. This situation overlaps
discovery moves. It can lead to
deceptive actions taken to ensnare surveillance agents and document
discreditable surveillance. A classic case, that certainly must have had ripple
effects on police behavior, is that of a Jewish Defense League informant and
later defendant, who tape-recorded highly incriminating conversations with his
police handler. This came out in court
after the police officer committed perjury in denying that he threatened the
informant and then to his great surprise, heard his tape-recorded threats
played back (Dershowitz, Silverglate, & Baker, 1976).
The
results of counter-surveillance, if incriminating, may be used to compromise
those
doing the initial surveillance. Those controlling surveillance systems may be
seduced, blackmailed or otherwise coerced into cooperation in return for the
silence of those they originally watched.
I
have suggested some concepts that can help organize responses to privacy
invasions. Yet as a Yiddish expression holds, “for instance isn’t proof. A
number of questions involving the patterning and correlates of neutralization
moves, differences between acceptable and questionable forms, the
generalizability of the forms and understanding cycles of surveillance,
neutralization and counter-neutralization can be identified for more systematic
research.
How
does neutralization relate to attitudes towards authority, politics and
privacy? The pattern likely transcends conventional left-right distinctions.
How are these forms of resistance socially distributed? The social class
implications appear distinctive. Resistance to control and domination is
usually approached as a response of the poor and powerless (Scott, 1985). However, responses to new forms of
organizational surveillance are more evenly distributed and much resistance
appears positively (rather than inversely) associated with class position. The privileged who participate most fully in
conventional society (e.g., with respect to communications, computer uses,
credit cards, and consumption) are prominent in resistance. Orwell’s (1990) dystopia is prescient here,
as those who were watched most closely were the more privileged. Their greater
exposure to organizational demands for personal information and greater
resources enhances the likelihood and multiplicity of forms of resistance.
There
is a spatial and relational geography to the protection and revelation of
personal information that could be usefully explored (Margulis, this issue;
Marx, 2001; Stinchcombe, 1963). There could be mapping of the ecology of places
where information has greater protection, for either cultural reasons, as in
the sanctuary offered by a church, or in no (or fewer) questions asked places,
such as frontiers, ports, vice and skid row areas of cities and some isolated
places (e.g., Key West). Relationships also serve to contour the perceived need
for information protection. Notable
examples include the legal protections of much medical, legal and religious
communication and the confidentiality of spousal communication. There may be
“natural” limitations (until a technology changes this) such as those involving
the senses (e.g., the traditional protection of darkness) or environmental limitations
such as dead spaces where radio transmission is poor or blocked. This mapping also could have a temporal
dimension, such as special periods (e.g., time-outs and times of holiday
license), when there may be a greater tendency to look the other way.
How
can legitimate and illegitimate forms of resistance be differentiated and how
might they be connected? While there
will be some common social forms and processes, the moral distinction is
central. We can contrast legitimate
forms of resistance to unwarranted privacy invasions with actions that,
although protecting personal information, are morally indefensible, as with the
discovery, masking and blocking moves of serious crime. There is an important
distinction between neutralizing the unwarranted surveillance of legitimate
activities in which symbolic protest motivations are important as ends in themselves,
as against neutralizing legitimate surveillance as a means to other
illegitimate ends.
Criteria
are needed which would permit us to speak of “good” and “bad,” or appropriate
and inappropriate, efforts to neutralize the collection of personal data. Beyond the easy contrasts provided by
extremes in actual behavioral patterns (e.g., a person who uses encryption as
part of a serious crime or who seeks to destroy fingerprints to avoid
prosecution vs. the dismantling of “cookies” implanted without consent on one’s
computer by a website), how should efforts to neutralize surveillance be
judged?
Elsewhere
I have argued that the ethics of any given form of the new surveillance can be
evaluated by differentiating the means of data collection, the context and
conditions under which data are gathered, and uses or goals. I suggest 29 questions embodying a cluster
of values respecting the dignity and autonomy of the individual, trust and
community (Marx 1998). The more the
questions can be answered in a way affirming the underlying values, the more
justified surveillance is. Conversely
the more answers to these questions negatively reflect on these values, the
more justified are neutralization efforts.
There
are research questions about how generalizable this classification of moves
is. For example, does it apply to other
contexts of information discovery and protection beyond the individual relative
to organizations considered here, such as in the management of stigma (Goffman,
1963) or the espionage and counter-espionage activities of organizations
relative to each other? Can some of the
same concepts (e.g., discovery, blocking) be usefully applied to those engaged
in surveillance, whether in the private sector, government or in interpersonal
relations, as they in turn attempt to neutralize the efforts of those
responding to surveillance, or do we need additional concepts?
Can
the categories be usefully applied to efforts to neutralize the imposition of
information on persons, as well as taking it from them? I have focused
primarily on the latter, but another component of privacy is crossing personal
borders and invading personal space in order to impose something upon a person
(e.g., loud music, cell phone conversations in public, cooking smells, subliminal
messages, advertisements) without their consent and sometimes even
knowledge. How do efforts to protect
against these forms of privacy invasion relate to efforts to stop information
from being taken from the person? Are the same categories useful or are
additional ones required? What does the mute button on the TV remote have in
common with encryption?
There
also are research questions about how neutralization moves, that expand zones
of “insulation from observability” (Coser 1961), relate to other forms of
resistance to organizational power (Brook & Boal, 1995; Jermier, Knights,
& Nord, 1995; Scott, 1985, 1990).
For example, what is the impact of individual responses? That is, under what conditions are they
effective in meeting material needs and enhancing the individual’s sense of
dignity and autonomy? Under what
conditions do such individual actions additively result in unplanned social
change, apart from any formal political pressure or legal or policy changes? When do they serve as a kind of consciousness
raising and pre-politicization in which individual resistance eventually leads
to more organized political challenges, as against simply remaining
individualistic responses that inhibit such organized challenges (e.g., compare
the contrasting views in Gilliom, 2001; Martin, 1993; McCann & March, 1996;
Scott, 1985, 1990).
Surprisingly,
social researchers who generally rush to study protest groups have tended to
ignore the social dynamics, impact, and strategies of organizations that seek
to protect privacy and challenge surveillance (e.g., the Electronic Privacy
Information Center, the Electronic Frontier Foundation, the American Civil
Liberties Union, The Privacy Rights Clearinghouse and groups focusing on
particular forms such as junk mail, telemarketing, medical, genetic, workplace
and social security privacy). Such groups seek to directly affect political
processes, as well as offering information and resources to the public.
A
mapping of the organizational forms of resistance to the indiscriminate
collection and use of personal information could begin by analysis of the many
privacy groups, privacy media, privacy-enhancing services and tools, and
consumer and legal resources found on the “Privacy links” web page of the
Privacy Rights Clearinghouse (http://www.privacyrights.org).
Of
equal interest would be studies of how organizations involved in surveillance
and personal data collection activities, and groups representing them such as
the Direct Marketing Association, the American Association of Manufacturers,
the American Chamber of Commerce and various banking, insurance and credit card
companies, promote their activities and respond to challenges. Under what conditions do individual forms of
resistance, such as the reaction to the Lotus Marketplace and
related controversies (e.g., the Intel Pentium III chip with its’ unique
identifier) lead to retreat rather than intensified public relations and
lobbying efforts? In the 1994 Lotus
case, an effort to sell millions of names and address was met with a
massive public outcry that ironically was mobilized by email messages (Gurak,
1997).
Gabriel
(1999), in noting the hubris, pathos (and we might had bathos) surrounding
Foucault’s combining the Catholic omnipresent eye of God with the
Protestant will to sterile efficiency sought with modern control efforts,
observes “that our lives are controlled by diverse forces operating both on us
and through us cannot be doubted. That our lives can be reduced to these forces
in a totalitarian gloom runs against what history has to tell us” (p. 193).
As
the examples of neutralization suggest, the human spirit valiantly expresses
itself in the face of the machine. It frequently proves richer than the possibilities
anticipated and built into the latter.
However, victory may be short lived. While the present analysis is
static, in reality the processes are fluid and dynamic. That is, just as new means of information
collection can lead to innovations in resistance, those in the surveillance
business respond to neutralization efforts with their own innovations that are
then responded to in a re-occurring pattern.
This
is not to deny the greater power of employers, manufacturers, merchants,
landlords, professionals, parents and state agents, such as welfare workers,
teachers, police, and prison guards, over those subordinate to them. Issues of
power and control are central to the kinds of surveillance that become social
issues. Yet power is rarely a zero-sum game. Beyond varying degrees of
normative constraint on power holders, power is not unlimited because it is
often rooted in interdependency and occurs on a broad and decentralized scale.
In informational-conflict settings in democratic societies, the advantage of
technological and other strategic surveillance advances are often short-lived
and contain ironic vulnerabilities.
Neutralization
is a dynamic adversarial social dance involving strategic moves and
counter-moves. It has the quality of an
endless chess game. This emergent
phenomenon is well worth studying as a conflict interaction process. Of particular interest here is escalation
and "the see-saw principle" of military technology in which new
developments are balanced by counter-developments. Consider for example the
appearance of the “unshredder,” a computerized document reconstruction process
that appeared in response to the initial strip-cut type of shredder.
This
cat-and-mouse reciprocity raises questions such as: How are neutralization and
counter-neutralization techniques discovered, chosen, combined and diffused?
Useful in studying this would be the “how to do it” literature made possible by
a free market and free press (with an enormous boost from the Internet) and
communications of the surveillance industry. What are the major “career paths”
and “life cycles” of techniques of surveillance and neutralization? Does the
lag time vary by the properties of the technique and characteristics of the
players, by resource and skill factors, or
by the perceived importance of success and cost of failure?
In summary, I have noted factors encouraging compliance with and subverting the vast expansion in the collection of personal information. I have identified 11 behavioral techniques of neutralization intended to subvert the collection of personal information and some issues for research. In spite of doomsday scenarios with respect to the death of privacy, in societies with liberal democratic economic and political systems, the initial advantages offered by technological developments may be weakened by their own ironic vulnerabilities and, in Poe’s (1967, p. 153) term, “human ingenuity.” This is certainly not to argue for complacency because power imbalances (both legitimate and illegitimate) are central to a majority of surveillance situations. It is, however, to ask that our vigilance be informed by careful empirical research and analysis.
Agre, P., & Rotenberg,
M. (1997). Technology and privacy: the new landscape.
Cambridge: M.I.T. Press.
Alpert,
S. (this issue). Protecting medical privacy: Challenges in the age of genetic
information. Journal of
Social Issues.
Andreas,
P. (2000). Border Games. Ithaca: Cornell University Press.
Bandura,
A. (1999). Moral disengagement in the perpetration of inhumanities. Journal
of Personality and Social Psychology Review, 3, 193-209.
Bennett, C., & Grant, R. (1999). Regulating privacy: Data
protection and public policy in Europe and the United States. Toronto: University of
Toronto Press.
Boyne,
R. (2000). Post-panopticism. Economy and Society, 29 (3),
285-307.
Brinn,
D. (1999). The transparent society. New York: Perseus.
Brook,
J., & Boal, I.A. (1995). Resisting the virtual life: The culture and
politics of information. San Francisco: City Lights.
Corbett,
R., & Marx, G. (1991). No soul in
the new machine: Technofallacies in the electronic monitoring movement. Justice
Quarterly, 8 (3), 399-414.
Coser,
R. (1961). Insulation from observability and types of social conformity. American Sociological Review, 6, 28-39.
Dershowitz,
A., Silverglate & J. Baker, (1976).
The JDL Informer and the Bizarre Bombing Case. The Civil Liberties Review.
April/May.
Ericson,
R., & Haggerty, K. Policing the risk society. Toronto: University of
Toronto Press.
Foucault,
M. (1977). Discipline and punish: The birth of the prison. New York:
Pantheon.
Froomkin,
A. M. (2000). The death of privacy? Stanford Law Review, 52(5),
1461-1543.
Gabriel,
Y. (1999). Beyond happy families: A critical re-evaluation of the
control-resistance-identity triangle. Human Relations, 52 (2), 179-203.
Gandy,
O., Jr. (1993). The panoptic sort: A political economy of personal
information. Boulder, CO: Westview.
Garfinkle,
S. (2000). Database nation. Sebastapol, CA: O’Reilly.
Gilliom,
J. (2001). Overseers of the poor. Chicago: University of Chicago Press.
Goffman,
E. (1961). Asylums: Essays in the social situation of mental patients and
other inmates. Garden City, NY: Anchor Books.
Goffman,
E. (1963). Stigma: Notes on the
management of spoiled identity. Englewood Cliffs, NJ: Prentice-Hall.
Goffman,
E. (1974). Frame analysis. NY:
Harper Colophon.
Gurak,
L. (1997). Persuasion and privacy in cyberspace: The online protests over
Lotus Marketplace and the Clipper Chip. New Haven, CT: Yale University
Press.
Gutwirth,
S. (2002) Privacy in the Information Age. Boulder: Rowman and
Littlefield.
Kelvin,
P. (1973). A social-psychological examination of privacy. British Journal of
Social and Clincial Psychology, 12, 248-261.
Jermier,
J., Knights, D. & Nord, W. (1995). Resistance & power in
organizations. London: Routledge.
Lyon,
D. (2001). Surveillance society:
Monitoring everyday life. Buckingham, U.K.: Open University Press.
Margulis,
S. T. (this issue). On the status and contribution of Westin's and Altman's
theories of privacy. Journal of Social Issues.
Martin, B. (1993). Antisurveillance. Anarchist Studies,1, 111-129.
Marx,
G. T. (1981). Ironies
of social control: Authorities as contributors to deviance through escalation,
nonenforcement and covert facilitation. Social Problems, 28 (3),
221-246.
Marx,
G. T. (1988). Undercover: Police surveillance in America. Berkeley:
University of California Press.
Marx, G. T.(1995). The
engineering of social control: The search for the silver bullet. (pp.
225-246). In J. Hagan,& R. Peterson
(Eds.), Crime and Inequality.). Stanford, CA: Stanford University Press.
Marx, G. T. (1998). An
ethics for the new surveillance. The Information Society, 14
(3), 171-185.
Marx, G. T. (1999). Measuring
everything that moves: The new surveillance at work. (pp. 165-189). In I. Simpson, & R. Simpson (Eds.), Deviance
in the workplace. Greenwich, CT: JAI.
Marx, G. T. (2001). Murky
conceptual waters: The public and the private. Ethics and Information
Technology, 3 (3), 157-169.
Marx, G.T. (2002a). Technology
and social control. (pp.
15506-15511).
In
N. Smelser, & P. Baltes (Eds.), International Encyclopedia of the Social
and Behavioral Sciences. Oxford, England: Pergamon.
Marx, G. T. (2002b). Technology and gender: Thomas I. Voire and
the case of the Peeping Tom. Sociological Quarterly, 43 (3).
Marx,G.
T. (forthcoming). Windows into the
soul: Surveillance and society in an age of high society. Chicago:
University of Chicago Press.
McAdam
D., & Snow, D. (1997). Social
movements: Readings on their emergence, mobilization, and dynamics. Los
Angeles: Roxbury Press.
Nadelmann,
E. (1993). Cops across borders.
University Park, PA.: The Pennsylvania State University Press.
McCann
M,.& March, T. (1996). Law and
everyday forms of resistance: A socio-political assessment. (Pp. 201-236). In A. Sarat, & S. Silbey (Eds.) Studies
in Law, Politics and Society. Beverly Hills, CA: SAGE.
Orwell,
G. (1990). 1984. New York: New American Library Classics.
Pfaff,
S. (2000). The limits of coercive surveillance. Punishment and Society,
3 (3), 381-407.
Poe,
E. A. (1967). Tales of mystery and imagination. New York: Gramercy
Books.
Privacy
Act, 5 U.S.C. Sec. 522 et seq. (1974)
Regan,
P. (1995). Legislating privacy: Technology, social values and public policy.
Chapel Hill: University of North Carolina Press.
Scott, J. C. (1985). Weapons of the weak:
Everyday forms of peasant resistance.
New Haven, CT: Yale University Press.
Scott, J. C. (1990). Domination and the arts of
resistance: Hidden transcripts. New Haven, CT: Yale University Press.
Sieber, S. (1981). Fatal remedies. New York:
Plenum.
Rosoff,
S., Pontell, H., & Tillman, R. (2002). Profit without honor. Upper
Saddle River, NJ: Prentice-Hall.
Shelley,
L. (1996). Policing soviet society: The evolution of state control.
London: Routledge.
Smith,
R.E. (1990). War stories. Providence, RI: Privacy Journal.
Staples,
W.G. (2000). Everyday surveillance: Vigilance and visibility in postmodern
life. New York: St. Martin’s Press.
Stinchcombe,
A. (1963). Institutions of pivacy in the determination of police administrative
Practice. American Journal of
Sociology, 69, 150-160.
Sykes,
G. (1971). The society of captives. Princeton, NJ: Princeton University
Press.
Sykes,
G., & Matza, D. (1957). Techniques of neutralization: A theory of
delinquency.
American
Sociological Review, 22, 664-670.
Tenner,
E. (1996). Why things bite back.
New York: Knopf.
Tower,
J. (1987). The Tower Commission
report. New York: Bantam Books.
Wood,
D. (2002). Panoptic Prophecies: Science Fiction Literature, Surveillance and
Social Control. Paper delivered at Technotopias Conference, University of Strathclyde.
Top | References
| Back
to Main Page
You are visitor
number
to this page since December 31, 2002.
If you are having difficulty viewing or
downloading any of the posted articles, report
the problem here.