Technology and Social Control: The Search for the Illusive Silver Bullet Continues
Encyclopedia of the Social & Behavioral Sciences, 2nd edition, forthcoming

By Gary T. Marx, Professor Emeritus, MIT

Gary T. Marx bio  |  Back to Main Page  |  References

Since the last half of the 20th century there has been a significant expansion in the use of science and technology for purposes of social control. The wheels that began turning with industrialization continue to gain momentum. Control through technology is central to the working of modern society. 9/11 and the war on terror have brought increased attention to the issue, but reflect continuity rather than disjuncture. Consider the intensification of control seen in events such as the Patriot Act and other legislation, the creation of the Department of Homeland Security, vastly augmented expenditures for research and development; the spread of tools such as DNA fingerprinting, RFID chips, drones, enhanced travel and border controls, data based risk management and predictive tools; and the increased sharing of information and integration among public and private and local, national and international agencies. Many technologies developed for the military such as the Internet, satellites and sensors have diffused to other institutions. Yet the use of tools to engineer behavior goes far beyond particular historical events and national security to everyday life—whether this involves work, consumption, health, recreation, school or families and friends.

The engineering of social control is one of the defining characteristics of modern society. It is so prominent, ubiquitous and transparent in daily life that it is often taken for granted. Our personal spatial, communication, social, cultural and psychological environments and borders are increasingly subject to technological strategies designed to influence behavior—whether involving conformity with rules, safety, consumption or attitudes. In some ways, contemporary social control can be seen to reflect the ethos of the highly regulated prison in which conformity is sought by the design of ever more features of the environment, rather than relying on trusting the individual or facing the uncertainty of human will and choice. Are new threats, technologies, expectations and ways of living resulting in a move toward our becoming a "maximum security society"? The engineering of control (in both its hard and soft forms) is an important component of our contemporary surveillance society. (the Marx article on surveillance studies expands on these developments)

A major strand of social control involves the enforcement of rules and standards. This article provides a classification framework for technology-based efforts to prevent or reduce the harm from, or attractiveness of rule violations and, when that isn't possible, to increase the likelihood of discovering and apprehending violators. Such enforcement activities are distinct from other aspects of social control such as the creation of norms, processes of adjudication and sanctioning, or the broad societal guidance and integration which was of concern to early theorists of industrialization and urbanization. (see articles on Ross, Durkheim, +??)

The engineering of rule enforcement as this involves material artifacts (architectural and product design, sensors and alarms, access controls, software) is also distinct from forms of social control such as the creation and manipulation of culture, socialization, the redistributive rewards and penalties of the welfare state and inter-personal influences that are also designed to influence behavior.

Contemporary efforts build on, but go far beyond, a medieval fortification ethos. Of course the inventors and builders of the first locks, safes, moats and walled castles and the developers of early biometric identification systems (e.g., the Italian criminologist Cesare Lombroso 1835-1909) were engaged in the engineering of social control. What is new is the scale and relatively greater scientific precision, power, omnipresence, continual invention and experimentation and rapid global diffusion. Technical means of control saturate modern society, colonizing, documenting and in some ways restricting ever more areas of life.

The roots of contemporary social control lie in the development of large organizations and standardized control technologies. (Beniger 1986) They are one strand of broad processes of rationalization, professionalization and specialization occurring with modernization. (Weber 1964, Rule 1973, Foucault 1977, Cohen 1985, Laudon 1986, Gandy 1993, Zuboff 1988, Lyon 1994, Shenhav 1999).The heterogeneity, scale, mobility and anonymity of mass society and the pursuit of efficiency and effectiveness encourage reliance on external, impersonal, distance-mediated, technical means and data-base memories that locate, identify, register, record, classify and direct individuals.

The perception of catastrophic risks in an interdependent, global, world relying on complex technologies drives the search for definitive technical solutions. Consider issues such as terrorism, crime, obesity, drug abuse, AIDS, border controls, and the need for economic competitiveness and unprecedented flows of persons, information and goods Developments in electronics, computerization, artificial intelligence, cognitive science, bio-chemistry, architecture and materials science and many other areas promise new control possibilities. Entrepreneurial efforts (whether by private sector actors such as the security industry or governments such as the United States), have helped spread the technologies at home and abroad.

This use of contemporary technology contrasts with traditional approaches in which environments were less likely to be designed with rule enforcement in mind. Nor was there much probing beneath personal informational borders before untoward incidents occurred. Current preventive and anticipatory responses contrast with earlier reactive forms of control in which enforcement agents tended to become involved only after a violation occurred. The engineering emphasis may be on environmental conditions, the control agent or the actual or potential offender. Consistent with the classical deterrence ideas that Thomas Hobbes (1588-1679) emphasized, some strategic efforts aim to create self-control and rational calculation on the subject's part, others however seek greater certainty through applying means believed to be fail proof. Rather than attend to the consciousness or will of subjects, the emphasis through automation is on eliminating, or at least significantly inhibiting, their ability to violate the rule. Engineered efforts also seek to limit the ability of social control agents to demonstrate incompetence, mistakes, corruption or discrimination, as the machine is geared to get "the human out of the loop." Software programs aimed at prevention and early intervention have proliferated. There are many actuarial assessment protocols that profile and assign predictive scores to places and persons with respect to their presumed threat, vulnerability and risk. This has resulted in the expansion of various kinds of watch lists for at-risk populations such as mentally ill and sex offenders and those believed to be at risk of becoming homicide offenders or victims. Any effort to influence other persons or events can be seen as a form of engineered social control using tools exerted by an agent on a subject (or a population of potential subjects) or on an object. Engineering is present whenever a strategy based on beliefs about means-ends relations is applied and need not involve material factors. Praying as well as magic would thus be examples, although they are far from the direct science based efforts of interest here that involve rule enforcement. Failing to take any action as part of a strategy to avoid seeing a situation escalate is also an example.

Closer to the topic of this article, although also distinct, is the design of presumably fail-safe products automated for safety as with the "dead man's switch" on a train or the protective shield on a table saw which are intended to transcend human will. Engineering is also present in the non-automated situations designed to persuade or guide behavior (but with choice remaining) as with advertisements to discourage smoking; advice to political candidates to present themselves as competant and warm and rather than denying the claim of a negative advertisement, affirm an alternative positive idea; a waiter reporting the kinds of beers offered by moving from the least to the most expensive—anticipating that the customer will be prone to opt for the most recently mentioned beer; manufacturers increasing the nicotine or sugar content of cigarettes and food; or providing uncomfortable seating in fast-food restaurants to encourage rapid turnover. Also related are strategic efforts that permit, or even facilitate, rule violations as—with deception or reverse engineering. Consider police undercover opportunities, spoofing computer security or creating situations that permit blackmail. While sharing strategic logic, the above efforts are distinct from rule enforcement efforts.

Six Strategies

Six engineering (primarily machine or material based) efforts to eliminate or limit violations by control of the physical and social environment, rather than mere appeals to doing the right thing (or at least what an agent desires) are considered. This is followed by discussion of some social and ethical implications of such efforts.

The six ways of controlling persons and/or environments emphasize protection/alteration of the victim or of what is desired in the violation making it impossible, more difficult or less inviting for the potential offender to act; or should prevention fail, tactics are intended to increase the likelihood of identification and apprehension. Some engineering efforts involve traditional notions of target hardening. But the engineering of control may also involve the idea of suspect softening or weakening.

Strategy 1 (and sometimes 4 and 5 below) are primary direct prevention efforts. These are designed to eliminate or increase the difficulty of carrying out the offence. With the primary engineering strategy it is not necessary to affect the will or calculation of the potential rule breaker. The subjective orientations of the actor (whether based on calculation, a content-filled socialization, or a contentless discipline) are simply ignored. The emphasis is on altering opportunity structures and capabilities rather than the person's conscious choices. The social engineering example of castration as a device to control sexuality (whether literal or as currently may be done chemically) clearly contrasts with appeals to virtue to accomplish the same end.

But primary strategies are not always available and may not live up to their promise. Hence we see a series of secondary engineering strategies (2,3,5 and 6 below) where concern with the will of the violator can be a factor. The more traditional goal of deterrence may be sought by affecting the calculations of potential violators through devaluing and insulating targets, increasing the costs of non-conformity and the likelihood that violations and violators will be discovered and that evidence can be traced. What cannot literally be prevented may nonetheless be deterred, by eliminating the gain, altering the cost-benefit ratio, or by enhancing the chances for identification and apprehension.

As in other areas of social intervention such as public health, contemporary approaches put a strong emphasis on prevention and harm minimization. Rather than relying on persuasion (which assumes situations where individuals have a choice about whether to respect a rule or to behave in a way that an agent wants them to), the emphasis is on direct interventions to design problems away. Ideally problems are anticipated and eliminated (1, below); or where that is not possible, the goal is to create deterrence by reducing the gain (2), making violation more difficult and more costly (3, 4, 5), or by increasing the likelihood of identification and apprehension (6).

  1. Target or facility removal. The logic of prevention is clearest and most effective here. Something that is not there can not be taken or used. The move toward a cashless society is one example. Merchants who only accept credit or debit cards, or whose registers never have more than a modest amount of cash are unlikely to be conventionally robbed. Furniture built into the wall cannot be stolen. Subway cars and buses made with graffiti resistant metals are hard to draw upon. Police stings that pass off a harmless white substance as cocaine or that substitute fake for real dynamite or provide inoperable weapons offer evidence of intent to commit a crime, but preclude harm resulting from it. Conversely, deception as disguise may be in the other direction—hiding the appearance of something valuable, as with painting gold bars black.

  2. Target devaluation. Here the goal is to reduce or eliminate the value of a potential target to anyone but authorized users. The target remains, but its uselessness makes it unattractive to predators. Examples include products which self-destruct, as with some car radios when stolen, or which leave clear proof of theft, as with exploding red dye packs that stain money taken in bank robberies. The broken tamper proof seal on food products is intended to deter those who would tamper and serves as a warning. Telephones, computers, automobiles and even guns which to be used require a unique, presumably non-transferable bio-metric means of identification (e.g., retinal, voice or geometric hand pattern) offer another example. Disposable access codes that can only be used once need not be protected. Encrypted messages can often be easily intercepted, however absent the decryption code, the data are useless. As an anti-teenage congregating tool, some mall shops play classical music in front of their stores.

  3. Target insulation. With this ancient technique the object of desire remains, but it is protected. Perimeter maintaining strategies such as fences, walls, moats, guards, and guards dogs can be separated from more specific protections surrounding an object such as safes, armor, chastity belts, goods that are in locked cases or chained to an immovable object and the hiding or disguising of valuables or persons. Other examples are high security, gated communities in which access and egress is carefully controlled and the use of networked sensors, alarms, internet video and bullet proof barriers in banks separating customers from teller (a form in Europe controls both entry into and exit from a bank through an enclosed booth). The architectural development of "sky-walks" linking downtown private buildings creates "sanitary zones" more subject to control than the potentially disorderly public streets below. Softer forms of border protection lie in the use of access codes based on pin numbers, passwords and biometric measures.

  4. Offender weakening or incapacitation. This classic strategy seeks to render potential offenders harmless with respect to the will, or ability, to violate the norm in question or if the violation occurs, to escape. The means may act directly on the body by permanently altering it and making certain offenses impossible— cutting off the hands of thieves, or at least uninviting, as exemplified in the operant conditioning seen in the novel A Clockwork Orange (Burgess 1962). A negative association with a given undesirable form of behavior may be created. After ingestion of a substance such as antabuse, an unpleasant physical reaction (gagging or vomiting) follows when alcohol is consumed. The morphine derivative trexan is used to detoxify methadone and heroin addiction.

    Drugs such as Depo-Provera may be used to reduce sex drive and birth control implants, as well as sterilization, are sometimes judicially mandated. Lowering serotonin levels and psycho-surgery have been used to curb violence. Passivity, disorientation or the inability to flee may be created by sensory weapons. Various citizen protection devices that can be defensively used such as mace fit here, as do non-lethal crowd control devices such as electrical, chemical, strobe and acoustical immobilizers that disorient, stop, restrain or block individuals (e.g., Tasers, pepper spray, loud music, flash bang devices, beanbag shotguns, sticky foam released on a floor, straight jackets and a cage dropped upon or a net fired upon those to be controlled). Spikes in the road may be used to stop a vehicle and some cars can be remotely stopped.

    Related efforts deal not with the body of the offender but with the instrumentalities involved in the offense. Examples include anti-drunk driving interlock systems which require passing a breath analyzer test attached to the automobile ignition system before a car will start; cell phone muzzling devices for automobiles that use GPS sensors to prevent the phone from being used while the car is being driven; or mixing a bad smelling chemical into a product to prevent it from being inhaled for its hallucinatory effects. An anti-paparazzi device identifies the presence of a digital camera and can then remotely disable it through projection of a light beam.

  5. Exclusion. Potential offenders have traditionally been kept away from targets or tempting environments by exile, prison, curfew and place or activity exclusions (e.g., alcohol and cigarettes for juveniles). A related form is the visible warning offered by a stigma such as the brand or clipped ear of offenders in medieval Europe, which encouraged others to stay away. Electronic monitoring or location devices based on GPS are contemporary examples. In one form an alarm goes off and a message is sent to authorities if an adjudicated person wearing a transmitter gets too close to a prohibited person (such as an abused spouse) or leaves (or enters) a restricted area.

    Capital punishment is the ultimate form of exclusion. At the other extreme is prevention of birth. With the human genome project completed, eugenics is likely to become more contentious. For example, the belief (which ignores interactions with the environment and the socially crafted character of most rules) that DNA is linked to violence and other anti-social behavior could generate another ultimate form of exclusion—requiring a license indicating an "acceptable" genetic pattern before a child could be born. (see entries on DNA, Eugenics?)

  6. Offense/offender/target identification. Where it is not actually possible to prevent the violation physically, or where that is too expensive, it may be possible to at least know that it took place and perhaps who is responsible, where they are and/or to locate contraband (as with x[ray, metal and chemical detectors). The goal is to document the violation and identify the violator. A major goal of nineteenth century forensic science was to develop reliable biometric measures of identity based on the analysis of fingerprints, facial measurements, and chemical properties (Thorwald 1965). These have significantly expanded from involving a person's gait and voice to distinctive smell. One technique used by the former East Germany involved identifying individuals by their unique olifactors. Architectural design emphasizing visibility as a deterrent fits here (Newman 1972), as do video, audio, motion, and heat detection means and access codes that are presumed to document who enters or leaves an area, or is using a resource such as a computer. Hand-activated personal alarm systems, luggage alarms that go off if a purse or suitcase is illegitimately moved or opened, gun shot detection and location devices, cameras to detect speeders and red light violations and the electronic tagging of consumer items in stores or books in libraries are other examples.

    Hidden identification marks left on paper by the manufacturer or by photo-copying machines and identifiers in material that can be used to make explosives would be included here. The various "see something, say something" campaigns dependent on the ease and ubiquity of multi-purpose cell phones and e-mail are a related example. Citizens are encouraged to use hot lines to report (e.g., suspicious persons and objects, erratic highway drivers, drug dealing, poaching and organizational malfeasance). The police in turn use mass communications media to help identify and locate wanted persons via amber alerts, posting warrant information on web cites and crime re-enactments on television.

    Enhancements to criminal history data systems involving new tools for crime mapping of hot spots and analysis (e.g., COMPSTAT) have appeared and there is increased sharing of information within and between control agencies and between the public and private sector. There has been a significant expansion of surveillance practices for which there are no, or minimal, specific reporting requirements, as with GPS data, stored communication and subscriber records (e.g., social media sites, texting, web browsing and searching).

Some Other Social Control Dimensions

The concepts above are based on combining several aspects such as whether the focus is on the potential offender, victim or a resource that is part of the violation; and whether the means literally prevents or simply ups the risks and costs of violation. Combining separate dimensions into such ideal types can be useful as a shorthand for classification and comparison. Yet this approach can also distort by merging sources of variation that can be analyzed separately and by excluding other variables.

Another approach starts with single dimensions which may cut across the different forms. Among relevant dimensions are classification based on: visibility or invisibility; openness or secrecy regarding use of the tactic both generally and specifically; control access into or out of a system; emphasis on the individual, an organization or a network; a focus on the body, consciousness or the environment; reliance on machines, humans or both; normative or non-normative influences and if the former, whether this involves criminal or civil law, policies, or manners; relative costs of a tactic including errors and mistakes; reliability and validity and ways of determining these; the presence or absence of democratic decision-making and review processes regarding the adoption and application of a tactic; and whether the goals of subjects and agents are the same, or at least overlap, or are in conflict. While all technological control efforts have elements in common, those involving dissensus and inequality are particularly important because of the centrality of human rights issues and the prominence of control and counter-control efforts.

Some Social and Ethical Implications

Whether dealing with questions of justice, quality of life, security, health, the environment or other issues, the potential benefits of science and technology are hardly deniable. In the case of criminal justice for example Clarke (1997) documents a number of successes. Yet given the sense of urgency about many problems and what is often the self-justifying tunnel rhetoric of those offering solutions, caution is needed and often,—midterm corrections, limitations or prohibitions. To argue for a yellow light is certainly not a call to cease innovation or the search for better solutions. However ideal a technical control system may appear in the abstract under ideal laboratory conditions or successful in the short run, the world of application is often much messier and more complicated than the public relations efforts claim. There is rarely a perfect, nor cost free, technical fix (if nothing else, a given choice is likely to involve using resources that might have gone elsewhere). The technology's narrowing of focus on a given problem may come at a cost of failing to see larger systemic contexts, alternatives and longer range consequences. The complexity and fluidity of human situations makes this a rich area for the study of trade-offs, irony and paradox. Goal conflicts, unintended consequences and neutralization means are among a number of factors limiting technical efforts to insure conformity discussed below.

Goal conflicts

At an abstract level consider the possible tension between values. In the case of the new super maximum security prisons there is the enduring tension between values of custody and punishment as against care and some form of rehabilitation. (Rhodes 2004) When more intensive mechanical control, whether within the prison or the community comes (as it often does) with a diminution of human contact and help at efforts to overcome the social and personal deficits which contribute to violations. Short term gains in control may come at a cost of longer term losses. (Byrne, J, Taxman, F. and Hummer D. 2007). Informing subjects that a technique is in use is intended to serve the goal of deterrence, but that information may in turn serve as a strategic support for clever rule breakers who will then seek ways around it.

The expectation that one should be judged as an individual and in context may conflict with the greater rationality and predictive success believed to be found in categorical responses based on aggregate data and models divorced from the richness of particular situations. An automatic process that eliminates the misuse of authority and appears to offer fair (as in universal treatment) can conflict with the need to respond to the uniqueness of particular contexts. The latter requires discretion to override action by a machine unable to take account of the richness of the situation. More concretely, goal conflicts in the immediate situation can be considered. Thus barriers need to keep out those who are uninvited, while making it possible for those contained within to leave in the case of an emergency. Bars over windows to keep out thieves may also prevent occupants from escaping through the window in the event of a fire. Conversely, barriers intended to keep persons or animals contained within a facility may lead to their being unable to get out when there is a fire (e.g., in a prison or a horse stall).

In commercial settings where access to merchandise is important, attaching expensive clothes (e.g., leather jackets) to a rack with a locked cable reduces the likelihood that an item will be stolen, but also complicates trying clothes on and impulse buying. Encryption of information offers security, but at a cost of increased expense, slowing down the time required for a transaction and the risk of losing the key.

Unintended consequences

Situations involving unexpected and unwanted results offer a rich area for analysis. (Merton 1957, Sieber 1982, Marx 1981) It may be difficult to limit the impact of a technology. Terms such as blowback, collateral damage, backfire and overshooting the target capture this. Techniques that immobilize suspects may do the same for control agents (e.g., sound wave technology intended to cause suspects to lose control of their bowels, a slippery banana peel substance that made it difficult to walk). Uncontrollable wind patterns may send tear gas to places where it is not directed (including back on controllers). Consider the fact that enhanced lighting and lines of visibility can help perpetrators identify victims or control agents, as well as the reverse. The roads used by the Roman legions venturing forth to conquer became equally available to other conquerors who later marched on Rome. President Nixon in secretly taping others also taped himself—leading to his downfall. Conversely, a protective device can lock everyone out if the keys are lost. The removal of benches from public areas denies the homeless, as well as others, a place to sit.

Automatic processes can result in punishment without trial. For example, a thief in Mobile, Alabama, was killed in a trap set by a homeowner. The trap consisted of two hunting rifles in separate locations. One pointed down a staircase. The rifle fired when the thief stepped on a wire rigged to the trigger. A neighbor called police when he heard a shot fired and then entered himself. (New York Times, Dec. 28, 1989). It is easy to imagine good Samaritan scenarios that end disastrously (e.g., a passerby who is shot by a homemade burglar alarm after seeing a fire and rushing in to help).

There may be second order effects. Thus in their initial use, barrier strips intended to stop fleeing cars almost instantly released the air in the tire, sometimes causing high speed crashes. Those publicly identified as sex offenders may face vigilante attacks. The death of thirty people in a fire in the London King's Cross subway station was attributed to fumes from anti-graffiti paint. Enhanced technical enforcement along the U.S.-Mexico border has led to a funnel effect in which immigrants seek to enter through more dangerous desert areas with an increase in mortality. (Cornelius 2001).

An intervention may interact with other conditions to produce an undesired outcome. Thus pepper spray is intended as a non-lethal alternative, but may be fatal to those with severe asthma or other respiratory problems. Consider also the warnings to those with pace makers that electronic sensors are in operation in retail settings. Will a non-lethal antitheft device that delivers a 50,000-volt shock to the driver of a stolen car with a weak heart be lethal? There may be longer term health consequences that are not immediately visible. Questions have been raised for example about the effect of repeated exposure to radiation from x-ray search machines.

Displacement

Several forms of displacement can be noted involving place, time, type of offence and offender. (Reppetto 1976), Norris 1998) Issues of displacement are central to many control settings where there are conflicts of interest and where rule breakers, having some resources, find ways to beat control efforts. The commercialization of protection as with embedding hidden transmitters in cars which permits locating them by remotely activating the transmitter or gated communities to keep out would-be thieves, may lead to greater victimization of those unable to afford enhanced levels of security.

Derivative offences may appear. The discovery that a target has been rendered useless to an offender may increase violence, whether as a resource to gain the needed access, or out of frustration. For example the appearance of "car-jacking" is related to more sophisticated anti-theft devices on cars. The use of access codes to activate autos and appliances may mean that the crime of burglary is converted to robbery or kidnapping, as thieves confront property owners and demand not only the property, but the code to make it work. A frustrated thief may respond to a self-destruct automobile radio by fire-bombing the car. New secondary laws that criminalize the possession of artifacts or activities designed to thwart enforcement may develop. Consider laws prohibiting the production, distribution, and use of products intended to falsify drug tests or for the possession of a detector that identifies police use of radar for traffic enforcement. For the latter, the guilty face charges for the secondary offense of possession, even if they were not speeding.

Neutralization and Escalation

Whether out of self-interested rule breaking, principled rebellion or human contrariness, individuals can be very creative in neutralizing systems of control. In a free market economy new controls create incentives to develop means of neutralization. This may lead to a higher level of play but not fundamentally alter the game. A dynamic struggle may escalate—police use of body armor may lead to offenders using more powerful weapons and wearing armor as well.

That locks open with keys and borders require access points means they are eternally vulnerable. Marx (forthcoming) identifies a number of behavioral techniques of neutralization—strategic moves by which subjects seek to subvert controls. Reverse engineering permits a breaking move—not long after anti-theft ignition protection systems appeared on automobiles, a device that permitted by-passing the lock was available. No special tool is needed to spray paint over the lens of a video camera. In a distorting move the initial anti-drunk driving car interlock systems could be beaten by saving air in a balloon or by having someone else blow into it to start the car. A variety of moves can be seen in efforts to defeat urine drug tests from contaminating the sample with bleach on one's hand to the switching move of using a catheter to insert drug free urine into the body. Discovery moves are illustrated by driver's use of radar detectors to locate police radar. When systems cannot be technically defeated as with very sophisticated encryption, then their human context may be compromised, whether through coercion, corruption or deception. For example a thief who could not break a manufacturer's sophisticated encryption code, never-the-less managed to embezzle millions of dollars through generating fake invoices. He did this by having an affair with the individual who had the decryption codes.

In many settings subjects have some rights and are entitled to challenge techno-controls. We thus see explanatory moves wherein an unfavorable result by reframed in an acceptable way by offering alternative data and the claims of rival experts or making claims about rights and procedural violations. An empirically valid result does not guarantee a socially meaningful result. Thus a DNA match between material from a crime scene and a suspect can not reveal if a death resulted from a homicide or self-defense. The sample might have been planted or a secure chain of evidence custody not maintained. A computer match between persons on welfare and those with bank accounts may reveal a person over the savings limits, but that is not proof of cheating since funds may be held in trust for a funeral—something legally permitted, but not necessarily built into the computer program. Audio and video recordings may reflect what was done and said, but may not reveal why, or what a suspect intended. Seeing should not automatically mean believing. Thus a suspect in an undercover scheme may have been threatened or entrapped off-camera. Nor is a drug test, even if "valid" indicating the presence of drugs within a person's system, a necessary indication of a violation. Depending on the assessment used, if the standard is set low enough it is possible to have a "false positive" reading as a result of just being in a room where marijuana was smoked or the result may be as a result of medications or eating certain foods.

Some Additional Factors

The over selling of technical solutions may exaggerate the risks, engendering immobilizing fear and an unduly suspicious and distrustful society and waste resources. Or, instead an unexamined faith in a tactic's supposedly fail-safe nature can lead to complacency and a false sense of security in which individual's are lulled into not taking other necessary precautions. Consider a Los Angeles case in which a man sentenced to house arrest and required to wear an electronic surveillance bracelet shot and killed his estranged wife. She had not reported his threats to police because she thought she was safe as long as he had the bracelet on.

The seemingly ever greater ease and efficiency offered by technological means can conflict with traditional liberty protecting ideas of reasonable suspicion, minimization and impracticality. Of course, as a Texas judge reportedly said," if you hang them all you will certainly get the guilty." Risk prediction technology, as in the case of profiling, may be statistically accurate across many cases, but inaccurate with respect to a given case (this is the issue of aggregate rationality vs. individual cases). Even if accurate in the individual case it may conflict with the expectations of the individual to be judged on the basis of actual behavior, not a prediction of future behavior.

Improvements in the ability to identify rule breaking may vastly expand the pool of those subject to social control. Control systems may become overloaded. This may lower morale among enforcers who feel overwhelmed, or offer corruptible officials a resource (non-enforcement) to market. Since resources for acting on all the information may not be available, authorities may face charges of discriminatory enforcement.

Even if adequate resources for full enforcement action were available, organizational effectiveness could be harmed. Automatic technical solutions developed without adequate appreciation of complexity and contingency run the risk of eliminating the discretion, negotiation, compromise and informal understandings that are often central to morale and the effective working of organizations. (Dalton 1959, Goffman 1961) In broadening the documented pool of violations/violators authorities may feel compelled to take action in cases they feel it would be best to ignore. The rigidity of the machine and limited possibilities for immediate innovation can be damaging. One strand of humor involves the automatic, unthinking and repetitive quality of many mechanical devices (note the classic Charlie Chaplin film Modern Times.)

If technical solutions could some how be effective at eliminating all rule breaking (holding apart the conflict between, and the ambiguity of and lack of consensus on, many rules), systems might become unduly rigid. Much innovation is initially seen as deviance. Experimentation and risk taking require room and can be aided by anonymity and secrecy. A socially transparent, engineered society would be more orderly, but likely less creative or dynamic and less able to respond to changing conditions.

If order depended only on technical means of blocking infractions, rather than on legitimacy, how would people behave if the means failed (e.g., in the case of a power failure and failure even of back up systems)? A social order based primarily on technical fixes is likely to be as fragile over time as one based primarily on overt repression.

Even if systems could some how be made fool and fail-proof with ever more, and more advanced technology, there is a danger of viewing humans as robots, rather than as creative beings with a conscience capable of making choices about how they behave. The former image is inconsistent with belief in the dignity of the autonomous individual in a democratic society. Whatever a technology is capable of, the view of humans as volitional (and hence responsible for their behavior) and beliefs about the inviolability (absent clear justification) of the borders that protect private personal zones around one's body, mind, relationships, communications, physical space and past history are central to ideas of respect for personhood. The tools we use to communicate say something about how we see human beings and what kind of a society we are and seek to become. Symbolism matters. So do precedents.

With a new and seemingly effective technique applied to important goals, it is important to ask, "where might this lead and what kind of a society is being created?" In the United States a future radically at odds with the nation's higher ideals is not likely to come by cataclysmic change, but gradually in a thousand little ways, each perhaps understandable (if often aided by fear or seduction) but in totality, creating a very different world —a world arrived at by accretion under the radar, rather than through full public dialogue.

The search for stand-alone mechanical solutions also avoids the need to ask why some individuals and groups break the rules and points away from examining the social conditions which may contribute to violations and the possibility of changing those conditions, rather than changing the individual. Technical solutions seek to by-pass the need to create consensus and a community in which individuals act responsibly as a result of voluntary commitment to the rules, not because they have no choice, or only out of fear of reprisals. This emphasis can further social neglect and subsequent problems, leading to calls for more intensive and extensive reliance on technology in a seemingly endless re-enforcing spiral.

There is a magisterial, legitimacy-granting aura around both law and science. (Ericson and Shearing 1986). Technological controls, presumably being science based, are justified as valid, objective, neutral, universal, consensual and fair. Their legitimacy may be strengthened in free market societies where the tactics can often be used by citizens (e.g., video cameras to record police behavior or DNA analysis offered by a criminal defendant) and internally by police managers for guarding the guards.

Yet tools are socially created and results interpreted (and thus potentially disputable). These exist in dynamic interdependent systems where interests may conflict, inequality is often present and where full impacts may be difficult to envision. Critical inquiry and humility are as needed as are innovation and experimentation. It is important to maintain, if not a doubtful attitude, at least a respectfully skeptical attitude toward claims for the new,— at least until they are examined both empirically and morally.

Our age has two rather distinct fears of technology. One, ala George Orwell, is that it will work too well—creating a manipulated, totalitarian society naively taking pride in how free it is. The other fear, reflective of Franz Kafka, is that it won't work well enough. This suggests a crazily complex, out-of-control, interdependent, opaque society steeped in technological errors and catch-22 absurdities.

A well known, if often naïve expression (given that individuals and groups do not start with equivalent resources), holds that where there is a will there is a way. This speaks to the role of human effort in obtaining goals. With the control possibilities made available by science and technology this may be reversed to where there is a way there is a will.

The myth of Frankenstein implies that we must be ever vigilant to be sure that we control the technology rather than the reverse. As Jacques Ellul (1964) argues there is a danger of self-amplifying technical means silently coming to determine the ends or even becoming ends in themselves, divorced from a vision of, and the continual search for, the good society.

Back to Main Page  |  Top

References

Altheide D 1975 "The Irony of Security." Journal of Urban Life, 4:175-195

Andreas P 2000 Border Games: Policing the U.S.-Mexico Divide. Ithaca, NY: Cornell University Press

Beniger J 1986 The Control Revolution: The Technological and Economic Origins of the Information Society. Cambridge, MA: Harvard University Press

Bennett C and Grant R (eds.) 1999 Visions of Privacy: Policy Choices for the Digital Age. Toronto: University of Toronto Pres

Brodeur J. 2011 The Police Web. Oxford: Oxford University Press.

Byrne, J, Taxman, F. and Hummer D. (eds.) 2007. The New Technology of Crime, Law, and Social Control. Monsey, NY: Criminal Justice Press.

Byrne, J. and Marx, G.T., 2011. "Technological Innovations in Crime Prevention and Policing a Review of the Research on Implementation and Impact." Journal of Police Studies, 20:3. 17-38.

Clarke R 1997 (ed) Situational Crime Prevention: Successful Case Studies. New York: Harrow and Heston

Clarke R 1988 "Information Technology and Dataveillance." in Communications of the ACM, 31:488

Cohen S 1985 Visions of Social Control. Cambridge, MA: Polity Press

Dalton M 1959 Men Who Manage. New York: Wiley & Sons

Eisenberg, S. 2012. The Victory Lab. New York: Crown

s Ellul J 1964 The Technological Society. New York: Vintage Books

Ericson R, Shearing C 1986 "The Scientification of Police Work." In Bohme G and Stehr N (eds.) The Knowledge Society: The Growing Impact of Scientific Knowledge on Social Relations. Reidel: Dordrecht

Foucault M 1977 Discipline and Punish: The Birth of the Prison. Vintage: New York

Gandy O 1993 The Panoptic Sort: Towards a Political Economy of Information. Boulder, CO: Westview Press

Gibbs J 1989 Control Sociology's Central Notion. Urbana, IL: University of Illinois Press

Goffman E 1961 Asylums. Garden City, NJ: Anchor Books

Graham S, Marvin S 1996 Telecommunications and the City: Electronic Spaces, Urban Places. London: Routledge

Haggerty K, Ericson R 1997 Policing the Risk Society. Toronto: University of Toronto Press

Janowitz M 1975 "Sociological Theory and Social Control." American Journal of Sociology, 81:82-108

Laudon K 1986 The Dossier Society: Value Choices in the Design of National Information Systems. New YOrk: Columbia University Press

Leman-Langlois, S. 2013. Technocrime, Policing and Surveillance. London: Routledge

Lessig, L. 2006 Code: And Other Laws of Cyberspace. New York: Basic Books.

Lowman, J Menzies R, Payys R (eds.) 1987 Transcarceration: Essays in the Sociology of Social Control. Aldershot: Gower

Lykken D 1981 A Tremor in the Blood: Uses and Abuses of the Lie Detector. New York: McGraw-Hill

Lyon, D. 2003 Surveillance as Social Sorting: Privacy, Risk, and Digital Discrimination. London: Routledge

Marx GT 1988 Undercover: Police Surveillance in America. Berkeley: University of California Press

  • Forthcoming, Windows into the Soul: Surveillance and Society in an Age of High Technology. Chicago: University of Chicago Press
Manning P 1992 "Information Technology and the Police." In M. Tonry and N. Morris, Modern Policing. Chicago: University of Chicago Press

Norris O, Moran J, Armstrong G 1998 Surveillance, Closed Circuit Television, and Social Control. Aldershot: Ashgate Publishing House

Office of Technology Assessment 1985 Federal Government Information Technology: Electronic Surveillance and Civil Liberties. Washington DC: USGPO

Newman O 1972 Defensible Space. New York: MacMillan

Perrow C 1984 Normal Accidents. New York: Basic Books

Rule J 1973 Private Lives, Public Surveillance. London: Allen Lane

Sherman L 1992 "Attacking Crime: Policing and Crime Control." In M. Tonry and N. Morris Modern Policing. Chicago: University of Chicago Press

Shenhav Y 1999 Manufacturing Rationality: The Engineering Foundations of the Modern Managerial Revolution. New York: Oxford University Press

Snyder E and Blakely M 1997 Fortress America: Gated Communities in the United States. Washington DC: Brookings Institution

Tenner E 1997 Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York: Vintage Books

Thorwald 1965 The Century of the Detective. New York: Harcourt Brace & World

Weber, M 1958 From Max Weber: Essays in Sociology. Gerth H, Mills CW (eds.) New York: Oxford University Press

Welsh, B.C. and Farrington, D.P. 2007 "Crime Prevention and Hard Technology: The Case of CCTV and Improved Street Lighting", in Byrne, J. and Rebovich, J. The New Technology of Crime, Law, and Social Control. Monsey, NY: Criminal Justice Press

Zuboff S 1988 In the Age of the Smart Machine. New York: Basic Books

Back to Main Page  |  References  |  Top


You are visitor number  to this page since December 10, 2012.