Science, Technology, Ethics,
and Public Decision Making
Charles M. Vest
Context and Introduction
In universities we generally view ethics in science and technology in terms of critically important values and processes to be upheld in research and publication. Above all else, we are committed to maintaining a culture and processes that ensure scholarly integrity and scientific objectivity. As the research enterprise has become larger, more global, faster, more complex, and associated with valuable intellectual property, this commitment to scientific integrity has become more important than ever.
At a more macroscopic level, scientific knowledge is increasingly critical to major public and political decisions, e.g., about global warming, embryonic stem cells, privacy of digital communications, nanotechnology, and genetically-modified foods.
I think it is worth examining three past successes of science, technology, and public policy – recombinant DNA, phasing out of Ozone-depleting chemicals, and deployment of the World Wide Web. Each of these involved people associated with MIT.
None of them map directly onto today’s issues, but they offer hints about better ways of proceeding.
Recombinant DNA technology (gene splicing) was a stunning new scientific tool in the 1970s. It enables scientists to transplant genes from one species into cells of a host organism of a different species. The public, and many scientists, worried that splicing together DNA from different species might create new organisms that would pose fundamental risks to life on our planet. Indeed, in 1973 a group of highly respected scientists, including many of the leading researchers in the field, established a voluntary moratorium on classes of experimentation until the risks of gene splicing could be carefully assessed.
Subsequently, Chemistry Nobel Laureate Paul Berg of Stanford chaired a committee that considered the issues and established a meeting at the Asilomar Conference Center in California in February 1975. About 140 scientists from 13 countries, including Phil Sharp and David Baltimore of MIT, as well as attorneys, government officials, and members of the press attended. The purpose of this conference was to decide whether to lift the moratorium, and if so to define experimental conditions and protocols for safely conducting gene-splicing work.
Indeed the conferees decided that the moratorium should be lifted. They also outlined strict biosafety guidelines that were subsequently adopted by the U.S. National Institutes of Health, and ultimately were adopted in many other countries. These remain the basis of guidelines followed today.
Recombinant DNA technology has flourished as a ubiquitous tool in biological research, as the basis of important new drugs, diagnostics, and therapies, and indeed as the basis for the biotechnology industry of which Cambridge and Boston form a major center. In recent years there has been some controversy, especially in Europe, about its application to agriculture and nutrition, but on the whole it is an accepted basis of important and wide ranging scientific, medical, and agricultural endeavors.
It is noteworthy that:
- The Asilomar meeting and process, in my view, were wise and successful.
- The process was driven by key scientists.
- The moratorium and conference focused on both fundamental and practical risk assessment and safety.
- The conclusions of the Asilomar Conference were reached by consensus, but were not unanimous.
- Some historians and policy experts have criticized the work as having paid insufficient attention to ethical and legal considerations, and to implications for biological warfare.
- National governments adopted regulations based on the Asilomar recommendations.
- The process engendered considerable public trust.
CFCs and the Ozone Layer
Incredibly complex and delicate balances maintain our environment and the life forms that have evolved on earth. In recent history, humans have applied scientific knowledge and engineering principles to develop technologies that extend our capabilities, help us adapt to unwelcoming environments, build our economies, and increase our comfort. Sometimes our technologies upset the delicate balance in unexpected ways. Refrigeration turned out to be an unexpected example of such unintended consequences.
Life on earth depends on the naturally occurring trace gas ozone that resides in our atmosphere and protects us from being exposed to too much ultraviolet radiation. Refrigeration is one of the most important technological developments of the last century. It enables us to ship and store food, improves our comfort and health, and is critical to many industries.
In the 1930s new chlofluorocarbon refrigerants (CFCs) were developed and heralded as “wonder chemicals” because, unlike the noxious refrigerants used in earlier refrigeration, they were nontoxic, nonflammable, and very useful. However, in 1973 at the University of California, Irvine, Sherwood Rowland and Mario Molina (who subsequently moved to MIT and became an Institute Professor) hypothesized that by a complex process human-made CFCs were causing a depletion of atmospheric ozone. Molina, Rowland, and Paul Crutzen shared the 1995 Nobel Prize in chemistry for this work.
These scientists engaged the public, industry, and the political process to call attention to the dangers of ozone depletion and to stop it and allow the environment to heal.
In 1977 the United Nations, through its environmental program (UNEP) established the Coordinating Committee on the Ozone Layer. In 1978 the U.S., Canada, and several Scandinavian countries banned spray cans with CFC propellants. In 1985 the UNEP Vienna Convention on the Protection of the Ozone Layer was signed to promote cooperative research, development of alternate refrigerants, legal and policy matters, and to facilitate technology transfer.
In 1987, 24 countries signed the Montreal Protocol on Substances that Deplete the Ozone Layer. This protocol froze consumption of key CFCs at 1986 levels, and reduced consumption by 50 percent over 10 years. Less developed nations were given a longer time to stop using CFCs than wealthier nations. Amazingly, the elimination of CFCs accelerated. Europe phased out CFCs by 1995, and the U.S. production was zero by 1996.
Mario Molina has stated “The Protocol demonstrates how the different sectors of society – industrialists, scientists, environmentalists and policy makers – can be productive by working together, rather than functioning in adversary mode.”
It is noteworthy that:
- Science and scientists drove the process.
- The U.N. played a key role.
- The science was still somewhat speculative while the treaty was being negotiated.
- Industry came on board once the science was clear.
- The world moved forward to reduce risk.
- New technologies for replacing and recycling existing CFCs were important to solving the problem.
Deployment of the World Wide Web
Let me end with a very different kind of success story – that of the World Wide Web (WWW). The Web is a world-changing technology that seems to have evolved as a public good through a remarkably successful global collaboration. Its rapid, massive deployment followed an unusually good path in large measure because of the vision and leadership of Tim Berners-Lee, and a somewhat unique culture and worldview that dominated much of the computer science community.
In 1980, Tim Berners-Lee began the work at CERN in Geneva that was to form the basis of the WWW. In 1989 he authored an internal memo, Information Management: A Proposal, that noted that “Many of the discussions of the future at CERN and the LHC [Large Hadron Collider] era end with the question ‘Yes, but how will we ever keep track of such a large project?’” He proposed that hypertext would be the key, and that CERN engineers and scientists should involve themselves with hypertext “so that individually and collectively we may understand what we are creating.” In my view, this latter injunction is a critically important one that all too often has not been advanced or heeded in development of other technologies.
In 1993, Berners-Lee and two students began work on the line-mode browser. That same year the first World Wide Web Wizards Workshop was held here in Cambridge, MA. In 1994, Marc Andersen formed Mosaic Communications Corp. that later became Netscape, another international conference “Woodstock of the Web” was held at CERN, and the legislator Martin Bangemann of Germany reported on the European Union’s Information Superhighway plan.
Also in 1994, as a result of leadership from our late colleague Michael Dertouzos, and many others around the world, MIT and CERN announced an agreement to establish the World Wide Web Consortium (W3C) to “lead the World Wide Web to its full potential by developing protocols and guidelines that ensure long-term growth for the Web.”
The W3C is headquartered in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and is jointly administered by CSAIL, the European Research Consortium for Informatics and Mathematics (ERCM) in France, and Keio University in Japan.
Few technologies have so rapidly transformed the way we work, live, learn, and play as the WWW.
It is noteworthy that:
- Deployment of the Web was driven by key technical players (in an engineering and computer science culture).
- The vision for the Web was global from the very beginning.
- There were international discussions throughout.
- Informal and formal cooperation among CERN, MIT, the European Commission, DARPA, and others enabled the establishment of the W3C.
- Those who developed the Web were dedicated to open, vendor-neutral standards.
- The consortium is voluntary and global.
Questions, Lessons, and Recommendations
Among the questions raised by these brief case studies are:
- Could the Asilomar process work today when most scientific communities are so large and even more globally dispersed than in the past?
- What is different about today’s global warming challenges and that of CFC phase out? Is it the magnitude of near-term economic consequences? Is it the greater complexity and economics of mitigating technologies?
- Why did ideology apparently play a less dominant role in the CFC debates than in today’s issues?
- Could the informal, multi-national, multi-sector discussions that led to the World Wide Web Consortium occur today?
- How should we decide when technology deployment should be open, and when it should be market driven?
Despite today’s changing context, these examples suggest some recommendations and remind me of important responsibilities we have as faculty and leaders in science, technology, and policy:
- Above all, maintain the integrity and objectivity of research and scholarship.
- Maintain the openness of our campus, scientific communities, and scholarly communication.
- Promote governmental and industrial investment in the future and create opportunity – globally.
- Help the public to understand risk.
- Maintain continual, respectful dialog with the public and political leaders.
- Fight – without arrogance – the rise of anti-rationality.
- Create cultures of innovation.
- Recognize the increasing role of industry and NGOs in policy matters, innovation, and problem solving.
- Continue to build and sustain good colleagueship with rising nations.
Editor's Note: This article was based on a talk given by Prof. Vest at a dinner sponsored by the Ethics Lab at MIT held on April 11, 2006.