EPA Center on Airborne Organics:
1996 Summer Symposium Report
"Advanced Instrumentation for Air Quality Measurements"


Contents:


Introductory Remarks

The 1996 EPA Center on Airborne Organics Summer Symposium on Advanced Instrumentation for Air Quality Measurements was co-organized by. Richard Flagan of the California Institute of Technology and Charles Kolb of Aerodyne Research, Inc.

Overview and Sponsor Acknowledgment

Charles E. Kolb, Aerodyne Research Inc.
Chuck Kolb opened the symposium with overview and acknowledgments. He pointed out that our understanding of atmospheric processes is largely based on measurement data. Air quality models are formulated to integrate our understanding in various fields and to give predictions that are used in developing regulations. The collection of real-time measurements with high spatial and temporal resolution is important for the advancement of the field. Another critical need is measurements to reveal the role of atmospheric aerosols and cloud droplets and their condensed phase chemistry and heterogeneous reactions with trace gaseous species. Dr. Kolb acknowledged Richard Flagan, the co-organizer of the symposium, the EPA Center on Airborne Organics and its director, Adel Sarofim, the sponsors of the symposium, and Emmi Snyder, the coordinator of the symposium.

Need for Advanced Monitoring of Air Pollutants at EPA

Larry Cupitt, Atmospheric Process Research Division, EPA
Larry Cupitt next spoke about the need for advanced instrumentation at the EPA. EPA needs advanced monitoring technologies for two reasons: first, to improve the scientific understanding of sources, processes and effects of air pollution, and second, to monitor regulatory compliance, both for source emissions and ambient standards. These needs are intimately tied to the environmental health paradigm, which connects the emissions sources and atmospheric processes to exposure and effects. A number of issues are noteworthy: ecosystem protection, "one atmosphere" (holistic approach to environmental problems), fine particulate matter, endocrine disrupters, and source characterization, especially for fugitive emissions. For source emissions, the regulatory drivers are the Operating Permit Program and compliance monitoring. Here, "better, cheaper, faster" is the key.


Session 1: What Do We Do Now?

Moderator: Praveen Amar, NESCAUM

Measurement Lessons from Southern Oxidant Study (SOS)

Ellis B. Cowling, North Carolina State University
The goal of the 1995 Southern Oxidant Study (SOS) was policy relevant research to further scientific understanding of ozone and to develop of ozone management strategies. SOS provided data to address many assumptions in ozone management approaches, such as the form of National Ambient Air Quality Standard (NAAQS), natural and human sources of NOx, VOC, and CO, and the relative importance of NOx and VOC controls. Experimentalists and modelers alike need to pay attention to processes with different time scales. Emissions vary greatly with time and ambient conditions. Biogenic emissions depend on soil moisture and the availability of light. Vehicular emissions change with the day of the week, traffic speed and volume. Ambient temperature also affects the emission rates from most sources.

Cowling summarized factors for the success of measurement campaigns. They include:

"The purpose of research is to create simple declarative sentences that tell the truth." "Truth is a perception of reality that is consistent with all relevant evidence and not contradicted by any important evidence." One of the traps for measurement campaigns is insufficient analysis and interpretation compared to data collection efforts. The solution includes involving analysts in the measurement phase of the program, realizing the need for "customers" who derive value from analysis and interpretation of data, and the formulation of specific science and policy questions that are to be addressed by specifically designed measurement systems.

Measurement lessons from the SOS include:

Three areas were discussed after the talk. The policy relevance of a study like the SOS was reiterated -- SOS led to the consideration of a new secondary NAAQS for ozone because a seasonal average level is more relevant to the effects on vegetation. Ken Colburn observed that data analysis is not regarded a science by some, and less funding is available for that purpose. Glen Cass argued that measurements deplete the funding for analysis. Dr. Cowling agreed that when funding is limited, fewer measurements may have to be made so that equal amounts of resources can be spent of measurement, analysis, and communication. The politics involved in funding research was also raised. There is a need to connect the research and policy agendas so that science can be used to address policy questions. Research is too frequently used as an alternative to control programs. Monitoring should not be neglected after political decision is made. Measurement programs are popular among politicians, industry, and the public. Analysis of data results in information that require effective action to solve the problem. The political benefit seems to be greater in taking the first steps to understand a problem, rather than in providing the solution.

Merging Models and Measurements

Gregory J. McRae, Massachusetts Institute of Technology
There is a critical need for a rigorous science base to design cost effective strategies for improving human health and welfare. Much can be gained from using models, data, and uncertainty analysis to guide measurement programs that answer policy questions. Advanced instrumentation should be used to supplement current measurement strategies.

Meaningful verification of complex atmospheric models requires estimates of uncertainties in both predictions and observations. Uncertainty analysis methodologies must be compatible with existing modeling systems, and be orders of magnitude faster than the traditional Monte Carlo method. DEMM (Deterministic Equivalent Modeling Method) directly embeds uncertainty into the model to predict the probability density functions of uncertain outcomes based on uncertain inputs, and to identify parameters which contribute to uncertainties in predictions. Applications of uncertainty information include model discrimination, resources allocation for measurement and modeling programs, alternative policy formulation, and design of stopping rules for model building and data collection. Dr. McRae gave examples from the uncertainty analysis of a chemical mechanism. In a case study, it was found that out of 19 photolysis rate constants in the SAPRC (Statewide Air Pollution Research Center) mechanism, those of the photolysis of nitrogen dioxide and formaldehyde are the main contributors to uncertainties in ozone prediction. This led to the recommendations for routine monitoring of actinic flux, and further measurement of the quantum yield for formaldehyde. Other implications for measurement programs include the need for better initial conditions and source information, nitric acid sink reaction rate measurement. Moreover, knowledge of the spatial distribution of total organics concentrations provide more benefit than measurements of exact composition for reducing model uncertainties.

Inverse modeling can be used to determine the uncertainties in model inputs from model output. One source of uncertainties in air quality models is the emissions inventory. Using Karhunen-Loeve (KL) expansions to represent emissions fields, the true emissions field can be determined by finding the KL coefficients that minimize the error of the predictions. This procedure was applied to the LA basin SCAQS data. The optimized results provide much better agreement with the observed data than the base case results. VOC emissions were previously underestimated by more than 25%.

In addition to taking advantage of model-driven measurement strategies, experimentalists should also make use of advanced instrumentation to gain useful information for modeling and regulating purposes. Tunable diode laser (TDL) spectroscopy has the potential to contribute to the reduction uncertainty in the field of atmospheric modeling because of its fast response, high sensitivity, lack of artifacts, and its ability to provide absolute measurement. Two examples of measurements by TDL were given. One is the simultaneous closed path measurements of ozone, nitric oxide, and nitrogen dioxide in the same sample. Fluctuations, fluxes, and correlations of these pollutants can be studied. The other example is the real-time open path measurement of vehicular emissions of nitrogen oxides ,carbon monoxide, and carbon dioxide to reduce the uncertainties of mobile emission inventories.

Data Requirements from a Regulator's Viewpoint

John Elston, New Jersey Department of Environmental Protection
The speaker outlined the objectives for an air monitoring program, which are to collect data that are useful for (1) the assessment of public health standards, (2) public policy decision making, and (3) public communication. Such programs face challenges due to the changing social environment, including: increased federal/state partnerships, regionalization of air quality problems, increased cost of monitoring programs, and the changing focus between basic and applied research.

Air monitoring programs that are currently supported by state and federal governments include: NAMS/SLAMS, CASTNET, PAMS and special intensive projects. These programs typically include the following components: siting, operation and maintenance, auditing and assessment.

The speaker summarized recommendations for the primary ozone and the particulate matter NAAQS. For both standards, it has been recommended that the existing standards be revised based on health effects evidence. For ozone, the recommendation included: replacing the 1-hr primary standard with an 8-hour standard at a level in the range 0.07-0.09 ppm; consider replacing the current one-expected exceedance form with a standard that

For particulate matter, the recommendation included:

North American Research Strategy on Tropospheric Ozone (NARSTO) - Northeast

Al Ferullo
The speaker discussed the past and future NARSTO field studies. The field program conducted in 1995 included:

During the 1996 program, the following changes were made:

The speaker presented the lessons learned from the studies conducted to date, including the need for the following:


Session 2: What Do We Need To Do?

Moderator: Kenneth L. Demerjian, SUNY Albany

State of the Art in VOC/NOx Measurements.

Bruce Harris, EPA Risk Management Research Laboratory
Bruce Harris provided an overview of new emissions measurement methods and programs developed at EPA. First, he reviewed the advances in the manual methods for monitoring the species regulated under the 1990 Clean Air Act. Liquid impinger sampling was discussed for the measurement of reactive species such as isocyanates, aldehydes, and methanol. EPA method 0023A which uses isotopic standards for measuring dioxins and furans was also mentioned.

Secondly, the EPA program of instrumenting heavy-duty Diesel trucks was described. It is estimated that Diesel trucks are a significant source of several pollutants, especially NOx and particulates. The emissions of 7 truck tractors are being characterized vs. operating parameters such as speed, acceleration, load, and grade. Emissions measurements include O2, CO2, CO, NOx, THC, opacity, and exhaust flow rate and temperature. The goal of the study is to develop an emissions model for inventory and regulatory purposes.

Finally, a remote sensing study of volume source emission factors was described. A FTIR beam is scanned continuously along different paths on a plane perpendicular to the prevailing wind. Emission rates for several pollutants are derived from the FTIR measurements and meteorological measurements. The method has been demonstrated with species such as SF6, CH4, CF4, propylene and ethane. It has proved to be accurate within 10% in two series of field tests in which known amounts of tracers were released.

NOy and VOC Atmospheric Measurements: State of the Art

Fred L. Fehsenfeld, NOAA, Aeronomy Laboratory
State-of-the-art measurements can be used to elucidate the nature of processes, the sources of compounds, and the effectiveness of controls. The focus of this talk is the use of data analysis as a means to assure the reliability of measurements.

A series of tests were described to assure data quality. The first one is intercomparison between different measurement procedures for the same compound. The second test is an auto-comparison of compounds in the same data set: the ratio of compounds is expected to be bound based on the physical processes of dilution and hydroxyl radical (HO) reaction. The third test is species similarity. The ratio of species should be independent of the concentrations of the species. The fourth test is to use the ratios of mixing ratios to relate two compounds to a third. The relations of the ratios are also expected to be bounded by extreme values expected from pure dilution and HO destruction. These simple tests can be used to identify the possibilities of artifacts, interferences, and other problems in the data.

A specific example was given on the quality assessment of NOy measurements. Seven NOy measurements were made in the Nashville in 1995, six of which agree to within 15% of each other throughout the range tested. Using the Trainer plot, which shows ozone concentration on the ordinate and the difference between NOy and NOx on the abscissa, only 5 of the measurements were shown to be "good enough" for modeling purposes.

Nitric acid (HNO3) is an important member of the NOy family, and a major sink for both NOx and VOC's. A new measurement of HNO3 based on chemical ionization mass spectrometry was introduced for detecting ambient levels HNO3. This technique was found to give excellent agreement with the traditional filter pack method except when wind blew from farming and ranching areas. Interference from nitrate containing aerosols was suspected for the filter pack method. Dr. Fehsenfeld stressed the importance of simple modeling and analysis of the data as part of the QA/QC plan for measurements. Reliability needs to be tested at new sites and environments. Reliable and precise measurements may be beyond the state-of-the-art and the present budget. In the question session, Dr. Fehsenfeld addressed the effects of humidity and the presence of other nitrogen species in the accuracy of the measurements of HNO3.

Particulate and Particulate Precursor Measurements

Glen R. Cass, California Institute of Technology
Glen Cass presented the methods that are currently being used to determine the nature of particulate matter and particulate matter processes in the atmosphere. Measurements of the size distribution and chemical composition of airborne particles and particle emissions from sources, as well simultaneous measurements of gas-phase pollutants are necessary. These data are needed in several areas, including: atmospheric characterization, compliance monitoring, research on atmospheric processes, health effects studies and air quality model evaluation.

Dr. Cass focused on advanced methods that can be applied today in large enough numbers to be used in air monitoring networks. He summarized the state of science in the area by listing the parameters that can be measured, including:

The equipment currently available to measure these parameters include:

Dr. Cass presented data collected in the Los Angeles basin that described the nature of particles which are indicative of important sources and atmospheric processes. In addition, many sources were tested to develop source profiles "fingerprints." Molecular tracer techniques can be used to apportion sources from individual particulate measurements. Examples include: cholesterol as an indicator of meat charbroiling; resin acids for woodsmoke; and hopanes and steranes for motor vehicle exhaust.

The talk concluded with a discussion of the many challenges remain in this area, including reducing the cost of detailed measurements, and increasing the use of these methods in routine monitoring networks.

A question was raised concerning the health effects of particulate matter exposure. The discussion that followed concluded that epidemiological evidence is available, but toxicological evidence is not strong.


Session 3: What's Happening in Related Fields?

Moderator: Robert S. Slott, Shell

Impact of Advanced Technology on Global Change Atmospheric Measurement Systems

Paul Wennberg, Harvard University
Paul Wennberg presented the results of several studies conducted as part of the Airborne Southern Hemisphere Ozone Experiment (ASHOE). As an introduction, the speaker presented the theories that describe the polar stratospheric ozone hole and the history of our understanding of the problem. Ground-based measurements showed the onset of a sharp decline of springtime ozone column, which was confirmed by satellite observations (Stolarski et al., 1986). By 1989, 70 percent of the total O3 column over the Antarctic was removed during September and October.

To explain this phenomenon, Tung suggested that dynamical lifting of air masses over the pole leads to the lower ozone column. Callis et al., suggested solar proton events could lead to large NOx concentrations over the pole. Molina and Molina; McElroy et al. proposed halogen radical driven chemistry.

The development of supersonic transports (SSTs) led to several important questions related to stratospheric processes. The SST's effect on stratospheric ozone needed to be explored because:

The ASHOE project has attempted to answer some of these questions. These studies were conducted using the NASA ER2 airplane during 8 hour flights to the south pole at an altitude of 70,000 ft.

Modeling studies based on the ASHOE data have shown that in the low NOx regime, halogens control the amount of O3 loss. In the high NOx regime, however, NOx controls O3 loss. At midlatitudes, CO2 profiles reveal that air masses of different "transport histories" are encountered. Dr. Wennberg noted that field measurements of the CO2/Nitrous oxide ratio can be used to:

Dr. Wennberg raised the following questions:

The ASHOE study has provided measurements of CO2, NO, NO2, condensation nuclei, HO2, N2O and HO in the stratosphere. These data have revealed:

Remote Sensing of Industrial Process Fugitive Emissions

Ian Archibald, Shell Research and Technology Center - Thornton, England
Ian Archibald provided an overview of the use of remote sensing and inverse modeling to determine the distribution of fugitive emissions from an industrial facility. Fugitive emissions were defined as direct releases to the atmosphere other than through venting and flaring. The costs of these losses can be substantial. For example, a 0.2 weight percent loss of feedstock can cost a refinery $2 million/year; however, the cost of detection can be as much.

New measurement techniques offer the opportunity to locate sources of emissions and quantify release rates. These techniques allow for more detailed assessments for environmental impacts, process improvement monitoring and regulatory requirements. The location of fugitive emissions eases remediation and allows for „just in timeš maintenance. There are a range of options for the estimation of emission rates, including equipment counts, point sensor technology, open path remote sensing and laser radar (LIDAR/DIAL). The speaker discussed several potential problems with point sensor methodology including: order of magnitude uncertainties, incomplete coverage and the high sensitivity to instrument damping.

Calculation of the source distribution involves solving a integral (Fredholm) equation. A maximum entropy formulation was used to find the most likely solution consistent with the data. This approach was compared to a least squares regression using a simple example where the emission rates from four emissions cells (2x2) are to be determined from measurements of the total rate across each row and column. The least squares solution yields negative emissions rates, however the maximum entropy approach produced a physically realistic result. A Gaussian plume model was used to simulate the dispersion of emissions from the site in this study. Estimation of the growth of plume size with distance is required and can be calculated by a correlation typecast by meteorological conditions (e.g., Pasquill Gifford), or by methods based on direct measurement of turbulence.

A controlled release experiment was conducted to test the method. The maximum entropy reconstruction of a release of 2 sources at 20 l/min each was calculated to be 32 +/- 12.7 l/min . The speaker noted that another method of locating emission sources is to study the concentration time series, since the effect of an increasing distance-to-source on this signal is to reduce peaks and increase characteristic length scales of variation.

The conclusions of the talk were:


Session 4: How Could New Technology Improve Gaseous Pollutant Measurements?

Moderator: Charles E. Kolb, Aerodyne Research, Inc.

LIDAR Profiling of Gaseous Pollutants

Edward V. Browell, NASA Langley Research Center
Edward Browell provided an overview of the airborne LIDAR activities at NASA Langley. The different LIDAR techniques (elastic scattering, Raman scattering, fluorescence and DIAL) were described, and the choice of DIAL for gaseous species and aerosols justified in terms of the range achievable: with this technique, an aircraft flying at 10 km can provide the concentration profile from the ground to the tropopause.

Airborne LIDAR systems used at NASA Langley were described. These include UV-DIAL for ozone and aerosols and IR-DIAL for water vapor, aerosol and cloud measurements. NASA has conducted a large number of measurement programs since 1980, in which these instruments were mounted on NASA DC-8 and ER-2 airplanes. Some of the results from the TRACE, ABLE, and PEM-WEST experiments were presented. The vertical profile of aerosols and ozone or water vapor made possible the identification of air masses and the investigation of large-scale atmospheric processes.

The future developments of LIDAR systems were described. Water vapor in the upper troposphere, SO2, CH4, and temperature in the lower troposphere are expected to be operational in less than 3 years. In the longer term CO, NH3, NO2, and NO capabilities may also be developed.

Finally, the potential for active remote sensing from space was described. High vertical resolution (<2km) measurements would be best from an atmospheric chemistry point of view. Spaceborne LIDAR systems could provide such data for H2O, aerosols, O3, CO, and CH4. In addition, total column burden of NH3, NO2 and possibly SO2 could be obtained by this technique.

Remote Sensing Technology for Mobile Emissions Measurements.

Steven H. Cadle, GM R&D Center
Steven Cadle provided an overview of the emissions inventory from mobile sources. Cars and trucks are a significant source of HC, CO, NOx, PM10 and PM2.5. Inventories are based on emissions data obtained in the laboratory while cars follow a driving profile on dynamometers. Evaporative emissions are determined by maintaining the car in an sealed enclosure under a controlled temperature schedule. Current emission inventories are thought to be deficient and very uncertain, because driving patterns and emissions can be very different for every vehicle and the numbers of cars tested by the above methods is very small.

Several alternative methods to quantify real-world motor vehicle emissions exist: on-road instrumented vehicles, tunnel studies, remote sensing, and ambient measurements. Of these, remote sensing is the most promising as it can determine the emissions from a large number of individual cars under real-world conditions. Remote sensing of CO by non-dispersive infrared absorption (NDIR) was pioneered by Don Stedman of the University of Denver in 1987, and is now commercially available.

Remote sensing of CO and HC has been shown to be accurate to 5% and 15% respectively in double-blind comparisons with instrumented vehicles. The quantity directly determined by remote sensing is the pollutant/CO2 ratio, from which the pollutant concentration on the exhaust can be determined. Automated license plate readers are now available for vehicle identification. NO and PM remote sensing are under development. There are serious problems with the interpretation of the HC measurements, as the response of different hydrocarbons to current remote sensors varies widely.

The potential uses of remote sensing are two: the compilation of better emission inventories and the surveillance of the car fleet between inspections. A problem for the later application is the correspondence between the 1-second remote sensing emissions measurement and the regulated emissions. There is a concern that some clean cars could be misidentified as dirty or vice versa by a remote sensor.

The distribution of CO emissions versus model year obtained in a number of remote sensing studies was presented. Although emissions tend to increase monotonically with vehicle age, the decreasing number of older vehicles means that the highest contributors to total CO emissions are cars about 10 years old.

Tunable Infrared Laser Absorption Systems for Remote Sensing of Mobile Emissions and Mapping or Emission Fluxes.

Mark S. Zahniser, Aerodyne Research
Mark Zahniser discussed several applications of Tunable Infrared Laser Differential Absorption Spectroscopy (TILDAS) to gaseous pollutant measurements. The heart of the TILDAS instrument is an infrared laser, which is tuned across a known absorption feature of the species of interests. This technique provides very sensitive (< 1ppb), fast time response, unequivocal measurement of many small-molecule gaseous species, including CH4, N2O, CO, CO2, O3, NO, NO2, SO2, H2O2, H2CO, NH3, HOCl, HO2, and HNO3. Another advantage is that the measurement is absolute, based only on spectral line parameters.

This technique works best at reduced pressures due to the increased sharpness of the spectral absorption lines. Long absorption paths in small volumes are desirable for sensitivity, response time, and compactness. Aerodyne researchers have developed an astigmatic cell which improves all previous characteristics. Eddy correlation fluxes can be obtained due to the better time response. Also, multiple species can be measured by spatially combining the light from several diodes. For example simultaneous measurements of ambient NO, NO2 and O3 are underway.

The TILDAS technique has been used to characterize pollutant emissions in several ways. In one study a TILDAS system was mounted on a truck and driven around several towns. The sources of CH4 were characterized that way. Another application involved roadside measurements of NO and NO2.

Remote sensing of motor vehicle NO is being developed with this technique. NO and CO2 are simultaneously measured at about 100Hz and their ratio is used to determine the NO concentration in the exhaust. This technique is much more sensitive than other NO remote sensing methods due its much better spectral resolution, which avoids interferences with other species in the exhaust. Other pollutants, such as CO, NO2 , and CH2O can also be remotely sensed by this method.

In response to a question, Dr. Zahniser estimated the cost of a remote sensing TILDAS instrument in $150,000 and the cost of an individual measurement as 50Ę.

Atmospheric Pressure Chemical Ionization Mass Spectrometry.

Albert A. Viggiano, Phillips Laboratory
Albert Viggiano discussed the application of chemical ionization mass spectrometry (CIMS) to atmospheric measurements. The main advantage of this technique is its sensitivity, which ranges from 1 part in 10-11 to 10-18. Neutral mass spectrometry can only reach 1 part in 10-10. Ambient ions can be detected if they are present, otherwise analyte ions can be created by chemical reaction.

The technique has been applied to determine the ions present in the stratosphere, and also H2SO4 and free electron concentration profiles. The later controls radio propagation, but its measurement is difficult due to its low abundance. Titration by SF6 has been used successfully for this purpose. Other species measured by this technique include ClNO2, and OH (by titration with isotopically labeled SO2 to create H2SO4).

This technique was also used during the SUCCESS measurement campaign, in which aircraft exhaust was directly sampled. SO2, NO2, HSO4, HNO3, and HONO were measured using CIMS.

Measurements of a total of 22 species have been reported with the CIMS technique, including NH3, HOCl, HNO3, HNO2, SO3, DMS, pyridine, acetone, and some larger organic molecules such as picoline (C6H7N). The measurement of some other species, such as HCl and ClO, is underway.


Session 5: How Could New Technology Improve Particulate Measurements?

Moderator: Glen Cass, Caltech

Advances in Aerosol Sampling and Sizing

Peter H. McMurry, University of Minnesota
The speaker reviewed size and size-resolved composition measurement techniques for particles in the size range 3 nm to 10 mm with particular emphasis on ultra fine aerosols (UFA) and advances over the last 5 years. The instruments discussed were:

Optical Particle Counters (OPC) measure optical size of particles larger than 0.1 mm by light scattering. Instrument response is a function of refractive index, particle shape and size. A large fraction of atmospheric particles generate an OPC response similar to oleic acid particles, however another fraction of particles yields a response in the OPC between that of oleic acid and polystyrene latex (PSL). This can cause large errors in the measured size distribution of atmospheric aerosols if the OPC is calibrated with PSL particles.

Aerodynamic Particle Sizers (APS) determine particle size by measuring the velocity of particles accelerated though a nozzle. Units which use subsonic flow measure the aerodynamic size of particles in the range 0.5-20 mm; those with supersonic flows measure the geometric size of particles in the range 0.2-700 mm.

Differential Mobility Analyzers (DMA) determine size of particles in the range 3-500 nm by measuring the mobility of charged particles in an electrical field. Recent advances in DMA operation and design have reduced the measurement times by a factor of 10 and reduced diffusional losses. This is significant for measurement of UFA as 60-90% of 5 nm particles are lost by diffusion in conventional DMA designs.

Tandem Differential Mobility Analyzers (TDMA), in this scheme particles are sized in the first DMA, conditioned in a constant relative humidity chamber, and the resulting particle size is measured by a second DMA. Measurements of atmospheric particles in the TDMA show that there exist two subpopulations - "more hygroscopic" and "less hygroscopic" particles - in atmospheric aerosols.

Condensation Nuclei Counters (CNC) count particles by condensing n-butanol on them in a saturation chamber and optically detecting the resulting large particles. Uneven saturation ratios in the chamber had led to poor detection efficiencies for particles smaller than 10 nm. Recent advances in CNC design have corrected this and allow accurate detection of particles as small as 3 nm.

Impactors collect particles for analysis by inertial deposition. Modern impactors segregate aerosols into 10 size fractions from 0.05 to 18 mm. Hygroscopic particles may either shrink in the low pressures of the final impactor stages or grow due to flow-induced cooling and condensation. These artifacts are most important for relative humidities greater than 80%. At low humidities, on the other hand, up to 80% of atmospheric particles bounce off ungreased aluminum plates. In addition collected semi-volatile compounds may evaporate during sampling. These evaporative losses have been predicted, and the predictions found to agree with sampling results.

Advances in Real-Time Ambient Aerosol Characterization

Kimberly A. Prather, University of California, Riverside
Seven laboratories are in the process of developing, testing and using single particle mass spectrometers (MS). The speaker summarized recent advances in the field relevant to the characterization of atmospherically relevant aerosols. One motivation for single particle analysis is to determine the degree of inhomogenity of atmospheric particles - how chemical composition of particles varies. Another motivation is to efficiently collect real time data including particle size and chemical speciation, with short averaging times and without costly off-line analytic techniques.

The general design of the single particle MS is as follows. The size of incoming particles are determined by an aerodynamic particle sizer. A laser is then fired at the particle to ablate the particle and ionize the ablated material. The mass to charge ratio of the ions is then determined by a time of flight MS. The MS can be configured to monitor positive, negative, or, with two MS, both positive and negative ions.

Recently these instruments have been used to monitor particles generated in reaction chambers in the laboratory, particles from emission sources, and atmospheric particles. Studies of laboratory generated particles have demonstrated that the instruments can accurately quantify species in the aerosol. Other laboratory studies are using the single particle MS to test aerosol equilibration and reaction mechanisms. Source characterization studies have been made for cigarette smoke, diesel and automobile engine exhaust. Analyses of atmospheric particles indicates that particle composition is inhomogenous. Atmospheric particles have been classified into categories like inorganic oxide, marine, soil, organic carbon, elemental carbon and fireworks particles based on their mass spectra. The speaker showed a sample positive ion mass spectrum of an organic carbon particle.

These instruments are now able to determine particle size and chemical composition in real time. Future studies are planned to collect source samples to establish fingerprints for source attribution; monitor aerosol composition in coordination with health effects studies; and comparison with traditional techniques for the collection and chemical analysis of aerosols.

Petros Koutrakis asked: What is the pressure in the sampling chamber? Will the particles change at these low pressures?

A: The operating pressure of the sampling chamber is 10-7 torr. The analysis occurs within 600 ms of the particle entering the sampler. Because of the rapid analysis, it is unlikely that the particle size or composition changes significantly. Evidence for this is that volatile components like naphthalene and nicotine can be detected on the particles.

Q: How well does laser ablation create ions representative of the particle composition?

A: By varying the laser power, one can study the extent of ionization. At low laser power K and N ions attach to organic compounds, creating positive ions by a "soft" process. These positive ions are then detected in the MS. At very high laser power fragmentation of the organic ions seen at low laser power is evident. But peaks indicative of additional compounds are not evident, indicating that at low power the laser is able to ionize all the species.

Challenges in Instrumentation Commercialization

David H. Fine, Thermedics Detection, Inc.
The speaker's company develops and manufactures analytical instrumentation for the air pollution and explosives sensing markets. He traced the life cycle for the development of a new instrument and discussed the challenges and issues typically encountered at each stage in the development process.

The first and final consideration in any instrument development is to make a healthy profit. A new instrument is positioned to address a perceived need building on the company's existing technology and patent position. The desired result is to produce a viable product at a reasonable price.

After concept development, one or two breadboard units are built to prove the concept and use for in-house testing. Next, two to five engineering models are built. These units are built by skilled workers, usually with Ph.D. degrees, to determine product design and specifications. From the detailed specifications developed with the engineering model, 5-10 prototype units are produced by unskilled labor for beta testing. Next pilot production is undertaken to produce approximately 50 units. Following successful pilot production, full scale production is begun.

Specifications are developed with the engineering model. Included in specifications are failure analysis, serviceability, data management and environmental testing. Failure analysis must examine the critical failure modes and the consequences of failure. Then the instrument is designed for the appropriate level of reliability. Similarly, designing the instrument serviceability requires an examination of the consequences of downtime, the instrument and equipment accessibility and the time to service. Design of data management requires development of user friendly screens appropriate to the skill level of the operators. Environmental testing includes exposure of the instrument to "shake & bake", water, dropping, electromagnetic interference and compliance with safety and other product regulations.

Larry Cupitt asked: How does one assure compliance with relevant ISO standards?

A: The European Community "EE" mark requirements are similar, and generally more stringent than the ISO standards. Therefore we use "EE" compliance to ensure ISO compliance.

Q: How long does it take to produce a new instrument?

A: The industry norm for implementing an established technology, e.g. mass spectrometry, is 18 months. Thermedics is able to develop a product based on an established technology in 6-8 months. The company can develop a prototype of a new technology in approximately 4 months.

Glen Cass asked: My experience with instruments is that approximately half fail soon after installation. How can the consumer evaluate the reliability of a new instrument? For the small air pollution market, where the market for instruments is 10-100 units, how can manufacturers make reliable instruments?

A: In the explosive detection equipment market, the prospective purchasing nation tests units before placing a large order. Thermedics is producing a high speed GC for methane/non-methane hydrocarbon detection with a projected market of 1000 units. Smaller markets probably cannot return the development costs required for reliable instrument development.


Session 6: Summary

Symposium Summary - What Have We Learned

John H. Seinfeld, California Institute of Technology
Key ideas presented about the future of air quality monitoring were

The biggest measurement challenge is the measurement of PM2.5. This is because particle chemical compositions are necessary to develop effective control strategies for PM2.5 and because current filter-based sampling technologies are inadequate for semi-volatile species. After the summary the floor was then opened for discussion.

A number of attendees questioned EPA's proposed gravimetric standard for PM2.5 monitoring. Petros Koutrakis noted that the PM2.5 standard is motivated by epidemiological studies on PM10 and PM2.5 and that, until the scientific community is able to provide data linking chemical composition to health effects, EPA must stay with a gravimetric standard. He also noted that there are no field-demonstrated instruments able to correct for semi-volatile losses and the effect of humidity on particle mass. He challenged the research community to develop and prove such an instrument to replace the proposed gravimetric PM2.5 standard in 3-4 years. Partnerships with industry will be essential to meet this goal. The instrumentation research community also needs to have a dialog with the health effects community on the possible mechanisms of PM2.5 health effects and what in PM2.5 to measure.

Larry Cupitt noted that the PM2.5 monitoring is motivated by the risk management paradigm. In addition, the PM2.5 measurement instruments need to be as simple as possible so that they can be quickly deployed, reliably used and generate a high quality data set. He also stated that nothing precludes the collection of data in addition to PM2.5 mass.

Chuck Kolb stated his concern that PM2.5 will come to be defined as what the EPA defined PM2.5 monitor measures and it is unclear which size parameter will actually be measured. Any later criticism of the standard will reflect badly both on EPA and the environmental research community. He also noted that as a result of NRC recommendations, funding for instrumentation has been increased by EPA, NOAA, NASA and others. He asked researchers doing field measurements to include new instruments in their programs to help the new instruments through the "valley of death" between development and commercialization.

Praveen Amar expressed concern from a state's perspective that in 2-3 years PM2.5 data will be generated, but the states will be unable to use the data to design a compliance strategy.

Glen Cass noted that in designing a total environmental abatement program, the monitoring network is a minor expense. He is concerned that a large error in the measurement of PM2.5 might lead to a poor allocation of billions of dollars spent on abatement. He suggested that with existing technology 500 sites could be monitored for fine particle chemical composition which would provide data for source apportionment. He recommended this alternative over the equally expensive one of collecting only gravimetric data at the existing 1500 PM10 sites.


Back to the EPA Center for Airborne Organics Home Page.
Back to the MIT Home Page.