EPA Center on Airborne Organics
1998 Summer Symposium Report
 
 

Costs and Benefits Estimation
in Air Quality Regulations

Table of Contents


 
Session I. Introduction, Robert Slott, MIT Opening Talk, Paul Portney, Resources from the Future

A Political View, Charles Ingbretson, House Commerce Committee

EPA "812" Study, Richard Morgenstern (on leave from EPA)

An Outsider's View of EPA "812," Tom Lareau, API

Session II. Air Modeling and Monitoring, John Seinfeld, Caltech Air Models, Jana Milford, University of Colorado

Monitoring Emissions, Ken Demerjian, SUNY, Albany

Meteorological Variation, S. T. Rao, SUNY Albany and NY DEC

Session III. Health Effects, Robert Sawyer, UC Berkeley The Challenge of Measuring the Benefits of Air Pollution Regulation, Daniel Greenbaum, Health Effects Institute

Indoor Exposure to Outdoor Pollutants, Joan Daisey, LBL

Session IV. Stationary Source Case Histories, Praveen Amar, NESCAUM Stationary NOx-A Retrospective Look at the Cost of Controlling Air Pollution, Praveen Amar, NESCAUM

The Costs of Title IV SO2 Reductions, Denny Ellerman, MIT

Revisiting SO2, Jeremy Platt, EPRI

Session V. Transportation Case Histories, Richard Morgenstern, (on leave from EPA) Inspection and Maintenance, Winston Harrington, RFF

Colorado's I/M Program, Richard Barrett, Colorado Department of Public Health and the Environment

Low Emission Vehicle (LEV) Technology and California Reformulated Gasoline (RFG), Tom Cackette, California ARB

Reformulated Gasoline, Robert Anderson, Resource Consulting Associates, Inc

Air Quality and Other Benefits and Costs of Transportation Control Measures John Suhrbier, Cambridge Systematics, Inc.

Session VI. Summing Up, Art Fraas, OMB

Panel Discussion: What Can be Done to Improve the Accuracy of Cost and Benefit Predictions?

John Bachmann, EPA

Sam Leonard, General Motors

Tom Lareau, American Petroleum Institute

Jason Grumet, NESCAUM

Jeremy Platt, EPRI

Review and Summary of the Conference, Robert Sawyer, UC Berkeley

 
 

Session I. Introduction
Moderator: Robert Slott, MIT

Dr. Slott opened the session by observing that Federal regulations cost about 4 - 6% of GDP. The benefits of these regulations are estimated to be greater than the costs, according to EPAās 812 study of the Clean Air Act. A presentation on this study will be heard later in the morning.

[UP]
Opening Talk
Paul Portney, Resources for the Future

Dr. Portney opened his talk by stating that air pollution control is arguably the most important regulatory function provided by the federal government. Citizens are directly and visibly affected by air pollution. This program has been the most successful social intervention in the US (at least in terms of improvements in air quality since 1970). Every criteria pollutant has improved nearly everywhere in the US (except in cities experiencing dramatic growth (e.g., Phoenix)). In spite of this success, recent polls indicate that people still believe that air quality is getting worse.

According to EPAās 812 study, over $500 billion has been spent on this success (from 1970-90) and it is estimated that $50 billion/yr. is continuing to be spent to meet the Clean Air Act Amendments of 1990. This represents about one third of the regulatory burden for environmental compliance. The US spends the largest share (2.3%) of GDP on environmental compliance of any country in the world. Similarly-developed countries spend approximately 1.6-1.8% of GDP.

EPA claims that the program provides benefits of over one trillion dollars and staves off 200,000 premature deaths. While the exact number is debatable, no serious study suggests that the benefits are less than costs. However, to claim that the benefits are worth 15% of GDP (based on $7 trillion GDP) seems unreasonable. Also, total mortality in the US is about 2 million deaths; it seems unlikely that 10% of annual deaths are being saved.

Regulatory reform bills that have promoted cost-benefit analysis have been unsuccessful for various reasons. The current bill (S981) has many of the objectionable parts removed. This might give us a better regulatory framework. Recent regulation has promoted the use of cost-benefit analysis even while recognizing that many costs and benefits are difficult to quantify. The Safe Drinking Water Act (SDWA) and the Food Quality Protection Act have been expanded to include cost-benefit requirements.

Cost estimation needs to be improved, as there are categories of cost that are often not considered. Economic analysis is only part of the battle. The physical benefits need to be quantified in a much better manner. Premature mortality and acute illnesses provide the major contribution to EPAās estimated benefits (approximately 90%). A significant reduction in the estimated benefit of reduced pollution on premature death would significantly alter the benefit calculations. Estimating the value of extending life from 75 to 77 may be different from extending life from 40 to 77. With the potential for further tightening of particulate standards, we need to know whether the additional costs are worth the benefits.

Q: Praveen Amar, NESCAUM Can we compare costs of CAA regulations to healthcare costs?

A: Yes, some benefits are calculated by looking at avoided healthcare costs. Healthcare cost could be argued to be $1 trillion higher without regulations.

Q: Richard Magee, NJIT In implementing the CAA, we did a lot of good early on ("low hanging fruit"). Should we look at this in comparing costs and benefits?

A: Early costs were not compared with further progress. Incremental cost/benefit analysis would tell you how far to go.

Q: Philip Lorang, EPA Didnāt 1977 regulations provide the real benefit, not 1970?

A: No, 1970 regulations were substantial.

Q: Kenneth Colburn, NH Dept. of Env. Serv. Cost/benefit is not linear. Are you looking at that?

A: Yes, linearity can be misleading.

[UP]
A Political View
Charles Ingebretson, House Committee on Commerce

As a matter of introduction, Mr. Ingebretson informed the group that the House Commerce Committee is chaired by Tom Bliley and is claimed to be the oldest committee in the House. From patrolling the borders in the late 1700s, it has evolved into a committee with exclusive jurisdiction over the Clean Air Act. The Committee includes John Dingle and Bob Waxman.

There has been a remarkable amount of progress in the recognition of the value of cost-benefit analysis in recent years. There has been considerable debate over the proposed NAAQS changes. There has also been a perception that cost-benefit analysis does not apply to setting NAAQS. This has gradually been changing. Recent appropriations bills have required "regulatory accounting" for federal activities. There are proposals to make these requirements permanent.

Mr. Ingebretson highlighted milestones in the use of cost-benefit analysis. These included the decision in Lead Industries Association v. EPA (1980) and legislation including the Unfunded Mandate Reform Act of 1995 (P.L. 104-4) and the Small Business Regulatory Enforcement Fairness Act of 1996 (P.L. 104-121). Reauthorization bills have been a vehicle for requiring cost-benefit analysis for new regulations. As an example, in the Safe Drinking Water Act, consensus language was passed to give the administrator the power to promulgate contaminant levels "that maximize health risk reduction benefits at a cost that is justified by the benefits." There is some nervousness on the data coming from EPA. There will be requirements for other agencies to corroborate EPA data. There also seems to be more regional activity for clean air (vs. national) as there is recognition that although air pollution does not stop at the state border, its effects tend to be more regional than national.

Q: Jeffrey MacGillivray, NH State Rep. The cost-benefit analysis was only intended to be a six-month study. Economic analysis needs to be done from day 1 and not be considered a zero cost effort. Has anything been done in this regard?

A: Cost-benefit analysis is intended to be before regulation in the newest language. This has more to do with language, but itās a start.

Q: John Elston, NJ EPA In my opinion, Congress canāt make tough decisions ö it seems to come down to the courts.

A: Congress doesnāt do the day-to-day job, but they should set policy objectives.

Q: What does Congress expect to accomplish on these science reviews?

A: Congress can make sure that the agency is on the right track.

Q: Can you give an example of why they held these reviews?

A: I donāt know.

[UP]
EPA "812" Study
Richard Morgenstern, RFF (on leave from EPA)

Dr. Morgenstern opened his talk by explaining that Section 812 of the CAAA of 1990 required a cost-benefit analysis of the air regulations from 1970 to 1990 (as well as a forecast of future costs and benefits biennially). This study lasted 6 years and $5 million to carry out. Economic benefits cover individuals, production/consumption, economic assets, and environmental assets. Costs cover private sector and government costs as well as economic and social impacts. The major contributor to benefits is mortality. The major contributor to costs is the private sector costs. Direct costs were estimated to be $500 billion over the 20-year period with controls.

The cost-benefit comparison was intended to be against what 1990 would have looked like in the absence of controls (i.e., absence of Clean Air Act). Determining such a baseline is a difficult task. The reportās authors chose to freeze unit emissions at 1970 rates. This assumption has the effect of overstating the emissions reduction and, hence, the benefits. This approach is not intended to be completely accurate; it is, however, the only feasible method to achieve a reasonable comparison.

Based on this analysis, emissions reductions have definitely occurred. Based on the 1970 rates, most reductions are in the range of 40 - 50% with exception of lead which was 99%. The value of statistical life based results assuming $4.8 million per death saved averaged out to $22 trillion over the 20 year period. Using a statistical life year approach, the value was reduced to $14 trillion ($293K/life year). In either case, the benefits were estimated to be well in excess of the cost. Premature mortality accounted for 80% of the calculated benefits, the bulk of which was attributed to particulate matter. One of the critiques of the study points to the question of whether there were parts of the regulations that were particularly effective (or ineffective). The baseline question is also critical. The epidemiology and toxicology of many of these pollutants are not well understood. The values associated with these effects are also being called into question. A study of CAAA costs and benefits since 1990 by Robert Hahn calculated net benefits of $90 billion (Costs: $124.7; Benefits: $214.7).

[UP]
An Outsider's View of EPA "812"
Tom Lareau, API

Dr. Lareau opened his talk by reiterating the major findings of EPAās 812 study. He also noted that:

- 81% of the claimed benefits are from reduced mortality;

- over 200,000 lives are supposedly saved in 1990; and

- 75% of the claimed benefits are linked to one major study.

As an initial plausibility check, consider that the estimated benefits in 1990 amounted to $1.2 trillion; at roughly 20% of GDP this seems implausible as an estimate of societyās willingness-to-pay for air quality improvement.

The EPA study, while an impressive effort, is deficient in several ways. First, the study was designed to measure total benefits and costs, which have little policy relevance. The study should have (and to a limited degree could have) developed estimates of marginal benefits and costs. In other words, the relevant question is whether or not the next regulation increment is worthwhile.

The second shortcoming involves baseline issues. As Richard Morgenstern indicated, assuming the CAA was responsible for all emission reductions after 1970 inevitably led to an overstatement of emissions reduction. In addition, the large reduction of particle emissions EPA reports is questionable. Particle emissions were decreasing prior to 1970, but EPA models show them increasing after 1970 assuming no further control. As a consequence, modeled decreases in PM associated with the CAA are quite large. This may not be correct.

The third shortcoming arises from EPAās selection of the Pope study for their mortality estimates and from EPAās valuation of reduced mortality. The Pope study might be a reasonable basis for a high-end estimate of mortality, but is not a reasonable basis for a mid-point or low-end estimate. Other studies indicate weak or no linkage between PM concentrations and mortality. How EPA values reduced mortality is also controversial. Most of the saved lives occur by reduced mortality of individuals over age 65. EPA values each saved life at $4.8 million. However, that estimate reflects a willingness-to-pay to avoid mortality in the prime of life; that is, it represents 30 to 40 life-years saved. One estimate would reduce EPAās estimated mortality benefit by more than a factor of five.

EPAās analysis of costs is much closer to being on target, though probably underestimated. Compliance expenditures or changes in GNP are not good estimates of social cost. To its credit EPA uses a general equilibrium model from which social cost can be estimated correctly. Still, other studies of regulatory cost find that social costs increase substantially over time as productivity is croded due to substitution of environmental capital for output enhancing capital, a result at odds with EPAās models.

Finally, EPAās uncertainty analysis falls short of the ideal, though again EPA should be commended for their extensive Monte Carlo simulations. Unfortunately, important sources of uncertainty are not estimated or fully characterized, particularly for emission projections and health effects.

The result of the above criticisms is that the difference between costs and benefits is likely to be much less than what EPAās 812 study shows. The interesting thing is that some regulations flowing from the CAA likely were not net beneficial. In conclusion, one can improve the accuracy of estimates by reducing and correctly characterizing uncertainty. Fundamental studies of joint pollutants are needed. Surprisingly, the greatest uncertainties are those associated with emissions and dose-response functions, not uncertainties related to economic valuation. Finally, peer review needs to be strengthened and set up both inside and outside EPA. It may be necessary to assign primary responsibility for such analysis to an agency other than the one responsible for the regulation.

Q: Philip Lorang, EPA Old people spend a lot of money to stay alive ö how can we account for this willingness-to-pay for life?

A: Willingness-to-pay is the right metric. Though evidence is scanty, what is available is consistent with a life-year approach. One must capture societyās value of a saved life, not an individualās willingness-to-spend to postpone the inevitable ravages of old age.

Q: What is the state-of -the-art in disaggregating costs and benefits?

A: This is a difficult problem, but we can do something. It is harder to disaggregate benefits.

Q: Robert Harley, UCB What about CO deaths in automobiles?

A: Cars cause deaths in all kinds of ways. Consumers should be partly responsible for safe use. [Note: I may have misunderstood the question. I thought R.A. was referring to CO auto suicides. You might want to ask him.]

Q: Life expectancy is going up. Were baseline death rates frozen?

A: I believe life expectancy changes were accounted for properly.


[UP]




Session II. Air Modeling and Monitoring
Moderator: John Seinfeld, Caltech

[UP]
Air Models
Jana Milford, University of Colorado

Dr. Milford pointed out that the relationship between NOx and ozone has some perversities related to the ratio of VOCs and NOx. Thus, air modeling is needed to determine the impacts of a given change in emissions levels. An episode that occurred in the California South Coast region during 2 days in 1987 was used as a base for the model. A 3-D model with a 5 km grid with 800 source types was developed for hourly resolution. Local data was used for both weather and inventory effects. A chemical model with 186 species was applied. A baseline emissions estimate was made and input to the model an the results were compared to actual data. The model slightly over predicted (~30%) the ozone concentration levels over the hourly record. An uncontrolled case was run using emissions levels from vehicles without controls. This showed a roughly 50% increase in the ozone level. Uncertainty analysis looked at reaction rate constants, vehicle emissions, and weather data. A trajectory model was used over several sites. Values were compared to observed results. Uncertainly levels of 25% were estimated. The results compared well with observed data, which also had a comparable error band. In one county, the response to NOx reduction indicated that NOx control would be counter productive (i.e., higher ozone levels resulted). VOC control was effective. In another county, NOx reductions were more effective. This is for the same ozone episode. Clearly a lot more episodes need to be modeled. These models are very detailed and dependent upon a lot of data and complex relationships. Uncertainty levels are on the order of 25 - 50%. The models are reasonable, but the input data is limited. The Los Angeles area has a lot of data available that reduces the uncertainty.

Q: Philip Lorang, EPA Were trajectories considered with uncertainties?

A: Yes, very much. The Los Angeles area is blessed with lots of data.

Q: What about uncertainties in speciation?

A: On an aggregate level, speciation is not far off. On the level of individual sources, there is a lot of uncertainty.

Comment: Robert Sawyer, UCB Uncertainty in the LA inventory is currently being done (from the "ground up").

Q: Winston Harrington, RFF Do you eliminate some of the non-linearity if you roll back the graph?

A. Yes.

[UP]
Monitoring Emissions
Ken Demerjian, SUNY Albany

Dr. Demerjian posed what he believes to be one of the key questions: "Can historical trends in ambient air be directly attributed to air regulations?" This gets back to a process of accountability. The evaluation of control alternatives, diagnosis of the efficacy of controls, and corrective actions are all part of the process. A key issue is identifying the organizations that have authority and responsibility for doing the evaluations and taking these actions. Principal steps are verification that implemented controls are actually performing, verification that the environment responds as expected, and verification that the response of identified public health and welfare receptors agree with expectations.

An example is lead emissions. From 1970 to 1994 lead emissions were reduced by more than 99%. The lead level in ambient air also dropped by a similar level. The blood levels in children aged 1 - 5 went from 15 in 1976-1980 down to 2.7 in 1991 - 1994. Although there are several ways for children to ingest lead, many are related to atmospheric deposition. Finally, it would be helpful to show that there were fewer cases of retardation, mental illness, etc. as a result of lower lead blood levels. This has not been done. The lead case is probably the simplest case. A similar situation exists for SO2. Emissions have dropped from 9.4 million tons in 1980 to 4.5 million tons in 1996. The observed ambient levels on White Face mountain summit dropped from 4.2 to 1.7. Other area maps show similar results. Closing the loop with regard to the response in terms of impacts on lakes, forests, and buildings needs to be done. Thus monitoring requirements are increasing in order to provide verification.

National networks must establish a systematic appraisal process. The network must assess data quality and utility. The process must include the broad community of data users. A feedback mechanism is needed to correct situations where "bad" data is being generated. One analysis showed the trend in CO emissions from 1980 to 1996 being reduced by about 20%. However, the maximum 8-hr concentration was reduced by 60%. A similar plot for NOxemissions and NOx concentrations shows a larger change in emissions than ambient concentration. For SO2, emissions dropped 20% while concentration dropped 50%. A question was raised about the particulate level issue. Data is lacking on chemical compositions of the particulate matter. Monitoring is needed to demonstrate that reduction in particulate emissions actually result in lower ambient concentrations. Finally, measuring the desirable benefits such as lower death rate and fewer hospital emissions is required. Canada is trying to do this. They are in a somewhat better position to do this with a smaller population and a national healthcare service.

Q: Thomas Peterson, U. of Ariz. Where do the estimates come from?

A: Some come from different input data, some come from actual measurements. Emission factors, miles traveled, and other changes are made to refine the estimate.

Q: Once a control is implemented, it wonāt be removed ö will your model help forecast the best approach?

A: We are trying to integrate both so we can use it to predict. Also, there is some advantage relative to what is in place. For example, telling people not to drive on certain days.

Q: Kenneth Colburn, NH Dept. of Env. Serv. How do you use accountability effectively? Geographically, it will vary due to politics.

A: Yes, it would be very difficult to implement regionally.

Q: Thomas Lareau, API Accountability might allow us to say we have controlled mobile sources enough and we can move on.

A: Yes, there is a lot of work to be done first, especially in monitoring

[UP]
Meteorological Variation
S. T. Rao, SUNY Albany and NY DEC

Dr. Rao presented approaches to determine changes in ozone levels due to changes in meteorological conditions. There are on the order of 900 monitoring stations throughout the US, many of these are around urban areas in the Northeast and California. For the summer of 1995, only a few exceedances of the 1-hour 124 ppb standard were measured in the eastern US. However, photochemical modeling results indicate more exceedances than observed over a large portion of the eastern U.S. The 8-hr standard of 84 ppb would encompass even more areas with more violations than the 1-hr standard threshold of 124 ppb.

Detection and attribution are key factors in looking at the ozone problem. Detection looks at the identification of a statistically significant trend. Attribution links these changes to a specific forcing function. Natural variations and instrument errors can dwarf the variation in ozone resulting from anthropogenic forcing. In much of the environmental literature, the process of trend detection is synonymous with linear regression. Linear regression analysis on the raw data reduces the temporal information about the pollutant. Changes in monitoring techniques, weather, and climate influence profoundly influence ozone concentrations. Furthermore, variations on intra-day, diurnal, synoptic, seasonal, and longer time scales are present in ozone concentrations. These influences cannot be discerned from a linear trend estimate since it reduces the information to something like a change of a few ppb of ozone per year. Separation of these effects is critical to establishing trends for policy analysis. Currently, techniques such as extreme value statistics, distributional analysis, and filter plus multivariate regression analysis are being applied by researchers. However, there are some areas where there is an increasing trend.

For data at Whiteface Mountain, short-term fluctuations accounted for 53% of the variance in raw data. Seasonal variations covered 39%. There is a long cycle wave of 3.5 years, which gave a repeatable cycle over study period. The explanation for the 3.5-year periodic oscillation is under investigation. Analysis of ozone time series data from the LA Basin showed no trend in the intra-day component of ozone over the 10-year period for ozone record even though the NOx and VOC have both reduced by 25%. However, the long-term component of ozone showed a downward trend during this 10-year period. Meteorological variables explained 54% of the variations in ozone.

Q: Jana Milford, U. of Col., Boulder What is meant by locations that have a green and red dot?

A: These are distinct monitoring stations within an area, each showing a different trend.


[UP]




Session III. Health Effects
Moderator: Robert Sawyer, UCB

Opening remarks
Robert Sawyer, UCB

Prof. Sawyer thanked Robert Slott for doing much of the organizational work for the Symposium. He went on to note that health effects are the driving force for legislation, and thus deserve an important place in the discussion. However, the health effects benefits of air pollution regulations are difficult to measure. This topic is discussed in the first presentation this session.

Most air quality regulations control emissions are in outdoor air, but in todayās society most people spend over 80% of their time indoors. There is a disconnect between air quality that we measure and model outside and the peopleās exposure. The second presentation this session addresses this issue.

[UP]
The Challenge of Measuring the Benefits of Air Pollution Regulation
Daniel Greenbaum, Health Effects Institute

Mr. Greenbaum outlined his talk as consisting of Basic Concepts, State-of-the- Art, Challenges, and Future Directions.

The Basic Concept is simple, that as exposure goes down, health effects go down. An "easy" case is cited, that of lung cancer and smoking. Lung cancer in males has gone down, as has the rate of smoking in males. Mr. Greenbaum then discussed the State-of- the-Art in estimating health benefits. There are four main ingredients: Retrospective concentration-response studies, probabilistic models of likely effects of reductions, estimates of effects reductions as a result of air pollution reductions, and generalization to nationwide benefits (e.g., the EPA "Staff Paper") and to all health endpoints (e.g., the 812 study). Mr. Greenbaum described the example of the case of asthma hospitalizations as a health effect, with ozone levels as the air pollutant. He states that one can generate curves for asthma hospitalizations vs. ozone concentration levels, perform a probabilistic analysis, estimate change in hospital admissions levels for achieving different levels of air quality standards, and perform benefits/costs studies, but there are challenges to be met with this approach: 1. Benefits are not actually measured 2. One is using retrospective studies to look forward 3. Underlying health trends may be reducing or increasing incidents of health effects independent of the effects of air pollution 4.

The relative size of the benefits, and 5. There is an absence of routinely collected health surveillance data. Mr. Greenbaum gives two suggestions of options for the future: 1. Establish and maintain a national public health surveillance data base using emerging HMO databases 2. Design and implement long term tracking and study of selected population cohorts.

Q: John Bachmann, EPA Do you think you can measure effects of some place dirty becoming clean? Can we expect to see anything even with good statistics?

A: You would need to do detailed power calculations. Some levels of effects have been very low. It will get harder as air pollution levels decrease. As an example, HEI has funded a study in former E. Germany, but the economy collapsed and the factories shut down, so the much higher pollution levels we had expected did not exist, and the study has been more difficult. It is possible that the Kaiser Permanente in Los Angeles would be a candidate for such a study.

Q: Michael Redemer, Texaco Do the models assume a linear response for PM and O3 ?

A: People generally agree on a linear response for O3. PM is not known so well. The epidemiology might suggest a linear relationship, but in the absence of a known biological mechanism, it is hard to confirm this.

Q: Ken Demerjian, Univ. of Albany I havenāt seen the staff papers. Would it be better to recombine the pollutants?

A: The NMMAPS study is looking at "packages" of pollutants vs. mortality. Getting a firm handle is a challenge. Some index of general pollution is required. The problem is defining a control strategy where each source of a different pollutant will want to know if it is his or her source/pollutant that is causing the problem.

[UP]
Indoor Exposure to Outdoor Air Pollutants
Joan Daisey, LBL

Dr. Daisey started by noting that the outdoor air quality community does not seem to talk to the indoor community. This may be due to the idea held by many outdoor people that if we focus on indoor issues, nothing will be done about outdoor issues. This is not true in her opinion.

Exposure analysis is essential to understanding causal relationships to health effects. Exposure is being defined as a boundary between a human and an environmental medium at a specific contaminant concentration for a specific interval of time. Total exposure is the sum of exposures to a given contaminant.

The significance of indoor environments is that 90% of our time is spent indoors, that buildings modify the exposures to outdoor air pollutants, and that there are indoor sources of some pollutants. Indoor concentrations are some times an order of magnitude higher than outdoor concentrations because of the local nature of use (i.e. spray can or cigarette). One smoker in a house adds about the equivalent of outdoor air levels of particulates (with a much smaller particle size). Data from a sample of California homes shows that 76% of PM2.5 comes from outdoor sources.

Buildings have a relatively high surface to volume ratio, a relatively low air exchange rate, and a relatively low volume. The concentration of a given contaminant is influenced by ventilation rates, filtration effects, deposition losses, chemical reactions, and re-entrainment. For closed residential buildings, ventilation rates vary from 0.2 ö 2 air changes / hour (ach). Commercial buildings are often under positive pressure from the HVAC system. Ventilation rates are in the range from 0.2 - 4 ach. Ventilation by itself doesn't change the cumulative exposure to a spike in outdoor concentration assuming no losses and constant human activity. Ozone, NOx, air toxics, and particulates are generally subject to losses. With a high ventilation rate, the ozone is perhaps 70% of the outdoor level. For a low exchange rate, the ozone is only about 22% of the outdoor level as ozone has deposited or decomposed and not been replaced rapidly enough by outdoor air.

Particle deposition data shows some deviation from theory at small particle sizes (more actual deposition), while at 1-2 m the deposition is about 10 times larger than predicted. For fine particles, in household data, the penetration losses were negligible, but the decay rates inside the building are high (0.65/hour for PM10 and .39/h for PM2.5). Thus buildings offer more protection from PM than was anticipated. The process is not well understood.

Semi Volatile Organic Compounds (SVOCs) can be in particle or vapor form. In an indoor environment the vapor phase SVOCs are deposited to the walls, which causes some SVOCs to go from the particle to the vapor phase to reestablish equilibrium. This causes the particle mass and size to decline with time.

Dr. Daisey concluded that more work needs to be done on the area before cost-benefit analysis can start. Most of the work has been done on residential buildings while little work has been done on complex commercial buildings. Many needed parameters are not there, and indoor chemical reactions, sorption and desorption are not well understood.

Comment: Robert McCunney, MIT There is a very limited link between indoor air quality and health effects.

Q: McCunney How many cigarettes cause the 20-30 g/m3 PM increase?

A: about half a pack a day, maybe 8 cigarettes.

Comment: McCunney Because the PM excursions with adverse outcomes are only 20-30 g/m3, and then it is comparable to indoor ETS effects. It is hard to sort them out.

A: Maybe the homes with smokers are more predisposed to have health effects. There is also a new paper that indicates that ETS is less potent than other kinds of PM.

Q: McCunney We are often requested an indoor AQ evaluation at MIT. We donāt measure NOx or ozone routinely. Should we do it?

A: I donāt know how cost effective it is. There are certainly lots of VOC emissions indoors, as well as other pollutants that might cause adverse health effects and/or "sick building syndrome" symptoms.

Comment: McCunney To study the interaction of substances makes a lot of sense. We donāt know if the substances emitted, such as VOCs from carpets, are neutralized or made worse indoors.

A: We also have to develop metrics to combine the different VOC, not based on mass but on health effects. It is promising but hard.


[UP]




Session IV. Stationary Source Case Histories
Moderator: Praveen Amar, NESCAUM

[UP]
Stationary NOx ö A Retrospective Look at the Costs of Controlling Air Pollution
Praveen Amar, NESCAUM

Mr. Amar began his talk with the conclusions that actual control costs have been lower than initial estimates, for various pollutants and industries. The reasons he cited were conservative overestimates, market competition, technological innovation, market-based approaches, and regulatory trends against mandating specific technologies. He went on to say that policy decisions based on inflated costs lead to a lower level of environmental protection than is reasonably affordable. Several examples of cost overestimation are given, Kinnear of Texaco estimation of hydrocarbon or NOx reduction; GAO94, EPRI93, EPRI95, ICF89T, and ICF89NT estimates of cost of compliance with Title IV, and CARB and AAMA estimates for costs of LEV program.

A Status Report by NESCAUM and MARAMA was presented on NOx control technologies and cost effectiveness for utility boilers. Mr. Amar raised the question of whether reductions of 84% of NOx by flue gas treatment is justified due to the increased cost for smaller fractional returns. The program objectives were to review the Ozone Transport Region (OTR) NOx inventory, provide a status report on post RACT control technology, and provide technology costs. This represents the first major study since RACT was implemented in OTR, and is unique in the use of Continuous Emissions Monitoring (CEM) and the use of real project experience and real costs. Most of the emssions have come from reductions in burning coal, and in Pennsylvania. The technology responsible for the most reduction is Low NOx burners. Post RACT NOx control technologies include Selective Catalytic Reduction (SCR), Selective Non-Catalytic Reduction (SNCR), gas and coal reburning, primary controls for gas/oil fired units, and hybrids of the mentioned technologies. The performance and cost analysis relied on real project data by users of the technologies. SCR and SNCR performance and cost was analyzed. The conclusions are that post-RACT NOx control technologies have been implemented and are operating reliably on U.S. utility boilers, and actual performance and cost information is available. Some concerns about SNCR raised by industry are: Control of ammonia slip, application to large boilers, process applicability (too process specific). Concerns about SCR include: applicability to high-sulfur coal units, applicability to boilers that reinject fly ash, can NOx emissions be reduced enough to make trading viable, and costly retrofits to include forced-draft to balanced-draft conversions.

Comment: John Elston, NJ EPA The cost was the constant before in the cost/benefit analysis, but it appears that the costs may vary as much as the benefits. It is amazing that when industry is told they must pay whatever the cost differential is, the costs immediately come down.

Q: Rob Harley, UCB What is the precision of the CEM measurements?

A: I assume the CEMās are accurate.

Comment: Jeremy Platt, EPRI They probably are not as accurate as you would think.

Comment: Ken Colburn, NH Dept. of Env. Serv. We designated SCR as RACT.

A: Good point.

Comment: John Beer, MIT There is a significant reserve in low NOx burners. Low NOx burners have been shown to go below .2 lb/MMBTU.

A: Yes, you would not have to retrofit.

Q: Michael Bradley, M.J. Bradley and Assoc. What are the costs when you look at it in five years?

A: I donāt think the estimates will be as far off as the last five years.

Q: Robert Sawyer, UCB What is the possibility of mobile source NOx trading?

A: The consensus is that it will be in stationary sources first.

Comment: Robert Slott, consultant If NOx output were constant per gallon, that would favor a mobile trading scheme.

[UP]
The Costs of Title IV SO2 Reductions
Denny Ellerman, Center for Energy and Environmental Policy Research, MIT

Looking back at the U.S. acid rain program, half of the source of SO2 emissions reduction has come from scrubbers, the rest from switching to low sulfur coal. The reductions were not even; two-thirds of units in use made no reductions. Most of this reduction took place in the Midwest, and the places that burn the highest sulfur coal are the cheapest places of abatement. Mr. Ellerman pointed out that SO2 can be reduced considerably by scrubbing high sulfur coals, and that scrubbing units have been used more extensively than one would have predicted. The actual costs of compliance have come in at ~$200/ton, on the lower end of predictions. On the subject of SO2 allowances, Mr. Ellerman pointed out that the price is much lower than expected. They represent short-run marginal cost, and understanding the difference between allowance price and costs is necessary in the future.

Q: Richard Morgenstern, RFF If I sell you a ton for $100 and complying will cost $200/ton, why shouldnāt I hold on to my ton?

A: At $200/ton, people will start to build scrubbers, at $100/ton, people will buy allowances. Todayās price does reflect future price of compliance.

Q: Phil Lorang, EPA Do these allowances expire?

A: No.

Q: Jason Grumet, NESCAUM Do you think the $1600/ton estimate was intentional to confuse Congress or were people ignorant?

A: Itās like today with CO2, I think there are ridiculous estimates out there by the present administration. Itās part of the political process. There was also a big debate about how much trading would take place.

Q: Jana Milford, U. Col., Boulder Can you speculate on the volatility in current prices?

A: We have a lot of Phase II utilities calculating their costs and finding they canāt beat $100/ton.

Q: Ken Colburn, NH Dept. of Env. Serv. What about the inflation of allowances?

A: There is optimization. Allowances look like any other commodity in forward projections.

[UP]
Revisiting SO2
Jeremy Platt, EPRI

Mr. Platt begins by stating that there has been an enormous amount of scrutiny on SO2. He cautioned that Phase 1 is a phase, and one should not be too dogmatic about price forecasting. He added that one should avoid learning the wrong, or too many, lessons from the history of SO2 allowances. In particular, the "wrong" lessons would be that the low allowance prices indicate the efficiency of trading, the extent of technological advances, or the extent of industry overestimation of compliance costs. The "right" lessons are that trading is a success, savings have been had, trading has brought political support, and price forecasting mistakes were made. Mr. Platt provided the example of the use of Powder River Basin (PRB) coal and lower western rail rates as examples of an unforeseen factor in the economics of allowance trading. There is a natural desire to look for an " " factor to adjust cost estimates to true costs, but it should be avoided. He went on to say that the future is becoming more uncertain, and engineering economic models cannot be used to obtain great predictive accuracy. Mr. Platt gave various analytical price effect results for Phase II: banking:>$200/ton; retirements: >$200/ton; different levels of coal use: >$150/ton; no more scrubbers: $150/ton; FGD improvements: $60/ton; freezing coal prices or 25% lower rail rates: $50/ton.

Q: There was a lot of pessimism in the late Ī80ās, but there appears to be a change in culture, please comment.

A: It takes time to learn how to play the game. Lower prices drove interest.

Comment: Denny Ellerman, MIT There is evidence that utilities embraced intra-utility trading from the start. They just didnāt believe in the external market.

Q: Jeff MacGillivray, NH State Rep. There are two things that affect the price of this future, compliance and politics. Are we really looking at futures in allowances or politics?

A: I donāt think politics plays a part.

Comment: Jeff MacGillivray, NH State Rep. It may be better to have inter-pollutant trades.

Comment: Ken Colburn, NH Dept. of Env. Serv. A comment on the alpha stuff, it is a learned behaviour.

Q: Glenn Cass, Caltech Are non-users buying allowances?

A: About half the inter-traded allowances are bought by brokers, the other half are utilities.

Q: Glen Cass, Caltech Is there open registry of allowance holders?

A: Denny Ellerman, MIT Yes there is. But you can have trades that donāt get registered until later.

Q: Jason Grumet, NESCAUM You say that prediction is hard, but it seems like you give a prediction.

A: You have to say something. Weāre changing a lot of things with Phase II.

Q: John Beer, MIT I thought in the early part of your talk, the answer was to go to low sulfur coal, but when I look at the mass balance, it is clear in the future it would have to be scrubbers.

A: Weāve been wrong for 10 years about what PRB coal can do.

Comment: John Beer, MIT I feel there will be plenty of room for trading.

Comment: Robert Anderson, Consultant Phase I costs have come in lower and benefits higher than projections.

A: Phase I prices have come in lower.

Q: Winston Harrington, RFF What is the accuracy of benefit projections?

A: We know deposition has gone down.

Q: Robert Andersen, Consultant Were there unforeseen health effects?

A: John Bachmann, EPA We did foresee the benefits.


[UP]




Session V. Transportation Case Histories
Moderator: Richard Morgenstern, RFF (on leave from EPA)

[UP]
Inspection and Maintenance
Winston Harrington, RFF

Dr. Harrington presented the economics of the Arizona Enhanced I/M Program. Testing data from 17 months during 1995 and 1996 were reviewed. Results showed that 860,000 vehicles passed and 130,000 vehicles failed. There were 113,000 rounds of repair and 83,000 vehicles retested to pass. About 3% of the total vehicles went through the system and "disappeared" (i.e. failed, and didn't show up again). The resulting overall emissions have been reduced by about 12% for CO and HC and 7% for NOx. The average repair cost per repaired vehicle was $123, while costs reported by motorists were $74, the difference being mostly attributed to warranty costs. There was an estimated fuel economy benefit of $35. About 5% of the repair costs were over $400.

He summarized simulations to determine if economic incentives would reduce costs. The basic approach was to allow a fee vs. repair choice for the vehicle, with the fee increasing with increasing emissions. The simulations showed a reduction in cost compared to command and control approaches. However, problems such as disappearing cars, and lack of repair cost effectiveness and are not solved by an economics incentives program.

The costs were higher than originally estimated. This was attributed to cars being cleaner by the time the I/M program was initiated than at the time of proposal. Total cost was estimated at $32 million, allocated to testing (44%), repair (34%), driving and waiting time for the public (31%), and a negative cost from the fuel economy improvement (-9%). Overall, the I/M program was not terribly expensive. However, the actual emission reductions are small, mainly due to test avoidance, and should not be estimated with I/M data but with remote sensing.

In Dr. Harringtonās opinion the basic problem of I/M programs is that they make the drivers responsible for the cost, and thus provide an incentive to avoid the program. It would be better if someone else paid for the costs.

[UP]
Colorado's I/M Program
Richard Barrett, Colorado Department of Public Health and the Environment

Mr. Barrett reviewed the cost effectiveness of Coloradoās I/M program. The original program was established in 1980. It was based on annual decentralized idle tests, with the objective of carbon monoxide control. An enhanced program was added in the 1993 State Implementation Plan (SIP) to try to get carbon monoxide attainment in some areas around metro Denver. Originally the goal was attainment for 1996, but subsequently was changed to 2001. The enhanced program began in 1995. Exemptions are given to the last 4 model year vehicles because of low cost-effectiveness. The initial failure rate was about 6%. For the failed vehicles, repair and retest resulted in a 70% reduction in emissions.

The first evaluation of the program was done in 96 by the Colorado DPHE. It indicated that 95 - 97% of the goals were being achieved relative to the CO target reduction. An additional year of data showed the program meeting 80% of the target goal. There were about 4% "unresolved" vehicles which were still on the road, and the analysis was highly dependent on the assumptions about these. The I/M program cost was estimated to be $34 million, allocated to testing (60%) and repair (40%). No attempt was made to quantify the cost to the public of driving and waiting. The cost per ton of pollutant reduced was $237. The average repair cost was $130 per failed vehicle.

Evaluation by remote sensing was initiated by Don Stedman in 95-96, finding reductions in CO of about 4ö8%. This was below the 11% originally estimated over not having an enhanced program. Changes in HC and NOx were not statistically significant. Later analysis with additional data showed 10 - 11% reduction for CO, and not statistically significant changes for HC and NOx. It was found that the cars whose registration was renewed had lower emissions than the ones changing owners. Remote sensing data has some problems with out of area vehicles, license plate tracking, and other attributions.

In 1997, an outside firm (Environ) was hired by the state to study the programās effectiveness. Their analysis of the ambient air quality data was inconclusive. Their modeling study gave a 30 - 34% reduction in CO over the no I/M case, which is larger than the 1993 SIP estimates. The analysis is still very dependent upon base case assumptions and assumptions for "disappearing" or out of area vehicles. IM240 resulted in longer lasting repairs than idle tests, but it was judged that the repair industry was not doing well enough. Cost per ton of CO was estimated to be about $300/ton, which is similar to other existing programs.

Mr. Barrett concluded that the enhanced I/M program is indeed reducing CO emissions. IM240 is the best test for detecting high emitting vehicles. Overall the program is cost-effective compared to other programs. For the future, lower cut points are being planned, and they will cause the failure rate to double. There will be more emphasis on OBD. Finally, a clean screen program is planned for implementation next year which will utilize remote sensing and model year exemptions to pass 60% of the vehicles to reduce costs, while loosing only an estimated 6% of the emission reduction with no clean screen.

Questions for Winston Harrington and Richard Barret

Q (for RB): Terry Keating, Harvard How has the clean screen program been designed?

A: RB A pilot remote sensing program has been in place since 1995 in an area (Greeley) where only basic I/M is required. Lots of remote sensing measurements have been done there, with the intention of finding all vehicles. There have also been lots of IM240 runs to provide data for the comparison. Based on the results from the pilot program, we are thinking of requiring 2 or 3 readings per vehicle. If all the readings are cleaner than a threshold we will send the person a letter saying that their vehicle is exempt from I/M. They will still have to pay the same fee ($15) as the people who do the test, but they will save the inconvenience.

Q: Jeremy Platt, EPRI Is there any experience with a line where people can call in to report "smokers?"

A: RB We have a program like that in Colorado, but its effectiveness has not been tracked. We send a letter to the owner of the vehicle but very few people actually come in for testing.

Q: Both studies show that the cost of repair is only 1/3 of the total cost. Could the state pay the cost of all the repairs?

A: WH That could improve the amount and quality of the repairs. It is a potential solution that should be looked at.

A: RB Many people feel that those who pollute should pay for the repairs, or even have an emission-based tax or registration fee. But the people who do the polluting are the least capable to pay for it.

Comment: Sam Leonard, GM If the state pays for repairs, owners have no incentive to maintain their vehicles.

Q: Don Stedman set up a "happy car sign" in Denver. The sign tells the owner that their car is "sick" in 4% of the cases. 50% of those people self-reported that they tried to repair their car. Anyone has any experience on that?

A: RB We havenāt done it but I think itās a good idea.

Q (for RB): DanielGreenbaum, Health Effects Institute Are the trends of emission reductions leveling off?

A: RB The trends are leveling off for all pollutants because of the increase in vehicle-miles traveled.

Q: John Elston, Dept. Env. Prot. NJ The disappearing vehicles have high emissions. Are there any ideas on how to deal with them?

A: RB We are trying very hard to figure out what is going on with the disappearing cars. The preliminary results indicate that they donāt go right over the edge of the I/M area, but we still donāt understand whatās going on.

Q: Could insurance information be used for that purpose?

A: RB No, there are lots of people without insurance.

Q: Jeffrey McGuillivray, NH State Representative Some cars pass at the second try without doing anything. Could the variability of IM240 tests explain this?

A: RB We have not assessed that point with the publicās cars. We did driver-to-driver variability studies at the beginning and saw good test-to-test correlation. Even if the phenomenon is real for marginally-passing and marginally-failing cars, we feel that at the high emission levels variability has no impact.

A: WH The clean cars are always clean. Some dirty cars are variable.

Comment: Sam Leonard, GM The most important thing is to precondition the vehicle. If you wait a long time before the test and the catalyst is cold you get a high failure rate.

A: RB Sometimes a second IM240 is done for that reason, and the first serves as preconditioning.

Q: Glen Cass, Caltech Are super-emitting vehicles captured by the tests?

A: RB We donāt know. The disappearing vehicles are dirtiest on average.

Q: Glen Cass, Caltech What happens to the cars that need expensive ($500) repairs?

A: RB In Colorado you donāt get a waiver unless you spend $450 attempting to repair the vehicle. Any missing or tampered with parts are not counted towards that cost.

Comment: Robert Slott, MIT A substantial part of the population is trying to avoid having to spend money. That is very easy in California, you just have to go to the "right" garage. In Colorado you can register the car out of the I/M area because the enforcement is lax. As the enforcement will become stricter people will become more ingenious at avoiding repairs. They may start registering their cars in Wyoming, and since there is no national VIN registry it is very difficult to know where the car went. Until these issues are resolved it is difficult to know what the effectiveness of the I/M programs can be. It all comes back to the willingness of the states to deal with the motorists.

Q: Robert Sawyer, UC Berkeley There is a big difference in CO emissions between summer and winter. Has this been taken into account?

A: RB No, all studies used data for both summer and winter. You can see the temperature and oxygenated fuel effects on the lane data, but the studies havenāt been done separately.

[UP]
Low Emission Vehicle (LEV) Technology and California Reformulated Gasoline (RFG)
Tom Cackette, California ARB

Mr. Cackette presented two case studies for the costs of air quality regulations: reformulated gasoline (RFG) and low emission vehicles (LEVs).

Air quality has been improving in California. Maximum ozone concentration in Los Angeles has been reduced from 0.35 ppm in 1988 to 0.21 ppm in 1997 in spite of increased population, increased vehicles, and increased vehicle miles. The RFG and LEV programs have contributed to this success.

The RFG program in California has targeted lower CO, HC, NOx, and toxics. It includes specifications for 8 fuel properties, with a wide variety of compliance choices. The regulation was adopted in 1991 and implemented statewide in 1996. Targeted reductions were in the range of 11 - 17%. A CARB study on RFG indicated that $5.5 billion in capital investment would be needed to modify refineries to produce the RFG, which would mean an increased gasoline cost of 12 - 17 cents/gal. Cost effectiveness was estimated to be $3.50/lb of reduced pollutants, which compares favorably with other emission reduction programs. The oil industry did another study which came to a figure of 23 cents/gal. According to Mr. Cackette this was a very conservative or "worst-case" analysis. By 1996, the estimates of the actual capital costs amounted to $4 billion, or 10 cents/gal. The breakdown was 35% capital cost, 50% O&M, and 15% cost of oxygenates. A price study was done in 1997. Compared to other cities the price differential was estimated to be +5.4 cents/gal, though the prices were very volatile due to market forces.

CARB carried out a study to estimate the costs of the program, and compare them to prior estimates made by CARB and the automobile industry. Incremental costs for the new or revised emission control devices installed on three 1998 models meeting the LEV standard were determined, and compared to earlier estimates made in 1994. For a 4 cylinder Honda Civic, the actual incremental cost at the retail level is $75, compared to an $85 estimate by CARB in 1994. For a 6 cylinder Camry, the current cost is $79, compared to a $137 estimate in 1994. For a Ford V8, the current cost is $152 compared to a $140 estimate in 1994. In general the technologies CARB projected to be needed in 1990 were used in actual production vehicles in 1998, with the exception that lower cost, conventional catalysts were sufficient to meet the LEV standards.

In 1994 consultants to the automobile manufacturers estimated a $788 cost of compliance to meet LEV standards. The main reasons the consultants overestimated the costs were overestimating higher use of expensive active catalyst systems, overestimating the costs of components, overestimating the warranty costs, and including costs for unplanned sheet metal changes (e.g. the floor plan to accommodate relocated catalysts) which actually were made during normal model revisions.

In 1998 EPA is estimating $95 per vehicle as the cost of the national NLEV, which is similar to the California LEV program, while industry estimates are in the range of $95ö200. For the next stage of implementation for ultra low emission vehicles (ULEVs), the additional cost with respect to Tier I vehicles is estimated at $200ö300, although GM has quoted costs of up to $1000.

The incremental cost effectiveness is $0.50/lb of pollutant reduced, which compares favorably to other programs. We are not going up in diminishing returns with every new program, actually the cost effectiveness of new programs have stayed quite constant.

Mr. Cackette concluded that the CARB methodology predicts reasonably predicts actual costs. Technology forecasting is the key to accurate cost estimates. Industry and consultant estimates are systematically pessimistic, probably indicating resistance to the programs. Both the RFG and the LEV programs have lower costs than predicted and are cost effective.

[UP]
Reformulated Gasoline
Robert Anderson, Resource Consulting Associates, Inc.

Dr. Anderson presented an economic analysis of the federal RFG program. The RFG program started with an industry initiative with ARCO's EC-1 gasoline for older vehicles. The Clean Air Act Amendments of 1990 expanded the program. The legislation set emission performance targets for RFG and specified that RFG must contain oxygenates. Phase I implementation required a minimum oxygen content of 2%, VOC and toxic emissions reductions of 15%, and no increase on NOx emissions. Phase II was scheduled for 1999 and required 25% improvement. A baseline fuel composition was established for summer and winter conditions upon which baseline emissions were based. The Phase I RFG incorporated oxygen and reduced the vapor pressure. This gives a projected emission reduction of 16.5%.

The cost estimates before the program was implemented ranged from 2.3 cents/gal for EPA to 7 - 16 cents/gal from industry. DOE estimated 4 ö 6 cent/gal. The actual costs were reported as 2.6 - 4.0 cents/gal. There is an energy penalty due to the lower energy density of RFG of about 1.8 cents/gal, which one industry study had overestimated by a factor of 3. The cost effectiveness could be in the area of $2200/ton. The toxics goal was set to the minimum allowed by the Clean Air Act as the cost per cancer case was quite high (around $40 million per cancer case avoided).

Phase II RFG should cost 3.5 cents/gal according to EPA, which has done a good job on estimating costs until now. The incremental benefit of increasing the oxygen content beyond 2% does not appear to be cost effective. Dr. Anderson suggested that setting the minimum oxygen as high as 2% may not have been cost effective either, though EPA did not report on that, since the 2% minimum was required by the Clean Air Act.

Questions for Tom Cackette and Robert Anderson

Q (to RA): Praveen Amar, NESCAUM You said that $500/ton was cost-effective and $5,000/ton was not, but both may be effective from the State Implementation Planās point of view.

A: RA I agree, it depends on the locations and the control costs at other sources.

Q: Winston Harrington, Resources for the Future The oxygenated fuel requirement means that 15% of the gasoline is MTBE. Hasnāt this created a large MTBE industry that may be hard to remove?

A: RA Yes, there has been a lot of investment, they may not like a change in the requirement.

Comment: Robert Slott, MIT The oxygenates are also used to increase the octane rating of gasoline and compensate for the decrease of aromatics. Without them they will have a hard time meeting the octane requirement. We many not want to require MTBE, but we may want to allow it.

A: TC There would also be a fuel shortage in California if MTBE was banned, because the industry is running at 90% of full capacity and MTBE is 15% of the gasoline.

A: RA But there are severe problems with MTBE such as leakage into groundwater. MTBE will stay in the short-term but its future is not clear in the medium term. I would not be surprised if it was gone in 10 years.

Q: Daniel Pedersen, MIT RFG is expensive, but is it effective? And if so on which time scales?

A: TC There is a large overall reduction in ROG. RFG has an immediate effect since it is used by all cars. It had the 2nd largest effect of any measure ever used in California. With cleaner new vehicles you have to wait 10-15 years.

Q (to TC): Terry Keating, Harvard Why did you attribute 1/2 of the cost to toxics and 1/2 to NOx + ROG?

A: TC Mobile sources are the largest emitters of toxics, things like benzene and butadiene. It is not clear that our cost allocation is analytically based, but we have done it attributing the costs to NOx + ROG, to NOx + ROG + CO/7, and to that plus toxics, and they all come to $1-$2 a pound. Any way you look at it, it is still extremely cost effective.

Q: Jeffrey McGillivray, New Hampshire State Representative What is the cost of OBD2 mandated components?

A: TC The actual cost is $56, CARB had estimated $35. It is not included in the LEV calculations since it is mandated nationally.

Q: McGillivray How do LEVs & ULEVs split as far as ROG + NOx?

A: TC You canāt estimate separately because the techniques are the same (catalyst and air-fuel ratio control)

Q: Would the cost be lower if there was only a NOx or ROG requirement?

A: TC Probably, but that was not studied.

Comment: DanielGreenbaum, Health Effects Institute The RFG law should be a historical lesson about how detailed a law should be.

Q: Greenbaum Metals are historically known to be bad in the fuels and we have spent a lot of time dealing with them. European data show that catalyst metals are stripped into the atmosphere. What is the effect of your recommendation to increase metal loadings in catalysts?

A: TC It hasnāt come up as an issue. My uninformed assessment is that it is very small. I am unaware of any data that say that it is important.

Q (to TC): Michael Redemer, Texaco What did you assume for the capital recovery rate?

A: TC For the cars I donāt know. For the oil industry 10-11%. This is based on what they actually get when they do a project, which is different from what they estimate when they are deciding whether to do a project.

Comment: Sam Leonard, GM CARB cost estimates for LEVs show the result of a good business decision: doing the least costly things first, picking the low hanging fruit. When things are done across the board the costs will be higher.

A: TC True

Comment: Sam Leonard, GM The dealer markup is 20-30%, that is where the industry estimate of increased cost for that item came from.

A: TC We asked dealers what they got.

Q (to RA): Was the cost-effectiveness of reducing sulfur down to 50ppm attractive for NOx?

A: RA Yes it is. It was $7,700/ton when you go from 100ppm to 50ppm S. NOx is going to be expensive. It would not be cost effective to do it for VOCs ($32,000/ton).

Q: John Elston, Dept. Env. Prot., NJ Are the costs of gasoline storage estimated?

A: RA EPA did not estimate those. The industry study took a country average. My guess is that they are 0.1-0.2 cents/gallon.

[UP]
Air Quality and Other Benefits and Costs of Transportation Control Measures
John Suhrbier, Cambridge Systematics, Inc.

Mr. Suhrbier summarized results of transportation air quality research being conducted for the National Cooperative Highway Research Program (NCHRP). Administered by the National Academy of Sciences, the objective of this three year project is to develop and test an improved analytical framework for evaluating the air quality and other benefits and costs of transportation control measures (TCMs). The framework, though, is designed so as to be equally applicable to the full range of potential air quality transportation control strategies. This will facilitate the use of consistent methodologies, data, and assumptions in developing emission inventories, evaluating candidate strategies, and conducting conformity analyses.

An important secondary objective of the research is to examine the relationship between implemented transportation measures and monitored ambient air quality levels. State transportation officials, quite understandably, are interested in knowing whether projects that are being implemented with the objective of reducing air pollution are in fact accomplishing their desired objective. An important finding of this portion of the research is that monitoring techniques are adequate for detecting changes in direct pollutants such as carbon monoxide. Attributing changes in ozone levels to a particular transportation measure though is much more difficult. In this case, a combination of monitoring and modeling techniques can be used to correct for meteorological and control conditions.

As an experiment, monitored air quality and traffic data were examined before, during, and after the Atlanta Olympic Summer Games. Although ozone levels were measurably lower during the Olympic period and the implemented set of travel demand management measures were demonstrably effective, it is not possible to conclude statistically that the improved ozone levels were caused by the set of transportation actions. In fact, ozone levels were correspondingly lower in many areas located far to the west of the immediate Atlanta urban area. Further, the implemented transportation measures had the effect of accommodating a higher total demand for transportation services and redistributing both the temporal and spatial patterns of automobile traffic. Total daily vehicle miles of travel (VMT) did not change significantly.

Existing transportation benefit and cost models were not designed to support air quality analyses, but have been adopted to accomplish this objective. Consequently, existing analysis techniques have numerous and extensive deficiencies. In addition, the state-of-the-practice in many agencies lags seriously behind even the current unsatisfactory state-of-the-art. This is most apparent in the variables used to link transportation and emissions analyses. Vehicle travel speeds can be a particularly unreliable output of travel demand analyses, yet are in a critical determinant of vehicle emissions. Further, cost-effectiveness analyses often are conducted incorrectly from an economic perspective, and in a manner that does not allow valid comparison of alternative measures.

The NCHRP research is examining both short range improvements that can be immediately implemented by transportation agencies, and longer range, more fundamental changes. The short range improvements have been tested using data from the Sacramento, CA urban area. The longer range improvements are being tested for the Portland, OR metropolitan region.

Over the next several years, it is likely that modally-based emissions modeling will be gradually introduced into practice. This will be matched on the transportation side by the use of activity-based travel demand models. Rather than examining trips as independent units, daily activities are examined on a household and individual basis. This facilitates consideration of a broader range of transportation options; for example, bicycling and walking It also allows an activity to be undertaken by staying at home without the need for taking a trip; for example, telecommuting to work. Finally, activity-based models treat "tours" as the basic unit of travel, with tours composed of a series of linked trips. A major benefit of activity-based modeling is the ability to examine indirect as well as direct transportation impacts.

Many, if not most, transportation actions have potentially important air quality impacts and it is important that the air quality and other benefits and costs of this full range of measures be accurately evaluated. While only a small handful of these transportation actions may end up being incorporated in a regulatory State Implementation Plan, other transportation actions still have important implications for emission inventories and conformity analyses.

In this regard, it is important to take two additional factors into consideration in conducting state-of-the-art transportation air quality analyses. The first is the need to examine changes in the temporal and spatial distribution of mobile source emissions. The second is the need to examine air quality impacts on the basis of multi-state regions rather than just for an individual metropolitan area.

Additional information on the NCHRP 8-33 research, including the interim reports produced to date, can be found on the projectās web site accessible via http://webservices.camsys.com.


[UP]





 
 

Session VI. Summing Up

Moderator: Art Fraas

Panel Discussion: What can be done to improve the accuracy of cost and benefit predictions?

John Bachmann, EPA
Sam Leonard, General Motors
Tom Lareau, API
Jason Grumet, NESCAUM
Jeremy Platt, EPRI

Art Fraas introduced the speakers and stated that OMB is a user of cost-benefit analysis, so it will be helpful to hear the various methods of improvements in methodology that can be suggested.

[UP]
John Bachmann, EPA

Mr. Bachmann began by stating that it is unreasonable to expect a high level of accuracy in cost-benefit analyses when they are limited by the available data, the geographical and economic scope of the project, the often extended time horizon for forecasting, and related inherent uncertainties. He added that it is often difficult to address and integrate combined effects for both benefits and costs for integrated control programs, as in the case of NOx reductions resulting in reduced ozone, particulates, and air toxics. He suggested that current regulatory air quality models may be inadequate for such analyses, because evaluating annualized benefits for the ozone case requires knowledge of pollutant concentrations for each day rather than the peak episode concentrations for which the models are in implementation.

He proposed the following "framing" issues for benefit analysis: Should it be used explicitly or merely for information? How do we place values on life, illness, and aesthetics? How do we consider disbenefits (for example, how smog reduction increases UVB penetration)? How do we extrapolate small studies to the national scale? Mr. Bachmann then outlined some requirements for improved cost-benefit analysis, including improved estimations of future emissions, baseline air quality, potential control strategies, control costs, economic impacts, future policies, an improved understanding of benefits functions (ex: dose-response), and better atmospheric models. He concluded that benefit valuations should be checked for plausibility and that people may be willing to pay more for some benefits categories than the totals derived from indirect assessments.

Q: Michael Redemer, Texaco Should there be a body separate from the regulatory agency to do the evaluation, so that there are checks and balances?

A: I am not willing to accept that such an approach is desirable or necessary. I believe that independent peer review can address the most important issues with respect to the credibility of such analyses. A rigorous process is necessary to evaluate the estimates of costs and benefits.

Q: Terry Keating, Harvard University Are the simpler engineering models going to be effective and feasible when the trend is to make the models more complex?

A: The first step is to make a believable complex model, during which time computer advancements may solve many of the computation problems. From this we should be able to put together a simpler model which will be necessary for making last minute decisions.

[UP]
Sam Leonard, General Motors

Mr. Leonard suggested that benefits are more difficult to quantify than cost effectiveness, at least from an engineering standpoint. He critiqued the EPA draft Tier II analysis, stating that it used outdated emissions models, over estimated benefits, underestimated costs, did not allow enough time for peer review, ignored the interactive effects of technologies, and did consider the added costs of larger vehicles.

In an evaluation of a study of HC & NOx cost effectiveness relative to an NLEV, Mr. Leonard calculated that if one included a time value discount, the fact that half the country is already in attainment, that the per vehicle costs are double EPA's estimates, a half-year seasonal benefit, and other costs, the EPA estimate of $2,400/ton would climb to over $40,000/ton. He emphasized that these issues must be taken into account for a correct analysis.

Q: Kenneth Colburn, N.H. Dept. of Env. Serv. Has anyone monitored disbenefits?

Q: Robert Sawyer, UCB You can see an ozone peak on weekends. Is this due to lower NOx emissions from trucks?

A: This must be modeled, not ignored.

C: Praveen Amar, NESCAUM Benefits overwhelm costs so that the benefited area will produce a net cost effectiveness

Q: John Holmes, CARB Can we address the ozone problems through NOx and hydrocarbons?

A: Local NOx control would be better than on the national level.

C: John Bachmann, EPA Donāt mix NOx benefits with cost effectiveness. Costs are fixed, but the benefits may be reduced.

A: If you list cost effectiveness numbers as justification, then you must consider the benefits, because projects with equal cost effectiveness may not have equal ozone benefits.

Q: Richard Morgenstern, RFF If you could have suggested from the outset how the study should have been done, how would it have changed things?

A: We have been arguing the ineffectiveness of these type of cost-benefit analyses for 10 years to no avail.

C: John Bachmann, EPA There is a case where we have sat down with industry to work out a procedure before the analysis was done.

C: Philip Lorang, EPA In its defense, the Tier II study is the first of three generations and is not a basis for action.

A: EPA did not have enough time to do the job.

Q: Jason Grumet, NESCAUM Will industry accept states to tailor restrictions, rather than enforce a national standard?

A: On a national basis, automobiles are the only source that have consistently reduced emissions.

Q: Jeff MacGillivray, N.H. State Rep. If there are financial incentives for people to buy cleaner cars in some areas, how will this drive up the cost of making cleaner cars?

A: Having only 7% of national car sales in California allows us to test more risky technology at a higher cost. This works better than a national standard; we can prove the technology and experience the learning curve before doing expensive retooling.

C: Jeff MacGillivray, N.H. State Rep. We donāt want to spend money on the hydrocarbon controls that California requires when that overdoes the necessary amount for ozone reduction.

[UP]
Tom Lareau, American Petroleum Institute

Mr. Lareau stated that cost-benefit studies are expensive and time consuming, and that the agencies are often not given the time or resources necessary to get the right answers.

He said that more regulatory alternatives need to be considered (ex: stringency of regulatory changes) and confidence intervals must be presented so that a false level of accuracy is not suggested. He suggested that an independent agency such as the GAO or OMB should be utilized to resolve differences between EPA and industry analyses.

[UP]
Jason Grumet, NESCAUM

Mr. Grumet questioned, "How do bad estimates happen to good people?" He gave two options: that we have good tools that are used poorly or that we do not have good tools at all. He felt that we have good tools, but the anticipation of "good news" (such as the lowering of railroad prices as good coal was mined) is rare, and actual costs are often 10% that which was predicted. He agreed that uncertainty analysis was needed. He stated that we should have set SO2 levels at the threshold for environmental benignness instead of costs, because costs always come down. Mr. Grumet then asked why EPA is reviled for its low cost projections when they turn out to be accurate. He suggested adopting "covenant cost projections," a mechanism to make people more responsible for their estimates, penalizing those who overestimate, while reimbursing those who estimate too low. He emphasized the need for those who do the analyses to make sure that their results are not used by others in inappropriate ways to support a cause.

[UP]
Jeremy Platt, EPRI

Mr. Platt stated the need for more scenario analysis, since environmental controls do not simply "ratchet down" at some arbitrary time. He admitted to errors in forecasting and being too possessive of certain scenarios. He said that market fluctuations do not necessarily equate to errors in estimates. He continued that the Phase II studies are not complete and that results that look correct may have come from the wrong reasons. Mr. Platt stated that work must be done to get the bias out of the estimates and that regulators and industry must work together to create a modeling platform before discussing interpretations.

Comment: Robert Anderson, Consultant Swedenās NOx program is remarkable insofar as it follows Jasonās idea by charging companies their estimate of $6,000 per ton and paying companies back based on usage. Now costs are down to $1,100 per ton.

Q: Richard Morgenstern, RFF The analytic framework is established. The issue is in the detail and assumptions. How do we set standards when everybody is an expert in their own field?

Comment: Jason Grumet, NESCAUM Underwriters lab set a standard and everybody followed it.

Comment: John Bachmann, EPA The process must be continuous. It would be nice to discuss assumptions with industry, health experts, etc. to get a consensus on the benefits of any regulation. We are always advancing.


[UP]




Review and Summary of the Conference
Robert Sawyer, UCB

Dr. Sawyer thanked the presenters, the MIT host, and the Endicott staff. He expressed appreciation for people telling not only what they know, but also what they donāt know. He then summarized what had been presented by the previous speakers:

Dr. Sawyer then stated that the bottom line was that we need more good data on health, emissions and economics, and that we must give more effort toward quality assurance and uncertainty. He hypothesized that the lessons learned here may prove valuable when dealing with the global warming problem.


-- End --




Back to the EPA Center for Airborne Organics Home Page.
Back to the MIT Home Page.