RhumbRunner13
No alts, no "Iggy"
- Joined
- Jan 4, 2002
- Posts
- 3,463
I realize we have had some threads discussing this topic and this is a long testimony that I've C&P.
The statement Michaels made on CNN this morning piqued my interest. He said if Kyoto were to be implemented by the US, the anticipated "savings" in "Global Warming" would be 1/700 of a degree centigrade over the next 50 years!
There are 7 sections to his points, any of which would be a good topic for discussion, but I pasted the whole thing.
I think shedding light on a subject always increases knowledge and understanding......or chases the roaches back to their hiding places!
What are your thoughts?
Testimony of Patrick J. Michaels, Department of Environmental Sciences, University of Virginia, to the
Committee on Science, U.S. House of Representatives, March 6, 1996
This text represents the personal testimony of Patrick J. Michaels and is no official representation of the University or Commonwealth of Virginia This testimony is tendered with the traditional protection of academic freedom.
I have been asked to offer my testimony on the state of our knowledge about global change, including recent developments in the field, and on the priority for data collection relating to this important problem.
In science, data are the philosopher's stone that confirm or reject the hypotheses of theoreticians. In the science of global warming, it is the data that have moved the theoretical models of climate change from prognostication of rapid and disastrous warming, to reduced and slower change much more consistent with a ``moderate'', rather than ``dangerous'' synthesis of the issue.
It is also the data that demonstrate that current hypotheses as to the cause of failure of the climate models that based the 1992 Framework Convention on Climate Change are themselves likely to be incorrect. And it is the data that demonstrate that statements ascribing increases in large rainfall events in summer to c) changes in the greenhouse effect are not supported.
In my testimony I will cite seven examples of how climate data have changed the paradigm on climate change from ``dangerous'' to ``moderate.'' In each example, I cite the source of data in order to underscore the need to continue to monitor and collate important environmental information.
At the outset, it is worth noting that the purpose of the Framework Convention, according to the treaty itself, is to ``prevent dangerous anthropogenic interference in the climate system.''
Even though accurate environmental data is essential, it is also, in some cases, expensive. There may be a novel solution to the problem of funding climate change research that relieves some of the public burden and shifts the costs to some of the fundamental interest groups concerned with this problem. I note this in a final comment.
1. Climate models that were heavily cited as providing scientific basis for the Framework Convention were wrong.
In 1992, the United Nations Intergovernmental Panel on Climate Change published a ``Supplementary Report'' prepared specifically to provide the technical background for the 1992 Framework Convention on Climate Change. Unlike their previous (1990) report, which relied on climate models that ``instantaneously'', and therefore unrealistically, doubled their carbon dioxide greenhouse effect, the 1992 report used models that gradually increased the greenhouse effect and also included a more elaborate calculation of the ocean-atmosphere interaction. The net warming predicted by these models for doubling carbon dioxide generally ranged between 4 and 4.5 C.
The models could be checked in a variety of ways. One simple test is to calculate the global mean temperature change that would be expected by the models for current changes in the greenhouse effect. As early as 1987, T.M.L. Wigley calculated that the mean planetary warming to then should have be 1-1.5 C; the observed value of 0.45 valid at that time indicated that errors, averaged globally, were in the 20 0-300% range. Using the results of the transient coupled ocean-atmosphere model of Manabe et al. (1991), and allowing the model even more time to warm than was allowed by reality, I found that the error was of the same magnitude found by Wigley. This result was eventually published in the Bulletin of the American Meteorological Society and elsewhere in the scientific literature.
Parenthetically, I must state that it was common knowledge in the Modelling community that, at the time of the 1992 Supplementary Report, the models were making errors this large; As evidence for this I have appended the reviews of my original manuscript that was sent to Science. Nowhere did the one negative reviewer deny the reality of the calculations; rather, it was stated that ``Those who work with GCMs [climate models] know the problems associated with their models.'' Somehow, a public airing of the overprediction of warming was apparently inappropriate but nonetheless well known.
The most recent quantitative argument concerning the models of the type featured in IPCC (1992) is by Mitchell (199S), and indicates that the current error in those type of models is a minimum of 160% and a maximum of 360%.
Data: The data for these analyses came from ground-based weather stations. In the United States, we are currently changing the measurement systems and there is concern that the ``newer'' measurements may be incompatible with older ones. Research and data collection that preserves the integrity of the land based climate record must be in the class of highest priority.
2. The failure of the earlier forecasts is not explainable by sulfate aerosol.
The 1995 IPCC ``Second Assessment'' builds the paradigm that the cooling effect of sulfate aerosols is sufficient to explain the disparity between warming predicted by models in the 1992 Supplementary Report and observed temperatures.
The logic employed by IPCC (1995) in making this assertion is questionable. While models that attempt to mimic the effect of sulfate aerosols must provide a better explanation for the lack of warming, that does not imply that aerosols are a sufficient cause. In fact, addition of any mathematical parameter that reduces radiation reaching the ground will simulate less global warming and therefore appear to be a better model than those employed in 1992.
A more logical defense of the sulfate hypothesis would be to take the non-sulfate models and to see if their failure was minimized in large areas where the sulfate effect would be minimized. I have formally tested this hypothesis (Michaels et al., 1994) and found that the nonsulfate models performed best where the sulfate cooling was supposed to be greatest.
Another test would be to simply compare the Northern and Southern Hemispheres. The Southern half of the planet is virtually sulfate-free. And yet Satellite measured temperatures are falling, both in absolute value and in comparison to the Northern Hemisphere, which is the half of the planet that should be relatively cooled by sulfates. The change that there in fact is no negative trend in the southern hemisphere satellite data is less than 1 in 10,000.
Data: The data sources consist of the surface temperature records noted above and temperatures sensed by microwave-sounding satellites. While these are NOAA platforms, the data capture and analyses are performed by NASA. This system is currently threatened by machine failures related to expected lifetimes.
Maintenance of this datastream is of the highest priority as it provides strong and objective evidence against the ``dangerous'' warming paradigm.
3. Stratospheric Ozone depletion does not explain the lack of warming.
In response to the first two items noted in this testimony, a novel explanation is emerging: that stratospheric ozone depletion explains the lack of warming obvious in the satellite data, and also explains the disparity between ground-measured temperatures and the satellite data. NASA modular James Hansen noted this in a January report about satellite temperatures in the Washington Post. Last week, climatologist Phil Jones noted the same in a presentation at the head office Australian Meteorological Bureau in Melbourne.
Ozone depletion is greatest in the high latitudes, particularly in the Southern Hemisphere, and there is virtually no depletion in the tropical half of the planet. Thus the disparity between ground and satellite temperatures should be greatest in the high latitudes and least in the lower latitudes. The opposite is true.
Satellite and ground-measured temperatures match very well in high latitudes and mismatch in the tropics.
A parenthetical argument has been somehow the satellite data are in error in the free atmosphere. In fact, on a global basis, the agreement between satellite data and the mean temperature sensed by weather balloons in the layer between 5,000 and 30,000 feet is truly remarkable, with a statistical correspondence of nearly 95%.
Data: In addition to the two data sources mentioned previously, this analysis uses the global weather balloon (radiosonde) network data that is collated in large part by NOAA scientists and the National Climatic Data Center. Maintenance and analysis of this network at the highest level of quality is clearly as high a priority other data priorities noted above.
4. The most likely explanation for the failure of the models is in their forecast of temperature in the zone from 5,000 to 50,000 feet.
All transient climate models, including those modified with sulfate aerosol, predict a rather smooth and consistent warming of the entire troposphere, or bottom 50,000 feet of the atmosphere. It is apparent that the atmosphere has cooled above 30,000 feet and that there is no net change in the last 20 years in the region between 5,000 and 30,000 feet. There was a sudden jump in this layer temperature in 1977; no change was noted prior to then, either.
The implications of the behavior in these layers is profound. Everything else being equal, the cooling from 30,000 to 50,000 feet increases the vertical transport of moisture in the atmosphere and could be responsible for observed increases in large-area cloudiness. In turn, the increases in cloudiness could explain the overall lack of warming and its distribution primarily into the winter and night; in the region of the planet that has shown the most warming Siberia the ratio of winter to summer (which is, essentially, night (or low sun)-to-day) is approximately 4.2 to 1. The primary climatic effect would be to slightly lessen the severity of the coldest airmasses in the Northern Hemisphere.
Several scientists, including Dr. Lindzen on this Panel, have hypothesized that model calculations of the vertical distribution of heat in the atmosphere are likely to be wrong or at least unreliable; the failure of the models to even simulate the sign of observed changes in the last two decades bears strong witness to this hypothesis.
Data: While there are many hypotheses concerning the inaccuracy of the prediction of vertical temperature change, data required to resolve this will come from atmospheric sounding devices such as weather balloons and satellite profilers. These should therefore be high priorities for support.
5. LACK of data cause embarrassing errors
In early January we were treated to a number of stories stating that ground-based thermometers indicated that l995 was the warmest year in the instrumental record. In fact, the temperatures used to make this calculation only included data through November, 1995, and an assumption that December temperature departures from normal would be the same as they were for the rest of the year. This temperature history was compiled by Phil Jones of the University of East Anglia. As noted in the satellite data (Figure 1), December temperature departures from the mean in fact reflected the largest single one-month drop in the entire record for the Northern Hemisphere, declining by 0.72 from November.
These types of errors could be expected to propagate throughout the climate measurement system if surface data collection and analysis is allowed to degrade.
6. Increases in strong summer rainfall are not explainable by climate models, and are more likely to have been beneficial rather than `dangerous.''
In his March 17, 1995 ``Earth Day'' address at George Washington University, the vice-president noted that, with regard to global climate change, ``torrential rains have increased in the summer during agricultural growing regions.'' (Gore)
The source for his statement was a then-unpublished manuscript by Thomas Karl et al. that showed a statistically significant, but very small increase in the percent of United States rainfall that occurred from rainstorms that produced at least two inches of rain in 24 hours. The equivalent change is that, on the average, there is now one more day in every 730 days in which it rains more than two inches.
Examination of Karl's own graphics indicates that the largest increase in this parameter took place between 1930 and 1950, or prior to 70% of the human-induced changes in the greenhouse effect. Ascribing this change to the greenhouse effect is therefore dubious at best. Perhaps as important, the vast majority (70960, nationally) of summer 24 hour rainfall that is greater than two inches is less than three inches. It is very hard to entertain the notion that a 2-3 inch rain in summer is ``torrential'', and, in toto hardly a ``dangerous'' change in climate. At that time of the year, all major agricultural regions of the U.S. are, on the average, losing more moisture to evaporation than they gain from rainfall, and this rain is much more likely to be welcomed by farmers than feared.
Data: The data for this study came from the cooperative-observer network of raingauges, maintained by the National Climatic Data Center. Maintenance of this high-quality record is a very high priority. Recent attempts to augment this history with satellite measurements have met with limited success, but it is nonetheless important to develop a system that reliably provides global monitoring of rainfall, especially over the ocean and in unpopulated regions. This should also be a very high priority.
7. Model calculations have made large errors in projections of jet stream positions.
These areas have been discussed in the testimony of Robert Davis.
Data: The data required to examine these aspects of model performance include weather balloon soundings and ground-based rainfall measurements. As noted above, these can and should be augmented with an improved remotely sensed rainfall network as well as satellite-based vertical profilers of atmospheric temperature, pressure, and moisture.
Thoughts on funding: It is clear that we live in an era of declining federal financial support of many aspects of the nation's life. One cannot expect that federal research budgets will be immune to this trend. In the area of global climate change, there is little doubt that several public servants feel very strongly that research indicates the ``dangerous'' view of climate change is more likely than the newer, ``moderate'' synthesis. Thus the federal research presence in this area will always raise suspicions real and imagined that the funding process has become politicized. One can change neither this perception nor the ambitions of those who champion any point of view; rather, the best tactic may be to admit to the reality of the problem and purposefully attempt to broaden the research base to explicitly include as providers some of the communities that are especially concerned with the issue of climate change.
Might it not be appropriate, in this era of declining funding, for interested parties other than government to begin to assume some of the research burden? I am referring specifically to two groups with considerable resources: industry and the environmental community. Perhaps you can develop a mechanism where both of these groups explicitly demonstrate financial support for research on climate change, and then the federal outlay is reduced an equivalent amount. I do not know much of these matters, but I suspect there are some incentives that can aid in this process.
This proposal would have the effect of maintaining financial support for research on climate change while broadening the base of support. There is no known constitutional fiat that I know of that requires that t he federal government be the sole provider of funding for climate change research, even though that is virtually the case today. And further, there is demonstrable evidence that removal of the ``monopoly'' provider status of the federal government may in fact result in a more diverse, and therefore healthy, research culture on this important issue.
REFERENCES
Intergovernmental Panel on Climate Change (IPCC) 1990. U. N. Environment Programme. 200pp., 1992. Climate Change 1992.
The Supplementary Report to the IPCC Scientific Assessment. pp.
• Karl, T.R., et al. l99S. Nature 337, 217-220.
• Michaels, P. I. and D. E. Stooltsbury, 1992. Bull Amer. Met. Soc.
• Michaels, P. J. et al., 1994. Technology: J. Franklin Inst. 331A, 123-133.
• Mitchell, J.F.B. et al, l 99S. J. Climate 8, 2364-238S.
• Wigley, T.M.L., 1987. Climate Monitor 16, 14-28.
Rhumb
The statement Michaels made on CNN this morning piqued my interest. He said if Kyoto were to be implemented by the US, the anticipated "savings" in "Global Warming" would be 1/700 of a degree centigrade over the next 50 years!
There are 7 sections to his points, any of which would be a good topic for discussion, but I pasted the whole thing.
I think shedding light on a subject always increases knowledge and understanding......or chases the roaches back to their hiding places!

What are your thoughts?
Testimony of Patrick J. Michaels, Department of Environmental Sciences, University of Virginia, to the
Committee on Science, U.S. House of Representatives, March 6, 1996
This text represents the personal testimony of Patrick J. Michaels and is no official representation of the University or Commonwealth of Virginia This testimony is tendered with the traditional protection of academic freedom.
I have been asked to offer my testimony on the state of our knowledge about global change, including recent developments in the field, and on the priority for data collection relating to this important problem.
In science, data are the philosopher's stone that confirm or reject the hypotheses of theoreticians. In the science of global warming, it is the data that have moved the theoretical models of climate change from prognostication of rapid and disastrous warming, to reduced and slower change much more consistent with a ``moderate'', rather than ``dangerous'' synthesis of the issue.
It is also the data that demonstrate that current hypotheses as to the cause of failure of the climate models that based the 1992 Framework Convention on Climate Change are themselves likely to be incorrect. And it is the data that demonstrate that statements ascribing increases in large rainfall events in summer to c) changes in the greenhouse effect are not supported.
In my testimony I will cite seven examples of how climate data have changed the paradigm on climate change from ``dangerous'' to ``moderate.'' In each example, I cite the source of data in order to underscore the need to continue to monitor and collate important environmental information.
At the outset, it is worth noting that the purpose of the Framework Convention, according to the treaty itself, is to ``prevent dangerous anthropogenic interference in the climate system.''
Even though accurate environmental data is essential, it is also, in some cases, expensive. There may be a novel solution to the problem of funding climate change research that relieves some of the public burden and shifts the costs to some of the fundamental interest groups concerned with this problem. I note this in a final comment.
1. Climate models that were heavily cited as providing scientific basis for the Framework Convention were wrong.
In 1992, the United Nations Intergovernmental Panel on Climate Change published a ``Supplementary Report'' prepared specifically to provide the technical background for the 1992 Framework Convention on Climate Change. Unlike their previous (1990) report, which relied on climate models that ``instantaneously'', and therefore unrealistically, doubled their carbon dioxide greenhouse effect, the 1992 report used models that gradually increased the greenhouse effect and also included a more elaborate calculation of the ocean-atmosphere interaction. The net warming predicted by these models for doubling carbon dioxide generally ranged between 4 and 4.5 C.
The models could be checked in a variety of ways. One simple test is to calculate the global mean temperature change that would be expected by the models for current changes in the greenhouse effect. As early as 1987, T.M.L. Wigley calculated that the mean planetary warming to then should have be 1-1.5 C; the observed value of 0.45 valid at that time indicated that errors, averaged globally, were in the 20 0-300% range. Using the results of the transient coupled ocean-atmosphere model of Manabe et al. (1991), and allowing the model even more time to warm than was allowed by reality, I found that the error was of the same magnitude found by Wigley. This result was eventually published in the Bulletin of the American Meteorological Society and elsewhere in the scientific literature.
Parenthetically, I must state that it was common knowledge in the Modelling community that, at the time of the 1992 Supplementary Report, the models were making errors this large; As evidence for this I have appended the reviews of my original manuscript that was sent to Science. Nowhere did the one negative reviewer deny the reality of the calculations; rather, it was stated that ``Those who work with GCMs [climate models] know the problems associated with their models.'' Somehow, a public airing of the overprediction of warming was apparently inappropriate but nonetheless well known.
The most recent quantitative argument concerning the models of the type featured in IPCC (1992) is by Mitchell (199S), and indicates that the current error in those type of models is a minimum of 160% and a maximum of 360%.
Data: The data for these analyses came from ground-based weather stations. In the United States, we are currently changing the measurement systems and there is concern that the ``newer'' measurements may be incompatible with older ones. Research and data collection that preserves the integrity of the land based climate record must be in the class of highest priority.
2. The failure of the earlier forecasts is not explainable by sulfate aerosol.
The 1995 IPCC ``Second Assessment'' builds the paradigm that the cooling effect of sulfate aerosols is sufficient to explain the disparity between warming predicted by models in the 1992 Supplementary Report and observed temperatures.
The logic employed by IPCC (1995) in making this assertion is questionable. While models that attempt to mimic the effect of sulfate aerosols must provide a better explanation for the lack of warming, that does not imply that aerosols are a sufficient cause. In fact, addition of any mathematical parameter that reduces radiation reaching the ground will simulate less global warming and therefore appear to be a better model than those employed in 1992.
A more logical defense of the sulfate hypothesis would be to take the non-sulfate models and to see if their failure was minimized in large areas where the sulfate effect would be minimized. I have formally tested this hypothesis (Michaels et al., 1994) and found that the nonsulfate models performed best where the sulfate cooling was supposed to be greatest.
Another test would be to simply compare the Northern and Southern Hemispheres. The Southern half of the planet is virtually sulfate-free. And yet Satellite measured temperatures are falling, both in absolute value and in comparison to the Northern Hemisphere, which is the half of the planet that should be relatively cooled by sulfates. The change that there in fact is no negative trend in the southern hemisphere satellite data is less than 1 in 10,000.
Data: The data sources consist of the surface temperature records noted above and temperatures sensed by microwave-sounding satellites. While these are NOAA platforms, the data capture and analyses are performed by NASA. This system is currently threatened by machine failures related to expected lifetimes.
Maintenance of this datastream is of the highest priority as it provides strong and objective evidence against the ``dangerous'' warming paradigm.
3. Stratospheric Ozone depletion does not explain the lack of warming.
In response to the first two items noted in this testimony, a novel explanation is emerging: that stratospheric ozone depletion explains the lack of warming obvious in the satellite data, and also explains the disparity between ground-measured temperatures and the satellite data. NASA modular James Hansen noted this in a January report about satellite temperatures in the Washington Post. Last week, climatologist Phil Jones noted the same in a presentation at the head office Australian Meteorological Bureau in Melbourne.
Ozone depletion is greatest in the high latitudes, particularly in the Southern Hemisphere, and there is virtually no depletion in the tropical half of the planet. Thus the disparity between ground and satellite temperatures should be greatest in the high latitudes and least in the lower latitudes. The opposite is true.
Satellite and ground-measured temperatures match very well in high latitudes and mismatch in the tropics.
A parenthetical argument has been somehow the satellite data are in error in the free atmosphere. In fact, on a global basis, the agreement between satellite data and the mean temperature sensed by weather balloons in the layer between 5,000 and 30,000 feet is truly remarkable, with a statistical correspondence of nearly 95%.
Data: In addition to the two data sources mentioned previously, this analysis uses the global weather balloon (radiosonde) network data that is collated in large part by NOAA scientists and the National Climatic Data Center. Maintenance and analysis of this network at the highest level of quality is clearly as high a priority other data priorities noted above.
4. The most likely explanation for the failure of the models is in their forecast of temperature in the zone from 5,000 to 50,000 feet.
All transient climate models, including those modified with sulfate aerosol, predict a rather smooth and consistent warming of the entire troposphere, or bottom 50,000 feet of the atmosphere. It is apparent that the atmosphere has cooled above 30,000 feet and that there is no net change in the last 20 years in the region between 5,000 and 30,000 feet. There was a sudden jump in this layer temperature in 1977; no change was noted prior to then, either.
The implications of the behavior in these layers is profound. Everything else being equal, the cooling from 30,000 to 50,000 feet increases the vertical transport of moisture in the atmosphere and could be responsible for observed increases in large-area cloudiness. In turn, the increases in cloudiness could explain the overall lack of warming and its distribution primarily into the winter and night; in the region of the planet that has shown the most warming Siberia the ratio of winter to summer (which is, essentially, night (or low sun)-to-day) is approximately 4.2 to 1. The primary climatic effect would be to slightly lessen the severity of the coldest airmasses in the Northern Hemisphere.
Several scientists, including Dr. Lindzen on this Panel, have hypothesized that model calculations of the vertical distribution of heat in the atmosphere are likely to be wrong or at least unreliable; the failure of the models to even simulate the sign of observed changes in the last two decades bears strong witness to this hypothesis.
Data: While there are many hypotheses concerning the inaccuracy of the prediction of vertical temperature change, data required to resolve this will come from atmospheric sounding devices such as weather balloons and satellite profilers. These should therefore be high priorities for support.
5. LACK of data cause embarrassing errors
In early January we were treated to a number of stories stating that ground-based thermometers indicated that l995 was the warmest year in the instrumental record. In fact, the temperatures used to make this calculation only included data through November, 1995, and an assumption that December temperature departures from normal would be the same as they were for the rest of the year. This temperature history was compiled by Phil Jones of the University of East Anglia. As noted in the satellite data (Figure 1), December temperature departures from the mean in fact reflected the largest single one-month drop in the entire record for the Northern Hemisphere, declining by 0.72 from November.
These types of errors could be expected to propagate throughout the climate measurement system if surface data collection and analysis is allowed to degrade.
6. Increases in strong summer rainfall are not explainable by climate models, and are more likely to have been beneficial rather than `dangerous.''
In his March 17, 1995 ``Earth Day'' address at George Washington University, the vice-president noted that, with regard to global climate change, ``torrential rains have increased in the summer during agricultural growing regions.'' (Gore)
The source for his statement was a then-unpublished manuscript by Thomas Karl et al. that showed a statistically significant, but very small increase in the percent of United States rainfall that occurred from rainstorms that produced at least two inches of rain in 24 hours. The equivalent change is that, on the average, there is now one more day in every 730 days in which it rains more than two inches.
Examination of Karl's own graphics indicates that the largest increase in this parameter took place between 1930 and 1950, or prior to 70% of the human-induced changes in the greenhouse effect. Ascribing this change to the greenhouse effect is therefore dubious at best. Perhaps as important, the vast majority (70960, nationally) of summer 24 hour rainfall that is greater than two inches is less than three inches. It is very hard to entertain the notion that a 2-3 inch rain in summer is ``torrential'', and, in toto hardly a ``dangerous'' change in climate. At that time of the year, all major agricultural regions of the U.S. are, on the average, losing more moisture to evaporation than they gain from rainfall, and this rain is much more likely to be welcomed by farmers than feared.
Data: The data for this study came from the cooperative-observer network of raingauges, maintained by the National Climatic Data Center. Maintenance of this high-quality record is a very high priority. Recent attempts to augment this history with satellite measurements have met with limited success, but it is nonetheless important to develop a system that reliably provides global monitoring of rainfall, especially over the ocean and in unpopulated regions. This should also be a very high priority.
7. Model calculations have made large errors in projections of jet stream positions.
These areas have been discussed in the testimony of Robert Davis.
Data: The data required to examine these aspects of model performance include weather balloon soundings and ground-based rainfall measurements. As noted above, these can and should be augmented with an improved remotely sensed rainfall network as well as satellite-based vertical profilers of atmospheric temperature, pressure, and moisture.
Thoughts on funding: It is clear that we live in an era of declining federal financial support of many aspects of the nation's life. One cannot expect that federal research budgets will be immune to this trend. In the area of global climate change, there is little doubt that several public servants feel very strongly that research indicates the ``dangerous'' view of climate change is more likely than the newer, ``moderate'' synthesis. Thus the federal research presence in this area will always raise suspicions real and imagined that the funding process has become politicized. One can change neither this perception nor the ambitions of those who champion any point of view; rather, the best tactic may be to admit to the reality of the problem and purposefully attempt to broaden the research base to explicitly include as providers some of the communities that are especially concerned with the issue of climate change.
Might it not be appropriate, in this era of declining funding, for interested parties other than government to begin to assume some of the research burden? I am referring specifically to two groups with considerable resources: industry and the environmental community. Perhaps you can develop a mechanism where both of these groups explicitly demonstrate financial support for research on climate change, and then the federal outlay is reduced an equivalent amount. I do not know much of these matters, but I suspect there are some incentives that can aid in this process.
This proposal would have the effect of maintaining financial support for research on climate change while broadening the base of support. There is no known constitutional fiat that I know of that requires that t he federal government be the sole provider of funding for climate change research, even though that is virtually the case today. And further, there is demonstrable evidence that removal of the ``monopoly'' provider status of the federal government may in fact result in a more diverse, and therefore healthy, research culture on this important issue.
REFERENCES
Intergovernmental Panel on Climate Change (IPCC) 1990. U. N. Environment Programme. 200pp., 1992. Climate Change 1992.
The Supplementary Report to the IPCC Scientific Assessment. pp.
• Karl, T.R., et al. l99S. Nature 337, 217-220.
• Michaels, P. I. and D. E. Stooltsbury, 1992. Bull Amer. Met. Soc.
• Michaels, P. J. et al., 1994. Technology: J. Franklin Inst. 331A, 123-133.
• Mitchell, J.F.B. et al, l 99S. J. Climate 8, 2364-238S.
• Wigley, T.M.L., 1987. Climate Monitor 16, 14-28.
Rhumb