3
A changing climate for climate modeling David I. Lewin [email protected] 10 IEEE Concurrency I s the Earth’s climate changing, and to what degree are these changes due to human actions? These are the key questions that must be answered before ratification of the Kyoto Proto- col, which would limit the emission of greenhouse gases by industrialized and industrializing nations. Measurements alone—whether land- or sea-based or by satellites—cannot give the answers, but must be used to evaluate the credibility of supercomputer models of the atmospheric, oceanic, and biotic systems that determine the world’s climate. Unlike weather models, which deal, at most, with forecasts of atmos- pheric phenomena a week in advance, climate models deal with the entire atmosphere, ocean, and ice/land-surface system for decades or even centuries. “Weather is an initial value problem because it starts with a particular aspect of the system that uses deterministic results of the hydrodynamic equations,” states Jerry Mahlman, director of the Geophysical Fluid Dynamics Laboratory, a National Oceanic and Atmospheric Administration research center at Princeton University. In contrast, he says, “Climate is a time-dependent boundary-value problem.” Instead of trying to forecast the wea- ther 50 years from now, climate modelers are interested in the sta- tistics for the weather a half-century hence—what the range of temperatures and other weather phenomena is most likely to be. Although the atmosphere adjusts to change relatively quickly, the ocean can take decades or centuries to adjust to a change, says oceanographer Yi Chao of the National Aeronautics and Space Administration’s Jet Propulsion Laboratory. He believes that to study the natural variability in the climate due to the interactions of the ocean and atmosphere, researchers need to integrate the coupled climate models over long periods of time— hundreds of simulated years. “There are only a few places in this country—or in the world—that have the capability to run cen- tury time-scale climate models,” he says. CLIMATE MODELINGS HISTORY Climate modeling goes back to the mid-1960s, beginning with one-dimensional models (altitude only) of the atmosphere, although mathematical modeling for weather prediction began two decades earlier. By the end of the 1960s, researchers had created complex ocean models for coupling with atmospheric models. More recently, researchers have developed models that incorporate sea ice, clouds, and biogenic gases (such as carbon dioxide and methane). Contemporary ocean models can produce realistic simulations for several decades, including boundary currents, eddies, and El Niño, notes Albert J. Semtner of the Naval Postgraduate School, Monterey, California, but “the most recent calculations for the Atlantic Ocean only, done at Los Alamos National Laboratory, showed that you need a 0.1-degree grid with 40 vertical levels to represent these phenomena.” Figure 1 shows an example of this. Such a model would require 100 gigaflops (where one gigaflop equals a billion floating-point operations per second) of sustained computer power to carry out multidecade integrations. “There has been a lot of progress in understanding the cli- mate system,” Mahlman notes. In his opinion, some of the biggest problems that researchers must address include the proper treatment of clouds and the best approach to ocean- atmosphere coupling. Another problem needing better treat- ment is sea ice modeling, says Warren Washington of the National Center for Atmospheric Research (NCAR) in Boul- der, Colorado. Models for sea ice transport, especially near the edge of the ice, need to be improved. The results of such climate models have led to fierce debates, as much political as scientific, in the US over the reality of global warm- ing. “Most scientists [agree] that the climate is changing due to greenhouse gas buildup, and the people who argue against this are in the minority,” says Semtner, “but the details are very uncertain— the models are aiming at defining the timing and spatial distribu- tion of climate change.” But there is still sharp disagreement whether existing models are adequate for policy setting. RESTRICTIONS ON MODELS One of the limiting factors on the use of these models is the availability of computer power to perform the simulations. The Focus

A changing climate for climate modeling

  • Upload
    di

  • View
    217

  • Download
    2

Embed Size (px)

Citation preview

Page 1: A changing climate for climate modeling

A changing climate for climatemodeling

David I. [email protected]

10 IEEE Concurrency

I s the Earth’s climate changing, and to what degree are thesechanges due to human actions? These are the key questions

that must be answered before ratification of the Kyoto Proto-col, which would limit the emission of greenhouse gases byindustrialized and industrializing nations. Measurementsalone—whether land- or sea-based or by satellites—cannotgive the answers, but must be used to evaluate the credibilityof supercomputer models of the atmospheric, oceanic, andbiotic systems that determine the world’s climate. Unlikeweather models, which deal, at most, with forecasts of atmos-pheric phenomena a week in advance, climate models deal withthe entire atmosphere, ocean, and ice/land-surface system fordecades or even centuries.

“Weather is an initial value problem because it starts with aparticular aspect of the system that uses deterministic results ofthe hydrodynamic equations,” states Jerry Mahlman, director ofthe Geophysical Fluid Dynamics Laboratory, a National Oceanicand Atmospheric Administration research center at PrincetonUniversity. In contrast, he says, “Climate is a time-dependentboundary-value problem.” Instead of trying to forecast the wea-ther 50 years from now, climate modelers are interested in the sta-tistics for the weather a half-century hence—what the range oftemperatures and other weather phenomena is most likely to be.

Although the atmosphere adjusts to change relatively quickly,the ocean can take decades or centuries to adjust to a change,says oceanographer Yi Chao of the National Aeronautics andSpace Administration’s Jet Propulsion Laboratory. He believesthat to study the natural variability in the climate due to theinteractions of the ocean and atmosphere, researchers need tointegrate the coupled climate models over long periods of time—hundreds of simulated years. “There are only a few places in thiscountry—or in the world—that have the capability to run cen-tury time-scale climate models,” he says.

CLIMATE MODELING’S HISTORYClimate modeling goes back to the mid-1960s, beginning withone-dimensional models (altitude only) of the atmosphere,

although mathematical modeling for weather prediction begantwo decades earlier. By the end of the 1960s, researchers hadcreated complex ocean models for coupling with atmosphericmodels. More recently, researchers have developed models thatincorporate sea ice, clouds, and biogenic gases (such as carbondioxide and methane).

Contemporary ocean models can produce realistic simulationsfor several decades, including boundary currents, eddies, and ElNiño, notes Albert J. Semtner of the Naval Postgraduate School,Monterey, California, but “the most recent calculations for theAtlantic Ocean only, done at Los Alamos National Laboratory,showed that you need a 0.1-degree grid with 40 vertical levels torepresent these phenomena.” Figure 1 shows an example of this.Such a model would require 100 gigaflops (where one gigaflopequals a billion floating-point operations per second) of sustainedcomputer power to carry out multidecade integrations.

“There has been a lot of progress in understanding the cli-mate system,” Mahlman notes. In his opinion, some of thebiggest problems that researchers must address include theproper treatment of clouds and the best approach to ocean-atmosphere coupling. Another problem needing better treat-ment is sea ice modeling, says Warren Washington of theNational Center for Atmospheric Research (NCAR) in Boul-der, Colorado. Models for sea ice transport, especially near theedge of the ice, need to be improved.

The results of such climate models have led to fierce debates, asmuch political as scientific, in the US over the reality of global warm-ing. “Most scientists [agree] that the climate is changing due togreenhouse gas buildup, and the people who argue against this arein the minority,” says Semtner, “but the details are very uncertain—the models are aiming at defining the timing and spatial distribu-tion of climate change.” But there is still sharp disagreement whetherexisting models are adequate for policy setting.

RESTRICTIONS ON MODELSOne of the limiting factors on the use of these models is theavailability of computer power to perform the simulations. The

Focus

Page 2: A changing climate for climate modeling

adequacy of computer resources and appropriateness of insti-tutional arrangements for climate modeling in the UnitedStates was the subject of a National Research Council (NRC)report released in January—Capacity of US Climate Modeling toSupport Climate Change Assessment Activities.

Although the US climate-modeling research community isan acknowledged leader in improving our understanding ofspecific aspects of the climate system, it has lagged behindEuropean researchers in producing large-scale models used inrecent international assessments of global climate change andits impacts, according to the NRC report. The report con-cludes that “although individual federal agencies may haveestablished well-defined priorities for climate modelingresearch, there is no integrated national strategy,” unlike thatseen in Europe and Japan. Furthermore, the NRC report notesthat “insufficient human and computational resources arebeing devoted to high-end, computer-intensive comprehen-sive modeling.” Finally, the report expresses concern aboutUS reliance on other countries for high-end climate model-ing, because other countries’ models can embody differentgeographic and scientific priorities.

One of the events leading the NRC to undertake the two-yearstudy was an October 1995 letter from four prominent modelers toprogram managers at several federal agencies that support climatemodeling research. The letter expressed concern about the lack of

funding for this research, the inadequacy of computing resourcesas compared to those abroad, and the lack of national coordination.

“What has happened since the time of that letter is that thecomputing situation [in the US] has gotten worse,” says Semt-ner. An ocean modeler, Semtner was one of the authors of the1995 letter and notes that European researchers have accessto NEC’s top-of-the-line vector supercomputers, which USresearch centers—particularly NCAR—have been unable toacquire because of a prohibitive 400% tariff imposed by theDepartment of Commerce, which protects the domestic super-computer industry. “We’ve really lost about three years interms of having access to the best computers in the world forthese models, which happen to be made in Japan [the newNEC SX-5 and machines from Fujitsu and Hitachi].”

Mahlman agrees that, in the short term, the tariff imposed onthe SX systems is a real problem for climate researchers. In acompetition on sustained performance for climate modeling,“my opinion is that right now the Japanese [computers] wouldwin,” he says. However, within five years, he expects the oddsto favor massively parallel distributed-memory systems.

The comparable US supercomputer, the T-90 from CrayResearch (now part of Silicon Graphics Inc.), turned out to befar too expensive and not as fast as it had been initially reportedto be, Semtner says. “My impression is that Cray Research didnot pursue the refinements that were needed in the vector archi-

April–June 1999 11

Figure 1. The two sides of this figure compare the sea-surface temperature (SST) in the Gulf Stream (GS) regionmeasured by a satellite radiometer with the SST from the 0.1-degree POP simulation of the Atlantic Ocean. Thecomparison is only qualitative because the color scale used in the satellite image is not available and thereforecould not be reproduced in the model image. However, important points of comparison are the narrowness ofthe core of the GS, separation of the GS at Cape Hatteras, the scale of meanders downstream of the separation,and the formation of cold (warm) core eddies south (north) of the GS. A notable discrepancy is the appearance inthe model of cold water along the coastline from Cape Hatteras to southern Florida, instead of the relativelywarm water seen in the observations. Simulation by Richard Smith, Mathew Maltrud (Los Alamos National Lab-oratory), Frank Bryan, and Matthew Hecht (National Center for Atmospheric Research) with support from theDepartment of Energy CHAMMP program and the National Science Foundation; visualization by Patrick Mc-Cormick (LANL Advanced Computing Laboratory).

Page 3: A changing climate for climate modeling

12 IEEE Concurrency

tecture while they were pursuing the massively parallel paththat the High Performance Computing and CommunicationsProgram was steering them towards,” he says. Because of this,Cray Research was unable to match the improvements in per-formance and cost of the Japanese SX-4 vector machine, whichdelivers 25 gigaflops. If NCAR had been able to procure thefour SX-4s that it wanted, this would have given climate mod-elers 100 gigaflops of sustained computing power, Semtner says.

With the best vector supercomputers out of reach in the US,researchers have turned to running their models on distrib-uted and massively parallel systems. “There is considerableoptimism in the US high-performance computing communitythat parallelism will triumph” and that the US mode of usingmany cheap microprocessors together “will ultimately be theonly game in town,” Semtner says. Despite this, “there is nosubstitute for having high speed built into a single CPU.”

Climate models require calculations at sustained speeds tocarry out very long time integrations for spatial structures thatare not large in comparison with the typical problems solvedon massively parallel computers. “Something that’s relativelyunique to climate models is that we take relatively coarse-grained data and take the results out for centuries,” says SteveHammond, a computer scientist involved in parallel comput-ing projects at NCAR. However, NCAR and the Departmentof Energy have invested many man-years in studies that suc-cessfully ran the global climate change model.

Because climate models are not as strongly parallel as manyfluid dynamics models, researchers have tended to use vectorcomputers, says JPL’s Chao, who also notes that climate mod-elers are still learning how to use parallel computers efficiently.

Vector computers have a single, central memory that suppliesall of the processors while each processor in a massively paral-lel computer has its own memory, Chao says. One problem withtrying to run climate models on massively parallel, distributedmemory computers is that, for these models, the performanceis only 5%–10% of the maximum advertised by the manufac-turer because the memories are slow and the communicationbetween the processors is limited, Semtner says. Another prob-lem with regard to parallel systems is that they often link thecommodity processors together into clusters. However, saysHammond, the debugging and performance-measurement toolsused on vector supercomputers do not always work on proces-sor clusters, let alone across such clusters.

MODIFICATIONSResearchers such as Chao are working to optimize climatemodels to run on massively parallel computers such as the CrayT3E, which has 512 processors. Their goal is to take coupledatmosphere-ocean models and run them ever faster on suchmassively parallel computing systems. As a NASA Grand Chal-lenge project, Chao and his colleagues have increased the speedat which the model of general ocean circulation runs the cou-pled ocean-atmosphere system model by two- to threefold,from 10 gigaflops during the first year of their project to 40gigaflops in the second year, representing about 20% of peak

speed. In the project’s third year, they are coupling the ocean-atmosphere model with component models of atmosphericchemistry, ocean chemistry, and biology.

This increased speed enables the researchers to do longerruns at an increased resolution, “which is important for oceanmodeling,” Chao says. Because of the differences betweenair and water, important ocean features such as mesoscaleeddies, the equivalent of storms in the atmosphere, are muchsmaller in scale. For most ocean models, he notes, researchersuse a less than 100-km grid size. This loses substantial detail,and the JPL group is now trying to model with a finer meshscale, approximately 15 km, which lets the researchers explorewhat the proper scale needs to be to determine climate effects.

Atmospheric models that cover the Earth’s surface also needrefining. Models covering the western US, for example, needimproved vertical and horizontal resolution, NCAR’s Wash-ington says, to take into account that as you go inland fromthe coast, the surface rises into a series of mountains and basinsfrom California to Colorado, rather than the linear increasein elevation used in current models.

In addition to concerns about adequate computer resources,researchers in the field are concerned about the level of fund-ing and the goals of climate modeling. There is an implicitassumption that climate modeling is solely directed to supportglobal change assessment activities, says Mahlman. Due to thisassumption, he notes, “there are issues that are being under-addressed because of lack of funds.”

For example, little effort is being put into the question ofdetection attribution—researchers might be able to detect a sig-nal of the kind they are looking for, but is the signal evidence ofglobal change or of bad data? Computing power alone cannotlet researchers distinguish between these two options, Mahlmansays. Further, the infrastructure for disseminating the results ofmodeling is almost nonexistent, he says, in contrast to the situ-ation for weather forecasting. These and other issues show theneed for a better balance between the short-term needs of pol-icy-making and the long-term needs of climate science.

In one area, the conclusions of the NRC report are nearlyobsolete. A number of the climate researchers interviewed

noted that coordination among the federal agencies fundingclimate modeling has improved greatly. There will not be onenational model as proposed by Semtner and his coauthors ofthe 1995 letter, says Washington. Rather each agency is sup-porting a few modeling groups and the modeling groups areexchanging modules between the models. Such interchangewill allow researchers to choose the best component for eachcomputational approach for each physical component con-tributing to a model, he says.

David I. Lewin is contributing editor to Concurrency, and served ascontributing editor for Computers in Physics. He writes on science, med-icine, and technology from Silver Spring, Maryland. Contact him [email protected].