21
CCSM Testing Status Tony Craig Lawrence Buja Wei Yu CCSM SEWG Meeting Feb 5, 2003

CCSM Testing Status Tony Craig Lawrence Buja Wei Yu CCSM SEWG Meeting Feb 5, 2003

Embed Size (px)

Citation preview

CCSM Testing Status

Tony Craig

Lawrence Buja

Wei Yu

CCSM SEWG Meeting

Feb 5, 2003

Outline

• Testing: when, who, why, what• What we do now• Where do we go next

Test StrategyWhen-Who-Why

WHEN WHO WHY

pre-commit developer validate changes

post-commit test engineer, automated?

verify commits

pre-release release team verify model is ready for release

post-release test engineer, automated?

verify platform is not changing

Test StrategyWhat

• Exact restart• Bit-for-bit, round-off, other• Different Platforms• Different Resolutions• Serial, multitasked, threaded, hybrid• Physics packages, dynamical cores• Scientific Validation - long climate runs• Performance• I/O, Data

Testing Process for each Component

ccsm atm lnd ocn ice cpl datapre-commit comp dev dev gate-

keepergate-keeper

? ?

post-commit test engr test engr gate-keeper

gate-keeper

gate-keeper

? ?

pre-release release team

release team

release team

N/A N/A N/A ?

post-release

test engr ? ? N/A N/A N/A ?

What do we do now

• CAM test-model– used by developers

• CAM dev branch testing– automated testing after each commit

• CCSM beta tag tests– manual testing of entire system periodically

• CCSM release tests– automated testing of releases versions regularly

CAM test-model

• Script that runs many CAM cases automatically• Runs on many platforms (SGI, IBM, CPQ, PC/Linux)• Many automated tests

– dynamical cores– tasks and threads– error growth

CAM dev branch testing

• Automatic testing of CAM every night after a commit on the primary development branch

• Uses test-model• Runs on chinook, blackforest, anchorage

CCSM beta tag tests

• Exact restart testing– Various configurations– Various resolutions

• Comparison with previous beta tag

CCSM Release Tests

• Weekly testing of CCSM2.0, CCSM2.0.1 releases on – chinook– blackforest– bluesky– seaborg

• Includes some patches

CCSM performance testing

• Carry out a large suite of timing tests• Comparison with previous versions• Determine appropriate load balance

What next

• Develop a formal test plan document• Determine Who, When, Why, What• Establish formal processes for each component and

for the CCSM overall• Consider resources required (both people and

computer time)• Unit tests

Notes

• Test test-model• Test Configuration thoroughly, what’s supported,

– configuration = dycores, large components, physics, machine dependencies

• chunking• testing <-> requirements• test-model lite• add tests - cost?• difference between dev tests, nightly tests, release testing, test-

model is same for all• community testing

Notes - 2

• pop - coarse and high resolution tests cover most of physics space, also different pe configurations

• test-model compares to previous version, very useful• test-model for land exists• make consistency between stand-alone and ccsm

versions of the model, unify compiler options?• ccsm requirements vs component requirements• specs for makefile, specs in general?• unit tests - physics wrapper

Notes - 3

• Decide where tests take place, who is responsible for what parts

• recommendation that working groups test coupling aspects, also verification of make in coupled system,

• debug flags testing• gap in test process wrt make• track make fixes/changes through bug tracker• go to wg, get test requirements

Notes - 4

• performance requirement• science requirement• is it just the “control run”• get developers input on what should be in the test suite• library issues, internal libraries (ESMF, MCT), mass• test cost - test-model, chinook (1 hours, 16 pes), blackforest(30-

40 minutes, 32 pes), babyblue (30 minutes), anchorage (30 minutes), lots of time spent building

• regular testing, automated testing, different frequencies

Notes - 5

• Have components provide test suite to CSEG• different levels of testing

Recommendations

• component working groups test coupling, cseg liaisons coordinate

• develop specs for makefile• go to working groups, find out test requirements, then

decide who does what• develop ccsm test requirements• revisit benchmarking requirements

Open Discussion

• Community has a difficult time keeping up with what’s happening in NCAR

• Could have more forward planning, IPCC, resolutions, chemistry, bgc,

• SE and science status• Should we make component development work more visible• CRBs, adaptation, software practices• Need more background on decisions made• Code review - need? time?• Design walkthroughs and code reviews are working for the

ocean model.• Code reviews help educate others on code

open discussion - 2

• SEWG encourages the use of code reviews. SEWG will explain what we encourage. Recommend brown bag.

• recommend a code walkthrough of cpl6.• Discuss status of code review plans at june workshop, internal

SEWG• ccsm-progs, ccsm-sci, wg notes to be distributed to ?• get mailing lists under control• who do we make recommendations too? SSC, SE dev guide,

word of mouth

open discussion - 3

• Get input from many developers on code review plans