View
311
Download
0
Category
Tags:
Preview:
Citation preview
Work Group Meeting on
IT Techniques, Tools and Philosophies for Model Intercomparison
Ispra, JRC, March 14, 2004
Data handling approaches, software tools, participant involvement, data dissemination, integration issues
Program:
09:00-09:15 Welcome/Background Dentener 09:15-10:00 Thunis/Cuvelier City Delta/Euro Delta 10:00-10:45 Michael Schulz/Stefan Kinne Aerocom 10:45-11:00 Coffee 11:00-11:45 Charles Doutrioux (Ocean modelling) 11:45-12:30 Rudolf Husar American Aerosol intercomparisons
12:30-14:00 Lunch + discussion14:00-14:30 Christiane Textor (GEMS) 14:30-15:15 Stefano Galmarini Radioactivity Alert system 15:15-15:30 Tea 15:30-17:00 Discussion+ summarizing recommendations
Meeting Background & PurposeF. Dentener, JRC, Ispra
BackgroundThere is an increasing number of model intercomparison studiesThese cover different modeling scales (global, regional, local), pollutants (gases,
aerosols, metals, radioactivity) and comparison methodologiesIntercomparison participants may not be familiar with similar work elsewhereA common problem for all these studies is the intense use of information
technologies for bringing together, homogenizing and comparing the models
Meeting Goals• Present the major intercomparison studies with emphasis on IT issues• Discuss IT issues and solutions, (e.g. common formats, conventions) • Recommend actions for joint IT-related efforts
Past Model Intercomparison StudiesACCENT-PhotoComEuroDeltaAEROCOMMICS-AsiaMercury Transport
The Euro-and City-Delta Model Intercomparison P. Thunis, K. Cuvelier, JRC, Ispra
EuroDelta: Evaluates uncertainty of source-receptor relationships used in air quality policy (CHIMERE, REM, EMEP, MATCH, LOTOS,TM5)
EuroDelta: Includes sub-grid effects into a Europe-wide health impact assessment for PM and O3(CALGRID,MUSE, CAMX, TRANSCHIM, MUSCAT,
EUROS, OFIS, MOCAGE, STEM REM, LOTOS, CHIMERE, EPISODE EMEP)
Original Participant data (~Tb)Pre-processing (JRC)Processed (20 Gb), tool, on Web
AeroCom – Aerosol Model Comparison C. Textor, S. Guibert, S. Kinne, J. Penner, M. Schulz, F. Dentener
• Questionnaire 2002, 4 workshops, 40 participants)• 20 models, 3 experiments (original model + emissions 2000 + 1750) • Central model database (~2TB), public web interface to images, joint papers
http://nansen.ipsl.jussieu.fr/AEROCOM/data.html
Proposal for cooperation« Atmospheric Tracer Model Intercomparison Tools Initiative »
GEMS - C. Textor Global Earth-system Monitoring using Space and in-situ data
• Goal: Build validated, operational assimilation system for atmospheric composition and dynamics, by 2008.
• Integrated Project co-funded by EC, 17 M€, 31 consortium members, 4 years (started in
March 2005)
Reg. AQ: Ensemble forecastsReg. AQ: Ensemble forecastsModel Evaluation Methods?
• “Eyeball” methods • Basis statistical evaluation• Sophisticated skill scores
Model Evaluation Tools? • MetPy (ECMWF) • MMAS (FMI)• AeroCom (LSCE)• CDO, MetView?, CDAT..
Towards Automated Model Output Analysis Charles Doutriaux
• Python based system• Added packages by community• One environment, • Community Software
• From central to distributed resources:• Data, visualization, supercomputers• Collaboration and data sharing• Analyst focus on work, not mechanics
AutoMOD Automated Model Diagnostic Facility
CDATESG
Data adheres to standardsNetCDF format, CF compliant
Data pre-processed via code
ENSEMBLE: Reconciliation of Disparate National Medium/Long Range Dispersion Forecasts
Stefano Galmarini, JRC, Ispra
•Following Chernobyl, accidental radioactive dispersion, conc., deposition needs to be forecasted over the next few days (60 h)• National services use LRTP models
DataFed: Federated Data System for Air Quality
R. B. Husar, Washington UniversityAQ info is distributed over many ‘dimensions’: Geography, Content, Agency..Info content includes: emissions, ambient & satellite data and modelsInfo is provided and consumed by different agencies, (NASA, NOAA, EPA…)Providers have different access protocols, formats, and information usage
• Standardization is a key need for agile IT systems• Non-intrusive mediators can achieve virtual standardization• Technologies are currently available for dynamic NETWORKING• Eager to cooperate, share networked data, tools for model comparison
Proposal for cooperationAtmospheric Tracer Model Intercomparison Tools Initiative
GoalsGeneral Goals Accelerate Analysis of Models and Feedback to model participants Develop jointly intercomparison tools through transformation, integration, adaptation and development of tools Allow participants to join more easily into the analysis of an intercomparison
Specific Objectives to which the Network should contribute for any intercomparison Model documentationModel quality controlModel comparisonModel benchmarkingModel improvementScientific understanding
ProcedureIn the first place tutorial institutions are identified, which provide general support for the planning and implementation of the
initiative (JRC, LSCE/IPSL + ??)A steering committee is put into place to develop the initiative and report to the tutorial institutions.
A work plan is elaborated until early summer 2006 to structure and prioritise the actions to be undertaken.A pilot project is the support of the planned 1st phase of the intercomparison under the HTAP convention, by reusing and
integrating tools prepared for EuroDelta, ACCENT and AeroCom.Present the initiative at forthcoming meetings (eg AeroCom 17-19 Octobre HTAP workshops etc) and promote its existence
through publication (IGAC newsletter? +?)
Workplan components:The Workplan aims to identify a detailed implementation planOn What steps are taken, Who is doing them, and When they are ready.
1) Identify the formatting standards needed > units, CF or less strict??2) Identify standard names for specific tracer variables > report to CF3) Identify the standard interface specifications between any of the tools and any of the data bases (see slide 5) so that
different tools can call and adress each databases4) To develop a standard protocol form to be used for different
intercomparisons and subparts of them ( reference+compliance w check tools)5) Define standards for file names and image names for databases6) Develop compliance test tools to test whether data and tools fulfill the standards set under 1-57) Implement a pilot database based on automod for testing if different existing tools (AeroCom catalogues etc)8) Build a web based repository for users to find tools
to prepare CF compliant model output (fortran routines) to rename and reformat files (nco examples)to do specific diagnostics (region budgets, aerosol size fractions etc)to do regridding compliant to the standardsto handle/replace/identify missing data
9) To develop check tools for physical meaning of data (budgets, units, order of magnitude, ocean/land contrast)10) Develop observational data comparison tools which interface to the model data as defined under 1-411) Develop tools to develop higher order analysis results (ensemble averages, spatial correlation)12) Documentation tool for keeping track of model version characteristics
Recommended