29
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss PT postprocessing Fieldextra Jean-Marie Bettems 07.09.2010 Moscow (RU)

PT postprocessing Fieldextra

  • Upload
    kacy

  • View
    36

  • Download
    0

Embed Size (px)

DESCRIPTION

PT postprocessing Fieldextra. Jean-Marie Bettems 07.09.2010 Moscow (RU). Agenda. 16h30 - 16h45 [JMB] Fieldextra - a short refresher [JMB] Highlights of version 10.2 16h45 - 17h15 [JMB] Live demo (if technical infrastructure allows) - PowerPoint PPT Presentation

Citation preview

Page 1: PT postprocessing Fieldextra

Federal Department of Home Affairs FDHAFederal Office of Meteorology and Climatology MeteoSwiss

PT postprocessingFieldextra

Jean-Marie Bettems

07.09.2010

Moscow (RU)

Page 2: PT postprocessing Fieldextra

2 COSMO GM, PT postprocessing / Moscow, September 2010

Agenda

16h30 - 16h45 [JMB] Fieldextra - a short refresher [JMB] Highlights of version 10.2

16h45 - 17h15 [JMB] Live demo (if technical infrastructure allows)

17h15 - 17h45 [JMB] Proposed extension of PT post-processing [JMB] Other planned developments for the 2010-2011 period

[JMB] High performance fieldextra[JMB] Sharing development effort[ALL] Other requirements (e.g. GUI, DWD headers)

17h45 - 18h15 [Uli, JMB] Grib 2 Implementation → Vertical coordinate parameters in Grib2 → Problems with coding the reference atmosphere

→ How to recognize the originating model (e.g. COSMO-2, COSMO-7)→ Coordinate usage within COSMO

(short names, local tables, local section, local usage, fxtr dictionary)

18h15 - 18h30 [ALL] Feedback from participants→ Do you plan to use fieldextra? For which type of application?→ Who already tried the program? On which platform?→ Any problems by installation? By running the code?→ What is missing in the documentation?...

Page 3: PT postprocessing Fieldextra

3 COSMO GM, PT postprocessing / Moscow, September 2010

Fieldextra – Identity card (1)

• Some examples of typical model pre-/post-processing tasks:

• merge surface temperature from IFS over sea and from COSMO over land to produce a single field suited for the assimilation cycle

• interpolate Swiss radar composite onto the COSMO-2 grid for feeding the latent heat nudging process

• compute EPS probabilities from COSMO-LEPS members

• compute neighbourhood probabilities from COSMO-7

• compute convective index like KO or CAPE

• compute time serie of W_SO, stratified by soil type, region averaged, and for a set of standard depths

• create a single XML file with time serie of parameters from COSMO-2 / -7 / -EPS and IFS for a set of locations

→ use fieldextra !

Page 4: PT postprocessing Fieldextra

4 COSMO GM, PT postprocessing / Moscow, September 2010

Fieldextra – Identity card (2)

• Fortran program designed as a generic tool to manipulate NWP model data and gridded observations

• Build as a toolbox ...

• implement a set of primitive operations, which can be freely combined and iterated

• controlled by Fortran namelists

• File based input/output ...• support both GRIB1 and GRIB2 (input/output)

• support local extension of GRIB standard

• understand naming conventions of COSMO output

• rich set of output format in addition to GRIB (CSV, XML ...)

Page 5: PT postprocessing Fieldextra

5 COSMO GM, PT postprocessing / Moscow, September 2010

Fieldextra – Identity card (3)

• Primary focus is the production environment

• high quality standard (code design, testing)

• robust handling of exceptions

• comprehensive diagnostic

• optimized code:

• input/output (read model output once, produce as many products as desired)

• memory footprint

• time criticality

• inter-process communication (support parallel production suite)

Page 6: PT postprocessing Fieldextra

6 COSMO GM, PT postprocessing / Moscow, September 2010

Fieldextra – Identity card (4)

• More than 60k lines of Fortran 2003

• Linked with DWD grib library (GRIB1), ECMWF grib API (GRIB2), JasPer (JPEG in GRIB2) and some COSMO modules

• Standalone package available on COSMO web site

• Single threaded code

• Portable code (used on SGI Origin, IBM Power, Cray Opteron …)

• Documented code (examples, user manual, developer manual …)

• Resources allocated at MeteoSwiss for further development

Page 7: PT postprocessing Fieldextra

7 COSMO GM, PT postprocessing / Moscow, September 2010

Fieldextra – Identity card (5)

• Core non-graphical NWP production tool at MeteoSwiss

• Official COSMO post-processing software

• Official tool for the EUMETNET SRNWP interoperability project

Page 8: PT postprocessing Fieldextra

8 COSMO GM, PT postprocessing / Moscow, September 2010

Activities since last COSMO GMSummary

• Releases

• 10.0.0 15 Nov. 2009 (first official COSMO release)

• 10.1.0 31 Jan. 2010 (only intern MeteoSwiss)

• 10.2.0 26 Aug. 2010

• Focus 10.0.0 – 10.2.0

• Improve user experience (package, namelist, diagnostic, ...)

• More versatile tool, less dependent on MeteoSwiss specificities

• Support for additional models besides COSMO

• Robust implementation of GRIB 2 input and output

Page 9: PT postprocessing Fieldextra

9 COSMO GM, PT postprocessing / Moscow, September 2010

Activities since last COSMO GMHighlights (1)

• Create standalone distribution package• source code, incl. libraries

• resource files (dictionary, location ...)

• examples, with reference results

• documentation (installation, user, developer)

• support for PathScale and GNU compiler

• Support of GRIB2 for input and output data• support product templates 0 & 8, 1 & 11 (EPS member),

2 & 12 (EPS derived), 5 & 9 (probability)

• support both simple and JPEG data packing

• based on ECMWF GRIB API v1.9.0

• on a single core as efficient as the DWD grib library for GRIB1

• still missing features (local section, kilometric grid ...)

Page 10: PT postprocessing Fieldextra

10 COSMO GM, PT postprocessing / Moscow, September 2010

Activities since last COSMO GMHighlights (2)

• Extend set of recognized products (model, product category, vertical coord., EPS information)

gme ifs cosmo

-------------------------------------------------------------

determinist YES YES YES

eps member NA NO YES (1)

eps mean NA NO YES (2)

eps median NA NO YES (2)

eps probability NA NO YES (2)

neighb. probability YES YES YES

--------------------------------------------------------------

(1) both COSMO LEPS and COSMO DE EPS suites

(2) only COSMO LEPS

• Operators to compute HFL and HHL by integrating the hydrostatic equation

Page 11: PT postprocessing Fieldextra

11 COSMO GM, PT postprocessing / Moscow, September 2010

Activities since last COSMO GMHighlights (3)

• Other new features• localisation of operators (fxtr_operator_generic, fxtr_operator_specific)

and of output routines (fxtr_write_generic, fxtr_write_specific)

• new operator to interpolate model fields from half to full levels

• introduce strict_usage in &RunSpecification

• introduce soft_memory_limit in &GlobalSettings

• implement flag-file based inter process communication (force stop, ready files)

• and more ...

• Bug correction, code clean-up, memory optimization

• See the HISTORY file for the complete list of modifications(pay attention to the ... ATTENTION section!)

Page 12: PT postprocessing Fieldextra

12 COSMO GM, PT postprocessing / Moscow, September 2010

... 30 minutes

Live demo

Page 13: PT postprocessing Fieldextra

13 COSMO GM, PT postprocessing / Moscow, September 2010

Design – Input / output

input 1

Output 1

Output 2

Output 3

Output 4

INCOREinput 2

input 3

input 1

input 2

input 3

First pass (INCORE)

Second pass (standard output)INCORE • Each input is read once.• Storage is allocated on demand for each output.• Each record is evaluated, and dispatched in the

corresponding output storage when requested (in memory).

• When all data for some output has been collected, the corresponding data is written on disk and the memory is released.

• For output supporting append mode, data is processed piecewise after reading each related validation time (‚just on time‘ mode).

Page 14: PT postprocessing Fieldextra

14 COSMO GM, PT postprocessing / Moscow, September 2010

Design – Iterative processing

input ninput 2

input 1output

tmp1_field =

INCORE (non volatile)

in_field =

in_field =

out_field =

...

Fn1(Ψ) Fn1(Ψ)

Fnm(Ψ)

F11(Ψ) : 1 to 1 field operator

Fn1(Ψ) : n to 1 field operator

Fnm(Ψ) : n to m field operator

F11(Ψ)F11(Ψ) F11(Ψ) F11(Ψ)

Page 15: PT postprocessing Fieldextra

15 COSMO GM, PT postprocessing / Moscow, September 2010

Design – Field transformations

1. Lateral transformation (hoper)2. Scale/offset 3. Vertical transformation (voper)4. Temporal transformation (toper)5. Local transformation (poper)6. Spatial filter

Within each recipient, after a new setof fields is produced, the followingtransformations can be applied:

The order of operations is hard coded.Operators can be chosen independentlyfor each field.

F11(Ψ)

Page 16: PT postprocessing Fieldextra

16 COSMO GM, PT postprocessing / Moscow, September 2010

Live demo

• Example 1: meteogram of total precipitation at chosen locations (ASCII output)

• resources: dictionary, locations

• time loop construct

• horizontal and temporal operator

• deriving a new field

• Example 27: regrid COSMO-7 wind field components (GRIB1 output)

• working with Arakawa staggering

• working with vector fields

• re-gridding operators

• Example 22: deriving probability fields form EPS system (GRIB1 output)

• EPS loop construct

• deriving EPS products

• Example 7: conditional selection of grid points (N-TUPLE output)

• mask usage

Page 17: PT postprocessing Fieldextra

17 COSMO GM, PT postprocessing / Moscow, September 2010

Done !

Live demo

Page 18: PT postprocessing Fieldextra

18 COSMO GM, PT postprocessing / Moscow, September 2010

Priority task postprocessingProposal for a one year extension – Rationale

Two new major fieldextra releases have been delivered since the last COSMO GM, with a focus on providing

a better user experience (package, documentation, diagnostic, user interface) ;

a more versatile tool, less dependent on a specific COSMO configuration ;

a robust implementation of GRIB 2 input and output.

Nevertheless, not all planned tasks have been fulfilled, and some new needs have been detected. The large spectrum of potential applications for this tool (pre-/post-processing, EPS processing, SRNWP interoperability), and the significant interest raised within the COSMO community makes an extension of this PT meaningful.

Page 19: PT postprocessing Fieldextra

19 COSMO GM, PT postprocessing / Moscow, September 2010

Priority task postprocessingDescription of individual sub-tasks (2010 – 2011)

1. [EXTENDED] Support usage outside of MeteoSwiss [0.05 FTE, MCH resources]> support for initial installation at interested centers, bug fixes > organize one day tutorial at DWD

2. [EXTENDED] Support of GME and ICON on native grid[??? FTE, no resources available]> utility routines to work with GME/ICON grids are available in INT2LM> fieldextra currently assumes a regular grid, this should be relaxed

3. [DELAYED, MODIFIED] Implement NetCDF support (out)[0.05 FTE, MCH resources]> use support from ECMWF GRIB API to implement NetCDF output> necessity to implement NetCDF input has to be re-evaluated

4. [DELAYED] Consolidate interface COSMO/fieldextra (fieldextra code) [0.015 FTE, MCH resources]> use latest COSMO module, support new reference atmosphere

Page 20: PT postprocessing Fieldextra

20 COSMO GM, PT postprocessing / Moscow, September 2010

Priority task postprocessingDescription of individaual sub-tasks (2010 – 2011)

5. [DELAYED] Provide a common library of modules for physical and mathematical constants, meteorological functions, etc. [0.2 FTE, DWD resources]

6. [DELAYED] Extend fieldextra functionality[0.075 FTE, MCH resources; ??? FTE from other centres]> review all formulas in fieldextra and use common library when possible > implement vorticity on p-surfaces using metric terms from COSMO> implement parameters requested by COSMO members

7. [NEW] Consolidate GRIB2 implementation[0.15 FTE, MCH resources]> further adapt internal code structure (e.g. level representation)> implement additional features (e.g. product template for radar)> coordinate usage within COSMO (e.g. short names, local tables)

8. [NEW] Consolidate documentation[0.075 FTE, MCH resources; 0.05 FTE DWD resources]> finalize developer documentation> format in LaTeX

Page 21: PT postprocessing Fieldextra

21 COSMO GM, PT postprocessing / Moscow, September 2010

Roadmap (1)

December 2010 – Version 10.2.1 (intern MeteoSwiss)

→ Needs MeteoSwiss / Vorticity as showcase for use of metric terms

• Kalman corrections

• Support latest SLEVE coordinates

• Consolidate interface COSMO/fieldextra(latest COSMO code, new ref. atmosphere, vorticity)

Page 22: PT postprocessing Fieldextra

22 COSMO GM, PT postprocessing / Moscow, September 2010

Roadmap (2)

March 2011 – Version 10.3

→ Consolidate code / NetCDF & GRIB2 / SRNWP interoperability

• Internal code improvements

• Finalize support for SRNWP interoperability(incl. support of additional grids if necessary)

• use other SRNWP LAM to generate MeteoSwiss products• generate SRNWP inetroperability file from COSMO output

• Consolidate GRIB2 implementation(treatment of level, missing features, ...)

• Coordinate usage of GRIB2 within COSMO

• NetCDF output

Page 23: PT postprocessing Fieldextra

23 COSMO GM, PT postprocessing / Moscow, September 2010

Roadmap (3)

September 2011 – Version 11.0

→ COSMO priority task

• Graceful handling of missing input files in a temporal serie

• Improve developer documentation

• Consolidate interface COSMO/fieldextra(use COSMO common library when possible)

Page 24: PT postprocessing Fieldextra

24 COSMO GM, PT postprocessing / Moscow, September 2010

High Performance Fieldextra

Output 1

Output 2

Output 3

Output 4

input 1

input 2

input 3

• From sequential to parallel processing of output

• output processing means computation of user defined operations and production of output file

• potential important performance gain when working in post-processing mode with many output, scalability with respect to the number of products

• load balance achieved by distributing processing according to a pre-computed profiling (post-processing production is fairly static)

• Direct in-memory communication between COSMO and fieldextra

• i/o (disk access) will become more and more a bottle neck, in particular for EPS systems

• live visualization methods (Visit) exist which use such an approach, CSCS is willing to help for this task

Page 25: PT postprocessing Fieldextra

25 COSMO GM, PT postprocessing / Moscow, September 2010

Feature Requests

→ See Excel Sheet

• DWD headers ?! $RCSfile: src_leapfrog.f90,v $! $Revision: 1.4 $ $Date: 2009/07/01 15:38:24 $!+ Source module for computing the leapfrog dynamical time stepping!------------------------------------------------------------------------------

MODULE src_leapfrog

!------------------------------------------------------------------------------!! Description:! …….!! Current Code Owner: …….!! History:! Version Date Name! ---------- ---------- ----! V4_4 2008/07/16 Ulrich Schaettler, Dmitrii Mironov! .......!! Code Description:! Language: Fortran 90.! Software Standards: "European Standards for Writing and! Documenting Exchangeable Fortran 90 Code".!==============================================================================

• Graphical user interface ?

• M. Raschendorfer diagnostic tool for turbulence ?• Others ?

Page 26: PT postprocessing Fieldextra

26 COSMO GM, PT postprocessing / Moscow, September 2010

Sharing development effort

→ Resources at Meteoswiss for fieldextra development are limited!

→ Quality standard must be maintained (production tool)!

• Implementing new operators or new types of ASCII output is reasonably easy, and should be done by the interested center(mail support from MeteoSwiss can be expected)

• output format fxtr_write_specificfxtr_write_generic (general interest)

• operator fxtr_operator_specificfxtr_operator_generic (general interest)

• MeteoSwiss resources will be reserved for changes requiring a deep knowledge of the code (or for own needs). Non MeteoSwiss resources should be found for tasks such as the development of a GUI.

Page 27: PT postprocessing Fieldextra

27 COSMO GM, PT postprocessing / Moscow, September 2010

GRIB 2

• Vertical coordinate parameters (→ Slides by Uli)

• Problems with coding the reference atmosphere (→ Slides by Uli)

• How to identify the originating model• e.g. COSMO-1, COSMO-2, COSMO-7 or COSMO-EU, COSMO-DE• this is an important information (e.g. for the end user)• there is no provision in the current GRIB2 standard for a direct coding• shall one use the local section, or some of the locally defined tables

(‚Background generating process identifier’ or ‘Analysis or forecast generating process identified’), or something else?

• Coordinate usage within COSMO• Short names (and fieldextra dictionaries)• Local section, local tables (at least document on COSMO web site)• Local usage (e.g. new field, new horizontal coordinates)

Page 28: PT postprocessing Fieldextra

28 COSMO GM, PT postprocessing / Moscow, September 2010

Feedback

• Do you plan to use fieldextra?For which type of application?

• Already confirmed: MeteoSwiss (JM. Bettems)COSMO LEPS (A.Montani)DWD / production (M. Buchhold)Polen (A. Mazur)Romania (???)

• A focal point should be nominated for each interested center / activity

• Who already tried the program? • On which platform? With which compiler?• Any problems by installation? • Any problems by running the code?

• What is missing in the documentation?

• ... your last opportunity to make a comment is now!

Page 29: PT postprocessing Fieldextra

29 COSMO GM, PT postprocessing / Moscow, September 2010

!+****************************************************************************SUBROUTINE generate_output(multi_pass_mode, just_on_time, last_call, & datacache, data_origin, tot_nbr_input, & out_paths, out_types, out_modes, & out_grib_keys, out_spatial_filters, & out_subset_size, out_subdomain, out_gplist, out_loclist, & out_data_reduction, out_postproc_modules, & nbr_gfield_spec, gen_spec, ierr, errmsg )!=============================================================================!! Root procedure to generate output files!!------------------------------------------------------------------------------ ! Dummy arguments LOGICAL, INTENT(IN) :: multi_pass_mode ! Multiple pass mode? LOGICAL, DIMENSION(:), INTENT(IN) :: just_on_time ! True if prod. now LOGICAL, INTENT(IN) :: last_call ! True if last call CHARACTER(LEN=*), INTENT(IN) :: datacache ! Data cache file TYPE(ty_fld_orig), INTENT(IN) :: data_origin ! Data origin INTEGER, DIMENSION(:), INTENT(IN) :: tot_nbr_input ! Expected nbr. input CHARACTER(LEN=*), DIMENSION(:), INTENT(IN) :: out_paths ! Output files names TYPE(ty_out_spec), DIMENSION(:), INTENT(IN) :: out_types ! types TYPE(ty_out_mode), DIMENSION(:), INTENT(IN) :: out_modes ! modes INTEGER, DIMENSION(:,:), INTENT(IN) :: out_grib_keys ! grib specs INTEGER, DIMENSION(:), INTENT(IN) :: out_subset_size ! subset size INTEGER, DIMENSION(:,:), INTENT(IN) :: out_subdomain ! subdomain definition INTEGER, DIMENSION(:,:,:), INTENT(IN) :: out_gplist ! gp definition CHARACTER(LEN=*), DIMENSION(:,:), INTENT(IN) :: out_loclist ! locations definition CHARACTER(LEN=*), DIMENSION(:,:), INTENT(IN) :: out_spatial_filters ! Condition defining filter TYPE(ty_out_dred), DIMENSION(:), INTENT(IN) :: out_data_reduction ! Data reduction spec CHARACTER(LEN=*), DIMENSION(:), INTENT(IN) :: out_postproc_modules ! Specific postprocessing INTEGER, DIMENSION(:,:), INTENT(IN) :: nbr_gfield_spec !+ Specifications of TYPE(ty_fld_spec_root), DIMENSION(:), INTENT(IN) :: gen_spec !+ fields to generate INTEGER, INTENT(OUT) :: ierr ! Error status CHARACTER(LEN=*), INTENT(OUT) :: errmsg ! error message

! Local parameters CHARACTER(LEN=*), PARAMETER :: nm='generate_output: ' ! Tag

! Local variables LOGICAL :: exception_detected, exception, use_postfix LOGICAL :: unique_ftype, multiple_grid, exist LOGICAL, DIMENSION(3*mx_iteration+1) :: tmp_fddata_alloc, tmp_gpdata_alloc LOGICAL, DIMENSION(3*mx_iteration+1) :: tmp_value_alloc, tmp_flag_alloc INTEGER :: i1, i2, i3, i_fd, i_vd INTEGER :: nbr_input INTEGER :: out_idx, ios, idx_vd_defined CHARACTER(LEN=strlen) :: messg, temporal_res, out_path TYPE(ty_fld_type) :: out_ftype

! Initialize variables !--------------------- ierr = 0 ; errmsg = '' exception_detected = .FALSE. tmp_fddata_alloc(:) = .FALSE. ; tmp_gpdata_alloc(:) = .FALSE. tmp_value_alloc(:) = .FALSE. ; tmp_flag_alloc(:) = .FALSE.

! Create/update data cache file !------------------------------------------------------------------------- ! The cache file must reflect the state of data(:) after the last call to ! collect_output (i.e. before any field manipulation done in prepare_pout)

! Loop over each output file !--------------------------- output_file_loop: & DO i1 = 1, nbr_ofile out_idx = data(i1)%ofile_idx nbr_input = COUNT( data(i1)%ifile_used )

! Skip bogus output IF ( data(i1)%ofile_bogus ) CYCLE output_file_loop ! Skip completed output IF ( data(i1)%ofile_complete ) CYCLE output_file_loop ! Skip empty data array IF ( ALL(.NOT. data(i1)%defined) ) CYCLE output_file_loop ! Only prepare output when all possible associated data have been collected ! or when 'just on time' production is active IF ( .NOT. last_call .AND. & nbr_input < tot_nbr_input(out_idx) .AND. & .NOT. just_on_time(out_idx) ) CYCLE output_file_loop

! At this point the corresponding output file will be produced ! Keep track of completed output file IF ( nbr_input >= tot_nbr_input(out_idx) ) data(i1)%ofile_complete = .TRUE.

! Build name of output, considering a possible temporary postfix use_postfix = .FALSE. IF ( LEN_TRIM(out_postfix) /= 0 .AND. data(i1)%ofile_usepostfix .AND. & .NOT. (data(i1)%ofile_firstwrite .AND. data(i1)%ofile_complete) ) & use_postfix = .TRUE. out_path = out_paths(out_idx) IF ( use_postfix ) out_path = TRIM(out_path) // out_postfix

! Release memory allocated in previous call to prepare_pout (if any) DO i2 = 1, 3*mx_iteration+1 IF ( tmp_value_alloc(i2) ) DEALLOCATE(data_tmp(i2)%values, data_tmp(i2)%defined) IF ( tmp_flag_alloc(i2) ) DEALLOCATE(data_tmp(i2)%flag) IF ( tmp_fddata_alloc(i2) ) THEN DEALLOCATE(data_tmp(i2)%field_type, data_tmp(i2)%field_origin, & data_tmp(i2)%field_name, data_tmp(i2)%field_grbkey, & data_tmp(i2)%field_trange, & data_tmp(i2)%field_level, data_tmp(i2)%field_ltype, & data_tmp(i2)%field_prob, data_tmp(i2)%field_epsid, & data_tmp(i2)%field_vref, data_tmp(i2)%field_ngrid, & data_tmp(i2)%field_scale, data_tmp(i2)%field_offset, & data_tmp(i2)%field_vop, data_tmp(i2)%field_vop_usetag, & data_tmp(i2)%field_vop_nlev, data_tmp(i2)%field_vop_lev, & data_tmp(i2)%field_pop, data_tmp(i2)%field_hop, & data_tmp(i2)%field_top, data_tmp(i2)%nbr_level, & data_tmp(i2)%level_idx, data_tmp(i2)%nbr_eps_member, & data_tmp(i2)%eps_member_idx, data_tmp(i2)%field_idx ) ENDIF IF ( tmp_gpdata_alloc(i2) ) THEN DEALLOCATE(data_tmp(i2)%gp_coord, data_tmp(i2)%gp_idx, & data_tmp(i2)%gp_lat, data_tmp(i2)%gp_lon, data_tmp(i2)%gp_h) ENDIF END DO

! Prepare data for print out (calculate new fields, ... ; populate data_pout) ! * Info message IF ( just_on_time(out_idx) ) THEN messg = ' (just on time output)' ELSE IF ( nbr_input >= tot_nbr_input(out_idx) ) THEN messg = ' (all associated input collected)' ELSE messg = '' ENDIF

Thank you for your attention!