Forecast Verification Research
Beth Ebert and Laurie Wilson, JWGFVR co-chairs
WWRP-JSC meeting, Geneva, 21-24 Feb 2011
2
Aims
Verification component of WWRP, in collaboration with WGNE, WCRP, CBS
• Develop and promote new verification methods
• Training on verification methodologies
• Ensure forecast verification is relevant to users
• Encourage sharing of observational data
• Promote importance of verification as a vital part of experiments
• Promote collaboration among verification scientists, model developers and forecast providers
3
Working group members
Beth Ebert (BOM, Australia)Laurie Wilson (CMC, Canada)• Barb Brown (NCAR, USA)• Barbara Casati (Ouranos, Canada)• Caio Coelho (CPTEC, Brazil)• Anna Ghelli (ECMWF, UK)• Martin Göber (DWD, Germany)• Simon Mason (IRI, USA)• Marion Mittermaier (Met Office, UK)• Pertti Nurmi (FMI, Finland)• Joel Stein (Météo-France)• Yuejian Zhu (NCEP, USA)
4
FDPs and RDPs
Sydney 2000 FDP
Beijing 2008 FDP/RDP
SNOW-V10 RDP
Sochi 2014
MAP D-PHASE
Severe Weather FDP
Typhoon Landfall FDP
5
Beijing 2008 FDP
Real Time Forecast Verification (RTFV) system
Fast qualitative and quantitative feedback on forecast system performance in real time– Verification products generated whenever new observations
arrive
Ability to inter-compare forecast systems
3 levels of complexity– Visual (quick look)– Statistics (quantitative)– Diagnostic (more information)
7
B08FDP lessons for real time verification
• Real time verification considered very useful
• Forecasters preferred scatterplots and quantile-quantile plots
• Format and standardization of nowcasts products was critical to making a robust verification system
• Difficult to compare "like" products created with different aims (e.g., QPF for warning vs hydrological applications)
• Verification system improvements
– User-friendly web display
– More user options for exploring results
8
SNOW-V10• Verification strategy
– User-oriented verification for Olympic period of all forecasts, tuned to decision points of VANOC
– Verification of parallel model forecasts for Jan to August 2010
– Nowcast and regional model verification
• Rich dataset
9
Variable Cat 1 Cat 2 Cat 3 Cat 4 Cat 5 Cat 6 Cat 7 Cat 8 Cat 9
Temperature (°C) -25 < -25≤ T<-20 -20≤ T<-4C -4≤ T<-2 -2≤ T< 0 0≤ T< +2 +2 ≤ T< +4 ≥ +4
RH (%) < 30% 30≤ RH< 65%
65≤ RH< 90%
90≤ RH< 94%
94≤ RH< 98%
≥ 98%
Winds (m/s) < 3 3 ≤ w < 4 4 ≤ w < 5 5 ≤ w < 7 7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17 ≥ 17
Wind Gust (m/s) < 3 3 ≤ w < 4 4 ≤ w < 5 5 ≤ w < 7 7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17 ≥ 17
Wind Direction
d ≥ 339 & d < 24º (N)
24 ≤ d < 69º (NE)
69 ≤ d < 114º (E)
114 ≤ d < 159º (SE)
159 ≤ d < 204º (S)
204 ≤ d < 249º (SW)
249 ≤ d < 294º (W)
294 ≤ d < 339º (NW)
Visibility (m) v < 30 30 ≤ v < 50 50 ≤ v < 200200 ≤ v <
300300 ≤ v <
500 ≥ 500 - -
Ceiling (m) c < 50 50 ≤ c< 120 120 ≤ c< 300 300 ≤ c< 750 750 ≤ c<
3000 c ≥ 3000 - - -
Precip Rate (mm/hr) r = 0 (None)
0 < r ≤ 0.2 (Trace)
0.2 < r ≤ 2.5 (Light)
2.5 < r ≤ 7.5 (Moderate)
r > 7.5 (Heavy) - - - -
Precip Type No Precip Liquid Freezing FrozenMixed
(w/Liquid) Unknown - - -
Table 5 (2nd Revised Suggestion for SNOW-V10 Verification)
Suggested categories for SNOW-V10 verification
10
Forecast < 30 30 ≤ x < 50 50 ≤ x < 200 200 ≤ x < 300 300 ≤ x < 500 > 500 Total< 30 0 0 0 0 0 0 0
30 ≤ x < 50 0 0 0 0 0 0 050 ≤ x < 200 0 0 52 20 22 43 137
200 ≤ x < 300 0 0 76 18 19 103 216300 ≤ x < 500 0 1 26 15 12 60 114
> 500 0 9 831 246 170 3743 4999Total 0 10 985 299 223 3949 5466
lam1k Min. Visibility (m) at VOL HSS=0.095Observed
Example:Visibility verification
11
Sochi 2014
Standard verification
Possible verification innovations:• Road weather forecasts
• Real-time verification
• Timing of events – onset, duration, cessation
• Verification in the presence of observation uncertainty
• Neighborhood verification of high-resolution NWP, including in time-height plane
• Spatial verification of ensembles
• User-oriented probability forecast verification
12
Collaboration
• WWRP working groups• THORPEX
– GIFS-TIGGE– Subseasonal prediction– Polar prediction
• CBS– Severe Wx FDPs– Coordination Group on Forecast Verification
• SRNWP• COST 731• ECMWF TAC subgroup on verification measures
13
Spatial Verification Method Intercomparison Project
• International comparison of many new spatial verification methods
• Methods applied by researchers to same datasets (precipitation; perturbed cases; idealized cases)
• Subjective forecast evaluations
• Workshops: 2007, 2008, 2009
• Weather and Forecasting special collection
http://www.rap.ucar.edu/projects/icp
15
Spatial Verification Method Intercomparison Project
• Future variables – "Messy" precipitation
– Wind
– Cloud
• Future datasets– MAP D-PHASE / COPS
– SRNWP / European data
– Nowcast dataset(s)
• Verification test bed
16
PublicationsPublications
Recommendations for verifying deterministic and probabilistic quantitative precipitation forecasts
Recommendations for verifying cloud forecasts (this year)
Recommendations for verifying tropical cyclone forecasts (next year)
January 2008 special issue of Meteorological Applications on forecast verification
2009-2010 special collection of Weather & Forecasting on spatial verification
DVD from 2009 Helsinki Verification Tutorial
17
Outreach
• Verification workshops and tutorials– On-site, travelling
• EUMETCAL training modules
• Verification web page
• Sharing of tools
http://www.cawcr.gov.au/projects/verification/
18
International Verification Methods Workshops 4th Workshop – Helsinki 2009Tutorial• 26 students from 24 countries• 3 days• Lectures, hands-on (took tools home)• Group projects - presented at workshop
Workshop• ~100 participants• Topics:
– User-oriented verification– Verification tools & systems– Coping with obs uncertainty– Weather warning verification– Spatial & scale-sensitive methods– Ensembles– Evaluation of seasonal and climate
predictions
19
5th International Verification Methods Workshop
• Melbourne, December 2011
• 3-day tutorial + 3-day scientific workshop
• Additional tutorial foci– Verifying seasonal predictions
– Brief intro to operational verification systems
• Capacity building for FDPs/RDPs, SWFDP, etc.
20
climatechange
New focus areas for JWGFVR research
local
point
regional
global
Spa
tial s
cale
Forecast lead time
minutes hours days weeks months years decades
NWP
nowcasts
"Seamless verification" - consistent across space/time scales
decadalprediction
seasonalprediction
sub-seasonalprediction
veryshortrange
Approaches:• deterministic / categorical• probabilistic• distributional• other?
21
New focus areas for JWGFVR research
Spatial methods for verifying ensemble predictions• Neighborhood, scale-separation, feature-based, deformation
average rain
rain volumemaximum rain
rain area
23
New focus areas for JWGFVR research
Warnings, including timing
Success ratio (1-FAR)
Hit
rate
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 0