45
22nd October D4L2 Kerridge 1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training Course in Atmospheric Remote Sensing 19-24 th October 2009, Nanjing University

22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

Embed Size (px)

Citation preview

Page 1: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 1

Data Comparison Methods

Lecture by B.Kerridge

Rutherford Appleton Laboratory, UK

ESA-MOST DRAGON 2 PROGRAMMEAdvanced Training Course in Atmospheric Remote Sensing

19-24th October 2009, Nanjing University

Page 2: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 2

Contents

1. Introduction

2. Radiances

3. Ozone

4. Aerosol

5. Surface networks

6. Summary

Page 3: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 3

1. Introduction– Why are data comparisons needed?

1. Comparison of observation with theory is fundamental.2. Validation of satellite data is also essential

→ inform users of data attributes and quality

– Which approaches are used?

– Quantitative comparison of new data set with existing data sets of established attributes and quality (ie resolution, precision & accuracy)

– Spatial & temporal distributions and variances– Biases and standard deviations between new and established data– Sensor-to-sensor, sensor-to-model or -analysis

– What level of stringency is necessary (ie precision and accuracy)?

– Atmospheric variability ← lifetime– Quality of established observing techniques

• What (scientific or other) value are the new observations required to add?• First detection ↔ well-established measurement techniques / sensors

→ Determine level of sophistication

Page 4: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 4

2. Radiances / reflectances– The measured quantities (L1 data)

Page 5: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 5

Wavelength / nm

R

efle

ctan

ce

GOME coverage

v55 v87v67 v16AATSR channels

Comparison of GOME-1/ATSR-2 & SCIA/AATSR reflectances

SCIA spectra

Page 6: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 6

Inter-comparison requires:– Accurate co-location of imager/spectrometer– Spectral averaging of SCIA/GOME-1– Spatial averaging of AATSR/ATSR-2

AATSR / SCIA / GOME spatial coverage

SCIAMACHY

GOME

80 km

40 km

Comparison of GOME-1/ATSR-2 & SCIA/AATSR reflectances

Page 7: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 7

Comparison of AATSR, ATSR-2, GOME-1 and SCIAMACHY 670nm Reflectances

Two different orbits on 15th December 2002

Page 8: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 8

3. Ozone– A retrieved quantity

Page 9: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 9

Direct comparison of MIPAS O3 with HALOE IR solar occultation

• Solar occultation: – High vertical resolution, precision & accuracy– Sparse geographical coverage

• HALOE: – 79 co-locations between 22/07/02-14/12/02– Accuracy: 30-60km 6%; 15-30km 20%– Co-location: <250km on same day

Page 10: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 10

High latitude Subtropics

Individual HALOE Profiles

Page 11: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 11

0.5 – 50 hPa: +5 – +15% (+/- 20%) >50hPa: +ve biases and increases in RMS

HALOE sampling weighted to N. mid. lats; only 10% in tropics

Ensemble of HALOE Profiles

Page 12: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 12

Factors Limiting Direct Comparisons• Precision:

– Direct satellite comparisons underestimate MIPAS precision, due to:

• Imperfect co-location cf atmospheric variability

• Differences in representation of O3 vertical profile

• Precision of the correlative sensor factored in

• Accuracy:– Vertical structure of bias partly reflects different vertical

resolutions of respective sensors

Page 13: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 13

Direct Assimilation of MIPAS Data

Ozone: July 10th 2003, 850K level

• Direct assimilation equivalent to

Kalman Smoother, but far lower

computational cost.

• Uses isentropic advection as

modelling constraint.

→ No chemistry or vertical advection

• Field allowed to adjust continually

towards the observations.

→ Analysis unbiased relative to I/P data.

• Applied to MIPAS ozone, methane

and water vapour products (Feb. to

July 2003 v4.61/2)

M.Juckes, RAL

Page 14: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 14

MIPAS Ozone Comparisons with Other SensorsMean [ppmv] Standard error [ppmv]

Assimilation allows comparison against other observations which are not coincident.

Biases (left) found to be small in the lower and mid-stratosphere.

Standard error (right) is <10% in most of the stratosphere.

Using a range of independent instruments helps establish confidence.

Hei

ght [

km]

In both graphs, the outer limit of shading is 10% of sample mean profile, inner boundary of shading 1%, transition 5%.

10%

Page 15: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 15

Assimilation of MLS ozone profiles: comparison of analyses with ozone sondes

0 5 10 15 20 25 30

Ozone in mPa

1000800600500400300

200

1008060504030

20

1086543

2

Pre

ssure

hPa

Month = 200708 ( 10 sondes)Neymayer (Lat = -70.7, Lon = -8.3)

Ozone profiles from sondes and analyses

Sonde CTRL MLS

0 5 10 15 20 25 30

Ozone in mPa

1000800600500400300

200

1008060504030

20

1086543

2

Pre

ssure

hPa

Month = 200709 ( 8 sondes)HOHENPEISSENBERG (Lat = 47.8, Lon = 11.0)

Ozone profiles from sondes and analyses

Sonde CTRL MLS

0 5 10 15 20 25 30

Ozone in mPa

1000800600500400300

200

1008060504030

20

1086543

2

Pre

ssure

hPa

Month = 200709 ( 3 sondes)NAHA (Lat = 26.2, Lon = 127.7)

Ozone profiles from sondes and analyses

Sonde CTRL MLS

• 1st July – 30th Sept 2007• Ozone data actively assimilated into ECMWF operational system:

• CTRL: SCIAMACHY total column (KNMI)• MLS: SCIAMACHY total column + MLS O3 profile

→ Agreement with ozonesonde profiles <200hPa improved by assimilating MLS

R.Dragani, ECMWF

Page 16: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 16

)( DIAL aafinal xxAxx

TES a priori profile

DIAL or sonde profile

TES averaging kernel

Comparison of TES Retrieved O3 Profileswith Ozonesondes & Airborne DIAL

Page 17: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 17

TES O3 – Ozonesondes

Worden et al.

Page 18: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 18

TES O3 – Ozonesondes (contd.)

TES V1 TES V2

Page 19: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 19

Airborne DIAL Measurements • DIAL profiles ozone simultaneously

above & below DC-8 aircraft

• DIAL profiles <0.15o (lat/long) from TES profile averaged & interpolated to TES p grid.

• Missing data in DIAL profile → TES a priori

•DIAL: accuracy <10% (2 ppbv)•Vertical resolution of 300 m.

Richards et al

Page 20: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 20

TES O3 – DIAL

TES V2 TES V3

– Care needed in comparisons near tropopause, where O3 vertical gradient strong.– Using AKs, TES bias in upper troposphere seen to be reduced in V3

Page 21: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 21

Comparison of OMI integrated ozone column to ECMWF analyses

• 70-day period from 12 Aug 2005

• Assimilation of operational obs:– ozonesondes – aircraft – satellite IR rads– SBUV/2 partial columns– SCIAMACHY columns – MLS ozone profiles

– OMI data:– No sunglint – SZA< 84 deg– Coincidence criterion: within ± 1 hour of analysis time

Analysis by S.Migliorini, U.Reading

Page 22: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 22

Total column ozone comparisons

• Total column should be compared to

ˆ ( )Ta a a x x

ana anaˆ ( )T

a a a x x

ECMWF analyses firstinterpolated to OMI pixel locations

ˆ ( )a a x x x A x x ε

T g x T Ta g A Tx g εIntegrated column:

Retrieved profile:

Page 23: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 23

OMI ozone layer averaging kernels

a < 1

15 Oct 2005 12 UTC

nadir pixel

Page 24: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 24

OMI total column ozone

Page 25: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 25

Simulation with ECMWF 3-D analysis

Page 26: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 26

Integrated Column Differences (%)

For 70 day’s data: [OMI-ECMWF] bias -3.773 % and RMS 5.2 %

Page 27: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 27

4. Aerosol

Page 28: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 28

Height-Integrated Aerosol

• Aerosol optical thickness (AOT), (), at = 550 and 870 nm

Cloud screening & handling of surface BRDF critical for aerosol retrieval from satellite sensors

Page 29: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 29

Comparisons of AATSR & SEVIRI with MODIS and MISR

MODIS & MISR on EOS-Terra Day-time equator crossing time

similar to Envisat (~10am) Terra ascending node in daytime

whereas Envisat descending, → Observing times at high latitude

quite different to AATSR.

SEVIRI on MSG samples hourly, though viewing geometry a fixed function of geographical location

Page 30: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 30

Impact of coincident sampling on AATSR – MODIS Comparison in Sept’04

Cor=.64 Cor=.48 Cor=.56SD=.16

Cor=.71SD=.08

Page 31: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 31

C=0.86 C=0.53 C=0.80 C=0.45

SEVIRI

MODISMISR

SEVIRI aerosol optical thickness comparisons with MODIS & MISRJuly 2005

Page 32: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 32

5. Ground-based networks

Page 33: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 33

Aeronet

AErosol RObotic NETwork. http://aeronet.gsfc.nasa.gov

Standardized instruments, calibration & processing.

→Geographically distributed observations of aerosol spectral optical depths & retrieved products

Page 34: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 34

Aeronet • Each ground station has CIMEL sun-photometer

’s: 340, 380, 440, 500, 675, 870, 940 and 1020 nm– Widths: 2nm at 340nm, 4 at 380nm, others 10nm.– Narrow FOV ~1o

• Direct sun measurement every 15 mins

• AOT from direct sun extinction → Beer’s law – Corrected for Rayleigh scat. + trace-gas abs.– Rel. acc. cf other photometers <0.004– Abs. acc. <0.01-0.02

• [Angstrom coefficient fitted to AOT at 440, 500, 675, 870nm]

Page 35: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 35

• Satellite grid-box containing AERONET station identified.

• For each day, AOT (and Angstrom coeff.) extracted within ±20km from this satellite pixel.

• Aeronet measurements for each station extracted within ±30 minutes of satellite measurements.

• For both Aeronet and satellite sensor, no. of valid retrievals, means and standard deviation stored

• Minimum of 4 matches required.

• If SD of either satellite or Aeronet exceeds 0.15, comparison discounted.

Satellite – Aeronet Comparison

Page 36: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 36

SEVIRI – Aeronet individual time series comparison

Page 37: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 37

SEVIRI

AeronetMODIS

MISR

SEVIRI – Aeronet time series comparison for North Atlantic Ocean

Page 38: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 38

SEVIRI – Aeronet: statistical comparison

C=.2SD=.07

C=.84SD=.04

C=.38SD=.12

C=.81SD=.07

Page 39: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 39

SEVIRI – Aeronet: Europe & Africa

C=.64SD=.06

C=.36SD=.15

– SEVIRI performance reasonable over sea and coasts– High-reflectance land sites (Africa) are problematic

Page 40: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 40

– Representativeness – point location vs satellite field of view

– Only five aerosol types in retrievals– aerosol composition varies continuously

– Cloud flagging may be too stringent– upper limit on allowed AOT set too low

– High land surface reflectance not modelled well

Some Factors Limiting Aeronet Comparisons

Page 41: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 41

Network for the Detection of Atmospheric Composition Change (NDACC)

>70 high-quality stations observing stratosphere and upper troposphere

→ impact of stratospheric changes on troposphere and on global climate LIDAR profiles:Raman lidar - water vapor Differential Absorption Lidar (DIAL) - O3 Backscatter lidars - aerosol Raman and Rayleigh lidars - temperature

Microwave radiometers: ozone, water vapor, and ClO profiles

UV/VISIBLE SPECTROMETERS: column ozone, NO2 (OClO and BrO) FTIR SPECTROMETERS: column ozone, HCl, NO, NO2, ClONO2, and HNO3

DOBSON/BREWER: column ozone

SONDES: ozone and aerosol profiles

UV SPECTRORADIOMETERS: UV radiation at the ground  

Page 42: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 42

Page 43: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 43

• Network of ground-based FTS• Near-IR solar absorption spectrometry• 4,000–14,000 cm−1 at 0.02 cm−1 resolution • Analysis with standard algorithm (GFIT)

– Non-linear least squares → scale profile

→ CO2, CH4, O2 & other columns

• O2 to convert column densities to pressure- • weighted column average mixing ratios

• Complementary surface in-situ at each site

• To usefully constrain global carbon budget – directly, and through validation of satellite column measurements – precision of 0.1% required!

• SCIAMACHY & GOSAT validation

→ Stringent accuracy reqs. eg:- solar tracking- correction for source fluctuations- FTS instrument line shape

-permanently monitored with HCl cell in solar beam.

- spectroscopic parameters- retrieval algorithms

Total Column Carbon Observing Network - TCCON

Page 44: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 44

– Variability reflects atmospheric lifetimes: CH4 ~10yrs, N2O >100yrs

– Correction needed for stratospheric column variability

Sherlock et al

Page 45: 22nd October D4L2Kerridge1 Data Comparison Methods Lecture by B.Kerridge Rutherford Appleton Laboratory, UK ESA-MOST DRAGON 2 PROGRAMME Advanced Training

22nd October D4L2 Kerridge 45

6. Summary– Comparison of observation with theory is fundamental– Validation of satellite data also essential– Important to compare like-with-like as far as possible

→ Apply observation operators to model fields and in assimilation→ Account for sensor vertical smoothing & a priori also in

comparisons with profiles observed at higher resolution

– Techniques of increasing sophistication are being used→ to compare with models and other observations→ to quantify value added in assimilation

– Production of long-term satellite data-sets on the essential climate variables will depend on surface networks for ground truth.