53
Consiglio Nazionale delle Ricerche Istituto di Scienze dell’Atmosfera e del Clima Via Gobetti 101 I-40129 Bologna Italy Tel: +39-051-6398015 Fax: +39- 051-6399749 e-mail: [email protected] Web: www.isac.cnr.it/~meteosat MUSIC – MUltiple-Sensor Precipitation Measurements, Integration, Calibration and Flood Forecasting A Research Project supported by the European Commission under the Fifth Framework Programme and contributing to the implementation of the Key Action “Sustainable Management and Quality of Water” within the Energy, Environment and Sustainable Development Contract n°: EVK1-CT-2000-00058 The Rapid Update satellite rainfall estimation method: operational setup documentation by Francesca Torricella and Vincenzo Levizzani Istituto di Scienze dell’Atmosfera e del Clima-CNR and Alberto Ortolani, Samantha Melani and Andrea Antonini CNR-IBIMET LaMMA Deliverable 6.3 WP6 - Implementation of techniques for satellite image derived rainfall estimates

The Rapid Update satellite rainfall estimation method: operational setup documentation

  • Upload
    cnr-it

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Consiglio Nazionale delle Ricerche Istituto di Scienze dell’Atmosfera e del Clima Via Gobetti 101 I-40129 Bologna Italy Tel: +39-051-6398015 Fax: +39- 051-6399749 e-mail: [email protected] Web: www.isac.cnr.it/~meteosat

MUSIC – MUltiple-Sensor Precipitation Measurements, Integration, Calibration and Flood Forecasting

A Research Project supported by the European Commission under the Fifth Framework Programme

and contributing to the implementation of the Key Action “Sustainable Management and Quality of Water” within the Energy, Environment and Sustainable Development

Contract n°: EVK1-CT-2000-00058

The Rapid Update satellite rainfall estimation method: operational setup documentation

by

Francesca Torricella and Vincenzo Levizzani Istituto di Scienze dell’Atmosfera e del Clima-CNR

and

Alberto Ortolani, Samantha Melani and Andrea Antonini

CNR-IBIMET LaMMA

Deliverable 6.3

WP6 - Implementation of techniques for satellite image derived rainfall estimates

i

INDEX

1. INTRODUCTION ......................................................................................................... 1

2. NAMING CONVENTIONS ........................................................................................... 1

3. GENERAL REMARKS ................................................................................................ 1

4. DEFINING THE DIRECTORY STRUCTURE FOR THE SOFTWARE......................... 2

5. DEFINING THE GEOGRAPHICAL PROJECTION AND THE ANALYSIS AREA ...... 2

6. PROCEDURES FOR PROCESSING METEOSAT IR DATA: PRE-PROCESSING.... 3

6.1. GEOLOCATION ........................................................................................................ 3 6.2. LATITUDE AND LONGITUDE ....................................................................................... 4 6.3. PIXEL TIME.............................................................................................................. 4 6.4. SATELLITE ZENITH ANGLE ......................................................................................... 4 6.5. BYTESWAPPING....................................................................................................... 4

7. PROCEDURES FOR PROCESSING METEOSAT DATA: IMPORTING IR DATA SLOTS INTO TERASCAN FORMAT ................................................................................. 4

8. PROCEDURES FOR PROCESSING SSM/I DATA: IMPORTING BRIGHTNESS TEMPERATURE DATA INTO TERASCAN FORMAT ....................................................... 6

9. PROCEDURES FOR PROCESSING SSM/I DATA: COMPUTING THE RAIN RATE MAPS FOR EACH ORBIT.................................................................................................. 7

10. CREATION OF STATISTIC FILES .......................................................................... 7

11. ASSIGNING A RAIN RATE TO EACH IR PIXEL .................................................... 7

12. REFERENCES......................................................................................................... 9

APPENDIX 1: Implementation of an operational chain of RU Technique for near real-time rainfall estimation ................................................................................................... 11

Input satellite data .................................................................................................. 11 TeraScan software ................................................................................................. 13 Operational chain ................................................................................................... 13 Procedure structure ................................................................................................ 14 Setup ...................................................................................................................... 16 METEOSAT data processing: automaticGEO.sh ................................................... 16 SSM/I data processing: automaticSSM/I.sh ........................................................... 18 Start up procedure: startup.sh ................................................................................ 19 Rainfall images visualisation .................................................................................. 20

APPENDIX 2: UNIX shell scripts .................................................................................... 21

APPENDIX 3:Glossary of terms ..................................................................................... 49

ii

1

1. Introduction The ISAC SatMet group runs and participates to the continuous improvement of the software package hereinafter called Rapid Update or RU that computes the rain rate at the geostationary time-space scale (Turk et al., 2000a, 2000b). The RU is based on a hybrid microwave-infrared (MW-IR) technique that correlates through statistical probability matching brightness temperatures (TB) measured by geostationary sensors and precipitations levels (rain rates RR [mm h-1]). The latter are derived from passive MW data using the operational algorithm of the National Environmental Satellite Data and Information Service (NESDIS) of the National Oceanic and Atmospheric Administration (NOAA) by Ferraro et al. (1995, 1997). The original operational arrangement at the Naval Research Laboratory (NRL), such as the triggering of the runs by a steady flow of incoming geostationary and MW data in TeraScan® data format (TDF), has been adapted to the needs of case-study re-analysis. The required input data are obtained from different public sources and in different formats. The interface to transform these data to the TeraScan® format required by the RU has been set up, so to minimize the modifications to the core programs of the package. The computer codes that form the RU package are written in C language and Cshell scripts manage the setting of parameters, directories, lat-long grids, etc. The output of the package are TeraScan® files containing geolocated rainrate values. 2. Naming conventions When referring to the original software package developed in an operational and automatic form by J. Turk we will use the georain name. Hereafter RU refers to the adapted test case version of the software. The interval between two METEOSAT-7 images is 30’ (25’ to acquire a total image plus a five minute retrace and standby period). This interval is known as a slot, and there are 48 slots in each day of operations. Slots are numbered from midnight UTC, so that slot 1 covers data acquired from 00:00 to 00:30 UTC, slot 2 from 00:30 to 01:00, and so on up to slot 48 (from 23:30 UTC to midnight) (EUMETSAT, 2002). 3. General remarks Note that all C programs must be compiled by linking the TeraScan® libraries directory since they can contain TeraScan® commands or functions. This is an example: # generic terascan compiler for Linux gcc -g -I$TSCANROOT/include $1.c -L$TSCANROOT/lib -ltscan -lm -o $1 The following diagram sketches the main functions of the RU. The package can be subdivided into four main parts: • preparation and pre-processing of GEO data; • preparation and pre-processing of MW data; computation of rain rate maps at the LEO

space-time resolution; • set-up of geo-located statistical relationships; • production of rain-rate maps at the GEO space-time resolution.

2

IR data from GEO satelliteconversion from counts to

radiances then to TB

Geolocation and conversion into TeaScan® format

Time space co-location over a grid covering the GEO domain (or a sub-frame)

MW data from LEO satellite (TB array)

Geolocation and conversion into TeraScan®format

Rainrate (RR) computation via NESDIS algorithm

Creation of TB –RR geolocated relationships via Probability matching

technique (PMM)

Assign a rain-rate to each IR pixel

Figure 1: Main structure of the RU.

4. Defining the directory structure for the software According to the main structure of the RU the directory three of the software is arranged as depicted below.

programs METEOSATRU

ssmi

TRMM

preprocEUMETSAT_to_TeraScan

hdf_to_tdfrainutility

TeraScan master

5. Defining the geographical projection and the analysis area Among other things, georain allows to set a main master area and smaller inner regions of interest (as for instance central America, Mediterranean, etc). A master is, according to TeraScan® definition, a dataset generated to describe a geographical area over a given type of map projection and it is produced by a TeraScan® function. The masters are used by numerous TeraScan® functions to perform manipulation on other datasets that require the use of a map to match satellite data against a specific geographical area using a given projection. To simplify the procedure the RU works with only one master file, i.e. the one

3

defining the geolocation and projection of the METEOSAT-7 pixels, B-format (EUMETSAT, 2002). This subset covers the European, North African and Middle East regions. The final output is produced using this projection and resolution, that is indeed the best possible attainable resolution. By using the proper TeraScan® commands after the final production of RR maps it is possible to change projection and resolution as needed. The code for creating the master is master_meteosat.sh.

> cat $HOME/TeraScan/master/master_meteosat.sh master \ projection=perspective \ center_lat=0. \ center_lon=0. \ num_lines=625 \ num_samples=1250 \ pixel_width=4.5 \ pixel_height=4.5 \ view_radians=6.05 \ rotate_angle=0. \ move_center=yes \ new_center_lat=40.2623 \ new_center_lon=0. \ master.meteosat_B

Figure 2: METEOSAT-7 IR slot for August 6 2002 02:30 UTC, B-format.

6. Procedures for processing METEOSAT IR data: pre-processing

6.1. Geolocation The nominal geolocation of each METEOSAT pixel is computed only once when installing the software, and the lat-lon data are used for all the GEO data as long as only METEOSAT-7 data are used. For each pixel, the procedure computes: latitude and longitude, acquisition time, and observing zenith angle.

4

6.2. Latitude and Longitude Bformat2geo.f computes lat and lon by means of the EUMETSAT procedure refgeo.f. A file latlon.dat is written, containing line, sample, lat, lon. The file is then converted from ASCII to binary by means of ascii2bin.c. The output files are Blat.out and Blon.out.

6.3. Pixel time Bformat_timepixel.c computes for each pixel the time from the starting of acquisition of the slot by means of the simple relationship: #define LNTIME 0.6 /* time for each METEOSAT line */ #define PXTIME 0.00024 /* time for each METEOST pixel */ pixtime= (line-1.)*LNTIME + (sample-1.)*PXTIME; The output is a binary file named Bpixtime.out.

6.4. Satellite zenith angle It is computed by means of zenith.c, a program that translates in C part of EUMETSAT procedure POSMTO.f. The output binary file is Bzenith.out.

6.5. Byteswapping Prior to be associated with IR data, these binary files must be converted from the little-endian format (Linux) to the big-endian one (TeraScan®) by means of the TeraScan® command: byteswap –4. The -4 option is provided to byte-rotate every four bytes. The files obtained Blat_swa.out Blon_swa.out Bpixtime_swa.out Bzenith_swa.out are converted into TeraScan® format by means of the shells get_geo.sh get_pixtime.sh get_zenith.sh that produce, respectively Blon.geo Btime.geo Bzenith.geo. These files too are directly supplied. 7. Procedures for processing METEOSAT data: importing IR data slots

into TeraScan format To import the native EUMETSAT OpenMTP files (EUMETSAT, 2000) of the METEOSAT IR channel in the proper TeraScan® format, the following scripts have to be run, in the order they are described. The procedure is quite complicated, involving an intermediate transformation of the data format connected with obsolete data formats.

5

get_OpenMTP2ir

This script extract the data in binary format and the (time dependent) calibration coefficients in ASCII files. It calls the program OpenMTP2ir.c. In this script one has to set the following parameters: 1. I/O directories names; 2. days to be processed; 3. log files names.

get_meteosat_from_OpenMTP

This script imports the binary files created in the previous step into the final TeraScan® format. The main TeraScan® functions used are impbin, emath, assemble, setattr, imgrid editdim. The digital counts are converted into radiances by means of the relationship: R = (Cnt - C0) where R is the radiance, the calibration coefficient, Cnt the digital counts, and C0 the space counts, i.e. the readout of the radiometer when it looks at the empty space. The appropriate Temperature/Radiance conversion tables are provided at http://www.eumetsat.de/en/index.html?area=left7.html&body=/en/dps/mpef/temp-rad_conv.html&a=730&b=1&c=700&d=700&e=0 and have to be applied to convert the radiances into TBs. There are different tables for each of the three currently operational spacecrafts. To facilitate the process, an accurate exponential fit of the above mentioned tables can be used. Regression coefficients for the relationship between radiance (W m-2 sr1) and TB (K) for the IR and water vapor (WV) channels of Meteosat-2 through Meteosat-7 have been computed. The analytic form of the relationship is: R(T) = exp (A + B / T) where R is the radiance (W m-2 sr-1), T the TB (K), and A and B regression coefficients (A dimensionless, B in K). The equation fits the relationship with an rms error of less than 0.2 K in the range between 200 K and 330 K. The following parameters needs to be fixed in the script: 1. channels to be imported (only IR) 2. I/O directories names; 3. days-slots to be processed; 4. log files names; 5. parameters for the radiance-temperature conversion (see above); 6. name of the master file (set always to the one representing the 7. METEOSAT-7 geolocation and perspective); 8. names of the files containing latitude, longitude, satellite zenith angle, 9. pixel time acquisition (see above).

6

8. Procedures for processing SSM/I data: importing brightness temperature data into TeraScan format

The Special Sensor Microwave/Imager (SSM/I) TB data are obtained via ftp from the Global Hydrology Resource Center (GHRC) at the Global Hydrology and Climate Center (GHCC), Huntsville, Alabama (http://ghrc.msfc.nasa.gov/ghrc.html). The data are supplied in HDF format, and they can be visualized and managed by using the HDFLook software. To manage these data the user must download the necessary software suitable for her/his platform at http://hdf.ncsa.uiuc.edu/index.html. The format and the content of the files are described in a GHRC technical document (GHRC, 2002). The files contain geolocated values of TBs for the seven SSM/I channels broken into passes (i.e. a pole to pole swath either ascending south to north or descending north to south). A complete pass contains 51 minutes of data. The first pass of a UTC day is the first complete pass of the day. A typical day contains 28 or 29 passes. For each pass there are three files, one with TBs, one with high resolution geolocation information, one with low resolution geolocation channels (i.e the low frequencies channels) plus an ASCII text file containing summary information. The naming convention of files is as follows: fxx_Tb_yyddd_ppZ.hdf where xx is the satellite ID number (13, 14, 15, etc.); yyddd is the date; pp is the pass number (1-29); Z is the pass direction (A-ascending, D-descending). This file contains the TBs and the corresponding fxx_ln_yyddd_ppZ.hdf, fxx_hn_yyddd_ppZ.hdf, meta_fxx_yyddd.text the low and high resolution geolocation data and the metadata.

get_ssmi_4_rain This script imports the HDF files into the suitable TeraScan® format. The main TeraScan® functions used are hdftotdf, editvar, getval, setattr, editdim, varname. The long variable names contained in HDF files are properly redefined so to suit the maximum varname length supported by TeraScan® software and naming convention used in the original georain programs. The main steps of the C shell for each input orbit are as follows: • Extraction of the orbit and platform codes from the file name. • Set of geolocation files names. • Set of output files names. • Translation of HDF into TDF format by means of the TeraScan® command hdftotdf. • Extraction from each file (TB, ln, hn) of the orbit parameters (namely time and date) and

checking of the alignment of the files. • Transformation of variable names in TB, ln, hn files . • Defining of x and y variables in TB, ln, hn files. • Scaling of variable (TB, lat, lon) values by 0.01. • Defining of file attributes (satellite, sensor, pass_date, start_time)

7

9. Procedures for processing SSM/I data: computing the rain rate maps for each orbit

Each file containing the geolocated SSM/I TB in TeraScan® format obtained in the previous step are then used as input for the

rain Cshell, that calls the program rain_landsea.c. This program implements the NESDIS operational rain algorithm by Ferraro. The output is a TeraScan® file containing geolocated values of rain rates at the A scan resolution of SSM/I measurements. Then the program change_date.c is called. It copies from a tdf file the lat, lon, rain variables to a new tdf file. The date and time variables are extended to all the samples of each line. 10. Creation of statistic files Having computed the rain rates maps for each available SSM/I orbit and the IR TBs from METEOSAT data, the software creates the files that contain the statistical relationships relating the TB to the rain rate. This is accomplished in several steps.

stats_from_geo For each METEOSAT slot, all the SSM/I orbits that can contain time-space coincident pixels are identified by means of the TeraScan® command nearestfile. For each MW file selected in this way and for each IR pixel, the pixels that have the same geolocation (with a tunable tolerance of 10 km) and the same acquisition time (tolerance = 15’) are selected and stored in the so called coincident files (program stats_from_geo.c). They are ASCII files containing (at the worse MW spatial resolution) the latitude, the longitude, the mean and minimum TB, and the rain rate. The name of coincident files contains the information about the MW and IR files from which they derive. Once these files are written, the program histogram.c computes the geolocated TB/RR relationships. The program subdivides the globe into 2.5° x 2.5° boxes (latitude and longitude) and the coincident data are collected for each box until a given coverage (lets say 75%) of the corresponding histogram region (i.e. the area containing the box and the 8 surronding boxes) is reached. From the coincident data the two cumulated probability functions (cpdf) for TB and RR are generated. The TB/RR relationship is thereby derived using the statistical approach called probability matching method (PMM) that couples TB and RR giving rise to the same value in the corresponding cpdf. The relationship is considered meaningful if more than 400 coincident points have been used to generate it. Otherwise a null relationship is output for that box. The coincident files are analysed starting from the more recent backwards, until a maximum look-back time is reached (tunable parameter, in general set to 24 hours). The geolocated relationships are then written to output in ASCII format. 11. Assigning a rain rate to each IR pixel Once the relationships are generated the script lancia_georain allows to assign to each IR pixel in a METEOSAT slot the corresponding rain rate by means of the program georain.c. The script re-examines each IR slot and set the parameters to run the program, in particular the flags that drive the screening process. georain.c looks for the histogram-

8

matched TB-RR relationship closest to each pixel in the input file, and adjusts its IR temperature to the rain rate based on interpolation. Let's call this the "unscreened rain rate". There are several screens that do the further step of (attempting to) screen away non-raining pixels. This is the nature of IR data; often high, cold clouds are non-precipitating, and conversely, warm, low clouds can be precipitating especially in the tropical oceans. There are currently four screens, specified in the input to georain.c (see the attached listing of the shell). The three screens spect_screen, time_screen, and spatial_screen are binary tests. That is, if they determine that it is not raining, they turn off the rain, if the unscreened rain rate was previously determined to be non-zero. If any of the channels wvname or niname cannot be located in the input TDF, then spect_screen is deactivated. It is a simple channel difference test (spect_screen must be deactivated in this version of the software). time_screen can add a fair amount of overhead to the program. It searches for the previous IR datasets (the number is set within georain.c), and gets the time-history of the IR temperature at each grid point. Depending upon the time rate of change of the IR temperature, any non-zero rain rate can be set to zero (time_screen must be deactivated in this version of the software). spatial_screen looks in a boxsize times boxsize region for the minimum, mean, and standard deviation of the IR temperature at each grid point. Depending upon some threshold tests, any non-zero rain rate can be set to zero. The screen orograph_check is not really a screen, but rather a fairly sophisticated check to capture orographic rain conditions. It allows for an adjustment to the rain rate by up to a factor of 3, depending upon topography elevation changes delta-h (where h is height in meters), wind speed and direction at 850 hPa, mixing ratio at 850 hPa. orograph_check may increase the rain on windward sides of steep terrain, especially if it is an onshore wind, or it may decrease or turn off the rain on the back side of steep terrain (orograph_check is still undergoing testing, must be deactivated in this version of the software).

9

12. References EUMETSAT, 2000: Format Guide No. 1 – Basic imagery – OpenMTP Format. EUMETSAT EUM FG1, Rev. 2.1, 22 pp. EUMETSAT, 2002: METEOSAT high resolution image dissemination – Technical description. EUMETSAT EUM TD 02, Rev. 6.1, 79 pp. Ferraro, R. R., and G. F. Marks, 1995: The development of SSM/I rain-rate retrieval algorithms using ground-based radar mea-surements. J. Atmos. Oceanic Technol., 12, 755–770. Ferraro, R. R., 1997: Special sensor microwave imager derived global rainfall estimates for climatological applications. J. Geophys. Res., 102 (D14), 16715-16735. GHRC, 2002: An introduction to MSFC SSM/I Brightness Temperature Data Sets. Available at: http://microwave.nsstc.nasa.gov:5721/guide/ssmi_tb_dataset_guide.gd.html. Turk, J. F., G. Rohaly, J. Hawkins, E. A. Smith, F. S. Marzano, A. Mugnai, and V. Levizzani, 2000a: Meteorological applications of precipitation estimation from combined SSM/I, TRMM and geostationary satellite data. In: Microwave Radiometry and Remote Sensing of the Earth’s Surface and Atmosphere, P. Pampaloni and S. Paloscia Eds., VSP Int. Sci. Publisher, Utrecht (The Netherlands), 353-363. Turk, F. J., J. Hawkins, E. A. Smith, F. S. Marzano, A. Mugnai, and V. Levizzani, 2000b: Combining SSM/I, TRMM and infrared geostationary satellite data in a near-realtime fashion for rapid precipitation updates: advantages and limitations. Proc. 2000 EUMETSAT Meteorological Satellite Data Users' Conf., 452-459.

10

11

APPENDIX 1: Implementation of an operational chain of RU Technique for near real-time rainfall estimation LaMMA has received the RU software and installed it on a dedicated workstation of an operational chain that runs continuously and produces the rainfall estimation every thirty minutes. Some software modifications were done so as to allow the implementation of the operational chain. Input satellite data The data from SSM/I and METEOSAT-7 satellites are taken from different sources in different formats: the SSM/I data are downloaded by ftp connection from NOAA’s Satellite Active Archive (SAA: http://www.saa.noaa.gov) and METEOSAT data are received from the LaMMA’s Primary Data User Station (PDUS) by Tecnavia. The MW SSM/I sources are the Defense Meteorological Satellite Program (DMSP) F-13, F-14, F-15 satellites actually in orbit. The SSM/I data, supplied in Temperature Data Record (TDR) format, consist of 12-bit precision antenna temperatures, along with Earth surface positions for each pixel. The following table shows the “internal structure” of each TDR file:

RECORD FORMAT LENGHT CONTENT 1 I3, F16.4,

I9 28 SAT ID, TIME, REV

2 128 F7.2 896 A-Scan Lat 3 128 F7.2 896 B-Scan Lat 4 128 F7.2 896 A-Scan Long 5 128 F7.2 896 B-Scan Long 6 128 I2 256 A-Scan Surface

types 7 128 I2 256 B-Scan Surface

types 8 64 F7.2 448 19V (TB) 9 64 F7.2 448 19H (TB) 10 64 F7.2 448 22V (TB) 11 64 F7.2 448 37V (TB) 12 64 F7.2 448 37H (TB) 13 128 F7.2 896 85V A-Scan (TB) 14 128 F7.2 896 85V B-Scan (TB) 15 128 F7.2 896 85H A-Scan (TB) 16 128 F7.2 896 85H B-Scan (TB)

where SAT ID: satellite identification number TIME: time REV: revolution count since launch Lat: latitude (-90 ÷ 90); Long: longitude (0 ÷ 359.99).

12

The SAA’s archive provides also the TDR data processing software which allows to calibrate the digital data, obtaining TBs. They consists of C and Fortran language programs. Specifically, the ssmitdrta.c reads the SSM/I TDR data set file and writes the antenna temperatures, satellite ID, time, revolution number, latitudes, longitudes to an output file. This output file is read by FORTRAN program ssmitdrtb.f which converts the antenna temperatures to TBs. The C language code ssmitdrlatlon.c reads the TDR data set and creates a new TDR file containing scans with specified latitude-longitude boundaries. The data are available at the SAA archive in a near real-time fashion, about an hour later the satellite passage for a geographical area extending 80° x 80° lat-lon; to download the data a simple user registration is needed. LaMMA is equipped with a PDUS for real time High Resolution (HR) data acquisition (every half hour). The data, supported in digital format, also contain geolocation, rectification and calibration parameters, as well as the temperatures for the various channels. The PDUS station acquires METEOSAT digital data in the A and B formats (EUMETSAT, 2002). The first one represents the whole earth disc, the second covers the European, North Africa and Middle East regions. The A and B formats are shown in Fig. A1-1.

Figure A1-1. A and B METEOSAT formats.

Because of the huge volume of data acquired every day by the PDUS station, the images (processed by the Tecnavia software) are stored in a non-standard TIFF format, which contains besides the images of the three METEOSAT’s channels, TAG files for the geolocations. The LaMMA database contains the METEOSAT record data for the following spatial windows: • Entire globe (512 x 512 pixels); visible (VIS), IR, WV channels; pixel resolution 25 x

25 km2 (i.e. A format at low resolution). • European area (1012 x 512 pixels); VIS, IR, WV channels; high-resolution (i.e. a

spatially reduced B format).

13

• Northern-central African area (1500 x 900 pixels); IR channel; high-resolution (i.e. a subset of the A format).

The operational chain ingest data over the European area. TeraScan software The TeraScan software is based on the UNIX operating system and is dedicated to processing and visualization of satellite data; the data must be converted in the proprietary Terascan TDF format. TDF is an extremely versatile format capable of assimilating a wide variety of data types, shapes, and sizes. For example, a single dataset could contain satellite image data, random in-situ data, and 3-D model data. TDF is self describing and both dataset definitions and data are typically stored together in the same file. A TDF dataset can reference variables from several files using links. Links allow rapid import of non-TDF data and support lightweight dataset subset and assemblies. TeraScan includes a C-language programming interface to TDF. This interface accesses TDF components as named objects, hiding all unnecessary details about physical data layout. TDF is very similar to other self-describing dataset standards such as HDF and netCDF. TDF support conversion to and from these and other formats. TeraScan has more than 500 command line functions for capturing and processing satellite data. For example TeraScan has function for: • Ingesting raw satellite telemetry or industry-standard archive format data • Importing structured ASCII or binary data. • Defining area-of interest maps using various projection and remapping and

interpolating data to those maps. • Managing datasets, retrieving and setting dataset and variable attributes, such as

variable units. • Element-by-element math, featuring multiple expressions, trigonometric and

transcendental functions, access to satellite-pixel-sun angles, land masks, and neighbourhood statistics.

• Generating overlays, including coastlines, latitude/longitude grids, and topography/bathymetry contours.

• Generating pictures consisting of colored/enhanced imagery with embedded overlays, and for exporting the pictures in PostScript, JPEG, GIF, TIFF, PPM, PNG, etc…

• Exporting non-picture data in plain ASCII, in binary, or in HDF or netCDF. Operational chain The system architecture is shown in Fig. A1-2. The core of the system is a Workstation equipped with UNIX operating system and TeraScan software that performs the automatic near-real time data processing. The Central Process Unit is an Intel Pentium IV @ 2.8 GHz, but in case of need (for example when several procedures must run at the same time) this system offers optional dual processing. In order to ensure data security two hard disks worth 36 GByte have been installed for redundancy.

14

Figure A1-2. system architecture at LaMMA. The Workstation ingests data from two types of satellites sensors: IR METEOSAT and seven MW SSM/I. The rain rate maps can be displayed by means of TeraVision’s graphic interface, and the real time images are available on the LaMMA web site. All data are stored in CD or DVD supports. The automatic, independent processing of METEOSAT and SSM/I data are implemented by C-language procedures and functions and UNIX shell scripts; they are launched by two ad-hoc “crontabs”. Ø Every 30 min: automaticGEO.sh Ø Ingest METEOSAT data: downloadmeteosat.sh Ø Register infrared geostationary channel onto desired map projection:

get_meteosat_chanIR.sh Ø Compute rain rate from latest global histogram file using nearest TB - RR

relationship, pixel-by-pixel: lancia_georain.sh + georain.c Ø Grid the rainfall data in a smaller master: istanMeditrain.sh and export in jpg image

format: tdftojpeg.sh and gif image format: animate_rain.sh for the web page Ø Every 30 min (if a new SSM/I pass occurs): automaticSSMI.sh Ø Ingest SSM/I data: downloadSAA.sh + get_ssmi.sh Ø Compute SSM/I rain field with Ferraro’s algorithm: rain.sh Ø Save all rain rate pixels into a file along with pixel geolocation, insert date and

time into for chronological searching: stats_from_geo.sh Ø Update global histogram matched TB - RR relationships: histogram.c

Procedure structure Actually, an operational procedure is running on the workstation for the rainfall estimation over the European area displayed in Figure A1-3.

15

Figure A1-3. Geographic area for the operational procedure. The principal directory is /home/antonini/pioggia/programmi/ It contains the sub-directories METEOSAT/ Sub-directories: 1. data/ (METEOSAT-7 data taken from the PDUS receiving station in TIFF format, file

with latitude, longitude, time and satellite zenith angle) 2. src/ (executable programs)

2.1 preproc/ : executable programs for the creation of latitude, longitude, time and satellite zenith angle

3. gridded/ (METEOSAT-7 data converted to TDF format and geolocated in the selected area)

4. logfile/ (log directory containing all the log files of the executable programs) ssmi/ Sub-directories: 1. data/ (SSM/I data downloaded from the SAA’s archive in TDR format) 2. src/ (executable programs) 3. rain/ (SSM/I calibrated data, converted in TDF format and rainfall data in TDF format

geolocated in the selected area) 4. logfile/ (log directory containing all the log files of the executable programs) RU/ Sub-directories: 1. src/ (executable programs) 2. coincident/ (METEOSAT-7 and SSM/I data spatially and temporally coincident) 3. histogram/ (TB, IR histograms)

16

4. istantaneous/ (rainfall maps produced by the system in TDF format) 5. logfile/ (log directory that contains all the log files of the executable programs) master/ (in this directory are contained the “master” files for the geolocation of the maps in the different projection) Then some executable programs are available automaticGEO.sh automaticSSMI.sh startup.sh tarzip.sh Finally, we mention the file georain.paths that defines the environmental variables used from the procedure and that is called from the scripts for use by the “source” command. Setup The operational chain needs setting of some “crontab” procedures for the automatic execution of the download and data processing procedures. The login must happen as user “root” and the command is crontab -u antonini /home/antonini/crontabautomatic The file /home/antonini/crontabautomatic sets the “crontab” that launches the procedures automaticGEO.sh e automaticSSMI.sh with the temporisation indicated in that file. To visualise the content of the “crontab” file any text file visualisation command can be used, for example “vi” vi /home/antonini/crontabautomatic METEOSAT data processing: automaticGEO.sh When this procedure starts it creates a “flag” file (flagmeteo) that temporarily suspends the cron based script execution. The system creates some files to ingest satellite data with the script /home/antonini/pioggia/programmi/METEOSAT/src/preproc/driver_preproc.sh that associates to each pixel of the METEOSAT image the latitude, longitude, time (from the begin of image acquisition) and satellite zenith angle information.

17

To define the area on which the images are geolocated and visualized a master file (a latitude/longitude grid with the desired projection) is created /home/antonini/pioggia/programmi/METEOSAT/src/preproc/crea_master.sh The system is now ready to ingest and to process the satellite data. The input METEOSAT data are in TIFF format and are ingested with a ftp download procedure (intranet of LaMMA laboratory): /home/antonini/pioggia/programmi/METEOSAT/src/downloadmeteosat.sh All satellite data are processed by the TeraScan software and must be converted in TDF format: /home/antonini/pioggia/programmi/METEOSAT/src/get_meteosat_chanIR.sh To convert the METEOSAT data from TIFF to TDF format the system uses the C program

/home/antonini/pioggia/programmi/METEOSAT/src/get_TIFF.c that creates a binary file containing the calibrated IR TB array. For each image the routine creates a corresponding binary file. The system to ingest these files uses the command line TeraScan function "impbin” that creates the TDF dataset. All TDF data must be projected in the master (TeraScan command “imgrid”), and added to the information on latitude, longitude and time (geolocation). The output files are in the directory /home /antonini/pioggia/programmi/METEOSAT/gridded/ From each METEOSAT IR channel image a rainfall map is created. The script /home/antonini/pioggia/programmi/RU/src/lancia_georain.sh is a driver for the C-program /home/antonini/pioggia/programmi/RU/src/georain The rainrate is computed from the METEOSAT IR image by applying the last updated relationship IR vs. rain-rate, by assigning the pixel to a 15° x 15° lat-long box and taking the relationship available for this box. The parameters to implement some screening techniques in the rainfall calculation are set on the script lancia_georain.sh. The instantaneous rainfall maps in TDF format are in the directory /home/antonini/pioggia/programmi/RU/istantaneous/

18

this file can be exported from TeraScan as image in JPG format with the procedure /home/antonini/pioggia/programmi/RU/src/tdftojpg.sh SSM/I data processing: automaticSSM/I.sh Upon starting, this procedure creates a “flag” file (flagssmi) that temporarily suspends the cron based script execution. The input SSM/I data, from DMSP F-13, F-14, F-15 satellites, are in TDR format. These data are taken with an ftp download from the SAA web site: www.saa.noaa.gov /home/antonini/pioggia/programmi/SSMI/src/downloadSAA.sh The next step is the data processing. The data ingestion and conversion in TDF format is /home/antonini/pioggia/programmi/ssmi/src/get_ssmi.sh SSM/I data downloaded from SAA are in TDR format, i.e. record of temperatures received from sensor; a calibration procedure is needed. Also the routines for the calibration are available in the SAA web site. They are /home/antonini/pioggia/programmi/ssmi/src/ssmitdrta.c that converts the received data in antenna temperatures, /home/antonini/pioggia/programmi/ssmi/src/ssmitdrtb.c that converts the antenna temperatures in absolute TBs. These TBs are structured in record (see the TDR scheme). The routine /home/antonini/pioggia/programmi/ssmi/src/tbascii2tbbin.c converts the format; the output are 5 binary files: • low resolution SSM/I channels TBs arrays; • high resolution SSM/I channels TBs arrays; • low resolution latitude longitude and surface type arrays; • high resolution latitude longitude and surface type arrays; • low resolution time (from the begin of acquisition) array. These five files are ingested in TeraScan with the command line "impbin” function and five correspondent TDF files are created. From these five files the “assemble” function creates a TDF file. This file is processed by the procedure /home/antonini/pioggia/programmi/ssmi/src/rain.sh

that launches the C program

19

/home/antonini/pioggia/programmi/ssmi/src/rain_landsea.c for the processing of MW data with the Ferraro’s NOAA-NESDIS algorithm and the calculation of rain-rate relative to each pixel of SSM/I orbit. Rain-rate (MW based) and METEOSAT data are ready for the RU processing procedure. Since the procedure is real time only current and last day data are processed. For each METEOSAT file starts a loop with these steps

1. /home/antonini/pioggia/programmi/RU/stats_from_geo.sh that searches the temporarily near (in a prefixed time window) rain-rate (MW based) files, and controls the spatial intersection with the “master” file /home/antonini/pioggia/programmi/RU/src/mastersubssmi If there is spatial and temporal coincidence, the coincident IR TB and rain rate (SSM/I) relative to each pixel are extracted /home/antonini/pioggia/programmi/RU/src/expstat This values obtained are recorded in a file that has the same spatial resolution of SSM/I data (the METEOSAT data are spatially averaged); these files are in the directory /home/antonini/pioggia/programmi/RU/coincident The next step of the loop is the routine

2. /home/antonini/pioggia/programmi/RU/histogram.c that builds the two histograms: the one for the IR TBs and the other for the rain-rate values. The Earth is subdivided into 15° x 15° boxes, which are further divided into an inner grid of 2.5° x 2.5° per pixel. For each of these pixel the two histograms are build, by taking the expstat.c program output files. By use the PMM method the two histograms are matched considering all 2.5° x 2.5° pixels that are belong to the 15° x 15° box. The final result is a function, a relationship rain-rate vs. IR TB: RR(TB). Start up procedure: startup.sh When the system is stopped or turned off for some time and the new satellite datasets have not been downloaded, the “start up” procedure needs to be run /home/antonini/pioggia/programmi/startup.sh This procedure temporarily suspends the cron based execution of automaticGEO.sh and automaticSSMI.sh scripts, downloads SSMI and METEOSAT available data, and updates the histograms and the instantaneous rainfall fields.

20

To suspend the cron based script execution for the time in which the startup.sh script is running, two flag files are created: flagmeteo and flagssmi. Both the procedures of download SSM/I and METEOSAT data are launched twice because the download of these big satellite datasets is very demanding for the system and during the first execution other datasets can be available. All the steps of this procedures are recorded in the log file startup.log. Rainfall images visualisation The command tvis – true & starts the TeraVision software, the graphical interface of TeraScan. All product datasets can be visualised by clicking on File->Open Data Shelf and choosing the dataset type by clicking on Library Shelf. The available data are: • CASISTUDIOMETEOSAT: METEOSAT geolocated images (for case studies) • CASISTUDIOSSMI: SSM/I geolocated images (for case studies) • CASISTUDIOSTIMEPIOGGIA: Instantaneous rainfall maps (for case studies) • ISTANTRAIN: Real time instantaneous rainfall maps (in the European area) • MAPPEPROVVISORIE: Real time rainfall map (in the Italian area for the Civil

Protection) • METEOSAT: Real time METEOSAT geolocated images (in the European area) • SSMI: Real time SSM/I geolocated images (in the European area) • MEDITRAIN: Real time instantaneous rainfall maps (in the Mediterranean area)

By clicking on one of these options the available files are displayed. By selecting a file and the relative variable and by clicking on Open the dataset is open and displayed.

21

APPENDIX 2: UNIX shell scripts get_OpenMTP2ir #! /bin/csh # # F. Torricella- Sat. Met. Group - ISAC- CNR Bo. September 2002 # # This script imports OpenMTP files supplied by EUMETSAT in the High Resolution # Images format suitable to be input to get_meteosat cshell # #================================================================= # ================== VARIABLES SETTING ====================== #================================================================= #------------DIRECTORIES VAR------------------- set INDIR = /workcyclone/pioggia/algeria/meteo-7/eumetsat/ set OUTDIR = /workcyclone/pioggia/algeria/meteo-7/eumetsat2/ set TMPDIR = /home/pioggia/temp if (! -d $OUTDIR) mkdir -p $OUTDIR if (! -d $TMPDIR) mkdir -p $TMPDIR #----------------- SET LOG FILES ------------------------------- set IMPLOG = out.log #tdf command standard and diagnostic output set REPORT = status.log #command status results rm -f $IMPLOG $REPORT date >>$REPORT #-------------------- DATE VAR ---------------------- foreach DAY (08 09 10) #---------------- Set files to import ----------- set inflist = `ls -1 $INDIR/OpenMTP*-$DAY*.IR` echo $inflist #--------------- FOR EACH SLOT FILE ------------------------ foreach SLOT( $inflist ) echo $SLOT ./OpenMTP2ir $SLOT $OUTDIR echo fine elaborazione slot $SLOT end end exit

22

23

get_meteosat_from_OpenMTP # #! /bin/csh # F.T. September 2002 # This script imports METEOSAT binary files and create TDF files. # The counts to Brightness Temp transformation is performed and # meteo-7 attributes are set to the tdf output files. # Geolocation is performed and imgrid is used to map data onto a master # Because the source of MET binary data are the archived EUMETSAT # in OpenMTP format, the coefficient needed for the calibration are not taken # from the *.hdr files as in get_meteosat but directly from data files supplied by # EUMETSAT #================================================================= # ================== VARIABLES SETTING ====================== #================================================================= #--------- CHANNELS TO READ: ONLY ir IN THIS VERSION !!! ------------------- set channels = 'ir' #--------- DIRECTORIES VAR ------------------- set INDIR = /workcyclone/pioggia/algeria/meteo-7/eumetsat2 set OUTDIR = /workcyclone/pioggia/algeria/meteo-7/grid-new #if (-e $OUTDIR)then #echo $OUTDIR already exists. Exiting. #exit -1 #endif #if (! -d $OUTDIR) mkdir -p $OUTDIR if ($INDIR == $OUTDIR)then echo Output dir $OUTDIR is the same as input dir $INDIR. echo .......exiting! echo Output dir $OUTDIR is the same as input dir $INDIR. >> $REPORT echo .......exiting!>> $REPORT exit endif #---------- DATE VAR ------------------------- set YY = 01 set YEAR = 2001 set JDAYI = 312 #Initial day set JDAYF = 314 #Final day #---------- SLOT TO IMPORT ------------------- set SLOTSTRING = '1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48' #set SLOTSTRING = '17 18' #---------- SET LOG FILES -------------------- set IMPLOG = out.log #tdf command standard and diagnostic output set REPORT = status_importa.log #command status results set REP_CALIB = status_calibration.log rm -f $IMPLOG $REPORT $REP_CALIB date >>$REPORT date >>$REP_CALIB

24

#----------- CALIBRATION VAR ------------------ # Radiance to Brightness Temperature for meteosat-7 set AIR = 6.9618 set BIR = -1255.5465 set AWV=9.2477 set BWV=-2233.4882 set ZERO = 273.15 #---------- SET MASTER ------------------------- set MASTER=/homewind/pioggia/TeraScan/master/master.meteosat_B set SEARCH_PARAM = 1 # Per il master di un'area in proiezione rettangolare # SERCH_PARAM = 3 assicura la copertura delle # latitudini a nord. Si perde definizione alle basse # latitudini. # Per proiezioni ortogonali, SEARCH_PARAM=1. # Si perde la regolarita' della griglia #---------- SET GEOFILES ------------------------ # Non ruotare le immagini. set LATFILE = /homewind/pioggia/RU/METEOSAT/Blat.geo set LONFILE = /homewind/pioggia/RU/METEOSAT/Blon.geo set TIMEFILE = /homewind/pioggia/RU/METEOSAT/Btime.geo set ZENITHFILE = /homewind/pioggia/RU/METEOSAT/Bzenith.geo if (! -e $LATFILE) then echo ----- Latitude file $LATFILE not found. Exiting. >> $REPORT exit -1 endif if (! -e $LONFILE) then echo ----- Longitude file $LONFILE not found. Exiting. >> $REPORT exit -1 endif #=================================================================== #=================================================================== #------------------------------------------------------------------- # ---------------- FOR EACH CHANNEL ---------------------------- echo Channels to import: $channels >> $REPORT foreach CHAN ($channels) #---------------------- FOR EACH JDAY ------------------------------ @ JDAY = ($JDAYI - 1) while ($JDAY < $JDAYF) @ JDAY = ($JDAY + 1) set DATE = $YY$JDAY #-------------------------------------------------- # *** -------------- FOR EACH SLOT ----------------- foreach SLOT ($SLOTSTRING ) #--- Compute start_time @ hh = ($SLOT - 1) / 2 @ mm = ( ($SLOT - 1) % 2) * 3 if ( $hh < 10 ) set hh = 0$hh set STARTTIME = $hh":"$mm"0:00" #--- Set I/O filenames and rad to temp params

25

set INFILE = $INDIR/$DATE$SLOT.ir echo $INFILE set FILEINF = $OUTDIR/$DATE$SLOT.inf set FILEIR = $OUTDIR/$DATE$SLOT.ir set A = $AIR set B = $BIR set VAR = IR set TEMP = mvissr_ir set FILEGEO = $FILEIR.geo set FILEGRID = $FILEIR.grid echo su set FILEHDR = $INDIR/$DATE$SLOT.hdr echo giu if (! -e $INFILE) then echo $JDAY $SLOT No $CHAN inputfile. Skipping >> $REP_CALIB echo ----- $JDAY $SLOT Input file $INFILE not found. Skipping. >> $REPORT continue endif rm -f $FILEINF $FILEIR $FILEGEO $FILEGRID #--- Composes the name of header fiel containing the claibration information set ALPHA = 0.000000 if (! -e $FILEHDR) then echo ----- $FILEHDR does not exists. >> $REPORT continue endif set ALPHA = `./get_ircal_from_OMTP $FILEHDR` set SPCCNT = `./get_irspc_from_OMTP $FILEHDR` echo $JDAY $SLOT --- $ALPHA $SPCCNT >> $REP_CALIB # Check for ALPHA not being 0.000000 if ($ALPHA == 0.000000) then echo No good hdrfile found. Exiting echo No good hdrfile found. Exiting >> $REP_CALIB continue endif echo ---------- `basename $FILEHDR` is used >> $REP_CALIB #------------------------------------------------- #-------- Importing bin files --------- echo + Importing file `basename $INFILE` echo $ALPHA $SPCCNT impbin dim_sizes='625 1250' dim_names='line sample' \ var_names=$VAR var_units='counts' var_types=byte \ var_badvals=0 sort_by_var=yes header_size=0 rel_offsets=0 \ instantiate=yes $INFILE $FILEINF>>&$IMPLOG if (! -e $FILEINF) then echo ----- Import binary file $FILEINF failed. Continue >> $REPORT echo ----- Import binary file $FILEINF failed. Continue continue endif #---------------Counts to Rad---------------- echo echo ---+ Counts to Rad emath expr_vars=$VAR expression="(x1-$SPCCNT)*$ALPHA" \ var_name=Rad var_units="Wm2sr-1" var_type=float \ $FILEINF $FILEINF.rad >>&$IMPLOG

26

if (! -e $FILEINF.rad) then echo ----- Count to Rad $FILEINF.rad failed. Skipping continue endif #---------------Rad to Temp (C)------------------ echo echo ---+ Rad to Temp emath expr_vars=Rad expression="$B/(log(x1)-$A)- $ZERO" \ var_name=$TEMP var_units=Celsius var_type=float \ $FILEINF.rad $FILEIR >>&$IMPLOG if (! -e $FILEIR) then echo ----- Rad to temp $FILEIR failed. Skipping >>$REPORT continue endif rm -f $FILEINF $FILEINF.rad #------------- Assemble TEMP LAT LON TIME ZENITH files--------- echo echo ---+ Assembling with lat lon time zenith files rm -f $FILEGEO assemble include_vars= extend_names=no instantiate=yes \ $FILEIR $LATFILE $LONFILE $TIMEFILE $ZENITHFILE $FILEGEO >>&$IMPLOG if ( $status != 0 ) then echo Assembling $FILEGEO failed. Skipping >>$REPORT continue endif #-------------- Set attributes echo ---+ Set `basename $FILEGEO` attributes #---Satellite setattr name=satellite units= datatype=str12 \ value=meteo-7 $FILEGEO >>&$IMPLOG if ( $status != 0 ) then echo Satellite attribute $FILEGEO failed. Continue. >>$REPORT continue endif #---Sensor set codice_sensore=8 #svissr sensor: gsm setattr name=sensor units=std_sensor datatype=long length=1\ value=$codice_sensore $FILEGEO >>&$IMPLOG if ( $status != 0 ) then echo Sensor attribute $FILEGEO failed. Continue. >>$REPORT continue endif #---Pass_date setattr name=pass_date units=std_date value=$YEAR.$JDAY \ $FILEGEO >>&$IMPLOG if ( $status != 0 ) then echo ----- Pass_date attribute $FILEGEO failed. Continue >>$REPORT continue endif #---Start_time setattr name=start_time units=std_time value=$STARTTIME \

27

$FILEGEO >>&$IMPLOG if ( $status != 0 ) then echo ----- Start_time attribute $FILEGEO failed. Continue >>$REPORT continue endif # #-------------------- IMGRID -------------------------------- # imgrid \ master_file=$MASTER grid_spacing=1 search_radius=$SEARCH_PARAM \ wt_power=2 z_variable="$TEMP sat_zenith time" z_index= \ $FILEGEO $FILEGRID >>&$IMPLOG if ( $status != 0 ) then echo $FILEGEO imgrid on $MASTER area failed. Continue >>$REPORT else #-------------- Set attributes #---Satellite setattr name=satellite units= datatype=str12 value=meteo-7 \ $FILEGRID >>&$IMPLOG if ( $status != 0 ) then echo Satellite attribute $FILEGRID failed. Continue. >>$REPORT continue endif #---Sensor set codice_sensore=8 #svissr sensor: gsm setattr name=sensor units=std_sensor datatype=long length=1\ value=$codice_sensore $FILEGRID >>&$IMPLOG if ( $status != 0 ) then echo Satellite attribute $FILEGRID failed. Continue. >>$REPORT continue endif #---Pass_date setattr name=pass_date units=std_date value=$YEAR.$JDAY \ $FILEGRID >>&$IMPLOG if ( $status != 0 ) then echo ----- Pass_date attribute $FILEGRID failed. Continue >>$REPORT continue endif #---Start_time setattr name=start_time units=std_time value=$STARTTIME \ $FILEGRID >>&$IMPLOG if ( $status != 0 ) then echo ----- Start_time attribute $FILEGRID failed. Continue >>$REPORT continue endif editdim dimname=line newname=ir_line varname= \ scale= offset= coord=3 $FILEGRID >>&$IMPLOG if ( $status != 0) then echo ----- $FILEGRID editdim line failed. Skipping >> $REPORT continue endif editdim dimname=sample newname=ir_sample varname= \ scale= offset= coord=2 $FILEGRID >>&$IMPLOG if ( $status != 0) then echo ----- $FILEGRID editdim line failed. Skipping >> $REPORT continue endif

28

echo Written outputfile $FILEGRID >>$REPORT endif #end status checking rm -f $FILEIR rm -f $FILEGEO end # End slot loop ------------- #------------------------------------ end # End day loop -------------------------- #-------------------------------------------------- end # End chan loop ------------------------------------ #--------------------------------------------------------- #rm -f $IMPLOG cp $0 $OUTDIR cp $REPORT $OUTDIR cp $REP_CALIB $OUTDIR exit

29

get_SSMI_4_rain #! /bin/csh # # R. Amorati - Sat. Met. Group - ISAO - CNR Bo. July 2001 # # This script imports SSM/I HDF files from GHRC archive # and creates TDF files to be input into rain_landsea # #================================================================= # ================== VARIABLES SETTING ====================== #================================================================= #------------DIRECTORIES VAR------------------- set INDIR = /workcyclone/pioggia/temporali-agosto2002/ssmi/HDF set OUTDIR = /workcyclone/pioggia/temporali-agosto2002/ssmi/Tb set TMPDIR = /workcyclone/pioggia/temp if (! -d $OUTDIR) mkdir -p $OUTDIR if (! -d $TMPDIR) mkdir -p $TMPDIR #----------------- SET LOG FILES ------------------------------- set IMPLOG = out.log #tdf command standard and diagnostic output set REPORT = status.log #command status results rm -f $IMPLOG $REPORT date >>$REPORT #-------------------- DATE VAR ---------------------- set YY = 02 set YEAR = 2002 set JDAY=218 foreach JDAY (218) set DATE = $YY$JDAY echo $DATE #-------------------- SENSOR ------------------------ set sensorlist='f13,f14,f15' #e.g 'f13,f14' #set sensorlist = f14 #---------------- Set hdf files to import ----------- #ls -1 $INDIR/[$sensorlist]*Tb*$DATE*.hdf set hdflist = `ls -1 $INDIR/[$sensorlist]*Tb*$DATE*.hdf` echo $hdflist #--------------- FOR EACH HDF TB FILE ------------------------ foreach TBHDF ( $hdflist ) echo $TBHDF # ---------- Tb file names ------------- set TBFILE = `basename $TBHDF` set ORBIT = ` ./get_orbit $TBFILE` if ($ORBIT == 'dayAD') then echo Skipping orbit 'dayAD' continue endif set SENSOR = ` ./get_sensor $TBFILE` if ($SENSOR == f13) then

30

set satname = f-13 set satcode = F13 else if($SENSOR == f14) then set satname = f-14 set satcode = F14 else if($SENSOR == f15) then set satname = f-15 set satcode = F15 else echo Sensor name not found. set satname = ? endif echo $TBFILE #----------- Geo files names---------------- set HNHDF = $INDIR/$SENSOR"_"hn"_"$DATE"_"$ORBIT.hdf set LNHDF = $INDIR/$SENSOR"_"ln"_"$DATE"_"$ORBIT.hdf if (! -e $HNHDF) then echo $HNHDF does not exist. Skipping orbit $ORBIT continue endif if (! -e $LNHDF) then echo $LNHDF does not exist. Skipping orbit $ORBIT continue endif #-------------- tdf output names ----------------- set TBTDF=$TMPDIR/$SENSOR"_"Tb"_"$DATE"_"$ORBIT.tdf set HNTDF=$TMPDIR/$SENSOR"_"hn"_"$DATE"_"$ORBIT.tdf set LNTDF=$TMPDIR/$SENSOR"_"ln"_"$DATE"_"$ORBIT.tdf set OUTFILE=$OUTDIR/$SENSOR"_"$DATE"_"$ORBIT.tdf # --------------- IMPORT TB HN LN------------------------ # hdftotdf import variable names no longer than 31 characters # It must be taken into account in next operation hdftotdf include_vars= shortname=no $TBHDF $TBTDF >>&$IMPLOG if ( $status != 0) then echo ----- HDF to TDF $TBHDF failed. Skipping >> $REPORT continue endif hdftotdf include_vars= shortname=no $LNHDF $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- HDF to TDF $LNHDF failed. Skipping >> $REPORT continue endif hdftotdf include_vars= shortname=no $HNHDF $HNTDF >>&$IMPLOG if ( $status != 0) then echo ----- HDF to TDF $HNHDF failed. Skipping >> $REPORT continue endif #----------- Extract pass_date and start_time ---------------- #---Pass_date set var_jday = `./get_first31 $satcode"_"Julian"_"Day"_"$DATE"_"$ORBIT` editvar include_vars=$var_jday var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-32768 valid_max=32768 $TBTDF set DATETB = `getval include_vars=$var_jday dim_indices="1 1" annotate=no printout=no $TBTDF`

31

editvar include_vars=$var_jday var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-32768 valid_max=32768 $HNTDF set DATEHN = `getval include_vars=$var_jday dim_indices="1 1" annotate=no printout=no $HNTDF` editvar include_vars=$var_jday var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-32768 valid_max=32768 $LNTDF set DATELN = `getval include_vars=$var_jday dim_indices="1 1" annotate=no printout=no $LNTDF` if ( $DATETB == $DATELN && $DATETB == $DATEHN ) then set SAT_DATE=$DATETB else echo Pass_date does not match. Skipping. >> $REPORT continue endif #---Start_time set var_time = `./get_first31 $satcode"_"Time"_"of"_"Day"_"$DATE"_"$ORBIT` editvar include_vars=$var_time var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-3.4028e+38 valid_max=3.4028e+38 $TBTDF set TIMETB = `getval include_vars=$var_time dim_indices="1 1" annotate=no printout=no $TBTDF` editvar include_vars=$var_time var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-3.4028e+38 valid_max=3.4028e+38 $HNTDF set TIMEHN = `getval include_vars=$var_time dim_indices="1 1" annotate=no printout=no $HNTDF` editvar include_vars=$var_time var_units= \ bad_value= scale_factor=1 scale_offset= \ valid_min=-3.4028e+38 valid_max=3.4028e+38 $LNTDF set TIMELN = `getval include_vars=$var_time dim_indices="1 1" annotate=no printout=no $LNTDF` if ( $TIMETB == $TIMELN && $TIMETB == $TIMEHN ) then set STARTTIME=$TIMETB else echo Start_time does not match. Skipping. >> $REPORT continue endif # ------------------- TB FILE ------------------------- #----------change variable name: # var_19V --> mi_19v # var_19H --> mi_19h # var_22V --> mi_22v # var_37V --> mi_37v # var_37H --> mi_37h # var_85V --> mi_85v # var_85H --> mi_85h set var_19V=`./get_first31 $satcode"_"V19"_"$DATE"_"$ORBIT` set var_19H=`./get_first31 $satcode"_"H19"_"$DATE"_"$ORBIT` set var_22V=`./get_first31 $satcode"_"V22"_"$DATE"_"$ORBIT` set var_37V=`./get_first31 $satcode"_"V37"_"$DATE"_"$ORBIT` set var_37H=`./get_first31 $satcode"_"H37"_"$DATE"_"$ORBIT`

32

set var_85V=`./get_first31 $satcode"_"V85"_"$DATE"_"$ORBIT` set var_85H=`./get_first31 $satcode"_"H85"_"$DATE"_"$ORBIT` varname old_var_name=$var_19V new_var_name='mi_19v' $TBTDF >>&$IMPLOG varname old_var_name=$var_19H new_var_name='mi_19h' $TBTDF >>&$IMPLOG varname old_var_name=$var_22V new_var_name='mi_22v' $TBTDF >>&$IMPLOG varname old_var_name=$var_37V new_var_name='mi_37v' $TBTDF >>&$IMPLOG varname old_var_name=$var_37H new_var_name='mi_37h' $TBTDF >>&$IMPLOG varname old_var_name=$var_85V new_var_name='mi_85v' $TBTDF >>&$IMPLOG varname old_var_name=$var_85H new_var_name='mi_85h' $TBTDF >>&$IMPLOG #-----def y coord for all Tb variables foreach ID (4 6 8 10 12) editdim dimname=fakeDim$ID newname=miline_lo varname= \ scale= offset= coord=3 $TBTDF >>&$IMPLOG if ( $status != 0) then echo ----- $TBTDF editdim line failed. Skipping >> $REPORT continue endif end foreach ID (14 16) editdim dimname=fakeDim$ID newname=miline_hi varname= \ scale= offset= coord=3 $TBTDF >>&$IMPLOG if ( $status != 0) then echo ----- $TBTDF editdim line failed. Skipping >> $REPORT continue endif end #-----def x coord foreach ID (5 7 9 11 13) editdim dimname=fakeDim$ID newname=misamp_lo varname= \ scale= offset= coord=2 $TBTDF >>&$IMPLOG if ( $status != 0) then echo ----- $TBHDF editdim sample failed. Skipping >> $REPORT continue endif end foreach ID (15 17) editdim dimname=fakeDim$ID newname=misamp_hi varname= \ scale= offset= coord=2 $TBTDF >>&$IMPLOG if ( $status != 0) then echo ----- $TBHDF editdim sample failed. Skipping >> $REPORT continue endif end #--------- Divide by scale factor 100 each Tb Var editvar include_vars="mi_*" var_units=kelvin \ bad_value=-32768 scale_factor="0.01" scale_offset= \ valid_min=-32768 valid_max=32768 $TBTDF # ----------------- LOW RES GEO ------------------------ #----------change variable name: # var_jday --> date # var_time --> time # var_latlo --> lat_lo # var_lonlo --> lon_lo set var_latlo = `./get_first31 $satcode"_"Latitude"_("Low"_"Res")_"$DATE"_"$ORBIT` set var_lonlo = `./get_first31 $satcode"_"Longitude"_("Low"_"Res")_"$DATE"_"$ORBIT` varname old_var_name=$var_jday new_var_name='date' $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF date varname failed. Skipping >> $REPORT continue endif

33

varname old_var_name=$var_time new_var_name='time' \ $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF time failed. Skipping >> $REPORT continue endif varname old_var_name=$var_latlo new_var_name='lat_lo' \ $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF latitude varname failed. Skipping >> $REPORT continue endif varname old_var_name=$var_lonlo new_var_name='lon_lo' \ $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF longitude failed. Skipping >> $REPORT continue endif #-----def y coord foreach ID (0 2 4 6 8) editdim dimname=fakeDim$ID newname=miline_lo varname= scale= \ offset= coord=3 $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF editdim line failed. Skipping >> $REPORT continue endif end #-----def x coord foreach ID (5 7 9) editdim dimname=fakeDim$ID newname=misamp_lo varname= scale= \ offset= coord=2 $LNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $LNTDF editdim sample failed. Skipping >> $REPORT continue endif end #---------Divide by scale factor 100 Lat Lon editvar include_vars="lat_lo" var_units=std_latitude \ bad_value=-32768 scale_factor="0.01" scale_offset= \ valid_min=-32768 valid_max=32768 $LNTDF editvar include_vars="lon_lo" var_units=std_longitude \ bad_value=-32768 scale_factor="0.01" scale_offset= \ valid_min=-32768 valid_max=32768 $LNTDF # ----------------- HIGH RES GEO ------------------------ #----------change variable name: # Data-Set-4 --> lat_hi # Data-Set-5 --> lon_hi # Data-Set-6 --> mi_sfc set var_lathi = `./get_first31 $satcode"_"Latitude"_("High"_"Res")_"$DATE"_"$ORBIT` set var_lonhi = `./get_first31 $satcode"_"Longitude"_("High"_"Res")_"$DATE"_"$ORBIT` set var_sfc = `./get_first31 $satcode"_"Surface"_"Type"_("High"_"Res")_"$DATE"_"$ORBIT` echo $var_lathi echo $var_lonhi echo $var_sfc varname old_var_name=$var_lathi new_var_name='lat_hi' \ $HNTDF >>&$IMPLOG if ( $status != 0) then

34

echo ----- $LNTDF latitude varname failed. Skipping >> $REPORT continue endif varname old_var_name=$var_lonhi new_var_name='lon_hi' \ $HNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $HNTDF longitude varname failed. Skipping >> $REPORT continue endif varname old_var_name=$var_sfc new_var_name='mi_sfc' \ $HNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $HNTDF longitude varname failed. Skipping >> $REPORT continue endif #-----def y coord foreach ID (4 6 8) editdim dimname=fakeDim$ID newname=miline_hi varname= scale= \ offset= coord=3 $HNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $HNTDF editdim line failed. Skipping >> $REPORT continue endif end #-----def x coord foreach ID (5 7 9) editdim dimname=fakeDim$ID newname=misamp_hi varname= scale= \ offset= coord=2 $HNTDF >>&$IMPLOG if ( $status != 0) then echo ----- $HNTDF editdim sample failed. Skipping >> $REPORT continue endif end #------------ Divide by scale factor 100 Lat Lon editvar include_vars="lat_hi" var_units=std_latitude \ bad_value=-32768 scale_factor="0.01" scale_offset= \ valid_min=-32768 valid_max=32768 $HNTDF editvar include_vars="lon_hi" var_units=std_longitude \ bad_value=-32768 scale_factor="0.01" scale_offset= \ valid_min=-32768 valid_max=32768 $HNTDF #----------- Assemble variables to output file assemble include_vars="mi_19v mi_19h mi_22v mi_37v mi_37h mi_85v mi_85h mi_sfc lat_lo lon_lo date time" \ extend_names=no instantiate=yes \ $TBTDF $LNTDF $HNTDF $OUTFILE>>&$IMPLOG if ( $status != 0 ) then echo Assemble failed. Continue. >>$REPORT else setattr name=satellite units= datatype=str12 \ value=$satname $OUTFILE >>&$IMPLOG if ( $status != 0 ) then echo Satellite attribute $OUTFILE failed. Continue. >>$REPORT continue endif setattr name=sensor_name units= datatype=str12 \ value=ssmi $OUTFILE >>&$IMPLOG

35

if ( $status != 0 ) then echo Satellite attribute $OUTFILE failed. Continue. >>$REPORT continue endif #---Pass_date setattr name=pass_date units=std_date value=$YEAR.$SAT_DATE \ $OUTFILE >>&$IMPLOG if ( $status != 0 ) then echo ----- Pass_date attribute $OUTFILE failed. Continue >>$REPORT continue endif #---Start_time setattr name=start_time units=std_time value=$STARTTIME \ $OUTFILE >>&$IMPLOG if ( $status != 0 ) then echo ----- Start_time attribute $OUTFILE failed. Continue >>$REPORT continue endif echo Written file $OUTFILE endif rm $TBTDF $LNTDF $HNTDF end end exit

36

37

rain #! /bin/csh -xv # This script drives the Ferraro algorithm procedure. # Set INPUT OUTPUT directories and ssmi file list. # 18/3/2002 R.A. setenv GEORAINROOT /home/pioggia/programmi/ssmi/rain echo $GEORAINROOT setenv SSMI1 /workcyclone/pioggia/Confronti-lamma/ssmi/isac/Tb echo $SSMI1 setenv DONE /workcyclone/pioggia/Confronti-lamma/ssmi/isac/rain if (! -d $DONE ) mkdir -p $DONE ls $DONE set ssmilist = `ls -1 $SSMI1/f1*_03329*.tdf` foreach ssmitb ( $ssmilist ) echo $ssmitb set tdffile=`basename $ssmitb` set orbita=`./get_filename $tdffile` set ssmitmp = $DONE/$orbita.tmp set raintmp = $DONE/$orbita.rain set outfile = $DONE/$orbita"_"rain.tdf echo $tdffile echo $orbita echo $ssmitmp echo $raintmp echo $outfile $GEORAINROOT/rain_landsea $ssmitb $ssmitmp assemble include_vars="rainrate lat_lo lon_lo date time" \ extend_names=no instantiate=yes keep_hist=no \ $ssmitb $ssmitmp $raintmp ls $raintmp ./change_date <<EOF $raintmp $outfile rainrate lat_lo lon_lo EOF copyatt attributes= overwrite_atts=yes $ssmitb $outfile rm $ssmitmp $raintmp end exit

38

39

stats_from_geo #! /bin/csh #-------------------------------------------- # J. Turk NRL # Statistics generator, initiated with a geostationary file # Requires paths to the various directories where geostationary data # are located (see foreach loop) and also to SSMI and TRMM2A12 directories # Output files are ASCII with the file naming, eg, # 19990809.1828.f-13.19990809.1800.goes-10 # Output file format is as follows: # 2000/04/26 12:06:34 36641 43594 f-15 # 2000/04/26 11:00:00 36641 39600 meteo-5 # 25.91 119.76 0.00 237.00 244.28 225.00 230.42 222.00 231.36 233.00 235.03 # 26.11 119.87 0.00 248.00 255.50 230.00 231.92 230.00 231.89 230.00 232.19 # 26.32 120.00 0.00 245.00 248.50 237.00 239.39 230.00 231.42 229.00 231.33 # 26.13 119.69 0.00 247.00 255.67 232.00 236.39 232.00 233.42 229.00 231.31 # etc. # ------------------------------------------------- # R. Amorati Sat. Met. Group - ISAO-CNR Aug 2001 # # Modified J. Turk version to eliminte on-line usage. # Only SSMI and METEO-7 are used #-------------------------------------------------------- if ( $#argv == 0 ) then echo Usage: $0 full_pathname_geofile outdir master_file exit 1 endif unalias rm onintr bail set geo = `basename $1` # Input geostationary full-path file name set STATDIR = $2 # where ASCII output files go set MASTER = $3 if ( ! -e $MASTER) then echo Master file does not exists. Exiting.... exit 1 endif set minir = -90.0 # min IR TB to allow for statistics, Celsius set maxir = 30.0 # max IR TB to allow for statistics, Celsius set maxsec = 900 # max time difference (seconds) for co-location set maxstat = 2000 # max MB in STATDIR, keep enough for look-back time set maxtest = 10 # max MB in TESTDIR, keep enough for look-back time set maxlog = 200 # max MB in LOGDIR set gsat = `getinfo attr_name=satellite $GEODIR/$geo` if ( $status != 0 ) exit 1 set gdate = `getinfo attr_name=pass_date $GEODIR/$geo` set gtime = `getinfo attr_name=start_time $GEODIR/$geo` echo ___$gtime set yyyymmdd = `tsdate output=yyyy_mm_dd std_format=yes date=$gdate` set yyyy = $yyyymmdd[1] set mm = $yyyymmdd[2] set dd = $yyyymmdd[3] set hh = `echo $gtime | awk '{print substr($0,1,2)}'`

40

set mn = `echo $gtime | awk '{print substr($0,4,2)}'` set gname = $yyyy$mm$dd.$hh$mn.$gsat echo $gname cd $SCRATCH #-------------------------------------------------------- #--- Loop through all geo data and look for spatial-alignment #--- The names are the same as in refdata/satel set found = 0 if ( $gsat == meteo-7) then set irname = mvissr_ir set minlat = 20 set maxlat = 60 set minlon = -60 set maxlon = 60 set gsat_filetime = 30 else echo $gsat not found in list, exiting exit 1 endif #--- Check for IR variable set dim = `varinfo var_name=$irname attr_name=size $GEODIR/$geo` if ( $status != 0 ) then echo `date -u` Missing variable $irname in $geo exit 1 endif # Use the same master as the meteosat master. No need to subset #rm -f geosubset_$$ #mastersub \ # include_vars=$irname master_file=$MASTER instantiate=no \ # $GEODIR/$geo geosubset_$$ # #if ( ! -e geosubset_$$ ) exit 1 #--- Look for time and space coincident microwave pixels set m = ssmi #No trmm available echo `date -u` Starting processing of $m data for intersection with $geo set rrname = rainrate set psat_filetime = 45 set maxnames = 10 set maxkm = 15.0 set latvar = lat_lo set lonvar = lon_lo set mwfmt = "*_rain.tdf" #--- Set the time windows for nearestfile to the geostationary file set maxmin = `expr $maxsec / 60` set time1 = `expr -1 \* $psat_filetime - $maxmin` set time2 = `expr $gsat_filetime + $maxmin` set timewindow = "$time1 $time2" #--- List all of the microwave files within timewindow, using # the date/time of the geo file as a reference cd $MWDIR

41

set mwnames = `nearestfile directory=$MWDIR template="$mwfmt" time_units=minutes time_window="$timewindow" open_interval=yes read_time=yes same_sat=no max_files=$maxnames now_file=$GEODIR/$geo` if ( $status != 0 ) then cd $SCRATCH echo `date -u` No mw files located for $geo continue else cd $SCRATCH set nmw = ${#mwnames} echo Located $nmw mw files= $mwnames endif foreach mw ( $mwnames ) set mw = `basename $mw` echo `date -u` Starting processing of $mw for intersection with $geo set psat = `getinfo attr_name=satellite $MWDIR/$mw` set pdate = `getinfo attr_name=pass_date $MWDIR/$mw` set ptime = `getinfo attr_name=start_time $MWDIR/$mw` echo ___________________________________________$ptime set sign = `echo $ptime | awk '{print substr($0,1,1)}'` set meno = "-" if ( "$sign" == "$meno") then set hh = `echo $ptime | awk '{print substr($0,2,2)}'` set mn = `echo $ptime | awk '{print substr($0,5,2)}'` else set hh = `echo $ptime | awk '{print substr($0,1,2)}'` set mn = `echo $ptime | awk '{print substr($0,4,2)}'` endif set yyyymmdd = `tsdate output=yyyy_mm_dd std_format=yes date=$pdate` set yyyy = $yyyymmdd[1] set mm = $yyyymmdd[2] set dd = $yyyymmdd[3] set pname = $yyyy$mm$dd.$hh$mn.$psat set stfile = ${pname}.${gname} echo echo ++++++++++++++$stfile echo if ( -e $TESTDIR/$stfile ) then echo Colocated file already processed $stfile continue endif # --- mastersub mw file set mastersubexe = mastersubssmi rm -f mw_$$ $GEORAINROOT/$mastersubexe > /dev/null\ include_var=$rrname latitude_var=$latvar longitude_var=$lonvar \ master_file=$MASTER $MWDIR/$mw mw_$$ if ( ! -e mw_$$ ) then echo `date -u` $mw does not intersect $geo continue endif echo `date -u` Rainrate completed for $mw #--- Add the attributes to the microwave file setattr name=pass_date units=std_date value=$pdate mw_$$ setattr name=start_time units=std_time value=$ptime mw_$$ setattr name=satellite units= datatype=str12 value=$psat mw_$$

42

editvar \ include_vars=$rrname var_units="mm/hr" bad_value= \ scale_factor= scale_offset= valid_max= valid_min= mw_$$ set dim = `varinfo var_name=$rrname attr_name=size mw_$$` if ( $status != 0 ) then echo $rrname not located in microwave file continue endif echo -------------+ Setting attributes DONE #--- Export the coincident pixels to an ascii file rm -f points_$$ $GEORAINROOT/expstat > /dev/null \ ir_varname=$irname rain_varname=$rrname \ max_km_diff=$maxkm max_time_diff=$maxsec \ tb_max=$maxir tb_min=$minir parallax_adj=n \ mw_$$ $GEODIR/$geo points_$$ #--- Store an attributes-only TDF in the "done" directory assemble \ include_vars="-*" extend_names=no instantiate=yes \ $MWDIR/$mw $TESTDIR/$stfile rm -f mw_$$ if ( ! -e points_$$ ) continue if ( ! -d $STATDIR ) mkdir -p $STATDIR # if ( ! -e $STATDIR/$stfile ) cp points_$$ $STATDIR/$stfile cp points_$$ $STATDIR/$stfile rm -f points_$$ if ( -e $STATDIR/$stfile ) set found = 1 end # each mw file #end # each mwsat echo `date -u` Ended processing of $geo rm -f $SCRATCH/*_$$ diskhogs \ directory=$TESTDIR template='*' max_mbytes=$maxtest read_time=no dry_run=no diskhogs \ directory=$LOGDIR template='*' max_mbytes=$maxlog read_time=no dry_run=no if ( $found != 1 ) exit 1 diskhogs \ directory=$STATDIR template='*' max_mbytes=$maxstat read_time=no dry_run=no exit 0 bail: rm -f $SCRATCH/*_$$ exit 1

43

lancia_georain #! /bin/csh # driver program for instantaneous global georain # The main_master is currently set for a 0.10-global resolution # Currently set up for any of: goes-10 goes-8 gms-5 meteo-7 meteo-5 # in order to simulate a dual (or multiple) capture system # 2001/01/18 J. Turk # 2001/10/30 R. Amorati. From J. Turk version eliminating loop over regions # Changed usage: introduced input directory rather than input file # Changed master regions # unalias rm mv onintr bail set filelist = "$1" echo $filelist set same_master_geo = $2 #set to "yes" if rain is computed on the same master as meteosat IR set main_master = $3 echo $same_master_geo echo $main_master if ( ! -e $main_master ) then echo `date -u` main_master not specified, check define_regions script exit 1 endif #------------------------------------------------------------------------------------- #--- Options for georain set georainvar=rain # name of rain variable set spectral_screen=n # apply 3.9-um and 6.7-um channels for rain/no-rain screen set temporal_screen=n # apply past 11-um IR history for rain/no-rain screen set orographic_check=n # use 1-degree NOGAPS fields as a check for orographic uplift set spatial_screen=n # apply spatial(boxsizeXboxsize)checks for rain/no-rain screen set boxsize=5 set maxmodel=50 # max MB in MODELDIR set maxreg=10000 # max MB under REGDIR set min_geo_cover=5 # minimum percent-coverage after main_master registration #set reprocess = 0 # re-draw images from FILENAME #--------------------------------------------------------------------------------------- foreach FILENAME ($filelist) echo $FILENAME set geo = `basename $FILENAME` cd $SCRATCH #-------------------------------------------------------

44

#--- Prepare large map-registration area, depending upon which geosats one has set geosat = `getinfo attr_name=satellite $FILENAME` if ( $status != 0 ) then echo `date -u` $geo does not have satellite attribute # goto bail continue endif set passdate = `getinfo attr_name=pass_date $FILENAME` if ( $status != 0 ) then echo `date -u` $geo does not have pass_date attribute #goto bail continue endif set passtime = `getinfo attr_name=start_time $FILENAME` if ( $status != 0 ) then echo `date -u` $geo does not have start_time attribute #goto bail continue endif set yyyymmdd = `tsdate output=yyyy_mm_dd std_format=yes date=$passdate` set year = $yyyymmdd[1] set month = $yyyymmdd[2] set day = $yyyymmdd[3] set hour = `echo $passtime | awk '{print substr($0,1,2)}'` set min = `echo $passtime | awk '{print substr($0,4,2)}'` set ztime = $hour$min set passtime = ${hour}:${min}:00 # reset this without any seconds set datetime = $year$month$day.$hour$min.$geosat # Use the same master as the meteosat master. No need to subset if ( $geosat == meteo-7 )then set irname = mvissr_ir set wvname = mvissr_wv set niname = none else echo `date -u` $geosat not found exit 1 endif set units = `varinfo var_name=$irname attr_name=units $FILENAME` if ( $status != 0 ) then echo `date -u` Failed to locate units in $geo #goto bail continue endif if ( $units == temp_deg_c || $units == Celsius || $units == celsius ) then set k2c = 0 else if ( $units == temp_deg_k || $units == Kelvin || $units == kelvin ) then set k2c = 273 else echo `date -u` IR variable units of $units are not recognized, modify $0 accordingly #goto bail continue endif set dim = `varinfo var_name=$irname attr_name=size $FILENAME` if ( $status == 0 ) then set varnames = $irname

45

else echo `date -u` No IR variable in $geo, exiting #goto bail continue endif set have_wv = 0 set dim = `varinfo var_name=$wvname attr_name=size $FILENAME` if ( $status == 0 && $spectral_screen == y ) then set varnames="$varnames $wvname" set have_wv = 1 endif set have_ni = 0 set dim = `varinfo var_name=$niname attr_name=size $FILENAME` if ( $status == 0 && $spectral_screen == y ) then set varnames="$varnames $niname" set have_ni = 1 endif rm -f data_$$ if ( $same_master_geo == yes ) then echo same master as meteosat cp $FILENAME reg_$$ else mastersub \ include_vars="$varnames" master_file=$main_master instantiate=no \ $FILENAME data_$$ if ( ! -e data_$$ ) then echo `date -u` $geo does not cover satellite-master for $geosat continue endif rm -f mastersat_$$ echo `date -u` $geo is ready for further processing #-------------------------------------------------------- #--- Map register and save TDF on the master grid echo `date -u` Started map registration for $geo fastreg \ master_file=$main_master include_vars="$varnames" poly_size= \ data_$$ reg_$$ if ( $status != 0 ) then echo `date -u` fastreg of $geo failed continue endif rm -f data_$$ endif set cover = `count variable=$irname output=total_percent criteria=good reg_$$` echo `date -u` $geo covers $cover percent before georain if ( `expr $cover \< $min_geo_cover` ) then echo `date -u` Insufficient coverage $cover percent before georain continue endif echo `date -u` Started georain processing for $geo $GEORAINROOT/georain > /dev/null \

46

ir_varname=$irname \ hist_dir=$HISTDIR \ spect_screen=$spectral_screen \ wv_varname=$wvname \ nir_varname=$niname \ time_screen=$temporal_screen \ imager_dir=$REGDIR \ box_size=$boxsize \ spatial_screen=$spatial_screen \ orograph_check=$orographic_check \ model_dir=$MODELDIR \ reg_$$ georain_$$ if ( -e georain_$$ ) then set dim = `varinfo var_name=$georainvar attr_name=size georain_$$` if ( $status != 0 ) then set goodpts = 0 else set goodpts = `count variable=$georainvar output=count criteria=good georain_$$` endif else set goodpts = 0 endif if ( $goodpts != 0 ) then set histfile_used = `getinfo attr_name=histogram_file georain_$$` echo Histogram file used= $histfile_used assemble \ include_vars="$irname $georainvar" extend_names=no instantiate=no \ reg_$$ georain_$$ temp_$$ setattr name=histogram_file units= datatype=str256 value=$histfile_used \ temp_$$ set varnames = ( $irname rain) else echo `date -u` Rain variable not created set varnames = $irname continue endif #--- Add in date/time/sat attributes so that accumulations # can be calculated in time-sequential order setattr name=satellite units= datatype=str12 value=$geosat temp_$$ setattr name=pass_date units=std_date value=$passdate temp_$$ setattr name=start_time units=std_time value=$passtime temp_$$ #--- Save the final TDF for use in the accumulations if ( ! -d $REGDIR ) mkdir -p $REGDIR instant temp_$$ $REGDIR/$datetime rm -f temp_$$ reg_$$ georain_$$ echo `date -u` Ended georain processing for $geo end #--- main loop over input files #-------------------------------------------------------- #diskhogs \ # directory=$REGDIR template='*' max_mbytes=$maxreg read_time=no dry_run=no # #diskhogs \ # directory=$MODELDIR template='*' max_mbytes=$maxmodel read_time=no dry_run=no rm -f $SCRATCH/*_$$

47

cp $GEORAINROOT/$0 $REGDIR exit 0 bail: rm -f $SCRATCH/*_$$ exit 1

48

49

APPENDIX 3:Glossary of terms DMSP Defense Meteorological Satellite Program EUMETSAT European Organisation for the Exploitation of Meteorological

Satellites GEO Geostationary (satellite) GHCC Global Hydrology and Climate Center GHRC Global Hydrology Resource Center HDF Hierarchical Data Format IR Infrared ISAC Institute of Atmospheric Sciences and Climate LaMMA Laboratory for Meteorology and Environmental Modelling LEO Low Earth Orbit (satellite) MW Microwave NESDIS National Environmental Satellite Data and Information Service NOAA National Oceanic and Atmospheric Administration NRL Naval Research Laboratory OpenMTP Basic METEOSAT image format for the Transition Programme PDUS Primary Data User Station PMM Probability Matching Method RR Rainrate RU Rapid update SAA Satellite Active Archive SSM/I Special Sensor Microwave/Imager TDF TeraScan® Data Format TDR Temperature Data Record VIS Visible WV Water vapor