Upload
bathsheba-carter
View
227
Download
2
Tags:
Embed Size (px)
Citation preview
LES on massively parallel computers
Broadening of convective cellsduring cold air outbreaks:
A high resolution study using a parallelized LES-Model
Dr. Siegfried Raasch
Institut für Meteorologie und Klimatologie
Universität Hannover
LES on massively parallel computers
• PALM – a parallelized LES-model
- model equations- parallelization principles and strategy- performance analysis
• High resolution study of convective cells
- broadening of convective cells during cold air outbreaks
• Studies within AFO2000 and DEKLIM
- effects of surface inhomogeneities on boundary layer turbulence (including cloud coverage)
Contents
LES on massively parallel computers
PALM equations
k
ikvikkjijk
ik
iki
x
uuguuf
x
p
x
uu
t
ug
0
03
*
0
1
PREC
l
RAD
l
k
lk
k
lkl
ttx
u
x
u
t
PRECk
k
k
k
t
q
x
qu
x
qu
t
q
lp
l
lv
qTc
L
qqq
0
k
k
x
u
k
k
k x
u
tx
p
~
02
*2
Advantages of using the set of liquid water potential temperature and total water content l and q (see e.g. Deardorff, 1976):
• l and q are conservative quantities (as long as precipitation-, radiation-and freezing-processes are excluded), especially in case of condensation
• no problems if saturation happens only in parts of the volume (otherwise, a subgrid-scale condensation scheme would be necessary)
• no extra variable for the liquid water content (less demand of memory)
• for dry convection or convection without condensation, the set of l and q is equal to potential temperature and specific humidity
• no additional terms for phase changes necessary in the prognosticequations
LES on massively parallel computers
Example: LES of a convective boundary layer (CBL)
• computational domain: 2000 m x 2000 m x 3000 m• grid spacing: 25 m• grid points: 80 x 80 x 65
• inversion height zi: 800 m
• simulation period: 1 h
start animation
LES on massively parallel computers
Why to use a parallel computer?
• Many open problems in boundary layer research require extreme computational power
- interactions between turbulent structures of different scale:organized convection during cold-air outbreaksflow around obstacles
- stably stratified turbulence:entrainment layerscatabatic flows
- test of subgrid-scale models
• Normal sized LES studies are running much faster than on single-processor computers
- large number of runs with parameter variations in a short time
LES on massively parallel computers
Program requirements for efficient use of massively parallel computers
• load balancing• small communication overhead• scalability (up to large numbers of processors)
best domain decomposition:
S. Raasch and M. Schröter, 2001: PALM – A Large-Eddy Simulation Model Performing on Massively Parallel Computers. Meteorol. Z., 10, 363-372.
LES on massively parallel computers
Decomposition consequences (I)central finite differences cause local data dependenciessolution: introduction of ghost points
xxii
i
211
i
j
LES on massively parallel computers
Decomposition consequences (II)
FFT and linear equation solver cause non-local data dependencies
solution: transposition of 3D-arrays
Example: transpositions for solving the poisson equation
LES on massively parallel computers
1
10
100
1000
1 10 100 1000
Number of PEs
s(P
)
IDEALPALM
Scalability and performance (I)• Results for SGI/Cray-T3E (160*160*64 gridpoints)
Ptt
Ps1
speedup:
LES on massively parallel computers
Cell broadening (II)
vertical velocity at z = 1800 m
(from: Müller und Chlond, 1996:BLM, 81,289-323)
102.4 km
102.
4 km
LES on massively parallel computers
Cell broadening (III)
liquid water content ql at z = 3100 m vertical velocity at z = 2150 m
Palm-Results:704 * 704 * 82 gridpoints~10 GByte115 h on 256 PEs
LES on massively parallel computers
Cell broadening (IV)
with condensation
without condensation
pe
rce
nta
ge
of
tota
l e
ne
rgy
pe
rce
nta
ge
of
tota
l e
ne
rgy
for more information see:M.Schröter and S. Raasch, 2002: Broadening of Convective Cells. AMS 15th Symposium on Boundary Layers and Turbulence, Wageningen.
LES on massively parallel computers
Effects of inhomogeneities (I)
prescribed surface heat flux
yx
yxQQ
2
cos5.02
cos5.010
LES on massively parallel computers
updraft-areas
downdraft-areas
Effects of inhomogeneities (II)
vertical velocity in ms-1 (phase-averaged)
LES on massively parallel computers
Effects of inhomogeneities (IV)
S. Raasch and G. Harbusch, 2001: An Analysis of Secondary Circulations and their Effects Caused bySmall-Scale Surface Inhomogeneities Using LES. Boundary-Layer Meteorol., 101, 31-59.
LES on massively parallel computers
Effects of inhomogeneities (V)
• Inhomogeneities lead to a TKE increase in the mixed layer
• Secondary circulations may oscillate in time
Further results:
• Effects of irregular inhomogeneities and comparison with observations
• Runs with humidity
• Effects of secondary circulations and of inhomogeneous latent heat flux on e.g. cloud coverage
Future studies within DEKLIM and AFO2000:
M. O. Letzel and S. Raasch, 2002: Large-Eddy Simulation of Thermally Induced Oscillationsin the Convective Boundary Layer. Annual J. Hydraulic Eng., JSCE, 46, 67-72.
LES on massively parallel computers
PALM user groups:IMUK – Uni-Hannover
Dept. of Civil Engineering
Tokyo Inst. of Technology Yonsei University, Seoul
Dept. of Atmospheric Sciences