26
Numerical Simulations in Astrophysics Numerical Simulations in Astrophysics The COAST Project The COAST Project Daniel Pomarède Daniel Pomarède CEA/DAPNIA/SEDI/LILAS CEA/DAPNIA/SEDI/LILAS

Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

Numerical Simulations in Astrophysics Numerical Simulations in Astrophysics The COAST ProjectThe COAST Project

Daniel PomarèdeDaniel PomarèdeCEA/DAPNIA/SEDI/LILASCEA/DAPNIA/SEDI/LILAS

Page 2: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

2The COAST Project

The COAST Project at DAPNIA

•The COAST « Computational Astrophysics » Project is dedicated to the study of structures formation in the Universe:

– large-scale cosmological structures and galaxy formation

– turbulences in molecular clouds and star formation

– stellar MHD

– protoplanetary systems

•The project relies on numerical simulations performed on high performances, massively parallel mainframes and on software tools useful to the development, optimization, validation of the numerical simulation codes, the treatment and the exploitation of the results:

– visualization

– numerical algorithms

– databases

– code management

Page 3: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

3The COAST Project

The COAST Project at DAPNIA

•This transverse DAPNIA project involves :– the SAp “Service d’Astrophysique”

• 7 FTE permanent positions• 6 PhDs and 2 postdocs

– the SEDI “Service d’Electronique, des Detecteurs et de l’Informatique”

• 3 FTE software engineers from the LILAS Laboratory

• Master students

Page 4: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

4The COAST Project

The COAST Project members

•Astrophysics :– E. Audit : ISM, star formation

– F. Bournaud : galactic dynamics

– S. Brun : solar modeling

– S. Charnoz : planet formation

– S. Fromang : MHD turbulences

– F. Masset : planetary migration

– R. Teyssier : cosmology, galaxy formation

•Software developments :– V. Gautard : numerical algorithms

– J.P. Le Fèvre : databases

– D. Pomarède : visualization

– B. Thooris : data format, code management

Page 5: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

5The COAST Project

The COAST Project numerical simulation codes

•four simulation codes are developed at DAPNIA :– RAMSES, a hybrid N-body and hydrodynamical AMR code,

simulates the dark matter and the baryon gas – HERACLES a radiation hydrodynamics code to study

turbulences in interstellar molecular clouds– ASH (in collaboration with U. of Colorado), dedicated to the

study of stellar MHD– JUPITER, a multi-resolution code used in the study of

protoplanetary disks formation

•these codes are written in F90 or C, and parallelized with MPI

•they rely on numerical algorithms : equation solvers (Godunov, Riemann), adaptive mesh resolution techniques, cpu-load balancing (Peano-Hilbert space filling curves), …

Page 6: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

6The COAST Project

The COAST Project Computing Resources

•Local DAPNIA resources, used for development and post-treatment:

– funded by DAPNIA (120k€):

• DAPHPC :

– a 96-cores 2.6 GHz opteron cluster (24 nodes with 8Gb memory, with an Infiniband interface).

– 2/3 of computing time allocated to COAST

– funded by universities :

• SAPPEM : a 8-processors xeon platform with 32 Gb of memory

– funded by ANR (Agence Nationale de la Recherche) :

• 4 Visualization stations with 16 to 32 Gb RAM, ~1Tb disk, 4 processors, 1Gb memory graphics cards, 30 inches screens

•CEA resources at CCRT (CEA National Supercomputing Center) for massive simulations : 4 Mhrs in 2007

– Platine, ranking 12th in the TOP500 world supercomputer list (June 2007) : 7456 Itanium cores, total 23 Tb memory, 47.7 Teraflops

– Tantale : HP/Linux AMD Opteron cluster with 552 cores

Page 7: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

7The COAST Project

The COAST Project Computing Resources

•Other resources for massive simulations : 2 Mhrs for 2007– DEISA Extreme Computing Initiative

– MareNostrum at the Barcelona Supercomputing Center, ranking 9th in the TOP500 world supercomputer list (June 2007) : 10240 IBM PowerPC 2.3 GHz cores with 94.2 Teraflops, 20Tb of main memory

Page 8: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

8The COAST Project

The COAST Project software development pool

•Data handling– migration to the HDF5 Hierarchical Data Format developed at NCSA

(National Center for Supercomputing Applications, USA) for the HERACLES code

– massively parallel I/O schemes

•Numerical algorithms– development of a multiple-grid scheme for the HERACLES code– Radiation transfer/photo-ionization scheme– MHD/Godunov schemes– Poisson solver multiple grid and AMR– Load-balancing schemes

•Databases development :– the HORIZON Virtual Observatory, a relational database to store the

results of the “galaxy formation” simulations :• halos, sub-halos, galaxy catalogs• merger trees

– ODALISC (Opacity Database for Astrophysics, Lasers experiments and Inertial Fusion Science), provides a database of opacities and equations of state useful to the astrophysics and plasma/laser interaction communities

Page 9: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS
Page 10: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

10The COAST Project

The COAST Project software development pool

•Visualization : development of SDvision

– IDL Object Graphics framework– interactive 3D navigation and analysis– visualization of RAMSES, HERACLES, JUPITER, ASH data– visualization of complex scenes with scalar fields (volume

projection, 3D isosurface, slices), vector fields (streamlines) and particle clouds

SDvision

the Saclay/DAPNIA Visualization Interfacethe Saclay/DAPNIA Visualization Interface

Page 11: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS
Page 12: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

•Interactive visualization of huge datasets on desktops

•example of MareNostrum output #97 :

100 Gb of data 2048 processors

•interactive selection of subvolume (10% in each direction)

•data extraction through the Peano-Hilbert space filling curve

•projection of the AMR up to Level 13 in a 8003 Cartesian grid (4 Gb of memory) suitable for interactive navigation

Page 13: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

•HERACLES 1200x1200x1200 256-processors simulation of turbulences in the interstellar medium (size ~20pc)

•Max intensity projection of the density field

Page 14: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS
Page 15: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

15The COAST Project

Navigation in the RAMSES AMR : synchroneous spatial and resolution zooms

Page 16: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

16The COAST Project

Visualization of temporal evolutions : galaxy mergers

Page 17: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

17The COAST Project

Highlights of recent COAST milestones•The HORIZON Grand Challenge Simulation at CEA/CCRT on Platine :

the largest ever N-body cosmological simulation was performed with RAMSES 6144 cores, 18 Tb RAM used for 2 months to simulate 70 billions particles used to simulate future weak-lensing surveys like DUNE or LSST

•The HORIZON “galaxy formation” simulation at MareNostrum : 10243 dark matter particles, 4 billions AMR cells, box size 50 Mpc/h, resolution in space 2 kpc 2048 processors for computing, 64 processors dedicated to I/O, 3 weeks of computations so far,

down to z=1.9, 20 Tb of data generated and stored from large scale filaments to galactic discs :

Page 18: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

18The COAST Project

Highlight of a few recent publications

•about 40 refereed publications for the 2006-2007 years. A very few examples :•in Astronomy & Astrophysics :

– “On the role of meridional flows in flux transport dynamo models”, L. Jouve and A.S. Brun, A&A 474 (2007) 239

– “On the structure of the turbulent interstellar atomic hydrogen”, P. Hennebelle and E. Audit, A&A 465 (2007) 431

– “Simulating planet migration in globally evolving disks”, A. Crida, A. Morbidelli, and F. Masset, A&A 461 (2007) 1173

– “A high order Godunov scheme with constrained transport and adaptive mesh refinement for astrophysical magnetohydrodynamics”, S. Fromang, P. Hennebelle, R. Teyssier, A&A 457 (2006) 371

•in The Astrophysical Journal :– “Simulations of turbulent convection in rotating young solarlike stars:

differential rotation and meridional circulation”, J. Ballot, A.S. Brun, and S. Turck-Chieze, ApJ 669 (2007) 1190

– “On the migration of protogiant solid cores”, F. Masset, G. D’Angelo, and W. Kley, ApJ 652 (2006) 730

– “Disk surface density transitions as protoplanet traps”, F. Masset, A. Morbidelli, and A. Crida, ApJ 642 (2006) 478

•in Journal of Computational Physics :– “Kinematic dynamos using constrained transport with high order Godunov

schemes and adaptive mesh refinement”, R. Teyssier, S. Fromang, and E. Dormy, J. Comp. Phys. 218 (2006) 44

Page 19: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

19The COAST Project

Publications in conferences•Organisation of the ASTRONUM-2007 “Numerical Modeling of Space

Plasma Flows” in Paris, June 11-15, 2007, 80 participants– 5 presentations by COAST members

•Supercomputing Conferences :– SC06 (Tampa), ISC07 (Dresden), SC07 (Reno)

•Visualization Conferences :– CGIV07 Computer Graphics, Imaging and Visualization, Bangkok,

august 2007, IEEE Computer Society– International Workshop on Visualization of High-resolution 3D Turbulent

Flows, Ecole Normale Supérieure, Paris, june 2007

•Computational Physics– CCP2006 (Gyeongju, South Korea), CCP2007 (Bruxelles)– ASTRONUM-2006 (1st edition in Palm Springs)

•Modeling and Simulation– MSO2006, Botswana, september 2006– EUROSIM2007, Ljubljana, september 2007

•Software – ADASS XVI (Tucson, 2006)

•Astrophysics : – “Protostars and planets V”, IAU Symposia, …

Page 20: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

20The COAST Project

External fundings for the COAST Project

•Successful applications to ANR Research National Agency :

– HORIZON• the objective is to federate numerical simulations activities with a

program focused on galaxy and large scale structure formation• budget = 500 k€• DAPNIA leadership

– SYNERGHY• a cross-disciplinary project focusing on simulations in astrophysics, hot

dense matter and inertial confinement fusion• budget = 600 k€• DAPNIA leadership

– MAGNET• development of MHD numerical codes, and study of generation and

structure of magnetic fields in astrophysics• budget = 400 k€

Page 21: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

21The COAST Project

Perspectives for the COAST Project

•Computational astrophysics has a bright future, lying on the ever increasing performances of massively parallel mainframes

•Recipe for success : synergy between astrophysicists, software developers, local computing resources, access to supercomputers

•Many similar projects and initiatives are competing, a few examples :– FLASH Center at U. of Chicago, organized in 6 groups, 41 members (Year 9 activities

report, 2006) : code (6), computational physics and validation (3), astrophysics (15), computer science (7), visualization (3), basic science (7)

– ASTROSIM European Network for Computational Astrophysics : 12 member organizations

– Applied Numerical Algorithms Group at Lawrence Berkeley, home of the Chombo and ChomboVis Adaptive Mesh Refinement Library

– Laboratory for Computational Science and Engineering, U. Minesotta

– VIRGO consortium for Cosmological Supercomputer Simulations : 20-25 scientists, heavy hardware resources at Durham (792 opteron cpus + 500 ultrasparc processors) and Garching (816 power-4 processors)

•To keep pace in this competition, the COAST Project needs :– adequate local computing resources for developments and post-processing : typically

32 processors / permanent scientist => 256 processors (versus 64 currently)

– additional strength in computer science (cluster management), data handling & visualization, computational physics and validation

Page 22: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

Backup slides

Page 23: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

Code is freely available

RAMSES: parallel graded octree AMR

+ MHD

Page 24: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

Domain decomposition using space-filling curves

Fully Threaded Tree (Khokhlov 98)

Cartesian mesh refined on a cell by cell basis

octs: small grid of 8 cells, pointing towards• 1 parent cell• 6 neighboring parent cells• 8 children octs

Coarse-fine boundaries: buffer zone 2-cell thick

Time integration using recursive sub-cycling

Parallel computing using the MPI library with a domain decomposition based on the Peano-Hilbert curve.

Algorithm inspired by TREE codes:

locally essential tree.

Tested and operational up to 6144 core.

Scaling depends on problem size and complexity.

Page 25: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

The AMR “Octree” data structure of the RAMSES codelevel 2

level 3

level 5

level 9

level 14

basic element of AMR structure :group of 2dim

sibling cells called “octs”

level 11

Page 26: Numerical Simulations in Astrophysics The COAST Project Daniel Pomarède CEA/DAPNIA/SEDI/LILAS

Level 9 to level 144.1107 cells

The RAMSES AMR

A formal resolution of 213 =8192 cells in each direction is reached, amounting to a total of 81923=5.5 1011 cells

Thanks to this dynamic range, physical processes at very different scales are treated : large-scale gravitational interaction to star formation in galaxies