Upload
others
View
5
Download
0
Embed Size (px)
Citation preview
A stochastic adventure for the ocean
Master research internship proposal
Scientific advisors: Arthur Vidard ([email protected])Elise Arnaud ([email protected])Etienne Memin ([email protected])Websites : https://team.inria.fr/airsea/en/http://www.irisa.fr/fluminance/indexFluminance.html
Functional area: Laboratoire Jean KuntzmannInria Project Team AIRSEAIMAG Building, 700, avenue centrale38401 Saint Martin d’Heres
Keywords: data assimilation, particle filter, stochastic parameterisation and P.D.E.s.
Context. One of the many achievements of computer aided simulations has been the sharp improvement in thehuman prediction capabilities of weather and climate, with undeniable benefits for society: agricultural planning,extreme weather preparedness, mitigation of climate change effects, etc. The quality of the simulations rely on theability to acquire data about the present state and the past state and to assimilate these data in the numerical modelto make reliable predictions. This is referred as data assimilation.
Horizontal&discretisation&
5454 ⇥ 3474 ⇥ 300 ⇠ 5.7 109
Ocean&models&
� NATL60&� Gridpoints:&
� 13000&processors,&1&month&of&simulation&takes&1&day&� Full&storage&impossible&
Goals. Data assimilation can be seen as the art of compromise.It refers to a large variety of methods that allow to combine allsources of information available about a given system : mathemat-ical equation (physical laws), observations (measures of reality) anderror statistics.
Representing properly physical model uncertainties is a trickyproblem, and new approches using stochastic parameterisation havebeen recently derived [1]. They lead to consider non Gaussian statis-tics, thus making impossible the use of classical ensemble data as-similation methods, which rely heavily on the Gaussian hypothesis.
Particle filters are theoretically-validated Monte-Carlo basedtechniques that do not require Gaussian assumptions on the model.In that framework, the data assimilation problem is written as alarge-scale Bayesian inverse problem. It requires the estimation ofthe probability distribution of the state variables, at a given time,conditioned on all the previously observed measurement data. Par-ticle filter algorithms receive a growing attention from the data assimilation community [2]. They consists in samplingthe state space in places where the data could appear. However they perform poorly for large state space dimension[2,3].
The goal of this internship is to combine stochastic parameterisation of model uncertainty with particle filter.Evidences show that the properly defined error representation will allow for a better exploration of the state space,and thus is a possible way to address the dimension issue. After a literature study, the intern will test the feasibilityof the idea on a test case that implement a simplified 2D ocean model at coarse resolution in a double gyre setting.This configuration is meant to represent an idealized North Atlantic circulation (Gulf Stream).
This internship can lead to a PhD, where an application to realistic ocean models is foreseen. This work is part ofan ongoing collaboration with Imperial College (London, UK) and Ifremer.
Prerequisites. Applied math skills (optimisation, numerical analysis, probability / statistics) and programmingskills (Matlab, python, C or Fortran)
Bibliography[1] V. Resseguier, E. Memin, B. Chapron (2017): Geophysical flows under location uncertainty, Part I: Random trans-port and general models. Geophysical and Astrophysical Fluid Dynamics, 111(3):149-176.[2] P.J. van Leeuwen. Data Assimilation for the Geosciences. chapter Particle filters for the Geosciences. OxfordUniversity Press Blue Book Series, 2014.[3] E. Arnaud: Lecture notes in inverse methods and data assimilation. https://team.inria.fr/airsea/files/
2018/11/Poly_InvMet-M2.pdf