16:00 - 17:00
- Charles-Edouard Bréhier (CNRS & Université Lyon 1)
- Evelyn Buckwar (Linz University)
- Erika Hausenblas (Loeben University)
- Ray Kawai (Tokyo University)
- Gabriel Lord (Radboud University)
- Mikhail Tretyakov (Nottingham University)
- Kostas Zygalakis (Edinburgh University)
This is a One World Seminar
The seminars occur bi-weekly on a Wednesdays 13.00-14.00 GMT
To sign up for this seminar series, please complete this form.
Recordings from this seminar series are available here.
3 March 2021, 16:00-17:00 GMT *Please note later time
Andrew Stuart (Caltech) Inverse Problems Without Adjoints
I will describe the use of ensemble based particle methods to solve Bayesian inverse problems, including ensemble Kalman methods, methods based
on multiscale SDEs, and conjunctions of the two approaches.
17 March 2021
Konstantinos Dareiotis (University of Leeds) Approximation of stochastic equations with irregular drifts
In this talk we will discuss about the rate of convergence of the Euler scheme for stochastic differential equations with irregular drifts. Our approach relies on regularisation-by-noise techniques and more specifically, on the recently developed stochastic sewing lemma. The advantages of this approach are numerous and include the derivation of improved (optimal) rates and the treatment of non-Markovian settings. We will consider drifts in Hölder and Sobolev classes, but also merely bounded and measurable. The latter is the first and at the same time optimal quantification of a convergence theorem of Gyöngy and Krylov. This talk is based on joint works with Oleg Butkovsky, Khoa Lê, and Máté Gerencsér.
31 March 2021
Monika Eisenmann (Lund University) Sub-linear convergence of stochastic optimization methods in Hilbert space
In order to solve a minimization problem, a possible approach is to find the steady state of the corresponding gradient flow initial value problem through a long time integration. The well-known stochastic gradient descent (SGD) method then corresponds to the forward Euler scheme with a stochastic approximation of the gradient. Our goal is to find more suitable schemes that work well in the stochastic setting.
In the talk, we first present a stochastic version of the proximal point algorithm. This method corresponds to the backward Euler method with a stochastic approximation of the gradient. While it is an implicit method, it has better stability properties than the SGD method and advantages can be seen if the implicit equation can be solved within an acceptable time frame. Secondly, we present a stochastic version of the tamed Euler scheme in this context. This method is fully explicit but it is more stable for larger step sizes. We provide convergence results with a sub-linear rate also in an infinite-dimensional setting. We will illustrate the theoretical results on numerical examples.
A typical application for such optimization problems is supervised learning.
The talk is based on a joint work with Tony Stillfjord and Måns Williamson (both Lund University).
Svetlana Dubinkina, Vrije Universiteit Amsterdam - Shadowing approach to data assimilation
Denis Talay, Inria and Ecole Polytechnique - Probability distributions of first hitting times of solutions to SDEs w.r.t. the Hurst parameter of the driving fractional Brownian noise: A sensitivity analysis
Evelyn Buckwar, Johannes Kepler University - A couple of ideas on splitting methods for SDEs
Andreas Prohl, Tübingen - Numerical methods for stochastic Navier-Stokes equations
Mireille Bossy, INRIA - SDEs with boundaries, modelling particle dynamics in turbulent flow
Raphael Kruse, Halle-Wittenberg - On the BDF2-Maruyama method for stochastic evolution equations
Adrien Laurent, University of Geneva - Order conditions for sampling the invariant measure of ergodic stochastic differential equations in R^d and on manifolds
Chuchu Chen, Chinese Academy of Sciences - Probabilistic superiority of stochastic symplectic methods via large deviations principle
- This seminar was NOT recorded
Kostas Zygalakis, University of Edinburgh - Explicit stabilised Runge-Kutta methods and their application to Bayesian inverse problems
Xuerong Mao, Strathclyde - The Truncated Euler-Maruyama Method for Stochastic Differential Delay Equations
Charles-Edouard Bréhier, Claude Bernard Lyon - Analysis of splitting schemes for the stochastic Allen-Cahn equation
Conall Kelly, University College Cork - A hybrid, adaptive numerical method for the Cox-Ingersoll-Ross model
Abdul Lateef Haji-Ali, Heriot Watt University - Sub-sampling and other considerations for efficient risk estimation in large portfolios
David Cohen, Umeå University - Drift-preserving schemes for stochastic Hamiltonian and Poisson systems
Gabriel Lord, Radboud University - Numerics and SDE a model for the stochastically forced vorticity equation
Marco Iglesias, University of Nottingham - Ensemble Kalman Inversion: from subsurface environments to composite materials
Ray Kawai, University of Tokyo - Stochastic approximation in adaptive Monte Carlo variance reduction
- This seminar was NOT recorded
Kody Law, University of Manchester - Bayesian Static Parameter Estimation using Multilevel and multi-index Monte Carlo
Akash Sharma & Michael Tretyakov, University of Nottingham - Computing ergodic limits of reflected diffusions and sampling from distributions with compact support
Georg Gottwald, The University of Sydney - Simulation of non-Lipschitz stochastic differential equations driven by α-stable noise: a method based on deterministic homogenisation
Marta Sanz-Sole, Barcelona - Global existence for stochastic waves with super-linear coefficients
Sonja Cox, University of Amsterdam - Efficient simulation of generalized Whittle-Mat'ern fields
Zoom is the online platform being used to deliver this seminar series. Any questions relating to the seminar, please email Diane Horberry
This seminar series is supported as part of the ICMS Online Mathematical Sciences Seminars.