Stochastic Analysis is a branch of mathematics dealing with the analysis of dynamical systems evolving under the influence of randomness. In the last 30 years, it developed into one of the most active areas of mathematics worldwide. UK-based mathematicians have pioneered its recent developments. A wide range of powerful and novel stochastic tools have been developed in recent years, and with impressive diverse applications within and beyond mathematics, including physics, chemistry, biology, computer science and social sciences.

Perhaps the most influential contribution in this area has been Lyons’ theory of rough paths, a mathematical language to describe the effects a stream can generate when interacting with non-linear dynamical systems. It has had a fundamental impact on extending Ito’s theory of SDEs beyond semimartingales and on the development of Hairer’s Fields medal winning work on regularity structures, which provides a robust solution theory for many singular stochastic PDEs arising from physics. In recent years, rough path theory has started to play a key role in the design of state-of-the-art machine learning algorithms for processing noisy and high-dimensional data streams in a wide range of contexts including finance, cybersecurity, precision-medicine, and bioengineering.  

During this time, the Stochastic Analysis Group at Imperial College London has grown to be one of the strongest and most active in Europe and internationally. The group proposes to organize a five-day conference to gather experts from different areas of theoretical and applied stochastic analysis to investigate and discuss current and future directions of this field, with special emphasis on the interplay between theory and applications. A special session is planned with presentations from industrial and academic practitioners. This conference will mark the 70th birthday of Prof. Terry Lyons.

Over the course of the conference, participants will have the opportunity to present their latest research, exchange ideas and collaborate with peers. Through plenary talks, panel discussions, and interactive sessions, the conference will provide a platform for participants to learn from one another and gain new insights into the latest trends and techniques in stochastic analysis.

POSTER

Organisers

The conference is jointly organised with Thomas Cass (Imperial College London), Ilya Chevyrev (University of Edinburgh), Dan Crisan (Imperial College London), James Foster (University of Bath), Christian Litterer (University of York), Hao Ni (University College London) and Cristopher Salvi (Imperial College London).

Funders

London Mathematical Society, DataSig (EPSRC Project Grant EP/S026347/1), Making Cubature on Wiener Space Work (EPSRC Project Grant EP/V005413/1), Imperial College London (Cecilia Tanner Fund).

Invited speakers: titles and abstracts

 

Peter Friz (TU Berlin)

Title: Rough analysis of rough volatility models

Abstract: The question what rough paths have to do with finance has received many answers in recent years, this talk is devoted to some these. I will start by reporting on recent work that identifies the weak rate for a class of rough (Bergomi type) volatility models. In a second part, I will present a rough PDE based extension of the Romani-Touzi formula, describing the law of some asset under partial conditioning, applicable in particular to local rough stochastic volatility models. Finally, I will present an extension of Gatheral’s diamond calculus, motivated by rough Heston in forward variance form, to expected signatures, offering systematic computations in general semimartingale models. Credit to numerous coworkers will be given in the talk.

René Carmona (Princeton University)

Title: Control of Conditional Processes

Sandy Davie (University of Edinburgh)

Title: Central limit bounds in Vaserstein metrics

Abstract: The Vaserstein metrics from optimal transport theory seem to be natural measures of distance between probability measures, and several authors have used them to give bounds for normal approximations as in the Central Limit Theorem. In particular Rio in 2011 gave quite sharp estimates for such bounds. I will describe how (under some moment conditions) asymptotic expansions can be obtained, in negative powers of the sample size, for  Vaserstein distances between the distribution of a sample mean and the corresponding normal distribution. These expansions throw some light on  a question raised by Villani.

Felix Otto (Max Planck Institute, Leipzig)

Title: Regularity structures: more geometry and less combinatorics

Abstract: Singular stochastic partial differential equations are those stochastic PDE in which the noise is so rough that the nonlinearity requires a renormalization. The guiding principle of renormalization is to preserve as many symmetries of the solution manifold as possible. We follow Hairer’s regularity structures, which however we re-interpret as providing an informal, and eventually rigorous, parameterization of the infinite-dimensional nonlinear solution manifold. We systematically follow this more geometric/analytic than combinatorial point-of-view: Instead of appealing to an expansion indexed by trees, we consider all partial derivatives w.r.t. the “constitutive” function defining the nonlinearity. Instead of a Gaussian calculus guided by Feynman diagrams arising from pairing nodes of two trees, we consider derivatives w.r.t. the noise, i.e. Malliavin derivatives. We interpret the Malliavin derivative of the parameterization as an approximate tangent vector to the solution manifold, which yields a sparse representation in terms of the parameterization itself, and paves the way for its stochastic estimate. Ultimately, this gives a characterization of the solution manifold that is oblivious to the divergent counter terms. This is work with L. Broux, R. Steele, P. Linares, M. Tempelmayr, and P. Tsatsoulis, based on work with J. Sauer, S. Smith, and H. Weber.

Martin Hairer (EPFL & Imperial College London)

Title: What happens at H = 1/4?

Abstract: It has been known since the work of Coutin & Qian that fractional Brownian motion has a canonical rough path lift for H > 1/4, while only non-canonical lifts are known when this condition fails. We show that the suitably rescaled lift of a mollified fractional Brownian motion converges to a limiting “pure area” rough path with entries given by standard Brownian motions that are asymptotically independent of the underlying fractional Brownian motion.

Michael Rőckner (University of Bielefeld)

Title: Non-linear Fokker-Planck-Kolmogorov equations as gradient flows on the space of probability measures

Abstract: We propose a general method to identify nonlinear Fokker–Planck–Kolmogorov equations (FPK equations) as gradient flows on the space of Borel probability measures on Rd with a natural differential geometry. Our notion of gradient flow does not depend on any underlying metric structure such as the Wasserstein distance, but is derived from purely differential geometric principles. Moreover, we explicitly identify the associated energy functions and show that these are Lyapunov functions for the FPK solutions. Our main result covers classical and generalized porous media equations, where the latter have a generalized diffusivity function and a nonlinear transport-type first-order perturbation.

Yves Le Jan (Université Paris-Saclay)

Title: Moran model with selection

Abstract: We consider a biparental model in which the population is initially divided in two groups of different fitness. We determine the time evolution of the contribution of each initial group to the genome of the total population.

David Nualart (University of Kansas)

Title: Gaussian fluctuations for spatial averages of the stochastic heat equation

Abstract: The purpose of this talk is to survey several results on quantitative central limit theorems for spatial averages of the solution to the stochastic heat equation driven by a Gaussian noise with spatial  homogeneous covariance. The estimates are based on the combination of Stein’s method for normal approximations and the techniques of Malliavin calculus.  We will also discuss the case of the parabolic Anderson model driven by a Gaussian noise colored in time, where the corresponding mild equation is formulated using the Skorohod integral.

Ofer Zeitouni (Weizmann Institute of Science)

Title: Optimal rigidity and maximum of the characteristic polynomial of Wigner matrices

Abstract: I will describe the determination to leading order of the maximum of the characteristic polynomial for Wigner matrices and $\beta$-ensembles. In the special case of Gaussian-divisible Wigner matrices, the method provides universality of the maximum up to tightness. These are the first universal results on the Fyodorov–Hiary–Keating conjectures for these models, and in particular it answers the question of optimal rigidity for the spectrum of Wigner matrices. The proofs combine dynamical techniques for universality of eigenvalue statistics with ideas surrounding the maxima of log-correlated fields and Gaussian multiplicative chaos. Joint work with Paul Bourgade and Patrick Lopato

Gerard Ben Arous (Courant Institute, NY University)

Title: Dynamical spectral transition for optimization in very high dimensions

Abstract: The dynamics of optimization, needed for Machine Learning tasks, take place in a very high dimensional setting. But it seems that in fact the most important aspects really happen in a much smaller dimension. In recent work with Reza Gheissari (Northwestern), Aukosh Jagannath (Waterloo) we gave a general context for the existence of projected “effective dynamics” of SGD in very high dimensions for “summary statistics” in much smaller dimensions. These effective dynamics (and, in particular, their so-called ‘critical regime”) define a dynamical system in finite dimensions which may be quite complex, and rules the performance of the learning algorithm. The next step is to understand how the system finds these “summary statistics”.  This is done in the last work with the same authors and with Jiaoyang Huang (Wharton, U-Penn). This is based on a dynamical spectral transition (the BBP transition of Random Matrix Theory): along the trajectory of the optimization path, the Gram matrix or the Hessian matrix develop outliers whose eigen-spaces carry these effective dynamics. I will illustrate the use of this point of view on a few central examples of ML:  classification for Gaussian mixtures, and the XOR task.

Ben Hambly (University of Oxford)

Title: Particle systems with feedback and networks of neurons

Abstract: Interacting particle systems with feedback have been investigated recently as they arise in models in finance and neuroscience as well as having connections to Stefan problems. I will discuss the features of these models and focus on the case of the leaky integrate and fire model in neuroscience. In this setting we will discuss the scaling limit of the empirical measure of the particle system, showing the existence and uniqueness of solutions for an SPDE which describes the evolution of the whole system. We will also discuss the different effects that can result from adjusting the feedback parameter.

Alison Etheridge (University of Oxford)

Title: Forwards and backwards in spatially heterogeneous populations

Abstract: We introduce a broad class of individual based models that might describe how spatially heterogeneous populations live, die, and reproduce. Our primary interest is in understanding how genetic ancestry spreads across geography when looking back through time in these populations. A novelty is that by explicitly splitting reproduction into two phases (production of juveniles and their maturation) we produce a framework that not only captures models which when suitably scaled converge to classical reaction diffusion equations, but also ones with nonlinear diffusion that exhibit quite different behaviour. This is joint work with Tom Kurtz (Madison), Peter Ralph (Oregon) and Ian Letter and Terence Tsui (Oxford).

Christian Bayer (WIAS Berlin)

Title: Signatures for stochastic optimal control

Abstract: Models with memory play an increasingly important role in many applications, from finance to molecular dynamics. In a stochastic setting, memory means that the underlying stochastic process is not a Markov process. Such processes are particularly challenging for stochastic optimal control, as most state-of-the-art methods for solving stochastic optimal controls problems heavily rely on the Markov property. Building on earlier works by Terry Lyons, we show that paths signatures allow us to efficiently solve several classes of stochastic optimal control problems even when the underlying state process is not a Markov process, We provide theoretical analysis and numerical applications, with special emphasis on optimal stopping and optimal execution problems.

Zhongmin Qian (University of Oxford)

TitleOn the duality of conditional laws of diffusions and applications

Abstract: In this talk I will report the duality of conditional diffusion processes obtained recently for a large class of time in-homogeneous diffusion processes. By using the duality of conditional laws new type of Feynman-Kac formulas are established for solutions of some parabolic equations. Some applications to random vortex system and to compressible flows will be discussed.

Andrew Stuart (Caltech)

Title: Gradient Flows for Sampling: Mean-Field Models, Gaussian Approximations and Affine Invariance

Abstract: Sampling a probability distribution with an unknown normalization constant is a fundamental problem in computational science and engineering. This task may be cast as an optimization problem over all probability measures, by choice of a suitable energy function. Then an initial distribution can be evolved to the desired minimizer (the target distribution) via a gradient flow with respect to a chosen metric. The choice of the energy and the metric lead to different approaches and it is of interest to understand their role. We provide theoretical insights into these choices. Having chosen an energy and a metric, development of an actionable algorithm requires approximation of the gradient flow. Mean-field models, whose law is governed by the gradient flow in the space of probability measures, may be identified; particle approximations of these mean-field models form the basis of algorithms. The gradient flow approach is
also the basis of algorithms for variational inference, in which the optimization is performed over a parameterized family of probability distributions such as Gaussians or Gaussian mixtures; the underlying gradient flow is restricted to the parameterized family. Numerical results are presented to illustrate the resulting methodologies. Joint work with Y. CHEN (NYU), D.Z. HUANG (PKU), J. HUANG (U Penn) and S. REICH (Potsdam).

Josef Teichman (ETH)

Title: On the relation of real analytic functions on path spaces and signature expansions.

Abstract: Signature expansions and associated kernel techniques are important tools in approximation theory on path spaces. We show some relations to more classical notions of real analytic functions on path spaces and to some concepts of invariant theory. We then head towards a better understanding of reproducing kernel Hilbert spaces related to signature kernels. (joint works with Christa Cuchiero, Walter
Schachermayer, and Valentin Tissot-Daguette).

Hans Buehler (XTX Markets)

Title: Learning to Trade

Abstract: We discuss advances in applying reinforcement learning to risk managing a variety of financial problems.

Salvador Ortiz-Latorre (University of Oslo)

Title: Deep learning methods for the stochastic filtering problem

Abstract: The stochastic filtering equations govern the evolution of the conditional distribution of a signal process given partial, and possibly noisy, observations arriving sequentially in time. Their numerical approximation plays a central role in many real-life applications, including numerical weather prediction, finance and engineering.  In this talk we will present an approach based on the combination of the PDE method (also known as the splitting-up method) for solving the stochastic filtering problem and a deep learning method to represent the solution of the PDE involved. Our approach follows a recent stream of research into deep learning based approximations of high dimensional PDEs and related works within the context of stochastic optimal control. Joint work with Dan Crisan (Imperial) and Alexander Lobbe (Imperial).

Harald Oberhauser (University of Oxford)

Title: Random Surfaces and Higher Dimensional Algebra

Abstract: The path signature provides a structured description of an (unparametrized) path by representing it as an element in a non-commutative group; path concatenation turns into group multiplication, and path-reversal into group inversion. We study a generalization to surfaces, that is maps indexed by a two-dimensional domain. It turns out that so-called double groups (aka crossed modules) are rich enough to faithfully capture the compositional structure of (unparametrized) surfaces and that signature objects arise as the solution of differential equation. This has many consequences, one of them is that analogous to how the expected path signature can characterize laws of stochastic processes indexed by “time” one can characterize laws of random surfaces in an algebraic structured and principled way, thus providing a natural “characteristic function” for laws of random surfaces. We develop this construction up to a Young regime. Joint work with Darrick Lee.

Paola Arrubarrena (Imperial College London)

Title: Anomaly Detection on Radio Astronomy Data using Signatures

Abstract: An anomaly detection methodology is presented that identifies if a given observation is unusual by deviating from a corpus of non-contaminated observations. The signature transform is applied to the streamed data as a vectorization to obtain a faithful representation in a fixed-dimensional feature space. This talk is applied to radio astronomy data to identify very faint radio frequency interference (RFI) contaminating the rest of the data.

Maud Lemercier (University of Oxford)

Title: A High Order Solver for Signature Kernels

Abstract: Signature kernels are at the core of several machine learning algorithms for analysing multivariate time series. The kernel of two bounded variation paths (such as piecewise linear interpolations of time series data) is typically computed by solving a Goursat problem for a hyperbolic partial differential equation (PDE) in two independent time variables. However, this approach becomes considerably less practical for highly oscillatory input paths, as they have to be resolved at a fine enough scale to accurately recover their signature kernel, resulting in significant time and memory complexities. To mitigate this issue, we first show that the signature kernel of a broader class of paths, known as smooth rough paths, also satisfies a PDE, albeit in the form of a system of coupled equations. We then use this result to introduce new algorithms for the numerical approximation of signature kernels. As bounded variation paths (and more generally geometric -rough paths) can be approximated by piecewise smooth rough paths, one can replace the PDE with rapidly varying coefficients in the original Goursat problem by an explicit system of coupled equations with piecewise constant coefficients derived from the first few iterated integrals of the original input paths. While this approach requires solving more equations, they do not require looking back at the complex and fine structure of the initial paths, which significantly reduces the computational complexity associated with the analysis of highly oscillatory time series.

Adeline Fermanian (Califrais)

Title: Dynamic Survival Analysis with Controlled Latent States

Abstract: We consider the task of learning individual specific intensities of counting processes from a set of static variables and irregularly sampled time series. We introduce a novel modelization approach in which the intensity is the solution to a controlled differential equation. We first design a neural estimator by building on neural controlled differential equations. In a second time, we show that our model can be linearized in the signature space under sufficient regularity conditions, yielding a signature-based estimator which we call CoxSig. We provide theoretical learning guarantees for both estimators, before showcasing the performance of our models on a vast array of simulated and real-world datasets from finance, predictive maintenance and food supply chain management

Horatio Boedhardjo (University of Warwick)

Title: Lack of Gaussian Tail for integrals along fractional Brownian motions

Abstract: A result of Kusuoka and Stroock states that the solutions to elliptic stochastic differential solutions, where vector fields have uniformly bounded derivatives of all orders, have Gaussian tails. For rough differential equations driven by fractional Brownian motions with Hurst parameter H and with same assumptions on vector fields as before, a celebrated result of Cass-Littterer-Lyons established a max(1+2H,1/2)-Weibull upper bound for the tail probability of the solution. Establishing a general lower bound seems difficult but we will discuss the particular case of integral along fractional Brownian motions of smooth functions with uniformly bounded and non degenerate derivatives of all orders. Joint work with Xi Geng. 

Syoiti Ninomiya (Tokyo Institute of Technology)

Title: New deep learning machine architecture based on higher-order weak approximation algorithms for SDEs

Abstract: A new deep-learning neural network architecture based on high-order weak approximation algorithms for stochastic differential equations is proposed. The architecture enables deep learning machines to learn martingales efficiently. The behavior of deep neural networks based on this architecture when applied to the financial derivatives pricing problem is also reported. The crux of this new architecture lies in the higher-order weak approximation algorithms for SDEs of Runge-Kutta type, in which the approximation is realised by iterative substitutions and their linear combinations.

Stefano Goria (Thymia)

Title: Signatures for Machine Learning-Driven Mental Health Biomarkers

Abstract: Mental health care lags far behind physical health in many dimensions, with the lack of accessible and accurate biomarkers for its symptoms being one of the causes. In recent years, we have been witnessing an increased interest in mental health biomarkers particularly based on machine learning techniques applied to extremely rich data sources, such as speech or video. We review here our recent results from applying path signatures to speech and video data in order to predict the cognitive test scores used in diagnosing ASD and ADHD, positioning this method more broadly as a powerful tool to support the development of machine learning-driven biomarkers.

Blanka Horvath (University of Oxford)

Title: Pathwise methods for pricing, trading and market generation

Abstract: The emergence of machine learning solutions for pricing, hedging and trading has opened up new horizons for addressing these tasks under a large variety of models and market conditions. The emergence of these solutions has also provided the first applications where more nuanced properties of markets became necessary than what classical models could provide. In this setting, generative models and pathwise methods rooted in rough paths have proven to be powerful from several perspectives. At the same time, any model – a traditional stochastic model or a market generator – is at best an approximation of market reality, prone to model misspecification and estimation errors. The unfading question, “how to furnish a modelling setup with tools that can address the risk of the discrepancy between model and market reality” also gains new colour as multitude of new possibilities arise under the pathwise perspective.

 

Schedule

 

Monday 22 Tuesday 23 Wednesday 24 Thursday 25 Friday 26
8.30-8.55 Registration
9 – 9.45 Christian Bayer Adeline Fermanian Sandy Davie Ofer Zeitouni Josef Teichmann
9.45 – 10.30 René Carmona Peter Friz Michael Rőckner Yves Le Jan Syoiti Ninomiya
Coffe break (10.30 – 11)
11 – 11.45 Maud Lemercier Martin Hairer Alison Etheridge David Nualart Hans Buehler
Lunch break (11.45 – 14)
14 – 14.45 Blanka Horvath Ben Hambly PhD session Gerard Ben Arous Salvador Ortiz-Latorre
14.45 – 15.30 Horatio Boedhardjo Felix Otto PhD session Zhongmin Qian Andrew Stuart
Coffe break (15.30 – 16) Closing remarks
16 – 16.45 Stefano Goria Harald Oberhauser PhD session Paola Arruebarrena
Conference dinner (invitation only) PhD students dinner (invitation only)

Getting here

Registration is now closed. Add event to calendar
See all events