nonconvex

Title: Efficient Finite Basis Physics-Informed Neural Networks

Abstract:  Physics-informed neural networks (PINNs) offer promising advantages over traditional numerical methods for solving PDEs, such as being mesh-free and naturally suited for inverse problems, yet they often struggle to converge and scale to more complex, multi-scale problems. A promising way to scale PINNs is to use domain decomposition; for instance, finite basis physics-informed neural networks (FBPINNs) replace the global PINN with multiple local networks whose sum approximates the PDE solution. Whilst this divide-and-conquer strategy achieves better convergence and accuracy than PINNs, FBPINNs are still relatively slow to train compared to numerical methods. In this talk, we show it is possible to significantly accelerate the training of FBPINNs by linearising their optimisation problem when using extreme learning machines (ELMs) as their subdomain networks. We show comparisons of PINNs, FBPINNs and ELM-FBPINNs across a range of problems and discuss outstanding challenges in accelerating FBPINNs.