Scientific Computing and Numerics (SCAN) Seminar
Gilles Abstract: The conjugate gradient method (CG) is the algorithm of choice for solving large symmetric positive definite linear systems. Although self-adjoint elliptic partial differential equations are the differential analogues of symmetric positive systems, their spectral discretizations typically lead to non-normal, ill conditioned systems on which CG cannot operate. In this talk, I present an extension of CG that applies directly to self-adjoint elliptic operators and leads to optimal complexity solvers for the spectral solutions of some PDEs on intervals and rectangular domains. I also discuss extensions to other Krylov subspace methods.
Javeed Abstract: In this short talk, I will explain how SDEs can be simulated as ODEs. Chebfun uses this strategy. It represents Brownian motion as a random Fourier series, truncates, and then solves the resulting ODE dx/dt = f(x,t) + g(x,t)dB/dt with a conventional integrator. As the approximating series is truncated at higher and higher levels, the solutions of the ODE converge (almost surely) to those of the SDE. But are there other approximations with this guarantee? The answer is yes. In fact, not even the Fourier series representation of Brownian motion is unique. In part of this talk, I will describe another random Fourier series for Brownian motion that enables us to approximate SDEs as ODEs. What is special about this alternative series is that it is reminiscent of an SVD: its partial sums capture as much energy as possible. It is known as Brownian motion’s Karhunen-Loeve expansion. And as I will explain, the KL expansion is to stochastic processes what PCA is to IID data. Time permitting, I will also summarize some remarkable results published by a Rutgers mathematican (in 1977 and 1978) that establish a deep theoretical connection between SDEs and ODEs.