Special Seminar
General Relativity suffers, in several ways -- from source modeling to data analysis--, from the "curse of dimensionality", by which it is usually meant that the complexity of the system grows beyond practical control as more physical parameters of interest are taken into account.
The typical example from source modeling is mapping the space of, say, binary black hole (BBH) collision waveforms using numerical relativity simulations, even if only used to calibrate or fit effective or phenomenological models. Each of these simulations is very expensive to run and the space of configurations for BBHs in initial quasi circular orbit is 7 or 8-dimensional (masses and spins), depending on the counting. Therefore one would imagine that N^7 or N^8 numerical relativity simulations are needed to cover the space of solutions, with N being large. One is therefore interested in: i) choosing the most relevant points in parameter space to solve for and, ii) fast evaluations of new solutions once such an optimal set has been computed. In this context, beating the curse of dimensionality would include the property that the number of such optimal solutions is small and does not scale with the number of parameter dimensions. In terms of searching for gravitational waves by matched filtering, an example for which no solution-solving is involved, is that one of ringdown waveforms such as those used to search for intermediate mass black holes (IMBHs). Even when they can be modeled in closed form (through quasi-normal modes) the size of catalogs for multi-mode ringdowns is huge for realistic searches. On the other hand, IMBHs are expected to have a second, non-trivial mode beyond the dominant l=2=m one, and studies have shown that neglecting it can lead to events-losses of up to ~ 15%, and bias in parameter detection. Multi-mode ringdown searches have also been proposed for a long while as a consistency test of GR and/or of the no hair theorem. Similarly, including spins in searches of gravitational waves from BBHs is desirable, and poses similar challenges.
From a parameter estimation point of view of any detected signal, an example is the need of fast (preferably in real time) evaluations of likelihoods triggered by the detection of a gravitational wave, to compute the posterior probability distribution of the parameters of the source. Such calculations are typically carried out using Markov chain Monte Carlo (MCMC) simulations, which are very expensive. Particularly so for binary compact coalescences, due to the number of cycles involved and the multi-dimensionality of the problem. There are two dominant costs in these likelihood computations required by MCMC: on-demand evaluation of waveforms (even if they come from approximate models) and computing many overlaps between a signal and members of a catalog while marching in high dimensional (intrinsic and extrinsic) parameter spaces. The apparently very different examples given above have a common origin, and can be approached with similar tools from reduced order modeling (ROM)/dimensional reduction. I will discuss a ROM in GR program in collaboration with several colleagues, results that we have obtained so far (which indicate that one can beat the curse of dimensionality), ongoing work, and plans for the future.
Work done so far has been in collaboration with Harbir Antil, Priscilla Canizares, Sarah Caudill, Scott Field, Jonathan Gair, Chad Galley, Frank Herrmann, Jan Hesthaven, Ricardo Nochetto, and Evan Ochsner. In addition, ongoing work includes Jonathan Blackman, Mark Scheel and Bela Szilagyi.