This is a project which is currently making use of HPC facilities at Newcastle University. It is active.
For further information about this project, please contact:
This project develops new ways to speed up and quantify the accuracy of large-scale scientific simulations. The key idea is to view many numerical computations as a form of extrapolation across different simulation fidelities (for example, running coarse, medium, and fine versions of the same model). By combining results from these multi-fidelity runs in a principled probabilistic way, the team can both estimate uncertainty and accelerate convergence to high-accuracy solutions.
Earlier work showed that this approach—Probabilistic Richardson Extrapolation—can significantly speed up demanding tasks like whole-heart simulations when run in parallel on HPC systems. However, the method previously required an impractically large number of simulations as the problem dimension grew.
The project overcomes this by introducing the ideas of effective dimension and extrapolation sparsity, which reflect structural features common to many modern numerical methods. These concepts allow the researchers to design sparse, efficient extrapolation schemes that dramatically reduce the data requirements while maintaining or improving accuracy. The work combines new theory with extensive large-scale computational experiments on the HPC facility.