Practical discretisation error estimates for data-driven materials modelling (Swiss National Science Foundation, Oct 2025 – Sept 2029, 600k CHF) Modern computational materials discovery frequently involves machine-learned surrogates trained by running density-functional theory (DFT) simulations on a large set of structures. The error in the training data itself is predominantly modelled as small, homoscedastic Gaussian noise, such that the simulated DFT quantities need to be of consistently good quality across the entire data set. Recent mathematical advances offer a quantitative approach to model the error due to the chosen plane-wave cutoff in energy and forces. In this project we will (1) to develop these discretisation error estimates into a routine tool for materials modelling and (2) to employ such quantitative estimates to overcome the limitations of a homoscedastic error model within Gaussian Processes (GP) regression.
A novel approach to first-principles inverse materials design leveraging algorithmic property derivatives (Swiss National Science Foundation, Apr 2024 – Mar 2026, 100k CHF)
National competence centre in research MARVEL, École Polytechnique Federale Lausanne (EPFL) (2023 – 2026)
Swiss national research centre on Computational Design and Discovery of Novel material including researchers in experimental and computational physics and chemistry as well as computer science. The goal is to bridge between communities and accelerate the state of the art of materials discovery. Our group joined the centre in May 2023 for the third funding phase in the pillar concerning Digital Infrastructure of Open Simulation and Data. Key aim is to integrate our software efforts surrounding efficient black-box algorithms with the high-throughput and pre-exascale platforms currently under development at MARVEL.
Center for Exascale Simulation of Material Interfaces in Extreme Environments, Massachusetts Institute of Technology. (2022 – 2025
Research initiative advancing state-of-the-art first-principle simulations by connecting method development with (a) advances in programming languages, compiler technologies and performance engineering tools and (b) underpinning them with rigorous approaches to statistical inference and uncertainty quantification (UQ). With respect to the former aspect my main interaction is with the JuliaLab of Alan Edelman and on the latter I collaborate with Youssef Marzouk on uncertainty quantification in first-principles simulations.
Extreme-scale Mathematically-based Computational Chemistry, Sorbonne University, Inria Paris and École des Ponts ParisTech. (2019 – 2024)
Interdisciplinary initiative to advance the state of the art in first-principle simulations by integrating recent advances in numerical linear algebra (e.g. low-rank factorisations, randomised methods, GPU acceleration) and numerical analysis (e.g. a posteriori error estimation) into state-of-the-art molecular-dynamics and electronic-structure methods. Continuing from my PostDoc I participate in the development of black-box density-functional theory methods and a posteriori error estimates, mainly working with Eric Cancès and Antoine Levitt.
Eric Cancès (Numerical analysis, École des Ponts ParisTech): A posteriori error analysis for electronic-structure simulations.
Genevieve Dusson (Applied mathematics, CNRS): Error analysis in density-functional theory
Gaspard Kemlin (Applied mathematics, Université de Picardie): Error analysis in density-functional theory
Youssef Marzouk (Uncertainty quantification, Massachusetts Institute of Technology): Uncertainty quantification in density-functional theory.
Antoine Levitt (Numerical analysis, University of Paris-Saclay): Reliable black-box algorithms for electronic structure simulations.
Andre Laestadius (Applied mathematics, Oslo Met University): Mathematical formulation of density-functional theory.
Benjamin Stamm (Numerical analysis, Universität Stuttgart): Reduced basis methods & analysis of quantum-chemical methods