I am a PhD student at ETH Zürich, the Max Planck Institute for Intelligent Systems, and student researcher in the applied science team at Google Research.
I am advised by Gunnar Rätsch, Bernhard Schölkopf, and supported by the CLS fellowship.
Previously, I completed my MSc in computer science at EPFL, where I worked with Patrick Thiran and Matthias Grossglauser as a research scholar.
I spent the last year of my MSc as intern at RIKEN AIP Tokyo with Emtiyaz Khan.
My research focuses on probabilistic methods for deep learning and scientific applications.
I am interested in improving generalization and efficiency of deep learning models through gradient-based optimization and inference.
Further, I work on capabilities for multiple modalities, forecasting horizons, and outputs for deep learning using a probabilistic framework.
Both of these directions are motivated by problems in scientific applications, e.g., biomedicine and neuroscience.
I have previously also worked on methods for multimodal time series occurring in mobility and political data.
Publications
Effective Bayesian Heteroscedastic Regression with Deep Neural Networks
A Immer*, E Palumbo*, A Marx, JE Vogt
NeurIPS, 2023
Kronecker-Factored Approximate Curvature for Modern Neural Network Architectures
R Eschenhagen, A Immer, RE Turner, F Schneider, P Hennig
NeurIPS, 2023
Learning Layer-wise Equivariances Automatically using Gradients
T v.d. Ouderaa, A Immer, M v.d. Wilk
NeurIPS, 2023
Stochastic Marginal Likelihood Gradients using Neural Tangent Kernels
A Immer, T v.d. Ouderaa, M v.d. Wilk, G Rätsch, B Schölkopf
ICML, 2023
On the Identifiability and Estimation of Causal Location-Scale Noise Models
A Immer, C Schultheiss, JE Vogt, B Schölkopf, P Bühlmann, A Marx
ICML, 2023
Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations
A Immer*, T v.d. Ouderaa*, G Rätsch, V Fortuin, M v.d. Wilk
NeurIPS, 2022
Probing as Quantifying Inductive Bias
A Immer*, LT Hennigen*, V Fortuin, R Cotterell
ACL, 2022
Laplace Redux – Effortless Bayesian Deep Learning
E Daxberger*, A Kristiadi*, A Immer*, R Eschenhagen*, M Bauer, P Hennig
NeurIPS, 2021
Scalable Marginal Likelihood Estimation for Model Selection in Deep Learning
A Immer, M Bauer, V Fortuin, G Rätsch, ME Khan
ICML, 2021
Improving predictions of Bayesian neural networks via local linearization
A Immer*, M Korzepa, M Bauer*
AISTATS, 2021
Continual Deep Learning by Functional Regularisation of Memorable Past
P Pan, S Swaroop, A Immer, R Eschenhagen, R Turner, ME Khan
NeurIPS 2020 (oral), 2020
Sub-Matrix Factorization for Real-Time Vote Prediction
A Immer, V Kristof, M Grossglauser, P Thiran
KDD (oral), 2020
Disentangling the Gauss-Newton Method and Approximate Inference for Neural Networks
A Immer
EPFL MSc Thesis, 2020
Approximate Inference Turns Deep Networks into Gaussian Processes
ME Khan, A Immer, E Abedi, M Korzepa
NeurIPS, 2019
Efficient Learning of Smooth Probability Functions from Bernoulli Tests with Guarantees
P Rolland, A Kavis, A Immer, A Singla, V Cevher
ICML, 2019
Variational Inference with Numerical Derivatives: Variance Reduction through Coupling
A Immer, GP Dehaene
Preprint, 2019