About me

I am a postdoctoral researcher (postdoc) at the Chair for the Methods of Machine Learning at the University of Tübingen, advised by Philipp Hennig. Before that, I did my Ph.D. in the same group as part of the IMPRS-IS (International Max Planck Research School for Intelligent Systems). I am working on more user-friendly training methods for machine learning. My work aims at riding the field of deep learning of annoying hyperparameters and thus automate the training of deep neural networks.

Prior to joining the IMPRS-IS, I studied Simulation Technology (B.Sc. and M.Sc.) and Industrial and Applied Mathematics (M.Sc) at the University of Stuttgart and the Technische Universiteit Eindhoven respectively. My Master’s thesis was on constructing preconditioners for Toeplitz matrices. This project was done at ASML (Eindhoven), a company developing lithography system for the semiconductor industry.

Interests
  • Deep Learning
  • Training Algorithms
  • Stochastic Optimization
  • Benchmarking
  • Artificial Intelligence
Education
  • Postdoctoral Researcher, 2022 -

    University of Tübingen

  • Ph.D. in Computer Science, 2017 - 2022

    University of Tübingen & MPI-IS, IMPRS-IS fellow

  • M.Sc. in Industrial and Applied Mathematics, 2015 - 2016

    TU/e Eindhoven

  • M.Sc. in Simulation Technology, 2015 - 2016

    University of Stuttgart

  • B.Sc. in Simulation Technology, 2011 - 2015

    University of Stuttgart

News

  • June 2023: We published the first paper from the MLCommons' Algorithms working group on arXiv titled "Benchmarking Neural Network Training Algorithms".
    In it, we motivate, present, and justify our new AlgoPerf: Training Algorithms benchmark. We plan to issue a Call for Submissions for the benchmark soon.
  • July 2022: I succesfully defended my Ph.D. thesis with the title Understanding Deep Learning Optimization via Benchmarking and Debugging!
    I will continue to work as a postdoctoral researcher at the University of Tübingen.
  • September 2021: Our paper “Cockpit: A Practical Debugging Tool for the Training of Deep Neural Networks” has been accepted at NeurIPS 2021.
    In this work, we present a new kind of debugger, specifically designed for training deep nets.
  • May 2021: Our work “Descending through a Crowded Valley - Benchmarking Deep Learning Optimziers” has been accepted at ICML 2021.
    In it, we present an extensive comparison of fifteen popular deep learning optimizers.
  • April, 2021: I have been elected as a co-chair for the MLCommons working group on Algorithmic Efficiency together with George Dahl.
    The working group will develop a set of rigorous and relevant benchmarks to measure training speedups to neural network training due to algorithmic improvements, focusing on new training algorithms and models.
  • September & Oktober 2020: I have been distinguished as a top reviewer for ICML 2020 and received a Top 10% Reviewer award for NeurIPS 2020.
  • May 2019: Our paper “DeepOBS: A Deep Learning Optimizer Benchmark Suite” has been accepted at ICLR 2019.
    In the paper, we present a benchmark suite for deep learning optimization methods. I will be at the conference from 06th through 09th May in New Orleans, USA.

Publications

Workshops, Talks & Summer Schools

Organized an ELLIS Workshop
I co-organized the ELLIS workshop on Algorithms for Deep Learning co-located with the inauguration of the Tübingen AI Center.
Invited Talk SPP 2353 Summer School
I gave an invited talk at the SPP 2353 Summer School focusing on Neural Network Training through the Lens of Benchmarking and Debugging.
Organized the HITY Workshop
I co-organized the Has it Trained Yet? workshop at NeurIPS 2022. In this workshop, we discuss algorithmic solutions for practical and efficient neural network training. During the workshop, we also presented our workshop paper Late-Phase Second-Order Training.
Invited Talk at the ML Evaluation Standards Workshop
I gave an invited talk at the ML Evaluation Standards Workshop at ICLR 2022. During the talk and the panel discussion, we examined the role of reproducibility and rigor in machine learning.
Invited Talk at University of Freiburg
Gave an invited talk about our research involving DeepOBS at the Machine Learning Lab at the University of Freiburg at the invitation of Prof. Frank Hutter.
Gaussian Process Summer School
Attended the Gaussian Process Summer School 2018 in Sheffield learning about GPs, kernel design, and Bayesian optimization.
Microsoft Research AI Summer School
Attended the Microsoft Research AI Summer School 2018 in Cambridge.

Teaching

Software Practicals - Bundesliga Prediction
Supervised software practicals where students should create an app that can predict the results of Bundesliga matches using historical data and machine learning algorithms.
Probabilistic Inference and Learning
Lecture course by Prof. Dr. Philipp Hennig on probabilistic machine learning covering topics such as probability theory, graphical models, Gaussian Processes and Sampling Methods.
Programming Course for MATLAB and C++
Block course introduction the basics of programming in both MATLAB and C++.

Education & Experience

 
 
 
 
 
University of Tübingen
Postdoctoral Researcher
January 2022 – Present Tübingen

Working on making deep learning more user-friendly by focusing on the training algorithms.

Advisor: Prof. Dr. Philipp Hennig

 
 
 
 
 
University of Tübingen
Ph.D.
September 2017 – July 2022 Tübingen

Doctoral student in computer science.

Working on improving deep learning optimization at the Max Planck Institute for Intelligent Systems and the University of Tübingen in the International Max Planck Research School for Intelligent Systems (IMPRS-IS).

Supervisor: Prof. Dr. Philipp Hennig

 
 
 
 
 
University of Stuttgart, IAG
Research Assistant
May 2017 – May 2017 Stuttgart
Assisted the European Workshop on High Order Nonlinear Numerical Methods for Evolutionary PDEs.
 
 
 
 
 
University of Stuttgart, IADM
Research Assistant
March 2017 – March 2017 Stuttgart
Teaching assistant for programming course on MATLAB and C++.
 
 
 
 
 
ASML
Internship
April 2016 – October 2016 Eindhoven
Working on preconditioners for large linear systems.
 
 
 
 
 
University of Stuttgart & TU/e Eindhoven
Double Master’s Degree
July 2015 – January 2017 Stuttgart & Eindhoven

Focus on numerics and mathematical applications.

Master’s thesis at ASML with the title “Approximation of Inverses of BTTB Matrices for Preconditioning Applications”.

Supervisor: Michiel Hochstenbach, Ph.D., TU/e

 
 
 
 
 
University of Stuttgart, IAT
Student Assistant
October 2014 – May 2015 Stuttgart
Analysis, evaluation and optimization of an agent-based warning dissemination model. Support for the round table at the Fraunhofer IAO about research in civil protection.
 
 
 
 
 
University of Stuttgart
Bachelor’s Degree
October 2011 – July 2015 Stuttgart

Focus on a broad education in mathematics, engineering, computer & natural science.

Bachelor’s thesis in cooperation with the Fraunhofer Institute for Industrial Engineering with the title “Analysis, evaluation and optimization of an agent-based model simulating warning dissemination”.

Supervisor: Prof. Dr. Albrecht Schmidt

Contact