My primary research interest is machine learning with differentiable algorithms.
For example, I have made a general framework for making algorithms differentiable, and have also focussed on differentiable logic gate networks, differentiable sorting, and differentiable rendering.
Beyond differentiable algorithms, my work on differentiability enhances various domains including stochastic gradient estimation, analytical distribution propagation, secondorder optimization, uncertainty quantification, domain adaptation, individual fairness, and efficient neural architectures.
I am a postdoctoral researcher at Stanford University in Stefano Ermon's group and in collaboration with Christian Borgelt, Hilde Kuehne, Mikhail Yurochkin, Yuekai Sun, Oliver Deussen, among others.
I have been working, i.a., at the University of Konstanz, at TAU, DESY, PSI, and CERN.


News
Jul 2023 Our paper "Learning by Sorting: Selfsupervised Learning with Group Ordering Constraints" was accepted to ICCV 2023!
May 2023 We released the CallforPapers for our ICML 2023 Workshop. Consider submitting a 4page paper and join us in Hawaii on July 28: differentiable.xyz
Apr 2023 Our paper "Neural Machine Translation for Mathematical Formulae" was accepted to ACL 2023!
Apr 2023 Our workshop "Differentiable Almost Everything: Differentiable Relaxations, Algorithms, Operators, and Simulators" has been accepted for ICML 2023!
Feb 2023 Our paper "ISAAC Newton: Inputbased Approximate Curvature for Newton's Method" was accepted to ICLR 2023!
Oct 2022 Our papers "Deep Differentiable Logic Gate Networks" and "Domain Adaptation meets Individual Fairness. And they get along" were accepted to NeurIPS 2022!
Jun 2022 I submitted my thesis on "Learning with Differentiable Algorithms"!
Jun 2022 Our paper "Differentiable Topk Classification Learning" was accepted to ICML!
Mar 2022 Our paper "GenDR: A Generalized Differentiable Renderer" was accepted to CVPR!
Feb 2022 Our paper "Monotonic Differentiable Sorting Networks" was accepted to ICLR!
Oct 2021 Our papers "Learning with Algorithmic Supervision via Continuous Relaxations" and "Postprocessing for Individual Fairness" were accepted to NeurIPS!

Research
The focus of my research is differentiability and the study of making nondifferentiable operations differentiable.
Differentiable relaxations enable a plethora of optimization tasks:
from optimizing logic gate networks [1]
and optimizing through the 3D rendering pipeline [2, 3, 4]
to differentiating sorting and ranking [5, 6]
for supervised [7] and
and selfsupervised [8] learning.
Beyond differentiable algorithms, this branches out into various domains including
stochastic gradient estimation [9],
analytical distribution propagation [10],
secondorder optimization [9, 11],
uncertainty quantification [10],
fairness [12, 13],
and efficient neural architectures [1, 14].


Learning by Sorting: Selfsupervised Learning with Group Ordering Constraints
Nina Shvetsova,
Felix Petersen,
Anna Kukleva,
Bernt Schiele,
Hilde Kuehne
in Proc. of the International Conference on Computer Vision (ICCV 2023)


Deep Differentiable Logic Gate Networks
Felix Petersen,
Christian Borgelt,
Hilde Kuehne,
Oliver Deussen
in Proceedings of the 36th International Conference on Neural Information Processing Systems (NeurIPS 2022)
Code


Learning with Differentiable Algorithms
Felix Petersen
PhD thesis (summa cum laude), University of Konstanz


Neural Machine Translation for Mathematical Formulae
Felix Petersen,
Moritz Schubotz,
Andre GreinerPetter,
Bela Gipp
in Proc. of the 61st Annual Meeting of the Association for Computational Linguistics (ACL 2023)
YouTube


ISAAC Newton: Inputbased Approximate Curvature for Newton's Method
Felix Petersen,
Tobias Sutter,
Christian Borgelt,
Dongsung Huh,
Hilde Kuehne,
Yuekai Sun,
Oliver Deussen
in Proc. of the International Conference on Learning Representations (ICLR 2023)
YouTube Code


Differentiable Topk Classification Learning
Felix Petersen,
Hilde Kuehne,
Christian Borgelt,
Oliver Deussen
in Proceedings of the 39th International Conference on Machine Learning (ICML 2022)
YouTube Code


Domain Adaptation meets Individual Fairness. And they get along.
Debarghya Mukherjee*,
Felix Petersen*,
Mikhail Yurochkin,
Yuekai Sun
in Proceedings of the 36th International Conference on Neural Information Processing Systems (NeurIPS 2022)


GenDR: A Generalized Differentiable Renderer
Felix Petersen,
Christian Borgelt,
Bastian Goldluecke,
Oliver Deussen
in Proceedings of the Conference on Computer Vision and Pattern Recognition (CVPR 2022)
YouTube Code


Monotonic Differentiable Sorting Networks
Felix Petersen,
Christian Borgelt,
Hilde Kuehne,
Oliver Deussen
in Proceedings of the International Conference on Learning Representations (ICLR 2022)
YouTube Code / diffsort library


Learning with Algorithmic Supervision via Continuous Relaxations
Felix Petersen,
Christian Borgelt,
Hilde Kuehne,
Oliver Deussen
in Proceedings of the 35th International Conference on Neural Information Processing Systems (NeurIPS 2021)
YouTube Code / AlgoVision library


Postprocessing for Individual Fairness
Felix Petersen*,
Debarghya Mukherjee*,
Yuekai Sun,
Mikhail Yurochkin
in Proceedings of the 35th International Conference on Neural Information Processing Systems (NeurIPS 2021)
YouTube Code


Style Agnostic 3D Reconstruction via Adversarial Style Transfer
Felix Petersen,
Hilde Kuehne,
Bastian Goldluecke,
Oliver Deussen
in Proceedings of the IEEE Winter Conf. on Applications of Computer Vision (WACV 2022)
YouTube


Differentiable Sorting Networks for Scalable Sorting and Ranking Supervision
Felix Petersen,
Christian Borgelt,
Hilde Kuehne,
Oliver Deussen
in Proceedings of the 38th International Conference on Machine Learning (ICML 2021)
YouTube


AlgoNet: C^{∞} Smooth Algorithmic Neural Networks
Felix Petersen,
Christian Borgelt,
Oliver Deussen


Pix2Vex: ImagetoGeometry Reconstruction using a Smooth Differentiable Renderer
Felix Petersen,
Amit H. Bermano,
Oliver Deussen,
Daniel CohenOr


Towards Formula Translation using Recursive Neural Networks
Felix Petersen,
Moritz Schubotz,
Bela Gipp
in Proceedings of the 11th Conference on Intelligent Computer Mathematics (CICM), 2018


LaTeXEqChecker
 A framework for checking mathematical semantics in LaTeX documents
Felix Petersen
Presented in the Special Session of the 11th Conference on Intelligent Computer Mathematics (CICM), 2018
Slides


Individualized Maths Introduction Course
WS 2019 – SS 2022


Tutor: Analysis and Linear Algebra
SS 2019 – WS 2020


Seminar: Current Trends in Computer Graphics (+ Neural Networks, and Mathematical Language Processing)
WS 2019


Tutor: Discrete Mathematics
SS 2017
and SS 2018


Tutor: Programming Course 1 (Java)
WS 2016 and WS 2017

