I am a postdoctoral researcher in machine learning at EPFL, working between the labs IdePHICS, SPOC and INDY.
My current research focuses on the behavior of graph neural networks.
Between 2019 and 2022, I was a PhD student at École des Ponts (CERMICS), and then a teaching assistant at MIT (JuliaLab).
Check out my website PhD Resources for students and researchers
Combinatorial optimization (CO) layers in machine learning (ML) pipelines are a powerful tool to tackle data-driven decision tasks, but they come with two main challenges. First, the solution of a CO problem often behaves as a piecewise constant function of its objective parameters. Given that ML pipelines are typically trained using stochastic gradient descent, the absence of slope information is very detrimental. Second, standard ML losses do not work well in combinatorial settings. A growing body of research addresses these challenges through diverse methods. Unfortunately, the lack of well-maintained implementations slows down the adoption of CO layers. In this paper, building upon previous works, we introduce a probabilistic perspective on CO layers, which lends itself naturally to approximate differentiation and the construction of structured losses. We recover many approaches from the literature as special cases, and we also derive new ones. Based on this unifying perspective, we present InferOpt.jl, an open-source Julia package that 1) allows turning any CO oracle with a linear objective into a differentiable layer, and 2) defines adequate losses to train pipelines containing such layers. Our library works with arbitrary optimization algorithms, and it is fully compatible with Julia’s ML ecosystem. We demonstrate its abilities using a pathfinding problem on video game maps as guiding example, as well as three other applications from operations research.
@online{dalleLearningCombinatorialOptimization2022,title={Learning with {{Combinatorial Optimization Layers}}: A {{Probabilistic Approach}}},shorttitle={Learning with {{Combinatorial Optimization Layers}}},author={Dalle, Guillaume and Baty, Léo and Bouvier, Louis and Parmentier, Axel},date={2022-12-03},eprint={2207.13513},eprinttype={arxiv},eprintclass={cs, math, stat},doi={10.48550/arXiv.2207.13513},url={http://arxiv.org/abs/2207.13513},urldate={2023-01-26},pubstate={preprint},note={Submitted to the Journal of Machine Learning Research}}
Dissertation
Machine Learning and Combinatorial Optimization Algorithms, with Applications to Railway Planning
This thesis investigates the frontier between machine learning and combinatorial optimization, two active areas of applied mathematics research. We combine theoretical insights with efficient algorithms, and develop several open source Julia libraries. Inspired by a collaboration with the Société nationale des chemins de fer français (SNCF), we study high-impact use cases from the railway world: train failure prediction, delay propagation, and track allocation.In Part I, we provide mathematical background and describe software implementations for various tools that will be needed later on: implicit differentiation, temporal point processes, Hidden Markov Models and Multi-Agent Path Finding. Our publicly-available code fills a void in the Julia package ecosystem, aiming at ease of use without compromising on performance.In Part II, we highlight theoretical contributions related to both statistics and decision-making. We consider a Vector AutoRegressive process with partial observations, and prove matching upper and lower bounds on the estimation error. We unify and extend the state of the art for combinatorial optimization layers in deep learning, gathering various approaches in a Julia library called InferOpt.jl. We also seek to differentiate through multi-objective optimization layers, which leads to a novel theory of lexicographic convex analysis.In Part III, these mathematical and algorithmic foundations come together to tackle railway problems. We design a hierarchical model of train failures, propose a graph-based framework for delay propagation, and suggest new avenues for track allocation, with the Flatland challenge as a testing ground.
@thesis{dalleMachineLearningCombinatorial2022,type={phdthesis},title={Machine Learning and Combinatorial Optimization Algorithms, with Applications to Railway Planning},author={Dalle, Guillaume},editora={Meunier, Frédéric and De Castro, Yohann and Parmentier, Axel},editoratype={collaborator},date={2022-12-16},institution={{École des Ponts ParisTech}},url={https://www.theses.fr/2022ENPC0047},urldate={2023-03-31},langid={english},}