Neur2RO: Neural Two-Stage Robust Optimization

Preprint (2025)
Author(s)

Justin Dumouchelle (University of Toronto)

E.A.T. Julien (TU Delft - Discrete Mathematics and Optimization)

Jannis Kurtz (Universiteit van Amsterdam)

Elias B. Khalil (University of Toronto)

Research Group
Discrete Mathematics and Optimization
More Info
expand_more
Publication Year
2025
Language
English
Research Group
Discrete Mathematics and Optimization

Abstract

Robust optimization provides a mathematical framework for modeling and solving decision-making problems under worst-case uncertainty. This work addresses two-stage robust optimization (2RO) problems (also called adjustable robust optimization), wherein first-stage and second-stage decisions are made before and after uncertainty is realized, respectively. This results in a nested min-max-min optimization problem which is extremely challenging computationally, especially when the decisions are discrete. We propose Neur2RO, an efficient machine learning-driven instantiation of column-and-constraint generation (CCG), a classical iterative algorithm for 2RO. Specifically, we learn to estimate the value function of the second-stage problem via a novel neural network architecture that is easy to optimize over by design. Embedding our neural network into CCG yields high-quality solutions quickly as evidenced by experiments on two 2RO benchmarks, knapsack and capital budgeting. For knapsack, Neur2RO finds solutions that are within roughly % of the best-known values in a few seconds compared to the three hours of the state-of-the-art exact branch-and-price algorithm; for larger and more complex instances, Neur2RO finds even better solutions. For capital budgeting, Neur2RO outperforms three variants of the -adaptability algorithm, particularly on the largest instances, with a to -fold reduction in solution time. Our code and data are available at https://github.com/khalil-research/Neur2RO.

No files available

Metadata only record. There are no files for this record.