Nonlinear Optimization with GPU-Accelerated Neural Network Constraints
Authors
Robert Parker Oscar Dowson Nicole LoGiudice Manuel Garcia Russell Bent
Abstract
We propose a reduced-space formulation for optimizing over trained neural networks where the network's outputs and derivatives are evaluated on a GPU. To do this, we treat the neural network as a "gray box" where intermediate variables and constraints are not exposed to the optimization solver. Compared to the full-space formulation, in which intermediate variables and constraints are exposed to the optimization solver, the reduced-space formulation leads to faster solves and fewer iterations in an interior point method. We demonstrate the benefits of this method on two optimization problems: Adversarial generation for a classifier trained on MNIST images and security-constrained optimal power flow with transient feasibility enforced using a neural network surrogate.
Paper Summary
Problem
Key Innovation
Practical Impact
Analogy / Intuitive Explanation
Paper Information
2509.22462v1