Published on Sun Apr 25 2021

DC3: A learning method for optimization with hard constraints

Priya L. Donti, David Rolnick, J. Zico Kolter

Deep Constraint Completion and Correction (DC3) is an algorithm to address this challenge. It enforces feasibility via a differentiable procedure. DC3 achieves near-optimal values while preserving feasibility.

2
90
446
Abstract

Large optimization problems with hard constraints arise in many settings, yet classical solvers are often prohibitively slow, motivating the use of deep networks as cheap "approximate solvers." Unfortunately, naive deep learning approaches typically cannot enforce the hard constraints of such problems, leading to infeasible solutions. In this work, we present Deep Constraint Completion and Correction (DC3), an algorithm to address this challenge. Specifically, this method enforces feasibility via a differentiable procedure, which implicitly completes partial solutions to satisfy equality constraints and unrolls gradient-based corrections to satisfy inequality constraints. We demonstrate the effectiveness of DC3 in both synthetic optimization tasks and the real-world setting of AC optimal power flow, where hard constraints encode the physics of the electrical grid. In both cases, DC3 achieves near-optimal objective values while preserving feasibility.

Mon Jan 04 2021
Machine Learning
Learning to Optimize Under Constraints with Unsupervised Deep Neural Networks
The computational complexity of optimization algorithms preclude near-optimal designs in real-time applications. In this paper, we propose an unsupervised deep learning (DL) solution for solving constrained optimization problems in real time.
0
0
0
Sun Jan 26 2020
Machine Learning
Lagrangian Duality for Constrained Deep Learning
This paper explores the potential of Lagrangian duality for learning applications that feature complex constraints. Such constraints arise in many science and engineering domains, where the task amounts to learning optimizing problems. The paper also considers applications where the learning task must enforce constraints on the predictor itself.
0
0
0
Mon Jun 29 2020
Machine Learning
High-Fidelity Machine Learning Approximations of Large-Scale Optimal Power Flow
The AC Optimal Power Flow (AC-OPF) determines generator setpoints at minimal cost that meet the power demands while satisfying the underlying physical and operational constraints. It is non-convex and NP-hard, and computationally challenging for large-scale power systems.
0
0
0
Thu Sep 19 2019
Artificial Intelligence
Predicting AC Optimal Power Flows: Combining Deep Learning and Lagrangian Dual Methods
The Optimal Power Flow (OPF) problem is a fundamental building block for the optimization of electrical power systems. The learning model exploits the information available in the prior states of the system. The experimental results show that its predictions are highly accurate with average errors as low as 0.2%.
0
0
0
Mon Mar 22 2021
Artificial Intelligence
DeepOPF-V: Solving AC-OPF Problems Efficiently
0
0
0
Tue Mar 30 2021
Machine Learning
End-to-End Constrained Optimization Learning: A Survey
This paper surveys the recent attempts at leveraging machine learning to solve constrained optimization problems. It focuses on the work on integrating combinatorial solvers and optimization methods with machine learning architectures. These approaches hold the promise to develop new hybrid machine learning and optimization.
1
2
6
Wed Feb 11 2015
Machine Learning
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
Batch Normalization allows us to use much higher learning rates and be less careful about initialization. It also acts as a regularizer, in some cases eliminating the need for Dropout. Applied to a state-of-the-art image classification model, Batch Normalized achieves the same
2
342
1,163
Mon Oct 28 2019
Machine Learning
Differentiable Convex Optimization Layers
Recent work has shown how to embed differentiable optimization problems as layers within deep learning architectures. We show that every disciplined parametrized program can be represented as the composition of an affine map from parameters to problem data. We then demonstrate how to efficiently differentiate through each of these components, allowing for end-to-end analytical differentiation.
1
136
596
Mon Jun 10 2019
Artificial Intelligence
Tackling Climate Change with Machine Learning
Machine learning can be a powerful tool in reducing greenhouse gas emissions. We identify high impact problems where existing gaps can be filled by machine learning. We call on the machine learning community to join the global effort against climate change.
11
68
334
Tue Jun 19 2018
Machine Learning
Neural Ordinary Differential Equations
We introduce a new family of deep neural network models. Instead of specifying a discrete sequence of hidden layers, we parameterize the derivative of the hidden state using a neural network. The output of the network is thencomputed using a black-box differential equation solver.
3
4
29
Wed Feb 24 2016
Machine Learning
Group Equivariant Convolutional Networks
Group equivariant Convolutional Neural Networks are a generalization of convolutional neural networks. G-CNNs use G-convolutions, a new type of layer that enjoys a substantially higher degree of weight sharing than regular convolution layers.
3
3
14
Wed Mar 01 2017
Machine Learning
OptNet: Differentiable Optimization as a Layer in Neural Networks
This paper presents OptNet, a network architecture that integrates quadratic programs with individual layers in larger end-to-end trainable deep networks. We show how techniques from sensitivity analysis, bilevel optimization, and implicit differentiation can be used to exactly differentiate through these layers. We
1
0
5