A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems

Boland, N, Christiansen, J, Dandurand, B, Eberhard, A and Oliveira, F 2019, 'A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems', Mathematical Programming, Series A, vol. 175, no. 1-2, pp. 503-536.


Document type: Journal Article
Collection: Journal Articles

Attached Files
Name Description MIMEType Size
n2006082898.pdf Accepted Manuscript application/pdf 494.96KB
Title A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems
Author(s) Boland, N
Christiansen, J
Dandurand, B
Eberhard, A
Oliveira, F
Year 2019
Journal name Mathematical Programming, Series A
Volume number 175
Issue number 1-2
Start page 503
End page 536
Total pages 34
Publisher Springer
Abstract We contribute improvements to a Lagrangian dual solution approach applied to large-scale optimization problems whose objective functions are convex, continuously differentiable and possibly nonlinear, while the non-relaxed constraint set is compact but not necessarily convex. Such problems arise, for example, in the split-variable deterministic reformulation of stochastic mixed-integer optimization problems. We adapt the augmented Lagrangian method framework to address the presence of nonconvexity in the non-relaxed constraint set and to enable efficient parallelization. The development of our approach is most naturally compared with the development of proximal bundle methods and especially with their use of serious step conditions. However, deviations from these developments allow for an improvement in efficiency with which parallelization can be utilized. Pivotal in our modification to the augmented Lagrangian method is an integration of the simplicial decomposition method and the nonlinear block Gauss-Seidel method. An adaptation of a serious step condition associated with proximal bundle methods allows for the approximation tolerance to be automatically adjusted. Under mild conditions optimal dual convergence is proven, and we report computational results on test instances from the stochastic optimization literature. We demonstrate improvement in parallel speedup over a baseline parallel approach.
Subject Optimisation
Operations Research
Numerical Analysis
Keyword(s) Augmented Lagrangian method
Proximal bundle method
Nonlinear block Gauss-Seidel method
Simplicial decomposition method
Parallel computing
DOI - identifier 10.1007/s10107-018-1253-9
Copyright notice © Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2018
ISSN 0025-5610
Additional Notes This is a post-peer-review, pre-copyedit version of an article published in Mathematical Programming. The final authenticated version is available online at: https://doi.org/10.1007/s10107-018-1253-9
Versions
Version Filter Type
Citation counts: TR Web of Science Citation Count  Cited 2 times in Thomson Reuters Web of Science Article | Citations
Scopus Citation Count Cited 0 times in Scopus Article
Altmetric details:
Access Statistics: 16 Abstract Views, 20 File Downloads  -  Detailed Statistics
Created: Mon, 29 Apr 2019, 13:04:00 EST by Catalyst Administrator
© 2014 RMIT Research Repository • Powered by Fez SoftwareContact us