253
IRUS Total
Downloads
  Altmetric

Exploiting structure in nonconvex quadratic optimisation

File Description SizeFormat 
Baltean-Lugojan-R-2019-PhD-Thesis.pdfThesis5.93 MBAdobe PDFView/Open
Title: Exploiting structure in nonconvex quadratic optimisation
Authors: Baltean-Lugojan, Radu
Item Type: Thesis or dissertation
Abstract: Finding globally-optimal solutions in quadratic nonconvex optimisation problems deterministically at scale requires exploiting problem structure. Moreover, engineering/industrial applications often present combinatorial aspects, with a nonconvexity interplay from both nonlinearity and integrality constraints. The structure this interplay introduces is difficult to exploit due to different existing mathematical/algorithmic toolboxes for nonlinear-continuous versus discrete-polyhedral optimisation. This thesis addresses two arising challenges: specific commercially-relevant pooling problems that bundle bilinear nonconvexities with a topological and polyhedral structure; semidefinite relaxations (of nonlinear nonconvexity) that integrate with difficulty into polyhedral-based Branch & Cut global solvers. First, we parametrically study pooling structure and explicitly identify sparsity via dominant active topologies under relaxed flow availability for single quality instances. We associate sparse active topological patterns with explicit piecewise objective functions, validating a long-held and heuristically successful intuition. We formally derive strongly-polynomial solutions for several single quality pooling problem subclasses, including some previously-studied nonconvex instances. The conditions in which sparse strongly-polynomial piecewise structure vanishes due to combinatorial complexity has further implications for pooling relaxations in global solvers. Second, we develop an effective lightweight linear outer-approximation of semidefinite relaxations, which we show can easily integrate into global solvers. Compared to previous work, our proposed cuts are sparser in the number of row nonzeros and explicitly selected to improve the objective. We explore relevant engineering trade-offs for sparsity patterns on quadratic programming with box constraints, showing they may immediately extend to quadratically constrained instances. A neural network estimator is key to selecting which strong cuts to generate using objective structure: ranking each cut by expected objective improvement involves solving many semidefinite optimisation problems, an expensive proposition at each Branch & Cut node. The estimator, trained a priori of any instance to solve, predicts objective improvements, taking the computation offline as an application of machine learning in cut selection and global optimisation.
Content Version: Open Access
Issue Date: Oct-2019
Date Awarded: Jan-2020
URI: http://hdl.handle.net/10044/1/79281
DOI: https://doi.org/10.25560/79281
Copyright Statement: Creative Commons Attribution NonCommercial Licence
Supervisor: Misener, Ruth
Parpas, Panos
Sponsor/Funder: Engineering and Physical Sciences Research Council
IBM
Department: Computing
Publisher: Imperial College London
Qualification Level: Doctoral
Qualification Name: Doctor of Philosophy (PhD)
Appears in Collections:Computing PhD theses