A faster interior-point method for sum-of-squares optimization
We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑ i qi2 be an n-variate SOS polynomial of degree 2d. Denoting by (Equation presented) and (Equation presented) the dimensions of the vector spaces in which qi's and p live respectively, our algorithm runs in time Õ(LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers [16, 15, 27], which achieve runtime Õ(L0.5 min{U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis [27], which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.
| Item Type | Chapter |
|---|---|
| Copyright holders | © 2022 The Authors |
| Departments | LSE > Academic Departments > Mathematics |
| DOI | 10.4230/LIPIcs.ICALP.2022.79 |
| Date Deposited | 14 Jul 2022 |
| Acceptance Date | 11 Apr 2022 |
| URI | https://researchonline.lse.ac.uk/id/eprint/115561 |
Explore Further
- https://personal.lse.ac.uk/natura/ (Author)
- https://www.scopus.com/pages/publications/85133487686 (Scopus publication)
- https://drops.dagstuhl.de/opus/ (Official URL)
