A faster interior-point method for sum-of-squares optimization

Jiang, S., Natura, B. & Weinstein, O. (2022). A faster interior-point method for sum-of-squares optimization. In Bojanczyk, M., Merelli, E. & Woodruff, D. P. (Eds.), 49th EATCS International Conference on Automata, Languages, and Programming, ICALP 2022 . Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing. https://doi.org/10.4230/LIPIcs.ICALP.2022.79
Copy

We present a faster interior-point method for optimizing sum-of-squares (SOS) polynomials, which are a central tool in polynomial optimization and capture convex programming in the Lasserre hierarchy. Let p = ∑ i qi2 be an n-variate SOS polynomial of degree 2d. Denoting by (Equation presented) and (Equation presented) the dimensions of the vector spaces in which qi's and p live respectively, our algorithm runs in time Õ(LU1.87). This is polynomially faster than state-of-art SOS and semidefinite programming solvers [16, 15, 27], which achieve runtime Õ(L0.5 min{U2.37, L4.24}). The centerpiece of our algorithm is a dynamic data structure for maintaining the inverse of the Hessian of the SOS barrier function under the polynomial interpolant basis [27], which efficiently extends to multivariate SOS optimization, and requires maintaining spectral approximations to low-rank perturbations of elementwise (Hadamard) products. This is the main challenge and departure from recent IPM breakthroughs using inverse-maintenance, where low-rank updates to the slack matrix readily imply the same for the Hessian matrix.

picture_as_pdf

subject
Published Version
Creative Commons: Attribution 4.0

Download

Export as

EndNote BibTeX Reference Manager Refer Atom Dublin Core JSON Multiline CSV
Export