Structure exploitation

How to make the Moment-SOS hierarchy more tractable?


On the practical side, there is no free lunch and polynomial optimization methods usually encompass severe scalability issues.
The underlying reason is that for optimization problems involving polynomials in n variables of degree at most d, the size of the matrices involved in the Moment-SOS hierarchy of convex relaxations is proportional to nd (at fixed d).
The team has invested a lot of recent efforts to tackle these issues.
Fortunately, for many applications, we can look at the problem in the eyes and exploit the inherent data structure arising from the cost and constraints describing the problem, in particular sparsity or symmetries:

  • correlative sparsity, occurring when there are few correlations between the variables of the input polynomials;
  • term sparsity, occurring when there are a small number of terms involved in the input polynomials by comparison with the fully dense case;
  • invariance of all input polynomials under the action of a subgroup of the general linear group.

In all cases, the properties on the original optimization problem induce specific sparsity/symmetry structures on the matrices arising in the convex relaxations.
Such schemes have been used for important applications arising from various fields, including roundoff error bounds (computer arithmetic), robustness of deep networks (machine learning), entanglement (quantum information), AC optimal power-flow (energy networks), and sphere packing (discrete geometry).
The team is also willing to extend these frameworks in the context of noncommutative optimization, set approximation and analysis of dynamical systems.