Convex relaxations of integral variational problems: pointwise dual relaxation and sum-of-squares optimization

Abstract

We present a method for finding lower bounds on the global infima of integral variational problems, wherein ∫Ωf(x,u(x),∇u(x))dx is minimized over functions u:Ω⊂Rn→Rm satisfying given equality or inequality constraints. Each constraint may be imposed over Ω or its boundary, either pointwise or in an integral sense. These global minimizations are generally non-convex and intractable. We formulate a particular convex maximization, here called the pointwise dual relaxation (PDR), whose supremum is a lower bound on the infimum of the original problem. The PDR can be derived by dualizing and relaxing the original problem; its constraints are pointwise equalities or inequalities over finite-dimensional sets, rather than over infinite-dimensional function spaces. When the original minimization can be specified by polynomial functions of (x,u,∇u), the PDR can be further relaxed by replacing pointwise inequalities with polynomial sum-of-squares (SOS) conditions. The resulting SOS program is computationally tractable when the dimensions m,n and number of constraints are not too large. The framework presented here generalizes an approach of Valmorbida, Ahmadi, and Papachristodoulou (IEEE Trans. Automat. Contr., 61:1649–1654, 2016). We prove that the optimal lower bound given by the PDR is sharp for several classes of problems, whose special cases include leading eigenvalues of Sturm-Liouville problems and optimal constants of Poincaré inequalities. For these same classes, we prove that SOS relaxations of the PDR converge to the sharp lower bound as polynomial degrees are increased. Convergence of SOS computations in practice is illustrated for several examples.

Publication
arXiv