Misplaced Pages

Slater's condition

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Slater condition) Concept in convex optimization Not to be confused with Slater determinant.

In mathematics, Slater's condition (or Slater condition) is a sufficient condition for strong duality to hold for a convex optimization problem, named after Morton L. Slater. Informally, Slater's condition states that the feasible region must have an interior point (see technical details below).

Slater's condition is a specific example of a constraint qualification. In particular, if Slater's condition holds for the primal problem, then the duality gap is 0, and if the dual value is finite then it is attained.

Formulation

Let f 1 , , f m {\displaystyle f_{1},\ldots ,f_{m}} be real-valued functions on some subset D {\displaystyle D} of R n {\displaystyle \mathbb {R} ^{n}} . We say that the functions satisfy the Slater condition if there exists some x {\displaystyle x} in the relative interior of D {\displaystyle D} , for which f i ( x ) < 0 {\displaystyle f_{i}(x)<0} for all i {\displaystyle i} in 1 , , m {\displaystyle 1,\ldots ,m} . We say that the functions satisfy the relaxed Slater condition if:

  • Some k {\displaystyle k} functions (say f 1 , , f k {\displaystyle f_{1},\ldots ,f_{k}} ) are affine;
  • There exists x relint D {\displaystyle x\in \operatorname {relint} D} such that f i ( x ) 0 {\displaystyle f_{i}(x)\leq 0} for all i = 1 , , k {\displaystyle i=1,\ldots ,k} , and f i ( x ) < 0 {\displaystyle f_{i}(x)<0} for all i = k + 1 , , m {\displaystyle i=k+1,\ldots ,m} .

Application to convex optimization

Consider the optimization problem

Minimize  f 0 ( x ) {\displaystyle {\text{Minimize }}\;f_{0}(x)}
subject to:    {\displaystyle {\text{subject to: }}\ }
f i ( x ) 0 , i = 1 , , m {\displaystyle f_{i}(x)\leq 0,i=1,\ldots ,m}
A x = b {\displaystyle Ax=b}

where f 0 , , f m {\displaystyle f_{0},\ldots ,f_{m}} are convex functions. This is an instance of convex programming. Slater's condition for convex programming states that there exists an x {\displaystyle x^{*}} that is strictly feasible, that is, all m constraints are satisfied, and the nonlinear constraints are satisfied with strict inequalities.

If a convex program satisfies Slater's condition (or relaxed condition), and it is bounded from below, then strong duality holds. Mathematically, this states that strong duality holds if there exists an x relint ( D ) {\displaystyle x^{*}\in \operatorname {relint} (D)} (where relint denotes the relative interior of the convex set D := i = 0 m dom ( f i ) {\displaystyle D:=\cap _{i=0}^{m}\operatorname {dom} (f_{i})} ) such that

f i ( x ) < 0 , i = 1 , , m , {\displaystyle f_{i}(x^{*})<0,i=1,\ldots ,m,} (the convex, nonlinear constraints)
A x = b . {\displaystyle Ax^{*}=b.\,}

Generalized Inequalities

Given the problem

Minimize  f 0 ( x ) {\displaystyle {\text{Minimize }}\;f_{0}(x)}
subject to:    {\displaystyle {\text{subject to: }}\ }
f i ( x ) K i 0 , i = 1 , , m {\displaystyle f_{i}(x)\leq _{K_{i}}0,i=1,\ldots ,m}
A x = b {\displaystyle Ax=b}

where f 0 {\displaystyle f_{0}} is convex and f i {\displaystyle f_{i}} is K i {\displaystyle K_{i}} -convex for each i {\displaystyle i} . Then Slater's condition says that if there exists an x relint ( D ) {\displaystyle x^{*}\in \operatorname {relint} (D)} such that

f i ( x ) < K i 0 , i = 1 , , m {\displaystyle f_{i}(x^{*})<_{K_{i}}0,i=1,\ldots ,m} and
A x = b {\displaystyle Ax^{*}=b}

then strong duality holds.

See also

References

  1. Slater, Morton (1950). Lagrange Multipliers Revisited (PDF). Cowles Commission Discussion Paper No. 403 (Report). Reprinted in Giorgi, Giorgio; Kjeldsen, Tinne Hoff, eds. (2014). Traces and Emergence of Nonlinear Programming. Basel: Birkhäuser. pp. 293–306. ISBN 978-3-0348-0438-7.
  2. Takayama, Akira (1985). Mathematical Economics. New York: Cambridge University Press. pp. 66–76. ISBN 0-521-25707-7.
  3. Nemirovsky and Ben-Tal (2023). "Optimization III: Convex Optimization" (PDF).
  4. ^ Boyd, Stephen; Vandenberghe, Lieven (2004). Convex Optimization (pdf). Cambridge University Press. ISBN 978-0-521-83378-3. Retrieved October 3, 2011.
Category: