Paris. Optimization

Optimization. Philippe Bich (Paris ..... Chapter 6: Existence of solutions of optimization problem .... constraints on an open domain: Necessary condition. Let f : U ...
221KB taille 4 téléchargements 325 vues
Paris.

Optimization. Philippe Bich (Paris 1 Panthéon-Sorbonne and PSE) Paris, 2016.

Last time

Last time we have seen: Implicit function theorem

Chapter 5: Convexity

Section 1: convexity of set linear combination, affine combination, convex combination. 2 equivalent definitions of A ⊂ Rn convex. Examples. Convex hull, Unit simplex, simplex. Compactness. Strict convexity. Examples.

Chapter 5: Convexity

Section 1: convexity of set linear combination, affine combination, convex combination. 2 equivalent definitions of A ⊂ Rn convex. Examples. Convex hull, Unit simplex, simplex. Compactness. Strict convexity. Examples.

Chapter 5: Convexity

Section 1: convexity of set linear combination, affine combination, convex combination. 2 equivalent definitions of A ⊂ Rn convex. Examples. Convex hull, Unit simplex, simplex. Compactness. Strict convexity. Examples.

Chapter 5: Convexity

Section 1: convexity of set linear combination, affine combination, convex combination. 2 equivalent definitions of A ⊂ Rn convex. Examples. Convex hull, Unit simplex, simplex. Compactness. Strict convexity. Examples.

Chapter 5: Convexity

Section 2: Convex functions Case of C 2 functions: convex (concave) function from R to R. Strictly convex or strictly concave C 2 function General definition (without C 2 ) of convex, concave, Strictly convex or strictly concave, When f : A ⊂ Rn → R. Quasi-concavity, Quasi-concavity. Examples. First Properties....Properties with Hypograph, Epigraph, Upper level sets.

Chapter 5: Convexity

Section 2: Convex functions Case of C 2 functions: convex (concave) function from R to R. Strictly convex or strictly concave C 2 function General definition (without C 2 ) of convex, concave, Strictly convex or strictly concave, When f : A ⊂ Rn → R. Quasi-concavity, Quasi-concavity. Examples. First Properties....Properties with Hypograph, Epigraph, Upper level sets.

Chapter 5: Convexity

Section 2: Convex functions Case of C 2 functions: convex (concave) function from R to R. Strictly convex or strictly concave C 2 function General definition (without C 2 ) of convex, concave, Strictly convex or strictly concave, When f : A ⊂ Rn → R. Quasi-concavity, Quasi-concavity. Examples. First Properties....Properties with Hypograph, Epigraph, Upper level sets.

Chapter 5: Convexity

Section 2: Convex functions Case of C 2 functions: convex (concave) function from R to R. Strictly convex or strictly concave C 2 function General definition (without C 2 ) of convex, concave, Strictly convex or strictly concave, When f : A ⊂ Rn → R. Quasi-concavity, Quasi-concavity. Examples. First Properties....Properties with Hypograph, Epigraph, Upper level sets.

Chapter 5: Convexity Section 2: Convex functions Convexity inequality 1: on the growth rate for f on an interval of R. Convexity inequality 2: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, f (y ) ≥ f (x)+ < ∇f (x), y − x > . Convexity inequality 3: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, < ∇f (y ) − ∇f (x), y − x >≥ 0.

Chapter 5: Convexity Section 2: Convex functions Convexity inequality 1: on the growth rate for f on an interval of R. Convexity inequality 2: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, f (y ) ≥ f (x)+ < ∇f (x), y − x > . Convexity inequality 3: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, < ∇f (y ) − ∇f (x), y − x >≥ 0.

Chapter 5: Convexity Section 2: Convex functions Convexity inequality 1: on the growth rate for f on an interval of R. Convexity inequality 2: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, f (y ) ≥ f (x)+ < ∇f (x), y − x > . Convexity inequality 3: for f differentiable on a convex subset C of Rn . Then f is convex if and only if ∀(x, y ) ∈ C × C, < ∇f (y ) − ∇f (x), y − x >≥ 0.

Chapter 5: Convexity

Section 2: Convex functions: regularity A convex (resp. concave) function on a convex subset C of Rn is continuous on the interior of C. Example To illustrate a possible discontinuity at boundary ?

Chapter 5: Convexity

Section 3: Hessian and convexity symetric real matrix, reduction. negative (or positive) semidefinite matrix, negative (or positive) semidefinite matrix. 2 definitions (with eigenvalues or not). Application to Hessian when n = 2: trace and determinant. Application to Hessian when n = 3: criterium for semidefinite positivity, semidefinite negativity.

Chapter 5: Convexity

Section 3: Hessian and convexity symetric real matrix, reduction. negative (or positive) semidefinite matrix, negative (or positive) semidefinite matrix. 2 definitions (with eigenvalues or not). Application to Hessian when n = 2: trace and determinant. Application to Hessian when n = 3: criterium for semidefinite positivity, semidefinite negativity.

Chapter 5: Convexity

Section 3: Hessian and convexity symetric real matrix, reduction. negative (or positive) semidefinite matrix, negative (or positive) semidefinite matrix. 2 definitions (with eigenvalues or not). Application to Hessian when n = 2: trace and determinant. Application to Hessian when n = 3: criterium for semidefinite positivity, semidefinite negativity.

Chapter 5: Convexity Section 3: Hessian and convexity: questions  is  is

1 2 2 1



3 1 2 4



PSD ? NSD ? PD ? ND ?

PSD ? NSD ? PD ? ND ?   2 −1 0 is  −1 2 −1  PSD ? NSD ? PD ? ND ? 0 −1 2   2 −1 b For which b is  −1 2 −1  PSD ? (sol: b ∈ [−1, 2]. b −1 2

Chapter 5: Convexity Section 3: Hessian and convexity: questions  is  is

1 2 2 1



3 1 2 4



PSD ? NSD ? PD ? ND ?

PSD ? NSD ? PD ? ND ?   2 −1 0 is  −1 2 −1  PSD ? NSD ? PD ? ND ? 0 −1 2   2 −1 b For which b is  −1 2 −1  PSD ? (sol: b ∈ [−1, 2]. b −1 2

Chapter 5: Convexity Section 3: Hessian and convexity: questions  is  is

1 2 2 1



3 1 2 4



PSD ? NSD ? PD ? ND ?

PSD ? NSD ? PD ? ND ?   2 −1 0 is  −1 2 −1  PSD ? NSD ? PD ? ND ? 0 −1 2   2 −1 b For which b is  −1 2 −1  PSD ? (sol: b ∈ [−1, 2]. b −1 2

Chapter 5: Convexity Section 3: Hessian and convexity: questions  is  is

1 2 2 1



3 1 2 4



PSD ? NSD ? PD ? ND ?

PSD ? NSD ? PD ? ND ?   2 −1 0 is  −1 2 −1  PSD ? NSD ? PD ? ND ? 0 −1 2   2 −1 b For which b is  −1 2 −1  PSD ? (sol: b ∈ [−1, 2]. b −1 2

Chapter 5: Convexity

Section 3: Hessian and convexity Equivalent condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): Hessx (f ) is positive semidefinite at every x ∈ U if and only if f is convex. Equivalent condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): Hessx (f ) is negative semidefinite at every x ∈ U if and only if f is concave.

Chapter 5: Convexity

Section 3: Hessian and convexity Equivalent condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): Hessx (f ) is positive semidefinite at every x ∈ U if and only if f is convex. Equivalent condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): Hessx (f ) is negative semidefinite at every x ∈ U if and only if f is concave.

Chapter 5: Convexity

Section 3: Hessian and convexity Sufficient condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): if Hessx (f ) is positive definite at every x ∈ U, then f is strictly convex. Sufficient condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): if Hessx (f ) is negative definite at every x ∈ U, then f is strictly concave.

Chapter 5: Convexity

Section 3: Hessian and convexity Sufficient condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): if Hessx (f ) is positive definite at every x ∈ U, then f is strictly convex. Sufficient condition for a C 2 function f : U ⊂ Rn → R to be convex (U convex open): if Hessx (f ) is negative definite at every x ∈ U, then f is strictly concave.

Chapter 5: Convexity

Section 3: Hessian and convexity: Questions if f (x, y ) = 2xy − 2x 2 − y 2 − 8x + 6y + 4 convex ? concave ? strictly convex ? strictly concave ?

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions Section 1: Weirstrass theorem: Existence for continuous mappings on a compact subset Let f : K → R. i) Assume f is continuous. ii) Assume K ⊂ Rn is compact. Then (P) max f (x) x∈K

and (Q) min f (x) x∈K

have at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions Section 1: Weirstrass theorem: Existence for continuous mappings on a compact subset Let f : K → R. i) Assume f is continuous. ii) Assume K ⊂ Rn is compact. Then (P) max f (x) x∈K

and (Q) min f (x) x∈K

have at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions Section 1: Weirstrass theorem: Existence for continuous mappings on a compact subset Let f : K → R. i) Assume f is continuous. ii) Assume K ⊂ Rn is compact. Then (P) max f (x) x∈K

and (Q) min f (x) x∈K

have at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions Section 1: Weirstrass theorem: Existence for continuous mappings on a compact subset Let f : K → R. i) Assume f is continuous. ii) Assume K ⊂ Rn is compact. Then (P) max f (x) x∈K

and (Q) min f (x) x∈K

have at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem thanks to coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = +∞ (f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) min f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem thanks to coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = +∞ (f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) min f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem thanks to coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = +∞ (f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) min f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem thanks to coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = +∞ (f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) min f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem with coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = −∞ (-f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) max f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem with coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = −∞ (-f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) max f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem with coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = −∞ (-f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) max f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem with coercivity Let f : C → R. Assume that if limkxk→+∞ f (x) = −∞ (-f is said coercive). ii) Assume C ⊂ Rn is closed. Then (P) max f (x) x∈C

has at least one solution.

Chapter 6: Existence of solutions of optimization problem without differentiability assumptions

Section 2: Weakening of Weirstrass theorem with coercivity. Exercice Prove (P) min(x 2n + x 2n−1 + x 2 + x + 1) x∈R

has at least one solution.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem with no constraints on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ ∈ U is a local solution of (P) min f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive semidefinite.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem with no constraints on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ ∈ U is a local solution of (P) min f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive semidefinite.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem with no constraints on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ ∈ U is a local solution of (P) min f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive semidefinite.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem (with no constraints) on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ is a local solution of (P) max f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive seminegative.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem (with no constraints) on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ is a local solution of (P) max f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive seminegative.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 1: Differentiable Optimization problem (with no constraints) on an open domain: Necessary condition Let f : U → R, where U ⊂ Rn open. i) Assume x¯ is a local solution of (P) max f (x) x∈U

then: i) If f is differentiable at x, we have ∇f (x¯ ) = 0, i.e. x¯ is a critical point of f . ii) If f is C 2 at x¯ , then Hessx (f ) is positive seminegative.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 2: Differentiable Optimization problem with no constraints on an open domain: Sufficient condition for local optimum Let f : U → R, where U ⊂ Rn open. If f is C 2 at x¯ a critical point of f and if Hessx (f ) is positive definite, then x¯ is a local solution of (P) min f (x) x∈U

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 2: Differentiable Optimization problem with no constraints on an open domain: Sufficient condition for local optimum Let f : U → R, where U ⊂ Rn open. If f is C 2 at x¯ a critical point of f and if Hessx (f ) is positive negative, then x¯ is a local solution of (P) max f (x) x∈U

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 3: Differentiable Optimization problem with no constraints on an open domain: Sufficient condition for global minimum Let f : U → R, where U ⊂ Rn open. If f is convex on U convex, then any critical point x¯ is a global solution of (P) min f (x) x∈U

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 3: Differentiable Optimization problem with no constraints on an open domain: Sufficient condition for global minimum Let f : U → R, where U ⊂ Rn open. If f is concave on U convex, then any critical point x¯ is a global solution of (P) max f (x) x∈U

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 3: Exercice 1) Solve (P) max x 2 + xy + y 2 + y . (x,y )∈R2

2) Solve (Q) min x 2 + xy + y 2 + y . (x,y )∈R2

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 4: Differentiable Optimization problem with constraints on an open domain: Necessary conditions for global optimum Let f : U → R be C 2 , where U ⊂ Rn open. Let f1 : U → R,...,fm : U → R beC 2 functions, with m ≤ n. We consider (P)

max

f1 (x)=0,...,fm (x)=0

f (x).

To be able to write necessary condition, we need regularity condition on the constraints.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 4: Differentiable Optimization problem with constraints on an open domain: Necessary conditions for global optimum We say that f1 , ..., fm satisfies the Qualification constraint at x ∗ ∈ U if: (1) ∇f1 (x ∗ ), ..., ∇fm (x ∗ ) are linearly independent or if (2) All fi are affine.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 4: Differentiable Optimization problem with constraints on an open domain: Necessary conditions for global optimum Let f : U → R be C 2 , where U ⊂ Rn open. Let f1 : U → R,...,fm : U → R be C 2 functions, with m ≤ n. Assume that x ∗ is a solution of (P)

max

f1 (x)=0,...,fm (x)=0

f (x).

and that f1 , ...,fm satisfies the qualification constraints at x ∗ . Then there exists m reals λ1 , ...,λm such that ∇f (x ∗ ) = λ1 .∇f1 (x ∗ ) + ... + λm .∇fm (x ∗ ).

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions Section 4: Differentiable Optimization problem with constraints on an open domain: Necessary conditions for global optimum Let f : U → R be C 2 , where U ⊂ Rn open. Let f1 : U → R,...,fm : U → R be C 2 functions, with m ≤ n. Assume that x ∗ is a solution of (P)

min

f1 (x)=0,...,fm (x)=0

f (x).

and that f1 , ...,fm satisfies the qualification constraints at x ∗ . Then there exists m reals λ1 , ...,λm such that ∇f (x ∗ ) = λ1 .∇f1 (x ∗ ) + ... + λm .∇fm (x ∗ ).

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 4: Differentiable Optimization problem with constraints on an open domain: Necessary conditions for global optimum We can write the previous condtion ∇L(x, λ1 , ..., λm ) = 0 where L(x, λ1 , ..., λm ) = f (x) − λ1 .∇f1 (x) − ... − λm .∇fm (x) is called the Lagrangian function.

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 5: example 1 (P) max x + y . x 2 +y 2 =1

Chapter 7: Existence of solutions of optimization problem with differentiability assumptions

Section 5: example 2 (P)

max

x>0,y >0,x+y =1

−xln(x) − yln(y )