Quantitative Analysis
Parallel Processing
Numerical Analysis
C++ Multithreading
Python for Excel
Python Utilities
Services
Author
Printable PDF file
I. Basic math.
II. Pricing and Hedging.
III. Explicit techniques.
IV. Data Analysis.
V. Implementation tools.
1. Finite differences.
2. Gauss-Hermite Integration.
3. Asymptotic expansions.
4. Monte-Carlo.
5. Convex Analysis.
A. Basic concepts of convex analysis.
B. Caratheodory's theorem.
C. Relative interior.
D. Recession cone.
E. Intersection of nested convex sets.
F. Preservation of closeness under linear transformation.
G. Weierstrass Theorem.
H. Local minima of convex function.
I. Projection on convex set.
J. Existence of solution of convex optimization problem.
K. Partial minimization of convex functions.
L. Hyperplanes and separation.
M. Nonvertical separation.
N. Minimal common and maximal crossing points.
O. Minimax theory.
P. Saddle point theory.
Q. Polar cones.
R. Polyhedral cones.
S. Extreme points.
T. Directional derivative and subdifferential.
U. Feasible direction cone, tangent cone and normal cone.
V. Optimality conditions.
W. Lagrange multipliers for equality constraints.
X. Fritz John optimality conditions.
Y. Pseudonormality.
Z. Lagrangian duality.
[. Conjugate duality.
VI. Basic Math II.
VII. Implementation tools II.
VIII. Bibliography
Notation. Index. Contents.

Fritz John optimality conditions.


roblem

(Smooth optimization problem). We consider the following problem MATH where the $f,h_{i},g_{j}$ are smooth functions MATH and $X$ is a nonempty closed set.

Proposition

(Fritz John conditions). Let $x^{\ast}$ be a local minimum of the problem ( Smooth optimization problem ). Then there exist quantities MATH such that

1. MATH

2. MATH

3. MATH

4. Let MATH If MATH then there exists MATH such that MATH where MATH and MATH

Proof

Let MATH for $k=1,2,...$

Consider the problems

MATH (Penalized problem)
where MATH and $\varepsilon>0$ is such that MATH

By the classic version of the Weierstrass theorem there exists a solution $x_{k}$ of the problem ( Penalized problem ) for every $k$ . In particular, MATH Note that MATH MATH and MATH . Hence, we rewrite the last inequality as

MATH (FJ proof 1)
By construction, MATH is a bounded sequence. Therefore, is has one or more limit points $\bar{x}$ .

The $f$ is smooth, hence, MATH is bounded. Therefore, MATH because otherwise MATH and MATH cannot be bounded by the MATH .

Thus, all the limit points $\bar{x}$ are feasible: MATH Therefore, by the construction of $S_{\varepsilon}$ and $\varepsilon$ ,

MATH (FJ proof 2)
By passing to the limit the inequality ( FJ proof 1 ) and combining with ( FJ proof 2 )we conclude MATH for every limit point $\bar{x}$ . Thus $x_{k}$ is convergent and MATH

According to the proposition ( Minimum of a smooth function ) MATH By convergence MATH , for large enough $k$ the $x_{k}$ is inside $S_{\varepsilon}$ , hence MATH We restrict our attention to such $k$ .

We calculate

MATH (FJ proof 3)
and introduce the notation MATH Note that the sequence of $k$ MATH is bounded: MATH Hence, it has a limit point MATH

By dividing ( FJ proof 3 ) with $\delta_{k}$ we obtain MATH We pass the last relationship to the limit $k\rightarrow\infty$ and arrive to MATH

(compare with the definition ( Normal cone )).

To see that the statement (4) holds, note that by construction of MATH , MATH , if $i\in I$ then MATH for large enough $k$ . If $i\not \in I$ then the $i$ -th component of $\lambda_{k}$ : MATH has to vanish as MATH . Hence, if $i\not \in I$ then MATH vanishes quicker than any of the MATH for $i\not \in I$ . The consideration for $g_{j}$ is identical.





Notation. Index. Contents.


















Copyright 2007