Download Nonsmooth Vector Functions and Continuous Optimization PDF

TitleNonsmooth Vector Functions and Continuous Optimization
Author
LanguageEnglish
File Size2.6 MB
Total Pages276
Table of Contents
                            Preface
Contents
	Pseudo-Jacobian Matrices
	Calculus Rules forPseudo-Jacobian
	Openness of ContinuousVector Functions
	Nonsmooth MathematicalProgramming Problems
	Monotone Operators andNonsmooth VariationalInequalities
Bibliographical Notes
References
Notations
Index
                        
Document Text Contents
Page 1

NONSMOOTH VECTOR FUNCTIONS AND
CONTINUOUS OPTIMIZATION

Page 2

Optimization and Its Applications

VOLUME 10

Managing Editor
Panos M. Pardalos (University of Florida)

Editor—Combinatorial Optimization
Ding-Zhu Du (University of Texas at Dallas)

Advisory Board
J. Birge (University of Chicago)
C.A. Floudas (Princeton University)
F. Giannessi (University of Pisa)
H.D. Sherali (Virginia Polytechnic and State University)
T. Terlaky (McMaster University)
Y. Ye (Stanford University)



Aims and Scope
Optimization has been expanding in all directions at an astonishing rate
during the last few decades. New algorithmic and theoretical techniques have
been developed, the diffusion into other disciplines has proceeded at a rapid
pace, and our knowledge of all aspects of the field has grown even more
profound. At the same time, one of the most striking trends in optimization is
the constantly increasing emphasis on the interdisciplinary nature of the field.
Optimization has been a basic tool in all areas of applied mathematics,
engineering, medicine, economics and other sciences.

The series Optimization and Its Applications publishes undergraduate
and graduate textbooks, monographs and state-of-the-art expository works
that focus on algorithms for solving optimization problems and also study
applications involving such problems. Some of the topics covered include
nonlinear optimization (convex and nonconvex), network flow problems,
stochastic optimization, optimal control, discrete optimization, multi-
objective programming, description of software packages, approximation
techniques and heuristic approaches.

Page 138

130 3 Openness of Continuous Vector Functions

lim
k→∞

vk = v0 ∈ Bm

and either
lim

k→∞
Mk = M0 ∈ coF (0) (3.28)

or
lim

k→∞
tkMk = M∗ ∈ co [(F (0))∞\{ 0} ] , (3.29)

where { tk} is some sequence of positive numbers converging to 0.
Let us see that (3.28) and (3.29) lead to a contradiction. First assume

that (3.28) holds. By hypothesis, M0 is K-surjective on C at 0, which
means that

0 ∈ int(M0(C) +K).

In view of Proposition 3.5.1, there exist some ε > 0 and k0 ≥ 1 such that

v0 + εBm ⊆ k0(M0(C ∩Bn) +K). (3.30)

For this ε, choose k1 ≥ k0 so that

‖Mk − M0‖ < ε/4 for k ≥ k1. (3.31)

We now show that there is k2 ≥ k1 such that

v0 +
ε

2
Bm ⊆ k0(M0(Bn ∩ (C − xk)) +K) for k ≥ k2. (3.32)

Indeed, if this is not the case, then one may assume that for each xk there
is some bk ∈ (ε/2)Bm satisfying

v0 + bk 6∈ k0(M0(Bn ∩ (C − xk)) +K).

Because the set on the right-hand side is convex, by the separation theorem,
there exists some ξk ∈ IRm with ‖ ξk‖ = 1 such that

〈 ξk, v0 + bk〉 ≤ 〈 ξk, k0(M0(x) + y)〉 for all x ∈ Bn ∩ (C − xk), y ∈ K.

Using subsequences if needed, one may again assume that

lim
k→∞

bk = b0 ∈
ε

2
Bm,

lim
k→∞

ξk = ξ0 with ‖ ξ0‖ = 1.

It follows then

〈 ξ0, v0 + b0〉 ≤ 〈 ξ0, k0(M0(x) + y〉 for all x ∈ Bn ∩C, y ∈ K.

The point v0 + b0 being an interior point of the set v0 + εBm, the obtained
inequality contradicts (3.30). By this, (3.32) is true. It follows from(3.30),
(3.31), and (3.32) for k ≥ k2 that

Page 139

3.5 Metric Regularity and Pseudo-Lipschitzian Property 131

v0 +
ε

2
Bm ⊆ k0{M0[Bn ∩ (C − xk)] +K}

⊆ k0 {Mk[Bn ∩ (C − xk)] + (M0 −Mk)[Bn ∩ (C − xk)] +K}

⊆ k0{Mk[Bn ∩ (C − xk)] +
ε

4
Bm +K}.

Because the set Mk((C − xk) ∩ Bn) + K is convex, we deduce from the
above inclusion that

v0 +
ε

4
Bm ⊆ k0(Mk(Bn ∩ (C − xk)) +K), for k ≥ k2.

Now we choose k ≥ k2 so large that vk ∈ v0 + (ε/4)Bm and obtain

vk ∈ k(Mk(Bn ∩ (C − xk)) +K)

which contradicts (3.27). The case of (3.29) is proven by the same tech-
nique. �

The next result is a generalization of Proposition 3.1.11.

Proposition 3.5.4 Assume that the hypotheses of Proposition 3.5.3 hold.
Then there is a closed convex set D containing 0 with D\{0} ⊆ C such
that the set [

y∈δBn

co[F (y) + (F (y))δ∞]

is equi-K-surjective on D around 0.

Proof. Let {Dk} be an increasing sequence of closed convex sets that exists
by Lemma 3.1.10; that is, Dk satisfy

0 ∈ Dk ⊆ C ∪ {0} and C ⊆ cl[∪∞k=1Dk].

Our aim is to apply Proposition 3.5.3 to the sets Dk. We show that for
k sufficiently large, every matrix of the set co(F (0)) ∪ co [(F (0))∞\{0}] is
K-surjective on Dk at 0. Suppose to the contrary that for each k = 1, 2, . . .
there is Mk ∈ co(F (0)) ∪ co[(F (0))∞\{0}] such that

0 6∈ int(Mk(Dk ∩Bn) +K).

Because the set on the right-hand side is convex, by using the separation
theorem, we find ξk ∈ IRm with ‖ξk‖ = 1 such that

0 ≤ 〈ξk,Mk(x) + y〉 for x ∈ Dk ∩Bn and y ∈ K. (3.33)

Without loss of generality we may assume that

lim
k→∞

ξk = ξ0 with ‖ξ0‖ = 1

Page 275

268 Index

H-di�erential, 22

implicit function theorem, 115
inf-function, 77
injective matrix, 87
inverse function theorem, 115
Io�e

approximate subdi�erential, 30
controllability theorem, 125
prederivative, 20, 54

Jacobian matrix, 8

Kuhn{Tucker condition, 144

Lagrangian, 156
limit

outer horizon, 41
recession upper, 41
cosmic upper, 41
Kuratowsk{Painlev�e upper , 41

limiting normal cone, 17
Lipschitz constant, 7
Lipschitz function, 7
Lipschitz modulus, 71
LMO-approximation, 179
local minimizer, 64
local unique solution, 233
locally bounded set-valued map, 43, 70
locally Lipschitz function, 14

max-function, 62
mean value theorem, 67

asymptotic, 69
metric regularity, 137
Michel{Penot subdi�erential, 30
min-function, 62
minimax theorem, 118
Mordukhovich

basic subdi�erential, 29
coderivative, 17
second order subdi�erential, 35
singular subdi�erential, 29

multiobjective problem, 187
multiplier rule, 144, 189

open mapping theorem, 110
operator, 207

comonotone, 213
monotone, 207
pseudomonotone, 220
quasimonotone, 215
strictly monotone, 207
strongly monotone, 212

optimality condition, 64
Fritz John, 144
necessary, 143
second-order, 156, 183
su�cient, 162

partial order, 186
partial pseudo-Jacobian, 39
polyhedral set, 233
positive de�nite matrix, 208
positive semide�nite matrix, 208
prederivative, 21
problem

complementarity, 243
convex composite, 168
equality constraints, 143
intersection, 231
linearized, 236
locally Lipschitz, 150
minimax, 154
mixed constraints, 147
multiobjective, 186

pseudo-di�erential, 23
pseudo-Hessian, 33, 162

Fr�echet, 163
pseudo-Jacobian, 10

strict pseudo-Jacobian, 55
densely regular, 208
Fr�echet pseudo-Jacobian, 49
Gâteaux pseudo-Jacobian, 49
partial, 72
pseudo-Jacobian matrix, 10
regular pseudo-Jacobian, 10

pseudo-Jacobian map, 46
pseudo-Lipschitz property, 139

recession cone, 35
relative interior, 3

separation theorem, 4, 66
strict local minimum of order 2, 185
strict prederivative, 21
submap, 208
suboperator, 208
sup-function, 72, 77
support function, 20
surjective matrix, 87

tangent cone, 65
Taylor’s expansion, 80
Treiman linear generalized gradient, 32

upper semicontinuity, 40
upper semicontinuous hull, 44

Page 276

Index 269

variational inequality, 230

Hartman{Stampacchia, 230

Minty, 230

Warga’s unbounded derivative container,
19, 54

weakly e�cient point, 187

Zagrodny’s mean value theorem, 24

Similer Documents